Science.gov

Sample records for reliability organizations theory

  1. (Centralized Reliability Data Organization (CRDO))

    SciTech Connect

    Haire, M J

    1987-04-21

    One of the primary goals of the Centralized Reliability Data Organization (CREDO) is to be an international focal point for the collection, analysis, and dissemination of liquid metal reactor (LMR) component reliability, availability, and maintainability (RAM) data. During FY-1985, the Department of Energy (DOE) entered into a Specific Memorandum of Agreement (SMA) with Japan's Power Reactor and Nuclear Fuel Development Corporation (PNC) regarding cooperative data exchange efforts. This agreement was CREDO's first step toward internationalization and represented an initial realization of the previously mentioned goal. DOE's interest in further internationalization of the CREDO system was the primary motivation for the traveler's attendance at the Reliability '87 conference.

  2. Becoming a high reliability organization

    PubMed Central

    2011-01-01

    Aircraft carriers, electrical power grids, and wildland firefighting, though seemingly different, are exemplars of high reliability organizations (HROs) - organizations that have the potential for catastrophic failure yet engage in nearly error-free performance. HROs commit to safety at the highest level and adopt a special approach to its pursuit. High reliability organizing has been studied and discussed for some time in other industries and is receiving increasing attention in health care, particularly in high-risk settings like the intensive care unit (ICU). The essence of high reliability organizing is a set of principles that enable organizations to focus attention on emergent problems and to deploy the right set of resources to address those problems. HROs behave in ways that sometimes seem counterintuitive - they do not try to hide failures but rather celebrate them as windows into the health of the system, they seek out problems, they avoid focusing on just one aspect of work and are able to see how all the parts of work fit together, they expect unexpected events and develop the capability to manage them, and they defer decision making to local frontline experts who are empowered to solve problems. Given the complexity of patient care in the ICU, the potential for medical error, and the particular sensitivity of critically ill patients to harm, high reliability organizing principles hold promise for improving ICU patient care. PMID:22188677

  3. Becoming a high reliability organization.

    PubMed

    Christianson, Marlys K; Sutcliffe, Kathleen M; Miller, Melissa A; Iwashyna, Theodore J

    2011-01-01

    Aircraft carriers, electrical power grids, and wildland firefighting, though seemingly different, are exemplars of high reliability organizations (HROs)--organizations that have the potential for catastrophic failure yet engage in nearly error-free performance. HROs commit to safety at the highest level and adopt a special approach to its pursuit. High reliability organizing has been studied and discussed for some time in other industries and is receiving increasing attention in health care, particularly in high-risk settings like the intensive care unit (ICU). The essence of high reliability organizing is a set of principles that enable organizations to focus attention on emergent problems and to deploy the right set of resources to address those problems. HROs behave in ways that sometimes seem counterintuitive--they do not try to hide failures but rather celebrate them as windows into the health of the system, they seek out problems, they avoid focusing on just one aspect of work and are able to see how all the parts of work fit together, they expect unexpected events and develop the capability to manage them, and they defer decision making to local frontline experts who are empowered to solve problems. Given the complexity of patient care in the ICU, the potential for medical error, and the particular sensitivity of critically ill patients to harm, high reliability organizing principles hold promise for improving ICU patient care.

  4. Organization Theory as Ideology.

    ERIC Educational Resources Information Center

    Greenfield, Thomas B.

    The theory that organizations are ideological inventions of the human mind is discussed. Organizational science is described as an ideology which is based upon social concepts and experiences. The main justification for organizational theory is that it attempts to answer why we behave as we do in social organizations. Ways in which ideas and…

  5. Creating High Reliability in Health Care Organizations

    PubMed Central

    Pronovost, Peter J; Berenholtz, Sean M; Goeschel, Christine A; Needham, Dale M; Sexton, J Bryan; Thompson, David A; Lubomski, Lisa H; Marsteller, Jill A; Makary, Martin A; Hunt, Elizabeth

    2006-01-01

    Objective The objective of this paper was to present a comprehensive approach to help health care organizations reliably deliver effective interventions. Context Reliability in healthcare translates into using valid rate-based measures. Yet high reliability organizations have proven that the context in which care is delivered, called organizational culture, also has important influences on patient safety. Model for Improvement Our model to improve reliability, which also includes interventions to improve culture, focuses on valid rate-based measures. This model includes (1) identifying evidence-based interventions that improve the outcome, (2) selecting interventions with the most impact on outcomes and converting to behaviors, (3) developing measures to evaluate reliability, (4) measuring baseline performance, and (5) ensuring patients receive the evidence-based interventions. The comprehensive unit-based safety program (CUSP) is used to improve culture and guide organizations in learning from mistakes that are important, but cannot be measured as rates. Conclusions We present how this model was used in over 100 intensive care units in Michigan to improve culture and eliminate catheter-related blood stream infections—both were accomplished. Our model differs from existing models in that it incorporates efforts to improve a vital component for system redesign—culture, it targets 3 important groups—senior leaders, team leaders, and front line staff, and facilitates change management—engage, educate, execute, and evaluate for planned interventions. PMID:16898981

  6. High Reliability Organizations in Education. Noteworthy Perspectives

    ERIC Educational Resources Information Center

    Eck, James H.; Bellamy, G. Thomas; Schaffer, Eugene; Stringfield, Sam; Reynolds, David

    2011-01-01

    The authors of this monograph assert that by assisting school systems to more closely resemble "high reliability" organizations (HROs) that already exist in other industries and benchmarking against top-performing education systems from around the globe, America's school systems can transform themselves from compliance-driven bureaucracies to…

  7. 18 CFR 39.3 - Electric Reliability Organization certification.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Electric Reliability... CERTIFICATION OF THE ELECTRIC RELIABILITY ORGANIZATION; AND PROCEDURES FOR THE ESTABLISHMENT, APPROVAL, AND ENFORCEMENT OF ELECTRIC RELIABILITY STANDARDS § 39.3 Electric Reliability Organization certification. (a)...

  8. 18 CFR 39.3 - Electric Reliability Organization certification.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Electric Reliability... CERTIFICATION OF THE ELECTRIC RELIABILITY ORGANIZATION; AND PROCEDURES FOR THE ESTABLISHMENT, APPROVAL, AND ENFORCEMENT OF ELECTRIC RELIABILITY STANDARDS § 39.3 Electric Reliability Organization certification. (a)...

  9. Microbial community modeling using reliability theory.

    PubMed

    Zilles, Julie L; Rodríguez, Luis F; Bartolerio, Nicholas A; Kent, Angela D

    2016-08-01

    Linking microbial community composition with the corresponding ecosystem functions remains challenging. Because microbial communities can differ in their functional responses, this knowledge gap limits ecosystem assessment, design and management. To develop models that explicitly incorporate microbial populations and guide efforts to characterize their functional differences, we propose a novel approach derived from reliability engineering. This reliability modeling approach is illustrated here using a microbial ecology dataset from denitrifying bioreactors. Reliability modeling is well-suited for analyzing the stability of complex networks composed of many microbial populations. It could also be applied to evaluate the redundancy within a particular biochemical pathway in a microbial community. Reliability modeling allows characterization of the system's resilience and identification of failure-prone functional groups or biochemical steps, which can then be targeted for monitoring or enhancement. The reliability engineering approach provides a new perspective for unraveling the interactions between microbial community diversity, functional redundancy and ecosystem services, as well as practical tools for the design and management of engineered ecosystems.

  10. High Reliability Organizations--Medication Safety.

    PubMed

    Yip, Luke; Farmer, Brenna

    2015-06-01

    High reliability organizations (HROs), such as the aviation industry, successfully engage in high-risk endeavors and have low incidence of adverse events. HROs have a preoccupation with failure and errors. They analyze each event to effect system wide change in an attempt to mitigate the occurrence of similar errors. The healthcare industry can adapt HRO practices, specifically with regard to teamwork and communication. Crew resource management concepts can be adapted to healthcare with the use of certain tools such as checklists and the sterile cockpit to reduce medication errors. HROs also use The Swiss Cheese Model to evaluate risk and look for vulnerabilities in multiple protective barriers, instead of focusing on one failure. This model can be used in medication safety to evaluate medication management in addition to using the teamwork and communication tools of HROs. PMID:25812532

  11. High Reliability Organizations--Medication Safety.

    PubMed

    Yip, Luke; Farmer, Brenna

    2015-06-01

    High reliability organizations (HROs), such as the aviation industry, successfully engage in high-risk endeavors and have low incidence of adverse events. HROs have a preoccupation with failure and errors. They analyze each event to effect system wide change in an attempt to mitigate the occurrence of similar errors. The healthcare industry can adapt HRO practices, specifically with regard to teamwork and communication. Crew resource management concepts can be adapted to healthcare with the use of certain tools such as checklists and the sterile cockpit to reduce medication errors. HROs also use The Swiss Cheese Model to evaluate risk and look for vulnerabilities in multiple protective barriers, instead of focusing on one failure. This model can be used in medication safety to evaluate medication management in addition to using the teamwork and communication tools of HROs.

  12. [Qualitative analysis: theory, steps and reliability].

    PubMed

    Minayo, Maria Cecília de Souza

    2012-03-01

    This essay seeks to conduct in-depth analysis of qualitative research, based on benchmark authors and the author's own experience. The hypothesis is that in order for an analysis to be considered reliable, it needs to be based on structuring terms of qualitative research, namely the verbs 'comprehend' and 'interpret', and the nouns 'experience', 'common sense' and 'social action'. The 10 steps begin with the construction of the scientific object by its inclusion on the national and international agenda; the development of tools that make the theoretical concepts tangible; conducting field work that involves the researcher empathetically with the participants in the use of various techniques and approaches, making it possible to build relationships, observations and a narrative with perspective. Finally, the author deals with the analysis proper, showing how the object, which has already been studied in all the previous steps, should become a second-order construct, in which the logic of the actors in their diversity and not merely their speech predominates. The final report must be a theoretic, contextual, concise and clear narrative.

  13. 76 FR 23222 - Electric Reliability Organization Interpretation of Transmission Operations Reliability

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-26

    ..., 52 FR 47897 (Dec. 17, 1987), FERC Stats. & Regs. Preambles 1986-1990 ] 30,783 (1987). \\23\\ 18 CFR 380... Energy Regulatory Commission 18 CFR Part 40 Electric Reliability Organization Interpretation of.... Background 2. Section 215 of the FPA requires a Commission-certified Electric Reliability Organization...

  14. Design of high reliability organizations in health care.

    PubMed

    Carroll, J S; Rudolph, J W

    2006-12-01

    To improve safety performance, many healthcare organizations have sought to emulate high reliability organizations from industries such as nuclear power, chemical processing, and military operations. We outline high reliability design principles for healthcare organizations including both the formal structures and the informal practices that complement those structures. A stage model of organizational structures and practices, moving from local autonomy to formal controls to open inquiry to deep self-understanding, is used to illustrate typical challenges and design possibilities at each stage. We suggest how organizations can use the concepts and examples presented to increase their capacity to self-design for safety and reliability.

  15. 76 FR 58101 - Electric Reliability Organization Interpretation of Transmission Operations Reliability Standard

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-20

    ... Reliability Standard, Notice of Proposed Rulemaking, 76 FR 23222 (Apr. 26, 2011), FERC Stats. & Regs. ] 32,674... No. 486, 52 FR 47897 (Dec. 17, 1987), FERC Stats. & Regs. Preambles 1986-1990 ] 30,783 (1987). \\24... Federal Energy Regulatory Commission 18 CFR Part 40 Electric Reliability Organization Interpretation...

  16. 76 FR 23171 - Electric Reliability Organization Interpretations of Interconnection Reliability Operations and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-26

    ... ``Current Status of Bulk Electric Systems elements (transmission or generation including critical... Energy Regulatory Commission 18 CFR Part 40 Electric Reliability Organization Interpretations of... Federal Power Act, the Federal Energy Regulatory Commission hereby approves the North American...

  17. [Process design in high-reliability organizations].

    PubMed

    Sommer, K-J; Kranz, J; Steffens, J

    2014-05-01

    Modern medicine is a highly complex service industry in which individual care providers are linked in a complicated network. The complexity and interlinkedness is associated with risks concerning patient safety. Other highly complex industries like commercial aviation have succeeded in maintaining or even increasing its safety levels despite rapidly increasing passenger figures. Standard operating procedures (SOPs), crew resource management (CRM), as well as operational risk evaluation (ORE) are historically developed and trusted parts of a comprehensive and systemic safety program. If medicine wants to follow this quantum leap towards increased patient safety, it must intensively evaluate the results of other high-reliability industries and seek step-by-step implementation after a critical assessment.

  18. Studying Reliability of Open Ended Mathematics Items According to the Classical Test Theory and Generalizability Theory

    ERIC Educational Resources Information Center

    Guler, Nese; Gelbal, Selahattin

    2010-01-01

    In this study, the Classical test theory and generalizability theory were used for determination to reliability of scores obtained from measurement tool of mathematics success. 24 open-ended mathematics question of the TIMSS-1999 was applied to 203 students in 2007-spring semester. Internal consistency of scores was found as 0.92. For…

  19. Modeling Sensor Reliability in Fault Diagnosis Based on Evidence Theory

    PubMed Central

    Yuan, Kaijuan; Xiao, Fuyuan; Fei, Liguo; Kang, Bingyi; Deng, Yong

    2016-01-01

    Sensor data fusion plays an important role in fault diagnosis. Dempster–Shafer (D-R) evidence theory is widely used in fault diagnosis, since it is efficient to combine evidence from different sensors. However, under the situation where the evidence highly conflicts, it may obtain a counterintuitive result. To address the issue, a new method is proposed in this paper. Not only the statistic sensor reliability, but also the dynamic sensor reliability are taken into consideration. The evidence distance function and the belief entropy are combined to obtain the dynamic reliability of each sensor report. A weighted averaging method is adopted to modify the conflict evidence by assigning different weights to evidence according to sensor reliability. The proposed method has better performance in conflict management and fault diagnosis due to the fact that the information volume of each sensor report is taken into consideration. An application in fault diagnosis based on sensor fusion is illustrated to show the efficiency of the proposed method. The results show that the proposed method improves the accuracy of fault diagnosis from 81.19% to 89.48% compared to the existing methods. PMID:26797611

  20. Modeling Sensor Reliability in Fault Diagnosis Based on Evidence Theory.

    PubMed

    Yuan, Kaijuan; Xiao, Fuyuan; Fei, Liguo; Kang, Bingyi; Deng, Yong

    2016-01-18

    Sensor data fusion plays an important role in fault diagnosis. Dempster-Shafer (D-R) evidence theory is widely used in fault diagnosis, since it is efficient to combine evidence from different sensors. However, under the situation where the evidence highly conflicts, it may obtain a counterintuitive result. To address the issue, a new method is proposed in this paper. Not only the statistic sensor reliability, but also the dynamic sensor reliability are taken into consideration. The evidence distance function and the belief entropy are combined to obtain the dynamic reliability of each sensor report. A weighted averaging method is adopted to modify the conflict evidence by assigning different weights to evidence according to sensor reliability. The proposed method has better performance in conflict management and fault diagnosis due to the fact that the information volume of each sensor report is taken into consideration. An application in fault diagnosis based on sensor fusion is illustrated to show the efficiency of the proposed method. The results show that the proposed method improves the accuracy of fault diagnosis from 81.19% to 89.48% compared to the existing methods.

  1. The Stability and Reliability of a Modified Work Components Study Questionnaire in the Educational Organization.

    ERIC Educational Resources Information Center

    Miskel, Cecil; Heller, Leonard E.

    The investigation attempted to establish the factorial validity and reliability of an industrial selection device based on Herzberg's theory of work motivation related to the school organization. The questionnaire was reworded to reflect an educational work situation; and a random sample of 197 students, 118 administrators, and 432 teachers was…

  2. Teamwork as an Essential Component of High-Reliability Organizations

    PubMed Central

    Baker, David P; Day, Rachel; Salas, Eduardo

    2006-01-01

    Organizations are increasingly becoming dynamic and unstable. This evolution has given rise to greater reliance on teams and increased complexity in terms of team composition, skills required, and degree of risk involved. High-reliability organizations (HROs) are those that exist in such hazardous environments where the consequences of errors are high, but the occurrence of error is extremely low. In this article, we argue that teamwork is an essential component of achieving high reliability particularly in health care organizations. We describe the fundamental characteristics of teams, review strategies in team training, demonstrate the criticality of teamwork in HROs and finally, identify specific challenges the health care community must address to improve teamwork and enhance reliability. PMID:16898980

  3. Generalizability Theory as a Unifying Framework of Measurement Reliability in Adolescent Research

    ERIC Educational Resources Information Center

    Fan, Xitao; Sun, Shaojing

    2014-01-01

    In adolescence research, the treatment of measurement reliability is often fragmented, and it is not always clear how different reliability coefficients are related. We show that generalizability theory (G-theory) is a comprehensive framework of measurement reliability, encompassing all other reliability methods (e.g., Pearson "r,"…

  4. 18 CFR 39.3 - Electric Reliability Organization certification.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 18 Conservation of Power and Water Resources 1 2012-04-01 2012-04-01 false Electric Reliability Organization certification. 39.3 Section 39.3 Conservation of Power and Water Resources FEDERAL ENERGY... operators of the Bulk-Power System, and other interested parties for improvement of the Electric...

  5. 18 CFR 39.3 - Electric Reliability Organization certification.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 18 Conservation of Power and Water Resources 1 2014-04-01 2014-04-01 false Electric Reliability Organization certification. 39.3 Section 39.3 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY REGULATIONS UNDER THE FEDERAL POWER ACT RULES CONCERNING CERTIFICATION OF THE ELECTRIC...

  6. 18 CFR 39.3 - Electric Reliability Organization certification.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 18 Conservation of Power and Water Resources 1 2013-04-01 2013-04-01 false Electric Reliability Organization certification. 39.3 Section 39.3 Conservation of Power and Water Resources FEDERAL ENERGY... operators of the Bulk-Power System, and other interested parties for improvement of the Electric...

  7. Evaluating reliability and resolution of ensemble forecasts using information theory

    NASA Astrophysics Data System (ADS)

    Weijs, Steven; van de Giesen, Nick

    2010-05-01

    Ensemble forecasts are increasingly popular for the communication of uncertainty towards the public and decision makers. Ideally, an ensemble forecast reflects both the uncertainty and the information in a forecast, which means that the spread in the ensemble should accurately represent the true uncertainty. For ensembles to be useful, they should be probabilistic, as probability is the language to precisely describe an incomplete state of knowledge, that is typical for forecasts. Information theory provides the ideal tools to deal with uncertainty and information in forecasts. Essential to the use and development of models and forecasts are ways to evaluate their quality. Without a proper definition of what is good, it is impossible to improve forecasts. In contrast to forecast value, which is user dependent, forecast quality, which is defined as the correspondence between forecasts and observations, can be objectively defined, given the question that is asked. The evaluation of forecast quality is known as forecast verification. Numerous techniques for forecast verification have been developed over the past decades. The Brier score (BS) and the derived Ranked Probability Score (RPS) are among the most widely used scores for measuring forecast quality. Both of these scores can be split into three additive components: uncertainty, reliability and resolution. While the first component, uncertainty, just depends on the inherent variability in the forecasted event, the latter two measure different aspects of the quality of forecasts themselves. Resolution measures the difference between the conditional probabilities and the marginal probabilities of occurrence. The third component, reliability, measures the conditional bias in the probability estimates, hence unreliability would be a better name. In this work, we argue that information theory should be adopted as the correct framework for measuring quality of probabilistic ensemble forecasts. We use the information

  8. Theory of reliable systems. [reliability analysis and on-line fault diagnosis

    NASA Technical Reports Server (NTRS)

    Meyer, J. F.

    1974-01-01

    Research is reported in the program to refine the current notion of system reliability by identifying and investigating attributes of a system which are important to reliability considerations, and to develop techniques which facilitate analysis of system reliability. Reliability analysis, and on-line fault diagnosis are discussed.

  9. Comparison of Reliability Measures under Factor Analysis and Item Response Theory

    ERIC Educational Resources Information Center

    Cheng, Ying; Yuan, Ke-Hai; Liu, Cheng

    2012-01-01

    Reliability of test scores is one of the most pervasive psychometric concepts in measurement. Reliability coefficients based on a unifactor model for continuous indicators include maximal reliability rho and an unweighted sum score-based omega, among many others. With increasing popularity of item response theory, a parallel reliability measure pi…

  10. Putting "Organizations" into an Organization Theory Course: A Hybrid CAO Model for Teaching Organization Theory

    ERIC Educational Resources Information Center

    Hannah, David R.; Venkatachary, Ranga

    2010-01-01

    In this article, the authors present a retrospective analysis of an instructor's multiyear redesign of a course on organization theory into what is called a hybrid Classroom-as-Organization model. It is suggested that this new course design served to apprentice students to function in quasi-real organizational structures. The authors further argue…

  11. A Postmodern Theory of Knowledge Organization.

    ERIC Educational Resources Information Center

    Mai, Jens-Erik

    1999-01-01

    Suggests a postmodern theory regarding knowledge organizations as active constructions of a perceived conception of particular discourse communities in the company, organization or knowledge fields for which the knowledge organization is intended. In this view, the interpretive process in knowledge organization and the culture and social context…

  12. Theory of reliable systems. [systems analysis and design

    NASA Technical Reports Server (NTRS)

    Meyer, J. F.

    1973-01-01

    The analysis and design of reliable systems are discussed. The attributes of system reliability studied are fault tolerance, diagnosability, and reconfigurability. Objectives of the study include: to determine properties of system structure that are conducive to a particular attribute; to determine methods for obtaining reliable realizations of a given system; and to determine how properties of system behavior relate to the complexity of fault tolerant realizations. A list of 34 references is included.

  13. Educational Management Organizations as High Reliability Organizations: A Study of Victory's Philadelphia High School Reform Work

    ERIC Educational Resources Information Center

    Thomas, David E.

    2013-01-01

    This executive position paper proposes recommendations for designing reform models between public and private sectors dedicated to improving school reform work in low performing urban high schools. It reviews scholarly research about for-profit educational management organizations, high reliability organizations, American high school reform, and…

  14. Conceptualizing Essay Tests' Reliability and Validity: From Research to Theory

    ERIC Educational Resources Information Center

    Badjadi, Nour El Imane

    2013-01-01

    The current paper on writing assessment surveys the literature on the reliability and validity of essay tests. The paper aims to examine the two concepts in relationship with essay testing as well as to provide a snapshot of the current understandings of the reliability and validity of essay tests as drawn in recent research studies. Bearing in…

  15. Universities as Theory Z Organizations.

    ERIC Educational Resources Information Center

    McQuillen, Charles D.

    1982-01-01

    Contrasts in the approaches of a Japanese company and the American university to management development provide the basis for a discussion of Theory Z's potential application to faculty affairs. Among the issues discussed are the tenure and promotion system, collegial decision making and responsibility, faculty development and incentives, and…

  16. A PERSPECTIVE ON RELIABILITY: PROBABILITY THEORY AND BEYOND

    SciTech Connect

    J. M. BOOKER; N. D. SINGPURWALLA

    2001-05-01

    Reliability assessment in the coming era is inclined to be characterized by a difficult dilemma. On the one hand units and systems will be required to be ultra reliable; on the other hand, it may not be possible to subject them to a full-scale testing. A case in point occurs where testing is limited is one-of-a-kind complex systems, such as space exploration vehicles or where severe testing constraints are imposed such as full scale testing of strategic nuclear weapons prohibited by test ban treaties and international agreements. Decision makers also require reliability assessments for problems with terabytes of data, such as from complex simulations of system performance. Quantitative measures of reliability and their associated uncertainties will remain integral to system monitoring and tactical decision making. The challenge is to derive these defensible measures in light of these dilemmas. Because reliability is usually defined as a probability that the system performs to its required specification, probability enters into the heart of these dilemmas, both philosophically and practically. This paper provides an overview of the several interpretations of probability as they relate to reliability and to the uncertainties involved. The philosophical issues pertain to the interpretation and the quantification of reliability. For example, how must we interpret a number like 10{sup {minus}9}, for the failure rate of an airplane flight or an electrical power plant? Such numbers are common, particularly in the context of safety. Does it mean one failure in 10{sup 9} identical, or almost identical, trials? Are identical trials physically possible, let alone the fact that 10{sup 9} trials can take generations to perform? How can we make precise the notion of almost identical trials? If the trials are truly identical, then all of them must produce the same outcome and so the reliability must be either one or zero. However tautologies, like certainty and impossibility, can

  17. Further Progress toward Theory in Knowledge Organization.

    ERIC Educational Resources Information Center

    Smiraglia, Richard P.

    2001-01-01

    Discussion of theory focuses on knowledge organization and the generation of theory and research in three specific areas: author productivity and the distribution of name headings; the work phenomenon and association with Lotka's Law; and external validity in the examination of knowledge entities. (Author/LRW)

  18. 18 CFR 39.10 - Changes to an Electric Reliability Organization Rule or Regional Entity Rule.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... with the Commission for approval any proposed Electric Reliability Organization Rule or Rule change. A Regional Entity shall submit a Regional Entity Rule or Rule change to the Electric Reliability Organization... or upon complaint, may propose a change to an Electric Reliability Organization Rule or...

  19. 18 CFR 39.10 - Changes to an Electric Reliability Organization Rule or Regional Entity Rule.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... with the Commission for approval any proposed Electric Reliability Organization Rule or Rule change. A Regional Entity shall submit a Regional Entity Rule or Rule change to the Electric Reliability Organization... or upon complaint, may propose a change to an Electric Reliability Organization Rule or...

  20. A Research of Weapon System Storage Reliability Simulation Method Based on Fuzzy Theory

    NASA Astrophysics Data System (ADS)

    Shi, Yonggang; Wu, Xuguang; Chen, Haijian; Xu, Tingxue

    Aimed at the problem of the new, complicated weapon equipment system storage reliability analyze, the paper researched on the methods of fuzzy fault tree analysis and fuzzy system storage reliability simulation, discussed the path that regarded weapon system as fuzzy system, and researched the storage reliability of weapon system based on fuzzy theory, provided a method of storage reliability research for the new, complicated weapon equipment system. As an example, built up the fuzzy fault tree of one type missile control instrument based on function analysis, and used the method of fuzzy system storage reliability simulation to analyze storage reliability index of control instrument.

  1. Bi-Factor Multidimensional Item Response Theory Modeling for Subscores Estimation, Reliability, and Classification

    ERIC Educational Resources Information Center

    Md Desa, Zairul Nor Deana

    2012-01-01

    In recent years, there has been increasing interest in estimating and improving subscore reliability. In this study, the multidimensional item response theory (MIRT) and the bi-factor model were combined to estimate subscores, to obtain subscores reliability, and subscores classification. Both the compensatory and partially compensatory MIRT…

  2. Using Metaphors to Teach Organization Theory

    ERIC Educational Resources Information Center

    Taber, Tom D.

    2007-01-01

    Metaphors were used to teach systems thinking and to clarify concepts of organizational theory in an introductory MBA management course. Gareth Morgan's metaphors of organization were read by students and applied as frames to analyze a business case. In addition, personal metaphors were written by individual students in order to describe the…

  3. The Progress of Theory in Knowledge Organization.

    ERIC Educational Resources Information Center

    Smiraglia, Richard P.

    2002-01-01

    Presents a background on theory in knowledge organization, which has moved from an epistemic stance of pragmatism and rationalism (based on observation of the construction of retrieval tools), to empiricism (based on the results of empirical research). Discusses historicism, external validity, classification, user-interface design, and…

  4. Reliability of the Measure of Acceptance of the Theory of Evolution (MATE) Instrument with University Students

    ERIC Educational Resources Information Center

    Rutledge, Michael L.; Sadler, Kim C.

    2007-01-01

    The Measure of Acceptance of the Theory of Evolution (MATE) instrument was initially designed to assess high school biology teachers' acceptance of evolutionary theory. To determine if the MATE instrument is reliable with university students, it was administered to students in a non-majors biology course (n = 61) twice over a 3-week period.…

  5. Electronic-Structure Theory of Organic Semiconductors: Charge-Transport Parameters and Metal/Organic Interfaces

    NASA Astrophysics Data System (ADS)

    Coropceanu, Veaceslav; Li, Hong; Winget, Paul; Zhu, Lingyun; Brédas, Jean-Luc

    2013-07-01

    We focus this review on the theoretical description, at the density functional theory level, of two key processes that are common to electronic devices based on organic semiconductors (such as organic light-emitting diodes, field-effect transistors, and solar cells), namely charge transport and charge injection from electrodes. By using representative examples of current interest, our main goal is to introduce some of the reliable theoretical methodologies that can best depict these processes. We first discuss the evaluation of the microscopic parameters that determine charge-carrier transport in organic molecular crystals, i.e., electronic couplings and electron-vibration couplings. We then examine the electronic structure at interfaces between an organic layer and a metal or conducting oxide electrode, with an emphasis on the work-function modifications induced by the organic layer and on the interfacial energy-level alignments.

  6. Mathematic Modeling of Complex Hydraulic Machinery Systems When Evaluating Reliability Using Graph Theory

    NASA Astrophysics Data System (ADS)

    Zemenkova, M. Yu; Shipovalov, A. N.; Zemenkov, Yu D.

    2016-04-01

    The main technological equipment of pipeline transport of hydrocarbons are hydraulic machines. During transportation of oil mainly used of centrifugal pumps, designed to work in the “pumping station-pipeline” system. Composition of a standard pumping station consists of several pumps, complex hydraulic piping. The authors have developed a set of models and algorithms for calculating system reliability of pumps. It is based on the theory of reliability. As an example, considered one of the estimation methods with the application of graph theory.

  7. Reliability analysis of the objective structured clinical examination using generalizability theory

    PubMed Central

    Trejo-Mejía, Juan Andrés; Sánchez-Mendiola, Melchor; Méndez-Ramírez, Ignacio; Martínez-González, Adrián

    2016-01-01

    Background The objective structured clinical examination (OSCE) is a widely used method for assessing clinical competence in health sciences education. Studies using this method have shown evidence of validity and reliability. There are no published studies of OSCE reliability measurement with generalizability theory (G-theory) in Latin America. The aims of this study were to assess the reliability of an OSCE in medical students using G-theory and explore its usefulness for quality improvement. Methods An observational cross-sectional study was conducted at National Autonomous University of Mexico (UNAM) Faculty of Medicine in Mexico City. A total of 278 fifth-year medical students were assessed with an 18-station OSCE in a summative end-of-career final examination. There were four exam versions. G-theory with a crossover random effects design was used to identify the main sources of variance. Examiners, standardized patients, and cases were considered as a single facet of analysis. Results The exam was applied to 278 medical students. The OSCE had a generalizability coefficient of 0.93. The major components of variance were stations, students, and residual error. The sites and the versions of the tests had minimum variance. Conclusions Our study achieved a G coefficient similar to that found in other reports, which is acceptable for summative tests. G-theory allows the estimation of the magnitude of multiple sources of error and helps decision makers to determine the number of stations, test versions, and examiners needed to obtain reliable measurements. PMID:27543188

  8. Some Characteristics of One Type of High Reliability Organization.

    ERIC Educational Resources Information Center

    Roberts, Karlene H.

    1990-01-01

    Attempts to define organizational processes necessary to operate safely technologically complex organizations. Identifies nuclear powered aircraft carriers as examples of potentially hazardous organizations with histories of excellent operations. Discusses how carriers deal with components of risk and antecedents to catastrophe cited by Perrow and…

  9. Two Prophecy Formulas for Assessing the Reliability of Item Response Theory-Based Ability Estimates

    ERIC Educational Resources Information Center

    Raju, Nambury S.; Oshima, T.C.

    2005-01-01

    Two new prophecy formulas for estimating item response theory (IRT)-based reliability of a shortened or lengthened test are proposed. Some of the relationships between the two formulas, one of which is identical to the well-known Spearman-Brown prophecy formula, are examined and illustrated. The major assumptions underlying these formulas are…

  10. Assessing Academic Advising Outcomes Using Social Cognitive Theory: A Validity and Reliability Study

    ERIC Educational Resources Information Center

    Erlich, Richard J.; Russ-Eft, Darlene F.

    2012-01-01

    The validity and reliability of three instruments, the "Counselor Rubric for Gauging Student Understanding of Academic Planning," micro-analytic questions, and the "Student Survey for Understanding Academic Planning," all based on social cognitive theory, were tested as means to assess self-efficacy and self-regulated learning in college academic…

  11. Estimating Reliability of School-Level Scores Using Multilevel and Generalizability Theory Models

    ERIC Educational Resources Information Center

    Jeon, Min-Jeong; Lee, Guemin; Hwang, Jeong-Won; Kang, Sang-Jin

    2009-01-01

    The purpose of this study was to investigate the methods of estimating the reliability of school-level scores using generalizability theory and multilevel models. Two approaches, "student within schools" and "students within schools and subject areas," were conceptualized and implemented in this study. Four methods resulting from the combination…

  12. 77 FR 59745 - Delegation of Authority Regarding Electric Reliability Organization's Budget, Delegation...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-01

    ... No. 486, 52 FR 47897 (Dec. 17, 1987), FERC Stats. & Regs. ] 30,783 (1987). \\14\\ 18 CFR 380.4(a)(1). V... Energy Regulatory Commission 18 CFR Part 375 Delegation of Authority Regarding Electric Reliability... responsibilities for specific Electric Reliability Organization (ERO) filings. In particular, this Final...

  13. Using chemical organization theory for model checking

    PubMed Central

    Kaleta, Christoph; Richter, Stephan; Dittrich, Peter

    2009-01-01

    Motivation: The increasing number and complexity of biomodels makes automatic procedures for checking the models' properties and quality necessary. Approaches like elementary mode analysis, flux balance analysis, deficiency analysis and chemical organization theory (OT) require only the stoichiometric structure of the reaction network for derivation of valuable information. In formalisms like Systems Biology Markup Language (SBML), however, information about the stoichiometric coefficients required for an analysis of chemical organizations can be hidden in kinetic laws. Results: First, we introduce an algorithm that uncovers stoichiometric information that might be hidden in the kinetic laws of a reaction network. This allows us to apply OT to SBML models using modifiers. Second, using the new algorithm, we performed a large-scale analysis of the 185 models contained in the manually curated BioModels Database. We found that for 41 models (22%) the set of organizations changes when modifiers are considered correctly. We discuss one of these models in detail (BIOMD149, a combined model of the ERK- and Wnt-signaling pathways), whose set of organizations drastically changes when modifiers are considered. Third, we found inconsistencies in 5 models (3%) and identified their characteristics. Compared with flux-based methods, OT is able to identify those species and reactions more accurately [in 26 cases (14%)] that can be present in a long-term simulation of the model. We conclude that our approach is a valuable tool that helps to improve the consistency of biomodels and their repositories. Availability: All data and a JAVA applet to check SBML-models is available from http://www.minet.uni-jena.de/csb/prj/ot/tools Contact: dittrich@minet.uni-jena.de Supplementary information: Supplementary data are available at Bioinformatics online. PMID:19468053

  14. Sensor reliability evaluation scheme for target classification using belief function theory.

    PubMed

    Zhu, Jing; Luo, Yupin; Zhou, Jianjun

    2013-01-01

    In the target classification based on belief function theory, sensor reliability evaluation has two basic issues: reasonable dissimilarity measure among evidences, and adaptive combination of static and dynamic discounting. One solution to the two issues has been proposed here. Firstly, an improved dissimilarity measure based on dualistic exponential function has been designed. We assess the static reliability from a training set by the local decision of each sensor and the dissimilarity measure among evidences. The dynamic reliability factors are obtained from each test target using the dissimilarity measure between the output information of each sensor and the consensus. Secondly, an adaptive combination method of static and dynamic discounting has been introduced. We adopt Parzen-window to estimate the matching degree of current performance and static performance for the sensor. Through fuzzy theory, the fusion system can realize self-learning and self-adapting with the sensor performance changing. Experiments conducted on real databases demonstrate that our proposed scheme performs better in target classification under different target conditions compared with other methods. PMID:24351632

  15. Sensor Reliability Evaluation Scheme for Target Classification Using Belief Function Theory

    PubMed Central

    Zhu, Jing; Luo, Yupin; Zhou, Jianjun

    2013-01-01

    In the target classification based on belief function theory, sensor reliability evaluation has two basic issues: reasonable dissimilarity measure among evidences, and adaptive combination of static and dynamic discounting. One solution to the two issues has been proposed here. Firstly, an improved dissimilarity measure based on dualistic exponential function has been designed. We assess the static reliability from a training set by the local decision of each sensor and the dissimilarity measure among evidences. The dynamic reliability factors are obtained from each test target using the dissimilarity measure between the output information of each sensor and the consensus. Secondly, an adaptive combination method of static and dynamic discounting has been introduced. We adopt Parzen-window to estimate the matching degree of current performance and static performance for the sensor. Through fuzzy theory, the fusion system can realize self-learning and self-adapting with the sensor performance changing. Experiments conducted on real databases demonstrate that our proposed scheme performs better in target classification under different target conditions compared with other methods. PMID:24351632

  16. Generalizability theory reliability of written expression curriculum-based measurement in universal screening.

    PubMed

    Keller-Margulis, Milena A; Mercer, Sterett H; Thomas, Erin L

    2016-09-01

    The purpose of this study was to examine the reliability of written expression curriculum-based measurement (WE-CBM) in the context of universal screening from a generalizability theory framework. Students in second through fifth grade (n = 145) participated in the study. The sample included 54% female students, 49% White students, 23% African American students, 17% Hispanic students, 8% Asian students, and 3% of students identified as 2 or more races. Of the sample, 8% were English Language Learners and 6% were students receiving special education. Three WE-CBM probes were administered for 7 min each at 3 time points across 1 year. Writing samples were scored for commonly used WE-CBM metrics (e.g., correct minus incorrect word sequences; CIWS). Results suggest that nearly half the variance in WE-CBM is related to unsystematic error and that conventional screening procedures (i.e., the use of one 3-min sample) do not yield scores with adequate reliability for relative or absolute decisions about student performance. In most grades, three 3-min writing samples (or 2 longer duration samples) were required for adequate reliability for relative decisions, and three 7-min writing samples would not yield adequate reliability for relative decisions about within-year student growth. Implications and recommendations are discussed. (PsycINFO Database Record PMID:26322656

  17. Influencing Organizations to Promote Health: Applying Stakeholder Theory

    ERIC Educational Resources Information Center

    Kok, Gerjo; Gurabardhi, Zamira; Gottlieb, Nell H.; Zijlstra, Fred R. H.

    2015-01-01

    Stakeholder theory may help health promoters to make changes at the organizational and policy level to promote health. A stakeholder is any individual, group, or organization that can influence an organization. The organization that is the focus for influence attempts is called the focal organization. The more salient a stakeholder is and the more…

  18. Reliability assessment of different plate theories for elastic wave propagation analysis in functionally graded plates.

    PubMed

    Mehrkash, Milad; Azhari, Mojtaba; Mirdamadi, Hamid Reza

    2014-01-01

    The importance of elastic wave propagation problem in plates arises from the application of ultrasonic elastic waves in non-destructive evaluation of plate-like structures. However, precise study and analysis of acoustic guided waves especially in non-homogeneous waveguides such as functionally graded plates are so complicated that exact elastodynamic methods are rarely employed in practical applications. Thus, the simple approximate plate theories have attracted much interest for the calculation of wave fields in FGM plates. Therefore, in the current research, the classical plate theory (CPT), first-order shear deformation theory (FSDT) and third-order shear deformation theory (TSDT) are used to obtain the transient responses of flexural waves in FGM plates subjected to transverse impulsive loadings. Moreover, comparing the results with those based on a well recognized hybrid numerical method (HNM), we examine the accuracy of the plate theories for several plates of various thicknesses under excitations of different frequencies. The material properties of the plate are assumed to vary across the plate thickness according to a simple power-law distribution in terms of volume fractions of constituents. In all analyses, spatial Fourier transform together with modal analysis are applied to compute displacement responses of the plates. A comparison of the results demonstrates the reliability ranges of the approximate plate theories for elastic wave propagation analysis in FGM plates. Furthermore, based on various examples, it is shown that whenever the plate theories are used within the appropriate ranges of plate thickness and frequency content, solution process in wave number-time domain based on modal analysis approach is not only sufficient but also efficient for finding the transient waveforms in FGM plates.

  19. In search of principles for a Theory of Organisms.

    PubMed

    Longo, Giuseppe; Montevil, Mael; Sonnenschein, Carlos; Soto, Ana M

    2015-12-01

    Lacking an operational theory to explain the organization and behaviour of matter in unicellular and multicellular organisms hinders progress in biology. Such a theory should address life cycles from ontogenesis to death. This theory would complement the theory of evolution that addresses phylogenesis, and would posit theoretical extensions to accepted physical principles and default states in order to grasp the living state of matter and define proper biological observables. Thus, we favour adopting the default state implicit in Darwin's theory, namely, cell proliferation with variation plus motility, and a framing principle, namely, life phenomena manifest themselves as non-identical iterations of morphogenetic processes. From this perspective, organisms become a consequence of the inherent variability generated by proliferation, motility and self-organization. Morphogenesis would then be the result of the default state plus physical constraints, like gravity, and those present in living organisms, like muscular tension. PMID:26648040

  20. In search of principles for a Theory of Organisms.

    PubMed

    Longo, Giuseppe; Montevil, Mael; Sonnenschein, Carlos; Soto, Ana M

    2015-12-01

    Lacking an operational theory to explain the organization and behaviour of matter in unicellular and multicellular organisms hinders progress in biology. Such a theory should address life cycles from ontogenesis to death. This theory would complement the theory of evolution that addresses phylogenesis, and would posit theoretical extensions to accepted physical principles and default states in order to grasp the living state of matter and define proper biological observables. Thus, we favour adopting the default state implicit in Darwin's theory, namely, cell proliferation with variation plus motility, and a framing principle, namely, life phenomena manifest themselves as non-identical iterations of morphogenetic processes. From this perspective, organisms become a consequence of the inherent variability generated by proliferation, motility and self-organization. Morphogenesis would then be the result of the default state plus physical constraints, like gravity, and those present in living organisms, like muscular tension.

  1. High Reliability Organizations and Transformational Leadership as Lenses for Examining a School Improvement Effort.

    ERIC Educational Resources Information Center

    Taylor, Dianne L.; Angelle, Pamela S.

    A matrix of characteristics associated with transformational leadership and with high reliability organizations was developed. Using the matrix as a lens, researchers examined a successful school involved in a school improvement effort to understand the success more fully. Transformational leaders provide opportunities for personal growth for…

  2. 75 FR 14097 - Revision to Electric Reliability Organization Definition of Bulk Electric System

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-24

    ... Environmental Policy Act, Order No. 486, 52 FR 47897 (Dec. 17, 1987), FERC Stats. & Regs., Regs. Preambles 1986...; ] DEPARTMENT OF ENERGY Federal Energy Regulatory Commission 18 CFR Part 40 61,204] Revision to Electric Reliability Organization Definition of Bulk Electric System March 18, 2010. AGENCY: Federal Energy...

  3. 75 FR 72909 - Revision to Electric Reliability Organization Definition of Bulk Electric System

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-26

    ... Reliability Organization Definition of Bulk Electric System, Notice of Proposed Rulemaking, 75 FR 14097 (Mar... effective January 25, 2011. FOR FURTHER INFORMATION CONTACT: Robert V. Snow (Technical Information), Office... Electric System, Notice of Proposed Rulemaking, 75 FR 14097 (Mar. 24, 2010), FERC Stats. & Regs. ]...

  4. [Employees in high-reliability organizations: systematic selection of personnel as a final criterion].

    PubMed

    Oubaid, V; Anheuser, P

    2014-05-01

    Employees represent an important safety factor in high-reliability organizations. The combination of clear organizational structures, a nonpunitive safety culture, and psychological personnel selection guarantee a high level of safety. The cockpit personnel selection process of a major German airline is presented in order to demonstrate a possible transferability into medicine and urology.

  5. [Employees in high-reliability organizations: systematic selection of personnel as a final criterion].

    PubMed

    Oubaid, V; Anheuser, P

    2014-05-01

    Employees represent an important safety factor in high-reliability organizations. The combination of clear organizational structures, a nonpunitive safety culture, and psychological personnel selection guarantee a high level of safety. The cockpit personnel selection process of a major German airline is presented in order to demonstrate a possible transferability into medicine and urology. PMID:24806799

  6. Using the Reliability Theory for Assessing the Decision Confidence Probability for Comparative Life Cycle Assessments.

    PubMed

    Wei, Wei; Larrey-Lassalle, Pyrène; Faure, Thierry; Dumoulin, Nicolas; Roux, Philippe; Mathias, Jean-Denis

    2016-03-01

    Comparative decision making process is widely used to identify which option (system, product, service, etc.) has smaller environmental footprints and for providing recommendations that help stakeholders take future decisions. However, the uncertainty problem complicates the comparison and the decision making. Probability-based decision support in LCA is a way to help stakeholders in their decision-making process. It calculates the decision confidence probability which expresses the probability of a option to have a smaller environmental impact than the one of another option. Here we apply the reliability theory to approximate the decision confidence probability. We compare the traditional Monte Carlo method with a reliability method called FORM method. The Monte Carlo method needs high computational time to calculate the decision confidence probability. The FORM method enables us to approximate the decision confidence probability with fewer simulations than the Monte Carlo method by approximating the response surface. Moreover, the FORM method calculates the associated importance factors that correspond to a sensitivity analysis in relation to the probability. The importance factors allow stakeholders to determine which factors influence their decision. Our results clearly show that the reliability method provides additional useful information to stakeholders as well as it reduces the computational time. PMID:26815724

  7. Achieving a high-reliability organization through implementation of the ARCC model for systemwide sustainability of evidence-based practice.

    PubMed

    Melnyk, Bernadette Mazurek

    2012-01-01

    High-reliability health care organizations are those that provide care that is safe and one that minimizes errors while achieving exceptional performance in quality and safety. This article presents major concepts and characteristics of a patient safety culture and a high-reliability health care organization and explains how building a culture of evidence-based practice can assist organizations in achieving high reliability. The ARCC (Advancing Research and Clinical practice through close Collaboration) model for systemwide implementation and sustainability of evidence-based practice is highlighted as a key strategy in achieving high reliability in health care organizations.

  8. Enhance the lifetime and bias stress reliability in organic vertical transistor by UV/Ozone treatment

    NASA Astrophysics Data System (ADS)

    Lin, Hung-Cheng; Chang, Ming-Yu; Zan, Hsiao-Wen; Meng, Hsin-Fei; Chao, Yu-Chiang

    In this paper, we use UV/Ozone treatment to improve the lifetime and bias stress reliability of organic transistor with vertical channel. Even if vertical organic transistor exhibits better bias stress reliability than organic field effect transistor (OFET) due to bulk conduction mechanism, poor lifetime performance is still a challenge. Adding octadecyltrichlorosilane (OTS) to treat the vertical channel can reduce the trapping state and hence improve the bias stress ability. However, off-current is much higher after 6 days and lifetime performance is degraded. On the other hand, after 4000-s on-state bias stress, stable output current and on/off current ratio are demonstrated by using UV/Ozone to treat vertical channels. Threshold voltage shift is only -0.02 V which is much smaller than OFET with the same organic semiconductor material. Furthermore, the output current is also an order enhanced. Nevertheless, unlike device with OTS treatment, no obvious degradation is observed for UV/Ozone treated devices even after 170 days. With UV/Ozone treatment, the output current, bias stress reliability and lifetime were all improved. It makes vertical transistor become a promising device for the further application in display technology and flexible electronics.

  9. Organizations and Social Systems: Organization Theory's Neglected Mandate.

    ERIC Educational Resources Information Center

    Stern, Robert N.; Barley, Stephen R.

    1996-01-01

    The social-systems perspective in organizational theory faded because the increasing complexity of social relations hindered determination of an appropriate unit of analysis. Also, the business-school environment in which organizational research occurred discouraged examination of broad social questions, promoted a particular approach to science,…

  10. Influencing organizations to promote health: applying stakeholder theory.

    PubMed

    Kok, Gerjo; Gurabardhi, Zamira; Gottlieb, Nell H; Zijlstra, Fred R H

    2015-04-01

    Stakeholder theory may help health promoters to make changes at the organizational and policy level to promote health. A stakeholder is any individual, group, or organization that can influence an organization. The organization that is the focus for influence attempts is called the focal organization. The more salient a stakeholder is and the more central in the network, the stronger the influence. As stakeholders, health promoters may use communicative, compromise, deinstitutionalization, or coercive methods through an ally or a coalition. A hypothetical case study, involving adolescent use of harmful legal products, illustrates the process of applying stakeholder theory to strategic decision making.

  11. Influencing organizations to promote health: applying stakeholder theory.

    PubMed

    Kok, Gerjo; Gurabardhi, Zamira; Gottlieb, Nell H; Zijlstra, Fred R H

    2015-04-01

    Stakeholder theory may help health promoters to make changes at the organizational and policy level to promote health. A stakeholder is any individual, group, or organization that can influence an organization. The organization that is the focus for influence attempts is called the focal organization. The more salient a stakeholder is and the more central in the network, the stronger the influence. As stakeholders, health promoters may use communicative, compromise, deinstitutionalization, or coercive methods through an ally or a coalition. A hypothetical case study, involving adolescent use of harmful legal products, illustrates the process of applying stakeholder theory to strategic decision making. PMID:25829111

  12. Cross Cultural Perspectives of the Learning Organization: Assessing the Validity and Reliability of the DLOQ in Korea

    ERIC Educational Resources Information Center

    Song, Ji Hoon; Kim, Jin Yong; Chermack, Thomas J.; Yang; Baiyin

    2008-01-01

    The primary purpose of this research was to adapt the Dimensions of Learning Organization Questionnaire (DLOQ) from Watkins and Marsick (1993, 1996) and examine its validity and reliability in a Korean context. Results indicate that the DLOQ produces valid and reliable scores of learning organization characteristics in a Korean cultural context.…

  13. Teaching Organization Theory and Practice: An Experiential and Reflective Approach

    ERIC Educational Resources Information Center

    Cameron, Mark; Turkiewicz, Rita M.; Holdaway, Britt A.; Bill, Jacqueline S.; Goodman, Jessica; Bonner, Aisha; Daly, Stacey; Cohen, Michael D.; Lorenz, Cassandra; Wilson, Paul R.; Rusk, James

    2009-01-01

    The organization is often the overlooked level in social work's ecological perspective. However, organizational realities exert a profound influence on human development and well-being as well as the nature and quality of social work practice. This article describes a model of teaching organization theory and practice which requires master's…

  14. Estimation of the reliability of all-ceramic crowns using finite element models and the stress-strength interference theory.

    PubMed

    Li, Yan; Chen, Jianjun; Liu, Jipeng; Zhang, Lei; Wang, Weiguo; Zhang, Shaofeng

    2013-09-01

    The reliability of all-ceramic crowns is of concern to both patients and doctors. This study introduces a new methodology for quantifying the reliability of all-ceramic crowns based on the stress-strength interference theory and finite element models. The variables selected for the reliability analysis include the magnitude of the occlusal contact area, the occlusal load and the residual thermal stress. The calculated reliabilities of crowns under different loading conditions showed that too small occlusal contact areas or too great a difference of the thermal coefficient between veneer and core layer led to high failure possibilities. There results were consistent with many previous reports. Therefore, the methodology is shown to be a valuable method for analyzing the reliabilities of the restorations in the complicated oral environment.

  15. Measuring theory of mind across middle childhood: Reliability and validity of the Silent Films and Strange Stories tasks.

    PubMed

    Devine, Rory T; Hughes, Claire

    2016-09-01

    Recent years have seen a growth of research on the development of children's ability to reason about others' mental states (or "theory of mind") beyond the narrow confines of the preschool period. The overall aim of this study was to investigate the psychometric properties of a task battery composed of items from Happé's Strange Stories task and Devine and Hughes' Silent Film task. A sample of 460 ethnically and socially diverse children (211 boys) between 7 and 13years of age completed the task battery at two time points separated by 1month. The Strange Stories and Silent Film tasks were strongly correlated even when verbal ability and narrative comprehension were taken into account, and all items loaded onto a single theory-of-mind latent factor. The theory-of-mind latent factor provided reliable estimates of performance across a wide range of theory-of-mind ability and showed no evidence of differential item functioning across gender, ethnicity, or socioeconomic status. The theory-of-mind latent factor also exhibited strong 1-month test-retest reliability, and this stability did not vary as a function of child characteristics. Taken together, these findings provide evidence for the validity and reliability of the Strange Stories and Silent Film task battery as a measure of individual differences in theory of mind suitable for use across middle childhood. We consider the methodological and conceptual implications of these findings for research on theory of mind beyond the preschool years.

  16. Reliability-based robust design optimization of vehicle components, Part I: Theory

    NASA Astrophysics Data System (ADS)

    Zhang, Yimin

    2015-06-01

    The reliability-based design optimization, the reliability sensitivity analysis and robust design method are employed to present a practical and effective approach for reliability-based robust design optimization of vehicle components. A procedure for reliability-based robust design optimization of vehicle components is proposed. Application of the method is illustrated by reliability-based robust design optimization of axle and spring. Numerical results have shown that the proposed method can be trusted to perform reliability-based robust design optimization of vehicle components.

  17. Safeguarding patients: complexity science, high reliability organizations, and implications for team training in healthcare.

    PubMed

    McKeon, Leslie M; Oswaks, Jill D; Cunningham, Patricia D

    2006-01-01

    Serious events within healthcare occur daily exposing the failure of the system to safeguard patient and providers. The complex nature of healthcare contributes to myriad ambiguities affecting quality nursing care and patient outcomes. Leaders in healthcare organizations are looking outside the industry for ways to improve care because of the slow rates of improvement in patient safety and insufficient application of evidenced-based research in practice. Military and aviation industry strategies are recognized by clinicians in high-risk care settings such as the operating room, emergency departments, and intensive care units as having great potential to create safe and effective systems of care. Complexity science forms the basis for high reliability teams to recognize even the most minor variances in expected outcomes and take strong action to prevent serious error from occurring. Cultural and system barriers to achieving high reliability performance within healthcare and implications for team training are discussed.

  18. A modelling approach to find stable and reliable soil organic carbon values for further regionalization.

    NASA Astrophysics Data System (ADS)

    Bönecke, Eric; Franko, Uwe

    2015-04-01

    Soil organic matter (SOM) and carbon (SOC) might be the most important components to describe soil fertility of agricultural used soils. It is sensitive to temporal and spatial changes due to varying weather conditions, uneven crops and soil management practices and still struggles with providing reliable delineation of spatial variability. Soil organic carbon, furthermore, is an essential initial parameter for dynamic modelling, understanding e.g. carbon and nitrogen processes. Alas it requires cost and time intensive field and laboratory work to attain and using this information. The objective of this study is to assess an approach that reduces efforts of laboratory and field analyses by using method to find stable initial soil organic carbon values for further soil process modelling and regionalization on field scale. The demand of strategies, technics and tools to improve reliable soil organic carbon high resolution maps and additionally reducing cost constraints is hence still facing an increasing attention of scientific research. Although, it is nowadays a widely used practice, combining effective sampling schemes with geophysical sensing techniques, to describe within-field variability of soil organic carbon, it is still challenging large uncertainties, even at field scale in both, science and agriculture. Therefore, an analytical and modelling approach might facilitate and improve this strategy on small and large field scale. This study will show a method, how to find reliable steady state values of soil organic carbon at particular points, using the approved soil process model CANDY (Franko et al. 1995). It is focusing on an iterative algorithm of adjusting the key driving components: soil physical properties, meteorological data and management information, for which we quantified the input and the losses of soil carbon (manure, crop residues, other organic inputs, decomposition, leaching). Furthermore, this approach can be combined with geophysical

  19. Reliability and validity of the German version of the Structured Interview of Personality Organization (STIPO)

    PubMed Central

    2013-01-01

    Background The assessment of personality organization and its observable behavioral manifestations, i.e. personality functioning, has a long tradition in psychodynamic psychiatry. Recently, the DSM-5 Levels of Personality Functioning Scale has moved it into the focus of psychiatric diagnostics. Based on Kernberg’s concept of personality organization the Structured Interview of Personality Organization (STIPO) was developed for diagnosing personality functioning. The STIPO covers seven dimensions: (1) identity, (2) object relations, (3) primitive defenses, (4) coping/rigidity, (5) aggression, (6) moral values, and (7) reality testing and perceptual distortions. The English version of the STIPO has previously revealed satisfying psychometric properties. Methods Validity and reliability of the German version of the 100-item instrument have been evaluated in 122 psychiatric patients. All patients were diagnosed according to the Diagnostic and Statistical Manual for Mental Disorders (DSM-IV) and were assessed by means of the STIPO. Moreover, all patients completed eight questionnaires that served as criteria for external validity of the STIPO. Results Interrater reliability varied between intraclass correlations of .89 and 1.0, Crohnbach’s α for the seven dimensions was .69 to .93. All a priori selected questionnaire scales correlated significantly with the corresponding STIPO dimensions. Patients with personality disorder (PD) revealed significantly higher STIPO scores (i.e. worse personality functioning) than patients without PD; patients cluster B PD showed significantly higher STIPO scores than patients with cluster C PD. Conclusions Interrater reliability, Crohnbach’s α, concurrent validity, and differential validity of the STIPO are satisfying. The STIPO represents an appropriate instrument for the assessment of personality functioning in clinical and research settings. PMID:23941404

  20. Layered and segmented system organization (LASSO) for highly reliable inventory monitoring systems (IMS)

    SciTech Connect

    Mangan, Dennis L.; Matter, John C.; Waddoups, I.; Abhold, M. E.; Chiaro, P.

    2002-01-01

    The Trilateral Initiative is preparing for International Atomic Energy Agency (LUiA) verification of excess fissile material released itom the defense programs of the United States and the Russian Federation. Following acceptance of the material using an Attribute Verification System, the IAEA will depend on an Inventory Monitoring System to maintain Continuity of Knowledge of the large inventory of thousands of items. Recovery fiom a total loss of Continuity of Knowledge in such a large storage facility would involve an extremely costly inventory re-verification This paper presents the framework for a Layered and Segmented System Organization that is the basis for a highly reliable IMS with protection-in-depth.

  1. Theory of Microcavity Organic Light-Emitting Diodes

    NASA Astrophysics Data System (ADS)

    Rothberg, L. J.; Dodabalapur, A.; Jordan, R. H.; Slusher, R. E.

    1996-03-01

    We adapt the theory of Schubert and coworkers footnote N. E. J. Hunt, E. F. Schubert and G. J. Zydzik, Appl. Phys. Lett. 63, 391 (1993). for resonant cavity inorganic light-emitting diodes (LEDs) to evaluate analogous organic devices for various display applications. The theory is used to calculate angular distribution of intensity and color, as well as to investigate optimizing light output from organic LEDs. Our results agree well with experimental measurements on microcavity devices we have fabricated from hydroxyquinolinealuminum (Alq) and doped Alq emitters.

  2. A Critique of Raju and Oshima's Prophecy Formulas for Assessing the Reliability of Item Response Theory-Based Ability Estimates

    ERIC Educational Resources Information Center

    Wang, Wen-Chung

    2008-01-01

    Raju and Oshima (2005) proposed two prophecy formulas based on item response theory in order to predict the reliability of ability estimates for a test after change in its length. The first prophecy formula is equivalent to the classical Spearman-Brown prophecy formula. The second prophecy formula is misleading because of an underlying false…

  3. The Development of the Functional Literacy Experience Scale Based upon Ecological Theory (FLESBUET) and Validity-Reliability Study

    ERIC Educational Resources Information Center

    Özenç, Emine Gül; Dogan, M. Cihangir

    2014-01-01

    This study aims to perform a validity-reliability test by developing the Functional Literacy Experience Scale based upon Ecological Theory (FLESBUET) for primary education students. The study group includes 209 fifth grade students at Sabri Taskin Primary School in the Kartal District of Istanbul, Turkey during the 2010-2011 academic year.…

  4. Using multivariate generalizability theory to assess the effect of content stratification on the reliability of a performance assessment.

    PubMed

    Keller, Lisa A; Clauser, Brian E; Swanson, David B

    2010-12-01

    In recent years, demand for performance assessments has continued to grow. However, performance assessments are notorious for lower reliability, and in particular, low reliability resulting from task specificity. Since reliability analyses typically treat the performance tasks as randomly sampled from an infinite universe of tasks, these estimates of reliability may not be accurate. For tests built according to a table of specifications, tasks are randomly sampled from different strata (content domains, skill areas, etc.). If these strata remain fixed in the test construction process, ignoring this stratification in the reliability analysis results in an underestimate of "parallel forms" reliability, and an overestimate of the person-by-task component. This research explores the effect of representing and misrepresenting the stratification appropriately in estimation of reliability and the standard error of measurement. Both multivariate and univariate generalizability studies are reported. Results indicate that the proper specification of the analytic design is essential in yielding the proper information both about the generalizability of the assessment and the standard error of measurement. Further, illustrative D studies present the effect under a variety of situations and test designs. Additional benefits of multivariate generalizability theory in test design and evaluation are also discussed.

  5. Territoriality: Necessary Concept in Conflict Theories of Organization?

    ERIC Educational Resources Information Center

    Brumbaugh, Robert B.

    This paper examines the Ardrey Concept of territoriality -- that in his dealing with others, man is driven by a "territorial imperative" -- for its possible relevance to the design of more powerful conflict theories of organization. The importance of territoriality is explored as a conceivable precondition necessary to eventual understanding. An…

  6. Investigating Postgraduate College Admission Interviews: Generalizability Theory Reliability and Incremental Predictive Validity

    ERIC Educational Resources Information Center

    Arce-Ferrer, Alvaro J.; Castillo, Irene Borges

    2007-01-01

    The use of face-to-face interviews is controversial for college admissions decisions in light of the lack of availability of validity and reliability evidence for most college admission processes. This study investigated reliability and incremental predictive validity of a face-to-face postgraduate college admission interview with a sample of…

  7. Utilizing Generalizability Theory to Investigate the Reliability of the Grades Assigned to Undergraduate Research Papers

    ERIC Educational Resources Information Center

    Gugiu, Mihaiela R.; Gugiu, Paul C.; Baldus, Robert

    2012-01-01

    Background: Educational researchers have long espoused the virtues of writing with regard to student cognitive skills. However, research on the reliability of the grades assigned to written papers reveals a high degree of contradiction, with some researchers concluding that the grades assigned are very reliable whereas others suggesting that they…

  8. Validity and Reliability of Published Comprehensive Theory of Mind Tests for Normal Preschool Children: A Systematic Review

    PubMed Central

    Ziatabar Ahmadi, Seyyede Zohreh; Jalaie, Shohreh; Ashayeri, Hassan

    2015-01-01

    Objective: Theory of mind (ToM) or mindreading is an aspect of social cognition that evaluates mental states and beliefs of oneself and others. Validity and reliability are very important criteria when evaluating standard tests; and without them, these tests are not usable. The aim of this study was to systematically review the validity and reliability of published English comprehensive ToM tests developed for normal preschool children. Method: We searched MEDLINE (PubMed interface), Web of Science, Science direct, PsycINFO, and also evidence base Medicine (The Cochrane Library) databases from 1990 to June 2015. Search strategy was Latin transcription of ‘Theory of Mind’ AND test AND children. Also, we manually studied the reference lists of all final searched articles and carried out a search of their references. Inclusion criteria were as follows: Valid and reliable diagnostic ToM tests published from 1990 to June 2015 for normal preschool children; and exclusion criteria were as follows: the studies that only used ToM tests and single tasks (false belief tasks) for ToM assessment and/or had no description about structure, validity or reliability of their tests. Methodological quality of the selected articles was assessed using the Critical Appraisal Skills Programme (CASP). Result: In primary searching, we found 1237 articles in total databases. After removing duplicates and applying all inclusion and exclusion criteria, we selected 11 tests for this systematic review. Conclusion: There were a few valid, reliable and comprehensive ToM tests for normal preschool children. However, we had limitations concerning the included articles. The defined ToM tests were different in populations, tasks, mode of presentations, scoring, mode of responses, times and other variables. Also, they had various validities and reliabilities. Therefore, it is recommended that the researchers and clinicians select the ToM tests according to their psychometric characteristics

  9. Organization Theory, Political Theory, and the International Arena: Some Hope But Very Little Time.

    ERIC Educational Resources Information Center

    Thayer, Frederick C.

    This paper presents background on a non-hierarchical organizational perspective. In addition, it presents guidelines for using a non-hierarchical perspective to create generally acceptable forms of international organizations. The theory on which the non-hierarchical perspective is based maintains that a form of comprehensive global planning…

  10. Polyanthraquinone as a Reliable Organic Electrode for Stable and Fast Lithium Storage.

    PubMed

    Song, Zhiping; Qian, Yumin; Gordin, Mikhail L; Tang, Duihai; Xu, Terrence; Otani, Minoru; Zhan, Hui; Zhou, Haoshen; Wang, Donghai

    2015-11-16

    In spite of recent progress, there is still a lack of reliable organic electrodes for Li storage with high comprehensive performance, especially in terms of long-term cycling stability. Herein, we report an ideal polymer electrode based on anthraquinone, namely, polyanthraquinone (PAQ), or specifically, poly(1,4-anthraquinone) (P14AQ) and poly(1,5-anthraquinone) (P15AQ). As a lithium-storage cathode, P14AQ showed exceptional performance, including reversible capacity almost equal to the theoretical value (260 mA h g(-1); >257 mA h g(-1) for AQ), a very small voltage gap between the charge and discharge curves (2.18-2.14=0.04 V), stable cycling performance (99.4% capacity retention after 1000 cycles), and fast-discharge/charge ability (release of 69% of the low-rate capacity or 64% of the energy in just 2 min). Exploration of the structure-performance relationship between P14AQ and related materials also provided us with deeper understanding for the design of organic electrodes.

  11. [Evolutionary theories of asymmetrization of organisms, brain and body].

    PubMed

    Geodakian, V A

    2005-01-01

    Growth of a dispersion of elements of unitary systems (US) inevitably transfrorms them into the binary connected differentiations (BCD). So, at a level of genes, from bisexuals have arisen females and males, and at a level of hormones (mentality, behavior), from symmetric--asymmetric: functions, organs, right-handers and left-handers. All BCD are isomorphic. They consist of subsystems preservation (conservative) and change (operative) (SP, SC). SP is more important, than SC, therefore the dispersion of their elements is less, than elements SC. This base difference. It transforms a monomodal population in bimodal, direct "ecology, (environment, E US), in consequetn (E SC SP), and synchronous evolution in asynchronous (at first SC, later SP). Then for evolution "pay" only SC. Means, asynchronous evolution is more economical, than synchronous. In it adaptive sense of any BCD. All theories of biology are theories of US. They treat subsystems not as phased but as forms, therefore may not explain BCD. The general idea of asynchronous evolution--control centre of any function arises in the left hemisphere and in the right hemisphere gets only after approbation. Hence, creates asymmetry different phases of the same function in the left and right hemisphere, but not different functions. It has allowed to reject a lot of erroneous representations and to create adaptive, internally consistent, evolutionary theories of three-demensional asymmetry of organisms, a brain, a cis-trans asymmetry of pair organs and dextrality-sinistrality, having unique explanatory and predictive potential.

  12. Generalizability Theory Analysis of CBM Maze Reliability in Third- through Fifth-Grade Students

    ERIC Educational Resources Information Center

    Mercer, Sterett H.; Dufrene, Brad A.; Zoder-Martell, Kimberly; Harpole, Lauren Lestremau; Mitchell, Rachel R.; Blaze, John T.

    2012-01-01

    Despite growing use of CBM Maze in universal screening and research, little information is available regarding the number of CBM Maze probes needed for reliable decisions. The current study extends existing research on the technical adequacy of CBM Maze by investigating the number of probes and assessment durations (1-3 min) needed for reliable…

  13. Magnetoelectroluminescence of organic heterostructures: Analytical theory and spectrally resolved measurements

    SciTech Connect

    Liu, Feilong; Kelley, Megan R.; Crooker, Scott A.; Nie, Wanyi; Mohite, Aditya D.; Ruden, P. Paul; Smith, Darryl L.

    2014-12-22

    The effect of a magnetic field on the electroluminescence of organic light emitting devices originates from the hyperfine interaction between the electron/hole polarons and the hydrogen nuclei of the host molecules. In this paper, we present an analytical theory of magnetoelectroluminescence for organic semiconductors. To be specific, we focus on bilayer heterostructure devices. In the case we are considering, light generation at the interface of the donor and acceptor layers results from the formation and recombination of exciplexes. The spin physics is described by a stochastic Liouville equation for the electron/hole spin density matrix. By finding the steady-state analytical solution using Bloch-Wangsness-Redfield theory, we explore how the singlet/triplet exciplex ratio is affected by the hyperfine interaction strength and by the external magnetic field. In order to validate the theory, spectrally resolved electroluminescence experiments on BPhen/m-MTDATA devices are analyzed. With increasing emission wavelength, the width of the magnetic field modulation curve of the electroluminescence increases while its depth decreases. Furthermore, these observations are consistent with the model.

  14. Magnetoelectroluminescence of organic heterostructures: Analytical theory and spectrally resolved measurements

    DOE PAGES

    Liu, Feilong; Kelley, Megan R.; Crooker, Scott A.; Nie, Wanyi; Mohite, Aditya D.; Ruden, P. Paul; Los Alamos National Lab.; Smith, Darryl L.; Los Alamos National Lab.

    2014-12-22

    The effect of a magnetic field on the electroluminescence of organic light emitting devices originates from the hyperfine interaction between the electron/hole polarons and the hydrogen nuclei of the host molecules. In this paper, we present an analytical theory of magnetoelectroluminescence for organic semiconductors. To be specific, we focus on bilayer heterostructure devices. In the case we are considering, light generation at the interface of the donor and acceptor layers results from the formation and recombination of exciplexes. The spin physics is described by a stochastic Liouville equation for the electron/hole spin density matrix. By finding the steady-state analytical solutionmore » using Bloch-Wangsness-Redfield theory, we explore how the singlet/triplet exciplex ratio is affected by the hyperfine interaction strength and by the external magnetic field. In order to validate the theory, spectrally resolved electroluminescence experiments on BPhen/m-MTDATA devices are analyzed. With increasing emission wavelength, the width of the magnetic field modulation curve of the electroluminescence increases while its depth decreases. Furthermore, these observations are consistent with the model.« less

  15. The "New Institutionalism" in Organization Theory: Bringing Society and Culture Back in

    ERIC Educational Resources Information Center

    Senge, Konstanze

    2013-01-01

    This investigation will discuss the emergence of an economistical perspective among the dominant approaches of organization theory in the United States since the inception of "organization studies" as an academic discipline. It maintains that Contingency theory, Resource Dependency theory, Population Ecology theory, and Transaction Cost theory…

  16. Reliability of a tool for measuring theory of planned behaviour constructs for use in evaluating research use in policymaking

    PubMed Central

    2011-01-01

    Background Although measures of knowledge translation and exchange (KTE) effectiveness based on the theory of planned behavior (TPB) have been used among patients and providers, no measure has been developed for use among health system policymakers and stakeholders. A tool that measures the intention to use research evidence in policymaking could assist researchers in evaluating the effectiveness of KTE strategies that aim to support evidence-informed health system decision-making. Therefore, we developed a 15-item tool to measure four TPB constructs (intention, attitude, subjective norm and perceived control) and assessed its face validity through key informant interviews. Methods We carried out a reliability study to assess the tool's internal consistency and test-retest reliability. Our study sample consisted of 62 policymakers and stakeholders that participated in deliberative dialogues. We assessed internal consistency using Cronbach's alpha and generalizability (G) coefficients, and we assessed test-retest reliability by calculating Pearson correlation coefficients (r) and G coefficients for each construct and the tool overall. Results The internal consistency of items within each construct was good with alpha ranging from 0.68 to alpha = 0.89. G-coefficients were lower for a single administration (G = 0.34 to G = 0.73) than for the average of two administrations (G = 0.79 to G = 0.89). Test-retest reliability coefficients for the constructs ranged from r = 0.26 to r = 0.77 and from G = 0.31 to G = 0.62 for a single administration, and from G = 0.47 to G = 0.86 for the average of two administrations. Test-retest reliability of the tool using G theory was moderate (G = 0.5) when we generalized across a single observation, but became strong (G = 0.9) when we averaged across both administrations. Conclusion This study provides preliminary evidence for the reliability of a tool that can be used to measure TPB constructs in relation to research use in policymaking

  17. Altruism and organism: disentangling the themes of multilevel selection theory.

    PubMed

    Wilson, D S

    1997-07-01

    The evolution of groups into adaptive units, similar to single organisms in the coordination of their parts, is one major theme of multilevel selection theory. Another major theme is the evolution of altruistic behaviors that benefit others at the expense of self. These themes are often assumed to be strongly linked, such that altruism is required for group-level adaptation. Multilevel selection theory reveals a more complex relationship between the themes of altruism and organism. Adaptation at every level of the biological hierarchy requires a corresponding process of natural selection, which includes the fundamental ingredients of phenotypic variation, heritability, and fitness consequences. These ingredients can exist for many kinds of groups and do not require the extreme genetic variation among groups that is usually associated with the evolution of altruism. Thus, it is reasonable to expect higher-level units to evolve into adaptive units with respect to specific traits, even when their members are not genealogically related and do not behave in ways that are obviously altruistic. As one example, the concept of a group mind, which has been well documented in the social insects, may be applicable to other species.

  18. Understanding Schools as High-Reliability Organizations: An Exploratory Examination of Teachers' and School Leaders' Perceptions of Success

    ERIC Educational Resources Information Center

    Lorton, Juli A.; Bellamy, G. Thomas; Reece, Anne; Carlson, Jill

    2013-01-01

    Drawing on research on high-reliability organizations, this interviewbased qualitative case study employs four characteristics of such organizations as a lens for analyzing the operations of one very successful K-5 public school. Results suggest that the school had processes similar to those characteristic of high-reliability organizations: a…

  19. Reliability of surgical skills scores in otolaryngology residents: analysis using generalizability theory.

    PubMed

    Fernandez, Soledad A; Wiet, Gregory J; Butler, Nancy N; Welling, Bradley; Jarjoura, David

    2008-12-01

    Assessments of temporal bone dissection performance among otolaryngology residents have not been adequately developed. At the Ohio State College of Medicine, an instrument (Welling Scale, Version 1 [WS1]) is used to evaluate residents' end-product performance after drilling a temporal bone. In this study, the authors evaluate the components that contribute to measurement error using this scale. Generalizability theory was used to reveal components of measurement error that allow for better understanding of test results. A major component of measurement error came from inconsistency in performance across the two cadaveric test bones each resident was assigned. In contrast, ratings of performance using the WS1 were highly consistent across raters and rating sessions within raters. The largest source of measurement error was caused by residents' inconsistent performance across bones. Rater disagreement introduced only small error into scores. The WS1 provides small measurement error, with two raters and two bones for each participant. PMID:18842619

  20. Reliability of Surgical Skills Scores in Otolaryngology Residents Analysis Using Generalizability Theory

    PubMed Central

    Fernandez, Soledad A.; Wiet, Gregory J.; Butler, Nancy N.; Welling, Bradley; Jarjoura, David

    2012-01-01

    Assessments of temporal bone dissection performance among otolaryngology residents have not been adequately developed. At the Ohio State College of Medicine, an instrument (Welling Scale, Version 1 [WS1]) is used to evaluate residents' end-product performance after drilling a temporal bone. In this study, the authors evaluate the components that contribute to measurement error using this scale. Generalizability theory was used to reveal components of measurement error that allow for better understanding of test results. A major component of measurement error came from inconsistency in performance across the two cadaveric test bones each resident was assigned. In contrast, ratings of performance using the WS1 were highly consistent across raters and rating sessions within raters. The largest source of measurement error was caused by residents'inconsistent performance across bones. Rater disagreement introduced only small error into scores. The WS1 provides small measurement error, with two raters and two bones for each participant. PMID:18842619

  1. Reliable Energy Level Alignment at Physisorbed Molecule–Metal Interfaces from Density Functional Theory

    PubMed Central

    2015-01-01

    A key quantity for molecule–metal interfaces is the energy level alignment of molecular electronic states with the metallic Fermi level. We develop and apply an efficient theoretical method, based on density functional theory (DFT) that can yield quantitatively accurate energy level alignment information for physisorbed metal–molecule interfaces. The method builds on the “DFT+Σ” approach, grounded in many-body perturbation theory, which introduces an approximate electron self-energy that corrects the level alignment obtained from conventional DFT for missing exchange and correlation effects associated with the gas-phase molecule and substrate polarization. Here, we extend the DFT+Σ approach in two important ways: first, we employ optimally tuned range-separated hybrid functionals to compute the gas-phase term, rather than rely on GW or total energy differences as in prior work; second, we use a nonclassical DFT-determined image-charge plane of the metallic surface to compute the substrate polarization term, rather than the classical DFT-derived image plane used previously. We validate this new approach by a detailed comparison with experimental and theoretical reference data for several prototypical molecule–metal interfaces, where excellent agreement with experiment is achieved: benzene on graphite (0001), and 1,4-benzenediamine, Cu-phthalocyanine, and 3,4,9,10-perylene-tetracarboxylic-dianhydride on Au(111). In particular, we show that the method correctly captures level alignment trends across chemical systems and that it retains its accuracy even for molecules for which conventional DFT suffers from severe self-interaction errors. PMID:25741626

  2. Reliable energy level alignment at physisorbed molecule-metal interfaces from density functional theory.

    PubMed

    Egger, David A; Liu, Zhen-Fei; Neaton, Jeffrey B; Kronik, Leeor

    2015-04-01

    A key quantity for molecule-metal interfaces is the energy level alignment of molecular electronic states with the metallic Fermi level. We develop and apply an efficient theoretical method, based on density functional theory (DFT) that can yield quantitatively accurate energy level alignment information for physisorbed metal-molecule interfaces. The method builds on the "DFT+Σ" approach, grounded in many-body perturbation theory, which introduces an approximate electron self-energy that corrects the level alignment obtained from conventional DFT for missing exchange and correlation effects associated with the gas-phase molecule and substrate polarization. Here, we extend the DFT+Σ approach in two important ways: first, we employ optimally tuned range-separated hybrid functionals to compute the gas-phase term, rather than rely on GW or total energy differences as in prior work; second, we use a nonclassical DFT-determined image-charge plane of the metallic surface to compute the substrate polarization term, rather than the classical DFT-derived image plane used previously. We validate this new approach by a detailed comparison with experimental and theoretical reference data for several prototypical molecule-metal interfaces, where excellent agreement with experiment is achieved: benzene on graphite (0001), and 1,4-benzenediamine, Cu-phthalocyanine, and 3,4,9,10-perylene-tetracarboxylic-dianhydride on Au(111). In particular, we show that the method correctly captures level alignment trends across chemical systems and that it retains its accuracy even for molecules for which conventional DFT suffers from severe self-interaction errors. PMID:25741626

  3. The Relationship between Educational Research and Organization Theory. Uppsala Reports on Education 15.

    ERIC Educational Resources Information Center

    Wallin, Erik; And Others

    Originally written to accompany a request for funds to set up a research project concerning the relationship between educational theory and organization theory, this brief report is based on the assumption that organization theory has a valuable contribution to make to the development of education as a science. First outlined are trends in…

  4. Modeling of a bubble-memory organization with self-checking translators to achieve high reliability.

    NASA Technical Reports Server (NTRS)

    Bouricius, W. G.; Carter, W. C.; Hsieh, E. P.; Wadia, A. B.; Jessep, D. C., Jr.

    1973-01-01

    Study of the design and modeling of a highly reliable bubble-memory system that has the capabilities of: (1) correcting a single 16-adjacent bit-group error resulting from failures in a single basic storage module (BSM), and (2) detecting with a probability greater than 0.99 any double errors resulting from failures in BSM's. The results of the study justify the design philosophy adopted of employing memory data encoding and a translator to correct single group errors and detect double group errors to enhance the overall system reliability.

  5. Theory manual for FAROW version 1.1: A numerical analysis of the Fatigue And Reliability Of Wind turbine components

    SciTech Connect

    WUBTERSTEUBMSTEVEB R.; VEERS,PAUL S.

    2000-01-01

    Because the fatigue lifetime of wind turbine components depends on several factors that are highly variable, a numerical analysis tool called FAROW has been created to cast the problem of component fatigue life in a probabilistic framework. The probabilistic analysis is accomplished using methods of structural reliability (FORM/SORM). While the workings of the FAROW software package are defined in the user's manual, this theory manual outlines the mathematical basis. A deterministic solution for the time to failure is made possible by assuming analytical forms for the basic inputs of wind speed, stress response, and material resistance. Each parameter of the assumed forms for the inputs can be defined to be a random variable. The analytical framework is described and the solution for time to failure is derived.

  6. Organization Theory and Its Application to Research in Librarianship.

    ERIC Educational Resources Information Center

    Howard, Helen

    1984-01-01

    This review discusses major organizational theories used by researchers to investigate questions related to librarianship: libraries as bureaucracies; contingency theory; decision-making; design and structure; technology; organizational climate; research by social scientists; theory, research, practice. Development of organizational theory and its…

  7. 78 FR 29209 - Revisions to Electric Reliability Organization Definition of Bulk Electric System and Rules of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-17

    ...The Commission denies rehearing in part, grants rehearing in part and otherwise reaffirms its determinations in Order No. 773. In addition, the Commission clarifies certain provisions of the Final Rule. Order No. 773 approved the modifications to the currently- effective definition of ``bulk electric system'' developed by the North American Electric Reliability Corporation (NERC), the......

  8. 76 FR 16263 - Revision to Electric Reliability Organization Definition of Bulk Electric System

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-23

    ... impact-based test for identifying bulk electric system elements and asks that the Commission reconsider an impact-based test as a viable approach. The NYPSC asserts that ``NERC and the NPCC have both..., effectively and efficiently ensures reliability.'' \\48\\ It contends that, because an impact-based...

  9. Ultrasonic study on organic liquid and binary organic liquid mixtures by using Schaaffs' collision factor theory

    NASA Astrophysics Data System (ADS)

    Lu, Yi-Gang; Dong, Yan-Wu

    2006-09-01

    Based on Schaaffs' collision factor theory (CFT) in liquids, the equations for nonlinear ultrasonic parameters in both organic liquid and binary organic liquid mixtures are deduced. The nonlinear ultrasonic parameters, including pressure coefficient, temperature coefficients of ultrasonic velocity, and nonlinear acoustic parameter B/A in both organic liquid and binary organic liquid mixtures, are evaluated for comparison with the measured results and data from other sources. The equations show that the coefficient of ultrasonic velocity and nonlinear acoustic parameter B/A are closely related to molecular interactions. These nonlinear ultrasonic parameters reflect some information of internal structure and outside status of the medium or mixtures. From the exponent of repulsive forces of the molecules, several thermodynamic parameters, pressure and temperature of the medium, the nonlinear ultrasonic parameters and ultrasonic nature of the medium can be evaluated. When evaluating and studying nonlinear acoustic parameter B/A of binary organic liquid mixtures, there is no need to know the nonlinear acoustic parameter B/A of the components. Obviously, the equation reveals the connection between the nonlinear ultrasonic nature and internal structure and outside status of the mixtures more directly and distinctly than traditional mixture law for B/A, e.g. Apfel's and Sehgal's laws for liquid binary mixtures.

  10. Reliability training

    NASA Technical Reports Server (NTRS)

    Lalli, Vincent R. (Editor); Malec, Henry A. (Editor); Dillard, Richard B.; Wong, Kam L.; Barber, Frank J.; Barina, Frank J.

    1992-01-01

    Discussed here is failure physics, the study of how products, hardware, software, and systems fail and what can be done about it. The intent is to impart useful information, to extend the limits of production capability, and to assist in achieving low cost reliable products. A review of reliability for the years 1940 to 2000 is given. Next, a review of mathematics is given as well as a description of what elements contribute to product failures. Basic reliability theory and the disciplines that allow us to control and eliminate failures are elucidated.

  11. Assessing Variations in Areal Organization for the Intrinsic Brain: From Fingerprints to Reliability

    PubMed Central

    Xu, Ting; Opitz, Alexander; Craddock, R. Cameron; Wright, Margaret J.; Zuo, Xi-Nian; Milham, Michael P.

    2016-01-01

    Resting state fMRI (R-fMRI) is a powerful in-vivo tool for examining the functional architecture of the human brain. Recent studies have demonstrated the ability to characterize transitions between functionally distinct cortical areas through the mapping of gradients in intrinsic functional connectivity (iFC) profiles. To date, this novel approach has primarily been applied to iFC profiles averaged across groups of individuals, or in one case, a single individual scanned multiple times. Here, we used a publically available R-fMRI dataset, in which 30 healthy participants were scanned 10 times (10 min per session), to investigate differences in full-brain transition profiles (i.e., gradient maps, edge maps) across individuals, and their reliability. 10-min R-fMRI scans were sufficient to achieve high accuracies in efforts to “fingerprint” individuals based upon full-brain transition profiles. Regarding test–retest reliability, the image-wise intraclass correlation coefficient (ICC) was moderate, and vertex-level ICC varied depending on region; larger durations of data yielded higher reliability scores universally. Initial application of gradient-based methodologies to a recently published dataset obtained from twins suggested inter-individual variation in areal profiles might have genetic and familial origins. Overall, these results illustrate the utility of gradient-based iFC approaches for studying inter-individual variation in brain function. PMID:27600846

  12. Aligning the Undergraduate Organic Laboratory Experience with Professional Work: The Centrality of Reliable and Meaningful Data

    ERIC Educational Resources Information Center

    Alaimo, Peter J.; Langenhan, Joseph M.; Suydam, Ian T.

    2014-01-01

    Many traditional organic chemistry lab courses do not adequately help students to develop the professional skills required for creative, independent work. The overarching goal of the new organic chemistry lab series at Seattle University is to teach undergraduates to think, perform, and behave more like professional scientists. The conversion of…

  13. Left-right organizer flow dynamics: how much cilia activity reliably yields laterality?

    PubMed

    Sampaio, Pedro; Ferreira, Rita R; Guerrero, Adán; Pintado, Petra; Tavares, Bárbara; Amaro, Joana; Smith, Andrew A; Montenegro-Johnson, Thomas; Smith, David J; Lopes, Susana S

    2014-06-23

    Internal organs are asymmetrically positioned inside the body. Embryonic motile cilia play an essential role in this process by generating a directional fluid flow inside the vertebrate left-right organizer. Detailed characterization of how fluid flow dynamics modulates laterality is lacking. We used zebrafish genetics to experimentally generate a range of flow dynamics. By following the development of each embryo, we show that fluid flow in the left-right organizer is asymmetric and provides a good predictor of organ laterality. This was tested in mosaic organizers composed of motile and immotile cilia generated by dnah7 knockdowns. In parallel, we used simulations of fluid dynamics to analyze our experimental data. These revealed that fluid flow generated by 30 or more cilia predicts 90% situs solitus, similar to experimental observations. We conclude that cilia number, dorsal anterior motile cilia clustering, and left flow are critical to situs solitus via robust asymmetric charon expression. PMID:24930722

  14. An examination of maintenance activities in liquid metal reactor facilities: An analysis by the Centralized Reliability Data Organization (CREDO)

    SciTech Connect

    Haire, M J; Knee, H E; Manning, J J; Manneschmidt, J F; Setoguchi, K

    1987-01-01

    The Centralized Reliability Data Organization (CREDO) is the largest repository of liquid metal reactor (LMR) component reliability data in the world. It is jointly sponsored by the US Department of Energy (DOE) and the Power Reactor and Nuclear fuel Development Corporation (PNC) of Japan. The CREDO database contains information on a population of more than 21,000 components and approximately 1300 event records. Total experience is approaching 1.2 billion component operating hours. Although data gathering for CREDO concentrates on event (failure) information, the work reported here focuses on the maintenance information contained in CREDO and the development of maintenance critical items lists. That is, components are ranked in prioritized lists from worse to best performers from a maintenance standpoint.

  15. A theory for the arrangement of sensory organs in Drosophila.

    PubMed

    Zhu, Huifeng; Gunaratne, Preethi H; Roman, Gregg W; Gunaratne, Gemunu H

    2010-03-01

    We study the arrangements of recurved bristles on the anterior wing margin of wild-type and mutant Drosophila. The epidermal or neural fate of a proneural cell depends on the concentrations of proteins of the achaete-scute complex. At puparium formation, concentrations of proteins are nearly identical in all cells of the anterior wing and each cell has the potential for neural fate. In wild-type flies, the action of regulatory networks drives the initial state to one where a bristle grows out of every fifth cell. Recent experiments have shown that the frequency of recurved bristles can be made to change by adjusting the mean concentrations of the zinc-finger transcription factor Senseless and the micro-RNA miR-9a. Specifically, mutant flies with reduced levels of miR-9a exhibit ectopic bristles, and those with lower levels of both miR-9a and Senseless show regular organization of recurved bristles, but with a lower periodicity of 4. We argue that these characteristics can be explained assuming an underlying Turing-type bifurcation whereby a periodic pattern spontaneously emerges from a uniform background. However, bristle patterns occur in a discrete array of cells, and are not mediated by diffusion. We argue that intracellular actions of transmembrane proteins such as Delta and Notch can play a role of diffusion in destabilizing the homogeneous state. In contrast to diffusion, intercellular actions can be activating or inhibiting; further, there can be lateral cross-species interactions. We introduce a phenomenological model to study bristle arrangements and make several model-independent predictions that can be tested in experiments. In our theory, miRNA-9a is one of the components of the underlying network and has no special regulatory role. The loss of periodicity in its absence is due to the transfer of the system to a bistable state. PMID:20370287

  16. A theory for the arrangement of sensory organs in Drosophila

    NASA Astrophysics Data System (ADS)

    Zhu, Huifeng; Gunaratne, Preethi H.; Roman, Gregg W.; Gunaratne, Gemunu H.

    2010-03-01

    We study the arrangements of recurved bristles on the anterior wing margin of wild-type and mutant Drosophila. The epidermal or neural fate of a proneural cell depends on the concentrations of proteins of the achaete-scute complex. At puparium formation, concentrations of proteins are nearly identical in all cells of the anterior wing and each cell has the potential for neural fate. In wild-type flies, the action of regulatory networks drives the initial state to one where a bristle grows out of every fifth cell. Recent experiments have shown that the frequency of recurved bristles can be made to change by adjusting the mean concentrations of the zinc-finger transcription factor Senseless and the micro-RNA miR-9a. Specifically, mutant flies with reduced levels of miR-9a exhibit ectopic bristles, and those with lower levels of both miR-9a and Senseless show regular organization of recurved bristles, but with a lower periodicity of 4. We argue that these characteristics can be explained assuming an underlying Turing-type bifurcation whereby a periodic pattern spontaneously emerges from a uniform background. However, bristle patterns occur in a discrete array of cells, and are not mediated by diffusion. We argue that intracellular actions of transmembrane proteins such as Delta and Notch can play a role of diffusion in destabilizing the homogeneous state. In contrast to diffusion, intercellular actions can be activating or inhibiting; further, there can be lateral cross-species interactions. We introduce a phenomenological model to study bristle arrangements and make several model-independent predictions that can be tested in experiments. In our theory, miRNA-9a is one of the components of the underlying network and has no special regulatory role. The loss of periodicity in its absence is due to the transfer of the system to a bistable state.

  17. The chronic toxicity of molybdate to freshwater organisms. I. Generating reliable effects data.

    PubMed

    De Schamphelaere, K A C; Stubblefield, W; Rodriguez, P; Vleminckx, K; Janssen, C R

    2010-10-15

    The European Union regulation on Registration, Evaluation, Authorization and Restriction of Chemical substances (REACH) (EC, 2006) requires the characterization of the chronic toxicity of many chemicals in the aquatic environment, including molybdate (MoO(4)(2-)). Our literature review on the ecotoxicity of molybdate revealed that a limited amount of reliable chronic no observed effect concentrations (NOECs) for the derivation of a predicted no-effect concentration (PNEC) existed. This paper presents the results of additional ecotoxicity experiments that were conducted in order to fulfill the requirements for the derivation of a PNEC by means of the scientifically most robust species sensitivity distribution (SSD) approach (also called the statistical extrapolation approach). Ten test species were chronically exposed to molybdate (added as sodium molybdate dihydrate, Na(2)MoO(4)·2H(2)O) according to internationally accepted standard testing guidelines or equivalent. The 10% effective concentrations (EC10, expressed as measured dissolved molybdenum) for the most sensitive endpoint per species were 62.8-105.6 (mg Mo)/L for Daphnia magna (21day-reproduction), 78.2 (mg Mo)/L for Ceriodaphnia dubia (7day-reproduction), 61.2-366.2 (mg Mo)/L for the green alga Pseudokirchneriella subcapitata (72h-growth rate), 193.6 (mg Mo)/L for the rotifer Brachionus calyciflorus (48h-population growth rate), 121.4 (mg Mo)/L for the midge Chironomus riparius (14day-growth), 211.3 (mg Mo)/L for the snail Lymnaea stagnalis (28day-growth rate), 115.9 (mg Mo)/L for the frog Xenopus laevis (4day-larval development), 241.5 (mg Mo)/L for the higher plant Lemna minor (7day-growth rate), 39.3 (mg Mo)/L for the fathead minnow Pimephales promelas (34day-dry weight/biomass), and 43.2 (mg Mo)/L for the rainbow trout Oncorhynchus mykiss (78day-biomass). These effect concentrations are in line with the few reliable data currently available in the open literature. The data presented in this study can

  18. How Settings Change People: Applying Behavior Setting Theory to Consumer-Run Organizations

    ERIC Educational Resources Information Center

    Brown, Louis D.; Shepherd, Matthew D.; Wituk, Scott A.; Meissen, Greg

    2007-01-01

    Self-help initiatives stand as a classic context for organizational studies in community psychology. Behavior setting theory stands as a classic conception of organizations and the environment. This study explores both, applying behavior setting theory to consumer-run organizations (CROs). Analysis of multiple data sets from all CROs in Kansas…

  19. Using Multivariate Generalizability Theory to Assess the Effect of Content Stratification on the Reliability of a Performance Assessment

    ERIC Educational Resources Information Center

    Keller, Lisa A.; Clauser, Brian E.; Swanson, David B.

    2010-01-01

    In recent years, demand for performance assessments has continued to grow. However, performance assessments are notorious for lower reliability, and in particular, low reliability resulting from task specificity. Since reliability analyses typically treat the performance tasks as randomly sampled from an infinite universe of tasks, these estimates…

  20. Reliable measurement of the Seebeck coefficient of organic and inorganic materials between 260 K and 460 K

    SciTech Connect

    Beretta, D.; Lanzani, G.; Bruno, P.; Caironi, M.

    2015-07-15

    A new experimental setup for reliable measurement of the in-plane Seebeck coefficient of organic and inorganic thin films and bulk materials is reported. The system is based on the “Quasi-Static” approach and can measure the thermopower in the range of temperature between 260 K and 460 K. The system has been tested on a pure nickel bulk sample and on a thin film of commercially available PEDOT:PSS deposited by spin coating on glass. Repeatability within 1.5% for the nickel sample is demonstrated, while accuracy in the measurement of both organic and inorganic samples is guaranteed by time interpolation of data and by operating with a temperature difference over the sample of less than 1 K.

  1. A Comparison of the Approaches of Generalizability Theory and Item Response Theory in Estimating the Reliability of Test Scores for Testlet-Composed Tests

    ERIC Educational Resources Information Center

    Lee, Guemin; Park, In-Yong

    2012-01-01

    Previous assessments of the reliability of test scores for testlet-composed tests have indicated that item-based estimation methods overestimate reliability. This study was designed to address issues related to the extent to which item-based estimation methods overestimate the reliability of test scores composed of testlets and to compare several…

  2. An Examination of Test-Retest, Alternate Form Reliability, and Generalizability Theory Study of the easyCBM Reading Assessments: Grade 1. Technical Report #1216

    ERIC Educational Resources Information Center

    Anderson, Daniel; Park, Jasmine, Bitnara; Lai, Cheng-Fei; Alonzo, Julie; Tindal, Gerald

    2012-01-01

    This technical report is one in a series of five describing the reliability (test/retest/and alternate form) and G-Theory/D-Study research on the easy CBM reading measures, grades 1-5. Data were gathered in the spring 2011 from a convenience sample of students nested within classrooms at a medium-sized school district in the Pacific Northwest. Due…

  3. Toward an Integrative Theory of Power and Educational Organizations.

    ERIC Educational Resources Information Center

    Muth, Rodney

    1984-01-01

    Suggests an empirical model to test the assumption that conflict and consensus theories of organizational management can be combined, with the central concept being power. Three studies supporting the model suggest that it has considerable heuristic and empirical potential. (JW)

  4. How Reliable is the Bulk δ13C value of Soil Organic Matter in Paleovegetational Reconstruction?

    NASA Astrophysics Data System (ADS)

    Sanyal, P.; Rakshit, S.

    2015-12-01

    Carbon isotope ratios of soil/paleosol organic matter (δ13CSOM) have been used to reconstruct abundance of C3-C4 plants survived in the landscape as the δ13C value of C3 (-27‰) and C4 (-12.5 ‰) plants are distinctly different. In an attempt to reconstruct the abundance of C3 and C4 plants, δ13CSOM have been measured from three soil profiles developed on flood plain of the Gangetic plain, Mohanpur, West Bengal, India. Satellite images reveal that the investigated sediments have been deposited in an oxbow lake setting of the river Ganges. The total organic carbon content of the profile ranges from 0.9% to 0.1%. The δ13CSOM values mostly range from -19.2‰ to -22‰ except a rapid positive excursions of ~5‰ at 1.5 m depth showing enriched value (-14.2‰) in all the three profiles. Based on mass balance calculation using the δ13C values of C3 and C4 plants, the δ13CSOM in the Gangetic plain indicate presence of both C3 and C4 plants in the floodplain. However, characterization of alkanes separated from lipids extracted from the same soil organic matter reveals dominant preferences in short carbon chain (C14, C16, C18, C20) with a little preferences for higher chain (C29, C31, C33). Interestingly, n-alkanes at 1.5 m depth shows very high concentration in short chain n-alkanes. Since the lower chain n-alkane represents aquatic productivity or intense bacterial decomposition and higher chain indicates the contribution from C3-C4 plants, the data from the investigated sedimentary profile shows contribution mostly from aquatic vegetation with a little contribution from terrestrial plants. This implies that before using bulk δ13CSOM value for reconstruction of C3-C4 plants from soil/paleosol, characterization (molecular level) of soil organic matter is required

  5. Retrenchment in health care organizations: theory and practice.

    PubMed

    Fottler, M D; Smith, H L; Muller, H J

    1986-01-01

    This paper analyzes retrenchment in health care organizations in terms of prescriptions in the literature and the actual responses of health care executives to retrenchment. Case studies of five organizations indicate that the range of coping strategies is much more limited than the range of possibilities suggested in the literature. Constraints within the culture of the organization are suggested as an explanation for this disparity.

  6. Can the second order multireference perturbation theory be considered a reliable tool to study mixed-valence compounds?

    PubMed

    Pastore, Mariachiara; Helal, Wissam; Evangelisti, Stefano; Leininger, Thierry; Malrieu, Jean-Paul; Maynau, Daniel; Angeli, Celestino; Cimiraglia, Renzo

    2008-05-01

    In this paper, the problem of the calculation of the electronic structure of mixed-valence compounds is addressed in the frame of multireference perturbation theory (MRPT). Using a simple mixed-valence compound (the 5,5(') (4H,4H('))-spirobi[ciclopenta[c]pyrrole] 2,2('),6,6(') tetrahydro cation), and the n-electron valence state perturbation theory (NEVPT2) and CASPT2 approaches, it is shown that the ground state (GS) energy curve presents an unphysical "well" for nuclear coordinates close to the symmetric case, where a maximum is expected. For NEVPT, the correct shape of the energy curve is retrieved by applying the MPRT at the (computationally expensive) third order. This behavior is rationalized using a simple model (the ionized GS of two weakly interacting identical systems, each neutral system being described by two electrons in two orbitals), showing that the unphysical well is due to the canonical orbital energies which at the symmetric (delocalized) conformation lead to a sudden modification of the denominators in the perturbation expansion. In this model, the bias introduced in the second order correction to the energy is almost entirely removed going to the third order. With the results of the model in mind, one can predict that all MRPT methods in which the zero order Hamiltonian is based on canonical orbital energies are prone to present unreasonable energy profiles close to the symmetric situation. However, the model allows a strategy to be devised which can give a correct behavior even at the second order, by simply averaging the orbital energies of the two charge-localized electronic states. Such a strategy is adopted in a NEVPT2 scheme obtaining a good agreement with the third order results based on the canonical orbital energies. The answer to the question reported in the title (is this theoretical approach a reliable tool for a correct description of these systems?) is therefore positive, but care must be exercised, either in defining the orbital

  7. Reliability of equivalent sphere model in blood-forming organ dose estimation

    SciTech Connect

    Shinn, J.L.; Wilson, J.W.; Nealy, J.E.

    1990-04-01

    The radiation dose equivalents to blood-forming organs (BFO's) of the astronauts at the Martian surface due to major solar flare events are calculated using the detailed body geometry of Langley and Billings. The solar flare spectra of February 1956, November 1960, and August 1972 events are employed instead of the idealized Webber form. The detailed geometry results are compared with those based on the 5-cm sphere model which was used often in the past to approximate BFO dose or dose equivalent. Larger discrepancies are found for the later two events possibly due to the lower numbers of highly penetrating protons. It is concluded that the 5-cm sphere model is not suitable for quantitative use in connection with future NASA deep-space, long-duration mission shield design studies.

  8. Reliability of equivalent sphere model in blood-forming organ dose estimation

    NASA Technical Reports Server (NTRS)

    Shinn, Judy L.; Wilson, John W.; Nealy, John E.

    1990-01-01

    The radiation dose equivalents to blood-forming organs (BFO's) of the astronauts at the Martian surface due to major solar flare events are calculated using the detailed body geometry of Langley and Billings. The solar flare spectra of February 1956, November 1960, and August 1972 events are employed instead of the idealized Webber form. The detailed geometry results are compared with those based on the 5-cm sphere model which was used often in the past to approximate BFO dose or dose equivalent. Larger discrepancies are found for the later two events possibly due to the lower numbers of highly penetrating protons. It is concluded that the 5-cm sphere model is not suitable for quantitative use in connection with future NASA deep-space, long-duration mission shield design studies.

  9. Implications of Complexity and Chaos Theories for Organizations that Learn

    ERIC Educational Resources Information Center

    Smith, Peter A. C.

    2003-01-01

    In 1996 Hubert Saint-Onge and Smith published an article ("The evolutionary organization: avoiding a Titanic fate", in The Learning Organization, Vol. 3 No. 4), based on their experience at the Canadian Imperial Bank of Commerce (CIBC). It was established at CIBC that change could be successfully facilitated through blended application of theory…

  10. Notes on a Political Theory of Educational Organizations.

    ERIC Educational Resources Information Center

    Bacharach, Samuel B.

    This essay reviews major trends in methodological and theoretical approaches to the study of organizations since the mid-sixties and espouses the political analysis of organizations, a position representing a middle ground between comparative structuralism and the loosely coupled systems approach. This position emphasizes micropolitics as well as…

  11. Cracking Silent Codes: Critical Race Theory and Education Organizing

    ERIC Educational Resources Information Center

    Su, Celina

    2007-01-01

    Critical race theory (CRT) has moved beyond legal scholarship to critique the ways in which "colorblind" laws and policies perpetuate existing racial inequalities in education policy. While criticisms of CRT have focused on the pessimism and lack of remedies presented, CRT scholars have begun to address issues of praxis. Specifically, communities…

  12. Push-Pull Receptive Field Organization and Synaptic Depression: Mechanisms for Reliably Encoding Naturalistic Stimuli in V1

    PubMed Central

    Kremkow, Jens; Perrinet, Laurent U.; Monier, Cyril; Alonso, Jose-Manuel; Aertsen, Ad; Frégnac, Yves; Masson, Guillaume S.

    2016-01-01

    Neurons in the primary visual cortex are known for responding vigorously but with high variability to classical stimuli such as drifting bars or gratings. By contrast, natural scenes are encoded more efficiently by sparse and temporal precise spiking responses. We used a conductance-based model of the visual system in higher mammals to investigate how two specific features of the thalamo-cortical pathway, namely push-pull receptive field organization and fast synaptic depression, can contribute to this contextual reshaping of V1 responses. By comparing cortical dynamics evoked respectively by natural vs. artificial stimuli in a comprehensive parametric space analysis, we demonstrate that the reliability and sparseness of the spiking responses during natural vision is not a mere consequence of the increased bandwidth in the sensory input spectrum. Rather, it results from the combined impacts of fast synaptic depression and push-pull inhibition, the later acting for natural scenes as a form of “effective” feed-forward inhibition as demonstrated in other sensory systems. Thus, the combination of feedforward-like inhibition with fast thalamo-cortical synaptic depression by simple cells receiving a direct structured input from thalamus composes a generic computational mechanism for generating a sparse and reliable encoding of natural sensory events. PMID:27242445

  13. Push-Pull Receptive Field Organization and Synaptic Depression: Mechanisms for Reliably Encoding Naturalistic Stimuli in V1.

    PubMed

    Kremkow, Jens; Perrinet, Laurent U; Monier, Cyril; Alonso, Jose-Manuel; Aertsen, Ad; Frégnac, Yves; Masson, Guillaume S

    2016-01-01

    Neurons in the primary visual cortex are known for responding vigorously but with high variability to classical stimuli such as drifting bars or gratings. By contrast, natural scenes are encoded more efficiently by sparse and temporal precise spiking responses. We used a conductance-based model of the visual system in higher mammals to investigate how two specific features of the thalamo-cortical pathway, namely push-pull receptive field organization and fast synaptic depression, can contribute to this contextual reshaping of V1 responses. By comparing cortical dynamics evoked respectively by natural vs. artificial stimuli in a comprehensive parametric space analysis, we demonstrate that the reliability and sparseness of the spiking responses during natural vision is not a mere consequence of the increased bandwidth in the sensory input spectrum. Rather, it results from the combined impacts of fast synaptic depression and push-pull inhibition, the later acting for natural scenes as a form of "effective" feed-forward inhibition as demonstrated in other sensory systems. Thus, the combination of feedforward-like inhibition with fast thalamo-cortical synaptic depression by simple cells receiving a direct structured input from thalamus composes a generic computational mechanism for generating a sparse and reliable encoding of natural sensory events. PMID:27242445

  14. Increasing Reliability of Direct Observation Measurement Approaches in Emotional and/or Behavioral Disorders Research Using Generalizability Theory

    ERIC Educational Resources Information Center

    Gage, Nicholas A.; Prykanowski, Debra; Hirn, Regina

    2014-01-01

    Reliability of direct observation outcomes ensures the results are consistent, dependable, and trustworthy. Typically, reliability of direct observation measurement approaches is assessed using interobserver agreement (IOA) and the calculation of observer agreement (e.g., percentage of agreement). However, IOA does not address intraobserver…

  15. Applying Hofstede's Cross-Cultural Theory of Organizations to School Governance: A French Case Study.

    ERIC Educational Resources Information Center

    Fowler, Frances C.

    This paper applies Geert Hofstede's cross-cultural theory of organizational structure and behavior to school administration, examining the governance structure of the French public school system to determine how accurately it predicts the form of that educational organization. The first section of the paper presents Hofstede's theory and his…

  16. Development of an Axiomatic Theory of Organization/Environment Interaction: A Theoretical and Empirical Analysis.

    ERIC Educational Resources Information Center

    Ganey, Rodney F.

    The goal of this paper was to develop a theory of organization/environment interaction by examining the impact of perceived environmental uncertainty on organizational processes and on organizational goal attainment. It examines theories from the organizational environment literature and derives corollaries that are empirically tested using a data…

  17. A Theory of Electronic Propinquity: Mediated Communication in Organizations.

    ERIC Educational Resources Information Center

    Korzenny, Felipe

    This paper proposes a theoretical approach to mediated communication in organizations. It is argued that the man/machine interface in mediated human communication is better dealt with when a comprehensive theoretical approach is used than when separate communication devices are tested as they appear in the market, such as video-teleconferencing.…

  18. An Examination of Test-Retest, Alternate Form Reliability, and Generalizability Theory Study of the easyCBM Reading Assessments: Grade 5. Technical Report #1220

    ERIC Educational Resources Information Center

    Lai, Cheng-Fei; Park, Bitnara Jasmine; Anderson, Daniel; Alonzo, Julie; Tindal, Gerald

    2012-01-01

    This technical report is one in a series of five describing the reliability (test/retest and alternate form) and G-Theory/D-Study research on the easyCBM reading measures, grades 1-5. Data were gathered in the spring of 2011 from a convenience sample of students nested within classrooms at a medium-sized school district in the Pacific Northwest.…

  19. An Examination of Test-Retest, Alternate Form Reliability, and Generalizability Theory Study of the easyCBM Reading Assessments: Grade 2. Technical Report #1217

    ERIC Educational Resources Information Center

    Anderson, Daniel; Lai, Cheg-Fei; Park, Bitnara Jasmine; Alonzo, Julie; Tindal, Gerald

    2012-01-01

    This technical report is one in a series of five describing the reliability (test/retest an alternate form) and G-Theory/D-Study on the easyCBM reading measures, grades 1-5. Data were gathered in the spring of 2011 from the convenience sample of students nested within classrooms at a medium-sized school district in the Pacific Northwest. Due to…

  20. An Examination of Test-Retest, Alternate Form Reliability, and Generalizability Theory Study of the easyCBM Passage Reading Fluency Assessments: Grade 4. Technical Report #1219

    ERIC Educational Resources Information Center

    Park, Bitnara Jasmine; Anderson, Daniel; Alonzo, Julie; Lai, Cheng-Fei; Tindal, Gerald

    2012-01-01

    This technical report is one in a series of five describing the reliability (test/retest and alternate form) and G-Theory/D-Study research on the easyCBM reading measures, grades 1-5. Data were gathered in the spring of 2011 from a convenience sample of students nested within classrooms at a medium-sized school district in the Pacific Northwest.…

  1. [The reliability of reliability].

    PubMed

    Blancas Espejo, A

    1991-01-01

    The author critically analyzes an article by Rodolfo Corona Vazquez that questions the reliability of the preliminary results of the Eleventh Census of Population and Housing, conducted in Mexico in March 1990. The need to define what constitutes "reliability" for preliminary results is stressed. PMID:12317739

  2. Targeting helicase-dependent amplification products with an electrochemical genosensor for reliable and sensitive screening of genetically modified organisms.

    PubMed

    Moura-Melo, Suely; Miranda-Castro, Rebeca; de-Los-Santos-Álvarez, Noemí; Miranda-Ordieres, Arturo J; Dos Santos Junior, J Ribeiro; da Silva Fonseca, Rosana A; Lobo-Castañón, Maria Jesús

    2015-08-18

    Cultivation of genetically modified organisms (GMOs) and their use in food and feed is constantly expanding; thus, the question of informing consumers about their presence in food has proven of significant interest. The development of sensitive, rapid, robust, and reliable methods for the detection of GMOs is crucial for proper food labeling. In response, we have experimentally characterized the helicase-dependent isothermal amplification (HDA) and sequence-specific detection of a transgene from the Cauliflower Mosaic Virus 35S Promoter (CaMV35S), inserted into most transgenic plants. HDA is one of the simplest approaches for DNA amplification, emulating the bacterial replication machinery, and resembling PCR but under isothermal conditions. However, it usually suffers from a lack of selectivity, which is due to the accumulation of spurious amplification products. To improve the selectivity of HDA, which makes the detection of amplification products more reliable, we have developed an electrochemical platform targeting the central sequence of HDA copies of the transgene. A binary monolayer architecture is built onto a thin gold film where, upon the formation of perfect nucleic acid duplexes with the amplification products, these are enzyme-labeled and electrochemically transduced. The resulting combined system increases genosensor detectability up to 10(6)-fold, allowing Yes/No detection of GMOs with a limit of detection of ∼30 copies of the CaMV35S genomic DNA. A set of general utility rules in the design of genosensors for detection of HDA amplicons, which may assist in the development of point-of-care tests, is also included. The method provides a versatile tool for detecting nucleic acids with extremely low abundance not only for food safety control but also in the diagnostics and environmental control areas. PMID:26198403

  3. Targeting helicase-dependent amplification products with an electrochemical genosensor for reliable and sensitive screening of genetically modified organisms.

    PubMed

    Moura-Melo, Suely; Miranda-Castro, Rebeca; de-Los-Santos-Álvarez, Noemí; Miranda-Ordieres, Arturo J; Dos Santos Junior, J Ribeiro; da Silva Fonseca, Rosana A; Lobo-Castañón, Maria Jesús

    2015-08-18

    Cultivation of genetically modified organisms (GMOs) and their use in food and feed is constantly expanding; thus, the question of informing consumers about their presence in food has proven of significant interest. The development of sensitive, rapid, robust, and reliable methods for the detection of GMOs is crucial for proper food labeling. In response, we have experimentally characterized the helicase-dependent isothermal amplification (HDA) and sequence-specific detection of a transgene from the Cauliflower Mosaic Virus 35S Promoter (CaMV35S), inserted into most transgenic plants. HDA is one of the simplest approaches for DNA amplification, emulating the bacterial replication machinery, and resembling PCR but under isothermal conditions. However, it usually suffers from a lack of selectivity, which is due to the accumulation of spurious amplification products. To improve the selectivity of HDA, which makes the detection of amplification products more reliable, we have developed an electrochemical platform targeting the central sequence of HDA copies of the transgene. A binary monolayer architecture is built onto a thin gold film where, upon the formation of perfect nucleic acid duplexes with the amplification products, these are enzyme-labeled and electrochemically transduced. The resulting combined system increases genosensor detectability up to 10(6)-fold, allowing Yes/No detection of GMOs with a limit of detection of ∼30 copies of the CaMV35S genomic DNA. A set of general utility rules in the design of genosensors for detection of HDA amplicons, which may assist in the development of point-of-care tests, is also included. The method provides a versatile tool for detecting nucleic acids with extremely low abundance not only for food safety control but also in the diagnostics and environmental control areas.

  4. Theory of zwitterionic molecular-based organic magnets

    NASA Astrophysics Data System (ADS)

    Shelton, William A.; Aprà, Edoardo; Sumpter, Bobby G.; Saraiva-Souza, Aldilene; Souza Filho, Antonio G.; Nero, Jordan Del; Meunier, Vincent

    2011-08-01

    We describe a class of organic molecular magnets based on zwitterionic molecules (betaine derivatives) possessing donor, π bridge, and acceptor groups. Using extensive electronic structure calculations we show the electronic ground-state in these systems is magnetic. In addition, we show that the large energy differences computed for the various magnetic states indicate a high Neel temperature. The quantum mechanical nature of the magnetic properties originates from the conjugated π bridge (only p electrons) in cooperation with the molecular donor-acceptor character. The exchange interactions between electron spin are strong, local, and independent on the length of the π bridge.

  5. Theory of self-organized critical transport in tokamak plasmas

    SciTech Connect

    Kishimoto, Y.; Tajima, T.; Horton, W.; LeBrun, M.J.; Kim, J.Y. |

    1995-07-01

    A theoretical and computational study of the ion temperature gradient and {eta}{sub i} instabilities in tokamak plasmas has been carried out. In toroidal geometry the modes have a radially extended structure and their eigenfrequencies are constant over many rational surfaces that are coupled through toroidicity. These nonlocal properties of the ITG modes impose strong constraint on the drift mode fluctuations and the amciated transport, showing a self-organized characteristic. As any significant deviation away from marginal stability causes rapid temperature relaxation and intermittent bursts, the modes hover near marginality and exhibit strong kinetic characteristics. As a result, the temperature relaxation is self-semilar and nonlocal, leading to a radially increasing heat diffusivity. The nonlocal transport leads to the Bohm-like diffusion scaling. The heat input regulates the deviation of the temperature gradient away from marginality. The obtained transport scalings and properties are globally consistent with experimental observations of L-mode charges.

  6. Theory of self-organized critical transport in tokamak plasmas

    SciTech Connect

    Kishimoto, Y.; Tajima, T.; Horton, W.; LeBrun, M.J.; Kim, J.Y.

    1996-04-01

    A theoretical and computational study of the ion temperature gradient (ITG) and {eta}{sub {ital i}} instabilities in tokamak plasmas has been carried out. In a toroidal geometry the modes have a radially extended structure and their eigenfrequencies are constant over many rational surfaces that are coupled through toroidicity. These nonlocal properties of the ITG modes impose a strong constraint on the drift mode fluctuations and the associated transport, showing self-organized criticality. As any significant deviation away from marginal stability causes rapid temperature relaxation and intermittent bursts, the modes hover near marginality and exhibit strong kinetic characteristics. As a result of this, the temperature relaxation is self-similar and nonlocal, leading to radially increasing heat diffusivity. The nonlocal transport leads to Bohm-like diffusion scaling. Heat input regulates the deviation of the temperature gradient away from marginality. We present a critical gradient transport model that describes such a self-organized relaxed state. Some of the important aspects in tokamak transport like Bohm diffusion, near marginal stability, radially increasing fluctuation energy and heat diffusivity, intermittency of the wave excitation, and resilient tendency of the plasma profile can be described by this model, and these prominent features are found to belong to one physical category that originates from the radially extended nonlocal drift modes. The obtained transport properties and scalings are globally consistent with experimental observations of low confinement mode (L-mode) discharges. The nonlocal modes can be disintegrated into smaller radial islands by a poloidal shear flow, suggesting that the transport changes from Bohm-like to near gyro-Bohm. {copyright} {ital 1996 American Institute of Physics.}

  7. Photoluminescent Organic Molecules from the Perspective of Density Functional Theory

    NASA Astrophysics Data System (ADS)

    Massaro, Richard Douglas

    2011-12-01

    I have studied the electronic structure, vibrational modes, and photophysics of methyl salicylate (MS) isomers in detail using density functional theory (DFT) and its time-dependent (TDDFT) companion. I have confirmed that six isomers are stable in their ground states with the ketoB isomer being the global minimum structure. I have performed free energy calculations which show that other isomers may be energetically favorable at higher temperatures. The calculated vibrational modes of ketoB match well with experimental infrared spectra. Using TDDFT, I have confirmed that the ketoB isomer undergoes an energetically favorable excited-state intermolecular proton transfer (ESIPT) to an enol isomer. I found that the ESIPT has a small potential energy barrier when the proton transitions from the ketoB to the enol structure and a ten times larger barrier to accomplish a reverse ESIPT from enol to ketoB. The barrier asymmetry is responsible for the temperature dependent suppression of the far-blue fluorescence. I modeled the emission spectra for gas phase MS using Franck-Condon factors based on the calculated 0-0 transition and vibrational modes for the ground and excited states. The calculated spectra match well to gas phase experimental spectra. Finally, I performed detailed DFT studies on dipicolinic acid (DPA) and determined its stable structures, energetics, and vibrational modes. My calculations predict the existence of six stable isomers of gas phase DPA in the ground state. Three of these isomers are nearly energetically degenerate. I calculated several transition state reaction paths between these isomers. I performed similar calculations on five dimerized formations. By using periodic boundary conditions (PBC) on three dimerized DPA arrays containing hydrogen-bonding DPA monomers, I was able to predict three different crystal structures. I report the band structures of the resulting DPA crystals for the first time. All of them are insulators.

  8. Human hair follicle organ culture: theory, application and perspectives.

    PubMed

    Langan, Ewan A; Philpott, Michael P; Kloepper, Jennifer E; Paus, Ralf

    2015-12-01

    For almost a quarter of a century, ex vivo studies of human scalp hair follicles (HFs) have permitted major advances in hair research, spanning diverse fields such as chronobiology, endocrinology, immunology, metabolism, mitochondrial biology, neurobiology, pharmacology, pigmentation and stem cell biology. Despite this, a comprehensive methodological guide to serum-free human HF organ culture (HFOC) that facilitates the selection and analysis of standard HF biological parameters and points out both research opportunities and pitfalls to newcomers to the field is still lacking. The current methods review aims to close an important gap in the literature and attempts to promote standardisation of human HFOC. We provide basic information outlining the establishment of HFOC through to detailed descriptions of the analysis of standard read-out parameters alongside practical examples. The guide closes by pointing out how serum-free HFOC can be utilised optimally to obtain previously inaccessible insights into human HF biology and pathology that are of interest to experimental dermatologists, geneticists, developmental biologists and (neuro-) endocrinologists alike and by highlighting novel applications of the model, including gene silencing and gene expression profiling of defined, laser capture-microdissected HF compartments. PMID:26284830

  9. Human hair follicle organ culture: theory, application and perspectives.

    PubMed

    Langan, Ewan A; Philpott, Michael P; Kloepper, Jennifer E; Paus, Ralf

    2015-12-01

    For almost a quarter of a century, ex vivo studies of human scalp hair follicles (HFs) have permitted major advances in hair research, spanning diverse fields such as chronobiology, endocrinology, immunology, metabolism, mitochondrial biology, neurobiology, pharmacology, pigmentation and stem cell biology. Despite this, a comprehensive methodological guide to serum-free human HF organ culture (HFOC) that facilitates the selection and analysis of standard HF biological parameters and points out both research opportunities and pitfalls to newcomers to the field is still lacking. The current methods review aims to close an important gap in the literature and attempts to promote standardisation of human HFOC. We provide basic information outlining the establishment of HFOC through to detailed descriptions of the analysis of standard read-out parameters alongside practical examples. The guide closes by pointing out how serum-free HFOC can be utilised optimally to obtain previously inaccessible insights into human HF biology and pathology that are of interest to experimental dermatologists, geneticists, developmental biologists and (neuro-) endocrinologists alike and by highlighting novel applications of the model, including gene silencing and gene expression profiling of defined, laser capture-microdissected HF compartments.

  10. The validity and reliability of the World Health Organization Mental Disorders Checklist for use in a telehealth clinic in Hong Kong.

    PubMed

    Leung, Sau Fong; Chui, Caroline; Arthur, David; French, Peter; Lai, Alice; Lee, Wai Man; Wong, Daniel

    2005-06-01

    This research aimed to test the validity and reliability of the 'World Health Organization Mental Disorders Checklist' for use in a telehealth clinic in Hong Kong. The Checklist adopted four subscales: (i) depression; (ii) anxiety; (iii) alcohol use disorders; and (vi) functioning and disablement, and was translated from English into Chinese. It was validated by a panel of five experts to confirm its content validity (content validity index = 0.98) and cultural appropriateness in Hong Kong. The reliability of the checklist was supported by the findings of a test-retest procedure (Pearson correlation = 0.66-0.88, P < 0.01), internal consistency reliability (Cronbach's alpha = 0.54-0.83), and interrater reliability (Kendall's coefficient of concordance = 0.58-1.00, P < 0.01) involving a sample of 197 subjects from one telehealth clinic in Hong Kong.

  11. Towards a Theory of Variation in the Organization of the Word Reading System

    PubMed Central

    Rueckl, Jay G.

    2015-01-01

    The strategy underlying most computational models of word reading is to specify the organization of the reading system—its architecture and the processes and representations it employs—and to demonstrate that this organization would give rise to the behavior observed in word reading tasks. This approach fails to adequately address the variation in reading behavior observed across and within linguistic communities. Only computational models that incorporate learning can fully account for variation in organization. However, even extant learning models (e.g., the triangle model) must be extended if they are to fully account for variation in organization. The challenges associated with extending theories in this way are discussed. PMID:26997862

  12. A review of carrier thermoelectric-transport theory in organic semiconductors.

    PubMed

    Lu, Nianduan; Li, Ling; Liu, Ming

    2016-07-20

    Carrier thermoelectric-transport theory has recently become of growing interest and numerous thermoelectric-transport models have been proposed for organic semiconductors, due to pressing current issues involving energy production and the environment. The purpose of this review is to provide a theoretical description of the thermoelectric Seebeck effect in organic semiconductors. Special attention is devoted to the carrier concentration, temperature, polaron effect and dipole effect dependence of the Seebeck effect and its relationship to hopping transport theory. Furthermore, various theoretical methods are used to discuss carrier thermoelectric transport. Finally, an outlook of the remaining challenges ahead for future theoretical research is provided.

  13. A review of carrier thermoelectric-transport theory in organic semiconductors.

    PubMed

    Lu, Nianduan; Li, Ling; Liu, Ming

    2016-07-20

    Carrier thermoelectric-transport theory has recently become of growing interest and numerous thermoelectric-transport models have been proposed for organic semiconductors, due to pressing current issues involving energy production and the environment. The purpose of this review is to provide a theoretical description of the thermoelectric Seebeck effect in organic semiconductors. Special attention is devoted to the carrier concentration, temperature, polaron effect and dipole effect dependence of the Seebeck effect and its relationship to hopping transport theory. Furthermore, various theoretical methods are used to discuss carrier thermoelectric transport. Finally, an outlook of the remaining challenges ahead for future theoretical research is provided. PMID:27386952

  14. Application of fuzzy set and Dempster-Shafer theory to organic geochemistry interpretation

    NASA Technical Reports Server (NTRS)

    Kim, C. S.; Isaksen, G. H.

    1993-01-01

    An application of fuzzy sets and Dempster Shafter Theory (DST) in modeling the interpretational process of organic geochemistry data for predicting the level of maturities of oil and source rock samples is presented. This was accomplished by (1) representing linguistic imprecision and imprecision associated with experience by a fuzzy set theory, (2) capturing the probabilistic nature of imperfect evidences by a DST, and (3) combining multiple evidences by utilizing John Yen's generalized Dempster-Shafter Theory (GDST), which allows DST to deal with fuzzy information. The current prototype provides collective beliefs on the predicted levels of maturity by combining multiple evidences through GDST's rule of combination.

  15. Egalitarian and maximin theories of justice: directed donation of organs for transplant.

    PubMed

    Veatch, R M

    1998-08-01

    It is common to interpret Rawls's maximin theory of justice as egalitarian. Compared to utilitarian theories, this may be true. However, in special cases practices that distribute resources so as to benefit the worst off actually increase the inequality between the worst off and some who are better off. In these cases the Rawlsian maximin parts company with what is here called true egalitarianism. A policy question requiring a distinction between maximin and "true egalitarian" allocations has arisen in the arena of organ transplantation. This case is examined here as a venue for differentiating maximin and true egalitarian theories. Directed donation is the name given to donations of organs restricted to a particular social group. For example, the family of a member of the Ku Klux Klan donated his organs on the provision that they go only to members of the Caucasian race. While such donations appear to be discriminatory, if certain plausible assumptions are made, they satisfy the maximin criterion. They selectively advantage the recipient of the organs without harming anyone (assuming the organs would otherwise go unused). Moreover, everyone who is lower on the waiting list (who, thereby, could be considered worse off) is advantaged by moving up on the waiting list. This paper examines how maximin and more truly egalitarian theories handle this case arguing that, to the extent that directed donation is unethical, the best account of that conclusion is that an egalitarian principle of justice is to be preferred to the maximin. PMID:9892035

  16. Egalitarian and maximin theories of justice: directed donation of organs for transplant.

    PubMed

    Veatch, R M

    1998-08-01

    It is common to interpret Rawls's maximin theory of justice as egalitarian. Compared to utilitarian theories, this may be true. However, in special cases practices that distribute resources so as to benefit the worst off actually increase the inequality between the worst off and some who are better off. In these cases the Rawlsian maximin parts company with what is here called true egalitarianism. A policy question requiring a distinction between maximin and "true egalitarian" allocations has arisen in the arena of organ transplantation. This case is examined here as a venue for differentiating maximin and true egalitarian theories. Directed donation is the name given to donations of organs restricted to a particular social group. For example, the family of a member of the Ku Klux Klan donated his organs on the provision that they go only to members of the Caucasian race. While such donations appear to be discriminatory, if certain plausible assumptions are made, they satisfy the maximin criterion. They selectively advantage the recipient of the organs without harming anyone (assuming the organs would otherwise go unused). Moreover, everyone who is lower on the waiting list (who, thereby, could be considered worse off) is advantaged by moving up on the waiting list. This paper examines how maximin and more truly egalitarian theories handle this case arguing that, to the extent that directed donation is unethical, the best account of that conclusion is that an egalitarian principle of justice is to be preferred to the maximin.

  17. Reliability and Validity Study of the Mobile Learning Adoption Scale Developed Based on the Diffusion of Innovations Theory

    ERIC Educational Resources Information Center

    Celik, Ismail; Sahin, Ismail; Aydin, Mustafa

    2014-01-01

    In this study, a mobile learning adoption scale (MLAS) was developed on the basis of Rogers' (2003) Diffusion of Innovations Theory. The scale that was developed consists of four sections. These sections are as follows: Stages in the innovation-decision process, Types of m-learning decision, Innovativeness level and attributes of m-learning.…

  18. Reliability and Validity Study of the Mobile Learning Adoption Scale Developed Based on the Diffusion of Innovations Theory

    ERIC Educational Resources Information Center

    Celik, Ismail; Sahin, Ismail; Aydin, Mustafa

    2014-01-01

    In this study, a mobile learning adoption scale (MLAS) was developed on the basis of Rogers' (2003) Diffusion of Innovations Theory. The scale that was developed consists of four sections. These sections are as follows: Stages in the innovation-decision process, Types of m-learning decision, Innovativeness level and attributes of m-learning. There…

  19. Compatibility between Text Mining and Qualitative Research in the Perspectives of Grounded Theory, Content Analysis, and Reliability

    ERIC Educational Resources Information Center

    Yu, Chong Ho; Jannasch-Pennell, Angel; DiGangi, Samuel

    2011-01-01

    The objective of this article is to illustrate that text mining and qualitative research are epistemologically compatible. First, like many qualitative research approaches, such as grounded theory, text mining encourages open-mindedness and discourages preconceptions. Contrary to the popular belief that text mining is a linear and fully automated…

  20. Improving the Reliability of Student Scores from Speeded Assessments: An Illustration of Conditional Item Response Theory Using a Computer-Administered Measure of Vocabulary

    PubMed Central

    Petscher, Yaacov; Mitchell, Alison M.; Foorman, Barbara R.

    2016-01-01

    A growing body of literature suggests that response latency, the amount of time it takes an individual to respond to an item, may be an important factor to consider when using assessment data to estimate the ability of an individual. Considering that tests of passage and list fluency are being adapted to a computer administration format, it is possible that accounting for individual differences in response times may be an increasingly feasible option to strengthen the precision of individual scores. The present research evaluated the differential reliability of scores when using classical test theory and item response theory as compared to a conditional item response model which includes response time as an item parameter. Results indicated that the precision of student ability scores increased by an average of 5 % when using the conditional item response model, with greater improvements for those who were average or high ability. Implications for measurement models of speeded assessments are discussed. PMID:27721568

  1. Applications of the Conceptual Density Functional Theory Indices to Organic Chemistry Reactivity.

    PubMed

    Domingo, Luis R; Ríos-Gutiérrez, Mar; Pérez, Patricia

    2016-01-01

    Theoretical reactivity indices based on the conceptual Density Functional Theory (DFT) have become a powerful tool for the semiquantitative study of organic reactivity. A large number of reactivity indices have been proposed in the literature. Herein, global quantities like the electronic chemical potential μ, the electrophilicity ω and the nucleophilicity N indices, and local condensed indices like the electrophilic P k + and nucleophilic P k - Parr functions, as the most relevant indices for the study of organic reactivity, are discussed. PMID:27294896

  2. Customer-organization relationships: development and test of a theory of extended identities.

    PubMed

    Bagozzi, Richard P; Bergami, Massimo; Marzocchi, Gian Luca; Morandin, Gabriele

    2012-01-01

    We develop a theory of personal, relational, and collective identities that links organizations and consumers. Four targets of identity are studied: small friendship groups of aficionados of Ducati motorcycles, virtual communities centered on Ducatis, the Ducati brand, and Ducati the company. The interplay amongst the identities is shown to order affective, cognitive, and evaluative reactions toward each target. Hypotheses are tested on a sample of 210 Ducati aficionados, and implications of these multiple, extended identities for organizations are examined.

  3. Applications of the Conceptual Density Functional Theory Indices to Organic Chemistry Reactivity.

    PubMed

    Domingo, Luis R; Ríos-Gutiérrez, Mar; Pérez, Patricia

    2016-06-09

    Theoretical reactivity indices based on the conceptual Density Functional Theory (DFT) have become a powerful tool for the semiquantitative study of organic reactivity. A large number of reactivity indices have been proposed in the literature. Herein, global quantities like the electronic chemical potential μ, the electrophilicity ω and the nucleophilicity N indices, and local condensed indices like the electrophilic P k + and nucleophilic P k - Parr functions, as the most relevant indices for the study of organic reactivity, are discussed.

  4. Examining Agency Theory in Training & Development: Understanding Self-Interest Behaviors in the Organization

    ERIC Educational Resources Information Center

    Azevedo, Ross E.; Akdere, Mesut

    2011-01-01

    Agency theory has been discussed widely in the business and management literature. However, to date there has been no investigation about its utility and implications for problems in training & development. Whereas organizations are still struggling to develop and implement effective training programs, there is little emphasis on the self-interest…

  5. Assuring reliability program effectiveness.

    NASA Technical Reports Server (NTRS)

    Ball, L. W.

    1973-01-01

    An attempt is made to provide simple identification and description of techniques that have proved to be most useful either in developing a new product or in improving reliability of an established product. The first reliability task is obtaining and organizing parts failure rate data. Other tasks are parts screening, tabulation of general failure rates, preventive maintenance, prediction of new product reliability, and statistical demonstration of achieved reliability. Five principal tasks for improving reliability involve the physics of failure research, derating of internal stresses, control of external stresses, functional redundancy, and failure effects control. A final task is the training and motivation of reliability specialist engineers.

  6. Organisms, organizations and interactions: an information theory approach to biocultural evolution.

    PubMed

    Wallace, R; Wallace, R G

    1999-08-01

    The language metaphor of theoretical biology, proposed by Waddington in 1972, provides a basis for the formal examination of how different self-reproducing structures interact in an extended evolutionary context. Such interactions have become central objects of study in fields ranging from human evolution-genes and culture-to economics-firms, markets and technology. Here we use the Shannon-McMillan Theorem, one of the fundamental asymptotic relations of probability theory, to study the 'weakest' and hence most universal, forms of interaction between generalized languages. We propose that the co-evolving gene-culture structure that permits human ultra-sociality emerged in a singular coagulation of genetic and cultural 'languages', in the general sense of the word. Human populations have since hosted series of culture-only speciations and coagulations, events that, in this formulation, do not become mired in the 'meme' concept.

  7. [Business organization theory: its potential use in the organization of the operating room].

    PubMed

    Bartz, H-J

    2005-07-01

    The paradigm of patient care in the German health system is changing. The introduction of German Diagnosis Related Groups (G-DRGs), a diagnosis-related coding system, has made process-oriented thinking increasingly important. The treatment process is viewed and managed as a whole from the admission to the discharge of the patient. The interfaces of departments and sectors are diminished. A main objective of these measures is to render patient care more cost efficient. Within the hospital, the operating room (OR) is the most expensive factor accounting for 25 - 50 % of the costs of a surgical patient and is also a bottleneck in the surgical patient care. Therefore, controlling of the perioperative treatment process is getting more and more important. Here, the business organisation theory can be a very useful tool. Especially the concepts of process organisation and process management can be applied to hospitals. Process-oriented thinking uncovers and solves typical organisational problems. Competences, responsibilities and tasks are reorganised by process orientation and the enterprise is gradually transformed to a process-oriented system. Process management includes objective-oriented controlling of the value chain of an enterprise with regard to quality, time, costs and customer satisfaction. The quality of the process is continuously improved using process-management techniques. The main advantage of process management is consistent customer orientation. Customer orientation means to be aware of the customer's needs at any time during the daily routine. The performance is therefore always directed towards current market requirements. This paper presents the basics of business organisation theory and to point out its potential use in the organisation of the OR.

  8. [Business organization theory: its potential use in the organization of the operating room].

    PubMed

    Bartz, H-J

    2005-07-01

    The paradigm of patient care in the German health system is changing. The introduction of German Diagnosis Related Groups (G-DRGs), a diagnosis-related coding system, has made process-oriented thinking increasingly important. The treatment process is viewed and managed as a whole from the admission to the discharge of the patient. The interfaces of departments and sectors are diminished. A main objective of these measures is to render patient care more cost efficient. Within the hospital, the operating room (OR) is the most expensive factor accounting for 25 - 50 % of the costs of a surgical patient and is also a bottleneck in the surgical patient care. Therefore, controlling of the perioperative treatment process is getting more and more important. Here, the business organisation theory can be a very useful tool. Especially the concepts of process organisation and process management can be applied to hospitals. Process-oriented thinking uncovers and solves typical organisational problems. Competences, responsibilities and tasks are reorganised by process orientation and the enterprise is gradually transformed to a process-oriented system. Process management includes objective-oriented controlling of the value chain of an enterprise with regard to quality, time, costs and customer satisfaction. The quality of the process is continuously improved using process-management techniques. The main advantage of process management is consistent customer orientation. Customer orientation means to be aware of the customer's needs at any time during the daily routine. The performance is therefore always directed towards current market requirements. This paper presents the basics of business organisation theory and to point out its potential use in the organisation of the OR. PMID:16001317

  9. Organizers.

    ERIC Educational Resources Information Center

    Callison, Daniel

    2000-01-01

    Focuses on "organizers," tools or techniques that provide identification and classification along with possible relationships or connections among ideas, concepts, and issues. Discusses David Ausubel's research and ideas concerning advance organizers; the implications of Ausubel's theory to curriculum and teaching; "webbing," a specific…

  10. The relationship between the reliability of transistors with 2D AlGaN/GaN channel and organization type of nanomaterial

    NASA Astrophysics Data System (ADS)

    Emtsev, V. V.; Zavarin, E. E.; Oganesyan, G. A.; Petrov, V. N.; Sakharov, A. V.; Shmidt, N. M.; V'yuginov, V. N.; Zybin, A. A.; Parnes, Ya. M.; Vidyakin, S. I.; Gudkov, A. G.; Chernyakov, A. E.

    2016-07-01

    The first experimental results demonstrating that the carrier mobility in the AlGaN/GaN 2D channel of transistor structures (AlGaN/GaN-HEMT) is correlated with the manner in which the nanomaterial is organized and also with the operation reliability of transistor parameters are presented. It is shown that improving the nature of organization of the nanomaterials in AlGaN/GaN-HEMT structures, evaluated by the multifractal parameter characterizing the extent to which a nanomaterial is disordered (local symmetry breaking) is accompanied by a significant, several-fold increase in the electron mobility in the 2D channel and in the reliability of parameters of transistors fabricated from these structures.

  11. Assessment of the ΔSCF density functional theory approach for electronic excitations in organic dyes

    SciTech Connect

    Kowalczyk, T.; Yost, S. R.; Van Voorhis, T.

    2010-01-01

    This paper assesses the accuracy of the ΔSCF method for computing low-lying HOMO→LUMO transitions in organic dye molecules. For a test set of vertical excitation energies of 16 chromophores, surprisingly similar accuracy is observed for time-dependent density functional theory and for ΔSCF density functional theory. In light of this performance, we reconsider the ad hoc ΔSCF prescription and demonstrate that it formally obtains the exact stationary density within the adiabatic approximation, partially justifying its use. The relative merits and future prospects of ΔSCF for simulating individual excited states are discussed.

  12. Self-organization theories and environmental management: The case of South Moresby, Canada

    NASA Astrophysics Data System (ADS)

    Grzybowski, Alex G. S.; Slocombe, D. Scott

    1988-07-01

    This article presents a new approach to the analysis and management of large-scale societal problems with complex ecological, economic, and social dimensions. The approach is based on the theory of self-organizing systems—complex, open, far-from-equilibrium systems with nonlinear dynamics. A brief overview and comparison of different self-organization theories (synergetics, self-organization theory, hypercycles, and autopoiesis) is presented in order to isolate the key characteristics of such systems. The approach is used to develop an analysis of the landuse controversy in the South Moresby area of the Queen Charlotte Islands, British Columbia, Canada. Critical variables are identified for each subsystem and classified by spatial and temporal scale, and discussed in terms of information content and internal/external origin. Eradication of sea otters, introduction of black-tailed deer, impacts of large-scale clearcut logging, sustainability of the coastal forest industry, and changing relations between native peoples and governments are discussed in detail to illustrate the system dynamics of the South Moresby “sociobiophysical” system. Finally, implications of the self-organizing sociobiophysical system view for regional analysis and management are identified.

  13. Human reliability analysis

    SciTech Connect

    Dougherty, E.M.; Fragola, J.R.

    1988-01-01

    The authors present a treatment of human reliability analysis incorporating an introduction to probabilistic risk assessment for nuclear power generating stations. They treat the subject according to the framework established for general systems theory. Draws upon reliability analysis, psychology, human factors engineering, and statistics, integrating elements of these fields within a systems framework. Provides a history of human reliability analysis, and includes examples of the application of the systems approach.

  14. Content-oriented Approach to Organization of Theories and Its Utilization

    NASA Astrophysics Data System (ADS)

    Hayashi, Yusuke; Bourdeau, Jacqueline; Mizoguch, Riichiro

    In spite of the fact that the relation between theory and practice is a foundation of scientific and technological development, the trend of increasing the gap between theory and practice accelerates in these years. The gap embraces a risk of distrust of science and technology. Ontological engineering as the content-oriented research is expected to contribute to the resolution of the gap. This paper presents the feasibility of organization of theoretical knowledge on ontological engineering and new-generation intelligent systems based on it through an application of ontological engineering in the area of learning/instruction support. This area also has the problem of the gap between theory and practice, and its resolution is strongly required. So far we proposed OMNIBUS ontology, which is a comprehensive ontology that covers different learning/instructional theories and paradigms, and SMARTIES, which is a theory-aware and standard-compliant authoring system for making learning/instructional scenarios based on OMNIBUS ontology. We believe the theory-awareness and standard-compliance bridge the gap between theory and practice because it links theories to practical use of standard technologies and enables practitioners to easily enjoy theoretical support while using standard technologies in practice. The following goals are set in order to achieve it; computers (1) understand a variety of learning/instructional theories based on the organization of them, (2) utilize the understanding for helping authors' learning/instructional scenario making and (3) make such theoretically sound scenarios interoperable within the framework of standard technologies. This paper suggests an ontological engineering solution to the achievement of these three goals. Although the evaluation is far from complete in terms of practical use, we believe that the results of this study address high-level technical challenges from the viewpoint of the current state of the art in the research area

  15. Assessing governance theory and practice in health-care organizations: a survey of UK hospices.

    PubMed

    Chambers, Naomi; Benson, Lawrence; Boyd, Alan; Girling, Jeff

    2012-05-01

    This paper sets out a theoretical framework for analyzing board governance, and describes an empirical study of corporate governance practices in a subset of non-profit organizations (hospices in the UK). It examines how practices in hospice governance compare with what is known about effective board working. We found that key strengths of hospice boards included a strong focus on the mission and the finances of the organizations, and common weaknesses included a lack of involvement in strategic matters and a lack of confidence, and some nervousness about challenging the organization on the quality of clinical care. Finally, the paper offers suggestions for theoretical development particularly in relation to board governance in non-profit organizations. It develops an engagement theory for boards which comprises a triadic proposition of high challenge, high support and strong grip. PMID:22673698

  16. Assessing governance theory and practice in health-care organizations: a survey of UK hospices.

    PubMed

    Chambers, Naomi; Benson, Lawrence; Boyd, Alan; Girling, Jeff

    2012-05-01

    This paper sets out a theoretical framework for analyzing board governance, and describes an empirical study of corporate governance practices in a subset of non-profit organizations (hospices in the UK). It examines how practices in hospice governance compare with what is known about effective board working. We found that key strengths of hospice boards included a strong focus on the mission and the finances of the organizations, and common weaknesses included a lack of involvement in strategic matters and a lack of confidence, and some nervousness about challenging the organization on the quality of clinical care. Finally, the paper offers suggestions for theoretical development particularly in relation to board governance in non-profit organizations. It develops an engagement theory for boards which comprises a triadic proposition of high challenge, high support and strong grip.

  17. Validation and test-retest reliability of a health measure, health as ability of acting, based on the welfare theory of health.

    PubMed

    Snellman, Ingrid; Jonsson, Bosse; Wikblad, Karin

    2012-03-01

    The aim of this study was to conduct a validation and assess the test-retest reliability of the health questionnaire based on Nordenfelt's Welfare Theory of Health (WTH). The study used a questionnaire on health together with the Short Form 12-Item Health Survey (SF-12) questionnaire, and 490 pupils at colleges for adult education participated. The results of the study are in accordance with Nordenfelt's WTH. Three hypotheses were stated, and the first was confirmed: People who were satisfied with life rated higher levels than those who were dissatisfied with life concerning both mental and physical health, measured with the SF-12. The second hypothesis was partially confirmed: People with high education were more often satisfied with life than those with low education, but they were not healthier. The third hypothesis, that women are unhealthy more often than men, was not confirmed. The questionnaire on health showed acceptable stability. PMID:21930655

  18. Insights into the organization of biochemical regulatory networks using graph theory analyses.

    PubMed

    Ma'ayan, Avi

    2009-02-27

    Graph theory has been a valuable mathematical modeling tool to gain insights into the topological organization of biochemical networks. There are two types of insights that may be obtained by graph theory analyses. The first provides an overview of the global organization of biochemical networks; the second uses prior knowledge to place results from multivariate experiments, such as microarray data sets, in the context of known pathways and networks to infer regulation. Using graph analyses, biochemical networks are found to be scale-free and small-world, indicating that these networks contain hubs, which are proteins that interact with many other molecules. These hubs may interact with many different types of proteins at the same time and location or at different times and locations, resulting in diverse biological responses. Groups of components in networks are organized in recurring patterns termed network motifs such as feedback and feed-forward loops. Graph analysis revealed that negative feedback loops are less common and are present mostly in proximity to the membrane, whereas positive feedback loops are highly nested in an architecture that promotes dynamical stability. Cell signaling networks have multiple pathways from some input receptors and few from others. Such topology is reminiscent of a classification system. Signaling networks display a bow-tie structure indicative of funneling information from extracellular signals and then dispatching information from a few specific central intracellular signaling nexuses. These insights show that graph theory is a valuable tool for gaining an understanding of global regulatory features of biochemical networks.

  19. Insights into the organization of biochemical regulatory networks using graph theory analyses.

    PubMed

    Ma'ayan, Avi

    2009-02-27

    Graph theory has been a valuable mathematical modeling tool to gain insights into the topological organization of biochemical networks. There are two types of insights that may be obtained by graph theory analyses. The first provides an overview of the global organization of biochemical networks; the second uses prior knowledge to place results from multivariate experiments, such as microarray data sets, in the context of known pathways and networks to infer regulation. Using graph analyses, biochemical networks are found to be scale-free and small-world, indicating that these networks contain hubs, which are proteins that interact with many other molecules. These hubs may interact with many different types of proteins at the same time and location or at different times and locations, resulting in diverse biological responses. Groups of components in networks are organized in recurring patterns termed network motifs such as feedback and feed-forward loops. Graph analysis revealed that negative feedback loops are less common and are present mostly in proximity to the membrane, whereas positive feedback loops are highly nested in an architecture that promotes dynamical stability. Cell signaling networks have multiple pathways from some input receptors and few from others. Such topology is reminiscent of a classification system. Signaling networks display a bow-tie structure indicative of funneling information from extracellular signals and then dispatching information from a few specific central intracellular signaling nexuses. These insights show that graph theory is a valuable tool for gaining an understanding of global regulatory features of biochemical networks. PMID:18940806

  20. Reliability of fluid systems

    NASA Astrophysics Data System (ADS)

    Kopáček, Jaroslav; Fojtášek, Kamil; Dvořák, Lukáš

    2016-03-01

    This paper focuses on the importance of detection reliability, especially in complex fluid systems for demanding production technology. The initial criterion for assessing the reliability is the failure of object (element), which is seen as a random variable and their data (values) can be processed using by the mathematical methods of theory probability and statistics. They are defined the basic indicators of reliability and their applications in calculations of serial, parallel and backed-up systems. For illustration, there are calculation examples of indicators of reliability for various elements of the system and for the selected pneumatic circuit.

  1. Subcellular metabolic organization in the context of dynamic energy budget and biochemical systems theories.

    PubMed

    Vinga, S; Neves, A R; Santos, H; Brandt, B W; Kooijman, S A L M

    2010-11-12

    The dynamic modelling of metabolic networks aims to describe the temporal evolution of metabolite concentrations in cells. This area has attracted increasing attention in recent years owing to the availability of high-throughput data and the general development of systems biology as a promising approach to study living organisms. Biochemical Systems Theory (BST) provides an accurate formalism to describe biological dynamic phenomena. However, knowledge about the molecular organization level, used in these models, is not enough to explain phenomena such as the driving forces of these metabolic networks. Dynamic Energy Budget (DEB) theory captures the quantitative aspects of the organization of metabolism at the organism level in a way that is non-species-specific. This imposes constraints on the sub-organismal organization that are not present in the bottom-up approach of systems biology. We use in vivo data of lactic acid bacteria under various conditions to compare some aspects of BST and DEB approaches. Due to the large number of parameters to be estimated in the BST model, we applied powerful parameter identification techniques. Both models fitted equally well, but the BST model employs more parameters. The DEB model uses similarities of processes under growth and no-growth conditions and under aerobic and anaerobic conditions, which reduce the number of parameters. This paper discusses some future directions for the integration of knowledge from these two rich and promising areas, working top-down and bottom-up simultaneously. This middle-out approach is expected to bring new ideas and insights to both areas in terms of describing how living organisms operate. PMID:20921043

  2. The Mosaic Theory Revisited: Common Molecular Mechanisms Coordinating Diverse Organ and Cellular Events in Hypertension

    PubMed Central

    Harrison, David G.

    2012-01-01

    Over 60 years ago, Dr. Irvine Page proposed the Mosaic Theory of hypertension, which states that many factors, including genetics, environment, adaptive, neural, mechanical and hormonal perturbations interdigitate to raise blood pressure. In the past two decades, it has become clear that common molecular and cellular events in various organs underlie many features of the Mosaic Theory. Two of these are the production of reactive oxygen species (ROS) and inflammation. These factors increase neuronal firing in specific brain centers, increase sympathetic outflow, alter vascular tone and morphology and promote sodium retention in the kidney. Moreover, factors such as genetics and environment contribute to oxidant generation and inflammation. Other common cellular signals, including calcium signaling and endoplasmic reticulum stress are similarly perturbed in different cells in hypertension and contribute to components of Dr. Page’s theory. Thus, Dr. Page’s Mosaic Theory formed a framework for future studies of molecular and cellular signals in the context of hypertension, and has greatly aided our understanding of this complex disease. PMID:23321405

  3. Are the Somatic Mutation and Tissue Organization Field Theories of Carcinogenesis Incompatible?

    PubMed Central

    Rosenfeld, Simon

    2013-01-01

    Two drastically different approaches to understanding the forces driving carcinogenesis have crystallized through years of research. These are the somatic mutation theory (SMT) and the tissue organization field theory (TOFT). The essence of SMT is that cancer is derived from a single somatic cell that has successively accumulated multiple DNA mutations, and that those mutations occur on genes which control cell proliferation and cell cycle. Thus, according to SMT, neoplastic lesions are the results of DNA-level events. Conversely, according to TOFT, carcinogenesis is primarily a problem of tissue organization: carcinogenic agents destroy the normal tissue architecture thus disrupting cell-to-cell signaling and compromising genomic integrity. Hence, in TOFT the DNA mutations are the effect, and not the cause, of the tissue-level events. Cardinal importance of successful resolution of the TOFT versus SMT controversy dwells in the fact that, according to SMT, cancer is a unidirectional and mostly irreversible disease; whereas, according to TOFT, it is curable and reversible. In this paper, our goal is to outline a plausible scenario in which TOFT and SMT can be reconciled using the framework and concepts of the self-organized criticality (SOC), the principle proven to be extremely fruitful in a wide range of disciplines pertaining to natural phenomena, to biological communities, to large-scale social developments, to technological networks, and to many other subjects of research. PMID:24324325

  4. Excited state and charge dynamics of hybrid organic/inorganic heterojunctions. I. Theory

    NASA Astrophysics Data System (ADS)

    Renshaw, C. Kyle; Forrest, Stephen R.

    2014-07-01

    The different cohesive forces that bond organic (i.e. excitonic) and inorganic semiconductors lead to widely disparate dielectric constants, charge mobilities, and other fundamental optoelectronic properties that make junctions between these materials interesting for numerous practical applications. Yet, there are no detailed theories addressing charge and energy transport across interfaces between these hybrid systems. Here, we develop a comprehensive physical model describing charge transport and photocurrent generation based on first-principles charge and excited state dynamics at the organic/inorganic heterojunction. We consider interfaces that are trap-free, as well as those with an exponential distribution of trap states. We find that the hybrid charge-transfer state resulting from photon absorption near the junction that subsequently migrates to the heterointerface is often unstable at room temperature, leading to its rapid dissociation into free charges that are collected at the device contacts. In the companion Paper II [A. Panda et al., Phys. Rev. B 90, 045303 (2014), 10.1103/PhysRevB.90.045303], we apply our theories to understanding the optical and electronic properties of archetype organic/inorganic heterojunction diodes. Our analysis provides insights for developing high performance optoelectronic devices whose properties are otherwise inaccessible to either conventional excitonic or inorganic semiconductor junctions.

  5. Are the somatic mutation and tissue organization field theories of carcinogenesis incompatible?

    PubMed

    Rosenfeld, Simon

    2013-01-01

    Two drastically different approaches to understanding the forces driving carcinogenesis have crystallized through years of research. These are the somatic mutation theory (SMT) and the tissue organization field theory (TOFT). The essence of SMT is that cancer is derived from a single somatic cell that has successively accumulated multiple DNA mutations, and that those mutations occur on genes which control cell proliferation and cell cycle. Thus, according to SMT, neoplastic lesions are the results of DNA-level events. Conversely, according to TOFT, carcinogenesis is primarily a problem of tissue organization: carcinogenic agents destroy the normal tissue architecture thus disrupting cell-to-cell signaling and compromising genomic integrity. Hence, in TOFT the DNA mutations are the effect, and not the cause, of the tissue-level events. Cardinal importance of successful resolution of the TOFT versus SMT controversy dwells in the fact that, according to SMT, cancer is a unidirectional and mostly irreversible disease; whereas, according to TOFT, it is curable and reversible. In this paper, our goal is to outline a plausible scenario in which TOFT and SMT can be reconciled using the framework and concepts of the self-organized criticality (SOC), the principle proven to be extremely fruitful in a wide range of disciplines pertaining to natural phenomena, to biological communities, to large-scale social developments, to technological networks, and to many other subjects of research. PMID:24324325

  6. Decision-making regarding organ donation in Korean adults: A grounded-theory study.

    PubMed

    Yeun, Eun Ja; Kwon, Young Mi; Kim, Jung A

    2015-06-01

    The aim of this study was to identify the hidden patterns of behavior leading toward the decision to donate organs. Thirteen registrants at the Association for Organ Sharing in Korea were recruited. Data were collected using in-depth interview and the interview transcripts were analyzed using Glaserian grounded-theory methodology. The main problem of participants was "body attachment" and the core category (management process) was determined to be "pursuing life." The theme consisted of four phases, which were: "hesitating," "investigating," "releasing," and "re-discovering. " Therefore, to increase organ donations, it is important to find a strategy that will create positive attitudes about organ donation through education and public relations. These results explain and provide a deeper understanding of the main problem that Korean people have about organ donation and their management of decision-making processes. These findings can help care providers to facilitate the decision-making process and respond to public needs while taking into account the sociocultural context within which decisions are made.

  7. African American Organ Donor Registration: A Mixed Methods Design using the Theory of Planned Behavior

    PubMed Central

    DuBay, Derek A.; Ivankova, Nataliya; Herby, Ivan; Wynn, Theresa A.; Kohler, Connie; Berry, Beverly; Foushee, Herman; Carson, April; Redden, David T.; Holt, Cheryl; Siminoff, Laura; Fouad, Mona; Martin, Michelle Y.

    2015-01-01

    Context A large racial disparity exists in organ donation. Objective The purpose of this study was to identify factors associated with becoming a registered organ donor in among African Americans in Alabama. Methods The study utilized a concurrent mixed methods design guided by the Theory of Planned Behavior to analyze African American’s decisions to become a registered organ donor using both qualitative (focus groups) and quantitative (survey) methods. Results The sample consisted of 22 registered organ donors (ROD) and 65 non-registered participants (NRP) from six focus groups completed in urban (n=3) and rural (n=3) areas. Participants emphasized the importance of the autonomy to make one’s own organ donation decision and have this decision honored posthumously. One novel barrier to becoming a ROD was the perception that organs from African Americans were often unusable due to high prevalence of chronic medical conditions such as diabetes and hypertension. Another novel theme discussed as an advantage to becoming a ROD was the subsequent motivation to take responsibility for one’s health. Family and friends were the most common groups of persons identified as approving and disapproving of the decision to become a ROD. The most common facilitator to becoming a ROD was information, while fear and the lack of information were the most common barriers. In contrast, religious beliefs, mistrust and social justice themes were infrequently referenced as barriers to becoming a ROD. Discussion Findings from this study may be useful for prioritizing organ donation community-based educational interventions in campaigns to increase donor registration. PMID:25193729

  8. Did Geomagnetic Activity Challenge Electric Power Reliability During Solar Cycle 23? Evidence from the PJM Regional Transmission Organization in North America

    NASA Technical Reports Server (NTRS)

    Forbes, Kevin F.; Cyr, Chris St

    2012-01-01

    During solar cycle 22, a very intense geomagnetic storm on 13 March 1989 contributed to the collapse of the Hydro-Quebec power system in Canada. This event clearly demonstrated that geomagnetic storms have the potential to lead to blackouts. This paper addresses whether geomagnetic activity challenged power system reliability during solar cycle 23. Operations by PJM Interconnection, LLC (hereafter PJM), a regional transmission organization in North America, are examined over the period 1 April 2002 through 30 April 2004. During this time PJM coordinated the movement of wholesale electricity in all or parts of Delaware, Maryland, New Jersey, Ohio, Pennsylvania, Virginia, West Virginia, and the District of Columbia in the United States. We examine the relationship between a proxy of geomagnetically induced currents (GICs) and a metric of challenged reliability. In this study, GICs are proxied using magnetometer data from a geomagnetic observatory located just outside the PJM control area. The metric of challenged reliability is the incidence of out-of-economic-merit order dispatching due to adverse reactive power conditions. The statistical methods employed make it possible to disentangle the effects of GICs on power system operations from purely terrestrial factors. The results of the analysis indicate that geomagnetic activity can significantly increase the likelihood that the system operator will dispatch generating units based on system stability considerations rather than economic merit.

  9. High reliable and stable organic field-effect transistor nonvolatile memory with a poly(4-vinyl phenol) charge trapping layer based on a pn-heterojunction active layer

    NASA Astrophysics Data System (ADS)

    Xiang, Lanyi; Ying, Jun; Han, Jinhua; Zhang, Letian; Wang, Wei

    2016-04-01

    In this letter, we demonstrate a high reliable and stable organic field-effect transistor (OFET) based nonvolatile memory (NVM) with a polymer poly(4-vinyl phenol) (PVP) as the charge trapping layer. In the unipolar OFETs, the inreversible shifts of the turn-on voltage (Von) and severe degradation of the memory window (ΔVon) at programming (P) and erasing (E) voltages, respectively, block their application in NVMs. The obstacle is overcome by using a pn-heterojunction as the active layer in the OFET memory, which supplied a holes and electrons accumulating channel at the supplied P and E voltages, respectively. Both holes and electrons transferring from the channels to PVP layer and overwriting the trapped charges with an opposite polarity result in the reliable bidirectional shifts of Von at P and E voltages, respectively. The heterojunction OFET exhibits excellent nonvolatile memory characteristics, with a large ΔVon of 8.5 V, desired reading (R) voltage at 0 V, reliable P/R/E/R dynamic endurance over 100 cycles and a long retention time over 10 years.

  10. Making Reliability Arguments in Classrooms

    ERIC Educational Resources Information Center

    Parkes, Jay; Giron, Tilia

    2006-01-01

    Reliability methodology needs to evolve as validity has done into an argument supported by theory and empirical evidence. Nowhere is the inadequacy of current methods more visible than in classroom assessment. Reliability arguments would also permit additional methodologies for evidencing reliability in classrooms. It would liberalize methodology…

  11. Firm Size, a Self-Organized Critical Phenomenon: Evidence from the Dynamical Systems Theory

    NASA Astrophysics Data System (ADS)

    Chandra, Akhilesh

    This research draws upon a recent innovation in the dynamical systems literature called the theory of self -organized criticality (SOC) (Bak, Tang, and Wiesenfeld 1988) to develop a computational model of a firm's size by relating its internal and the external sub-systems. As a holistic paradigm, the theory of SOC implies that a firm as a composite system of many degrees of freedom naturally evolves to a critical state in which a minor event starts a chain reaction that can affect either a part or the system as a whole. Thus, the global features of a firm cannot be understood by analyzing its individual parts separately. The causal framework builds upon a constant capital resource to support a volume of production at the existing level of efficiency. The critical size is defined as the production level at which the average product of a firm's factors of production attains its maximum value. The non -linearity is inferred by a change in the nature of relations at the border of criticality, between size and the two performance variables, viz., the operating efficiency and the financial efficiency. The effect of breaching the critical size is examined on the stock price reactions. Consistent with the theory of SOC, it is hypothesized that the temporal response of a firm breaching the level of critical size should behave as a flicker noise (1/f) process. The flicker noise is characterized by correlations extended over a wide range of time scales, indicating some sort of cooperative effect among a firm's degrees of freedom. It is further hypothesized that a firm's size evolves to a spatial structure with scale-invariant, self-similar (fractal) properties. The system is said to be self-organized inasmuch as it naturally evolves to the state of criticality without any detailed specifications of the initial conditions. In this respect, the critical state is an attractor of the firm's dynamics. Another set of hypotheses examines the relations between the size and the

  12. The search for reliable aqueous solubility (Sw) and octanol-water partition coefficient (Kow) data for hydrophobic organic compounds; DDT and DDE as a case study

    USGS Publications Warehouse

    Pontolillo, James; Eganhouse, R.P.

    2001-01-01

    The accurate determination of an organic contaminant?s physico-chemical properties is essential for predicting its environmental impact and fate. Approximately 700 publications (1944?2001) were reviewed and all known aqueous solubilities (Sw) and octanol-water partition coefficients (Kow) for the organochlorine pesticide, DDT, and its persistent metabolite, DDE were compiled and examined. Two problems are evident with the available database: 1) egregious errors in reporting data and references, and 2) poor data quality and/or inadequate documentation of procedures. The published literature (particularly the collative literature such as compilation articles and handbooks) is characterized by a preponderance of unnecessary data duplication. Numerous data and citation errors are also present in the literature. The percentage of original Sw and Kow data in compilations has decreased with time, and in the most recent publications (1994?97) it composes only 6?26 percent of the reported data. The variability of original DDT/DDE Sw and Kow data spans 2?4 orders of magnitude, and there is little indication that the uncertainty in these properties has declined over the last 5 decades. A criteria-based evaluation of DDT/DDE Sw and Kow data sources shows that 95?100 percent of the database literature is of poor or unevaluatable quality. The accuracy and reliability of the vast majority of the data are unknown due to inadequate documentation of the methods of determination used by the authors. [For example, estimates of precision have been reported for only 20 percent of experimental Sw data and 10 percent of experimental Kow data.] Computational methods for estimating these parameters have been increasingly substituted for direct or indirect experimental determination despite the fact that the data used for model development and validation may be of unknown reliability. Because of the prevalence of errors, the lack of methodological documentation, and unsatisfactory data

  13. Adsorptive desulfurization with metal-organic frameworks: A density functional theory investigation

    NASA Astrophysics Data System (ADS)

    Chen, Zhiping; Ling, Lixia; Wang, Baojun; Fan, Huiling; Shangguan, Ju; Mi, Jie

    2016-11-01

    The contribution of each fragment of metal-organic frameworks (MOFs) to the adsorption of sulfur compounds were investigated using density functional theory (DFT). The involved sulfur compounds are dimethyl sulfide (CH3SCH3), ethyl mercaptan (CH3CH2SH) and hydrogen sulfide (H2S). MOFs with different organic ligands (NH2-BDC, BDC and NDC), metal centers structures (M, M-M and M3O) and metal ions (Zn, Cu and Fe) were used to study their effects on sulfur species adsorption. The results revealed that, MOFs with coordinatively unsaturated sites (CUS) have the strongest binding strength with sulfur compounds, MOFs with NH2-BDC substituent group ligand comes second, followed by that with saturated metal center, and the organic ligands without substituent group has the weakest adsorption strength. Moreover, it was also found that, among different metal ions (Fe, Zn and Cu), MOFs with unsaturated Fe has the strongest adsorption strength for sulfur compounds. These results are consistent with our previous experimental observations, and therefore provide insights on the better design of MOFs for desulfurization application.

  14. Precise segmentation of multiple organs in CT volumes using learning-based approach and information theory.

    PubMed

    Lu, Chao; Zheng, Yefeng; Birkbeck, Neil; Zhang, Jingdan; Kohlberger, Timo; Tietjen, Christian; Boettger, Thomas; Duncan, James S; Zhou, S Kevin

    2012-01-01

    In this paper, we present a novel method by incorporating information theory into the learning-based approach for automatic and accurate pelvic organ segmentation (including the prostate, bladder and rectum). We target 3D CT volumes that are generated using different scanning protocols (e.g., contrast and non-contrast, with and without implant in the prostate, various resolution and position), and the volumes come from largely diverse sources (e.g., diseased in different organs). Three key ingredients are combined to solve this challenging segmentation problem. First, marginal space learning (MSL) is applied to efficiently and effectively localize the multiple organs in the largely diverse CT volumes. Second, learning techniques, steerable features, are applied for robust boundary detection. This enables handling of highly heterogeneous texture pattern. Third, a novel information theoretic scheme is incorporated into the boundary inference process. The incorporation of the Jensen-Shannon divergence further drives the mesh to the best fit of the image, thus improves the segmentation performance. The proposed approach is tested on a challenging dataset containing 188 volumes from diverse sources. Our approach not only produces excellent segmentation accuracy, but also runs about eighty times faster than previous state-of-the-art solutions. The proposed method can be applied to CT images to provide visual guidance to physicians during the computer-aided diagnosis, treatment planning and image-guided radiotherapy to treat cancers in pelvic region.

  15. Precise segmentation of multiple organs in CT volumes using learning-based approach and information theory.

    PubMed

    Lu, Chao; Zheng, Yefeng; Birkbeck, Neil; Zhang, Jingdan; Kohlberger, Timo; Tietjen, Christian; Boettger, Thomas; Duncan, James S; Zhou, S Kevin

    2012-01-01

    In this paper, we present a novel method by incorporating information theory into the learning-based approach for automatic and accurate pelvic organ segmentation (including the prostate, bladder and rectum). We target 3D CT volumes that are generated using different scanning protocols (e.g., contrast and non-contrast, with and without implant in the prostate, various resolution and position), and the volumes come from largely diverse sources (e.g., diseased in different organs). Three key ingredients are combined to solve this challenging segmentation problem. First, marginal space learning (MSL) is applied to efficiently and effectively localize the multiple organs in the largely diverse CT volumes. Second, learning techniques, steerable features, are applied for robust boundary detection. This enables handling of highly heterogeneous texture pattern. Third, a novel information theoretic scheme is incorporated into the boundary inference process. The incorporation of the Jensen-Shannon divergence further drives the mesh to the best fit of the image, thus improves the segmentation performance. The proposed approach is tested on a challenging dataset containing 188 volumes from diverse sources. Our approach not only produces excellent segmentation accuracy, but also runs about eighty times faster than previous state-of-the-art solutions. The proposed method can be applied to CT images to provide visual guidance to physicians during the computer-aided diagnosis, treatment planning and image-guided radiotherapy to treat cancers in pelvic region. PMID:23286081

  16. Collection-limited theory interprets the extraordinary response of single semiconductor organic solar cells

    PubMed Central

    Ray, Biswajit; Baradwaj, Aditya G.; Khan, Mohammad Ryyan; Boudouris, Bryan W.; Alam, Muhammad Ashraful

    2015-01-01

    The bulk heterojunction (BHJ) organic photovoltaic (OPV) architecture has dominated the literature due to its ability to be implemented in devices with relatively high efficiency values. However, a simpler device architecture based on a single organic semiconductor (SS-OPV) offers several advantages: it obviates the need to control the highly system-dependent nanoscale BHJ morphology, and therefore, would allow the use of broader range of organic semiconductors. Unfortunately, the photocurrent in standard SS-OPV devices is typically very low, which generally is attributed to inefficient charge separation of the photogenerated excitons. Here we show that the short-circuit current density from SS-OPV devices can be enhanced significantly (∼100-fold) through the use of inverted device configurations, relative to a standard OPV device architecture. This result suggests that charge generation may not be the performance bottleneck in OPV device operation. Instead, poor charge collection, caused by defect-induced electric field screening, is most likely the primary performance bottleneck in regular-geometry SS-OPV cells. We justify this hypothesis by: (i) detailed numerical simulations, (ii) electrical characterization experiments of functional SS-OPV devices using multiple polymers as active layer materials, and (iii) impedance spectroscopy measurements. Furthermore, we show that the collection-limited photocurrent theory consistently interprets typical characteristics of regular SS-OPV devices. These insights should encourage the design and OPV implementation of high-purity, high-mobility polymers, and other soft materials that have shown promise in organic field-effect transistor applications, but have not performed well in BHJ OPV devices, wherein they adopt less-than-ideal nanostructures when blended with electron-accepting materials. PMID:26290582

  17. Collection-limited theory interprets the extraordinary response of single semiconductor organic solar cells.

    PubMed

    Ray, Biswajit; Baradwaj, Aditya G; Khan, Mohammad Ryyan; Boudouris, Bryan W; Alam, Muhammad Ashraful

    2015-09-01

    The bulk heterojunction (BHJ) organic photovoltaic (OPV) architecture has dominated the literature due to its ability to be implemented in devices with relatively high efficiency values. However, a simpler device architecture based on a single organic semiconductor (SS-OPV) offers several advantages: it obviates the need to control the highly system-dependent nanoscale BHJ morphology, and therefore, would allow the use of broader range of organic semiconductors. Unfortunately, the photocurrent in standard SS-OPV devices is typically very low, which generally is attributed to inefficient charge separation of the photogenerated excitons. Here we show that the short-circuit current density from SS-OPV devices can be enhanced significantly (∼100-fold) through the use of inverted device configurations, relative to a standard OPV device architecture. This result suggests that charge generation may not be the performance bottleneck in OPV device operation. Instead, poor charge collection, caused by defect-induced electric field screening, is most likely the primary performance bottleneck in regular-geometry SS-OPV cells. We justify this hypothesis by: (i) detailed numerical simulations, (ii) electrical characterization experiments of functional SS-OPV devices using multiple polymers as active layer materials, and (iii) impedance spectroscopy measurements. Furthermore, we show that the collection-limited photocurrent theory consistently interprets typical characteristics of regular SS-OPV devices. These insights should encourage the design and OPV implementation of high-purity, high-mobility polymers, and other soft materials that have shown promise in organic field-effect transistor applications, but have not performed well in BHJ OPV devices, wherein they adopt less-than-ideal nanostructures when blended with electron-accepting materials.

  18. Simple, stable and reliable modeling of gas properties of organic working fluids in aerodynamic designs of turbomachinery for ORC and VCC

    NASA Astrophysics Data System (ADS)

    Kawakubo, T.

    2016-05-01

    A simple, stable and reliable modeling of the real gas nature of the working fluid is required for the aerodesigns of the turbine in the Organic Rankine Cycle and of the compressor in the Vapor Compression Cycle. Although many modern Computational Fluid Dynamics tools are capable of incorporating real gas models, simulations with such a gas model tend to be more time-consuming than those with a perfect gas model and even can be unstable due to the simulation near the saturation boundary. Thus a perfect gas approximation is still an attractive option to stably and swiftly conduct a design simulation. In this paper, an effective method of the CFD simulation with a perfect gas approximation is discussed. A method of representing the performance of the centrifugal compressor or the radial-inflow turbine by means of each set of non-dimensional performance parameters and translating the fictitious perfect gas result to the actual real gas performance is presented.

  19. Density functional theory study of the organic functionalization of hydrogenated silicene.

    PubMed

    Rubio-Pereda, Pamela; Takeuchi, Noboru

    2013-05-21

    Silicene, the silicon analogous of graphene, is a newly synthesized two-dimensional nanomaterial, with unique features and promising potential applications. In this paper we present density functional theory calculations of the organic functionalization of hydrogenated silicene with acetylene, ethylene, and styrene. The results are compared with previous works of the adsorption on H-Si[111]. For styrene, binding energies for the intermediate and final states as well as the energy barrier for hydrogen abstraction are rather similar for the two systems. On the other hand, results for acetylene and ethylene are surprisingly different in H-silicene: the abstraction barrier is much smaller in H-silicene than in H-Si[111]. These differences can be understood by the different electrostatic potentials due to the presence of the H atoms at the bottom of the silicene bilayer that allows the delocalization of the spin density at the reaction intermediate state.

  20. A simple theory of molecular organization in fullerene-containing liquid crystals

    NASA Astrophysics Data System (ADS)

    Peroukidis, S. D.; Vanakaras, A. G.; Photinos, D. J.

    2005-10-01

    Systematic efforts to synthesize fullerene-containing liquid crystals have produced a variety of successful model compounds. We present a simple molecular theory, based on the interconverting shape approach [Vanakaras and Photinos, J. Mater. Chem. 15, 2002 (2005)], that relates the self-organization observed in these systems to their molecular structure. The interactions are modeled by dividing each molecule into a number of submolecular blocks to which specific interactions are assigned. Three types of blocks are introduced, corresponding to fullerene units, mesogenic units, and nonmesogenic linkage units. The blocks are constrained to move on a cubic three-dimensional lattice and molecular flexibility is allowed by retaining a number of representative conformations within the block representation of the molecule. Calculations are presented for a variety of molecular architectures including twin mesogenic branch monoadducts of C60, twin dendromesogenic branch monoadducts, and conical (badminton shuttlecock) multiadducts of C60. The dependence of the phase diagrams on the interaction parameters is explored. In spite of its many simplifications and the minimal molecular modeling used (three types of chemically distinct submolecular blocks with only repulsive interactions), the theory accounts remarkably well for the phase behavior of these systems.

  1. Functional Organization of the Action Observation Network in Autism: A Graph Theory Approach

    PubMed Central

    Alaerts, Kaat; Geerlings, Franca; Herremans, Lynn; Swinnen, Stephan P.; Verhoeven, Judith; Sunaert, Stefan; Wenderoth, Nicole

    2015-01-01

    Background The ability to recognize, understand and interpret other’s actions and emotions has been linked to the mirror system or action-observation-network (AON). Although variations in these abilities are prevalent in the neuro-typical population, persons diagnosed with autism spectrum disorders (ASD) have deficits in the social domain and exhibit alterations in this neural network. Method Here, we examined functional network properties of the AON using graph theory measures and region-to-region functional connectivity analyses of resting-state fMRI-data from adolescents and young adults with ASD and typical controls (TC). Results Overall, our graph theory analyses provided convergent evidence that the network integrity of the AON is altered in ASD, and that reductions in network efficiency relate to reductions in overall network density (i.e., decreased overall connection strength). Compared to TC, individuals with ASD showed significant reductions in network efficiency and increased shortest path lengths and centrality. Importantly, when adjusting for overall differences in network density between ASD and TC groups, participants with ASD continued to display reductions in network integrity, suggesting that also network-level organizational properties of the AON are altered in ASD. Conclusion While differences in empirical connectivity contributed to reductions in network integrity, graph theoretical analyses provided indications that also changes in the high-level network organization reduced integrity of the AON. PMID:26317222

  2. Coding theory based models for protein translation initiation in prokaryotic organisms.

    SciTech Connect

    May, Elebeoba Eni; Bitzer, Donald L. (North Carolina State University, Raleigh, NC); Rosnick, David I. (North Carolina State University, Raleigh, NC); Vouk, Mladen A.

    2003-03-01

    Our research explores the feasibility of using communication theory, error control (EC) coding theory specifically, for quantitatively modeling the protein translation initiation mechanism. The messenger RNA (mRNA) of Escherichia coli K-12 is modeled as a noisy (errored), encoded signal and the ribosome as a minimum Hamming distance decoder, where the 16S ribosomal RNA (rRNA) serves as a template for generating a set of valid codewords (the codebook). We tested the E. coli based coding models on 5' untranslated leader sequences of prokaryotic organisms of varying taxonomical relation to E. coli including: Salmonella typhimurium LT2, Bacillus subtilis, and Staphylococcus aureus Mu50. The model identified regions on the 5' untranslated leader where the minimum Hamming distance values of translated mRNA sub-sequences and non-translated genomic sequences differ the most. These regions correspond to the Shine-Dalgarno domain and the non-random domain. Applying the EC coding-based models to B. subtilis, and S. aureus Mu50 yielded results similar to those for E. coli K-12. Contrary to our expectations, the behavior of S. typhimurium LT2, the more taxonomically related to E. coli, resembled that of the non-translated sequence group.

  3. Nuclear weapons decision-making; an application of organization theory to the mini-nuke case

    SciTech Connect

    Kangas, J.L.

    1985-01-01

    This dissertation addresses the problem of constructing and developing normative theory responsive to the need for improving the quality of decision-making in the nuclear weapons policy-making. Against the background of a critical evaluation of various paradigms in the literature (systems analysis and opposed-systems designed, the bureaucratic politics model, and the cybernetic theory of decision) an attempt is made to design an alternative analytic framework based on the writings of numerous organization theorists such as Herbert Simon and Kenneth Arrow. The framework is applied to the case of mini-nukes, i.e., proposals in the mid-1970s to develop and deploy tens of thousands of very low-yield (sub-kiloton), miniaturized fission weapons in NATO. Heuristic case study identifies the type of study undertaken in the dissertation in contrast to the more familiar paradigmatic studies identified, for example, with the Harvard Weapons Project. Application of the analytic framework developed in the dissertation of the mini-nuke case resulted in an empirical understanding of why decision making concerning tactical nuclear weapons has been such a complex task and why force modernization issues in particular have been so controversial and lacking in policy resolution.

  4. FFLO strange metal and quantum criticality in two dimensions: Theory and application to organic superconductors

    NASA Astrophysics Data System (ADS)

    Piazza, Francesco; Zwerger, Wilhelm; Strack, Philipp

    2016-02-01

    Increasing the spin imbalance in superconductors can spatially modulate the gap by forming Cooper pairs with finite momentum. For large imbalances compared to the Fermi energy, the inhomogeneous FFLO superconductor ultimately becomes a normal metal. There is mounting experimental evidence for this scenario in two-dimensional (2D) organic superconductors in large in-plane magnetic fields; this is complemented by ongoing efforts to realize this scenario in coupled tubes of atomic Fermi gases with spin imbalance. Yet, a theory for the phase transition from a metal to an FFLO superconductor has not been developed so far and the universality class has remained unknown. Here we propose and analyze a spin imbalance driven quantum critical point between a 2D metal and an FFLO phase in anisotropic electron systems. We derive the effective action for electrons and bosonic FFLO pairs at this quantum phase transition. Using this action, we predict non-Fermi-liquid behavior and the absence of quasiparticles at a discrete set of hot spots on the Fermi surfaces. This results in strange power laws in thermodynamics and response functions, which are testable with existing experimental setups on 2D organic superconductors and may also serve as signatures of the elusive FFLO phase itself. The proposed universality class is distinct from previously known quantum critical metals and, because its critical fluctuations appear already in the pairing channel, a promising candidate for naked metallic quantum criticality over extended temperature ranges.

  5. The effect of the labile organic fraction in food waste and the substrate/inoculum ratio on anaerobic digestion for a reliable methane yield.

    PubMed

    Kawai, Minako; Nagao, Norio; Tajima, Nobuaki; Niwa, Chiaki; Matsuyama, Tatsushi; Toda, Tatsuki

    2014-04-01

    Influence of the labile organic fraction (LOF) on anaerobic digestion of food waste was investigated in different S/I ratio of 0.33, 0.5, 1.0, 2.0 and 4.0g-VSsubstrate/g-VSinoculum. Two types of substrate, standard food waste (Substrate 1) and standard food waste with the supernatant (containing LOF) removed (Substrate 2) were used. Highest methane yield of 435ml-CH4g-VS(-1) in Substrate 1 was observed in the lowest S/I ratio, while the methane yield of the other S/I ratios were 38-73% lower than the highest yield due to acidification. The methane yields in Substrate 2 were relatively stable in all S/I conditions, although the maximum methane yield was low compared with Substrate 1. These results showed that LOF in food waste causes acidification, but also contributes to high methane yields, suggesting that low S/I ratio (<0.33) is required to obtain a reliable methane yield from food waste compared to other organic substrates.

  6. Discovery of fairy circles in Australia supports self-organization theory

    PubMed Central

    Getzin, Stephan; Yizhaq, Hezi; Bell, Bronwyn; Erickson, Todd E.; Postle, Anthony C.; Katra, Itzhak; Tzuk, Omer; Zelnik, Yuval R.; Wiegand, Kerstin; Wiegand, Thorsten; Meron, Ehud

    2016-01-01

    Vegetation gap patterns in arid grasslands, such as the “fairy circles” of Namibia, are one of nature’s greatest mysteries and subject to a lively debate on their origin. They are characterized by small-scale hexagonal ordering of circular bare-soil gaps that persists uniformly in the landscape scale to form a homogeneous distribution. Pattern-formation theory predicts that such highly ordered gap patterns should be found also in other water-limited systems across the globe, even if the mechanisms of their formation are different. Here we report that so far unknown fairy circles with the same spatial structure exist 10,000 km away from Namibia in the remote outback of Australia. Combining fieldwork, remote sensing, spatial pattern analysis, and process-based mathematical modeling, we demonstrate that these patterns emerge by self-organization, with no correlation with termite activity; the driving mechanism is a positive biomass–water feedback associated with water runoff and biomass-dependent infiltration rates. The remarkable match between the patterns of Australian and Namibian fairy circles and model results indicate that both patterns emerge from a nonuniform stationary instability, supporting a central universality principle of pattern-formation theory. Applied to the context of dryland vegetation, this principle predicts that different systems that go through the same instability type will show similar vegetation patterns even if the feedback mechanisms and resulting soil–water distributions are different, as we indeed found by comparing the Australian and the Namibian fairy-circle ecosystems. These results suggest that biomass–water feedbacks and resultant vegetation gap patterns are likely more common in remote drylands than is currently known. PMID:26976567

  7. Discovery of fairy circles in Australia supports self-organization theory.

    PubMed

    Getzin, Stephan; Yizhaq, Hezi; Bell, Bronwyn; Erickson, Todd E; Postle, Anthony C; Katra, Itzhak; Tzuk, Omer; Zelnik, Yuval R; Wiegand, Kerstin; Wiegand, Thorsten; Meron, Ehud

    2016-03-29

    Vegetation gap patterns in arid grasslands, such as the "fairy circles" of Namibia, are one of nature's greatest mysteries and subject to a lively debate on their origin. They are characterized by small-scale hexagonal ordering of circular bare-soil gaps that persists uniformly in the landscape scale to form a homogeneous distribution. Pattern-formation theory predicts that such highly ordered gap patterns should be found also in other water-limited systems across the globe, even if the mechanisms of their formation are different. Here we report that so far unknown fairy circles with the same spatial structure exist 10,000 km away from Namibia in the remote outback of Australia. Combining fieldwork, remote sensing, spatial pattern analysis, and process-based mathematical modeling, we demonstrate that these patterns emerge by self-organization, with no correlation with termite activity; the driving mechanism is a positive biomass-water feedback associated with water runoff and biomass-dependent infiltration rates. The remarkable match between the patterns of Australian and Namibian fairy circles and model results indicate that both patterns emerge from a nonuniform stationary instability, supporting a central universality principle of pattern-formation theory. Applied to the context of dryland vegetation, this principle predicts that different systems that go through the same instability type will show similar vegetation patterns even if the feedback mechanisms and resulting soil-water distributions are different, as we indeed found by comparing the Australian and the Namibian fairy-circle ecosystems. These results suggest that biomass-water feedbacks and resultant vegetation gap patterns are likely more common in remote drylands than is currently known. PMID:26976567

  8. Discovery of fairy circles in Australia supports self-organization theory.

    PubMed

    Getzin, Stephan; Yizhaq, Hezi; Bell, Bronwyn; Erickson, Todd E; Postle, Anthony C; Katra, Itzhak; Tzuk, Omer; Zelnik, Yuval R; Wiegand, Kerstin; Wiegand, Thorsten; Meron, Ehud

    2016-03-29

    Vegetation gap patterns in arid grasslands, such as the "fairy circles" of Namibia, are one of nature's greatest mysteries and subject to a lively debate on their origin. They are characterized by small-scale hexagonal ordering of circular bare-soil gaps that persists uniformly in the landscape scale to form a homogeneous distribution. Pattern-formation theory predicts that such highly ordered gap patterns should be found also in other water-limited systems across the globe, even if the mechanisms of their formation are different. Here we report that so far unknown fairy circles with the same spatial structure exist 10,000 km away from Namibia in the remote outback of Australia. Combining fieldwork, remote sensing, spatial pattern analysis, and process-based mathematical modeling, we demonstrate that these patterns emerge by self-organization, with no correlation with termite activity; the driving mechanism is a positive biomass-water feedback associated with water runoff and biomass-dependent infiltration rates. The remarkable match between the patterns of Australian and Namibian fairy circles and model results indicate that both patterns emerge from a nonuniform stationary instability, supporting a central universality principle of pattern-formation theory. Applied to the context of dryland vegetation, this principle predicts that different systems that go through the same instability type will show similar vegetation patterns even if the feedback mechanisms and resulting soil-water distributions are different, as we indeed found by comparing the Australian and the Namibian fairy-circle ecosystems. These results suggest that biomass-water feedbacks and resultant vegetation gap patterns are likely more common in remote drylands than is currently known.

  9. A regulatory theory of cortical organization and its applications to robotics

    NASA Astrophysics Data System (ADS)

    Thangavelautham, Jekanthan

    2009-11-01

    Fundamental aspects of biologically-inspired regulatory mechanisms are considered in a robotics context, using artificial neural-network control systems. Regulatory mechanisms are used to control expression of genes, adaptation of form and behavior in organisms. Traditional neural network control architectures assume networks of neurons are fixed and are interconnected by wires. However, these architectures tend to be specified by a designer and are faced with several limitations that reduce scalability and tractability for tasks with larger search spaces. Traditional methods used to overcome these limitations with fixed network topologies are to provide more supervision by a designer. More supervision as shown does not guarantee improvement during training particularly when making incorrect assumptions for little known task domains. Biological organisms often do not require such external intervention (more supervision) and have self-organized through adaptation. Artificial neural tissues (ANT) addresses limitations with current neural-network architectures by modeling both wired interactions between neurons and wireless interactions through use of chemical diffusion fields. An evolutionary (Darwinian) selection process is used to 'breed' ANT controllers for a task at hand and the framework facilitates emergence of creative solutions since only a system goal function and a generic set of basis behaviours need be defined. Regulatory mechanisms are formed dynamically within ANT through superpositioning of chemical diffusion fields from multiple sources and are used to select neuronal groups. Regulation drives competition and cooperation among neuronal groups and results in areas of specialization forming within the tissue. These regulatory mechanisms are also shown to increase tractability without requiring more supervision using a new statistical theory developed to predict performance characteristics of fixed network topologies. Simulations also confirm the

  10. An Examination of Test-Retest, Alternate Form Reliability, and Generalizability Theory Study of the easyCBM Word and Passage Reading Fluency Assessments: Grade 3. Technical Report #1218

    ERIC Educational Resources Information Center

    Park, Bitnara Jasmine; Anderson, Daniel; Alonzo, Julie; Lai, Cheng-Fei; Tindal, Gerald

    2012-01-01

    This technical report is one in a series of five describing the reliability (test/retest and alternate form) and G-Theory/D-Study research on the easyCBM reading measures, grades 1-5. Data were gathered in the spring of 2011 from a convenience sample of students nested within classrooms at a medium-sized school district in the Pacific Northwest.…

  11. Species Detection and Identification in Sexual Organisms Using Population Genetic Theory and DNA Sequences

    PubMed Central

    Birky, C. William

    2013-01-01

    Phylogenetic trees of DNA sequences of a group of specimens may include clades of two kinds: those produced by stochastic processes (random genetic drift) within a species, and clades that represent different species. The ratio of the mean pairwise sequence difference between a pair of clades (K) to the mean pairwise sequence difference within a clade (θ) can be used to determine whether the clades are samples from different species (K/θ≥4) or the same species (K/θ<4) with probability ≥0.95. Previously I applied this criterion to delimit species of asexual organisms. Here I use data from the literature to show how it can also be applied to delimit sexual species using four groups of sexual organisms as examples: ravens, spotted leopards, sea butterflies, and liverworts. Mitochondrial or chloroplast genes are used because these segregate earlier during speciation than most nuclear genes and hence detect earlier stages of speciation. In several cases the K/θ ratio was greater than 4, confirming the original authors' intuition that the clades were sufficiently different to be assigned to different species. But the K/θ ratio split each of two liverwort species into two evolutionary species, and showed that support for the distinction between the common and Chihuahuan raven species is weak. I also discuss some possible sources of error in using the K/θ ratio; the most significant one would be cases where males migrate between different populations but females do not, making the use of maternally inherited organelle genes problematic. The K/θ ratio must be used with some caution, like all other methods for species delimitation. Nevertheless, it is a simple theory-based quantitative method for using DNA sequences to make rigorous decisions about species delimitation in sexual as well as asexual eukaryotes. PMID:23308113

  12. Species detection and identification in sexual organisms using population genetic theory and DNA sequences.

    PubMed

    Birky, C William

    2013-01-01

    Phylogenetic trees of DNA sequences of a group of specimens may include clades of two kinds: those produced by stochastic processes (random genetic drift) within a species, and clades that represent different species. The ratio of the mean pairwise sequence difference between a pair of clades (K) to the mean pairwise sequence difference within a clade (θ) can be used to determine whether the clades are samples from different species (K/θ ≥ 4) or the same species (K/θ<4) with probability ≥ 0.95. Previously I applied this criterion to delimit species of asexual organisms. Here I use data from the literature to show how it can also be applied to delimit sexual species using four groups of sexual organisms as examples: ravens, spotted leopards, sea butterflies, and liverworts. Mitochondrial or chloroplast genes are used because these segregate earlier during speciation than most nuclear genes and hence detect earlier stages of speciation. In several cases the K/θ ratio was greater than 4, confirming the original authors' intuition that the clades were sufficiently different to be assigned to different species. But the K/θ ratio split each of two liverwort species into two evolutionary species, and showed that support for the distinction between the common and Chihuahuan raven species is weak. I also discuss some possible sources of error in using the K/θ ratio; the most significant one would be cases where males migrate between different populations but females do not, making the use of maternally inherited organelle genes problematic. The K/θ ratio must be used with some caution, like all other methods for species delimitation. Nevertheless, it is a simple theory-based quantitative method for using DNA sequences to make rigorous decisions about species delimitation in sexual as well as asexual eukaryotes.

  13. hfAIM: A reliable bioinformatics approach for in silico genome-wide identification of autophagy-associated Atg8-interacting motifs in various organisms.

    PubMed

    Xie, Qingjun; Tzfadia, Oren; Levy, Matan; Weithorn, Efrat; Peled-Zehavi, Hadas; Van Parys, Thomas; Van de Peer, Yves; Galili, Gad

    2016-05-01

    Most of the proteins that are specifically turned over by selective autophagy are recognized by the presence of short Atg8 interacting motifs (AIMs) that facilitate their association with the autophagy apparatus. Such AIMs can be identified by bioinformatics methods based on their defined degenerate consensus F/W/Y-X-X-L/I/V sequences in which X represents any amino acid. Achieving reliability and/or fidelity of the prediction of such AIMs on a genome-wide scale represents a major challenge. Here, we present a bioinformatics approach, high fidelity AIM (hfAIM), which uses additional sequence requirements-the presence of acidic amino acids and the absence of positively charged amino acids in certain positions-to reliably identify AIMs in proteins. We demonstrate that the use of the hfAIM method allows for in silico high fidelity prediction of AIMs in AIM-containing proteins (ACPs) on a genome-wide scale in various organisms. Furthermore, by using hfAIM to identify putative AIMs in the Arabidopsis proteome, we illustrate a potential contribution of selective autophagy to various biological processes. More specifically, we identified 9 peroxisomal PEX proteins that contain hfAIM motifs, among which AtPEX1, AtPEX6 and AtPEX10 possess evolutionary-conserved AIMs. Bimolecular fluorescence complementation (BiFC) results verified that AtPEX6 and AtPEX10 indeed interact with Atg8 in planta. In addition, we show that mutations occurring within or nearby hfAIMs in PEX1, PEX6 and PEX10 caused defects in the growth and development of various organisms. Taken together, the above results suggest that the hfAIM tool can be used to effectively perform genome-wide in silico screens of proteins that are potentially regulated by selective autophagy. The hfAIM system is a web tool that can be accessed at link: http://bioinformatics.psb.ugent.be/hfAIM/. PMID:27071037

  14. Implications of complex adaptive systems theory for interpreting research about health care organizations

    PubMed Central

    Jordon, Michelle; Lanham, Holly Jordan; Anderson, Ruth A.; McDaniel, Reuben R.

    2013-01-01

    Rationale Data about health care organizations (HCOs) is not useful until it is interpreted. Such interpretations are influenced by the theoretical lenses employed by the researcher. Objective Our purpose is to suggest the usefulness of theories of complex adaptive systems (CASs) in guiding research interpretation. Specifically, we address two questions. (1) What are the implications for interpreting research observations in HCOs of the fact that we are observing relationships among diverse agents? (2) What are the implications for interpreting research observations in HCOs of the fact that we are observing relationships among agents that learn? Method We define diversity and learning and the implications of the nonlinear relationships among agents from a CAS perspective. We then identify some common analytical practices that are problematic and may lead to conceptual and methodological errors. Then we describe strategies for interpreting the results of research observations. Conclusions We suggest that the task of interpreting research observations of HCOs could be improved if researchers take into account that the systems they study are CAS with nonlinear relationships among diverse, learning agents. Our analysis points out how interpretation of research results might be shaped by the fact that HCOs are CASs. We describe how learning is, in fact, the result of interactions among diverse agents and that learning can, by itself, reduce or increase agent diversity. We encourage researchers to be persistent in their attempts to reason about complex systems, and learn to attend not only to structures, but also to processes and functions of complex systems. PMID:20367840

  15. Body without Organs: Notes on Deleuze & Guattari, Critical Race Theory and the Socius of Anti-Racism

    ERIC Educational Resources Information Center

    Ibrahim, Awad

    2015-01-01

    My aim in this article is to epistemologically read Deleuze and Guattari (D & G) against critical race theory (CRT) and simultaneously delineate how D & G's notion of "body without organs" can benefit from CRT. At first glance, especially for language instructors and researchers, these two epistemological frameworks not only…

  16. From Structural Dilemmas to Institutional Imperatives: A Descriptive Theory of the School as an Institution and of School Organizations

    ERIC Educational Resources Information Center

    Berg, Gunnar

    2007-01-01

    This study outlines a descriptive theory that seeks to grasp the complexity of the school as a state and societal institution as well as single schools as organizations. A significant characteristic of this complexity is the ambiguity of the missions and goals--the outer boundaries--of the school-institution. The more institutional ambiguity that…

  17. Organizational Economics: Notes on the Use of Transaction-Cost Theory in the Study of Organizations.

    ERIC Educational Resources Information Center

    Robins, James A.

    1987-01-01

    Reviews transaction-cost approaches to organizational analysis, examines their use in microeconomic theory, and identifies some important flaws in the study. Advocates transaction-cost theory as a powerful tool for organizational and strategic analysis when set within the famework of more general organizational theory. Includes 61 references. (MLH)

  18. A Theory of Complex Adaptive Inquiring Organizations: Application to Continuous Assurance of Corporate Financial Information

    ERIC Educational Resources Information Center

    Kuhn, John R., Jr.

    2009-01-01

    Drawing upon the theories of complexity and complex adaptive systems and the Singerian Inquiring System from C. West Churchman's seminal work "The Design of Inquiring Systems" the dissertation herein develops a systems design theory for continuous auditing systems. The dissertation consists of discussion of the two foundational theories,…

  19. Theory aided design and analysis of dielectric and semiconductor components for organic field-effect transistors

    NASA Astrophysics Data System (ADS)

    Dibenedetto, Sara Arlene

    Perfluoroacyl/acyl-derivatized quaterthiophens are developed and synthesized. The frontier molecular orbital energies of these compounds are studied by optical spectroscopy and electrochemistry while solid-state/film properties are investigated by thermal analysis, x-ray diffraction, and scanning electron microscopy. Organic thin film transistors (OTFTs) performance parameters are discussed in terms of the interplay between semiconductor molecular energetics and film morphologies/microstructures. The majority charge carrier type and mobility exhibit a strong correlation with the regiochemistry of perfluoroarene incorporation. In quaterthiophene-based semiconductors, carbonyl-functionalization allows tuning of the majority carrier type from p-type to ambipolar and to n-type. In situ conversion of a p-type semiconducting film to n-type film is also demonstrated. The design of chemical and film microstructural alternative hybrid organic-inorganic gate dielectrics is described using the classic Clausius-Mossotti relation. The Maxwell-Wagner effective medium model is used to compute the effective dielectric permittivity of two types of dielectrics self-assembled nanodielectrics (SANDs) and crosslinked polymer blends (CPBs). In these calculations showing good agreement between theory and experiment, it is found that greater capacitances should be achievable with mixed composites than with layered composites. With this insight, a series of mixed metal oxide-polyolefin nanocomposites is synthesized via in-situ olefin polymerization using the single-site metallocene catalysts. By integrating organic and inorganic constituents, the resulting hybrid material exhibit high permittivity (from the inorganic inclusions) and high breakdown strength, mechanical flexibility, and facile processability (from the polymer matrices). In order to better optimize the capacitance and leakage current of hybrid organic-inorganic dielectrics, the capacitance, leakage current and OFET gate

  20. Intentions of becoming a living organ donor among Hispanics: a theory-based approach exploring differences between living and nonliving organ donation.

    PubMed

    Siegel, Jason T; Alvaro, Eusebio M; Lac, Andrew; Crano, William D; Dominick, Alexander

    2008-01-01

    This research examines perceptions concerning living (n = 1,253) and nonliving (n = 1,259) organ donation among Hispanic adults, a group considerably less likely than the general population to become donors. Measures are derived from the Theory of Planned Behavior (Ajzen, 1991) and Vested Interest Theory (Crano, 1983, 1997). A substantial percentage of respondents reported positive attitudes and high personal stake concerning organ donation. Mean differences in norms, attitudes, intentions, and assumed immediacy of payoff were found between living and nonliving donor groups, suggesting that these two donation formats are dissimilar and should be examined independently. Accordingly, separate hierarchical multiple regression models were estimated for living and nonliving donation. Analyses supported both theoretical frameworks: Constructs associated with Planned Behavior and Vested Interest independently contributed to donor intentions. The implications of these results, and our recommendations for future health campaigns, are presented in light of these theoretical models. PMID:18307137

  1. A Theory-Based Comparison of the Reliabilities of Fixed-Length and Trials-to-Criterion Scoring of Physical Education Skills Tests.

    ERIC Educational Resources Information Center

    Feldt, Leonard S.; Spray, Judith A.

    1983-01-01

    The reliabilities of two types of measurement plans were compared across six hypothetical distributions of true scores or abilities. The measurement plans were: (1) fixed-length, where the number of trials for all examinees is set in advance; and (2) trials-to-criterion, where examinees must keep trying until they complete a given number of trials…

  2. A Monte Carlo Simulation Investigating the Validity and Reliability of Ability Estimation in Item Response Theory with Speeded Computer Adaptive Tests

    ERIC Educational Resources Information Center

    Schmitt, T. A.; Sass, D. A.; Sullivan, J. R.; Walker, C. M.

    2010-01-01

    Imposed time limits on computer adaptive tests (CATs) can result in examinees having difficulty completing all items, thus compromising the validity and reliability of ability estimates. In this study, the effects of speededness were explored in a simulated CAT environment by varying examinee response patterns to end-of-test items. Expectedly,…

  3. Elastic, not plastic species: Frozen plasticity theory and the origin of adaptive evolution in sexually reproducing organisms

    PubMed Central

    2010-01-01

    Background Darwin's evolutionary theory could easily explain the evolution of adaptive traits (organs and behavioral patterns) in asexual but not in sexual organisms. Two models, the selfish gene theory and frozen plasticity theory were suggested to explain evolution of adaptive traits in sexual organisms in past 30 years. Results The frozen plasticity theory suggests that sexual species can evolve new adaptations only when their members are genetically uniform, i.e. only after a portion of the population of the original species had split off, balanced on the edge of extinction for several generations, and then undergone rapid expansion. After a short period of time, estimated on the basis of paleontological data to correspond to 1-2% of the duration of the species, polymorphism accumulates in the gene pool due to frequency-dependent selection; and thus, in each generation, new mutations occur in the presence of different alleles and therefore change their selection coefficients from generation to generation. The species ceases to behave in an evolutionarily plastic manner and becomes evolutionarily elastic on a microevolutionary time-scale and evolutionarily frozen on a macroevolutionary time-scale. It then exists in this state until such changes accumulate in the environment that the species becomes extinct. Conclusion Frozen plasticity theory, which includes the Darwinian model of evolution as a special case - the evolution of species in a plastic state, not only offers plenty of new predictions to be tested, but also provides explanations for a much broader spectrum of known biological phenomena than classic evolutionary theories. Reviewers This article was reviewed by Rob Knight, Fyodor Kondrashov and Massimo Di Giulio (nominated by David H. Ardell). PMID:20067646

  4. Reliability Prediction

    NASA Technical Reports Server (NTRS)

    1993-01-01

    RELAV, a NASA-developed computer program, enables Systems Control Technology, Inc. (SCT) to predict performance of aircraft subsystems. RELAV provides a system level evaluation of a technology. Systems, the mechanism of a landing gear for example, are first described as a set of components performing a specific function. RELAV analyzes the total system and the individual subsystem probabilities to predict success probability, and reliability. This information is then translated into operational support and maintenance requirements. SCT provides research and development services in support of government contracts.

  5. A theory for classification of health care organizations in the new economy.

    PubMed

    Vimarlund, Vivian; Sjöberg, Cecilia; Timpka, Toomas

    2003-10-01

    Most of the available studies into information technology (IT) have been limited to investigating specific issues, such as how IT can support decision makers distributing the information throughout health care organization, or how technology impacts organizational performance. In this study, for use in the planning of information system development projects, a theoretical model for the classification of health care organizations is proposed. We try to reflect the development in the contemporary digital economy by theoretically classifying health care organizations into three types, namely traditional, developing, and flexible. We describe traditional health care organizations as organizations with a centralized system for management and control. In developing health care organizations, IT is spread over the horizontal dimension and is used for coordinating the different parties throughout the organization. Finally, flexible health care organizations are those which work actively with the design of new health care organizational structure while they are designing the information system.

  6. Communication as a predictor of willingness to donate one's organs: an addition to the Theory of Reasoned Action.

    PubMed

    Jeffres, Leo W; Carroll, Jeanine A; Rubenking, Bridget E; Amschlinger, Joe

    2008-12-01

    Fishbein and Ajzen's theory of reasoned action has been used by many researchers, particularly in regard to health communication, to predict behavioral intentions and behavior. According to that theory, one's intention is the best predictor that one will engage in a behavior, and attitudes and social norms predict behavioral intentions. Other researchers have added different variables to the postulates of attitudes and social norms that Fishbein and Ajzen maintain are the best predictors of behavioral intention. Here we draw on data from a 2006 telephone survey (N = 420) gauging the awareness of an organ donation campaign in Northeast Ohio to examine the impact of communication on people's intentions. The current study supports the hypothesis that those who communicate with others are more likely to express a greater willingness to become an organ donor, but it expands the range of communication contexts. With demographics and attitudes toward organ donation controlled for, this study shows that communication with others about organ donation increases the willingness of individuals to have favorable attitudes about being an organ donor.

  7. Network Disturbance Theory: Spatial and Temporal Organization of Physical Heterogeneity in Rivers

    NASA Astrophysics Data System (ADS)

    Benda, L.; Poff, L.; Miller, D.; Dunne, T.; Reeves, G.; Pess, G.; Pollock, M.

    2003-12-01

    Beginning with the premise that extreme events, or "disturbances" in the parlance of ecologists (i.e., storms, fires, floods, and punctuated erosion and sediment transport), are intrinsic to many landscapes and river systems across the world, we develop new theory on the interaction between disturbances and branching river networks. The interaction of disturbances with network geometry is central to the question of how habitat heterogeneity forms within riverine corridors, a principle underlying the emerging ecosystem concept of "riverscapes". We explore that interaction by examining how tributary confluences interrupt gradual downstream changes in channel morphology leading to locally increased physical heterogeneity. Punctuated erosion during storms and following fires and episodic floods leads to discontinuous inputs of water, sediment, and organic material at confluences that modify channel and valley-floor morphology for tens of meters to kilometers, including substrate sizes, channel hydraulic geometry, floodplain widths, fans, terraces, and log jams. Based on 14 studies that documented these confluence effects at 168 junctions spanning 6 orders of magnitude in drainage area, it appears that the probability of confluence effects increases with increasing size ratio of tributary to mainstem river. This simple scaling relationship indicates that the downstream increase in tributary basin size found in many watersheds results in a downstream increase in the spacing between confluences that have morphological effects, identifying a control on the spatial scale of confluence-related heterogeneity in rivers. Therefore, when a river network is viewed as a population of tributaries and confluences, the extent to which a network interrupts downstream continua of physical and biological processes (and the degree of habitat heterogeneity) should depend on network geometry, basin shape, drainage density, and basin size. For example, oval-shaped basins (containing

  8. Selection of allelic isozyme polymorphisms in marine organisms: pattern, theory, and application.

    PubMed

    Nevo, E; Lavie, E; Ben-Shlomo, R

    1983-01-01

    The evolutionary significance of allelic isozyme polymorphisms in several Mediterranean marine organisms was tested initially by post-hoc gene frequency analyses at 11-15 gene loci in natural populations of barnacles, Balanus amphitrite, under thermal [Nevo et al, 1977] and chemical [Nevo et al, 1978] pollutions. We next carried out pre-hoc controlled laboratory experiments to test the effects of heavy metal pollution (Hg, Zn, Cd) on genotypic frequencies of 15 phosphoglucomutase (PGM) genotypes in thousands of individuals of the shrimp Palaemon elegans [Nevo et al, 1980, 1981a, and the present study]. Similarly, we tested the effects of Hg, Zn, Cd, Pb, Cu pollutions on the genotypic and allelic frequencies of five phosphoglucose isomerase (PGI) genotypes in the two close species of marine gastropods, Monodonta turbinata and M turbiformis [Lavie and Nevo, 1982, and the present study]. In both the thermal and chemical pollution studies, we established in repeated experiments statistically significant differences of allele frequencies at 8 out of 11 (73%) and 10 out of 15 (67%) gene loci, respectively, between the contrasting environments in each. While no specific function could be singled out in the post-hoc chemical study due to the complex nature of polluted marine water, temperature could be specified as the primary selective agent in the thermal study. The strongest direct and specific evidence for significant differential survivorship among allelic isozyme genotypes was obtained in the pre-hoc studies in Palaemon and Monodonta. Their differential viability was probably associated with the different degree of heavy metal inhibition uniquely related to each specific pollutant. Furthermore, we demonstrated in the two closely related Monodonta species parallel genotypic differentiation as a response to pollution. Our results are inconsistent with the neutral theory of allelic isozyme polymorphisms and appear to reflect the adaptive nature of the allelic isozyme

  9. Multifractality to Photonic Crystal & Self-Organization to Metamaterials through Anderson Localizations & Group/Gauge Theory

    NASA Astrophysics Data System (ADS)

    Hidajatullah-Maksoed, Widastra

    2015-04-01

    Arthur Cayley at least investigate by creating the theory of permutation group[F:∖∖Group_theory.htm] where in cell elements addressing of the lattice Qmf used a Cayley tree, the self-afine object Qmf is described by the combination of the finite groups of rotation & inversion and the infinite groups of translation & dilation[G Corso & LS Lacena: ``Multifractal lattice and group theory'', Physica A: Statistical Mechanics &Its Applications, 2005, v 357, issue I, h 64-70; http://www.sciencedirect.com/science/articel/pii/S0378437105005005 ] hence multifractal can be related to group theory. Many grateful Thanks to HE. Mr. Drs. P. SWANTORO & HE. Mr. Ir. SARWONO KUSUMAATMADJA.

  10. Immodest Witnesses: Reliability and Writing Assessment

    ERIC Educational Resources Information Center

    Gallagher, Chris W.

    2014-01-01

    This article offers a survey of three reliability theories in writing assessment: positivist, hermeneutic, and rhetorical. Drawing on an interdisciplinary investigation of the notion of "witnessing," this survey emphasizes the kinds of readers and readings each theory of reliability produces and the epistemological grounds on which it…

  11. Latent Trait Theory Approach to Measuring Person-Organization Fit: Conceptual Rationale and Empirical Evaluation

    ERIC Educational Resources Information Center

    Chernyshenko, Oleksandr S.; Stark, Stephen; Williams, Alex

    2009-01-01

    The purpose of this article is to offer a new approach to measuring person-organization (P-O) fit, referred to here as "Latent fit." Respondents were administered unidimensional forced choice items and were asked to choose the statement in each pair that better reflected the correspondence between their values and those of the organization;…

  12. Toward a Theory of Variation in the Organization of the Word Reading System

    ERIC Educational Resources Information Center

    Rueckl, Jay G.

    2016-01-01

    The strategy underlying most computational models of word reading is to specify the organization of the reading system--its architecture and the processes and representations it employs--and to demonstrate that this organization would give rise to the behavior observed in word reading tasks. This approach fails to adequately address the variation…

  13. Understanding the Environmental Elements in Religious Student Organizations through Sharon Parks' Mentoring Community Theory

    ERIC Educational Resources Information Center

    Gill, David Christopher

    2011-01-01

    Students are coming to colleges and universities for spiritual fulfillment and have turned to religious student organizations (i.e. Campus Crusade for Christ, Newman Centers, Muslim Student Association, Hillel, etc.) to attain guidance and support. To better understand the spiritual environment religious student organizations have in place, many…

  14. Power laws and self-organized criticality in theory and nature

    NASA Astrophysics Data System (ADS)

    Marković, Dimitrije; Gros, Claudius

    2014-03-01

    Power laws and distributions with heavy tails are common features of many complex systems. Examples are the distribution of earthquake magnitudes, solar flare intensities and the sizes of neuronal avalanches. Previously, researchers surmised that a single general concept may act as an underlying generative mechanism, with the theory of self organized criticality being a weighty contender. The power-law scaling observed in the primary statistical analysis is an important, but by far not the only feature characterizing experimental data. The scaling function, the distribution of energy fluctuations, the distribution of inter-event waiting times, and other higher order spatial and temporal correlations, have seen increased consideration over the last years. Leading to realization that basic models, like the original sandpile model, are often insufficient to adequately describe the complexity of real-world systems with power-law distribution. Consequently, a substantial amount of effort has gone into developing new and extended models and, hitherto, three classes of models have emerged. The first line of models is based on a separation between the time scales of an external drive and an internal dissipation, and includes the original sandpile model and its extensions, like the dissipative earthquake model. Within this approach the steady state is close to criticality in terms of an absorbing phase transition. The second line of models is based on external drives and internal dynamics competing on similar time scales and includes the coherent noise model, which has a non-critical steady state characterized by heavy-tailed distributions. The third line of models proposes a non-critical self-organizing state, being guided by an optimization principle, such as the concept of highly optimized tolerance. We present a comparative overview regarding distinct modeling approaches together with a discussion of their potential relevance as underlying generative models for real

  15. Knowledge sharing within organizations: linking art, theory, scenarios and professional experience

    NASA Technical Reports Server (NTRS)

    Bailey, T.; Burton, Y. C.

    2000-01-01

    In this discussion, T. Bailey will be addressing the multiple paradigms within organizations using imagery. Dr. Burton will discuss the relationship between these paradigms and social exchanges that lead to knowledge sharing.

  16. Reliable prediction of three-body intermolecular interactions using dispersion-corrected second-order Møller-Plesset perturbation theory

    SciTech Connect

    Huang, Yuanhang; Beran, Gregory J. O.

    2015-07-28

    Three-body and higher intermolecular interactions can play an important role in molecular condensed phases. Recent benchmark calculations found problematic behavior for many widely used density functional approximations in treating 3-body intermolecular interactions. Here, we demonstrate that the combination of second-order Møller-Plesset (MP2) perturbation theory plus short-range damped Axilrod-Teller-Muto (ATM) dispersion accurately describes 3-body interactions with reasonable computational cost. The empirical damping function used in the ATM dispersion term compensates both for the absence of higher-order dispersion contributions beyond the triple-dipole ATM term and non-additive short-range exchange terms which arise in third-order perturbation theory and beyond. Empirical damping enables this simple model to out-perform a non-expanded coupled Kohn-Sham dispersion correction for 3-body intermolecular dispersion. The MP2 plus ATM dispersion model approaches the accuracy of O(N{sup 6}) methods like MP2.5 or even spin-component-scaled coupled cluster models for 3-body intermolecular interactions with only O(N{sup 5}) computational cost.

  17. Molecular Theory of Detonation Initiation: Insight from First Principles Modeling of the Decomposition Mechanisms of Organic Nitro Energetic Materials.

    PubMed

    Tsyshevsky, Roman V; Sharia, Onise; Kuklja, Maija M

    2016-02-19

    This review presents a concept, which assumes that thermal decomposition processes play a major role in defining the sensitivity of organic energetic materials to detonation initiation. As a science and engineering community we are still far away from having a comprehensive molecular detonation initiation theory in a widely agreed upon form. However, recent advances in experimental and theoretical methods allow for a constructive and rigorous approach to design and test the theory or at least some of its fundamental building blocks. In this review, we analyzed a set of select experimental and theoretical articles, which were augmented by our own first principles modeling and simulations, to reveal new trends in energetic materials and to refine known existing correlations between their structures, properties, and functions. Our consideration is intentionally limited to the processes of thermally stimulated chemical reactions at the earliest stage of decomposition of molecules and materials containing defects.

  18. Molecular Theory of Detonation Initiation: Insight from First Principles Modeling of the Decomposition Mechanisms of Organic Nitro Energetic Materials

    DOE PAGES

    Tsyshevsky, Roman; Sharia, Onise; Kuklja, Maija

    2016-02-19

    Our review presents a concept, which assumes that thermal decomposition processes play a major role in defining the sensitivity of organic energetic materials to detonation initiation. As a science and engineering community we are still far away from having a comprehensive molecular detonation initiation theory in a widely agreed upon form. However, recent advances in experimental and theoretical methods allow for a constructive and rigorous approach to design and test the theory or at least some of its fundamental building blocks. In this review, we analyzed a set of select experimental and theoretical articles, which were augmented by our ownmore » first principles modeling and simulations, to reveal new trends in energetic materials and to refine known existing correlations between their structures, properties, and functions. Lastly, our consideration is intentionally limited to the processes of thermally stimulated chemical reactions at the earliest stage of decomposition of molecules and materials containing defects.« less

  19. Molecular Theory of Detonation Initiation: Insight from First Principles Modeling of the Decomposition Mechanisms of Organic Nitro Energetic Materials.

    PubMed

    Tsyshevsky, Roman V; Sharia, Onise; Kuklja, Maija M

    2016-01-01

    This review presents a concept, which assumes that thermal decomposition processes play a major role in defining the sensitivity of organic energetic materials to detonation initiation. As a science and engineering community we are still far away from having a comprehensive molecular detonation initiation theory in a widely agreed upon form. However, recent advances in experimental and theoretical methods allow for a constructive and rigorous approach to design and test the theory or at least some of its fundamental building blocks. In this review, we analyzed a set of select experimental and theoretical articles, which were augmented by our own first principles modeling and simulations, to reveal new trends in energetic materials and to refine known existing correlations between their structures, properties, and functions. Our consideration is intentionally limited to the processes of thermally stimulated chemical reactions at the earliest stage of decomposition of molecules and materials containing defects. PMID:26907231

  20. Intermolecular symmetry-adapted perturbation theory study of large organic complexes

    SciTech Connect

    Heßelmann, Andreas; Korona, Tatiana

    2014-09-07

    Binding energies for the complexes of the S12L database by Grimme [Chem. Eur. J. 18, 9955 (2012)] were calculated using intermolecular symmetry-adapted perturbation theory combined with a density-functional theory description of the interacting molecules. The individual interaction energy decompositions revealed no particular change in the stabilisation pattern as compared to smaller dimer systems at equilibrium structures. This demonstrates that, to some extent, the qualitative description of the interaction of small dimer systems may be extrapolated to larger systems, a method that is widely used in force-fields in which the total interaction energy is decomposed into atom-atom contributions. A comparison of the binding energies with accurate experimental reference values from Grimme, the latter including thermodynamic corrections from semiempirical calculations, has shown a fairly good agreement to within the error range of the reference binding energies.

  1. Theoretical modeling of the linear and nonlinear optical properties of organic crystals within the rigorous local field theory (RLFT)

    SciTech Connect

    Seidler, T.; Stadnicka, K.; Champagne, B.

    2015-03-30

    This contribution summarizes our current findings in the field of calculating and predicting the linear and second-order nonlinear electric susceptibility tensor components of organic crystals. The methodology used for this purpose is based on a combination of the electrostatic interaction scheme developed by Munn and his coworkers (RLFT) with high-level electronic structure calculations. We compare the results of calculations with available experimental data for several examples of molecular crystals. We show the quality of the final results is influenced by i) the chromophore geometry, ii) the method used for molecular properties calculations and iii) the partitioning scheme used. In conclusion we summarize further plans to improve the reliability and predictability of the method.

  2. Top-emitting white organic light-emitting devices with down-conversion phosphors: theory and experiment.

    PubMed

    Ji, Wenyu; Zhang, Letian; Gao, Ruixue; Zhang, Liming; Xie, Wenfa; Zhang, Hanzhuang; Li, Bin

    2008-09-29

    White top-emitting organic light-emitting devices (TEOLEDs) with down-conversion phosphors are investigated from theory and experiment. The theoretical simulation was described by combining the microcavity model with the down-conversion model. A White TEOLED by the combination of a blue TEOLED with organic down-conversion phosphor 3-(4-(diphenylamino)phenyl)-1-pheny1prop-2-en-1-one was fabricated to validate the simulated results. It is shown that this approach permits the generation of white light in TEOLEDs. The efficiency of the white TEOLED is twice over the corresponding blue TEOLED. The feasible methods to improve the performance of such white TEOLEDs are discussed.

  3. Hierarchies in eukaryotic genome organization: Insights from polymer theory and simulations

    PubMed Central

    2011-01-01

    Eukaryotic genomes possess an elaborate and dynamic higher-order structure within the limiting confines of the cell nucleus. Knowledge of the physical principles and the molecular machinery that govern the 3D organization of this structure and its regulation are key to understanding the relationship between genome structure and function. Elegant microscopy and chromosome conformation capture techniques supported by analysis based on polymer models are important steps in this direction. Here, we review results from these efforts and provide some additional insights that elucidate the relationship between structure and function at different hierarchical levels of genome organization. PMID:21595865

  4. The Impact of Multiple Master Patient Index Records on the Business Performance of Health Care Organizations: A Qualitative Grounded Theory Study

    ERIC Educational Resources Information Center

    Banton, Cynthia L.

    2014-01-01

    The purpose of this qualitative grounded theory study was to explore and examine the factors that led to the creation of multiple record entries, and present a theory on the impact the problem has on the business performance of health care organizations. A sample of 59 health care professionals across the United States participated in an online…

  5. Understanding the Value of Enterprise Architecture for Organizations: A Grounded Theory Approach

    ERIC Educational Resources Information Center

    Nassiff, Edwin

    2012-01-01

    There is a high rate of information system implementation failures attributed to the lack of alignment between business and information technology strategy. Although enterprise architecture (EA) is a means to correct alignment problems and executives highly rate the importance of EA, it is still not used in most organizations today. Current…

  6. Understanding Program Planning Theory and Practice in a Feminist Community-Based Organization

    ERIC Educational Resources Information Center

    Bracken, Susan J.

    2011-01-01

    The purpose of this article is to discuss feminist-program-planning issues, drawing from a critical ethnographic study of a Latin American feminist community-based organization. The research findings discuss the centrality of feminist identity to understanding and analyzing day-to-day program-planning process issues within a feminist…

  7. Knowledge sharing within organizations: linking art, theory, scenarios and professional experience

    NASA Technical Reports Server (NTRS)

    Burton, Y. C.; Bailey, T.

    2000-01-01

    In this presentation, Burton and Bailey, discuss the challenges and opportunities in developing knowledge sharing systems in organizations. Bailey provides a tool using imagery and collage for identifying and utilizing the diverse values and beliefs of individuals and groups. Burton reveals findings from a business research study that examines how social construction influences knowledge sharing among task oriented groups.

  8. The self-organizing fractal theory as a universal discovery method: the phenomenon of life.

    PubMed

    Kurakin, Alexei

    2011-03-29

    A universal discovery method potentially applicable to all disciplines studying organizational phenomena has been developed. This method takes advantage of a new form of global symmetry, namely, scale-invariance of self-organizational dynamics of energy/matter at all levels of organizational hierarchy, from elementary particles through cells and organisms to the Universe as a whole. The method is based on an alternative conceptualization of physical reality postulating that the energy/matter comprising the Universe is far from equilibrium, that it exists as a flow, and that it develops via self-organization in accordance with the empirical laws of nonequilibrium thermodynamics. It is postulated that the energy/matter flowing through and comprising the Universe evolves as a multiscale, self-similar structure-process, i.e., as a self-organizing fractal. This means that certain organizational structures and processes are scale-invariant and are reproduced at all levels of the organizational hierarchy. Being a form of symmetry, scale-invariance naturally lends itself to a new discovery method that allows for the deduction of missing information by comparing scale-invariant organizational patterns across different levels of the organizational hierarchy.An application of the new discovery method to life sciences reveals that moving electrons represent a keystone physical force (flux) that powers, animates, informs, and binds all living structures-processes into a planetary-wide, multiscale system of electron flow/circulation, and that all living organisms and their larger-scale organizations emerge to function as electron transport networks that are supported by and, at the same time, support the flow of electrons down the Earth's redox gradient maintained along the core-mantle-crust-ocean-atmosphere axis of the planet. The presented findings lead to a radically new perspective on the nature and origin of life, suggesting that living matter is an organizational state

  9. The self-organizing fractal theory as a universal discovery method: the phenomenon of life

    PubMed Central

    2011-01-01

    A universal discovery method potentially applicable to all disciplines studying organizational phenomena has been developed. This method takes advantage of a new form of global symmetry, namely, scale-invariance of self-organizational dynamics of energy/matter at all levels of organizational hierarchy, from elementary particles through cells and organisms to the Universe as a whole. The method is based on an alternative conceptualization of physical reality postulating that the energy/matter comprising the Universe is far from equilibrium, that it exists as a flow, and that it develops via self-organization in accordance with the empirical laws of nonequilibrium thermodynamics. It is postulated that the energy/matter flowing through and comprising the Universe evolves as a multiscale, self-similar structure-process, i.e., as a self-organizing fractal. This means that certain organizational structures and processes are scale-invariant and are reproduced at all levels of the organizational hierarchy. Being a form of symmetry, scale-invariance naturally lends itself to a new discovery method that allows for the deduction of missing information by comparing scale-invariant organizational patterns across different levels of the organizational hierarchy. An application of the new discovery method to life sciences reveals that moving electrons represent a keystone physical force (flux) that powers, animates, informs, and binds all living structures-processes into a planetary-wide, multiscale system of electron flow/circulation, and that all living organisms and their larger-scale organizations emerge to function as electron transport networks that are supported by and, at the same time, support the flow of electrons down the Earth's redox gradient maintained along the core-mantle-crust-ocean-atmosphere axis of the planet. The presented findings lead to a radically new perspective on the nature and origin of life, suggesting that living matter is an organizational state

  10. The self-organizing fractal theory as a universal discovery method: the phenomenon of life.

    PubMed

    Kurakin, Alexei

    2011-01-01

    A universal discovery method potentially applicable to all disciplines studying organizational phenomena has been developed. This method takes advantage of a new form of global symmetry, namely, scale-invariance of self-organizational dynamics of energy/matter at all levels of organizational hierarchy, from elementary particles through cells and organisms to the Universe as a whole. The method is based on an alternative conceptualization of physical reality postulating that the energy/matter comprising the Universe is far from equilibrium, that it exists as a flow, and that it develops via self-organization in accordance with the empirical laws of nonequilibrium thermodynamics. It is postulated that the energy/matter flowing through and comprising the Universe evolves as a multiscale, self-similar structure-process, i.e., as a self-organizing fractal. This means that certain organizational structures and processes are scale-invariant and are reproduced at all levels of the organizational hierarchy. Being a form of symmetry, scale-invariance naturally lends itself to a new discovery method that allows for the deduction of missing information by comparing scale-invariant organizational patterns across different levels of the organizational hierarchy.An application of the new discovery method to life sciences reveals that moving electrons represent a keystone physical force (flux) that powers, animates, informs, and binds all living structures-processes into a planetary-wide, multiscale system of electron flow/circulation, and that all living organisms and their larger-scale organizations emerge to function as electron transport networks that are supported by and, at the same time, support the flow of electrons down the Earth's redox gradient maintained along the core-mantle-crust-ocean-atmosphere axis of the planet. The presented findings lead to a radically new perspective on the nature and origin of life, suggesting that living matter is an organizational state

  11. [Theory of spatial organization of epithelial layers (using neuroepithelia as an example)].

    PubMed

    Savost'ianov, G A

    2001-01-01

    New principles of spatial organization of epithelial layers and highly productive approach for studying their tridimensional histoarchitecture have been proposed for the first time. This approach was based on conception of module structure of tissues and represents a family of topological and geometrical models of tissue structure and their experimental aprobation. This approach allows to create theoretical histology of epithelial layers, allowing to predict and define experimentally new variants of multi-row and multi-serie epithelia and to forecast their changes in development and pathology. According to this conception a family of new tridimensional tissue models was created. It was demonstrated that the structures studied are characteristics for real epithelial. Possibility of existence of a new type epithelia was predicted and a complex of new informative signs for their spatial organization was presented.

  12. Electronic structure of the organic semiconductor copper phthalocyanine: experiment and theory.

    PubMed

    Aristov, V Yu; Molodtsova, O V; Maslyuk, V V; Vyalikh, D V; Zhilin, V M; Ossipyan, Yu A; Bredow, T; Mertig, I; Knupfer, M

    2008-01-21

    The electronic structure of the organic semiconductor copper-phthalocyanine (CuPc) has been determined by a combination of conventional and resonant photoemission, near-edge x-ray absorption, as well as by the first-principles calculations. The experimentally obtained electronic valence band structure of CuPc is in very good agreement with the calculated density of states results, allowing the derivation of detailed site specific information.

  13. [Participation and integration: the self-organization theories point of view].

    PubMed

    Aleksandrowicz, Ana Maria Coutinho

    2009-10-01

    This article presents theoretical bases to facilitate participation and integration within an interdisciplinary research team. In order to achieve this, we will sketch fundamental notions related to the new conceptual field of self-organization of living beings. Subsequently, we will expose some ideas by Henri Atlan, Jean-Pierre Dupuy and Cornelius Castoriadis that are important to reach our objectives. Finally, we will suggest how to turn these principles into practice. PMID:19750370

  14. Ethical models in bioethics: theory and application in organ allocation policies.

    PubMed

    Petrini, C

    2010-12-01

    Policies for allocating organs to people awaiting a transplant constitute a major ethical challenge. First and foremost, they demand balance between the principles of beneficence and justice, but many other ethically relevant principles are also involved: autonomy, responsibility, equity, efficiency, utility, therapeutic outcome, medical urgency, and so forth. Various organ allocation models can be developed based on the hierarchical importance assigned to a given principle over the others, but none of the principles should be completely disregarded. An ethically acceptable organ allocation policy must therefore be in conformity, to a certain extent, with the requirements of all the principles. Many models for organ allocation can be derived. The utilitarian model aims to maximize benefits, which can be of various types on a social or individual level, such as the number of lives saved, prognosis, and so forth. The prioritarian model favours the neediest or those who suffer most. The egalitarian model privileges equity and justice, suggesting that all people should have an equal opportunity (casual allocation) or priority should be given to those who have been waiting longer. The personalist model focuses on each individual patient, attempting to mesh together all the various aspects affecting the person: therapeutic needs (urgency), fairness, clinical outcomes, respect for persons. In the individualistic model the main element is free choice and the system of opting-in is privileged. Contrary to the individualistic model, the communitarian model identities in the community the fundamental elements for the legitimacy of choices: therefore, the system of opting-out is privileged. This article does not aim at suggesting practical solutions. Rather, it furnishes to decision makers an overview on the possible ethical approach to this matter.

  15. Ethical models in bioethics: theory and application in organ allocation policies.

    PubMed

    Petrini, C

    2010-12-01

    Policies for allocating organs to people awaiting a transplant constitute a major ethical challenge. First and foremost, they demand balance between the principles of beneficence and justice, but many other ethically relevant principles are also involved: autonomy, responsibility, equity, efficiency, utility, therapeutic outcome, medical urgency, and so forth. Various organ allocation models can be developed based on the hierarchical importance assigned to a given principle over the others, but none of the principles should be completely disregarded. An ethically acceptable organ allocation policy must therefore be in conformity, to a certain extent, with the requirements of all the principles. Many models for organ allocation can be derived. The utilitarian model aims to maximize benefits, which can be of various types on a social or individual level, such as the number of lives saved, prognosis, and so forth. The prioritarian model favours the neediest or those who suffer most. The egalitarian model privileges equity and justice, suggesting that all people should have an equal opportunity (casual allocation) or priority should be given to those who have been waiting longer. The personalist model focuses on each individual patient, attempting to mesh together all the various aspects affecting the person: therapeutic needs (urgency), fairness, clinical outcomes, respect for persons. In the individualistic model the main element is free choice and the system of opting-in is privileged. Contrary to the individualistic model, the communitarian model identities in the community the fundamental elements for the legitimacy of choices: therefore, the system of opting-out is privileged. This article does not aim at suggesting practical solutions. Rather, it furnishes to decision makers an overview on the possible ethical approach to this matter. PMID:21196904

  16. Integrating mechanistic organism--environment interactions into the basic theory of community and evolutionary ecology.

    PubMed

    Baskett, Marissa L

    2012-03-15

    This paper presents an overview of how mechanistic knowledge of organism-environment interactions, including biomechanical interactions of heat, mass and momentum transfer, can be integrated into basic theoretical population biology through mechanistic functional responses that quantitatively describe how organisms respond to their physical environment. Integrating such functional responses into simple community and microevolutionary models allows scaling up of the organism-level understanding from biomechanics both ecologically and temporally. For community models, Holling-type functional responses for predator-prey interactions provide a classic example of the functional response affecting qualitative model dynamics, and recent efforts are expanding analogous models to incorporate environmental influences such as temperature. For evolutionary models, mechanistic functional responses dependent on the environment can serve as fitness functions in both quantitative genetic and game theoretic frameworks, especially those concerning function-valued traits. I present a novel comparison of a mechanistic fitness function based on thermal performance curves to a commonly used generic fitness function, which quantitatively differ in their predictions for response to environmental change. A variety of examples illustrate how mechanistic functional responses enhance model connections to biologically relevant traits and processes as well as environmental conditions and therefore have the potential to link theoretical and empirical studies. Sensitivity analysis of such models can provide biologically relevant insight into which parameters and processes are important to community and evolutionary responses to environmental change such as climate change, which can inform conservation management aimed at protecting response capacity. Overall, the distillation of detailed knowledge or organism-environment interactions into mechanistic functional responses in simple population

  17. Cortical organization: a description and interpretation of anatomical findings based on systems theory

    PubMed Central

    Casanova, Manuel F.

    2012-01-01

    The organization of the cortex can be understood as a complex system comprised of interconnected modules called minicolumns. Comparative anatomical studies suggest that evolution has prompted a scale free world network of connectivity within the white matter while simultaneously increasing the complexity of minicolumnar composition. It is this author’s opinion that this complex system is poised to collapse under the weight of environmental exigencies. Some mental disorders may be the manifestations of this collapse. PMID:22754693

  18. Burial of organic carbon and pyrite sulfur in sediments over phanerozoic time: a new theory

    NASA Astrophysics Data System (ADS)

    Berner, Robert A.; Raiswell, Robert

    1983-05-01

    In present day marine sediments, almost all of which are deposited in normal oxygenated seawater, rates of burial of organic carbon (C) and pyrite sulfur (S) correlate positively and bear a constant ratio to one another (C/S ˜- 3 on a weight basis). By contrast, calculations, based on the isotopic model of GARRELS and LERMAN (1981), indicate that at various times during the Phanerozoic the worldwide burial ratio must have been considerably different than the present day value. This ratio change is caused by the requirement that, increases in the worldwide mass of organic carbon must be accompanied by equivalent decreases in the mass of sedimentary pyrite sulfur, in order to maintain a roughly constant level of O 2 in the atmosphere. Such apparently contradictory behavior can be explained if the locus of major organic carbon burial has shifted over time from normal marine environments, as at present, to non-marine freshwater, or to euxinic environments, in the geologic past. A shift to predominantly freshwater burial can help explain predicted high C/S ratios in Permo-Carboniferous sediments, and a shift to euxinic environments can help explain predicted low C/S ratios during the early Paleozoic. It is demonstrated that the three environments today exhibit distinguishably different average C/S ratios.

  19. On the purposes of color for living beings: toward a theory of color organization.

    PubMed

    Pinna, Baingio; Reeves, Adam

    2015-01-01

    Phylogenetic and paleontological evidence indicates that in the animal kingdom the ability to perceive colors evolved independently several times over the course of millennia. This implies a high evolutionary neural investment and suggests that color vision provides some fundamental biological benefits. What are these benefits? Why are some animals so colorful? What are the adaptive and perceptual meanings of polychromatism? We suggest that in addition to the discrimination of light and surface chromaticity, sensitivity to color contributes to the whole, the parts and the fragments of perceptual organization. New versions of neon color spreading and the watercolor illusion indicate that the visual purpose of color in humans is threefold: to inter-relate each chromatic component of an object, thus favoring the emergence of the whole; to support a part-whole organization in which components reciprocally enhance each other by amodal completion; and, paradoxically, to reveal fragments and hide the whole-that is, there is a chromatic parceling-out process of separation, division, and fragmentation of the whole. The evolution of these contributions of color to organization needs to be established, but traces of it can be found in Harlequin camouflage by animals and in the coloration of flowers.

  20. Reliability and Maintainability (RAM) Training

    NASA Technical Reports Server (NTRS)

    Lalli, Vincent R. (Editor); Malec, Henry A. (Editor); Packard, Michael H. (Editor)

    2000-01-01

    The theme of this manual is failure physics-the study of how products, hardware, software, and systems fail and what can be done about it. The intent is to impart useful information, to extend the limits of production capability, and to assist in achieving low-cost reliable products. In a broader sense the manual should do more. It should underscore the urgent need CI for mature attitudes toward reliability. Five of the chapters were originally presented as a classroom course to over 1000 Martin Marietta engineers and technicians. Another four chapters and three appendixes have been added, We begin with a view of reliability from the years 1940 to 2000. Chapter 2 starts the training material with a review of mathematics and a description of what elements contribute to product failures. The remaining chapters elucidate basic reliability theory and the disciplines that allow us to control and eliminate failures.

  1. A Theory for the Function of the Spermaceti Organ of the Sperm Whale (Physeter Catodon L.)

    NASA Technical Reports Server (NTRS)

    Norris, K. S.; Harvey, G. W.

    1972-01-01

    The function of the spermaceti organ of the sperm whale is studied using a model of its acoustic system. Suggested functions of the system include: (1) action as an acoustic resonating and sound focussing chamber to form and process burst-pulsed clicks; (2) use of nasal passages in forehead for repeated recycling of air for phonation during dives and to provide mirrors for sound reflection and signal processing; and (3) use of the entire system to allow sound signal production especially useful for long range echolocofion in the deep sea.

  2. Theory of coupled hybrid inorganic/organic systems: Excitation transfer at semiconductor/molecule interfaces

    NASA Astrophysics Data System (ADS)

    Specht, Judith; Verdenhalven, Eike; Theuerholz, Sverre; Knorr, Andreas; Richter, Marten

    2016-03-01

    We derive a theoretical framework for describing hybrid organic-inorganic systems consisting of an ordered organic molecular layer coupled to a semiconductor quantum well (e.g., ZnO). A Heisenberg equation of motion technique based on a density matrix formalism is applied to derive dynamical equations for the composite system on a mesoscopic scale. Our theoretical approach focuses on the inuence of nonradiative Förster excitation transfer across the hybrid interface on linear optical absorption spectra. Therefore, the dielectric screening is discussed at the interface of two materials with different dielectric constants. Moreover, the Förster transfer matrix element is calculated in the point-dipole approximation. For a consistent theoretical description of both constituents (i.e., the molecular layer and the semiconductor substrate), the problem is treated in momentum space. Solving the equations of motion for the microscopic polarizations in frequency space directly leads to an equation for the frequency-dependent linear absorption coefficient. Our theoretical approach forms the basis for studying parameter regimes and geometries with optimized excitation transfer efficiency across the semiconductor/ molecule interface.

  3. Organizing principles as tools for bridging the gap between system theory and biological experimentation.

    PubMed

    Mekios, Constantinos

    2016-04-01

    Twentieth-century theoretical efforts towards the articulation of general system properties came short of having the significant impact on biological practice that their proponents envisioned. Although the latter did arrive at preliminary mathematical formulations of such properties, they had little success in showing how these could be productively incorporated into the research agenda of biologists. Consequently, the gap that kept system-theoretic principles cut-off from biological experimentation persisted. More recently, however, simple theoretical tools have proved readily applicable within the context of systems biology. In particular, examples reviewed in this paper suggest that rigorous mathematical expressions of design principles, imported primarily from engineering, could produce experimentally confirmable predictions of the regulatory properties of small biological networks. But this is not enough for contemporary systems biologists who adopt the holistic aspirations of early systemologists, seeking high-level organizing principles that could provide insights into problems of biological complexity at the whole-system level. While the presented evidence is not conclusive about whether this strategy could lead to the realization of the lofty goal of a comprehensive explanatory integration, it suggests that the ongoing quest for organizing principles is pragmatically advantageous for systems biologists. The formalisms postulated in the course of this process can serve as bridges between system-theoretic concepts and the results of molecular experimentation: they constitute theoretical tools for generalizing molecular data, thus producing increasingly accurate explanations of system-wide phenomena. PMID:26781787

  4. From organized high throughput data to phenomenological theory: The example of dielectric breakdown

    NASA Astrophysics Data System (ADS)

    Kim, Chiho; Pilania, Ghanshyam; Ramprasad, Rampi

    Understanding the behavior (and failure) of dielectric insulators experiencing extreme electric fields is critical to the operation of present and emerging electrical and electronic devices. Despite its importance, the development of a predictive theory of dielectric breakdown has remained a challenge, owing to the complex multiscale nature of this process. Here, we focus on the intrinsic dielectric breakdown field of insulators--the theoretical limit of breakdown determined purely by the chemistry of the material, i.e., the elements the material is composed of, the atomic-level structure, and the bonding. Starting from a benchmark dataset (generated from laborious first principles computations) of the intrinsic dielectric breakdown field of a variety of model insulators, simple predictive phenomenological models of dielectric breakdown are distilled using advanced statistical or machine learning schemes, revealing key correlations and analytical relationships between the breakdown field and easily accessible material properties. The models are shown to be general, and can hence guide the screening and systematic identification of high electric field tolerant materials.

  5. Beyond frontier molecular orbital theory: a systematic electron transfer model (ETM) for polar bimolecular organic reactions.

    PubMed

    Cahill, Katharine J; Johnson, Richard P

    2013-03-01

    Polar bimolecular reactions often begin as charge-transfer complexes and may proceed with a high degree of electron transfer character. Frontier molecular orbital (FMO) theory is predicated in part on this concept. We have developed an electron transfer model (ETM) in which we systematically transfer one electron between reactants and then use density functional methods to model the resultant radical or radical ion intermediates. Sites of higher reactivity are revealed by a composite spin density map (SDM) of odd electron character on the electron density surface, assuming that a new two-electron bond would occur preferentially at these sites. ETM correctly predicts regio- and stereoselectivity for a broad array of reactions, including Diels-Alder, dipolar and ketene cycloadditions, Birch reduction, many types of nucleophilic additions, and electrophilic addition to aromatic rings and polyenes. Conformational analysis of radical ions is often necessary to predict reaction stereochemistry. The electronic and geometric changes due to one-electron oxidation or reduction parallel the reaction coordinate for electrophilic or nucleophilic addition, respectively. The effect is more dramatic for one-electron reduction.

  6. Mean-field theory of atomic self-organization in optical cavities

    NASA Astrophysics Data System (ADS)

    Jäger, Simon B.; Schütz, Stefan; Morigi, Giovanna

    2016-08-01

    Photons mediate long-range optomechanical forces between atoms in high-finesse resonators, which can induce the formation of ordered spatial patterns. When a transverse laser drives the atoms, the system undergoes a second-order phase transition that separates a uniform spatial density from a Bragg grating maximizing scattering into the cavity and is controlled by the laser intensity. Starting from a Fokker-Planck equation describing the semiclassical dynamics of the N -atom distribution function, we systematically develop a mean-field model and analyze its predictions for the equilibrium and out-of-equilibrium dynamics. The validity of the mean-field model is tested by comparison with the numerical simulations of the N -body Fokker-Planck equation and by means of a Bogoliubov-Born-Green-Kirkwood-Yvon (BBGKY) hierarchy. The mean-field theory predictions well reproduce several results of the N -body Fokker-Planck equation for sufficiently short times and are in good agreement with existing theoretical approaches based on field-theoretical models. The mean field, on the other hand, predicts thermalization time scales which are at least one order of magnitude shorter than the ones predicted by the N -body dynamics. We attribute this discrepancy to the fact that the mean-field ansatz discards the effects of the long-range incoherent forces due to cavity losses.

  7. The metabolic pace-of-life model: incorporating ectothermic organisms into the theory of vertebrate ecoimmunology.

    PubMed

    Sandmeier, Franziska C; Tracy, Richard C

    2014-09-01

    We propose a new heuristic model that incorporates metabolic rate and pace of life to predict a vertebrate species' investment in adaptive immune function. Using reptiles as an example, we hypothesize that animals with low metabolic rates will invest more in innate immunity compared with adaptive immunity. High metabolic rates and body temperatures should logically optimize the efficacy of the adaptive immune system--through rapid replication of T and B cells, prolific production of induced antibodies, and kinetics of antibody--antigen interactions. In current theory, the precise mechanisms of vertebrate immune function oft are inadequately considered as diverse selective pressures on the evolution of pathogens. We propose that the strength of adaptive immune function and pace of life together determine many of the important dynamics of host-pathogen evolution, namely, that hosts with a short lifespan and innate immunity or with a long lifespan and strong adaptive immunity are expected to drive the rapid evolution of their populations of pathogens. Long-lived hosts that rely primarily on innate immune functions are more likely to use defense mechanisms of tolerance (instead of resistance), which are not expected to act as a selection pressure for the rapid evolution of pathogens' virulence. PMID:24760792

  8. Organizations.

    ERIC Educational Resources Information Center

    Aviation/Space, 1980

    1980-01-01

    This is a list of aerospace organizations and other groups that provides educators with assistance and information in specific areas. Both government and nongovernment organizations are included. (Author/SA)

  9. Spin-boson theory for charge photogeneration in organic molecules: Role of quantum coherence

    NASA Astrophysics Data System (ADS)

    Yao, Yao

    2015-01-01

    The charge photogeneration process in organic molecules is investigated by a quantum heat engine model, in which two molecules are modeled by a two-spin system sandwiched between two bosonic baths. The two baths represent the high-temperature photon emission source and the low-temperature phonon environment, respectively. We utilize the time-dependent density matrix renormalization group algorithm to investigate the quantum dynamics of the model. It is found that the transient energy current flowing through the two molecules exhibits two stages. In the first stage the energy current is of a coherent feature and represents the ultrafast delocalization of the charge-transfer state, and in the second stage a steady incoherent current is established. The power conversion efficiency is significantly high and may reach the maximum value of 93 % with optimized model parameters. The long-lived quantum entanglement between the two spins is found to be primarily responsible for the hyperefficiency.

  10. Computational organic chemistry: bridging theory and experiment in establishing the mechanisms of chemical reactions.

    PubMed

    Cheng, Gui-Juan; Zhang, Xinhao; Chung, Lung Wa; Xu, Liping; Wu, Yun-Dong

    2015-02-11

    Understanding the mechanisms of chemical reactions, especially catalysis, has been an important and active area of computational organic chemistry, and close collaborations between experimentalists and theorists represent a growing trend. This Perspective provides examples of such productive collaborations. The understanding of various reaction mechanisms and the insight gained from these studies are emphasized. The applications of various experimental techniques in elucidation of reaction details as well as the development of various computational techniques to meet the demand of emerging synthetic methods, e.g., C-H activation, organocatalysis, and single electron transfer, are presented along with some conventional developments of mechanistic aspects. Examples of applications are selected to demonstrate the advantages and limitations of these techniques. Some challenges in the mechanistic studies and predictions of reactions are also analyzed.

  11. Leadership in nonprofit organizations of Nicaragua and El Salvador: a study from the social identity theory.

    PubMed

    Moriano León, Juan Antonio; Topa Cantisano, Gabriela; Lévy Mangin, Jean-Pierre

    2009-11-01

    This study follows the social identity model of leadership proposed by van Knippenberg and Hogg (2003), in order to examine empirically the mediator effect of leadership prototypicality between social identity, extra effort, and perceived effectiveness of group members. The sample consisted of 109 participants who worked in 22 different work-teams of non-profit organizations (NPO) from Nicaragua and El Salvador. The data analysis was performed through structural equation modeling (SEM). The results show that NPO membership is related to a high level of social identity. In addition, the results confirmed that leadership prototypicality has a significant and positive mediator effect in the relationship between the group identification and the group members' extra effort and the perceived effectiveness of leadership.

  12. Recent results on analytical plasma turbulence theory: Realizability, intermittency, submarginal turbulence, and self-organized criticality

    SciTech Connect

    Krommes, J.A.

    2000-01-18

    Recent results and future challenges in the systematic analytical description of plasma turbulence are described. First, the importance of statistical realizability is stressed, and the development and successes of the Realizable Markovian Closure are briefly reviewed. Next, submarginal turbulence (linearly stable but nonlinearly self-sustained fluctuations) is considered and the relevance of nonlinear instability in neutral-fluid shear flows to submarginal turbulence in magnetized plasmas is discussed. For the Hasegawa-Wakatani equations, a self-consistency loop that leads to steady-state vortex regeneration in the presence of dissipation is demonstrated and a partial unification of recent work of Drake (for plasmas) and of Waleffe (for neutral fluids) is given. Brief remarks are made on the difficulties facing a quantitatively accurate statistical description of submarginal turbulence. Finally, possible connections between intermittency, submarginal turbulence, and self-organized criticality (SOC) are considered and outstanding questions are identified.

  13. High Reliability and Excellence in Staffing.

    PubMed

    Mensik, Jennifer

    2015-01-01

    Nurse staffing is a complex issue, with many facets and no one right answer. High-reliability organizations (HROs) strive and succeed in achieving a high degree of safety or reliability despite operating in hazardous conditions. HROs have systems in place that make them extremely consistent in accomplishing their goals and avoiding potential errors. However, the inability to resolve quality issues may very well be related to the lack of adoption of high-reliability principles throughout our organizations.

  14. Theory of Current Transients in Planar Semiconductor Devices: Insights and Applications to Organic Solar Cells

    NASA Astrophysics Data System (ADS)

    Hawks, Steven A.; Finck, Benjamin Y.; Schwartz, Benjamin J.

    2015-04-01

    Time-domain current measurements are widely used to characterize semiconductor material properties, such as carrier mobility, doping concentration, carrier lifetime, and the static dielectric constant. It is therefore critical that these measurements be theoretically understood if they are to be successfully applied to assess the properties of materials and devices. In this paper, we derive generalized relations for describing current-density transients in planar semiconductor devices at uniform temperature. By spatially averaging the charge densities inside the semiconductor, we are able to provide a rigorous, straightforward, and experimentally relevant way to interpret these measurements. The formalism details several subtle aspects of current transients, including how the electrode charge relates to applied bias and internal space charge, how the displacement current can alter the apparent free-carrier current, and how to understand the integral of a charge-extraction transient. We also demonstrate how the formalism can be employed to derive the current transients arising from simple physical models, like those used to describe charge extraction by linearly increasing voltage (CELIV) and time-of-flight experiments. In doing so, we find that there is a nonintuitive factor-of-2 reduction in the apparent free-carrier concentration that can be easily missed, for example, in the application of charge-extraction models. Finally, to validate our theory and better understand the different current contributions, we perform a full time-domain drift-diffusion simulation of a CELIV trace and compare the results to our formalism. As expected, our analytic equations match precisely with the numerical solutions to the drift-diffusion, Poisson, and continuity equations. Thus, overall, our formalism provides a straightforward and general way to think about how the internal space-charge distribution, the electrode charge, and the externally applied bias translate into a measured

  15. Mean Field Theory of a Coupled Heisenberg Model and Its Application to an Organic Antiferromagnet with Magnetic Anions

    NASA Astrophysics Data System (ADS)

    Ito, Kazuhiro; Shimahara, Hiroshi

    2016-02-01

    We examine the mean field theory of a uniaxial coupled Heisenberg antiferromagnet with two subsystems, one of which consists of strongly interacting small spins and the other consists of weakly interacting large spins. We reanalyze the experimental data of specific heat and magnetic susceptibility obtained by previous authors for the organic compound λ-(BETS)2FeCl4 at low temperatures, where BETS stands for bis(ethylenedithio)tetraselenafulvalene. The model parameters for this compound are evaluated, where the applicability of the theory is checked. As a result, it is found that J1 ≫ J12 ≫ J2, where J1, J2, and J12 denote the exchange coupling constant between π spins, that between 3d spins, and that between π and 3d spins, respectively. At the low-temperature limit, both sublattice magnetizations of the 3d and π spins are saturated, and the present model is reduced to the Schottky model, which successfully explains experimental observations in previous studies. As temperature increases, fluctuations of 3d spins increase, while π spins remain almost saturated. Near the critical temperature, both spins fluctuate significantly, and thus the mean field approximation breaks down. It is revealed that the magnetic anisotropy, which may be crucial to the antiferromagnetic long-range order, originates from J12 rather than from J2 and that the angle between the magnetic easy-axis and the crystal c-axis is approximately 26-27° in the present effective model.

  16. Information theory and local learning rules in a self-organizing network of Ising spins

    NASA Astrophysics Data System (ADS)

    Haft, Michael; Schlang, Martin; Deco, Gustavo

    1995-09-01

    The Boltzmann machine uses the relative entropy as a cost function to fit the Boltzmann distribution to a fixed given distribution. Instead of the relative entropy, we use the mutual information between input and output units to define an unsupervised analogy to the conventional Boltzmann machine. Our network of Ising spins is fed by an external field via the input units. The output units should self-organize to form an ``internal'' representation of the ``environmental'' input, thereby compressing the data and extracting relevant features. The mutual information and its gradient with respect to the weights principally require nonlocal information, e.g., in the form of multipoint correlation functions. Hence the exact gradient can hardly be boiled down to a local learning rule. Conversely, by using only local terms and two-point interactions, the entropy of the output layer cannot be ensured to reach the maximum possible entropy for a fixed number of output neurons. Some redundancy may remain in the representation of the data at the output. We account for this limitation from the very beginning by reformulating the cost function correspondingly. From this cost function, local Hebb-like learning rules can be derived. Some experiments with these local learning rules are presented.

  17. Small optical gap molecules and polymers: using theory to design more efficient materials for organic photovoltaics.

    PubMed

    Risko, Chad; Brédas, Jean-Luc

    2014-01-01

    Recent improvements in the power conversion efficiencies of organic solar cells have been derived through a combination of new materials, processing, and device designs. A key factor has also been quantum-chemical studies that have led to a better understanding not only of the intrinsic electronic and optical properties of the materials but also of the physical processes that take place during the photovoltaic effect. In this chapter we review some recent quantum-chemical investigations of donor-acceptor copolymers, systems that have found wide use as the primary absorbing and hole-transport materials in bulk-heterojunction solar cells. We underline a number of current limitations with regard to available electronic structure methods and in terms of the understanding of the processes involved in solar cell operation. We conclude with a brief outlook that discusses the need to develop multiscale simulation methods that combine quantum-chemical techniques with large-scale classically-based simulations to provide a more complete picture.

  18. Charge Photogeneration Experiments and Theory in Aggregated Squaraine Donor Materials for Improved Organic Solar Cell Efficiencies

    NASA Astrophysics Data System (ADS)

    Spencer, Susan Demetra

    Fossil fuel consumption has a deleterious effect on humans, the economy, and the environment. Renewable energy technologies must be identified and commercialized as quickly as possible so that the transition to renewables can happen at a minimum of financial and societal cost. Organic photovoltaic cells offer an inexpensive and disruptive energy technology, if the scientific challenges of understanding charge photogeneration in a bulk heterojunction material can be overcome. At RIT, there is a strong focus on creating new materials that can both offer fundamentally important scientific results relating to quantum photophysics, and simultaneously assist in the development of strong candidates for future commercialized technology. In this presentation, the results of intensive materials characterization of a series of squaraine small molecule donors will be presented, as well as a full study of the fabrication and optimization required to achieve >4% photovoltaic cell efficiency. A relationship between the molecular structure of the squaraine and its ability to form nanoscale aggregates will be explored. Squaraine aggregation will be described as a unique optoelectronic probe of the structure of the bulk heterojunction. This relationship will then be utilized to explain changes in crystallinity that impact the overall performance of the devices. Finally, a predictive summary will be given for the future of donor material research at RIT.

  19. Reliability model generator

    NASA Technical Reports Server (NTRS)

    McMann, Catherine M. (Inventor); Cohen, Gerald C. (Inventor)

    1991-01-01

    An improved method and system for automatically generating reliability models for use with a reliability evaluation tool is described. The reliability model generator of the present invention includes means for storing a plurality of low level reliability models which represent the reliability characteristics for low level system components. In addition, the present invention includes means for defining the interconnection of the low level reliability models via a system architecture description. In accordance with the principles of the present invention, a reliability model for the entire system is automatically generated by aggregating the low level reliability models based on the system architecture description.

  20. Design of an organic zeolite toward the selective adsorption of small molecules at the dispersion corrected density functional theory level.

    PubMed

    Li, Wenliang; Gahungu, Godefroid; Zhang, Jingping; Hao, Lizhu

    2009-12-31

    Tris(o-phenylenedioxy)cyclotriphosphazene (TPP) became the compound of choice to investigate the structural features of organic zeolites and their potential applications as soft materials. A van der Waals crystal of the TPP analogue (host) with the thiophene side fragment tris(3,4-thiophenedioxy)cyclotriphosphazene (TTP) was designed to investigate the selective adsorption among some common gases (guest): methane (CH(4)), carbon dioxide (CO(2)), nitrogen (N(2)), or hydrogen (H(2)). The crystal structure of TTP was modeled by applying minimization methods using the COMPASS (condensed-phase optimized molecular potentials for atomic simulation studies) force field. Interaction energies and structural properties of van der Waals complexes of the crystal of TTP and gas molecules were studied using the dispersion corrected density functional theory (DFT-D). The proper functional and basis set were selected after comparing with benchmark data of the coupled-cluster calculations with singles, doubles, and perturbative triple excitations [CCSD(T)] estimated at the complete basis set (CBS) limit. On the basis of our results, the interaction energy between the host and the guest molecules was predicted in the increasing order of host-H(2) < host-N(2) < host-CH(4) < host-CO(2), suggesting the designed TTP is a good candidate as an organic zeolite for potential fuel storage, hydrogen purification, carbon dioxide removal from the air, as well as safety care in a coal mine. PMID:19968318

  1. Well-organized raspberry-like Ag@Cu bimetal nanoparticles for highly reliable and reproducible surface-enhanced Raman scattering

    NASA Astrophysics Data System (ADS)

    Lee, Jung-Pil; Chen, Dongchang; Li, Xiaxi; Yoo, Seungmin; Bottomley, Lawrence A.; El-Sayed, Mostafa A.; Park, Soojin; Liu, Meilin

    2013-11-01

    Surface-enhanced Raman scattering (SERS) is ideally suited for probing and mapping surface species and incipient phases on fuel cell electrodes because of its high sensitivity and surface-selectivity, potentially offering insights into the mechanisms of chemical and energy transformation processes. In particular, bimetal nanostructures of coinage metals (Au, Ag, and Cu) have attracted much attention as SERS-active agents due to their distinctive electromagnetic field enhancements originated from surface plasmon resonance. Here we report excellent SERS-active, raspberry-like nanostructures composed of a silver (Ag) nanoparticle core decorated with smaller copper (Cu) nanoparticles, which displayed enhanced and broadened UV-Vis absorption spectra. These unique Ag@Cu raspberry nanostructures enable us to use blue, green, and red light as the excitation laser source for surface-enhanced Raman spectroscopy (SERS) with a large enhancement factor (EF). A highly reliable SERS effect was demonstrated using Rhodamine 6G (R6G) molecules and a thin film of gadolinium doped ceria.Surface-enhanced Raman scattering (SERS) is ideally suited for probing and mapping surface species and incipient phases on fuel cell electrodes because of its high sensitivity and surface-selectivity, potentially offering insights into the mechanisms of chemical and energy transformation processes. In particular, bimetal nanostructures of coinage metals (Au, Ag, and Cu) have attracted much attention as SERS-active agents due to their distinctive electromagnetic field enhancements originated from surface plasmon resonance. Here we report excellent SERS-active, raspberry-like nanostructures composed of a silver (Ag) nanoparticle core decorated with smaller copper (Cu) nanoparticles, which displayed enhanced and broadened UV-Vis absorption spectra. These unique Ag@Cu raspberry nanostructures enable us to use blue, green, and red light as the excitation laser source for surface-enhanced Raman spectroscopy

  2. Reliability of Scores on the Summative Performance Assessments

    ERIC Educational Resources Information Center

    Yang, Yanyun; Oosterhof, Albert; Xia, Yan

    2015-01-01

    The authors address the reliability of scores obtained on the summative performance assessments during the pilot year of our research. Contrary to classical test theory, we discussed the advantages of using generalizability theory for estimating reliability of scores for summative performance assessments. Generalizability theory was used as the…

  3. Kleiber's Law: How the Fire of Life ignited debate, fueled theory, and neglected plants as model organisms

    PubMed Central

    Niklas, Karl J; Kutschera, Ulrich

    2015-01-01

    Size is a key feature of any organism since it influences the rate at which resources are consumed and thus affects metabolic rates. In the 1930s, size-dependent relationships were codified as “allometry” and it was shown that most of these could be quantified using the slopes of log-log plots of any 2 variables of interest. During the decades that followed, physiologists explored how animal respiration rates varied as a function of body size across taxa. The expectation was that rates would scale as the 2/3 power of body size as a reflection of the Euclidean relationship between surface area and volume. However, the work of Max Kleiber (1893–1976) and others revealed that animal respiration rates apparently scale more closely as the 3/4 power of body size. This phenomenology, which is called “Kleiber's Law,” has been described for a broad range of organisms, including some algae and plants. It has also been severely criticized on theoretical and empirical grounds. Here, we review the history of the analysis of metabolism, which originated with the works of Antoine L. Lavoisier (1743–1794) and Julius Sachs (1832–1897), and culminated in Kleiber's book The Fire of Life (1961; 2. ed. 1975). We then evaluate some of the criticisms that have been leveled against Kleiber's Law and some examples of the theories that have tried to explain it. We revive the speculation that intracellular exo- and endocytotic processes are resource delivery-systems, analogous to the supercellular systems in multicellular organisms. Finally, we present data that cast doubt on the existence of a single scaling relationship between growth and body size in plants. PMID:26156204

  4. Kleiber's Law: How the Fire of Life ignited debate, fueled theory, and neglected plants as model organisms.

    PubMed

    Niklas, Karl J; Kutschera, Ulrich

    2015-01-01

    Size is a key feature of any organism since it influences the rate at which resources are consumed and thus affects metabolic rates. In the 1930s, size-dependent relationships were codified as "allometry" and it was shown that most of these could be quantified using the slopes of log-log plots of any 2 variables of interest. During the decades that followed, physiologists explored how animal respiration rates varied as a function of body size across taxa. The expectation was that rates would scale as the 2/3 power of body size as a reflection of the Euclidean relationship between surface area and volume. However, the work of Max Kleiber (1893-1976) and others revealed that animal respiration rates apparently scale more closely as the 3/4 power of body size. This phenomenology, which is called "Kleiber's Law," has been described for a broad range of organisms, including some algae and plants. It has also been severely criticized on theoretical and empirical grounds. Here, we review the history of the analysis of metabolism, which originated with the works of Antoine L. Lavoisier (1743-1794) and Julius Sachs (1832-1897), and culminated in Kleiber's book The Fire of Life (1961; 2. ed. 1975). We then evaluate some of the criticisms that have been leveled against Kleiber's Law and some examples of the theories that have tried to explain it. We revive the speculation that intracellular exo- and endocytotic processes are resource delivery-systems, analogous to the supercellular systems in multicellular organisms. Finally, we present data that cast doubt on the existence of a single scaling relationship between growth and body size in plants. PMID:26156204

  5. Kleiber's Law: How the Fire of Life ignited debate, fueled theory, and neglected plants as model organisms.

    PubMed

    Niklas, Karl J; Kutschera, Ulrich

    2015-01-01

    Size is a key feature of any organism since it influences the rate at which resources are consumed and thus affects metabolic rates. In the 1930s, size-dependent relationships were codified as "allometry" and it was shown that most of these could be quantified using the slopes of log-log plots of any 2 variables of interest. During the decades that followed, physiologists explored how animal respiration rates varied as a function of body size across taxa. The expectation was that rates would scale as the 2/3 power of body size as a reflection of the Euclidean relationship between surface area and volume. However, the work of Max Kleiber (1893-1976) and others revealed that animal respiration rates apparently scale more closely as the 3/4 power of body size. This phenomenology, which is called "Kleiber's Law," has been described for a broad range of organisms, including some algae and plants. It has also been severely criticized on theoretical and empirical grounds. Here, we review the history of the analysis of metabolism, which originated with the works of Antoine L. Lavoisier (1743-1794) and Julius Sachs (1832-1897), and culminated in Kleiber's book The Fire of Life (1961; 2. ed. 1975). We then evaluate some of the criticisms that have been leveled against Kleiber's Law and some examples of the theories that have tried to explain it. We revive the speculation that intracellular exo- and endocytotic processes are resource delivery-systems, analogous to the supercellular systems in multicellular organisms. Finally, we present data that cast doubt on the existence of a single scaling relationship between growth and body size in plants.

  6. Well-organized raspberry-like Ag@Cu bimetal nanoparticles for highly reliable and reproducible surface-enhanced Raman scattering.

    PubMed

    Lee, Jung-Pil; Chen, Dongchang; Li, Xiaxi; Yoo, Seungmin; Bottomley, Lawrence A; El-Sayed, Mostafa A; Park, Soojin; Liu, Meilin

    2013-12-01

    Surface-enhanced Raman scattering (SERS) is ideally suited for probing and mapping surface species and incipient phases on fuel cell electrodes because of its high sensitivity and surface-selectivity, potentially offering insights into the mechanisms of chemical and energy transformation processes. In particular, bimetal nanostructures of coinage metals (Au, Ag, and Cu) have attracted much attention as SERS-active agents due to their distinctive electromagnetic field enhancements originated from surface plasmon resonance. Here we report excellent SERS-active, raspberry-like nanostructures composed of a silver (Ag) nanoparticle core decorated with smaller copper (Cu) nanoparticles, which displayed enhanced and broadened UV-Vis absorption spectra. These unique Ag@Cu raspberry nanostructures enable us to use blue, green, and red light as the excitation laser source for surface-enhanced Raman spectroscopy (SERS) with a large enhancement factor (EF). A highly reliable SERS effect was demonstrated using Rhodamine 6G (R6G) molecules and a thin film of gadolinium doped ceria.

  7. Reliability beyond Theory and into Practice

    ERIC Educational Resources Information Center

    Sijtsma, Klaas

    2009-01-01

    The critical reactions of Bentler (2009, doi: 10.1007/s11336-008-9100-1), Green and Yang (2009a, doi: 10.1007/s11336-008-9098-4 ; 2009b, doi: 10.1007/s11336-008-9099-3), and Revelle and Zinbarg (2009, doi: 10.1007/s11336-008-9102-z) to Sijtsma's (2009, doi: 10.1007/s11336-008-9101-0) paper on Cronbach's alpha are addressed. The dissemination of…

  8. Organics.

    ERIC Educational Resources Information Center

    Chian, Edward S. K.; DeWalle, Foppe B.

    1978-01-01

    Presents water analysis literature for 1978. This review is concerned with organics, and it covers: (1) detergents and surfactants; (2) aliphatic and aromatic hydrocarbons; (3) pesticides and chlorinated hydrocarbons; and (4) naturally occurring organics. A list of 208 references is also presented. (HM)

  9. Packaging Theory.

    ERIC Educational Resources Information Center

    Williams, Jeffrey

    1994-01-01

    Considers the recent flood of anthologies of literary criticism and theory as exemplifications of the confluence of pedagogical concerns, economics of publishing, and other historical factors. Looks specifically at how these anthologies present theory. Cites problems with their formatting theory and proposes alternative ways of organizing theory…

  10. Solvent-free MALDI-MS: developmental improvements in the reliability and the potential of MALDI in the analysis of synthetic polymers and giant organic molecules.

    PubMed

    Trimpin, S; Keune, S; Räder, H J; Müllen, K

    2006-05-01

    A dry sample preparation strategy was previously established as a new method for matrix assisted laser desorption/ionization mass spectrometry (MALDI-MS), so-called solvent-free MALDI-MS. In this contribution, we examine systems that have been shown problematic with conventional solvent-based MALDI approaches. Problems frequently encountered are solubility, miscibility, and segregation effects during crystallization as a result of unfavorable analyte and matrix polarities. In all cases studied, solvent-free MALDI-MS simplified the measurement and improved the analysis. Solvent-free MALDI-MS enables more reliable results in well-known problematic systems such as polydimethylsiloxane with its segregation effects. However, even in highly compatible analyte/matrix systems such as polystyrene and dithranol, there were undesirable suppression effects when employing THF as solvent. Generally, the solvent-free method allows for more homogeneous analyte/matrix mixtures as well as higher shot-to-shot and sample-to-sample reproducibility. As a result, less laser power has to be applied, which yields milder MALDI conditions, reduced background signals, and provides better resolution of the analyte signals. Solvent-free MALDI-MS proved valuable for the characterization of nanosized material, e.g., fullereno-based structures, which indicated having an increased fragmentation-susceptibility. New analyte/matrix combinations (e.g., polyvinylpyrrolidone/dithranol) are accessible independent of solubility and compatibility in common solvents. An improved quantitation potential is recognized (e.g., insoluble polycyclic aromatic hydrocarbon against soluble dendrite precursor). The rapid and easy measurement of industrial products demonstrates the solvent-free method capable for improved throughput analysis of a variety of compounds (e.g., poly(butylmethacrylate) diol) in routine industrial analysis. Hence, this new MALDI method leads to qualitative and quantitative improvements, making

  11. Reliability Generalization: "Lapsus Linguae"

    ERIC Educational Resources Information Center

    Smith, Julie M.

    2011-01-01

    This study examines the proposed Reliability Generalization (RG) method for studying reliability. RG employs the application of meta-analytic techniques similar to those used in validity generalization studies to examine reliability coefficients. This study explains why RG does not provide a proper research method for the study of reliability,…

  12. Business of reliability

    NASA Astrophysics Data System (ADS)

    Engel, Pierre

    1999-12-01

    The presentation is organized around three themes: (1) The decrease of reception equipment costs allows non-Remote Sensing organization to access a technology until recently reserved to scientific elite. What this means is the rise of 'operational' executive agencies considering space-based technology and operations as a viable input to their daily tasks. This is possible thanks to totally dedicated ground receiving entities focusing on one application for themselves, rather than serving a vast community of users. (2) The multiplication of earth observation platforms will form the base for reliable technical and financial solutions. One obstacle to the growth of the earth observation industry is the variety of policies (commercial versus non-commercial) ruling the distribution of the data and value-added products. In particular, the high volume of data sales required for the return on investment does conflict with traditional low-volume data use for most applications. Constant access to data sources supposes monitoring needs as well as technical proficiency. (3) Large volume use of data coupled with low- cost equipment costs is only possible when the technology has proven reliable, in terms of application results, financial risks and data supply. Each of these factors is reviewed. The expectation is that international cooperation between agencies and private ventures will pave the way for future business models. As an illustration, the presentation proposes to use some recent non-traditional monitoring applications, that may lead to significant use of earth observation data, value added products and services: flood monitoring, ship detection, marine oil pollution deterrent systems and rice acreage monitoring.

  13. Can There Be Reliability without "Reliability?"

    ERIC Educational Resources Information Center

    Mislevy, Robert J.

    2004-01-01

    An "Educational Researcher" article by Pamela Moss (1994) asks the title question, "Can there be validity without reliability?" Yes, she answers, if by reliability one means "consistency among independent observations intended as interchangeable" (Moss, 1994, p. 7), quantified by internal consistency indices such as KR-20 coefficients and…

  14. HELIOS Critical Design Review: Reliability

    NASA Technical Reports Server (NTRS)

    Benoehr, H. C.; Herholz, J.; Prem, H.; Mann, D.; Reichert, L.; Rupp, W.; Campbell, D.; Boettger, H.; Zerwes, G.; Kurvin, C.

    1972-01-01

    This paper presents Helios Critical Design Review Reliability form October 16-20, 1972. The topics include: 1) Reliability Requirement; 2) Reliability Apportionment; 3) Failure Rates; 4) Reliability Assessment; 5) Reliability Block Diagram; and 5) Reliability Information Sheet.

  15. Driving Method for Compensating Reliability Problem of Hydrogenated Amorphous Silicon Thin Film Transistors and Image Sticking Phenomenon in Active Matrix Organic Light-Emitting Diode Displays

    NASA Astrophysics Data System (ADS)

    Shin, Min-Seok; Jo, Yun-Rae; Kwon, Oh-Kyong

    2011-03-01

    In this paper, we propose a driving method for compensating the electrical instability of hydrogenated amorphous silicon (a-Si:H) thin film transistors (TFTs) and the luminance degradation of organic light-emitting diode (OLED) devices for large active matrix OLED (AMOLED) displays. The proposed driving method senses the electrical characteristics of a-Si:H TFTs and OLEDs using current integrators and compensates them by an external compensation method. Threshold voltage shift is controlled a using negative bias voltage. After applying the proposed driving method, the measured error of the maximum emission current ranges from -1.23 to +1.59 least significant bit (LSB) of a 10-bit gray scale under the threshold voltage shift ranging from -0.16 to 0.17 V.

  16. Crystallization force--a density functional theory concept for revealing intermolecular interactions and molecular packing in organic crystals.

    PubMed

    Li, Tonglei; Ayers, Paul W; Liu, Shubin; Swadley, Matthew J; Aubrey-Medendorp, Clare

    2009-01-01

    Organic molecules are prone to polymorphic formation in the solid state due to the rich diversity of functional groups that results in comparable intermolecular interactions, which can be greatly affected by the selection of solvent and other crystallization conditions. Intermolecular interactions are typically weak forces, such as van der Waals and stronger short-range ones including hydrogen bonding, that are believed to determine the packing of organic molecules during the crystal-growth process. A different packing of the same molecules leads to the formation of a new crystal structure. To disclose the underlying causes that drive the molecule to have various packing motifs in the solid state, an electronic concept or function within the framework of conceptual density functional theory has been developed, namely, crystallization force. The concept aims to describe the local change in electronic structure as a result of the self-assembly process of crystallization and may likely quantify the locality of intermolecular interactions that directs the molecular packing in a crystal. To assess the applicability of the concept, 5-methyl-2-[(2-nitrophenyl)amino]-3-thiophenecarbonitrile, so-called ROY, which is known to have the largest number of solved polymorphs, has been examined. Electronic calculations were conducted on the seven available crystal structures as well as on the single molecule. The electronic structures were analyzed and crystallization force values were obtained. The results indicate that the crystallization forces are able to reveal intermolecular interactions in the crystals, in particular, the close contacts that are formed between molecules. Strong correlations exist between the total crystallization force and lattice energy of a crystal structure, further suggesting the underlying connection between the crystallization force and molecular packing.

  17. Phosphorescence lifetimes of organic light-emitting diodes from two-component time-dependent density functional theory

    SciTech Connect

    Kühn, Michael; Weigend, Florian

    2014-12-14

    “Spin-forbidden” transitions are calculated for an eight-membered set of iridium-containing candidate molecules for organic light-emitting diodes (OLEDs) using two-component time-dependent density functional theory. Phosphorescence lifetimes (obtained from averaging over relevant excitations) are compared to experimental data. Assessment of parameters like non-distorted and distorted geometric structures, density functionals, relativistic Hamiltonians, and basis sets was done by a thorough study for Ir(ppy){sub 3} focussing not only on averaged phosphorescence lifetimes, but also on the agreement of the triplet substate structure with experimental data. The most favorable methods were applied to an eight-membered test set of OLED candidate molecules; Boltzmann-averaged phosphorescence lifetimes were investigated concerning the convergence with the number of excited states and the changes when including solvent effects. Finally, a simple model for sorting out molecules with long averaged phosphorescence lifetimes is developed by visual inspection of computationally easily achievable one-component frontier orbitals.

  18. Self-organized criticality as Witten-type topological field theory with spontaneously broken Becchi-Rouet-Stora-Tyutin symmetry

    SciTech Connect

    Ovchinnikov, Igor V.

    2011-05-15

    Here, a scenario is proposed, according to which a generic self-organized critical (SOC) system can be looked upon as a Witten-type topological field theory (W-TFT) with spontaneously broken Becchi-Rouet-Stora-Tyutin (BRST) symmetry. One of the conditions for the SOC is the slow driving noise, which unambiguously suggests Stratonovich interpretation of the corresponding stochastic differential equation (SDE). This, in turn, necessitates the use of Parisi-Sourlas-Wu stochastic quantization procedure, which straightforwardly leads to a model with BRST-exact action, i.e., to a W-TFT. In the parameter space of the SDE, there must exist full-dimensional regions where the BRST symmetry is spontaneously broken by instantons, which in the context of SOC are essentially avalanches. In these regions, the avalanche-type SOC dynamics is liberated from overwise a rightful dynamics-less W-TFT, and a Goldstone mode of Fadeev-Popov ghosts exists. Goldstinos represent moduli of instantons (avalanches) and being gapless are responsible for the critical avalanche distribution in the low-energy, long-wavelength limit. The above arguments are robust against moderate variations of the SDE's parameters and the criticality is 'self-tuned'. The proposition of this paper suggests that the machinery of W-TFTs may find its applications in many different areas of modern science studying various physical realizations of SOC. It also suggests that there may in principle exist a connection between some SOC's and the concept of topological quantum computing.

  19. Analysis of algal bloom risk with uncertainties in lakes by integrating self-organizing map and fuzzy information theory.

    PubMed

    Chen, Qiuwen; Rui, Han; Li, Weifeng; Zhang, Yanhui

    2014-06-01

    Algal blooms are a serious problem in waters, which damage aquatic ecosystems and threaten drinking water safety. However, the outbreak mechanism of algal blooms is very complex with great uncertainty, especially for large water bodies where environmental conditions have obvious variation in both space and time. This study developed an innovative method which integrated a self-organizing map (SOM) and fuzzy information diffusion theory to comprehensively analyze algal bloom risks with uncertainties. The Lake Taihu was taken as study case and the long-term (2004-2010) on-site monitoring data were used. The results showed that algal blooms in Taihu Lake were classified into four categories and exhibited obvious spatial-temporal patterns. The lake was mainly characterized by moderate bloom but had high uncertainty, whereas severe blooms with low uncertainty were observed in the northwest part of the lake. The study gives insight on the spatial-temporal dynamics of algal blooms, and should help government and decision-makers outline policies and practices on bloom monitoring and prevention. The developed method provides a promising approach to estimate algal bloom risks under uncertainties.

  20. Luminescent properties of metal-organic framework MOF-5: relativistic time-dependent density functional theory investigations.

    PubMed

    Ji, Min; Lan, Xin; Han, Zhenping; Hao, Ce; Qiu, Jieshan

    2012-11-19

    The electronically excited state and luminescence property of metal-organic framework MOF-5 were investigated using relativistic density functional theory (DFT) and time-dependent DFT (TDDFT). The geometry, IR spectra, and UV-vis spectra of MOF-5 in the ground state were calculated using relativistic DFT, leading to good agreement between the experimental and theoretical results. The frontier molecular orbitals and electronic configuration indicated that the luminescence mechanism in MOF-5 follows ligand-to-ligand charge transfer (LLCT), namely, π* → π, rather than emission with the ZnO quantum dot (QD) proposed by Bordiga et al. The geometry and IR spectra of MOF-5 in the electronically excited state have been calculated using the relativistic TDDFT and compared with those for the ground state. The comparison reveals that the Zn4O13 QD is rigid, whereas the ligands BDC(2-) are nonrigid. In addition, the calculated emission band of MOF-5 is in good agreement with the experimental result and is similar to that of the ligand H2BDC. The combined results confirmed that the luminescence mechanism for MOF-5 should be LLCT with little mixing of the ligand-to-metal charge transfer. The reason for the MOF-5 luminescence is explained by the excellent coplanarity between the six-membered ring consisting of zinc, oxygen, carbon, and the benzene ring. PMID:23136957

  1. The effects of instructors' autonomy support and students' autonomous motivation on learning organic chemistry: A self-determination theory perspective

    NASA Astrophysics Data System (ADS)

    Black, Aaron E.; Deci, Edward L.

    2000-11-01

    This prospective study applied self-determination theory to investigate the effects of students' course-specific self-regulation and their perceptions of their instructors' autonomy support on adjustment and academic performance in a college-level organic chemistry course. The study revealed that: (1) students' reports of entering the course for relatively autonomous (vs. controlled) reasons predicted higher perceived competence and interest/enjoyment and lower anxiety and grade-focused performance goals during the course, and were related to whether or not the students dropped the course; and (2) students' perceptions of their instructors' autonomy support predicted increases in autonomous self-regulation, perceived competence, and interest/enjoyment, and decreases in anxiety over the semester. The change in autonomous self-regulation in turn predicted students' performance in the course. Further, instructor autonomy support also predicted course performance directly, although differences in the initial level of students' autonomous self-regulation moderated that effect, with autonomy support relating strongly to academic performance for students initially low in autonomous self-regulation but not for students initially high in autonomous self-regulation.

  2. Human Reliability Program Overview

    SciTech Connect

    Bodin, Michael

    2012-09-25

    This presentation covers the high points of the Human Reliability Program, including certification/decertification, critical positions, due process, organizational structure, program components, personnel security, an overview of the US DOE reliability program, retirees and academia, and security program integration.

  3. Power electronics reliability analysis.

    SciTech Connect

    Smith, Mark A.; Atcitty, Stanley

    2009-12-01

    This report provides the DOE and industry with a general process for analyzing power electronics reliability. The analysis can help with understanding the main causes of failures, downtime, and cost and how to reduce them. One approach is to collect field maintenance data and use it directly to calculate reliability metrics related to each cause. Another approach is to model the functional structure of the equipment using a fault tree to derive system reliability from component reliability. Analysis of a fictitious device demonstrates the latter process. Optimization can use the resulting baseline model to decide how to improve reliability and/or lower costs. It is recommended that both electric utilities and equipment manufacturers make provisions to collect and share data in order to lay the groundwork for improving reliability into the future. Reliability analysis helps guide reliability improvements in hardware and software technology including condition monitoring and prognostics and health management.

  4. Reliable Design Versus Trust

    NASA Technical Reports Server (NTRS)

    Berg, Melanie; LaBel, Kenneth A.

    2016-01-01

    This presentation focuses on reliability and trust for the users portion of the FPGA design flow. It is assumed that the manufacturer prior to hand-off to the user tests FPGA internal components. The objective is to present the challenges of creating reliable and trusted designs. The following will be addressed: What makes a design vulnerable to functional flaws (reliability) or attackers (trust)? What are the challenges for verifying a reliable design versus a trusted design?

  5. Reliability in aposematic signaling

    PubMed Central

    2010-01-01

    In light of recent work, we will expand on the role and variability of aposematic signals. The focus of this review will be the concepts of reliability and honesty in aposematic signaling. We claim that reliable signaling can solve the problem of aposematic evolution, and that variability in reliability can shed light on the complexity of aposematic systems. PMID:20539774

  6. Viking Lander reliability program

    NASA Technical Reports Server (NTRS)

    Pilny, M. J.

    1978-01-01

    The Viking Lander reliability program is reviewed with attention given to the development of the reliability program requirements, reliability program management, documents evaluation, failure modes evaluation, production variation control, failure reporting and correction, and the parts program. Lander hardware failures which have occurred during the mission are listed.

  7. Reliability as Argument

    ERIC Educational Resources Information Center

    Parkes, Jay

    2007-01-01

    Reliability consists of both important social and scientific values and methods for evidencing those values, though in practice methods are often conflated with the values. With the two distinctly understood, a reliability argument can be made that articulates the particular reliability values most relevant to the particular measurement situation…

  8. Reliability model generator specification

    NASA Technical Reports Server (NTRS)

    Cohen, Gerald C.; Mccann, Catherine

    1990-01-01

    The Reliability Model Generator (RMG), a program which produces reliability models from block diagrams for ASSIST, the interface for the reliability evaluation tool SURE is described. An account is given of motivation for RMG and the implemented algorithms are discussed. The appendices contain the algorithms and two detailed traces of examples.

  9. Cognitive decision errors and organization vulnerabilities in nuclear power plant safety management: Modeling using the TOGA meta-theory framework

    SciTech Connect

    Cappelli, M.; Gadomski, A. M.; Sepiellis, M.; Wronikowska, M. W.

    2012-07-01

    In the field of nuclear power plant (NPP) safety modeling, the perception of the role of socio-cognitive engineering (SCE) is continuously increasing. Today, the focus is especially on the identification of human and organization decisional errors caused by operators and managers under high-risk conditions, as evident by analyzing reports on nuclear incidents occurred in the past. At present, the engineering and social safety requirements need to enlarge their domain of interest in such a way to include all possible losses generating events that could be the consequences of an abnormal state of a NPP. Socio-cognitive modeling of Integrated Nuclear Safety Management (INSM) using the TOGA meta-theory has been discussed during the ICCAP 2011 Conference. In this paper, more detailed aspects of the cognitive decision-making and its possible human errors and organizational vulnerability are presented. The formal TOGA-based network model for cognitive decision-making enables to indicate and analyze nodes and arcs in which plant operators and managers errors may appear. The TOGA's multi-level IPK (Information, Preferences, Knowledge) model of abstract intelligent agents (AIAs) is applied. In the NPP context, super-safety approach is also discussed, by taking under consideration unexpected events and managing them from a systemic perspective. As the nature of human errors depends on the specific properties of the decision-maker and the decisional context of operation, a classification of decision-making using IPK is suggested. Several types of initial situations of decision-making useful for the diagnosis of NPP operators and managers errors are considered. The developed models can be used as a basis for applications to NPP educational or engineering simulators to be used for training the NPP executive staff. (authors)

  10. Measurement Issues in High Stakes Testing: Validity and Reliability

    ERIC Educational Resources Information Center

    Mason, Emanuel J.

    2007-01-01

    Validity and reliability of the new high stakes testing systems initiated in school systems across the United States in recent years in response to the accountability features mandated in the No Child Left Behind Legislation largely depend on item response theory and new rules of measurement. Reliability and validity in item response theory and…

  11. Applicability of the Multiple Intelligence Theory to the Process of Organizing and Planning of Learning and Teaching

    ERIC Educational Resources Information Center

    Acat, M. Bahaddin

    2005-01-01

    It has long been under discussion how the teaching and learning environment should be arranged, how individuals achieve learning, and how teachers can effectively contribute to this process. Accordingly, a considerable number of theories and models have been proposed. Gardner (1983) caused a remarkable shift in the perception of learning theory as…

  12. Stoking the Dialogue on the Domains of Transformative Learning Theory: Insights From Research With Faith-Based Organizations in Kenya

    ERIC Educational Resources Information Center

    Moyer, Joanne M.; Sinclair, A. John

    2016-01-01

    Transformative learning theory is applied in a variety of fields, including archaeology, religious studies, health care, the physical sciences, environmental studies, and natural resource management. Given the breadth of the theory's application, it needs to be adaptable to broad contexts. This article shares insights gained from applying the…

  13. International Lead Zinc Research Organization-sponsored field-data collection and analysis to determine relationships between service conditions and reliability of valve-regulated lead-acid batteries in stationary applications

    NASA Astrophysics Data System (ADS)

    Taylor, P. A.; Moseley, P. T.; Butler, P. C.

    The International Lead Zinc Research Organization (ILZRO), in cooperation with Sandia National Laboratories, has initiated a multi-phase project with the following aims: to characterize relationships between valve-regulated lead-acid (VRLA) batteries, service conditions, and failure modes; to establish the degree of correlation between specific operating procedures and PCL; to identify operating procedures that mitigate PCL; to identify best-fits between the operating requirements of specific applications and the capabilities of specific VRLA technologies; to recommend combinations of battery design, manufacturing processes, and operating conditions that enhance VRLA performance and reliability. In the first phase of this project, ILZRO has contracted with Energetics to identify and survey manufacturers and users of VRLA batteries for stationary applications (including electric utilities, telecommunications companies, and government facilities). The confidential survey is collecting the service conditions of specific applications and performance records for specific VRLA technologies. From the data collected, Energetics is constructing a database of the service histories and analyzing the data to determine trends in performance for particular technologies in specific service conditions. ILZRO plans to make the final report of the analysis and a version of the database (that contains no proprietary information) available to ILZRO members, participants in the survey, and participants in a follow-on workshop for stakeholders in VRLA reliability. This paper presents the surveys distributed to manufacturers and end-users, discusses the analytic approach, presents an overview of the responses to the surveys and trends that have emerged in the early analysis of the data, and previews the functionality of the database being constructed.

  14. Environmental control of sepalness and petalness in perianth organs of waterlilies: a new Mosaic theory for the evolutionary origin of a differentiated perianth.

    PubMed

    Warner, Kate A; Rudall, Paula J; Frohlich, Michael W

    2009-01-01

    The conventional concept of an 'undifferentiated perianth', implying that all perianth organs of a flower are alike, obscures the fact that individual perianth organs are sometimes differentiated into sepaloid and petaloid regions, as in the early-divergent angiosperms Nuphar, Nymphaea, and Schisandra. In the waterlilies Nuphar and Nymphaea, sepaloid regions closely coincide with regions of the perianth that were exposed when the flower was in bud, whereas petaloid regions occur in covered regions, suggesting that their development is at least partly controlled by the environment of the developing tepal. Green and colourful areas differ from each other in trichome density and presence of papillae, features that often distinguish sepals and petals. Field experiments to test whether artificial exposure can induce sepalness in the inner tepals showed that development of sepaloid patches is initiated by exposure, at least in the waterlily species examined. Although light is an important environmental cue, other important factors include an absence of surface contact. Our interpretation contradicts the unspoken rule that 'sepal' and 'petal' must refer to whole organs. We propose a novel theory (the Mosaic theory), in which the distinction between sepalness and petalness evolved early in angiosperm history, but these features were not fixed to particular organs and were primarily environmentally controlled. At a later stage in angiosperm evolution, sepaloid and petaloid characteristics became fixed to whole organs in specific whorls, thus reducing or removing the need for environmental control in favour of fixed developmental control. PMID:19574253

  15. Environmental control of sepalness and petalness in perianth organs of waterlilies: a new Mosaic Theory for the evolutionary origin of a differentiated perianth

    PubMed Central

    Warner, Kate A.; Rudall, Paula J.; Frohlich, Michael W.

    2009-01-01

    The conventional concept of an ‘undifferentiated perianth’, implying that all perianth organs of a flower are alike, obscures the fact that individual perianth organs are sometimes differentiated into sepaloid and petaloid regions, as in the early-divergent angiosperms Nuphar, Nymphaea, and Schisandra. In the waterlilies Nuphar and Nymphaea, sepaloid regions closely coincide with regions of the perianth that were exposed when the flower was in bud, whereas petaloid regions occur in covered regions, suggesting that their development is at least partly controlled by the environment of the developing tepal. Green and colourful areas differ from each other in trichome density and presence of papillae, features that often distinguish sepals and petals. Field experiments to test whether artificial exposure can induce sepalness in the inner tepals showed that development of sepaloid patches is initiated by exposure, at least in the waterlily species examined. Although light is an important environmental cue, other important factors include an absence of surface contact. Our interpretation contradicts the unspoken rule that ‘sepal’ and ‘petal’ must refer to whole organs. We propose a novel theory (the Mosaic theory), in which the distinction between sepalness and petalness evolved early in angiosperm history, but these features were not fixed to particular organs and were primarily environmentally controlled. At a later stage in angiosperm evolution, sepaloid and petaloid characteristics became fixed to whole organs in specific whorls, thus reducing or removing the need for environmental control in favour of fixed developmental control. PMID:19574253

  16. Signal verification can promote reliable signalling

    PubMed Central

    Broom, Mark; Ruxton, Graeme D.; Schaefer, H. Martin

    2013-01-01

    The central question in communication theory is whether communication is reliable, and if so, which mechanisms select for reliability. The primary approach in the past has been to attribute reliability to strategic costs associated with signalling as predicted by the handicap principle. Yet, reliability can arise through other mechanisms, such as signal verification; but the theoretical understanding of such mechanisms has received relatively little attention. Here, we model whether verification can lead to reliability in repeated interactions that typically characterize mutualisms. Specifically, we model whether fruit consumers that discriminate among poor- and good-quality fruits within a population can select for reliable fruit signals. In our model, plants either signal or they do not; costs associated with signalling are fixed and independent of plant quality. We find parameter combinations where discriminating fruit consumers can select for signal reliability by abandoning unprofitable plants more quickly. This self-serving behaviour imposes costs upon plants as a by-product, rendering it unprofitable for unrewarding plants to signal. Thus, strategic costs to signalling are not a prerequisite for reliable communication. We expect verification to more generally explain signal reliability in repeated consumer–resource interactions that typify mutualisms but also in antagonistic interactions such as mimicry and aposematism. PMID:24068354

  17. Signal verification can promote reliable signalling.

    PubMed

    Broom, Mark; Ruxton, Graeme D; Schaefer, H Martin

    2013-11-22

    The central question in communication theory is whether communication is reliable, and if so, which mechanisms select for reliability. The primary approach in the past has been to attribute reliability to strategic costs associated with signalling as predicted by the handicap principle. Yet, reliability can arise through other mechanisms, such as signal verification; but the theoretical understanding of such mechanisms has received relatively little attention. Here, we model whether verification can lead to reliability in repeated interactions that typically characterize mutualisms. Specifically, we model whether fruit consumers that discriminate among poor- and good-quality fruits within a population can select for reliable fruit signals. In our model, plants either signal or they do not; costs associated with signalling are fixed and independent of plant quality. We find parameter combinations where discriminating fruit consumers can select for signal reliability by abandoning unprofitable plants more quickly. This self-serving behaviour imposes costs upon plants as a by-product, rendering it unprofitable for unrewarding plants to signal. Thus, strategic costs to signalling are not a prerequisite for reliable communication. We expect verification to more generally explain signal reliability in repeated consumer-resource interactions that typify mutualisms but also in antagonistic interactions such as mimicry and aposematism.

  18. Software reliability experiments data analysis and investigation

    NASA Technical Reports Server (NTRS)

    Walker, J. Leslie; Caglayan, Alper K.

    1991-01-01

    The objectives are to investigate the fundamental reasons which cause independently developed software programs to fail dependently, and to examine fault tolerant software structures which maximize reliability gain in the presence of such dependent failure behavior. The authors used 20 redundant programs from a software reliability experiment to analyze the software errors causing coincident failures, to compare the reliability of N-version and recovery block structures composed of these programs, and to examine the impact of diversity on software reliability using subpopulations of these programs. The results indicate that both conceptually related and unrelated errors can cause coincident failures and that recovery block structures offer more reliability gain than N-version structures if acceptance checks that fail independently from the software components are available. The authors present a theory of general program checkers that have potential application for acceptance tests.

  19. Operational safety reliability research

    SciTech Connect

    Hall, R.E.; Boccio, J.L.

    1986-01-01

    Operating reactor events such as the TMI accident and the Salem automatic-trip failures raised the concern that during a plant's operating lifetime the reliability of systems could degrade from the design level that was considered in the licensing process. To address this concern, NRC is sponsoring the Operational Safety Reliability Research project. The objectives of this project are to identify the essential tasks of a reliability program and to evaluate the effectiveness and attributes of such a reliability program applicable to maintaining an acceptable level of safety during the operating lifetime at the plant.

  20. Managing Reliability in the 21st Century

    SciTech Connect

    Dellin, T.A.

    1998-11-23

    The rapid pace of change at Ike end of the 20th Century should continue unabated well into the 21st Century. The driver will be the marketplace imperative of "faster, better, cheaper." This imperative has already stimulated a revolution-in-engineering in design and manufacturing. In contrast, to date, reliability engineering has not undergone a similar level of change. It is critical that we implement a corresponding revolution-in-reliability-engineering as we enter the new millennium. If we are still using 20th Century reliability approaches in the 21st Century, then reliability issues will be the limiting factor in faster, better, and cheaper. At the heart of this reliability revolution will be a science-based approach to reliability engineering. Science-based reliability will enable building-in reliability, application-specific products, virtual qualification, and predictive maintenance. The purpose of this paper is to stimulate a dialogue on the future of reliability engineering. We will try to gaze into the crystal ball and predict some key issues that will drive reliability programs in the new millennium. In the 21st Century, we will demand more of our reliability programs. We will need the ability to make accurate reliability predictions that will enable optimizing cost, performance and time-to-market to meet the needs of every market segment. We will require that all of these new capabilities be in place prior to the stint of a product development cycle. The management of reliability programs will be driven by quantifiable metrics of value added to the organization business objectives.

  1. Hawaii electric system reliability.

    SciTech Connect

    Silva Monroy, Cesar Augusto; Loose, Verne William

    2012-09-01

    This report addresses Hawaii electric system reliability issues; greater emphasis is placed on short-term reliability but resource adequacy is reviewed in reference to electric consumers' views of reliability %E2%80%9Cworth%E2%80%9D and the reserve capacity required to deliver that value. The report begins with a description of the Hawaii electric system to the extent permitted by publicly available data. Electrical engineering literature in the area of electric reliability is researched and briefly reviewed. North American Electric Reliability Corporation standards and measures for generation and transmission are reviewed and identified as to their appropriateness for various portions of the electric grid and for application in Hawaii. Analysis of frequency data supplied by the State of Hawaii Public Utilities Commission is presented together with comparison and contrast of performance of each of the systems for two years, 2010 and 2011. Literature tracing the development of reliability economics is reviewed and referenced. A method is explained for integrating system cost with outage cost to determine the optimal resource adequacy given customers' views of the value contributed by reliable electric supply. The report concludes with findings and recommendations for reliability in the State of Hawaii.

  2. Bayesian methods in reliability

    NASA Astrophysics Data System (ADS)

    Sander, P.; Badoux, R.

    1991-11-01

    The present proceedings from a course on Bayesian methods in reliability encompasses Bayesian statistical methods and their computational implementation, models for analyzing censored data from nonrepairable systems, the traits of repairable systems and growth models, the use of expert judgment, and a review of the problem of forecasting software reliability. Specific issues addressed include the use of Bayesian methods to estimate the leak rate of a gas pipeline, approximate analyses under great prior uncertainty, reliability estimation techniques, and a nonhomogeneous Poisson process. Also addressed are the calibration sets and seed variables of expert judgment systems for risk assessment, experimental illustrations of the use of expert judgment for reliability testing, and analyses of the predictive quality of software-reliability growth models such as the Weibull order statistics.

  3. Reliability Models and Attributable Risk

    NASA Technical Reports Server (NTRS)

    Jarvinen, Richard D.

    1999-01-01

    The intention of this report is to bring a developing and extremely useful statistical methodology to greater attention within the Safety, Reliability, and Quality Assurance Office of the NASA Johnson Space Center. The statistical methods in this exposition are found under the heading of attributable risk. Recently the Safety, Reliability, and Quality Assurance Office at the Johnson Space Center has supported efforts to introduce methods of medical research statistics dealing with the survivability of people to bear on the problems of aerospace that deal with the reliability of component hardware used in the NASA space program. This report, which describes several study designs for which attributable risk is used, is in concert with the latter goals. The report identifies areas of active research in attributable risk while briefly describing much of what has been developed in the theory of attributable risk. The report, which largely is a report on a report, attempts to recast the medical setting and language commonly found in descriptions of attributable risk into the setting and language of the space program and its component hardware.

  4. Predicting Cloud Computing Technology Adoption by Organizations: An Empirical Integration of Technology Acceptance Model and Theory of Planned Behavior

    ERIC Educational Resources Information Center

    Ekufu, ThankGod K.

    2012-01-01

    Organizations are finding it difficult in today's economy to implement the vast information technology infrastructure required to effectively conduct their business operations. Despite the fact that some of these organizations are leveraging on the computational powers and the cost-saving benefits of computing on the Internet cloud, others…

  5. Maximum phonation time: variability and reliability.

    PubMed

    Speyer, Renée; Bogaardt, Hans C A; Passos, Valéria Lima; Roodenburg, Nel P H D; Zumach, Anne; Heijnen, Mariëlle A M; Baijens, Laura W J; Fleskens, Stijn J H M; Brunings, Jan W

    2010-05-01

    The objective of the study was to determine maximum phonation time reliability as a function of the number of trials, days, and raters in dysphonic and control subjects. Two groups of adult subjects participated in this reliability study: a group of outpatients with functional or organic dysphonia versus a group of healthy control subjects matched by age and gender. Over a period of maximally 6 weeks, three video recordings were made of five subjects' maximum phonation time trials. A panel of five experts were responsible for all measurements, including a repeated measurement of the subjects' first recordings. Patients showed significantly shorter maximum phonation times compared with healthy controls (on average, 6.6 seconds shorter). The averaged interclass correlation coefficient (ICC) over all raters per trial for the first day was 0.998. The averaged reliability coefficient per rater and per trial for repeated measurements of the first day's data was 0.997, indicating high intrarater reliability. The mean reliability coefficient per day for one trial was 0.939. When using five trials, the reliability increased to 0.987. The reliability over five trials for a single day was 0.836; for 2 days, 0.911; and for 3 days, 0.935. To conclude, the maximum phonation time has proven to be a highly reliable measure in voice assessment. A single rater is sufficient to provide highly reliable measurements.

  6. Harm reduction theory: users' culture, micro-social indigenous harm reduction, and the self-organization and outside-organizing of users' groups.

    PubMed

    Friedman, Samuel R; de Jong, Wouter; Rossi, Diana; Touzé, Graciela; Rockwell, Russell; Des Jarlais, Don C; Elovich, Richard

    2007-03-01

    This paper discusses the user side of harm reduction, focusing to some extent on the early responses to the HIV/AIDS epidemic in each of four sets of localities-New York City, Rotterdam, Buenos Aires, and sites in Central Asia. Using available qualitative and quantitative information, we present a series of vignettes about user activities in four different localities in behalf of reducing drug-related harm. Some of these activities have been micro-social (small group) activities; others have been conducted by formal organizations of users that the users organized at their own initiative. In spite of the limitations of the methodology, the data suggest that users' activities have helped limit HIV spread. These activities are shaped by broader social contexts, such as the extent to which drug scenes are integrated with broader social networks and the way the political and economic systems impinge on drug users' lives. Drug users are active agents in their own individual and collective behalf, and in helping to protect wider communities. Harm reduction activities and research should take note of and draw upon both the micro-social and formal organizations of users. Finally, both researchers and policy makers should help develop ways to enable and support both micro-social and formally organized action by users.

  7. Learning in Complex Organizations as Practicing and Reflecting: A Model Development and Application from a Theory of Practice Perspective

    ERIC Educational Resources Information Center

    Schulz, Klaus-Peter

    2005-01-01

    Purpose: The article seeks to conceptualize learning in practice from a theories of practice view. This paradigmatic shift allows one to overcome problem areas related to traditional conceptions of learning such as the difficulty of knowledge transfer, and related to many situated learning models that neglect the aspect of reification of practice.…

  8. Workplace Support, Discrimination, and Person-Organization Fit: Tests of the Theory of Work Adjustment with LGB Individuals

    ERIC Educational Resources Information Center

    Velez, Brandon L.; Moradi, Bonnie

    2012-01-01

    The present study explored the links of 2 workplace contextual variables--perceptions of workplace heterosexist discrimination and lesbian, gay, and bisexual (LGB)-supportive climates--with job satisfaction and turnover intentions in a sample of LGB employees. An extension of the theory of work adjustment (TWA) was used as the conceptual framework…

  9. Change of Mind: How Organization Theory Led Me to Move from Studying Educational Reform to Pursuing Educational Design

    ERIC Educational Resources Information Center

    Ogawa, Rodney T.

    2015-01-01

    Purpose: The purpose of this paper is for the author to recount how his use of organizational theory to understand educational reform in the USA led to a change of mind. Design/methodology/approach: My shift resulted from my conclusion, derived from the new institutionalism, that only marginal changes can be made in schools and, thus, fundamental…

  10. Generalizability Theory and Classical Test Theory

    ERIC Educational Resources Information Center

    Brennan, Robert L.

    2011-01-01

    Broadly conceived, reliability involves quantifying the consistencies and inconsistencies in observed scores. Generalizability theory, or G theory, is particularly well suited to addressing such matters in that it enables an investigator to quantify and distinguish the sources of inconsistencies in observed scores that arise, or could arise, over…

  11. Correcting Fallacies in Validity, Reliability, and Classification

    ERIC Educational Resources Information Center

    Sijtsma, Klaas

    2009-01-01

    This article reviews three topics from test theory that continue to raise discussion and controversy and capture test theorists' and constructors' interest. The first topic concerns the discussion of the methodology of investigating and establishing construct validity; the second topic concerns reliability and its misuse, alternative definitions…

  12. Harm reduction theory: Users culture, micro-social indigenous harm reduction, and the self-organization and outside-organizing of users’ groups

    PubMed Central

    Friedman, Samuel R.; de Jong, Wouter; Rossi, Diana; Touzé, Graciela; Rockwell, Russell; Jarlais, Don C Des; Elovich, Richard

    2007-01-01

    This paper discusses the user side of harm reduction, focusing to some extent on the early responses to the HIV/AIDS epidemic in each of four sets of localities—New York City, Rotterdam, Buenos Aires, and sites in Central Asia. Using available qualitative and quantitative information, we present a series of vignettes about user activities in four different localities in behalf of reducing drug-related harm. Some of these activities have been micro-social (small group) activities; others have been conducted by formal organizations of users that the users organised at their own initiative. In spite of the limitations of the methodology, the data suggest that users’ activities have helped limit HIV spread. These activities are shaped by broader social contexts, such as the extent to which drug scenes are integrated with broader social networks and the way the political and economic systems impinge on drug users’ lives. Drug users are active agents in their own individual and collective behalf, and in helping to protect wider communities. Harm reduction activities and research should take note of and draw upon both the micro-social and formal organizations of users. Finally, both researchers and policy makers should help develop ways to enable and support both micro-social and formally organized action by users PMID:17689353

  13. Multidisciplinary System Reliability Analysis

    NASA Technical Reports Server (NTRS)

    Mahadevan, Sankaran; Han, Song; Chamis, Christos C. (Technical Monitor)

    2001-01-01

    The objective of this study is to develop a new methodology for estimating the reliability of engineering systems that encompass multiple disciplines. The methodology is formulated in the context of the NESSUS probabilistic structural analysis code, developed under the leadership of NASA Glenn Research Center. The NESSUS code has been successfully applied to the reliability estimation of a variety of structural engineering systems. This study examines whether the features of NESSUS could be used to investigate the reliability of systems in other disciplines such as heat transfer, fluid mechanics, electrical circuits etc., without considerable programming effort specific to each discipline. In this study, the mechanical equivalence between system behavior models in different disciplines are investigated to achieve this objective. A new methodology is presented for the analysis of heat transfer, fluid flow, and electrical circuit problems using the structural analysis routines within NESSUS, by utilizing the equivalence between the computational quantities in different disciplines. This technique is integrated with the fast probability integration and system reliability techniques within the NESSUS code, to successfully compute the system reliability of multidisciplinary systems. Traditional as well as progressive failure analysis methods for system reliability estimation are demonstrated, through a numerical example of a heat exchanger system involving failure modes in structural, heat transfer and fluid flow disciplines.

  14. Solvent dependence of Stokes shift for organic solute-solvent systems: A comparative study by spectroscopy and reference interaction-site model-self-consistent-field theory.

    PubMed

    Nishiyama, Katsura; Watanabe, Yasuhiro; Yoshida, Norio; Hirata, Fumio

    2013-09-01

    The Stokes shift magnitudes for coumarin 153 (C153) in 13 organic solvents with various polarities have been determined by means of steady-state spectroscopy and reference interaction-site model-self-consistent-field (RISM-SCF) theory. RISM-SCF calculations have reproduced experimental results fairly well, including individual solvent characteristics. It is empirically known that in some solvents, larger Stokes shift magnitudes are detected than anticipated on the basis of the solvent relative permittivity, ɛr. In practice, 1,4-dioxane (ɛr = 2.21) provides almost identical Stokes shift magnitudes to that of tetrahydrofuran (THF, ɛr = 7.58), for C153 and other typical organic solutes. In this work, RISM-SCF theory has been used to estimate the energetics of C153-solvent systems involved in the absorption and fluorescence processes. The Stokes shift magnitudes estimated by RISM-SCF theory are ∼5 kJ mol(-1) (400 cm(-1)) less than those determined by spectroscopy; however, the results obtained are still adequate for dipole moment comparisons, in a qualitative sense. We have also calculated the solute-solvent site-site radial distributions by this theory. It is shown that solvation structures with respect to the C-O-C framework, which is common to dioxane and THF, in the near vicinity (∼0.4 nm) of specific solute sites can largely account for their similar Stokes shift magnitudes. In previous works, such solute-solvent short-range interactions have been explained in terms of the higher-order multipole moments of the solvents. Our present study shows that along with the short-range interactions that contribute most significantly to the energetics, long-range electrostatic interactions are also important. Such long-range interactions are effective up to 2 nm from the solute site, as in the case of a typical polar solvent, acetonitrile.

  15. Solvent dependence of Stokes shift for organic solute-solvent systems: A comparative study by spectroscopy and reference interaction-site model-self-consistent-field theory

    NASA Astrophysics Data System (ADS)

    Nishiyama, Katsura; Watanabe, Yasuhiro; Yoshida, Norio; Hirata, Fumio

    2013-09-01

    The Stokes shift magnitudes for coumarin 153 (C153) in 13 organic solvents with various polarities have been determined by means of steady-state spectroscopy and reference interaction-site model-self-consistent-field (RISM-SCF) theory. RISM-SCF calculations have reproduced experimental results fairly well, including individual solvent characteristics. It is empirically known that in some solvents, larger Stokes shift magnitudes are detected than anticipated on the basis of the solvent relative permittivity, ɛr. In practice, 1,4-dioxane (ɛr = 2.21) provides almost identical Stokes shift magnitudes to that of tetrahydrofuran (THF, ɛr = 7.58), for C153 and other typical organic solutes. In this work, RISM-SCF theory has been used to estimate the energetics of C153-solvent systems involved in the absorption and fluorescence processes. The Stokes shift magnitudes estimated by RISM-SCF theory are ˜5 kJ mol-1 (400 cm-1) less than those determined by spectroscopy; however, the results obtained are still adequate for dipole moment comparisons, in a qualitative sense. We have also calculated the solute-solvent site-site radial distributions by this theory. It is shown that solvation structures with respect to the C-O-C framework, which is common to dioxane and THF, in the near vicinity (˜0.4 nm) of specific solute sites can largely account for their similar Stokes shift magnitudes. In previous works, such solute-solvent short-range interactions have been explained in terms of the higher-order multipole moments of the solvents. Our present study shows that along with the short-range interactions that contribute most significantly to the energetics, long-range electrostatic interactions are also important. Such long-range interactions are effective up to 2 nm from the solute site, as in the case of a typical polar solvent, acetonitrile.

  16. The body of the soul. Lucretian echoes in the Renaissance theories on the psychic substance and its organic repartition.

    PubMed

    Tutrone, Fabio

    2014-01-01

    In the 16th and 17th centuries, when Aristotelianism still was the leading current of natural philosophy and atomistic theories began to arise, Lucretius' De Rerum Natura stood out as an attractive and dangerous model. The present paper reassesses several relevant aspects of Lucretius' materialistic psychology by focusing on the problem of the soul's repartition through the limbs discussed in Book 3. A very successful Lucretian image serves as flu rouge throughout this survey: the description of a snake chopped up, with its pieces moving on the ground (Lucretius DRN 1969, 3.657-669). The paper's first section sets the poet's theory against the background of ancient psychology, pointing out its often neglected assimilation of Aristotelian elements. The second section highlights the influence of De Rerum Natura and its physiology of the soul on Bernardino Telesio, Agostino Doni and Francis Bacon, since all of these authors engage in an original recombination of mechanical and teleological explanations. PMID:25707096

  17. The body of the soul. Lucretian echoes in the Renaissance theories on the psychic substance and its organic repartition.

    PubMed

    Tutrone, Fabio

    2014-01-01

    In the 16th and 17th centuries, when Aristotelianism still was the leading current of natural philosophy and atomistic theories began to arise, Lucretius' De Rerum Natura stood out as an attractive and dangerous model. The present paper reassesses several relevant aspects of Lucretius' materialistic psychology by focusing on the problem of the soul's repartition through the limbs discussed in Book 3. A very successful Lucretian image serves as flu rouge throughout this survey: the description of a snake chopped up, with its pieces moving on the ground (Lucretius DRN 1969, 3.657-669). The paper's first section sets the poet's theory against the background of ancient psychology, pointing out its often neglected assimilation of Aristotelian elements. The second section highlights the influence of De Rerum Natura and its physiology of the soul on Bernardino Telesio, Agostino Doni and Francis Bacon, since all of these authors engage in an original recombination of mechanical and teleological explanations.

  18. Scaled CMOS Technology Reliability Users Guide

    NASA Technical Reports Server (NTRS)

    White, Mark

    2010-01-01

    The desire to assess the reliability of emerging scaled microelectronics technologies through faster reliability trials and more accurate acceleration models is the precursor for further research and experimentation in this relevant field. The effect of semiconductor scaling on microelectronics product reliability is an important aspect to the high reliability application user. From the perspective of a customer or user, who in many cases must deal with very limited, if any, manufacturer's reliability data to assess the product for a highly-reliable application, product-level testing is critical in the characterization and reliability assessment of advanced nanometer semiconductor scaling effects on microelectronics reliability. A methodology on how to accomplish this and techniques for deriving the expected product-level reliability on commercial memory products are provided.Competing mechanism theory and the multiple failure mechanism model are applied to the experimental results of scaled SDRAM products. Accelerated stress testing at multiple conditions is applied at the product level of several scaled memory products to assess the performance degradation and product reliability. Acceleration models are derived for each case. For several scaled SDRAM products, retention time degradation is studied and two distinct soft error populations are observed with each technology generation: early breakdown, characterized by randomly distributed weak bits with Weibull slope (beta)=1, and a main population breakdown with an increasing failure rate. Retention time soft error rates are calculated and a multiple failure mechanism acceleration model with parameters is derived for each technology. Defect densities are calculated and reflect a decreasing trend in the percentage of random defective bits for each successive product generation. A normalized soft error failure rate of the memory data retention time in FIT/Gb and FIT/cm2 for several scaled SDRAM generations is

  19. Screening for high-spin metal organic frameworks (MOFs): density functional theory study on DUT-8(M1,M2) (with Mi = V,…,Cu).

    PubMed

    Schwalbe, Sebastian; Trepte, Kai; Seifert, Gotthard; Kortus, Jens

    2016-03-21

    We present a first principles study of low-spin (LS)/high-spin (HS) screening for 3d metal centers in the metal organic framework (MOF) DUT-8(Ni). Various density functional theory (DFT) codes have been used to evaluate numerical and DFT related errors. We compare highly accurate all-electron implementations with the widely used plane wave approach. We present electronically and magnetically stable DUT-8(Ni) HS secondary building units (SBUs). In this work we show how to tune the magnetic and electronic properties of the original SBU only by changing the metal centers. PMID:26922864

  20. Measurement Practices for Reliability and Power Quality

    SciTech Connect

    Kueck, JD

    2005-05-06

    This report provides a distribution reliability measurement ''toolkit'' that is intended to be an asset to regulators, utilities and power users. The metrics and standards discussed range from simple reliability, to power quality, to the new blend of reliability and power quality analysis that is now developing. This report was sponsored by the Office of Electric Transmission and Distribution, U.S. Department of Energy (DOE). Inconsistencies presently exist in commonly agreed-upon practices for measuring the reliability of the distribution systems. However, efforts are being made by a number of organizations to develop solutions. In addition, there is growing interest in methods or standards for measuring power quality, and in defining power quality levels that are acceptable to various industries or user groups. The problems and solutions vary widely among geographic areas and among large investor-owned utilities, rural cooperatives, and municipal utilities; but there is still a great degree of commonality. Industry organizations such as the National Rural Electric Cooperative Association (NRECA), the Electric Power Research Institute (EPRI), the American Public Power Association (APPA), and the Institute of Electrical and Electronics Engineers (IEEE) have made tremendous strides in preparing self-assessment templates, optimization guides, diagnostic techniques, and better definitions of reliability and power quality measures. In addition, public utility commissions have developed codes and methods for assessing performance that consider local needs. There is considerable overlap among these various organizations, and we see real opportunity and value in sharing these methods, guides, and standards in this report. This report provides a ''toolkit'' containing synopses of noteworthy reliability measurement practices. The toolkit has been developed to address the interests of three groups: electric power users, utilities, and regulators. The report will also serve

  1. General theory for multiple input-output perturbations in complex molecular systems. 1. Linear QSPR electronegativity models in physical, organic, and medicinal chemistry.

    PubMed

    González-Díaz, Humberto; Arrasate, Sonia; Gómez-SanJuan, Asier; Sotomayor, Nuria; Lete, Esther; Besada-Porto, Lina; Ruso, Juan M

    2013-01-01

    In general perturbation methods starts with a known exact solution of a problem and add "small" variation terms in order to approach to a solution for a related problem without known exact solution. Perturbation theory has been widely used in almost all areas of science. Bhor's quantum model, Heisenberg's matrix mechanincs, Feyman diagrams, and Poincare's chaos model or "butterfly effect" in complex systems are examples of perturbation theories. On the other hand, the study of Quantitative Structure-Property Relationships (QSPR) in molecular complex systems is an ideal area for the application of perturbation theory. There are several problems with exact experimental solutions (new chemical reactions, physicochemical properties, drug activity and distribution, metabolic networks, etc.) in public databases like CHEMBL. However, in all these cases, we have an even larger list of related problems without known solutions. We need to know the change in all these properties after a perturbation of initial boundary conditions. It means, when we test large sets of similar, but different, compounds and/or chemical reactions under the slightly different conditions (temperature, time, solvents, enzymes, assays, protein targets, tissues, partition systems, organisms, etc.). However, to the best of our knowledge, there is no QSPR general-purpose perturbation theory to solve this problem. In this work, firstly we review general aspects and applications of both perturbation theory and QSPR models. Secondly, we formulate a general-purpose perturbation theory for multiple-boundary QSPR problems. Last, we develop three new QSPR-Perturbation theory models. The first model classify correctly >100,000 pairs of intra-molecular carbolithiations with 75-95% of Accuracy (Ac), Sensitivity (Sn), and Specificity (Sp). The model predicts probabilities of variations in the yield and enantiomeric excess of reactions due to at least one perturbation in boundary conditions (solvent, temperature

  2. General theory for multiple input-output perturbations in complex molecular systems. 1. Linear QSPR electronegativity models in physical, organic, and medicinal chemistry.

    PubMed

    González-Díaz, Humberto; Arrasate, Sonia; Gómez-SanJuan, Asier; Sotomayor, Nuria; Lete, Esther; Besada-Porto, Lina; Ruso, Juan M

    2013-01-01

    In general perturbation methods starts with a known exact solution of a problem and add "small" variation terms in order to approach to a solution for a related problem without known exact solution. Perturbation theory has been widely used in almost all areas of science. Bhor's quantum model, Heisenberg's matrix mechanincs, Feyman diagrams, and Poincare's chaos model or "butterfly effect" in complex systems are examples of perturbation theories. On the other hand, the study of Quantitative Structure-Property Relationships (QSPR) in molecular complex systems is an ideal area for the application of perturbation theory. There are several problems with exact experimental solutions (new chemical reactions, physicochemical properties, drug activity and distribution, metabolic networks, etc.) in public databases like CHEMBL. However, in all these cases, we have an even larger list of related problems without known solutions. We need to know the change in all these properties after a perturbation of initial boundary conditions. It means, when we test large sets of similar, but different, compounds and/or chemical reactions under the slightly different conditions (temperature, time, solvents, enzymes, assays, protein targets, tissues, partition systems, organisms, etc.). However, to the best of our knowledge, there is no QSPR general-purpose perturbation theory to solve this problem. In this work, firstly we review general aspects and applications of both perturbation theory and QSPR models. Secondly, we formulate a general-purpose perturbation theory for multiple-boundary QSPR problems. Last, we develop three new QSPR-Perturbation theory models. The first model classify correctly >100,000 pairs of intra-molecular carbolithiations with 75-95% of Accuracy (Ac), Sensitivity (Sn), and Specificity (Sp). The model predicts probabilities of variations in the yield and enantiomeric excess of reactions due to at least one perturbation in boundary conditions (solvent, temperature

  3. 75 FR 70752 - Reliability Monitoring, Enforcement and Compliance Issues; Announcement of Panelists for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-18

    ... of its performance as the nation's Electric Reliability Organization (ERO), and performance by the... Skaar, President, Midwest Reliability Organization Steven Goodwill, General Counsel, Western Electricity... ``Documentation'' Violations and ``Performance'' Violations Are Regional Entities and NERC conducting...

  4. Statistical modelling of software reliability

    NASA Technical Reports Server (NTRS)

    Miller, Douglas R.

    1991-01-01

    During the six-month period from 1 April 1991 to 30 September 1991 the following research papers in statistical modeling of software reliability appeared: (1) A Nonparametric Software Reliability Growth Model; (2) On the Use and the Performance of Software Reliability Growth Models; (3) Research and Development Issues in Software Reliability Engineering; (4) Special Issues on Software; and (5) Software Reliability and Safety.

  5. Orbiter Autoland reliability analysis

    NASA Technical Reports Server (NTRS)

    Welch, D. Phillip

    1993-01-01

    The Space Shuttle Orbiter is the only space reentry vehicle in which the crew is seated upright. This position presents some physiological effects requiring countermeasures to prevent a crewmember from becoming incapacitated. This also introduces a potential need for automated vehicle landing capability. Autoland is a primary procedure that was identified as a requirement for landing following and extended duration orbiter mission. This report documents the results of the reliability analysis performed on the hardware required for an automated landing. A reliability block diagram was used to evaluate system reliability. The analysis considers the manual and automated landing modes currently available on the Orbiter. (Autoland is presently a backup system only.) Results of this study indicate a +/- 36 percent probability of successfully extending a nominal mission to 30 days. Enough variations were evaluated to verify that the reliability could be altered with missions planning and procedures. If the crew is modeled as being fully capable after 30 days, the probability of a successful manual landing is comparable to that of Autoland because much of the hardware is used for both manual and automated landing modes. The analysis indicates that the reliability for the manual mode is limited by the hardware and depends greatly on crew capability. Crew capability for a successful landing after 30 days has not been determined yet.

  6. Photovoltaic module reliability workshop

    SciTech Connect

    Mrig, L.

    1990-01-01

    The paper and presentations compiled in this volume form the Proceedings of the fourth in a series of Workshops sponsored by Solar Energy Research Institute (SERI/DOE) under the general theme of photovoltaic module reliability during the period 1986--1990. The reliability Photo Voltaic (PV) modules/systems is exceedingly important along with the initial cost and efficiency of modules if the PV technology has to make a major impact in the power generation market, and for it to compete with the conventional electricity producing technologies. The reliability of photovoltaic modules has progressed significantly in the last few years as evidenced by warranties available on commercial modules of as long as 12 years. However, there is still need for substantial research and testing required to improve module field reliability to levels of 30 years or more. Several small groups of researchers are involved in this research, development, and monitoring activity around the world. In the US, PV manufacturers, DOE laboratories, electric utilities and others are engaged in the photovoltaic reliability research and testing. This group of researchers and others interested in this field were brought together under SERI/DOE sponsorship to exchange the technical knowledge and field experience as related to current information in this important field. The papers presented here reflect this effort.

  7. Proposed reliability cost model

    NASA Technical Reports Server (NTRS)

    Delionback, L. M.

    1973-01-01

    The research investigations which were involved in the study include: cost analysis/allocation, reliability and product assurance, forecasting methodology, systems analysis, and model-building. This is a classic example of an interdisciplinary problem, since the model-building requirements include the need for understanding and communication between technical disciplines on one hand, and the financial/accounting skill categories on the other. The systems approach is utilized within this context to establish a clearer and more objective relationship between reliability assurance and the subcategories (or subelements) that provide, or reenforce, the reliability assurance for a system. Subcategories are further subdivided as illustrated by a tree diagram. The reliability assurance elements can be seen to be potential alternative strategies, or approaches, depending on the specific goals/objectives of the trade studies. The scope was limited to the establishment of a proposed reliability cost-model format. The model format/approach is dependent upon the use of a series of subsystem-oriented CER's and sometimes possible CTR's, in devising a suitable cost-effective policy.

  8. Learning for Social Justice: A Cultural Historical Activity Theory Analysis of Community Leadership Empowerment in a Korean American Community Organization

    ERIC Educational Resources Information Center

    Kim, Junghwan

    2012-01-01

    Community organizations, especially those aiming at social change, play a significant role in establishing societal health and contributing to adult learning in daily communities. Their existence secures marginalized groups' involvement in society and enhances community development by building community leadership with multiple stakeholders…

  9. A Grounded Theory of the College Experiences of African American Males in Black Greek-Letter Organizations

    ERIC Educational Resources Information Center

    Ford, David Julius, Jr.

    2014-01-01

    Studies have shown that involvement in a student organization can improve the academic and psychosocial outcomes of African American male students (Harper, 2006b; Robertson & Mason, 2008; Williams & Justice, 2010). Further, Harper, Byars, and Jelke (2005) stated that African American fraternities and sororities (i.e., Black Greek-letter…

  10. CRITICAL EVALUATION OF THE DIFFUSION HYPOTHESIS IN THE THEORY OF POROUS MEDIA VOLATILE ORGANIC COMPOUND (VOC) SOURCES AND SINKS

    EPA Science Inventory

    The paper proposes three alternative, diffusion-limited mathematical models to account for volatile organic compound (VOC) interactions with indoor sinks, using the linear isotherm model as a reference point. (NOTE: Recent reports by both the U.S. EPA and a study committee of the...

  11. Test-Retest Reliability of High Angular Resolution Diffusion Imaging Acquisition within Medial Temporal Lobe Connections Assessed via Tract Based Spatial Statistics, Probabilistic Tractography and a Novel Graph Theory Metric

    PubMed Central

    Kuhn, T.; Gullett, J. M.; Nguyen, P.; Boutzoukas, A. E.; Ford, A.; Colon-Perez, L. M.; Triplett, W.; Carney, P.R.; Mareci, T. H.; Price, C. C.; Bauer, R. M.

    2015-01-01

    Introduction This study examined the reliability of high angular resolution diffusion tensor imaging (HARDI) data collected on a single individual across several sessions using the same scanner. Methods HARDI data was acquired for one healthy adult male at the same time of day on ten separate days across a one-month period. Environmental factors (e.g. temperature) were controlled across scanning sessions. Tract Based Spatial Statistics (TBSS) was used to assess session-to-session variability in measures of diffusion, fractional anisotropy (FA) and mean diffusivity (MD). To address reliability within specific structures of the medial temporal lobe (MTL; the focus of an ongoing investigation), probabilistic tractography segmented the Entorhinal cortex (ERc) based on connections with Hippocampus (HC), Perirhinal (PRc) and Parahippocampal (PHc) cortices. Streamline tractography generated edge weight (EW) metrics for the aforementioned ERc connections and, as comparison regions, connections between left and right rostral and caudal anterior cingulate cortex (ACC). Coefficients of variation (CoV) were derived for the surface area and volumes of these ERc connectivity-defined regions (CDR) and for EW across all ten scans, expecting that scan-to-scan reliability would yield low CoVs. Results TBSS revealed no significant variation in FA or MD across scanning sessions. Probabilistic tractography successfully reproduced histologically-verified adjacent medial temporal lobe circuits. Tractography-derived metrics displayed larger ranges of scanner-to-scanner variability. Connections involving HC displayed greater variability than metrics of connection between other investigated regions. Conclusions By confirming the test retest reliability of HARDI data acquisition, support for the validity of significant results derived from diffusion data can be obtained. PMID:26189060

  12. Test-retest reliability of high angular resolution diffusion imaging acquisition within medial temporal lobe connections assessed via tract based spatial statistics, probabilistic tractography and a novel graph theory metric.

    PubMed

    Kuhn, T; Gullett, J M; Nguyen, P; Boutzoukas, A E; Ford, A; Colon-Perez, L M; Triplett, W; Carney, P R; Mareci, T H; Price, C C; Bauer, R M

    2016-06-01

    This study examined the reliability of high angular resolution diffusion tensor imaging (HARDI) data collected on a single individual across several sessions using the same scanner. HARDI data was acquired for one healthy adult male at the same time of day on ten separate days across a one-month period. Environmental factors (e.g. temperature) were controlled across scanning sessions. Tract Based Spatial Statistics (TBSS) was used to assess session-to-session variability in measures of diffusion, fractional anisotropy (FA) and mean diffusivity (MD). To address reliability within specific structures of the medial temporal lobe (MTL; the focus of an ongoing investigation), probabilistic tractography segmented the Entorhinal cortex (ERc) based on connections with Hippocampus (HC), Perirhinal (PRc) and Parahippocampal (PHc) cortices. Streamline tractography generated edge weight (EW) metrics for the aforementioned ERc connections and, as comparison regions, connections between left and right rostral and caudal anterior cingulate cortex (ACC). Coefficients of variation (CoV) were derived for the surface area and volumes of these ERc connectivity-defined regions (CDR) and for EW across all ten scans, expecting that scan-to-scan reliability would yield low CoVs. TBSS revealed no significant variation in FA or MD across scanning sessions. Probabilistic tractography successfully reproduced histologically-verified adjacent medial temporal lobe circuits. Tractography-derived metrics displayed larger ranges of scanner-to-scanner variability. Connections involving HC displayed greater variability than metrics of connection between other investigated regions. By confirming the test retest reliability of HARDI data acquisition, support for the validity of significant results derived from diffusion data can be obtained.

  13. Accounting for natural organic matter in aqueous chemical equilibrium models: a review of the theories and applications

    NASA Astrophysics Data System (ADS)

    Dudal, Yves; Gérard, Frédéric

    2004-08-01

    Soil organic matter consists of a highly complex and diversified blend of organic molecules, ranging from low molecular weight organic acids (LMWOAs), sugars, amines, alcohols, etc., to high apparent molecular weight fulvic and humic acids. The presence of a wide range of functional groups on these molecules makes them very reactive and influential in soil chemistry, in regards to acid-base chemistry, metal complexation, precipitation and dissolution of minerals and microbial reactions. Out of these functional groups, the carboxylic and phenolic ones are the most abundant and most influential in regards to metal complexation. Therefore, chemical equilibrium models have progressively dealt with organic matter in their calculations. This paper presents a review of six chemical equilibrium models, namely N ICA-Donnan, E Q3/6, G EOCHEM, M INTEQA2, P HREEQC and W HAM, in light of the account they make of natural organic matter (NOM) with the objective of helping potential users in choosing a modelling approach. The account has taken various faces, mainly by adding specific molecules within the existing model databases (E Q3/6, G EOCHEM, and P HREEQC) or by using either a discrete (W HAM) or a continuous (N ICA-Donnan and M INTEQA2) distribution of the deprotonated carboxylic and phenolic groups. The different ways in which soil organic matter has been integrated into these models are discussed in regards to the model-experiment comparisons that were found in the literature, concerning applications to either laboratory or natural systems. Much of the attention has been focused on the two most advanced models, W HAM and N ICA-Donnan, which are able to reasonably describe most of the experimental results. Nevertheless, a better knowledge of the humic substances metal-binding properties is needed to better constrain model inputs with site-specific parameter values. This represents the main axis of research that needs to be carried out to improve the models. In addition to

  14. Self-organization in irregular landscapes: Detecting autogenic interactions from field data using descriptive statistics and dynamical systems theory

    NASA Astrophysics Data System (ADS)

    Larsen, L.; Watts, D.; Khurana, A.; Anderson, J. L.; Xu, C.; Merritts, D. J.

    2015-12-01

    The classic signal of self-organization in nature is pattern formation. However, the interactions and feedbacks that organize depositional landscapes do not always result in regular or fractal patterns. How might we detect their existence and effects in these "irregular" landscapes? Emergent landscapes such as newly forming deltaic marshes or some restoration sites provide opportunities to study the autogenic processes that organize landscapes and their physical signatures. Here we describe a quest to understand autogenic vs. allogenic controls on landscape evolution in Big Spring Run, PA, a landscape undergoing restoration from bare-soil conditions to a target wet meadow landscape. The contemporary motivation for asking questions about autogenic vs. allogenic controls is to evaluate how important initial conditions or environmental controls may be for the attainment of management objectives. However, these questions can also inform interpretation of the sedimentary record by enabling researchers to separate signals that may have arisen through self-organization processes from those resulting from environmental perturbations. Over three years at Big Spring Run, we mapped the dynamic evolution of floodplain vegetation communities and distributions of abiotic variables and topography. We used principal component analysis and transition probability analysis to detect associative interactions between vegetation and geomorphic variables and convergent cross-mapping on lidar data to detect causal interactions between biomass and topography. Exploratory statistics revealed that plant communities with distinct morphologies exerted control on landscape evolution through stress divergence (i.e., channel initiation) and promoting the accumulation of fine sediment in channels. Together, these communities participated in a negative feedback that maintains low energy and multiple channels. Because of the spatially explicit nature of this feedback, causal interactions could not

  15. Organ-specific rates of cellular respiration in developing sunflower seedlings and their bearing on metabolic scaling theory.

    PubMed

    Kutschera, Ulrich; Niklas, Karl J

    2012-10-01

    Fifty years ago Max Kleiber described what has become known as the "mouse-to-elephant" curve, i.e., a log-log plot of basal metabolic rate versus body mass. From these data, "Kleiber's 3/4 law" was deduced, which states that metabolic activity scales as the three fourths-power of body mass. However, for reasons unknown so far, no such "universal scaling law" has been discovered for land plants (embryophytes). Here, we report that the metabolic rates of four different organs (cotyledons, cotyledonary hook, hypocotyl, and roots) of developing sunflower (Helianthus annuus L.) seedlings grown in darkness (skotomorphogenesis) and in white light (photomorphogenesis) differ by a factor of 2 to 5 and are largely independent of light treatment. The organ-specific respiration rate (oxygen uptake per minute per gram of fresh mass) of the apical hook, which is composed of cells with densely packaged cytoplasm, is much higher than that of the hypocotyl, an organ that contains vacuolated cells. Data for cell length, cell density, and DNA content reveal that (1) hook opening in white light is caused by a stimulation of cell elongation on the inside of the curved organ, (2) respiration, cell density and DNA content are much higher in the hook than in the stem, and (3) organ-specific respiration rates and the DNA contents of tissues are statistically correlated. We conclude that, due to the heterogeneity of the plant body caused by the vacuolization of the cells, Kleiber's law, which was deduced using mammals as a model system, cannot be applied to embryophytes. In plants, this rule may reflect scaling phenomena at the level of the metabolically active protoplasmic contents of the cells.

  16. Software reliability perspectives

    NASA Technical Reports Server (NTRS)

    Wilson, Larry; Shen, Wenhui

    1987-01-01

    Software which is used in life critical functions must be known to be highly reliable before installation. This requires a strong testing program to estimate the reliability, since neither formal methods, software engineering nor fault tolerant methods can guarantee perfection. Prior to the final testing software goes through a debugging period and many models have been developed to try to estimate reliability from the debugging data. However, the existing models are poorly validated and often give poor performance. This paper emphasizes the fact that part of their failures can be attributed to the random nature of the debugging data given to these models as input, and it poses the problem of correcting this defect as an area of future research.

  17. Reliability Centered Maintenance - Methodologies

    NASA Technical Reports Server (NTRS)

    Kammerer, Catherine C.

    2009-01-01

    Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.

  18. First-principles calculation of photo-induced electron transfer rate constants in phthalocyanine-C60 organic photovoltaic materials: Beyond Marcus theory

    NASA Astrophysics Data System (ADS)

    Lee, Myeong H.; Dunietz, Barry D.; Geva, Eitan

    2014-03-01

    Classical Marcus theory is commonly adopted in solvent-mediated charge transfer (CT) process to obtain the CT rate constant, but it can become questionable when the intramolecular vibrational modes dominate the CT process as in OPV devices because Marcus theory treats these modes classically and therefore nuclear tunneling is not accounted for. We present a computational scheme to obtain the electron transfer rate constant beyond classical Marcus theory. Within this approach, the nuclear vibrational modes are treated quantum-mechanically and a short-time approximation is avoided. Ab initio calculations are used to obtain the basic parameters needed for calculating the electron transfer rate constant. We apply our methodology to phthalocyanine(H2PC)-C60 organic photovoltaic system where one C60 acceptor and one or two H2PC donors are included to model the donor-acceptor interface configuration. We obtain the electron transfer and recombination rate constants for all accessible charge transfer (CT) states, from which the CT exciton dynamics is determined by employing a master equation. The role of higher lying excited states in CT exciton dynamics is discussed. This work is pursued as part of the Center for Solar and Thermal Energy Conversion, an Energy Frontier Research Center funded by the US Department of Energy Office of Science, Office of Basic Energy Sciences under 390 Award No. DE-SC0000957.

  19. Gearbox Reliability Collaborative Update (Presentation)

    SciTech Connect

    Sheng, S.

    2013-10-01

    This presentation was given at the Sandia Reliability Workshop in August 2013 and provides information on current statistics, a status update, next steps, and other reliability research and development activities related to the Gearbox Reliability Collaborative.

  20. Materials reliability issues in microelectronics

    SciTech Connect

    Lloyd, J.R. ); Yost, F.G. ); Ho, P.S. )

    1991-01-01

    This book covers the proceedings of a MRS symposium on materials reliability in microelectronics. Topics include: electromigration; stress effects on reliability; stress and packaging; metallization; device, oxide and dielectric reliability; new investigative techniques; and corrosion.

  1. Quantifying Human Performance Reliability.

    ERIC Educational Resources Information Center

    Askren, William B.; Regulinski, Thaddeus L.

    Human performance reliability for tasks in the time-space continuous domain is defined and a general mathematical model presented. The human performance measurement terms time-to-error and time-to-error-correction are defined. The model and measurement terms are tested using laboratory vigilance and manual control tasks. Error and error-correction…

  2. Grid reliability management tools

    SciTech Connect

    Eto, J.; Martinez, C.; Dyer, J.; Budhraja, V.

    2000-10-01

    To summarize, Consortium for Electric Reliability Technology Solutions (CERTS) is engaged in a multi-year program of public interest R&D to develop and prototype software tools that will enhance system reliability during the transition to competitive markets. The core philosophy embedded in the design of these tools is the recognition that in the future reliability will be provided through market operations, not the decisions of central planners. Embracing this philosophy calls for tools that: (1) Recognize that the game has moved from modeling machine and engineering analysis to simulating markets to understand the impacts on reliability (and vice versa); (2) Provide real-time data and support information transparency toward enhancing the ability of operators and market participants to quickly grasp, analyze, and act effectively on information; (3) Allow operators, in particular, to measure, monitor, assess, and predict both system performance as well as the performance of market participants; and (4) Allow rapid incorporation of the latest sensing, data communication, computing, visualization, and algorithmic techniques and technologies.

  3. Parametric Mass Reliability Study

    NASA Technical Reports Server (NTRS)

    Holt, James P.

    2014-01-01

    The International Space Station (ISS) systems are designed based upon having redundant systems with replaceable orbital replacement units (ORUs). These ORUs are designed to be swapped out fairly quickly, but some are very large, and some are made up of many components. When an ORU fails, it is replaced on orbit with a spare; the failed unit is sometimes returned to Earth to be serviced and re-launched. Such a system is not feasible for a 500+ day long-duration mission beyond low Earth orbit. The components that make up these ORUs have mixed reliabilities. Components that make up the most mass-such as computer housings, pump casings, and the silicon board of PCBs-typically are the most reliable. Meanwhile components that tend to fail the earliest-such as seals or gaskets-typically have a small mass. To better understand the problem, my project is to create a parametric model that relates both the mass of ORUs to reliability, as well as the mass of ORU subcomponents to reliability.

  4. Software reliability report

    NASA Technical Reports Server (NTRS)

    Wilson, Larry

    1991-01-01

    There are many software reliability models which try to predict future performance of software based on data generated by the debugging process. Unfortunately, the models appear to be unable to account for the random nature of the data. If the same code is debugged multiple times and one of the models is used to make predictions, intolerable variance is observed in the resulting reliability predictions. It is believed that data replication can remove this variance in lab type situations and that it is less than scientific to talk about validating a software reliability model without considering replication. It is also believed that data replication may prove to be cost effective in the real world, thus the research centered on verification of the need for replication and on methodologies for generating replicated data in a cost effective manner. The context of the debugging graph was pursued by simulation and experimentation. Simulation was done for the Basic model and the Log-Poisson model. Reasonable values of the parameters were assigned and used to generate simulated data which is then processed by the models in order to determine limitations on their accuracy. These experiments exploit the existing software and program specimens which are in AIR-LAB to measure the performance of reliability models.

  5. Reliable solar cookers

    SciTech Connect

    Magney, G.K.

    1992-12-31

    The author describes the activities of SERVE, a Christian relief and development agency, to introduce solar ovens to the Afghan refugees in Pakistan. It has provided 5,000 solar cookers since 1984. The experience has demonstrated the potential of the technology and the need for a durable and reliable product. Common complaints about the cookers are discussed and the ideal cooker is described.

  6. IRT-Estimated Reliability for Tests Containing Mixed Item Formats

    ERIC Educational Resources Information Center

    Shu, Lianghua; Schwarz, Richard D.

    2014-01-01

    As a global measure of precision, item response theory (IRT) estimated reliability is derived for four coefficients (Cronbach's a, Feldt-Raju, stratified a, and marginal reliability). Models with different underlying assumptions concerning test-part similarity are discussed. A detailed computational example is presented for the targeted…

  7. Toward panchromatic organic functional molecules: density functional theory study on the electronic absorption spectra of substituted tetraanthracenylporphyrins.

    PubMed

    Qi, Dongdong; Jiang, Jianzhuang

    2011-12-01

    To achieve full solar spectrum absorption of organic dyes for organic solar cells and organic solar antenna collectors, a series of tetraanthracenylporphyrin derivatives including H(2)(TAnP), H(2)(α-F(4)TAnP), H(2)(β,β'-F(8)TAnP), H(2)(γ,γ'-F(8)TAnP), H(2)(δ,δ'-F(8)TAnP), H(2)[α-(NH(2))(4)TAnP], H(2)[β,β'-(NH(2))(8)TAnP], H(2)[γ,γ'-(NH(2))(8)TAnP], and H(2)[δ,δ'-(NH(2))(8)TAnP] was designed and their electronic absorption spectra were systematically studied on the basis of TDDFT calculations. The nature of the broad and intense electronic absorptions of H(2)(TAnP) in the range of 500-1700 nm is clearly revealed, and different types of π → π* electronic transitions associated with different absorption bands are revealed to correspond to different electron density moving direction between peripherally fused 14-electron-π-conjugated anthracene units and the central 18-electron-π-conjugated porphyrin core. Introduction of electron-donating groups onto the periphery of the H(2)(TAnP) macrocycle is revealed to be able to lead to novel NIR dyes such as H(2)[α-(NH(2))(4)TAnP] and H(2)[δ,δ'-(NH(2))(8)TAnP] with regulated UV-vis-NIR absorption bands covering the full solar spectrum in the range of 300-2400 nm.

  8. The Harvard Clean Energy Project: High-throughput screening of organic photovoltaic materials using first-principles electronic structure theory

    NASA Astrophysics Data System (ADS)

    Hachmann, Johannes; Olivares-Amaya, Roberto; Atahan-Evrenk, Sule; Amador-Bedolla, Carlos; Aspuru-Guzik, Alan

    2012-02-01

    We present the Harvard Clean Energy Project (CEP) which is concerned with the computational screening and design of new organic photovoltaic materials. CEP has established an automated, high-throughput, in silico framework to study millions of potential candidate structures. This presentation discusses the CEP branch which employs first-principles computational quantum chemistry for the characterization of molecular motifs and the assessment of their quality with respect to applications as electronic materials. In addition to finding specific structures with certain properties, it is the goal of CEP to illuminate and understand the structure-property relations in the domain of organic electronics. Such insights can open the door to a rational, systematic, and accelerated development of future high-performance materials. CEP is a large-scale investigation which utilizes the massive computational resource of IBM's World Community Grid. In this context, it is deployed as a screensaver application harvesting idle computing time on donor machines. This cyberinfrastructure paradigm has already allowed us to characterize 3.5 million molecules of interest in about 50 million DFT calculations.

  9. Donating blood and organs: using an extended theory of planned behavior perspective to identify similarities and differences in individual motivations to donate.

    PubMed

    Hyde, Melissa K; Knowles, Simon R; White, Katherine M

    2013-12-01

    Due to the critical shortage and continued need of blood and organ donations (ODs), research exploring similarities and differences in the motivational determinants of these behaviors is needed. In a sample of 258 university students, we used a cross-sectional design to test the utility of an extended theory of planned behavior (TPB) including moral norm, self-identity and in-group altruism (family/close friends and ethnic group), to predict people's blood and OD intentions. Overall, the extended TPB explained 77.0% and 74.6% of variance in blood and OD intentions, respectively. In regression analyses, common contributors to intentions across donation contexts were attitude, self-efficacy and self-identity. Normative influences varied with subjective norm as a significant predictor related to OD intentions but not blood donation intentions at the final step of regression analyses. Moral norm did not contribute significantly to blood or OD intentions. In-group altruism (family/close friends) was significantly related to OD intentions only in regressions. Future donation strategies should increase confidence to donate, foster a perception of self as the type of person who donates blood and/or organs, and address preferences to donate organs to in-group members only. PMID:23943782

  10. 78 FR 58492 - Generator Verification Reliability Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-24

    ... Energy Regulatory Commission 18 CFR Part 40 Generator Verification Reliability Standards AGENCY: Federal... Organization: MOD-025-2 (Verification and Data Reporting of Generator Real and Reactive Power Capability and Synchronous Condenser Reactive Power Capability), MOD- 026-1 (Verification of Models and Data for...

  11. World Health Organization Quality-of-Life Scale (WHOQOL-BREF): Analyses of Their Item Response Theory Properties Based on the Graded Responses Model

    PubMed Central

    2010-01-01

    Objective This study has used Item Response Theory (IRT) to examine the psychometric properties of Health-Related Quality-of-Life. Method This investigation is a descriptive- analytic study. Subjects were 370 undergraduate students of nursing and midwifery who were selected from Tabriz University of Medical Sciences. All participants were asked to complete the Farsi version of WHOQOL-BREF. Samejima's graded response model was used for the analyses. Results The results revealed that the discrimination parameters for all items in the four scales were low to moderate. The threshold parameters showed adequate representation of the relevant traits from low to the mean trait level. With the exception of 15, 18, 24 and 26 items, all other items showed low item information function values, and thus relatively high reliability from low trait levels to moderate levels. Conclusions The results of this study indicate that although there was general support for the psychometric properties of the WHOQOL-BREF from an IRT perspective, this measure can be further improved. IRT analyses provided useful measurement information and demonstrated to be a better methodological approach for enhancing our knowledge of the functionality of WHOQOL-BREF. PMID:22952508

  12. The Estimation of the IRT Reliability Coefficient and Its Lower and Upper Bounds, with Comparisons to CTT Reliability Statistics

    ERIC Educational Resources Information Center

    Kim, Seonghoon; Feldt, Leonard S.

    2010-01-01

    The primary purpose of this study is to investigate the mathematical characteristics of the test reliability coefficient rho[subscript XX'] as a function of item response theory (IRT) parameters and present the lower and upper bounds of the coefficient. Another purpose is to examine relative performances of the IRT reliability statistics and two…

  13. Inverse modelling of Köhler theory - Part 1: A response surface analysis of CCN spectra with respect to surface-active organic species

    NASA Astrophysics Data System (ADS)

    Lowe, Samuel; Partridge, Daniel G.; Topping, David; Stier, Philip

    2016-09-01

    In this study a novel framework for inverse modelling of cloud condensation nuclei (CCN) spectra is developed using Köhler theory. The framework is established by using model-generated synthetic measurements as calibration data for a parametric sensitivity analysis. Assessment of the relative importance of aerosol physicochemical parameters, while accounting for bulk-surface partitioning of surface-active organic species, is carried out over a range of atmospherically relevant supersaturations. By introducing an objective function that provides a scalar metric for diagnosing the deviation of modelled CCN concentrations from synthetic observations, objective function response surfaces are presented as a function of model input parameters. Crucially, for the chosen calibration data, aerosol-CCN spectrum closure is confirmed as a well-posed inverse modelling exercise for a subset of the parameters explored herein. The response surface analysis indicates that the appointment of appropriate calibration data is particularly important. To perform an inverse aerosol-CCN closure analysis and constrain parametric uncertainties, it is shown that a high-resolution CCN spectrum definition of the calibration data is required where single-valued definitions may be expected to fail. Using Köhler theory to model CCN concentrations requires knowledge of many physicochemical parameters, some of which are difficult to measure in situ on the scale of interest and introduce a considerable amount of parametric uncertainty to model predictions. For all partitioning schemes and environments modelled, model output showed significant sensitivity to perturbations in aerosol log-normal parameters describing the accumulation mode, surface tension, organic : inorganic mass ratio, insoluble fraction, and solution ideality. Many response surfaces pertaining to these parameters contain well-defined minima and are therefore good candidates for calibration using a Monte Carlo Markov Chain (MCMC

  14. From Organized High-Throughput Data to Phenomenological Theory using Machine Learning: The Example of Dielectric Breakdown

    DOE PAGES

    Kim, Chiho; Pilania, Ghanshyam; Ramprasad, Ramamurthy

    2016-02-02

    Understanding the behavior (and failure) of dielectric insulators experiencing extreme electric fields is critical to the operation of present and emerging electrical and electronic devices. Despite its importance, the development of a predictive theory of dielectric breakdown has remained a challenge, owing to the complex multiscale nature of this process. We focus on the intrinsic dielectric breakdown field of insulators—the theoretical limit of breakdown determined purely by the chemistry of the material, i.e., the elements the material is composed of, the atomic-level structure, and the bonding. Starting from a benchmark dataset (generated from laborious first principles computations) of the intrinsicmore » dielectric breakdown field of a variety of model insulators, simple predictive phenomenological models of dielectric breakdown are distilled using advanced statistical or machine learning schemes, revealing key correlations and analytical relationships between the breakdown field and easily accessible material properties. Lastly, the models are shown to be general, and can hence guide the screening and systematic identification of high electric field tolerant materials.« less

  15. Bio-inspired transition metal-organic hydride conjugates for catalysis of transfer hydrogenation: experiment and theory.

    PubMed

    McSkimming, Alex; Chan, Bun; Bhadbhade, Mohan M; Ball, Graham E; Colbran, Stephen B

    2015-02-01

    Taking inspiration from yeast alcohol dehydrogenase (yADH), a benzimidazolium (BI(+) ) organic hydride-acceptor domain has been coupled with a 1,10-phenanthroline (phen) metal-binding domain to afford a novel multifunctional ligand (L(BI+) ) with hydride-carrier capacity (L(BI+) +H(-) ⇌L(BI) H). Complexes of the type [Cp*M(L(BI) )Cl][PF6 ]2 (M=Rh, Ir) have been made and fully characterised by cyclic voltammetry, UV/Vis spectroelectrochemistry, and, for the Ir(III) congener, X-ray crystallography. [Cp*Rh(L(BI) )Cl][PF6 ]2 catalyses the transfer hydrogenation of imines by formate ion in very goods yield under conditions where the corresponding [Cp*Ir(L(BI) )Cl][PF6 ] and [Cp*M(phen)Cl][PF6 ] (M=Rh, Ir) complexes are almost inert as catalysts. Possible alternatives for the catalysis pathway are canvassed, and the free energies of intermediates and transition states determined by DFT calculations. The DFT study supports a mechanism involving formate-driven RhH formation (90 kJ mol(-1) free-energy barrier), transfer of hydride between the Rh and BI(+) centres to generate a tethered benzimidazoline (BIH) hydride donor, binding of imine substrate at Rh, back-transfer of hydride from the BIH organic hydride donor to the Rh-activated imine substrate (89 kJ mol(-1) barrier), and exergonic protonation of the metal-bound amide by formic acid with release of amine product to close the catalytic cycle. Parallels with the mechanism of biological hydride transfer in yADH are discussed.

  16. Reliability of photovoltaic modules

    NASA Technical Reports Server (NTRS)

    Ross, R. G., Jr.

    1986-01-01

    In order to assess the reliability of photovoltaic modules, four categories of known array failure and degradation mechanisms are discussed, and target reliability allocations have been developed within each category based on the available technology and the life-cycle-cost requirements of future large-scale terrestrial applications. Cell-level failure mechanisms associated with open-circuiting or short-circuiting of individual solar cells generally arise from cell cracking or the fatigue of cell-to-cell interconnects. Power degradation mechanisms considered include gradual power loss in cells, light-induced effects, and module optical degradation. Module-level failure mechanisms and life-limiting wear-out mechanisms are also explored.

  17. Reliability and durability problems

    NASA Astrophysics Data System (ADS)

    Bojtsov, B. V.; Kondrashov, V. Z.

    The papers presented in this volume focus on methods for determining the stress-strain state of structures and machines and evaluating their reliability and service life. Specific topics discussed include a method for estimating the service life of thin-sheet automotive structures, stressed state at the tip of small cracks in anisotropic plates under biaxial tension, evaluation of the elastic-dissipative characteristics of joints by vibrational diagnostics methods, and calculation of the reliability of ceramic structures for arbitrary long-term loading programs. Papers are also presented on the effect of prior plastic deformation on fatigue damage kinetics, axisymmetric and local deformation of cylindrical parts during finishing-hardening treatments, and adhesion of polymers to diffusion coatings on steels.

  18. Human Reliability Program Workshop

    SciTech Connect

    Landers, John; Rogers, Erin; Gerke, Gretchen

    2014-05-18

    A Human Reliability Program (HRP) is designed to protect national security as well as worker and public safety by continuously evaluating the reliability of those who have access to sensitive materials, facilities, and programs. Some elements of a site HRP include systematic (1) supervisory reviews, (2) medical and psychological assessments, (3) management evaluations, (4) personnel security reviews, and (4) training of HRP staff and critical positions. Over the years of implementing an HRP, the Department of Energy (DOE) has faced various challenges and overcome obstacles. During this 4-day activity, participants will examine programs that mitigate threats to nuclear security and the insider threat to include HRP, Nuclear Security Culture (NSC) Enhancement, and Employee Assistance Programs. The focus will be to develop an understanding of the need for a systematic HRP and to discuss challenges and best practices associated with mitigating the insider threat.

  19. Reliable broadcast protocols

    NASA Technical Reports Server (NTRS)

    Joseph, T. A.; Birman, Kenneth P.

    1989-01-01

    A number of broadcast protocols that are reliable subject to a variety of ordering and delivery guarantees are considered. Developing applications that are distributed over a number of sites and/or must tolerate the failures of some of them becomes a considerably simpler task when such protocols are available for communication. Without such protocols the kinds of distributed applications that can reasonably be built will have a very limited scope. As the trend towards distribution and decentralization continues, it will not be surprising if reliable broadcast protocols have the same role in distributed operating systems of the future that message passing mechanisms have in the operating systems of today. On the other hand, the problems of engineering such a system remain large. For example, deciding which protocol is the most appropriate to use in a certain situation or how to balance the latency-communication-storage costs is not an easy question.

  20. Space Shuttle Propulsion System Reliability

    NASA Technical Reports Server (NTRS)

    Welzyn, Ken; VanHooser, Katherine; Moore, Dennis; Wood, David

    2011-01-01

    This session includes the following sessions: (1) External Tank (ET) System Reliability and Lessons, (2) Space Shuttle Main Engine (SSME), Reliability Validated by a Million Seconds of Testing, (3) Reusable Solid Rocket Motor (RSRM) Reliability via Process Control, and (4) Solid Rocket Booster (SRB) Reliability via Acceptance and Testing.

  1. Spacecraft transmitter reliability

    NASA Technical Reports Server (NTRS)

    1980-01-01

    A workshop on spacecraft transmitter reliability was held at the NASA Lewis Research Center on September 25 and 26, 1979, to discuss present knowledge and to plan future research areas. Since formal papers were not submitted, this synopsis was derived from audio tapes of the workshop. The following subjects were covered: users' experience with space transmitters; cathodes; power supplies and interfaces; and specifications and quality assurance. A panel discussion ended the workshop.

  2. ATLAS reliability analysis

    SciTech Connect

    Bartsch, R.R.

    1995-09-01

    Key elements of the 36 MJ ATLAS capacitor bank have been evaluated for individual probabilities of failure. These have been combined to estimate system reliability which is to be greater than 95% on each experimental shot. This analysis utilizes Weibull or Weibull-like distributions with increasing probability of failure with the number of shots. For transmission line insulation, a minimum thickness is obtained and for the railgaps, a method for obtaining a maintenance interval from forthcoming life tests is suggested.

  3. Compact, Reliable EEPROM Controller

    NASA Technical Reports Server (NTRS)

    Katz, Richard; Kleyner, Igor

    2010-01-01

    A compact, reliable controller for an electrically erasable, programmable read-only memory (EEPROM) has been developed specifically for a space-flight application. The design may be adaptable to other applications in which there are requirements for reliability in general and, in particular, for prevention of inadvertent writing of data in EEPROM cells. Inadvertent writes pose risks of loss of reliability in the original space-flight application and could pose such risks in other applications. Prior EEPROM controllers are large and complex and do not provide all reasonable protections (in many cases, few or no protections) against inadvertent writes. In contrast, the present controller provides several layers of protection against inadvertent writes. The controller also incorporates a write-time monitor, enabling determination of trends in the performance of an EEPROM through all phases of testing. The controller has been designed as an integral subsystem of a system that includes not only the controller and the controlled EEPROM aboard a spacecraft but also computers in a ground control station, relatively simple onboard support circuitry, and an onboard communication subsystem that utilizes the MIL-STD-1553B protocol. (MIL-STD-1553B is a military standard that encompasses a method of communication and electrical-interface requirements for digital electronic subsystems connected to a data bus. MIL-STD- 1553B is commonly used in defense and space applications.) The intent was to both maximize reliability while minimizing the size and complexity of onboard circuitry. In operation, control of the EEPROM is effected via the ground computers, the MIL-STD-1553B communication subsystem, and the onboard support circuitry, all of which, in combination, provide the multiple layers of protection against inadvertent writes. There is no controller software, unlike in many prior EEPROM controllers; software can be a major contributor to unreliability, particularly in fault

  4. Software reliability studies

    NASA Technical Reports Server (NTRS)

    Hoppa, Mary Ann; Wilson, Larry W.

    1994-01-01

    There are many software reliability models which try to predict future performance of software based on data generated by the debugging process. Our research has shown that by improving the quality of the data one can greatly improve the predictions. We are working on methodologies which control some of the randomness inherent in the standard data generation processes in order to improve the accuracy of predictions. Our contribution is twofold in that we describe an experimental methodology using a data structure called the debugging graph and apply this methodology to assess the robustness of existing models. The debugging graph is used to analyze the effects of various fault recovery orders on the predictive accuracy of several well-known software reliability algorithms. We found that, along a particular debugging path in the graph, the predictive performance of different models can vary greatly. Similarly, just because a model 'fits' a given path's data well does not guarantee that the model would perform well on a different path. Further we observed bug interactions and noted their potential effects on the predictive process. We saw that not only do different faults fail at different rates, but that those rates can be affected by the particular debugging stage at which the rates are evaluated. Based on our experiment, we conjecture that the accuracy of a reliability prediction is affected by the fault recovery order as well as by fault interaction.

  5. Infrared measurements of organic radical anions in solution using mid-infrared optical fibers and spectral analyses based on density functional theory calculations

    NASA Astrophysics Data System (ADS)

    Sakamoto, Akira; Kuroda, Masahito; Harada, Tomohisa; Tasumi, Mitsuo

    2005-02-01

    By using ATR and transmission probes combined with bundles of mid-infrared optical fibers, high-quality infrared spectra are observed for the radical anions of biphenyl and naphthalene in deuterated tetrahydrofuran solutions. The ATR and transmission probes can be inserted into a glass-tube cell with O-rings under vacuum. Organic radical anions prepared separately in a vacuum system are transferred into the cell for infrared absorption measurements. Observed infrared spectra are in good agreement with those calculated by density functional theory. The origin of the strong infrared absorption intensities characteristic of the radical anions are discussed in terms of changes in electronic structures induced by specific normal vibrations (electron-molecular vibration interaction).

  6. General Aviation Aircraft Reliability Study

    NASA Technical Reports Server (NTRS)

    Pettit, Duane; Turnbull, Andrew; Roelant, Henk A. (Technical Monitor)

    2001-01-01

    This reliability study was performed in order to provide the aviation community with an estimate of Complex General Aviation (GA) Aircraft System reliability. To successfully improve the safety and reliability for the next generation of GA aircraft, a study of current GA aircraft attributes was prudent. This was accomplished by benchmarking the reliability of operational Complex GA Aircraft Systems. Specifically, Complex GA Aircraft System reliability was estimated using data obtained from the logbooks of a random sample of the Complex GA Aircraft population.

  7. Age-related transcriptional changes in gene expression in different organs of mice support the metabolic stability theory of aging.

    PubMed

    Brink, Thore C; Demetrius, Lloyd; Lehrach, Hans; Adjaye, James

    2009-10-01

    Individual differences in the rate of aging are determined by the efficiency with which an organism transforms resources into metabolic energy thus maintaining the homeostatic condition of its cells and tissues. This observation has been integrated with analytical studies of the metabolic process to derive the following principle: The metabolic stability of regulatory networks, that is the ability of cells to maintain stable concentrations of reactive oxygen species (ROS) and other critical metabolites is the prime determinant of life span. The metabolic stability of a regulatory network is determined by the diversity of the metabolic pathways or the degree of connectivity of genes in the network. These properties can be empirically evaluated in terms of transcriptional changes in gene expression. We use microarrays to investigate the age-dependence of transcriptional changes of genes in the insulin signaling, oxidative phosphorylation and glutathione metabolism pathways in mice. Our studies delineate age and tissue specific patterns of transcriptional changes which are consistent with the metabolic stability-longevity principle. This study, in addition, rejects the free radical hypothesis which postulates that the production rate of ROS, and not its stability, determines life span.

  8. Exploring the sodium storage mechanism in disodium terephthalate as anode for organic battery using density-functional theory calculations

    NASA Astrophysics Data System (ADS)

    Sk, Mahasin Alam; Manzhos, Sergei

    2016-08-01

    We present an ab initio study of sodium storage mechanism in disodium terephthalate (Na2TP) which is a very promising anode material for organic sodium (Na)-ion batteries with reported experimental capacities of ∼255 mAh g-1, previously attributed to Na attachment to the two carboxylate groups (coordinating to oxygen atoms). We show here that the inserted Na atoms prefer to bind at carboxylate sites at low Na concentrations and are dominant for insertion of up to one Na atom per molecule; for higher Na concentrations, the hexagonal sites (on the aromatic ring) become dominant. We confirm that the Na2TP crystal can store a maximum of two Na atoms per molecule, as observed in experiments. Our current results are intriguing as we reveal that the Na binding at carboxylate sites contributes to the initial part of Na2TP sodiation curve and the Na binding at hexagonal sites contributes to the second part of the curve. The inserted Na atoms donate electrons to empty states in the conduction band. Moreover, we show that the Na diffusion barriers in clean Na2TP can be as low as 0.23 eV. We also show that there is significant difference in the mechanism of Na interaction between individual molecules and the crystal.

  9. Density Functional Theory Study of Hydrogen Adsorption in a Ti-Decorated Mg-Based Metal-Organic Framework-74.

    PubMed

    Suksaengrat, Pitphichaya; Amornkitbamrung, Vittaya; Srepusharawoot, Pornjuk; Ahuja, Rajeev

    2016-03-16

    The Ti-binding energy and hydrogen adsorption energy of a Ti-decorated Mg-based metal-organic framework-74 (Mg-MOF-74) were evaluated by using first-principles calculations. Our results revealed that only three Ti adsorption sites were found to be stable. The adsorption site near the metal oxide unit is the most stable. To investigate the hydrogen-adsorption properties of Ti-functionalized Mg-MOF-74, the hydrogen-binding energy was determined. For the most stable Ti adsorption site, we found that the hydrogen adsorption energy ranged from 0.26 to 0.48 eV H2 (-1) . This is within the desirable range for practical hydrogen-storage applications. Moreover, the hydrogen capacity was determined by using ab initio molecular dynamics simulations. Our results revealed that the hydrogen uptake by Ti-decorated Mg-MOF-74 at temperatures of 77, 150, and 298 K and ambient pressure were 1.81, 1.74, and 1.29 H2  wt %, respectively. PMID:26717417

  10. Exploring the sodium storage mechanism in disodium terephthalate as anode for organic battery using density-functional theory calculations

    NASA Astrophysics Data System (ADS)

    Sk, Mahasin Alam; Manzhos, Sergei

    2016-08-01

    We present an ab initio study of sodium storage mechanism in disodium terephthalate (Na2TP) which is a very promising anode material for organic sodium (Na)-ion batteries with reported experimental capacities of ∼255 mAh g-1, previously attributed to Na attachment to the two carboxylate groups (coordinating to oxygen atoms). We show here that the inserted Na atoms prefer to bind at carboxylate sites at low Na concentrations and are dominant for insertion of up to one Na atom per molecule; for higher Na concentrations, the hexagonal sites (on the aromatic ring) become dominant. We confirm that the Na2TP crystal can store a maximum of two Na atoms per molecule, as observed in experiments. Our current results are intriguing as we reveal that the Na binding at carboxylate sites contributes to the initial part of Na2TP sodiation curve and the Na binding at hexagonal sites contributes to the second part of the curve. The inserted Na atoms donate electrons to empty states in the conduction band. Moreover, we show that the Na diffusion barriers in clean Na2TP can be as low as 0.23 eV. We also show that there is significant difference in the mechanism of Na interaction between individual molecules and the crystal.

  11. Reliability sensitivity-based correlation coefficient calculation in structural reliability analysis

    NASA Astrophysics Data System (ADS)

    Yang, Zhou; Zhang, Yimin; Zhang, Xufang; Huang, Xianzhen

    2012-05-01

    The correlation coefficients of random variables of mechanical structures are generally chosen with experience or even ignored, which cannot actually reflect the effects of parameter uncertainties on reliability. To discuss the selection problem of the correlation coefficients from the reliability-based sensitivity point of view, the theory principle of the problem is established based on the results of the reliability sensitivity, and the criterion of correlation among random variables is shown. The values of the correlation coefficients are obtained according to the proposed principle and the reliability sensitivity problem is discussed. Numerical studies have shown the following results: (1) If the sensitivity value of correlation coefficient ρ is less than (at what magnitude 0.000 01), then the correlation could be ignored, which could simplify the procedure without introducing additional error. (2) However, as the difference between ρ s, that is the most sensitive to the reliability, and ρ R , that is with the smallest reliability, is less than 0.001, ρ s is suggested to model the dependency of random variables. This could ensure the robust quality of system without the loss of safety requirement. (3) In the case of | E abs|>0.001 and also | E rel|>0.001, ρ R should be employed to quantify the correlation among random variables in order to ensure the accuracy of reliability analysis. Application of the proposed approach could provide a practical routine for mechanical design and manufactory to study the reliability and reliability-based sensitivity of basic design variables in mechanical reliability analysis and design.

  12. Reliable predictions of waste performance in a geologic repository

    SciTech Connect

    Pigford, T.H.; Chambre, P.L.

    1985-08-01

    Establishing reliable estimates of long-term performance of a waste repository requires emphasis upon valid theories to predict performance. Predicting rates that radionuclides are released from waste packages cannot rest upon empirical extrapolations of laboratory leach data. Reliable predictions can be based on simple bounding theoretical models, such as solubility-limited bulk-flow, if the assumed parameters are reliably known or defensibly conservative. Wherever possible, performance analysis should proceed beyond simple bounding calculations to obtain more realistic - and usually more favorable - estimates of expected performance. Desire for greater realism must be balanced against increasing uncertainties in prediction and loss of reliability. Theoretical predictions of release rate based on mass-transfer analysis are bounding and the theory can be verified. Postulated repository analogues to simulate laboratory leach experiments introduce arbitrary and fictitious repository parameters and are shown not to agree with well-established theory. 34 refs., 3 figs., 2 tabs.

  13. Ultimately Reliable Pyrotechnic Systems

    NASA Technical Reports Server (NTRS)

    Scott, John H.; Hinkel, Todd

    2015-01-01

    This paper presents the methods by which NASA has designed, built, tested, and certified pyrotechnic devices for high reliability operation in extreme environments and illustrates the potential applications in the oil and gas industry. NASA's extremely successful application of pyrotechnics is built upon documented procedures and test methods that have been maintained and developed since the Apollo Program. Standards are managed and rigorously enforced for performance margins, redundancy, lot sampling, and personnel safety. The pyrotechnics utilized in spacecraft include such devices as small initiators and detonators with the power of a shotgun shell, detonating cord systems for explosive energy transfer across many feet, precision linear shaped charges for breaking structural membranes, and booster charges to actuate valves and pistons. NASA's pyrotechnics program is one of the more successful in the history of Human Spaceflight. No pyrotechnic device developed in accordance with NASA's Human Spaceflight standards has ever failed in flight use. NASA's pyrotechnic initiators work reliably in temperatures as low as -420 F. Each of the 135 Space Shuttle flights fired 102 of these initiators, some setting off multiple pyrotechnic devices, with never a failure. The recent landing on Mars of the Opportunity rover fired 174 of NASA's pyrotechnic initiators to complete the famous '7 minutes of terror.' Even after traveling through extreme radiation and thermal environments on the way to Mars, every one of them worked. These initiators have fired on the surface of Titan. NASA's design controls, procedures, and processes produce the most reliable pyrotechnics in the world. Application of pyrotechnics designed and procured in this manner could enable the energy industry's emergency equipment, such as shutoff valves and deep-sea blowout preventers, to be left in place for years in extreme environments and still be relied upon to function when needed, thus greatly enhancing

  14. Blade reliability collaborative :

    SciTech Connect

    Ashwill, Thomas D.; Ogilvie, Alistair B.; Paquette, Joshua A.

    2013-04-01

    The Blade Reliability Collaborative (BRC) was started by the Wind Energy Technologies Department of Sandia National Laboratories and DOE in 2010 with the goal of gaining insight into planned and unplanned O&M issues associated with wind turbine blades. A significant part of BRC is the Blade Defect, Damage and Repair Survey task, which will gather data from blade manufacturers, service companies, operators and prior studies to determine details about the largest sources of blade unreliability. This report summarizes the initial findings from this work.

  15. Why Generalizability Theory Yields Better Results than Classical Test Theory.

    ERIC Educational Resources Information Center

    Eason, Sandra H.

    Generalizability theory provides a technique for accurately estimating the reliability of measurements. The power of this theory is based on the simultaneous analysis of multiple sources of error variances. Equally important, generalizability theory considers relationships among the sources of measurement error. Just as multivariate inferential…

  16. Fault Tree Reliability Analysis and Design-for-reliability

    1998-05-05

    WinR provides a fault tree analysis capability for performing systems reliability and design-for-reliability analyses. The package includes capabilities for sensitivity and uncertainity analysis, field failure data analysis, and optimization.

  17. On Component Reliability and System Reliability for Space Missions

    NASA Technical Reports Server (NTRS)

    Chen, Yuan; Gillespie, Amanda M.; Monaghan, Mark W.; Sampson, Michael J.; Hodson, Robert F.

    2012-01-01

    This paper is to address the basics, the limitations and the relationship between component reliability and system reliability through a study of flight computing architectures and related avionics components for NASA future missions. Component reliability analysis and system reliability analysis need to be evaluated at the same time, and the limitations of each analysis and the relationship between the two analyses need to be understood.

  18. Improve relief valve reliability

    SciTech Connect

    Nelson, W.E.

    1993-01-01

    This paper reports on careful evaluation of safety relief valves and their service conditions which can improve reliability and permit more time between testing. Some factors that aid in getting long-run results are: Use of valves suitable for service, Attention to design of the relieving system (including use of block valves) and Close attention to repair procedures. Use these procedures for each installation, applying good engineering practices. The Clean Air Act of 1990 and other legislation limiting allowable fugitive emissions in a hydrocarbon processing plant will greatly impact safety relief valve installations. Normal leakage rate from a relief valve will require that it be connected to a closed vent system connected to a recovery or control device. Tying the outlet of an existing valve into a header system can cause accelerated corrosion and operating difficulties. Reliability of many existing safety relief valves may be compromised when they are connected to an outlet header without following good engineering practices. The law has been enacted but all the rules have not been promulgated.

  19. Integrated circuit reliability testing

    NASA Technical Reports Server (NTRS)

    Buehler, Martin G. (Inventor); Sayah, Hoshyar R. (Inventor)

    1990-01-01

    A technique is described for use in determining the reliability of microscopic conductors deposited on an uneven surface of an integrated circuit device. A wafer containing integrated circuit chips is formed with a test area having regions of different heights. At the time the conductors are formed on the chip areas of the wafer, an elongated serpentine assay conductor is deposited on the test area so the assay conductor extends over multiple steps between regions of different heights. Also, a first test conductor is deposited in the test area upon a uniform region of first height, and a second test conductor is deposited in the test area upon a uniform region of second height. The occurrence of high resistances at the steps between regions of different height is indicated by deriving the measured length of the serpentine conductor using the resistance measured between the ends of the serpentine conductor, and comparing that to the design length of the serpentine conductor. The percentage by which the measured length exceeds the design length, at which the integrated circuit will be discarded, depends on the required reliability of the integrated circuit.

  20. Integrated circuit reliability testing

    NASA Technical Reports Server (NTRS)

    Buehler, Martin G. (Inventor); Sayah, Hoshyar R. (Inventor)

    1988-01-01

    A technique is described for use in determining the reliability of microscopic conductors deposited on an uneven surface of an integrated circuit device. A wafer containing integrated circuit chips is formed with a test area having regions of different heights. At the time the conductors are formed on the chip areas of the wafer, an elongated serpentine assay conductor is deposited on the test area so the assay conductor extends over multiple steps between regions of different heights. Also, a first test conductor is deposited in the test area upon a uniform region of first height, and a second test conductor is deposited in the test area upon a uniform region of second height. The occurrence of high resistances at the steps between regions of different height is indicated by deriving the measured length of the serpentine conductor using the resistance measured between the ends of the serpentine conductor, and comparing that to the design length of the serpentine conductor. The percentage by which the measured length exceeds the design length, at which the integrated circuit will be discarded, depends on the required reliability of the integrated circuit.

  1. Reliable Entanglement Verification

    NASA Astrophysics Data System (ADS)

    Arrazola, Juan; Gittsovich, Oleg; Donohue, John; Lavoie, Jonathan; Resch, Kevin; Lütkenhaus, Norbert

    2013-05-01

    Entanglement plays a central role in quantum protocols. It is therefore important to be able to verify the presence of entanglement in physical systems from experimental data. In the evaluation of these data, the proper treatment of statistical effects requires special attention, as one can never claim to have verified the presence of entanglement with certainty. Recently increased attention has been paid to the development of proper frameworks to pose and to answer these type of questions. In this work, we apply recent results by Christandl and Renner on reliable quantum state tomography to construct a reliable entanglement verification procedure based on the concept of confidence regions. The statements made do not require the specification of a prior distribution nor the assumption of an independent and identically distributed (i.i.d.) source of states. Moreover, we develop efficient numerical tools that are necessary to employ this approach in practice, rendering the procedure ready to be employed in current experiments. We demonstrate this fact by analyzing the data of an experiment where photonic entangled two-photon states were generated and whose entanglement is verified with the use of an accessible nonlinear witness.

  2. Load Control System Reliability

    SciTech Connect

    Trudnowski, Daniel

    2015-04-03

    This report summarizes the results of the Load Control System Reliability project (DOE Award DE-FC26-06NT42750). The original grant was awarded to Montana Tech April 2006. Follow-on DOE awards and expansions to the project scope occurred August 2007, January 2009, April 2011, and April 2013. In addition to the DOE monies, the project also consisted of matching funds from the states of Montana and Wyoming. Project participants included Montana Tech; the University of Wyoming; Montana State University; NorthWestern Energy, Inc., and MSE. Research focused on two areas: real-time power-system load control methodologies; and, power-system measurement-based stability-assessment operation and control tools. The majority of effort was focused on area 2. Results from the research includes: development of fundamental power-system dynamic concepts, control schemes, and signal-processing algorithms; many papers (including two prize papers) in leading journals and conferences and leadership of IEEE activities; one patent; participation in major actual-system testing in the western North American power system; prototype power-system operation and control software installed and tested at three major North American control centers; and, the incubation of a new commercial-grade operation and control software tool. Work under this grant certainly supported the DOE-OE goals in the area of “Real Time Grid Reliability Management.”

  3. 78 FR 18817 - Revisions to Reliability Standard for Transmission Vegetation Management

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-28

    ...Under section 215 of the Federal Power Act (FPA), the Federal Energy Regulatory Commission (Commission) approves Reliability Standard FAC-003-2 (Transmission Vegetation Management), submitted to the Commission for approval by the North American Electric Reliability Corporation (NERC), the Commission-certified Electric Reliability Organization. Reliability Standard FAC-003-2 expands the......

  4. Understanding the Elements of Operational Reliability: A Key for Achieving High Reliability

    NASA Technical Reports Server (NTRS)

    Safie, Fayssal M.

    2010-01-01

    This viewgraph presentation reviews operational reliability and its role in achieving high reliability through design and process reliability. The topics include: 1) Reliability Engineering Major Areas and interfaces; 2) Design Reliability; 3) Process Reliability; and 4) Reliability Applications.

  5. Ionization Energies and Aqueous Redox Potentials of Organic Molecules: Comparison of DFT, Correlated ab Initio Theory and Pair Natural Orbital Approaches.

    PubMed

    Isegawa, Miho; Neese, Frank; Pantazis, Dimitrios A

    2016-05-10

    The calculation of redox potentials involves large energetic terms arising from gas phase ionization energies, thermodynamic contributions, and solvation energies of the reduced and oxidized species. In this work we study the performance of a wide range of wave function and density functional theory methods for the prediction of ionization energies and aqueous one-electron oxidation potentials of a set of 19 organic molecules. Emphasis is placed on evaluating methods that employ the computationally efficient local pair natural orbital (LPNO) approach, as well as several implementations of coupled cluster theory and explicitly correlated F12 methods. The electronic energies are combined with implicit solvation models for the solvation energies. With the exception of MP2 and its variants, which suffer from enormous errors arising at least partially from the poor Hartree-Fock reference, ionization energies can be systematically predicted with average errors below 0.1 eV for most of the correlated wave function based methods studies here, provided basis set extrapolation is performed. LPNO methods are the most efficient way to achieve this type of accuracy. DFT methods show in general larger errors and suffer from inconsistent behavior. The only exception is the M06-2X functional which is found to be competitive with the best LPNO-based approaches for ionization energies. Importantly, the limiting factor for the calculation of accurate redox potentials is the solvation energy. The errors in the predicted solvation energies by all continuum solvation models tested in this work dominate the final computed reduction potential, resulting in average errors typically in excess of 0.3 V and hence obscuring the gains that arise from choosing a more accurate electronic structure method.

  6. Testing for PV Reliability (Presentation)

    SciTech Connect

    Kurtz, S.; Bansal, S.

    2014-09-01

    The DOE SUNSHOT workshop is seeking input from the community about PV reliability and how the DOE might address gaps in understanding. This presentation describes the types of testing that are needed for PV reliability and introduces a discussion to identify gaps in our understanding of PV reliability testing.

  7. Factor reliability into load management

    SciTech Connect

    Feight, G.R.

    1983-07-01

    Hardware reliability is a major factor to consider when selecting a direct-load-control system. The author outlines a method of estimating present-value costs associated with system reliability. He points out that small differences in receiver reliability make a significant difference in owning cost. 4 figures.

  8. Optimum Reliability of Gain Scores.

    ERIC Educational Resources Information Center

    Sharma, K. K.; Gupta, J. K.

    1986-01-01

    This paper gives a mathematical treatment to findings of Zimmerman and Williams and establishes a minimum reliability for gain scores when the pretest and posttest have equal reliabilities and equal standard deviations. It discusses the behavior of the reliability of gain scores in terms of variations in other test parameters. (Author/LMO)

  9. Resource based view: a promising new theory for healthcare organizations: Comment on "Resource based view of the firm as a theoretical lens on the organisational consequences of quality improvement".

    PubMed

    Ferlie, Ewan

    2014-11-01

    This commentary reviews a recent piece by Burton and Rycroft-Malone on the use of Resource Based View (RBV) in healthcare organizations. It first outlines the core content of their piece. It then discusses their attempts to extend RBV to the analysis of large scale quality improvement efforts in healthcare. Some critique is elaborated. The broader question of why RBV seems to be migrating into healthcare management research is considered. They conclude RBV is a promising new theory for healthcare organizations.

  10. Origins of life: a comparison of theories and application to Mars

    NASA Technical Reports Server (NTRS)

    Davis, W. L.; McKay, C. P.

    1996-01-01

    The field of study that deals with the origins of life does not have a consensus for a theory of life's origin. An analysis of the range of theories offered shows that they share some common features that may be reliable predictors when considering the possible origins of life on another planet. The fundamental datum dealing with the origins of life is that life appeared early in the history of the Earth, probably before 3.5 Ga and possibly before 3.8 Ga. What might be called the standard theory (the Oparin-Haldane theory) posits the production of organic molecules on the early Earth followed by chemical reactions that produced increased organic complexity leading eventually to organic life capable of reproduction, mutation, and selection using organic material as nutrients. A distinct class of other theories (panspermia theories) suggests that life was carried to Earth from elsewhere--these theories receive some support from recent work on planetary impact processes. Other alternatives to the standard model suggest that life arose as an inorganic (clay) form and/or that the initial energy source was not organic material but chemical energy or sunlight. We find that the entire range of current theories suggests that liquid water is the quintessential environmental criterion for both the origin and sustenance of life. It is therefore of interest that during the time that life appeared on Earth we have evidence for liquid water present on the surface of Mars.

  11. Origins of life: a comparison of theories and application to Mars.

    PubMed

    Davis, W L; McKay, C P

    1996-02-01

    The field of study that deals with the origins of life does not have a consensus for a theory of life's origin. An analysis of the range of theories offered shows that they share some common features that may be reliable predictors when considering the possible origins of life on another planet. The fundamental datum dealing with the origins of life is that life appeared early in the history of the Earth, probably before 3.5 Ga and possibly before 3.8 Ga. What might be called the standard theory (the Oparin-Haldane theory) posits the production of organic molecules on the early Earth followed by chemical reactions that produced increased organic complexity leading eventually to organic life capable of reproduction, mutation, and selection using organic material as nutrients. A distinct class of other theories (panspermia theories) suggests that life was carried to Earth from elsewhere--these theories receive some support from recent work on planetary impact processes. Other alternatives to the standard model suggest that life arose as an inorganic (clay) form and/or that the initial energy source was not organic material but chemical energy or sunlight. We find that the entire range of current theories suggests that liquid water is the quintessential environmental criterion for both the origin and sustenance of life. It is therefore of interest that during the time that life appeared on Earth we have evidence for liquid water present on the surface of Mars.

  12. Reliable software and communication 1: An overview

    NASA Astrophysics Data System (ADS)

    Chung, Fan R. K.

    1994-01-01

    We discuss the general state of affairs in a variety of related area ranging from software safety, reliability, and testing, protocol specification and verification, network congestion-control and reliability, to communication security and complexity. We intend to identify useful theory and tools, point out connections between different areas, and to a large extent, raise a number of questions whose answers may still lie far beyond the limits of out current knowledge. Some of these areas are still in a very primitive state and call for new ideas, bold approaches, radical thinking and perhaps extraordinary efforts. This paper is originally the overview of the report consisted of the following surveys in selected areas, two of which (A and B) also appear in this issue. A. Quality, Reliability, and Safety by Bob Horgan, Sid Dalal, and Jon Kettenring (1). B. Congestion control and network reliability by Brian Coan and Dan Heyman (2). C. Protocol specification and validation by Linda Ness (3). D. Security and correctness of computation by Stuart Haber (4).

  13. The origin of allometric scaling laws in biology from genomes to ecosystems: towards a quantitative unifying theory of biological structure and organization.

    PubMed

    West, Geoffrey B; Brown, James H

    2005-05-01

    Life is the most complex physical phenomenon in the Universe, manifesting an extraordinary diversity of form and function over an enormous scale from the largest animals and plants to the smallest microbes and subcellular units. Despite this many of its most fundamental and complex phenomena scale with size in a surprisingly simple fashion. For example, metabolic rate scales as the 3/4-power of mass over 27 orders of magnitude, from molecular and intracellular levels up to the largest organisms. Similarly, time-scales (such as lifespans and growth rates) and sizes (such as bacterial genome lengths, tree heights and mitochondrial densities) scale with exponents that are typically simple powers of 1/4. The universality and simplicity of these relationships suggest that fundamental universal principles underly much of the coarse-grained generic structure and organisation of living systems. We have proposed a set of principles based on the observation that almost all life is sustained by hierarchical branching networks, which we assume have invariant terminal units, are space-filling and are optimised by the process of natural selection. We show how these general constraints explain quarter power scaling and lead to a quantitative, predictive theory that captures many of the essential features of diverse biological systems. Examples considered include animal circulatory systems, plant vascular systems, growth, mitochondrial densities, and the concept of a universal molecular clock. Temperature considerations, dimensionality and the role of invariants are discussed. Criticisms and controversies associated with this approach are also addressed.

  14. Benchmarking density functional theory predictions of framework structures and properties in a chemically diverse test set of metal-organic frameworks

    SciTech Connect

    Nazarian, Dalar; Ganesh, P.; Sholl, David S.

    2015-09-30

    We compiled a test set of chemically and topologically diverse Metal–Organic Frameworks (MOFs) with high accuracy experimentally derived crystallographic structure data. The test set was used to benchmark the performance of Density Functional Theory (DFT) functionals (M06L, PBE, PW91, PBE-D2, PBE-D3, and vdW-DF2) for predicting lattice parameters, unit cell volume, bonded parameters and pore descriptors. On average PBE-D2, PBE-D3, and vdW-DF2 predict more accurate structures, but all functionals predicted pore diameters within 0.5 Å of the experimental diameter for every MOF in the test set. The test set was also used to assess the variance in performance of DFT functionals for elastic properties and atomic partial charges. The DFT predicted elastic properties such as minimum shear modulus and Young's modulus can differ by an average of 3 and 9 GPa for rigid MOFs such as those in the test set. Moreover, we calculated the partial charges by vdW-DF2 deviate the most from other functionals while there is no significant difference between the partial charges calculated by M06L, PBE, PW91, PBE-D2 and PBE-D3 for the MOFs in the test set. We find that while there are differences in the magnitude of the properties predicted by the various functionals, these discrepancies are small compared to the accuracy necessary for most practical applications.

  15. Inverse modelling of Köhler theory - Part 1: A response surface analysis of CCN spectra with respect to surface-active organic species

    NASA Astrophysics Data System (ADS)

    Lowe, Samuel; Partridge, Daniel; Topping, David; Stier, Philip

    2016-04-01

    partitioning process. The response surface sensitivity analysis identifies the accumulation mode concentration and surface tension to be the most sensitive parameters. The organic:inorganic mass ratio, insoluble fraction , solution ideality and mean diameter and geometric standard deviation of the accumulation mode showed significant sensitivity while chemical properties of the organic exhibited little sensitivity within parametric uncertainties. Parameters such as surface tension and solution ideality, can introduce considerable parametric uncertainty to models and are therefore particularly good candidates for further parameter calibration studies. A complete treatment of bulk-surface partitioning is found to model CCN spectra similar to those calculated using classical Köhler Theory with the surface tension of a pure water drop, as found in traditional sensitivity analysis studies. In addition, the sensitivity of CCN spectra to perturbations in the partitioning parameters K and Γ was found to be negligible. As a result, this study supports previously held recommendations that complex surfactant effects might be neglected and continued use of classical Köhler Theory in GCMs is recommended to avoid additional computational burden.

  16. Adsorption of organic dyes on TiO2 surfaces in dye-sensitized solar cells: interplay of theory and experiment.

    PubMed

    Anselmi, Chiara; Mosconi, Edoardo; Pastore, Mariachiara; Ronca, Enrico; De Angelis, Filippo

    2012-12-14

    First-principles computer simulations can contribute to a deeper understanding of the dye/semiconductor interface lying at the heart of Dye-sensitized Solar Cells (DSCs). Here, we present the results of simulation of dye adsorption onto TiO(2) surfaces, and of their implications for the functioning of the corresponding solar cells. We propose an integrated strategy which combines FT-IR measurements with DFT calculations to individuate the energetically favorable TiO(2) adsorption mode of acetic acid, as a meaningful model for realistic organic dyes. Although we found a sizable variability in the relative stability of the considered adsorption modes with the model system and the method, a bridged bidentate structure was found to closely match the FT-IR frequency pattern, also being calculated as the most stable adsorption mode by calculations in solution. This adsorption mode was found to be the most stable binding also for realistic organic dyes bearing cyanoacrylic anchoring groups, while for a rhodanine-3-acetic acid anchoring group, an undissociated monodentate adsorption mode was found to be of comparable stability. The structural differences induced by the different anchoring groups were related to the different electron injection/recombination with oxidized dye properties which were experimentally assessed for the two classes of dyes. A stronger coupling and a possibly faster electron injection were also calculated for the bridged bidentate mode. We then investigated the adsorption mode and I(2) binding of prototype organic dyes. Car-Parrinello molecular dynamics and geometry optimizations were performed for two coumarin dyes differing by the length of the π-bridge separating the donor and acceptor moieties. We related the decreasing distance of the carbonylic oxygen from the titania to an increased I(2) concentration in proximity of the oxide surface, which might account for the different observed photovoltaic performances. The interplay between theory

  17. A Study of Birnbaum's Theory of the Relationship between the Constructs of Leadership and Organization as Depicted in His Higher Education Models of Organizational Functioning: A Contextual Leadership Paradigm for Higher Education

    ERIC Educational Resources Information Center

    Douglas, Pamela A.

    2013-01-01

    This quantitative, nonexperimental study used survey research design and nonparametric statistics to investigate Birnbaum's (1988) theory that there is a relationship between the constructs of leadership and organization, as depicted in his five higher education models of organizational functioning: bureaucratic, collegial, political,…

  18. Chemical Applications of Graph Theory: Part II. Isomer Enumeration.

    ERIC Educational Resources Information Center

    Hansen, Peter J.; Jurs, Peter C.

    1988-01-01

    Discusses the use of graph theory to aid in the depiction of organic molecular structures. Gives a historical perspective of graph theory and explains graph theory terminology with organic examples. Lists applications of graph theory to current research projects. (ML)

  19. High-Reliability Health Care: Getting There from Here

    PubMed Central

    Chassin, Mark R; Loeb, Jerod M

    2013-01-01

    Context Despite serious and widespread efforts to improve the quality of health care, many patients still suffer preventable harm every day. Hospitals find improvement difficult to sustain, and they suffer “project fatigue” because so many problems need attention. No hospitals or health systems have achieved consistent excellence throughout their institutions. High-reliability science is the study of organizations in industries like commercial aviation and nuclear power that operate under hazardous conditions while maintaining safety levels that are far better than those of health care. Adapting and applying the lessons of this science to health care offer the promise of enabling hospitals to reach levels of quality and safety that are comparable to those of the best high-reliability organizations. Methods We combined the Joint Commission's knowledge of health care organizations with knowledge from the published literature and from experts in high-reliability industries and leading safety scholars outside health care. We developed a conceptual and practical framework for assessing hospitals’ readiness for and progress toward high reliability. By iterative testing with hospital leaders, we refined the framework and, for each of its fourteen components, defined stages of maturity through which we believe hospitals must pass to reach high reliability. Findings We discovered that the ways that high-reliability organizations generate and maintain high levels of safety cannot be directly applied to today's hospitals. We defined a series of incremental changes that hospitals should undertake to progress toward high reliability. These changes involve the leadership's commitment to achieving zero patient harm, a fully functional culture of safety throughout the organization, and the widespread deployment of highly effective process improvement tools. Conclusions Hospitals can make substantial progress toward high reliability by undertaking several specific

  20. Nuclear weapon reliability evaluation methodology

    SciTech Connect

    Wright, D.L.

    1993-06-01

    This document provides an overview of those activities that are normally performed by Sandia National Laboratories to provide nuclear weapon reliability evaluations for the Department of Energy. These reliability evaluations are first provided as a prediction of the attainable stockpile reliability of a proposed weapon design. Stockpile reliability assessments are provided for each weapon type as the weapon is fielded and are continuously updated throughout the weapon stockpile life. The reliability predictions and assessments depend heavily on data from both laboratory simulation and actual flight tests. An important part of the methodology are the opportunities for review that occur throughout the entire process that assure a consistent approach and appropriate use of the data for reliability evaluation purposes.

  1. Lithium battery safety and reliability

    NASA Astrophysics Data System (ADS)

    Levy, Samuel C.

    Lithium batteries have been used in a variety of applications for a number of years. As their use continues to grow, particularly in the consumer market, a greater emphasis needs to be placed on safety and reliability. There is a useful technique which can help to design cells and batteries having a greater degree of safety and higher reliability. This technique, known as fault tree analysis, can also be useful in determining the cause of unsafe behavior and poor reliability in existing designs.

  2. Computerized life and reliability modelling for turboprop transmissions

    NASA Technical Reports Server (NTRS)

    Savage, M.; Radil, K. C.; Lewicki, D. G.; Coy, J. J.

    1988-01-01

    A generalized life and reliability model is presented for parallel shaft geared prop-fan and turboprop aircraft transmissions. The transmission life and reliability model is a combination of the individual reliability models for all the bearings and gears in the main load paths. The bearing and gear reliability models are based on classical fatigue theory and the two parameter Weibull failure distribution. A computer program was developed to calculate the transmission life and reliability. The program is modular. In its present form, the program can analyze five different transmission arrangements. However, the program can be modified easily to include additional transmission arrangements. An example is included which compares the life of a compound two-stage transmission with the life of a split-torque, parallel compound two-stage transmission as calculated by the comaputer program.

  3. Computerized life and reliability modelling for turboprop transmissions

    NASA Technical Reports Server (NTRS)

    Savage, M.; Radil, K. C.; Lewicki, D. G.; Coy, J. J.

    1988-01-01

    A generalized life and reliability model is presented for parallel shaft geared prop-fan and turboprop aircraft transmissions. The transmission life and reliability model is a combination of the individual reliability models for all the bearings and gears in the main load paths. The bearing and gear reliability models are based on classical fatigue theory and the two parameter Weibull failure distribution. A computer program was developed to calculate the transmission life and reliability. The program is modular. In its present form, the program can analyze five different transmission arrangements. However, the program can be modified easily to include additional transmission arrangements. An example is included which compares the life of a compound two-stage transmission with the life of a split-torque, parallel compound two-stage transmission, as calculated by the computer program.

  4. A fourth generation reliability predictor

    NASA Technical Reports Server (NTRS)

    Bavuso, Salvatore J.; Martensen, Anna L.

    1988-01-01

    A reliability/availability predictor computer program has been developed and is currently being beta-tested by over 30 US companies. The computer program is called the Hybrid Automated Reliability Predictor (HARP). HARP was developed to fill an important gap in reliability assessment capabilities. This gap was manifested through the use of its third-generation cousin, the Computer-Aided Reliability Estimation (CARE III) program, over a six-year development period and an additional three-year period during which CARE III has been in the public domain. The accumulated experience of the over 30 establishments now using CARE III was used in the development of the HARP program.

  5. INSTRUCTIONAL CONFERENCE ON THE THEORY OF STOCHASTIC PROCESSES: Some applications of the theory of martingales to statistics

    NASA Astrophysics Data System (ADS)

    Khmaladze, E. V.

    1982-12-01

    CONTENTS § 1. Introduction § 2. Martingale methods in the theory of testing hypotheses § 3. Martingale limit theorems in the theory of decomposable and similar statistics § 4. Martingale methods in reliability theory References

  6. Benchmarking density functional theory predictions of framework structures and properties in a chemically diverse test set of metal-organic frameworks

    DOE PAGES

    Nazarian, Dalar; Ganesh, P.; Sholl, David S.

    2015-09-30

    We compiled a test set of chemically and topologically diverse Metal–Organic Frameworks (MOFs) with high accuracy experimentally derived crystallographic structure data. The test set was used to benchmark the performance of Density Functional Theory (DFT) functionals (M06L, PBE, PW91, PBE-D2, PBE-D3, and vdW-DF2) for predicting lattice parameters, unit cell volume, bonded parameters and pore descriptors. On average PBE-D2, PBE-D3, and vdW-DF2 predict more accurate structures, but all functionals predicted pore diameters within 0.5 Å of the experimental diameter for every MOF in the test set. The test set was also used to assess the variance in performance of DFT functionalsmore » for elastic properties and atomic partial charges. The DFT predicted elastic properties such as minimum shear modulus and Young's modulus can differ by an average of 3 and 9 GPa for rigid MOFs such as those in the test set. Moreover, we calculated the partial charges by vdW-DF2 deviate the most from other functionals while there is no significant difference between the partial charges calculated by M06L, PBE, PW91, PBE-D2 and PBE-D3 for the MOFs in the test set. We find that while there are differences in the magnitude of the properties predicted by the various functionals, these discrepancies are small compared to the accuracy necessary for most practical applications.« less

  7. Integrating theory, synthesis, spectroscopy and device efficiency to design and characterize donor materials for organic photovoltaics: a case study including 12 donors

    SciTech Connect

    Oosterhout, S. D.; Kopidakis, N.; Owczarczyk, Z. R.; Braunecker, W. A.; Larsen, R. E.; Ratcliff, E. L.; Olson, D. C.

    2015-04-07

    There have been remarkable improvements in the power conversion efficiency of solution-processable Organic Photovoltaics (OPV) have largely been driven by the development of novel narrow bandgap copolymer donors comprising an electron-donating (D) and an electron-withdrawing (A) group within the repeat unit. The large pool of potential D and A units and the laborious processes of chemical synthesis and device optimization, has made progress on new high efficiency materials slow with a few new efficient copolymers reported every year despite the large number of groups pursuing these materials. In our paper we present an integrated approach toward new narrow bandgap copolymers that uses theory to guide the selection of materials to be synthesized based on their predicted energy levels, and time-resolved microwave conductivity (TRMC) to select the best-performing copolymer–fullerene bulk heterojunction to be incorporated into complete OPV devices. We validate our methodology by using a diverse group of 12 copolymers, including new and literature materials, to demonstrate good correlation between (a) theoretically determined energy levels of polymers and experimentally determined ionization energies and electron affinities and (b) photoconductance, measured by TRMC, and OPV device performance. The materials used here also allow us to explore whether further copolymer design rules need to be incorporated into our methodology for materials selection. For example, we explore the effect of the enthalpy change (ΔH) during exciton dissociation on the efficiency of free charge carrier generation and device efficiency and find that ΔH of -0.4 eV is sufficient for efficient charge generation.

  8. Integrating theory, synthesis, spectroscopy and device efficiency to design and characterize donor materials for organic photovoltaics: a case study including 12 donors

    DOE PAGES

    Oosterhout, S. D.; Kopidakis, N.; Owczarczyk, Z. R.; Braunecker, W. A.; Larsen, R. E.; Ratcliff, E. L.; Olson, D. C.

    2015-04-07

    There have been remarkable improvements in the power conversion efficiency of solution-processable Organic Photovoltaics (OPV) have largely been driven by the development of novel narrow bandgap copolymer donors comprising an electron-donating (D) and an electron-withdrawing (A) group within the repeat unit. The large pool of potential D and A units and the laborious processes of chemical synthesis and device optimization, has made progress on new high efficiency materials slow with a few new efficient copolymers reported every year despite the large number of groups pursuing these materials. In our paper we present an integrated approach toward new narrow bandgap copolymersmore » that uses theory to guide the selection of materials to be synthesized based on their predicted energy levels, and time-resolved microwave conductivity (TRMC) to select the best-performing copolymer–fullerene bulk heterojunction to be incorporated into complete OPV devices. We validate our methodology by using a diverse group of 12 copolymers, including new and literature materials, to demonstrate good correlation between (a) theoretically determined energy levels of polymers and experimentally determined ionization energies and electron affinities and (b) photoconductance, measured by TRMC, and OPV device performance. The materials used here also allow us to explore whether further copolymer design rules need to be incorporated into our methodology for materials selection. For example, we explore the effect of the enthalpy change (ΔH) during exciton dissociation on the efficiency of free charge carrier generation and device efficiency and find that ΔH of -0.4 eV is sufficient for efficient charge generation.« less

  9. Stirling Convertor Fasteners Reliability Quantification

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin R.; Korovaichuk, Igor; Kovacevich, Tiodor; Schreiber, Jeffrey G.

    2006-01-01

    Onboard Radioisotope Power Systems (RPS) being developed for NASA s deep-space science and exploration missions require reliable operation for up to 14 years and beyond. Stirling power conversion is a candidate for use in an RPS because it offers a multifold increase in the conversion efficiency of heat to electric power and reduced inventory of radioactive material. Structural fasteners are responsible to maintain structural integrity of the Stirling power convertor, which is critical to ensure reliable performance during the entire mission. Design of fasteners involve variables related to the fabrication, manufacturing, behavior of fasteners and joining parts material, structural geometry of the joining components, size and spacing of fasteners, mission loads, boundary conditions, etc. These variables have inherent uncertainties, which need to be accounted for in the reliability assessment. This paper describes these uncertainties along with a methodology to quantify the reliability, and provides results of the analysis in terms of quantified reliability and sensitivity of Stirling power conversion reliability to the design variables. Quantification of the reliability includes both structural and functional aspects of the joining components. Based on the results, the paper also describes guidelines to improve the reliability and verification testing.

  10. Issues in Modeling System Reliability

    NASA Astrophysics Data System (ADS)

    Cruse, Thomas A.; Annis, Chuck; Booker, Jane; Robinson, David; Sues, Rob

    2002-10-01

    This paper discusses various issues in modeling system reliability. The topics include: 1) Statistical formalisms versus pragmatic numerics; 2) Language; 3) Statistical methods versus reliability-based design methods; 4) Professional bias; and 5) Real issues that need to be identified and resolved prior to certifying designs. This paper is in viewgraph form.

  11. Avionics design for reliability bibliography

    NASA Technical Reports Server (NTRS)

    1976-01-01

    A bibliography with abstracts was presented in support of AGARD lecture series No. 81. The following areas were covered: (1) program management, (2) design for high reliability, (3) selection of components and parts, (4) environment consideration, (5) reliable packaging, (6) life cycle cost, and (7) case histories.

  12. Photovoltaic performance and reliability workshop

    SciTech Connect

    Mrig, L.

    1993-12-01

    This workshop was the sixth in a series of workshops sponsored by NREL/DOE under the general subject of photovoltaic testing and reliability during the period 1986--1993. PV performance and PV reliability are at least as important as PV cost, if not more. In the US, PV manufacturers, DOE laboratories, electric utilities, and others are engaged in the photovoltaic reliability research and testing. This group of researchers and others interested in the field were brought together to exchange the technical knowledge and field experience as related to current information in this evolving field of PV reliability. The papers presented here reflect this effort since the last workshop held in September, 1992. The topics covered include: cell and module characterization, module and system testing, durability and reliability, system field experience, and standards and codes.

  13. Statistical modeling of software reliability

    NASA Technical Reports Server (NTRS)

    Miller, Douglas R.

    1992-01-01

    This working paper discusses the statistical simulation part of a controlled software development experiment being conducted under the direction of the System Validation Methods Branch, Information Systems Division, NASA Langley Research Center. The experiment uses guidance and control software (GCS) aboard a fictitious planetary landing spacecraft: real-time control software operating on a transient mission. Software execution is simulated to study the statistical aspects of reliability and other failure characteristics of the software during development, testing, and random usage. Quantification of software reliability is a major goal. Various reliability concepts are discussed. Experiments are described for performing simulations and collecting appropriate simulated software performance and failure data. This data is then used to make statistical inferences about the quality of the software development and verification processes as well as inferences about the reliability of software versions and reliability growth under random testing and debugging.

  14. Organic matter diagenesis as the key to a unifying theory for the genesis of tabular uranium-vanadium deposits in the Morrison Formation, Colorado Plateau

    USGS Publications Warehouse

    Hansley, P.L.; Spirakis, C.S.

    1992-01-01

    Interstitial, epigenetic amorphous organic matter is intimately associated with uranium in the Grants uranium region and is considered essential to genetic models for these deposits. In contrast, uranium minerals are intimately associated with authigenic vanadium chlorite and vanadium oxides in amorphous organic matter-poor ores of the Slick Rock and Henry Mountains mining districts and therefore, in some genetic models amorphous organic matter is not considered crucial to the formation of these deposits. Differences in organic matter content can be explained by recognizing that amorphous organic matter-poor deposits have been subjected to more advanced stages of diagenesis than amorphous organic matter-rich deposits. Evidence that amorphous organic matter was involved in the genesis of organic matter-poor, as well as organic matter-rich, deposits is described. -from Authors

  15. Descriptive Case Study of Theories of Action, Strategic Objectives, and Strategic Initiatives Used by California Female County Superintendents to Move Their Organizations from Current State to Desired Future

    ERIC Educational Resources Information Center

    Park, Valerie Darlene

    2014-01-01

    The purpose of this study was to describe the theories of action, strategic objectives, and strategic initiatives of school systems led by female county superintendents in California and examine their impact on improving system outcomes. Additionally, the factors influencing theory of action, strategic objective, and initiative development were…

  16. Are specialist certification examinations a reliable measure of physician competence?

    PubMed

    Burch, V C; Norman, G R; Schmidt, H G; van der Vleuten, C P M

    2008-11-01

    High stakes postgraduate specialist certification examinations have considerable implications for the future careers of examinees. Medical colleges and professional boards have a social and professional responsibility to ensure their fitness for purpose. To date there is a paucity of published data about the reliability of specialist certification examinations and objective methods for improvement. Such data are needed to improve current assessment practices and sustain the international credibility of specialist certification processes. To determine the component and composite reliability of the Fellowship examination of the College of Physicians of South Africa, and identify strategies for further improvement, generalizability and multivariate generalizability theory were used to estimate the reliability of examination subcomponents and the overall reliability of the composite examination. Decision studies were used to identify strategies for improving the composition of the examination. Reliability coefficients of the component subtests ranged from 0.58 to 0.64. The composite reliability of the examination was 0.72. This could be increased to 0.8 by weighting all test components equally or increasing the number of patient encounters in the clinical component of the examination. Correlations between examination components were high, suggesting that similar parameters of competence were being assessed. This composite certification examination, if equally weighted, achieved an overall reliability sufficient for high stakes examination purposes. Increasing the weighting of the clinical component decreased the reliability. This could be rectified by increasing the number of patient encounters in the examination. Practical ways of achieving this are suggested.

  17. Calculating system reliability with SRFYDO

    SciTech Connect

    Morzinski, Jerome; Anderson - Cook, Christine M; Klamann, Richard M

    2010-01-01

    SRFYDO is a process for estimating reliability of complex systems. Using information from all applicable sources, including full-system (flight) data, component test data, and expert (engineering) judgment, SRFYDO produces reliability estimates and predictions. It is appropriate for series systems with possibly several versions of the system which share some common components. It models reliability as a function of age and up to 2 other lifecycle (usage) covariates. Initial output from its Exploratory Data Analysis mode consists of plots and numerical summaries so that the user can check data entry and model assumptions, and help determine a final form for the system model. The System Reliability mode runs a complete reliability calculation using Bayesian methodology. This mode produces results that estimate reliability at the component, sub-system, and system level. The results include estimates of uncertainty, and can predict reliability at some not-too-distant time in the future. This paper presents an overview of the underlying statistical model for the analysis, discusses model assumptions, and demonstrates usage of SRFYDO.

  18. Aerospace reliability applied to biomedicine.

    NASA Technical Reports Server (NTRS)

    Lalli, V. R.; Vargo, D. J.

    1972-01-01

    An analysis is presented that indicates that the reliability and quality assurance methodology selected by NASA to minimize failures in aerospace equipment can be applied directly to biomedical devices to improve hospital equipment reliability. The Space Electric Rocket Test project is used as an example of NASA application of reliability and quality assurance (R&QA) methods. By analogy a comparison is made to show how these same methods can be used in the development of transducers, instrumentation, and complex systems for use in medicine.

  19. Reliability analysis of interdependent lattices

    NASA Astrophysics Data System (ADS)

    Limiao, Zhang; Daqing, Li; Pengju, Qin; Bowen, Fu; Yinan, Jiang; Zio, Enrico; Rui, Kang

    2016-06-01

    Network reliability analysis has drawn much attention recently due to the risks of catastrophic damage in networked infrastructures. These infrastructures are dependent on each other as a result of various interactions. However, most of the reliability analyses of these interdependent networks do not consider spatial constraints, which are found important for robustness of infrastructures including power grid and transport systems. Here we study the reliability properties of interdependent lattices with different ranges of spatial constraints. Our study shows that interdependent lattices with strong spatial constraints are more resilient than interdependent Erdös-Rényi networks. There exists an intermediate range of spatial constraints, at which the interdependent lattices have minimal resilience.

  20. Integrating reliability analysis and design

    SciTech Connect

    Rasmuson, D. M.

    1980-10-01

    This report describes the Interactive Reliability Analysis Project and demonstrates the advantages of using computer-aided design systems (CADS) in reliability analysis. Common cause failure problems require presentations of systems, analysis of fault trees, and evaluation of solutions to these. Results have to be communicated between the reliability analyst and the system designer. Using a computer-aided design system saves time and money in the analysis of design. Computer-aided design systems lend themselves to cable routing, valve and switch lists, pipe routing, and other component studies. At EG and G Idaho, Inc., the Applicon CADS is being applied to the study of water reactor safety systems.

  1. Fatigue Reliability of Gas Turbine Engine Structures

    NASA Technical Reports Server (NTRS)

    Cruse, Thomas A.; Mahadevan, Sankaran; Tryon, Robert G.

    1997-01-01

    The results of an investigation are described for fatigue reliability in engine structures. The description consists of two parts. Part 1 is for method development. Part 2 is a specific case study. In Part 1, the essential concepts and practical approaches to damage tolerance design in the gas turbine industry are summarized. These have evolved over the years in response to flight safety certification requirements. The effect of Non-Destructive Evaluation (NDE) methods on these methods is also reviewed. Assessment methods based on probabilistic fracture mechanics, with regard to both crack initiation and crack growth, are outlined. Limit state modeling techniques from structural reliability theory are shown to be appropriate for application to this problem, for both individual failure mode and system-level assessment. In Part 2, the results of a case study for the high pressure turbine of a turboprop engine are described. The response surface approach is used to construct a fatigue performance function. This performance function is used with the First Order Reliability Method (FORM) to determine the probability of failure and the sensitivity of the fatigue life to the engine parameters for the first stage disk rim of the two stage turbine. A hybrid combination of regression and Monte Carlo simulation is to use incorporate time dependent random variables. System reliability is used to determine the system probability of failure, and the sensitivity of the system fatigue life to the engine parameters of the high pressure turbine. 'ne variation in the primary hot gas and secondary cooling air, the uncertainty of the complex mission loading, and the scatter in the material data are considered.

  2. 18 CFR 375.303 - Delegations to the Director of the Office of Electric Reliability.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... Director of the Office of Electric Reliability. 375.303 Section 375.303 Conservation of Power and Water... Delegations § 375.303 Delegations to the Director of the Office of Electric Reliability. The Commission... Electric Reliability Organization or Regional Entity rules or procedures; (ii) Reject an...

  3. A theory of knowledge

    NASA Astrophysics Data System (ADS)

    Pachner, J.

    1984-11-01

    In order to make reliable predictions in any region of human activity, it is necessary to distinguish clearly what is based on experience and what is a construction of intellect. The theory of knowledge developed in the present paper is an attempt to devise a set of axioms that demarcate experience, as the only source of our knowledge of the external world, from the ideas, scientific models, and theories by means of which the scientific predictions are made. After a discussion of the causality in relation to the laws of nature, the axioms of the expounded theory are formulated in the formalism of set theory. The theory is then applied to some problems in physics to demonstrate its usefulness.

  4. Test Pac: A Program for Comprehensive Item and Reliability Analysis.

    ERIC Educational Resources Information Center

    Luecht, Richard M.

    1987-01-01

    Test Pac, a test scoring and analysis computer program for moderate-sized sample designs using dichotomous response items, performs comprehensive item analyses and multiple reliability estimates. It also performs single-facet generalizability analysis of variance, single-parameter item response theory analyses, test score reporting, and computer…

  5. Interrater Reliability of a Team-Scored Electronic Portfolio

    ERIC Educational Resources Information Center

    Yao, Yuankun; Foster, Karen; Aldrich, Jennifer

    2009-01-01

    This study applied generalizability theory to investigate the interrater reliability of a team-scored electronic portfolio required for initial teacher certification. The sample consisted of 31 preservice teacher portfolios which were assigned to three groups of portfolio review teams. The review teams, which had received several rounds of…

  6. Measuring Rater Reliability on a Special Education Observation Tool

    ERIC Educational Resources Information Center

    Semmelroth, Carrie Lisa; Johnson, Evelyn

    2014-01-01

    This study used generalizability theory to measure reliability on the Recognizing Effective Special Education Teachers (RESET) observation tool designed to evaluate special education teacher effectiveness. At the time of this study, the RESET tool included three evidence-based instructional practices (direct, explicit instruction; whole-group…

  7. Robinson's Measure of Agreement as a Parallel Forms Reliability Coefficient.

    ERIC Educational Resources Information Center

    Willson, Victor L.

    A major deficiency in classical test theory is the reliance on Pearson product-moment (PPM) correlation concepts in the definition of reliability. PPM measures are totally insensitive to first moment differences in tests which leads to the dubious assumption of essential tan-equivalence. Robinson proposed a measure of agreement that is sensitive…

  8. Failure Analysis for Improved Reliability

    NASA Technical Reports Server (NTRS)

    Sood, Bhanu

    2016-01-01

    Outline: Section 1 - What is reliability and root cause? Section 2 - Overview of failure mechanisms. Section 3 - Failure analysis techniques (1. Non destructive analysis techniques, 2. Destructive Analysis, 3. Materials Characterization). Section 4 - Summary and Closure

  9. An experiment in software reliability

    NASA Technical Reports Server (NTRS)

    Dunham, J. R.; Pierce, J. L.

    1986-01-01

    The results of a software reliability experiment conducted in a controlled laboratory setting are reported. The experiment was undertaken to gather data on software failures and is one in a series of experiments being pursued by the Fault Tolerant Systems Branch of NASA Langley Research Center to find a means of credibly performing reliability evaluations of flight control software. The experiment tests a small sample of implementations of radar tracking software having ultra-reliability requirements and uses n-version programming for error detection, and repetitive run modeling for failure and fault rate estimation. The experiment results agree with those of Nagel and Skrivan in that the program error rates suggest an approximate log-linear pattern and the individual faults occurred with significantly different error rates. Additional analysis of the experimental data raises new questions concerning the phenomenon of interacting faults. This phenomenon may provide one explanation for software reliability decay.

  10. Accelerator Availability and Reliability Issues

    SciTech Connect

    Steve Suhring

    2003-05-01

    Maintaining reliable machine operations for existing machines as well as planning for future machines' operability present significant challenges to those responsible for system performance and improvement. Changes to machine requirements and beam specifications often reduce overall machine availability in an effort to meet user needs. Accelerator reliability issues from around the world will be presented, followed by a discussion of the major factors influencing machine availability.

  11. MEMS reliability: coming of age

    NASA Astrophysics Data System (ADS)

    Douglass, Michael R.

    2008-02-01

    In today's high-volume semiconductor world, one could easily take reliability for granted. As the MOEMS/MEMS industry continues to establish itself as a viable alternative to conventional manufacturing in the macro world, reliability can be of high concern. Currently, there are several emerging market opportunities in which MOEMS/MEMS is gaining a foothold. Markets such as mobile media, consumer electronics, biomedical devices, and homeland security are all showing great interest in microfabricated products. At the same time, these markets are among the most demanding when it comes to reliability assurance. To be successful, each company developing a MOEMS/MEMS device must consider reliability on an equal footing with cost, performance and manufacturability. What can this maturing industry learn from the successful development of DLP technology, air bag accelerometers and inkjet printheads? This paper discusses some basic reliability principles which any MOEMS/MEMS device development must use. Examples from the commercially successful and highly reliable Digital Micromirror Device complement the discussion.

  12. Progress in string theory

    NASA Astrophysics Data System (ADS)

    Maldacena, Juan Martín

    D-Branes on Calabi-Yau manifolds / Paul S. Aspinwall -- Lectures on AdS/CFT / Juan M. Maldacena -- Tachyon dynamics in open string theory / Ashoke Sen -- TASI/PITP/ISS lectures on moduli and microphysics / Eva Silverstein -- The duality cascade / Matthew J. Strassler -- Perturbative computations in string field theory / Washington Taylor -- Student seminars -- Student participants -- Lecturers, directors, and local organizing committee.

  13. The Assessment of Reliability Under Range Restriction: A Comparison of [Alpha], [Omega], and Test-Retest Reliability for Dichotomous Data

    ERIC Educational Resources Information Center

    Fife, Dustin A.; Mendoza, Jorge L.; Terry, Robert

    2012-01-01

    Though much research and attention has been directed at assessing the correlation coefficient under range restriction, the assessment of reliability under range restriction has been largely ignored. This article uses item response theory to simulate dichotomous item-level data to assess the robustness of KR-20 ([alpha]), [omega], and test-retest…

  14. Lifetime Reliability Prediction of Ceramic Structures Under Transient Thermomechanical Loads

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.; Jadaan, Osama J.; Gyekenyesi, John P.

    2005-01-01

    An analytical methodology is developed to predict the probability of survival (reliability) of ceramic components subjected to harsh thermomechanical loads that can vary with time (transient reliability analysis). This capability enables more accurate prediction of ceramic component integrity against fracture in situations such as turbine startup and shutdown, operational vibrations, atmospheric reentry, or other rapid heating or cooling situations (thermal shock). The transient reliability analysis methodology developed herein incorporates the following features: fast-fracture transient analysis (reliability analysis without slow crack growth, SCG); transient analysis with SCG (reliability analysis with time-dependent damage due to SCG); a computationally efficient algorithm to compute the reliability for components subjected to repeated transient loading (block loading); cyclic fatigue modeling using a combined SCG and Walker fatigue law; proof testing for transient loads; and Weibull and fatigue parameters that are allowed to vary with temperature or time. Component-to-component variation in strength (stochastic strength response) is accounted for with the Weibull distribution, and either the principle of independent action or the Batdorf theory is used to predict the effect of multiaxial stresses on reliability. The reliability analysis can be performed either as a function of the component surface (for surface-distributed flaws) or component volume (for volume-distributed flaws). The transient reliability analysis capability has been added to the NASA CARES/ Life (Ceramic Analysis and Reliability Evaluation of Structures/Life) code. CARES/Life was also updated to interface with commercially available finite element analysis software, such as ANSYS, when used to model the effects of transient load histories. Examples are provided to demonstrate the features of the methodology as implemented in the CARES/Life program.

  15. Reliability Assessment of Graphite Specimens under Multiaxial Stresses

    NASA Technical Reports Server (NTRS)

    Sookdeo, Steven; Nemeth, Noel N.; Bratton, Robert L.

    2008-01-01

    An investigation was conducted to predict the failure strength response of IG-100 nuclear grade graphite exposed to multiaxial stresses. As part of this effort, a review of failure criteria accounting for the stochastic strength response is provided. The experimental work was performed in the early 1990s at the Oak Ridge National Laboratory (ORNL) on hollow graphite tubes under the action of axial tensile loading and internal pressurization. As part of the investigation, finite-element analysis (FEA) was performed and compared with results of FEA from the original ORNL report. The new analysis generally compared well with the original analysis, although some discrepancies in the location of peak stresses was noted. The Ceramics Analysis and Reliability Evaluation of Structures Life prediction code (CARES/Life) was used with the FEA results to predict the quadrants I (tensile-tensile) and quadrant IV (compression-tension) strength response of the graphite tubes for the principle of independent action (PIA), the Weibull normal stress averaging (NSA), and the Batdorf multiaxial failure theories. The CARES/Life reliability analysis showed that all three failure theories gave similar results in quadrant I but that in quadrant IV, the PIA and Weibull normal stress-averaging theories were not conservative, whereas the Batdorf theory was able to correlate well with experimental results. The conclusion of the study was that the Batdorf theory should generally be used to predict the reliability response of graphite and brittle materials in multiaxial loading situations.

  16. Environmental education curriculum evaluation questionnaire: A reliability and validity study

    NASA Astrophysics Data System (ADS)

    Minner, Daphne Diane

    The intention of this research project was to bridge the gap between social science research and application to the environmental domain through the development of a theoretically derived instrument designed to give educators a template by which to evaluate environmental education curricula. The theoretical base for instrument development was provided by several developmental theories such as Piaget's theory of cognitive development, Developmental Systems Theory, Life-span Perspective, as well as curriculum research within the area of environmental education. This theoretical base fueled the generation of a list of components which were then translated into a questionnaire with specific questions relevant to the environmental education domain. The specific research question for this project is: Can a valid assessment instrument based largely on human development and education theory be developed that reliably discriminates high, moderate, and low quality in environmental education curricula? The types of analyses conducted to answer this question were interrater reliability (percent agreement, Cohen's Kappa coefficient, Pearson's Product-Moment correlation coefficient), test-retest reliability (percent agreement, correlation), and criterion-related validity (correlation). Face validity and content validity were also assessed through thorough reviews. Overall results indicate that 29% of the questions on the questionnaire demonstrated a high level of interrater reliability and 43% of the questions demonstrated a moderate level of interrater reliability. Seventy-one percent of the questions demonstrated a high test-retest reliability and 5% a moderate level. Fifty-five percent of the questions on the questionnaire were reliable (high or moderate) both across time and raters. Only eight questions (8%) did not show either interrater or test-retest reliability. The global overall rating of high, medium, or low quality was reliable across both coders and time, indicating

  17. String Theory and Gauge Theories

    SciTech Connect

    Maldacena, Juan

    2009-02-20

    We will see how gauge theories, in the limit that the number of colors is large, give string theories. We will discuss some examples of particular gauge theories where the corresponding string theory is known precisely, starting with the case of the maximally supersymmetric theory in four dimensions which corresponds to ten dimensional string theory. We will discuss recent developments in this area.

  18. [School Organization: Theory and Practice; Selected Readings on Grading, Nongrading, Multigrading, Self-Contained Classrooms, Departmentalization, Team Heterogeneous Grouping. Selected Bibliographies.] Rand McNally Education Series.

    ERIC Educational Resources Information Center

    Franklin, Marian Pope, Comp.

    Over 400 journal articles, case studies, research reports, dissertations, and position papers are briefly described in a series of eight selected bibliographies related to school organization. The eight specific areas treated in the volume and the number of items listed for each include: nongraded elementary school organization, 96; nongraded…

  19. Measurement, estimation, and prediction of software reliability

    NASA Technical Reports Server (NTRS)

    Hecht, H.

    1977-01-01

    Quantitative indices of software reliability are defined, and application of three important indices is indicated: (1) reliability measurement, (2) reliability estimation, and (3) reliability prediction. State of the art techniques for each of these procedures are presented together with considerations of data acquisition. Failure classifications and other documentation for comprehensive software reliability evaluation are described.

  20. Electronics reliability and measurement technology

    NASA Technical Reports Server (NTRS)

    Heyman, Joseph S. (Editor)

    1987-01-01

    A summary is presented of the Electronics Reliability and Measurement Technology Workshop. The meeting examined the U.S. electronics industry with particular focus on reliability and state-of-the-art technology. A general consensus of the approximately 75 attendees was that "the U.S. electronics industries are facing a crisis that may threaten their existence". The workshop had specific objectives to discuss mechanisms to improve areas such as reliability, yield, and performance while reducing failure rates, delivery times, and cost. The findings of the workshop addressed various aspects of the industry from wafers to parts to assemblies. Key problem areas that were singled out for attention are identified, and action items necessary to accomplish their resolution are recommended.

  1. Reliability model for planetary gear

    NASA Technical Reports Server (NTRS)

    Savage, M.; Paridon, C. A.; Coy, J. J.

    1982-01-01

    A reliability model is presented for planetary gear trains in which the ring gear is fixed, the Sun gear is the input, and the planet arm is the output. The input and output shafts are coaxial and the input and output torques are assumed to be coaxial with these shafts. Thrust and side loading are neglected. This type of gear train is commonly used in main rotor transmissions for helicopters and in other applications which require high reductions in speed. The reliability model is based on the Weibull distribution of the individual reliabilities of the transmission components. The transmission's basic dynamic capacity is defined as the input torque which may be applied for one million input rotations of the Sun gear. Load and life are related by a power law. The load life exponent and basic dynamic capacity are developed as functions of the component capacities.

  2. A Review of Score Reliability: Contemporary Thinking on Reliability Issues

    ERIC Educational Resources Information Center

    Rosen, Gerald A.

    2004-01-01

    Bruce Thompson's edited volume begins with a basic principle, one might call it a basic truth, "reliability is a property that applies to scores, and not immutably across all conceivable uses everywhere of a given measure." (p. 3). The author claims that this principle is little known and-or little understood. While that is an arguable point, the…

  3. [The repeat reliability of somatosensory evoked potentials].

    PubMed

    Strenge, H

    1989-09-01

    The test-immediate-retest reliability of latency and amplitude values of cervical and cortical somatosensory evoked potentials (SEP) to median nerve stimulation was assessed in 86 normal subjects aged 15 to 71 years. In addition to the stability of data between repeat trials within one test session the standard errors of measurement and the interpretable differences for SEP measures were calculated according to measurement theory. The study revealed retest correlations rtt greater than 0.80 for all latency measures of the cervical and cortical SEPs and all cortical amplitude parameters. The highest stability was found for the latency measures of the cervical components P10, N11, N13, the cortical components P16 and N20 and for the amplitude N20/P25. PMID:2507277

  4. Reliability study of TWT output RF window

    NASA Astrophysics Data System (ADS)

    Rocci, Peter J.

    1990-05-01

    Rome Air Development Center Computer-Aided Systems Engineering Branch (RBES) has documented an in-house effort to evaluate the structural reliability of the output waveguides window on a Traveling Wave Tube (TWT). This window acts as a seal between the TWT's vacuum envelope and output waveguide. Its purpose is to prevent any loss due to leakage of the vacuum while allowing passage of the microwave signal. This particular disk-shaped window is constructed of a ceramic material, beryllia, and contains an inner ring of copper and an outer ring of Monel K-500 (70 to 30 nickle-copper). It was suspected that excessive thermal stresses associated with the very high operating temperatures by this window has caused it to fail. Finite element analyses, along with material failure theories were used to determine the window's response to a time-dependent heat source and operating heat sink temperature.

  5. Reliability of two sintered silicon nitride materials

    NASA Technical Reports Server (NTRS)

    Mieskowski, D. M.; Sanders, W. A.; Pierce, L. A.

    1985-01-01

    Two types of sintered silicon nitride were evaluated in terms of reliability: an experimental high pressure nitrogen sintered material and a commercial material. The results show wide variations in strength for both materials. The Weibull moduli were 5.5, 8.9, and 11 for the experimental material at room temperature, 1200, and 1370 C, respectively. The commercial material showed Weibull moduli of 9.0, 8.6, and 8.9 at these respective temperatures. No correlation between strength and flaw size was noted for the experimental material. The applicability of the Weibull and Griffith theories to processing defects on the order of 100 microns or less in size are discussed.

  6. Designing magnetic systems for reliability

    SciTech Connect

    Heitzenroeder, P.J.

    1991-01-01

    Designing magnetic system is an iterative process in which the requirements are set, a design is developed, materials and manufacturing processes are defined, interrelationships with the various elements of the system are established, engineering analyses are performed, and fault modes and effects are studied. Reliability requires that all elements of the design process, from the seemingly most straightforward such as utilities connection design and implementation, to the most sophisticated such as advanced finite element analyses, receives a balanced and appropriate level of attention. D.B. Montgomery's study of magnet failures has shown that the predominance of magnet failures tend not to be in the most intensively engineered areas, but are associated with insulation, leads, ad unanticipated conditions. TFTR, JET, JT-60, and PBX are all major tokamaks which have suffered loss of reliability due to water leaks. Similarly the majority of causes of loss of magnet reliability at PPPL has not been in the sophisticated areas of the design but are due to difficulties associated with coolant connections, bus connections, and external structural connections. Looking towards the future, the major next-devices such as BPX and ITER are most costly and complex than any of their predecessors and are pressing the bounds of operating levels, materials, and fabrication. Emphasis on reliability is a must as the fusion program enters a phase where there are fewer, but very costly devices with the goal of reaching a reactor prototype stage in the next two or three decades. This paper reviews some of the magnet reliability issues which PPPL has faced over the years the lessons learned from them, and magnet design and fabrication practices which have been found to contribute to magnet reliability.

  7. Mechanically reliable scales and coatings

    SciTech Connect

    Tortorelli, P.F.; Alexander, K.B.

    1995-07-01

    As the first stage in examining the mechanical reliability of protective surface oxides, the behavior of alumina scales formed on iron-aluminum alloys during high-temperature cyclic oxidation was characterized in terms of damage and spallation tendencies. Scales were thermally grown on specimens of three iron-aluminum composition using a series of exposures to air at 1000{degrees}C. Gravimetric data and microscopy revealed substantially better integrity and adhesion of the scales grown on an alloy containing zirconium. The use of polished (rather than just ground) specimens resulted in scales that were more suitable for subsequent characterization of mechanical reliability.

  8. Metrological Reliability of Medical Devices

    NASA Astrophysics Data System (ADS)

    Costa Monteiro, E.; Leon, L. F.

    2015-02-01

    The prominent development of health technologies of the 20th century triggered demands for metrological reliability of physiological measurements comprising physical, chemical and biological quantities, essential to ensure accurate and comparable results of clinical measurements. In the present work, aspects concerning metrological reliability in premarket and postmarket assessments of medical devices are discussed, pointing out challenges to be overcome. In addition, considering the social relevance of the biomeasurements results, Biometrological Principles to be pursued by research and innovation aimed at biomedical applications are proposed, along with the analysis of their contributions to guarantee the innovative health technologies compliance with the main ethical pillars of Bioethics.

  9. Reliability in the design phase

    SciTech Connect

    Siahpush, A.S.; Hills, S.W.; Pham, H. ); Majumdar, D. )

    1991-12-01

    A study was performed to determine the common methods and tools that are available to calculated or predict a system's reliability. A literature review and software survey are included. The desired product of this developmental work is a tool for the system designer to use in the early design phase so that the final design will achieve the desired system reliability without lengthy testing and rework. Three computer programs were written which provide the first attempt at fulfilling this need. The programs are described and a case study is presented for each one. This is a continuing effort which will be furthered in FY-1992. 10 refs.

  10. Reliability in the design phase

    SciTech Connect

    Siahpush, A.S.; Hills, S.W.; Pham, H.; Majumdar, D.

    1991-12-01

    A study was performed to determine the common methods and tools that are available to calculated or predict a system`s reliability. A literature review and software survey are included. The desired product of this developmental work is a tool for the system designer to use in the early design phase so that the final design will achieve the desired system reliability without lengthy testing and rework. Three computer programs were written which provide the first attempt at fulfilling this need. The programs are described and a case study is presented for each one. This is a continuing effort which will be furthered in FY-1992. 10 refs.

  11. Photovoltaic power system reliability considerations

    NASA Technical Reports Server (NTRS)

    Lalli, V. R.

    1980-01-01

    An example of how modern engineering and safety techniques can be used to assure the reliable and safe operation of photovoltaic power systems is presented. This particular application is for a solar cell power system demonstration project designed to provide electric power requirements for remote villages. The techniques utilized involve a definition of the power system natural and operating environment, use of design criteria and analysis techniques, an awareness of potential problems via the inherent reliability and FMEA methods, and use of fail-safe and planned spare parts engineering philosophy.

  12. CERTS: Consortium for Electric Reliability Technology Solutions - Research Highlights

    SciTech Connect

    Eto, Joseph

    2003-07-30

    Historically, the U.S. electric power industry was vertically integrated, and utilities were responsible for system planning, operations, and reliability management. As the nation moves to a competitive market structure, these functions have been disaggregated, and no single entity is responsible for reliability management. As a result, new tools, technologies, systems, and management processes are needed to manage the reliability of the electricity grid. However, a number of simultaneous trends prevent electricity market participants from pursuing development of these reliability tools: utilities are preoccupied with restructuring their businesses, research funding has declined, and the formation of Independent System Operators (ISOs) and Regional Transmission Organizations (RTOs) to operate the grid means that control of transmission assets is separate from ownership of these assets; at the same time, business uncertainty, and changing regulatory policies have created a climate in which needed investment for transmission infrastructure and tools for reliability management has dried up. To address the resulting emerging gaps in reliability R&D, CERTS has undertaken much-needed public interest research on reliability technologies for the electricity grid. CERTS' vision is to: (1) Transform the electricity grid into an intelligent network that can sense and respond automatically to changing flows of power and emerging problems; (2) Enhance reliability management through market mechanisms, including transparency of real-time information on the status of the grid; (3) Empower customers to manage their energy use and reliability needs in response to real-time market price signals; and (4) Seamlessly integrate distributed technologies--including those for generation, storage, controls, and communications--to support the reliability needs of both the grid and individual customers.

  13. A discussion of system reliability and the relative importance of pumps and valves to overall system availability

    SciTech Connect

    Poole, A.B.

    1996-12-01

    An analysis was undertaken to establish preliminary trends for how component aging can effect failure rates for swing check valves, centrifugal pumps and motor operated valves. These failure rate trends were evaluated over time and linear aging rate models established. The failure rate models were then used with classic reliability theories to estimate reliability as a function of operating time. Reliability theory was also used to establish a simple system reliability model. Using the system model, the relative importance of pumps and valves to the overall system reliability were studied. Conclusions were established relative to overall system availability over time and the relative unavailabilities of the various components studied.

  14. Geographical Theories.

    ERIC Educational Resources Information Center

    Golledge, Reginald G.

    1996-01-01

    Discusses the origin of theories in geography and particularly the development of location theories. Considers the influence of economic theory on agricultural land use, industrial location, and geographic location theories. Explores a set of interrelated activities that show how the marketing process illustrates process theory. (MJP)

  15. 75 FR 80391 - Electric Reliability Organization Interpretations of Interconnection Reliability Operations and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-22

    ..., NOPR, Docket No. RM10-15-000, 75 FR 71613 (Nov. 24, 2010), 133 FERC ] 61,151, at P 65 (2010... Electrical and Electronics Engineers, Inc. (IEEE) definition of degraded as ``the inability of an item to... request for interpretation at 4-5 (citing full IEEE definitions of degraded: ``A failure that is...

  16. 78 FR 38851 - Electric Reliability Organization Proposal To Retire Requirements in Reliability Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-28

    ... through http://www.ferc.gov . Documents created electronically using word processing software should be... R5--Facility Ratings FAC-010-2.1, Requirement R5--System Operating Limits Methodology for the Planning Horizon FAC-011-2.1, Requirement R5--System Operating Limits Methodology for the...

  17. Management theory and applications.

    PubMed

    Fallon, L F

    2001-01-01

    Management is critical as an organization pursues its mission. There are many theories of management, but all agree that an effective organizational structure can facilitate the operation of a company. The author describes the typical functional areas found in most organizations (finance, operations, marketing, information systems, legal, and human resources); examines how the organization of tasks and people are inter-linked; and shows that administrators who have a working knowledge of management theory tend to be effective in the performance of their jobs. PMID:11401785

  18. Wanted: A Solid, Reliable PC

    ERIC Educational Resources Information Center

    Goldsborough, Reid

    2004-01-01

    This article discusses PC reliability, one of the most pressing issues regarding computers. Nearly a quarter century after the introduction of the first IBM PC and the outset of the personal computer revolution, PCs have largely become commodities, with little differentiating one brand from another in terms of capability and performance. Most of…

  19. Web Awards: Are They Reliable?

    ERIC Educational Resources Information Center

    Everhart, Nancy; McKnight, Kathleen

    1997-01-01

    School library media specialists recommend quality Web sites to children based on evaluations and Web awards. This article examines three types of Web awards and who grants them, suggests ways to determine their reliability, and discusses specific award sites. Includes a bibliography of Web sites. (PEN)

  20. The Reliability of College Grades

    ERIC Educational Resources Information Center

    Beatty, Adam S.; Walmsley, Philip T.; Sackett, Paul R.; Kuncel, Nathan R.; Koch, Amanda J.

    2015-01-01

    Little is known about the reliability of college grades relative to how prominently they are used in educational research, and the results to date tend to be based on small sample studies or are decades old. This study uses two large databases (N > 800,000) from over 200 educational institutions spanning 13 years and finds that both first-year…

  1. Averaging Internal Consistency Reliability Coefficients

    ERIC Educational Resources Information Center

    Feldt, Leonard S.; Charter, Richard A.

    2006-01-01

    Seven approaches to averaging reliability coefficients are presented. Each approach starts with a unique definition of the concept of "average," and no approach is more correct than the others. Six of the approaches are applicable to internal consistency coefficients. The seventh approach is specific to alternate-forms coefficients. Although the…

  2. Photovoltaic performance and reliability workshop

    SciTech Connect

    Kroposki, B

    1996-10-01

    This proceedings is the compilation of papers presented at the ninth PV Performance and Reliability Workshop held at the Sheraton Denver West Hotel on September 4--6, 1996. This years workshop included presentations from 25 speakers and had over 100 attendees. All of the presentations that were given are included in this proceedings. Topics of the papers included: defining service lifetime and developing models for PV module lifetime; examining and determining failure and degradation mechanisms in PV modules; combining IEEE/IEC/UL testing procedures; AC module performance and reliability testing; inverter reliability/qualification testing; standardization of utility interconnect requirements for PV systems; need activities to separate variables by testing individual components of PV systems (e.g. cells, modules, batteries, inverters,charge controllers) for individual reliability and then test them in actual system configurations; more results reported from field experience on modules, inverters, batteries, and charge controllers from field deployed PV systems; and system certification and standardized testing for stand-alone and grid-tied systems.

  3. Reliability Analysis of Money Habitudes

    ERIC Educational Resources Information Center

    Delgadillo, Lucy M.; Bushman, Brittani S.

    2015-01-01

    Use of the Money Habitudes exercise has gained popularity among various financial professionals. This article reports on the reliability of this resource. A survey administered to young adults at a western state university was conducted, and each Habitude or "domain" was analyzed using Cronbach's alpha procedures. Results showed all six…

  4. Wind turbine reliability database update.

    SciTech Connect

    Peters, Valerie A.; Hill, Roger Ray; Stinebaugh, Jennifer A.; Veers, Paul S.

    2009-03-01

    This report documents the status of the Sandia National Laboratories' Wind Plant Reliability Database. Included in this report are updates on the form and contents of the Database, which stems from a fivestep process of data partnerships, data definition and transfer, data formatting and normalization, analysis, and reporting. Selected observations are also reported.

  5. Compound estimation procedures in reliability

    NASA Technical Reports Server (NTRS)

    Barnes, Ron

    1990-01-01

    At NASA, components and subsystems of components in the Space Shuttle and Space Station generally go through a number of redesign stages. While data on failures for various design stages are sometimes available, the classical procedures for evaluating reliability only utilize the failure data on the present design stage of the component or subsystem. Often, few or no failures have been recorded on the present design stage. Previously, Bayesian estimators for the reliability of a single component, conditioned on the failure data for the present design, were developed. These new estimators permit NASA to evaluate the reliability, even when few or no failures have been recorded. Point estimates for the latter evaluation were not possible with the classical procedures. Since different design stages of a component (or subsystem) generally have a good deal in common, the development of new statistical procedures for evaluating the reliability, which consider the entire failure record for all design stages, has great intuitive appeal. A typical subsystem consists of a number of different components and each component has evolved through a number of redesign stages. The present investigations considered compound estimation procedures and related models. Such models permit the statistical consideration of all design stages of each component and thus incorporate all the available failure data to obtain estimates for the reliability of the present version of the component (or subsystem). A number of models were considered to estimate the reliability of a component conditioned on its total failure history from two design stages. It was determined that reliability estimators for the present design stage, conditioned on the complete failure history for two design stages have lower risk than the corresponding estimators conditioned only on the most recent design failure data. Several models were explored and preliminary models involving bivariate Poisson distribution and the

  6. The Information Function for the One-Parameter Logistic Model: Is it Reliability?

    ERIC Educational Resources Information Center

    Doran, Harold C.

    2005-01-01

    The information function is an important statistic in item response theory (IRT) applications. Although the information function is often described as the IRT version of reliability, it differs from the classical notion of reliability from a critical perspective: replication. This article first explores the information function for the…

  7. Goal Theory and Individual Productivity.

    ERIC Educational Resources Information Center

    Frost, Peter J.

    The paper provides a review of goal theory as articulated by Edwin Locke. The theory is evaluated in terms of laboratory and field research and its practical usefulnes is explored as a means to improving individual productivity in "real world" organizations Research findings provide support for some goal theory propositions but suggest also the…

  8. Crosslinkage theory of senile cataracts.

    PubMed

    Bellows, J G; Bellows, R T

    1976-02-01

    The theory of crosslinkage as a cause of aging of all living organic tissues has withstood the test of time. Nearly all the past theories of senile cataract formation contain elements involving a potent crosslinking agent. Therefore crosslinkage may now be recognized as the common denominator in past theories of senile cataracts. This paper proposes that crosslinkage is the mechanism for senile cataract formation.

  9. Power Quality and Reliability Project

    NASA Technical Reports Server (NTRS)

    Attia, John O.

    2001-01-01

    One area where universities and industry can link is in the area of power systems reliability and quality - key concepts in the commercial, industrial and public sector engineering environments. Prairie View A&M University (PVAMU) has established a collaborative relationship with the University of'Texas at Arlington (UTA), NASA/Johnson Space Center (JSC), and EP&C Engineering and Technology Group (EP&C) a small disadvantage business that specializes in power quality and engineering services. The primary goal of this collaboration is to facilitate the development and implementation of a Strategic Integrated power/Systems Reliability and Curriculum Enhancement Program. The objectives of first phase of this work are: (a) to develop a course in power quality and reliability, (b) to use the campus of Prairie View A&M University as a laboratory for the study of systems reliability and quality issues, (c) to provide students with NASA/EPC shadowing and Internship experience. In this work, a course, titled "Reliability Analysis of Electrical Facilities" was developed and taught for two semesters. About thirty seven has benefited directly from this course. A laboratory accompanying the course was also developed. Four facilities at Prairie View A&M University were surveyed. Some tests that were performed are (i) earth-ground testing, (ii) voltage, amperage and harmonics of various panels in the buildings, (iii) checking the wire sizes to see if they were the right size for the load that they were carrying, (iv) vibration tests to test the status of the engines or chillers and water pumps, (v) infrared testing to the test arcing or misfiring of electrical or mechanical systems.

  10. Domains and Naive Theories

    PubMed Central

    Gelman, Susan A.; Noles, Nicholaus S.

    2013-01-01

    Human cognition entails domain-specific cognitive processes that influence memory, attention, categorization, problem-solving, reasoning, and knowledge organization. This review examines domain-specific causal theories, which are of particular interest for permitting an examination of how knowledge structures change over time. We first describe the properties of commonsense theories, and how commonsense theories differ from scientific theories, illustrating with children’s classification of biological and non-biological kinds. We next consider the implications of domain-specificity for broader issues regarding cognitive development and conceptual change. We then examine the extent to which domain-specific theories interact, and how people reconcile competing causal frameworks. Future directions for research include examining how different content domains interact, the nature of theory change, the role of context (including culture, language, and social interaction) in inducing different frameworks, and the neural bases for domain-specific reasoning. PMID:24187603

  11. Indentation of polydimethylsiloxane submerged in organic solvents

    NASA Astrophysics Data System (ADS)

    Hu, Yuhang; Chen, Xin; Whitesides, George; Vlassak, Joost; Suo, Zhigang

    2011-03-01

    This study uses a method based on indentation to characterize a polydimethylsiloxane (PDMS) elastomer submerged in an organic solvent (decane, heptane, pentane, or cyclohexane). An indenter is pressed into a disk of a swollen elastomer to a fixed depth, and the force on the indenter is recorded as a function of time. By examining how the relaxation time scales with the radius of contact, one can differentiate the poroelastic behavior from the viscoelastic behavior. By matching the relaxation curve measured experimentally to that derived from the theory of poroelasticity, one can identify elastic constants and permeability. The measured elastic constants are interpreted within the Flory-Huggins theory. The measured permeabilities indicate that the solvents migrate in PDMS by diffusion, rather than by convection. This work confirms that indentation is a reliable and convenient method to characterize swollen elastomers.

  12. Reliability analysis applied to structural tests

    NASA Technical Reports Server (NTRS)

    Diamond, P.; Payne, A. O.

    1972-01-01

    The application of reliability theory to predict, from structural fatigue test data, the risk of failure of a structure under service conditions because its load-carrying capability is progressively reduced by the extension of a fatigue crack, is considered. The procedure is applicable to both safe-life and fail-safe structures and, for a prescribed safety level, it will enable an inspection procedure to be planned or, if inspection is not feasible, it will evaluate the life to replacement. The theory has been further developed to cope with the case of structures with initial cracks, such as can occur in modern high-strength materials which are susceptible to the formation of small flaws during the production process. The method has been applied to a structure of high-strength steel and the results are compared with those obtained by the current life estimation procedures. This has shown that the conventional methods can be unconservative in certain cases, depending on the characteristics of the structure and the design operating conditions. The suitability of the probabilistic approach to the interpretation of the results from full-scale fatigue testing of aircraft structures is discussed and the assumptions involved are examined.

  13. Reliable biological communication with realistic constraints.

    PubMed

    de Polavieja, Gonzalo G

    2004-12-01

    Communication in biological systems must deal with noise and metabolic or temporal constraints. We include these constraints into information theory to obtain the distributions of signal usage corresponding to a maximal rate of information transfer given any noise structure and any constraints. Generalized versions of the Boltzmann, Gaussian, or Poisson distributions are obtained for linear, quadratic and temporal constraints, respectively. These distributions are shown to imply that biological transformations must dedicate a larger output range to the more probable inputs and less to the outputs with higher noise and higher participation in the constraint. To show the general theory of reliable communication at work, we apply these results to biochemical and neuronal signaling. Noncooperative enzyme kinetics is shown to be suited for transfer of a high signal quality when the input distribution has a maximum at low concentrations while cooperative kinetics for near-Gaussian input statistics. Neuronal codes based on spike rates, spike times or bursts have to balance signal quality and cost-efficiency and at the network level imply sparseness and uncorrelation within the limits of noise, cost, and processing operations. PMID:15697405

  14. Computational Thermochemistry and Benchmarking of Reliable Methods

    SciTech Connect

    Feller, David F.; Dixon, David A.; Dunning, Thom H.; Dupuis, Michel; McClemore, Doug; Peterson, Kirk A.; Xantheas, Sotiris S.; Bernholdt, David E.; Windus, Theresa L.; Chalasinski, Grzegorz; Fosada, Rubicelia; Olguim, Jorge; Dobbs, Kerwin D.; Frurip, Donald; Stevens, Walter J.; Rondan, Nelson; Chase, Jared M.; Nichols, Jeffrey A.

    2006-06-20

    During the first and second years of the Computational Thermochemistry and Benchmarking of Reliable Methods project, we completed several studies using the parallel computing capabilities of the NWChem software and Molecular Science Computing Facility (MSCF), including large-scale density functional theory (DFT), second-order Moeller-Plesset (MP2) perturbation theory, and CCSD(T) calculations. During the third year, we continued to pursue the computational thermodynamic and benchmarking studies outlined in our proposal. With the issues affecting the robustness of the coupled cluster part of NWChem resolved, we pursued studies of the heats-of-formation of compounds containing 5 to 7 first- and/or second-row elements and approximately 10 to 14 hydrogens. The size of these systems, when combined with the large basis sets (cc-pVQZ and aug-cc-pVQZ) that are necessary for extrapolating to the complete basis set limit, creates a formidable computational challenge, for which NWChem on NWMPP1 is well suited.

  15. Electromagnetic Model Reliably Predicts Radar Scattering Characteristics of Airborne Organisms

    PubMed Central

    Mirkovic, Djordje; Stepanian, Phillip M.; Kelly, Jeffrey F.; Chilson, Phillip B.

    2016-01-01

    The radar scattering characteristics of aerial animals are typically obtained from controlled laboratory measurements of a freshly harvested specimen. These measurements are tedious to perform, difficult to replicate, and typically yield only a small subset of the full azimuthal, elevational, and polarimetric radio scattering data. As an alternative, biological applications of radar often assume that the radar cross sections of flying animals are isotropic, since sophisticated computer models are required to estimate the 3D scattering properties of objects having complex shapes. Using the method of moments implemented in the WIPL-D software package, we show for the first time that such electromagnetic modeling techniques (typically applied to man-made objects) can accurately predict organismal radio scattering characteristics from an anatomical model: here the Brazilian free-tailed bat (Tadarida brasiliensis). The simulated scattering properties of the bat agree with controlled measurements and radar observations made during a field study of bats in flight. This numerical technique can produce the full angular set of quantitative polarimetric scattering characteristics, while eliminating many practical difficulties associated with physical measurements. Such a modeling framework can be applied for bird, bat, and insect species, and will help drive a shift in radar biology from a largely qualitative and phenomenological science toward quantitative estimation of animal densities and taxonomic identification. PMID:27762292

  16. Promoting health care safety through training high reliability teams.

    PubMed

    Wilson, K A; Burke, C S; Priest, H A; Salas, E

    2005-08-01

    Many organizations have been using teams as a means of achieving organizational outcomes (such as productivity and safety). Research has indicated that teams, especially those operating in complex environments, are not always effective. There is a subset of organizations in which teams operate that are able to balance effectiveness and safety despite the complexities of the environment (for example, aviation, nuclear power). These high reliability organizations (HROs) have begun to be examined as a model for those in other complex domains, such as health care, that strive to reach a status of high reliability. In this paper we analyse the components leading to the effectiveness of HROs by examining the teams that comprise them. We use a systems perspective to uncover the behavioral markers by which high reliability teams (HRTs) are able to uphold the values of their parent organizations, thereby promoting safety. Using these markers, we offer guidelines and developmental strategies that will help the healthcare community to shift more quickly to high reliability status by not focusing solely on the organizational level.

  17. Promoting health care safety through training high reliability teams

    PubMed Central

    Wilson, K; Burke, C; Priest, H; Salas, E

    2005-01-01

    

 Many organizations have been using teams as a means of achieving organizational outcomes (such as productivity and safety). Research has indicated that teams, especially those operating in complex environments, are not always effective. There is a subset of organizations in which teams operate that are able to balance effectiveness and safety despite the complexities of the environment (for example, aviation, nuclear power). These high reliability organizations (HROs) have begun to be examined as a model for those in other complex domains, such as health care, that strive to reach a status of high reliability. In this paper we analyse the components leading to the effectiveness of HROs by examining the teams that comprise them. We use a systems perspective to uncover the behavioral markers by which high reliability teams (HRTs) are able to uphold the values of their parent organizations, thereby promoting safety. Using these markers, we offer guidelines and developmental strategies that will help the healthcare community to shift more quickly to high reliability status by not focusing solely on the organizational level. PMID:16076797

  18. 18 CFR 39.7 - Enforcement of Reliability Standards.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... application a proposal for the prompt reporting to the Commission of any self-reported violation or... provision for the prompt reporting through the Electric Reliability Organization to the Commission of any... or operator that is the subject of such penalty. (4) Any answer, intervention or comment to...

  19. 18 CFR 39.7 - Enforcement of Reliability Standards.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... application a proposal for the prompt reporting to the Commission of any self-reported violation or... provision for the prompt reporting through the Electric Reliability Organization to the Commission of any... penalty. (4) Any answer, intervention or comment to an application for review of a penalty imposed...

  20. 18 CFR 39.7 - Enforcement of Reliability Standards.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... application a proposal for the prompt reporting to the Commission of any self-reported violation or... provision for the prompt reporting through the Electric Reliability Organization to the Commission of any... penalty. (4) Any answer, intervention or comment to an application for review of a penalty imposed...