Science.gov

Sample records for reliability organizations theory

  1. Organization Theory as Ideology.

    ERIC Educational Resources Information Center

    Greenfield, Thomas B.

    The theory that organizations are ideological inventions of the human mind is discussed. Organizational science is described as an ideology which is based upon social concepts and experiences. The main justification for organizational theory is that it attempts to answer why we behave as we do in social organizations. Ways in which ideas and…

  2. Creating High Reliability in Health Care Organizations

    PubMed Central

    Pronovost, Peter J; Berenholtz, Sean M; Goeschel, Christine A; Needham, Dale M; Sexton, J Bryan; Thompson, David A; Lubomski, Lisa H; Marsteller, Jill A; Makary, Martin A; Hunt, Elizabeth

    2006-01-01

    Objective The objective of this paper was to present a comprehensive approach to help health care organizations reliably deliver effective interventions. Context Reliability in healthcare translates into using valid rate-based measures. Yet high reliability organizations have proven that the context in which care is delivered, called organizational culture, also has important influences on patient safety. Model for Improvement Our model to improve reliability, which also includes interventions to improve culture, focuses on valid rate-based measures. This model includes (1) identifying evidence-based interventions that improve the outcome, (2) selecting interventions with the most impact on outcomes and converting to behaviors, (3) developing measures to evaluate reliability, (4) measuring baseline performance, and (5) ensuring patients receive the evidence-based interventions. The comprehensive unit-based safety program (CUSP) is used to improve culture and guide organizations in learning from mistakes that are important, but cannot be measured as rates. Conclusions We present how this model was used in over 100 intensive care units in Michigan to improve culture and eliminate catheter-related blood stream infections—both were accomplished. Our model differs from existing models in that it incorporates efforts to improve a vital component for system redesign—culture, it targets 3 important groups—senior leaders, team leaders, and front line staff, and facilitates change management—engage, educate, execute, and evaluate for planned interventions. PMID:16898981

  3. High Reliability Organizations in Education. Noteworthy Perspectives

    ERIC Educational Resources Information Center

    Eck, James H.; Bellamy, G. Thomas; Schaffer, Eugene; Stringfield, Sam; Reynolds, David

    2011-01-01

    The authors of this monograph assert that by assisting school systems to more closely resemble "high reliability" organizations (HROs) that already exist in other industries and benchmarking against top-performing education systems from around the globe, America's school systems can transform themselves from compliance-driven bureaucracies to…

  4. 78 FR 38851 - Electric Reliability Organization Proposal To Retire Requirements in Reliability Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-28

    ... Federal Energy Regulatory Commission 18 CFR Part 40 Electric Reliability Organization Proposal To Retire... Electric Reliability Corporation (NERC), the Commission-certified Electric Reliability Organization. The... 20426, Telephone: (202) 502-6840. Michael Gandolfo (Technical Information), Office of...

  5. Organization theory. Analyzing health care organizations.

    PubMed

    Cors, W K

    1997-02-01

    Organization theory (OT) is a tool that can be applied to analyze and understand health care organizations. Transaction cost theory is used to explain, in a unifying fashion, the myriad changes being undertaken by different groups of constituencies in health care. Agency theory is applied to aligning economic incentives needed to ensure Integrated Delivery System (IDS) success. By using tools such as OT, a clearer understanding of organizational changes is possible. PMID:10164970

  6. 18 CFR 39.3 - Electric Reliability Organization certification.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 18 Conservation of Power and Water Resources 1 2013-04-01 2013-04-01 false Electric Reliability... CERTIFICATION OF THE ELECTRIC RELIABILITY ORGANIZATION; AND PROCEDURES FOR THE ESTABLISHMENT, APPROVAL, AND ENFORCEMENT OF ELECTRIC RELIABILITY STANDARDS § 39.3 Electric Reliability Organization certification. (a)...

  7. Microbial community modeling using reliability theory.

    PubMed

    Zilles, Julie L; Rodríguez, Luis F; Bartolerio, Nicholas A; Kent, Angela D

    2016-08-01

    Linking microbial community composition with the corresponding ecosystem functions remains challenging. Because microbial communities can differ in their functional responses, this knowledge gap limits ecosystem assessment, design and management. To develop models that explicitly incorporate microbial populations and guide efforts to characterize their functional differences, we propose a novel approach derived from reliability engineering. This reliability modeling approach is illustrated here using a microbial ecology dataset from denitrifying bioreactors. Reliability modeling is well-suited for analyzing the stability of complex networks composed of many microbial populations. It could also be applied to evaluate the redundancy within a particular biochemical pathway in a microbial community. Reliability modeling allows characterization of the system's resilience and identification of failure-prone functional groups or biochemical steps, which can then be targeted for monitoring or enhancement. The reliability engineering approach provides a new perspective for unraveling the interactions between microbial community diversity, functional redundancy and ecosystem services, as well as practical tools for the design and management of engineered ecosystems. PMID:26882268

  8. Administrative Law and Organization Theory

    ERIC Educational Resources Information Center

    Evan, William M.

    1977-01-01

    Considered are some trends in American administrative law, some trends in organization theory, a model of the administrative process, and several potentially useful research strategies. The analysis has implications for comparative research on legal systems. (Author/LBH)

  9. Design of high reliability organizations in health care

    PubMed Central

    Carroll, J S; Rudolph, J W

    2006-01-01

    To improve safety performance, many healthcare organizations have sought to emulate high reliability organizations from industries such as nuclear power, chemical processing, and military operations. We outline high reliability design principles for healthcare organizations including both the formal structures and the informal practices that complement those structures. A stage model of organizational structures and practices, moving from local autonomy to formal controls to open inquiry to deep self‐understanding, is used to illustrate typical challenges and design possibilities at each stage. We suggest how organizations can use the concepts and examples presented to increase their capacity to self‐design for safety and reliability. PMID:17142607

  10. 75 FR 80391 - Electric Reliability Organization Interpretations of Interconnection Reliability Operations and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-22

    ..., NOPR, Docket No. RM10-15-000, 75 FR 71613 (Nov. 24, 2010), 133 FERC ] 61,151, at P 65 (2010... Energy Regulatory Commission 18 CFR Part 40 Electric Reliability Organization Interpretations of... (Commission) proposes to approve the North American Electric Reliability Corporation's (NERC)...

  11. Studying Reliability of Open Ended Mathematics Items According to the Classical Test Theory and Generalizability Theory

    ERIC Educational Resources Information Center

    Guler, Nese; Gelbal, Selahattin

    2010-01-01

    In this study, the Classical test theory and generalizability theory were used for determination to reliability of scores obtained from measurement tool of mathematics success. 24 open-ended mathematics question of the TIMSS-1999 was applied to 203 students in 2007-spring semester. Internal consistency of scores was found as 0.92. For…

  12. Applicability of Complex Organization Theory to Small Organizations

    ERIC Educational Resources Information Center

    Dolch, Norman A.; Heffernan, William D.

    1978-01-01

    Reviews research literature and describes a study concerning the applicability of complex organization theory to small organizations. Finds that organizational-structural properties can be measured in small organizations; complex organization theory can be used to better understand small organizations; and certain measurement techniques used in…

  13. Modeling Sensor Reliability in Fault Diagnosis Based on Evidence Theory

    PubMed Central

    Yuan, Kaijuan; Xiao, Fuyuan; Fei, Liguo; Kang, Bingyi; Deng, Yong

    2016-01-01

    Sensor data fusion plays an important role in fault diagnosis. Dempster–Shafer (D-R) evidence theory is widely used in fault diagnosis, since it is efficient to combine evidence from different sensors. However, under the situation where the evidence highly conflicts, it may obtain a counterintuitive result. To address the issue, a new method is proposed in this paper. Not only the statistic sensor reliability, but also the dynamic sensor reliability are taken into consideration. The evidence distance function and the belief entropy are combined to obtain the dynamic reliability of each sensor report. A weighted averaging method is adopted to modify the conflict evidence by assigning different weights to evidence according to sensor reliability. The proposed method has better performance in conflict management and fault diagnosis due to the fact that the information volume of each sensor report is taken into consideration. An application in fault diagnosis based on sensor fusion is illustrated to show the efficiency of the proposed method. The results show that the proposed method improves the accuracy of fault diagnosis from 81.19% to 89.48% compared to the existing methods. PMID:26797611

  14. Modeling Sensor Reliability in Fault Diagnosis Based on Evidence Theory.

    PubMed

    Yuan, Kaijuan; Xiao, Fuyuan; Fei, Liguo; Kang, Bingyi; Deng, Yong

    2016-01-01

    Sensor data fusion plays an important role in fault diagnosis. Dempster-Shafer (D-R) evidence theory is widely used in fault diagnosis, since it is efficient to combine evidence from different sensors. However, under the situation where the evidence highly conflicts, it may obtain a counterintuitive result. To address the issue, a new method is proposed in this paper. Not only the statistic sensor reliability, but also the dynamic sensor reliability are taken into consideration. The evidence distance function and the belief entropy are combined to obtain the dynamic reliability of each sensor report. A weighted averaging method is adopted to modify the conflict evidence by assigning different weights to evidence according to sensor reliability. The proposed method has better performance in conflict management and fault diagnosis due to the fact that the information volume of each sensor report is taken into consideration. An application in fault diagnosis based on sensor fusion is illustrated to show the efficiency of the proposed method. The results show that the proposed method improves the accuracy of fault diagnosis from 81.19% to 89.48% compared to the existing methods. PMID:26797611

  15. Reliability between nurse managers: the key to the high-reliability organization.

    PubMed

    Kerfoot, Karlene

    2006-01-01

    Flawless execution rests in the hands of nurse managers. No one can work alone in health care any more. We are interdependent and know that the best outcomes happen when practices are organized around collegial supportive structures rather than autonomous competitive units. We are only as strong as our weakest link. If all managers see the big picture and look beyond their units for what is right for the common good, we will achieve high-reliability organizations in health care. In turn health care organizations will become very safe places to operate. Shared governance structures for nurse managers are the perfect vehicle to develop collaborative organizations and flawless execution, and to adopt high-reliability organization principles. PMID:17131622

  16. Teamwork as an Essential Component of High-Reliability Organizations

    PubMed Central

    Baker, David P; Day, Rachel; Salas, Eduardo

    2006-01-01

    Organizations are increasingly becoming dynamic and unstable. This evolution has given rise to greater reliance on teams and increased complexity in terms of team composition, skills required, and degree of risk involved. High-reliability organizations (HROs) are those that exist in such hazardous environments where the consequences of errors are high, but the occurrence of error is extremely low. In this article, we argue that teamwork is an essential component of achieving high reliability particularly in health care organizations. We describe the fundamental characteristics of teams, review strategies in team training, demonstrate the criticality of teamwork in HROs and finally, identify specific challenges the health care community must address to improve teamwork and enhance reliability. PMID:16898980

  17. Teamwork as an essential component of high-reliability organizations.

    PubMed

    Baker, David P; Day, Rachel; Salas, Eduardo

    2006-08-01

    Organizations are increasingly becoming dynamic and unstable. This evolution has given rise to greater reliance on teams and increased complexity in terms of team composition, skills required, and degree of risk involved. High-reliability organizations (HROs) are those that exist in such hazardous environments where the consequences of errors are high, but the occurrence of error is extremely low. In this article, we argue that teamwork is an essential component of achieving high reliability particularly in health care organizations. We describe the fundamental characteristics of teams, review strategies in team training, demonstrate the criticality of teamwork in HROs and finally, identify specific challenges the health care community must address to improve teamwork and enhance reliability. PMID:16898980

  18. 78 FR 41339 - Electric Reliability Organization Proposal To Retire Requirements in Reliability Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-10

    ... published in the Federal Register of Friday, June 28, 2013 (78 FR 38851). The proposed regulations would...,'' respectively, and the Total is changed from ``$535,500'' to ``$518,220.'' In FR Doc. 2013-15433 appearing on... Federal Energy Regulatory Commission 18 CFR Part 40 Electric Reliability Organization Proposal To...

  19. Generalizability Theory as a Unifying Framework of Measurement Reliability in Adolescent Research

    ERIC Educational Resources Information Center

    Fan, Xitao; Sun, Shaojing

    2014-01-01

    In adolescence research, the treatment of measurement reliability is often fragmented, and it is not always clear how different reliability coefficients are related. We show that generalizability theory (G-theory) is a comprehensive framework of measurement reliability, encompassing all other reliability methods (e.g., Pearson "r,"…

  20. 18 CFR 39.3 - Electric Reliability Organization certification.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Electric Reliability Organization certification. 39.3 Section 39.3 Conservation of Power and Water Resources FEDERAL ENERGY... operators of the Bulk-Power System, and other interested parties for improvement of the Electric...

  1. 18 CFR 39.3 - Electric Reliability Organization certification.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Electric Reliability Organization certification. 39.3 Section 39.3 Conservation of Power and Water Resources FEDERAL ENERGY... operators of the Bulk-Power System, and other interested parties for improvement of the Electric...

  2. 18 CFR 39.3 - Electric Reliability Organization certification.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 18 Conservation of Power and Water Resources 1 2014-04-01 2014-04-01 false Electric Reliability Organization certification. 39.3 Section 39.3 Conservation of Power and Water Resources FEDERAL ENERGY... operators of the Bulk-Power System, and other interested parties for improvement of the Electric...

  3. 18 CFR 39.3 - Electric Reliability Organization certification.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 18 Conservation of Power and Water Resources 1 2012-04-01 2012-04-01 false Electric Reliability Organization certification. 39.3 Section 39.3 Conservation of Power and Water Resources FEDERAL ENERGY... operators of the Bulk-Power System, and other interested parties for improvement of the Electric...

  4. Evaluating reliability and resolution of ensemble forecasts using information theory

    NASA Astrophysics Data System (ADS)

    Weijs, Steven; van de Giesen, Nick

    2010-05-01

    Ensemble forecasts are increasingly popular for the communication of uncertainty towards the public and decision makers. Ideally, an ensemble forecast reflects both the uncertainty and the information in a forecast, which means that the spread in the ensemble should accurately represent the true uncertainty. For ensembles to be useful, they should be probabilistic, as probability is the language to precisely describe an incomplete state of knowledge, that is typical for forecasts. Information theory provides the ideal tools to deal with uncertainty and information in forecasts. Essential to the use and development of models and forecasts are ways to evaluate their quality. Without a proper definition of what is good, it is impossible to improve forecasts. In contrast to forecast value, which is user dependent, forecast quality, which is defined as the correspondence between forecasts and observations, can be objectively defined, given the question that is asked. The evaluation of forecast quality is known as forecast verification. Numerous techniques for forecast verification have been developed over the past decades. The Brier score (BS) and the derived Ranked Probability Score (RPS) are among the most widely used scores for measuring forecast quality. Both of these scores can be split into three additive components: uncertainty, reliability and resolution. While the first component, uncertainty, just depends on the inherent variability in the forecasted event, the latter two measure different aspects of the quality of forecasts themselves. Resolution measures the difference between the conditional probabilities and the marginal probabilities of occurrence. The third component, reliability, measures the conditional bias in the probability estimates, hence unreliability would be a better name. In this work, we argue that information theory should be adopted as the correct framework for measuring quality of probabilistic ensemble forecasts. We use the information

  5. A Postmodern Theory of Knowledge Organization.

    ERIC Educational Resources Information Center

    Mai, Jens-Erik

    1999-01-01

    Suggests a postmodern theory regarding knowledge organizations as active constructions of a perceived conception of particular discourse communities in the company, organization or knowledge fields for which the knowledge organization is intended. In this view, the interpretive process in knowledge organization and the culture and social context…

  6. Putting "Organizations" into an Organization Theory Course: A Hybrid CAO Model for Teaching Organization Theory

    ERIC Educational Resources Information Center

    Hannah, David R.; Venkatachary, Ranga

    2010-01-01

    In this article, the authors present a retrospective analysis of an instructor's multiyear redesign of a course on organization theory into what is called a hybrid Classroom-as-Organization model. It is suggested that this new course design served to apprentice students to function in quasi-real organizational structures. The authors further argue…

  7. Theory of reliable systems. [reliability analysis and on-line fault diagnosis

    NASA Technical Reports Server (NTRS)

    Meyer, J. F.

    1974-01-01

    Research is reported in the program to refine the current notion of system reliability by identifying and investigating attributes of a system which are important to reliability considerations, and to develop techniques which facilitate analysis of system reliability. Reliability analysis, and on-line fault diagnosis are discussed.

  8. Comparison of Reliability Measures under Factor Analysis and Item Response Theory

    ERIC Educational Resources Information Center

    Cheng, Ying; Yuan, Ke-Hai; Liu, Cheng

    2012-01-01

    Reliability of test scores is one of the most pervasive psychometric concepts in measurement. Reliability coefficients based on a unifactor model for continuous indicators include maximal reliability rho and an unweighted sum score-based omega, among many others. With increasing popularity of item response theory, a parallel reliability measure pi…

  9. Theory of reliable systems. [systems analysis and design

    NASA Technical Reports Server (NTRS)

    Meyer, J. F.

    1973-01-01

    The analysis and design of reliable systems are discussed. The attributes of system reliability studied are fault tolerance, diagnosability, and reconfigurability. Objectives of the study include: to determine properties of system structure that are conducive to a particular attribute; to determine methods for obtaining reliable realizations of a given system; and to determine how properties of system behavior relate to the complexity of fault tolerant realizations. A list of 34 references is included.

  10. A Holistic Equilibrium Theory of Organization Development

    ERIC Educational Resources Information Center

    Yang, Baiyin; Zheng, Wei

    2005-01-01

    This paper proposes a holistic equilibrium theory of organizational development (OD). The theory states that there are three driving forces in organizational change and development--rationality, reality, and liberty. OD can be viewed as a planned process of change in an organization so as to establish equilibrium among these three interacting…

  11. Human Resource Management, Computers, and Organization Theory.

    ERIC Educational Resources Information Center

    Garson, G. David

    In an attempt to provide a framework for research and theory building in public management information systems (PMIS), state officials responsible for computing in personnel operations were surveyed. The data were applied to hypotheses arising from a recent model by Bozeman and Bretschneider, attempting to relate organization theory to management…

  12. Educational Management Organizations as High Reliability Organizations: A Study of Victory's Philadelphia High School Reform Work

    ERIC Educational Resources Information Center

    Thomas, David E.

    2013-01-01

    This executive position paper proposes recommendations for designing reform models between public and private sectors dedicated to improving school reform work in low performing urban high schools. It reviews scholarly research about for-profit educational management organizations, high reliability organizations, American high school reform, and…

  13. Conceptualizing Essay Tests' Reliability and Validity: From Research to Theory

    ERIC Educational Resources Information Center

    Badjadi, Nour El Imane

    2013-01-01

    The current paper on writing assessment surveys the literature on the reliability and validity of essay tests. The paper aims to examine the two concepts in relationship with essay testing as well as to provide a snapshot of the current understandings of the reliability and validity of essay tests as drawn in recent research studies. Bearing in…

  14. Reliability theory for diffusion processes on interconnected networks

    NASA Astrophysics Data System (ADS)

    Khorramzadeh, Yasamin; Youssef, Mina; Eubank, Stephen

    2014-03-01

    We present the concept of network reliability as a framework to study diffusion dynamics in interdependent networks. We illustrate how different outcomes of diffusion processes, such as cascading failure, can be studied by estimating the reliability polynomial under different reliability rules. As an example, we investigate the effect of structural properties on diffusion dynamics for a few different topologies of two coupled networks. We evaluate the effect of varying the probability of failure propagating along the edges, both within a single network as well as between the networks. We exhibit the sensitivity of interdependent network reliability and connectivity to edge failures in each topology. Network Dynamics and Simulation Science Laboratory, Virginia Bioinformatics Institute, Virginia Tech, Blacksburg, Virginia 24061, USA.

  15. A PERSPECTIVE ON RELIABILITY: PROBABILITY THEORY AND BEYOND

    SciTech Connect

    J. M. BOOKER; N. D. SINGPURWALLA

    2001-05-01

    Reliability assessment in the coming era is inclined to be characterized by a difficult dilemma. On the one hand units and systems will be required to be ultra reliable; on the other hand, it may not be possible to subject them to a full-scale testing. A case in point occurs where testing is limited is one-of-a-kind complex systems, such as space exploration vehicles or where severe testing constraints are imposed such as full scale testing of strategic nuclear weapons prohibited by test ban treaties and international agreements. Decision makers also require reliability assessments for problems with terabytes of data, such as from complex simulations of system performance. Quantitative measures of reliability and their associated uncertainties will remain integral to system monitoring and tactical decision making. The challenge is to derive these defensible measures in light of these dilemmas. Because reliability is usually defined as a probability that the system performs to its required specification, probability enters into the heart of these dilemmas, both philosophically and practically. This paper provides an overview of the several interpretations of probability as they relate to reliability and to the uncertainties involved. The philosophical issues pertain to the interpretation and the quantification of reliability. For example, how must we interpret a number like 10{sup {minus}9}, for the failure rate of an airplane flight or an electrical power plant? Such numbers are common, particularly in the context of safety. Does it mean one failure in 10{sup 9} identical, or almost identical, trials? Are identical trials physically possible, let alone the fact that 10{sup 9} trials can take generations to perform? How can we make precise the notion of almost identical trials? If the trials are truly identical, then all of them must produce the same outcome and so the reliability must be either one or zero. However tautologies, like certainty and impossibility, can

  16. 18 CFR 39.10 - Changes to an Electric Reliability Organization Rule or Regional Entity Rule.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... with the Commission for approval any proposed Electric Reliability Organization Rule or Rule change. A Regional Entity shall submit a Regional Entity Rule or Rule change to the Electric Reliability Organization... or upon complaint, may propose a change to an Electric Reliability Organization Rule or...

  17. 76 FR 23222 - Electric Reliability Organization Interpretation of Transmission Operations Reliability

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-26

    ..., 52 FR 47897 (Dec. 17, 1987), FERC Stats. & Regs. Preambles 1986-1990 ] 30,783 (1987). \\23\\ 18 CFR 380...) proposed interpretation of Reliability Standard, TOP-001-1, Requirement R8. DATES: Comments are due June 27... Requirement R8 in Commission-approved NERC Reliability Standard TOP-001-1 -- Reliability Responsibilities...

  18. Test Theories, Educational Priorities and Reliability of Public Examinations in England

    ERIC Educational Resources Information Center

    Baird, Jo-Anne; Black, Paul

    2013-01-01

    Much has already been written on the controversies surrounding the use of different test theories in educational assessment. Other authors have noted the prevalence of classical test theory over item response theory in practice. This Special Issue draws together articles based upon work conducted on the Reliability Programme for England's…

  19. Understanding organic photovoltaic cells: Electrode, nanostructure, reliability, and performance

    NASA Astrophysics Data System (ADS)

    Kim, Myung-Su

    My Ph.D. research has focused on alternative renewable energy using organic semiconductors. During my study, first, I have established reliable characterization methods of organic photovoltaic devices. More specifically, less than 5% variation of power conversion efficiency of fabricated organic blend photovoltaic cells (OBPC) was achieved after optimization. The reproducibility of organic photovoltaic cell performance is one of the essential issues that must be clarified before beginning serious investigations of the application of creative and challenging ideas. Second, the relationships between fill factor (FF) and process variables have been demonstrated with series and shunt resistance, and this provided a chance to understand the electrical device behavior. In the blend layer, series resistance (Rs) and shunt resistance (Rsh) were varied by controlling the morphology of the blend layer, the regioregularity of the conjugated polymer, and the thickness of the blend layer. At the interface between the cathode including PEDOT:PSS and the blend layer, cathode conductivity was controlled by varying the structure of the cathode or adding an additive. Third, we thoroughly examined possible characterization mistakes in OPVC. One significant characterization mistake is observed when the crossbar electrode geometry of OPVC using PEDOT:PSS was fabricated and characterized with illumination which is larger than the actual device area. The hypothesis to explain this overestimation was excess photo-current generated from the cell region outside the overlapped electrode area, where PEDOT:PSS plays as anode and this was clearly supported with investigations. Finally, I incorporated a creative idea, which enhances the exciton dissociation efficiency by increasing the interface area between donor and acceptor to improve the power conversion efficiency of organic photovoltaic cells. To achieve this, nanoimprint lithography was applied for interface area increase. To clarify the

  20. Using Metaphors to Teach Organization Theory

    ERIC Educational Resources Information Center

    Taber, Tom D.

    2007-01-01

    Metaphors were used to teach systems thinking and to clarify concepts of organizational theory in an introductory MBA management course. Gareth Morgan's metaphors of organization were read by students and applied as frames to analyze a business case. In addition, personal metaphors were written by individual students in order to describe the…

  1. The Progress of Theory in Knowledge Organization.

    ERIC Educational Resources Information Center

    Smiraglia, Richard P.

    2002-01-01

    Presents a background on theory in knowledge organization, which has moved from an epistemic stance of pragmatism and rationalism (based on observation of the construction of retrieval tools), to empiricism (based on the results of empirical research). Discusses historicism, external validity, classification, user-interface design, and…

  2. The Fail-Safe Schools Challenge: Leadership Possibilities from High Reliability Organizations. Perspective

    ERIC Educational Resources Information Center

    Bellamy, G. Thomas; Crawford, Lindy; Marshall, Laura Huber; Coulter, Gail A.

    2005-01-01

    As public policies increasingly hold schools responsible for preventing school failure, experiences of other organizations that must operate with high reliability may be helpful. This article builds on previous studies of high reliability organizations to inquire how their strategies might inform efforts to improve reliability in loosely coupled…

  3. The Fail-Safe Schools Challenge: Leadership Possibilities From High Reliability Organizations

    ERIC Educational Resources Information Center

    Bellamy, G.; Crawford, Lindy; Marshall, Laura; Coulter, Gail

    2005-01-01

    As public policies increasingly hold schools responsible for preventing school failure, experiences of other organizations that must operate with high reliability may be helpful. This article builds on previous studies of high reliability organizations to inquire how their strategies might inform efforts to improve reliability in loosely coupled…

  4. Reliability of the Optimized Perturbation Theory for scalar fields at finite temperature

    SciTech Connect

    Farias, R. L.; Teixeira, D. L. Jr.; Ramos, R. O.

    2013-03-25

    The thermodynamics of a massless scalar field with a quartic interaction is studied up to third order in the Optimized Perturbation Theory (OPT) method. A comparison with other nonperturbative approaches is performed such that the reliability of OPT is accessed.

  5. 76 FR 58101 - Electric Reliability Organization Interpretation of Transmission Operations Reliability Standard

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-20

    ... Reliability Standard, Notice of Proposed Rulemaking, 76 FR 23222 (Apr. 26, 2011), FERC Stats. & Regs. ] 32,674... No. 486, 52 FR 47897 (Dec. 17, 1987), FERC Stats. & Regs. Preambles 1986-1990 ] 30,783 (1987). \\24... of Reliability Standard, TOP-001-1, Requirement R8, which pertains to the restoration of real...

  6. Bi-Factor Multidimensional Item Response Theory Modeling for Subscores Estimation, Reliability, and Classification

    ERIC Educational Resources Information Center

    Md Desa, Zairul Nor Deana

    2012-01-01

    In recent years, there has been increasing interest in estimating and improving subscore reliability. In this study, the multidimensional item response theory (MIRT) and the bi-factor model were combined to estimate subscores, to obtain subscores reliability, and subscores classification. Both the compensatory and partially compensatory MIRT…

  7. Electronic-Structure Theory of Organic Semiconductors: Charge-Transport Parameters and Metal/Organic Interfaces

    NASA Astrophysics Data System (ADS)

    Coropceanu, Veaceslav; Li, Hong; Winget, Paul; Zhu, Lingyun; Brédas, Jean-Luc

    2013-07-01

    We focus this review on the theoretical description, at the density functional theory level, of two key processes that are common to electronic devices based on organic semiconductors (such as organic light-emitting diodes, field-effect transistors, and solar cells), namely charge transport and charge injection from electrodes. By using representative examples of current interest, our main goal is to introduce some of the reliable theoretical methodologies that can best depict these processes. We first discuss the evaluation of the microscopic parameters that determine charge-carrier transport in organic molecular crystals, i.e., electronic couplings and electron-vibration couplings. We then examine the electronic structure at interfaces between an organic layer and a metal or conducting oxide electrode, with an emphasis on the work-function modifications induced by the organic layer and on the interfacial energy-level alignments.

  8. Reliability of the Measure of Acceptance of the Theory of Evolution (MATE) Instrument with University Students

    ERIC Educational Resources Information Center

    Rutledge, Michael L.; Sadler, Kim C.

    2007-01-01

    The Measure of Acceptance of the Theory of Evolution (MATE) instrument was initially designed to assess high school biology teachers' acceptance of evolutionary theory. To determine if the MATE instrument is reliable with university students, it was administered to students in a non-majors biology course (n = 61) twice over a 3-week period.…

  9. 76 FR 23171 - Electric Reliability Organization Interpretations of Interconnection Reliability Operations and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-26

    ... Standards, Notice of Proposed Rulemaking, 75 FR 80391 (Dec. 22, 2010), 133 FERC ] 61,234, at P 27 (2010... Interconnection Reliability Operating Limits, Order No. 748, 76 FR, 16240 (Mar. 23, 2011), 134 FERC ] 61,213 (2011... monitoring), Notice of Proposed Rulemaking, 75 FR 71613 (Nov. 24, 2010), FERC Stats. & Regs. ] 32,665, at...

  10. 78 FR 803 - Revisions to Electric Reliability Organization Definition of Bulk Electric System and Rules of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-04

    ... Corporation (NERC), the Commission-certified Electric Reliability Organization. The Commission finds that the... North American Electric Reliability Corporation (NERC), the Commission-certified Electric Reliability... Procedure, Notice of Proposed Rulemaking, 77 FR 39857 (July 5, 2012) 139 FERC ] 61,247 (2012) (NOPR)....

  11. 77 FR 39857 - Revisions to Electric Reliability Organization Definition of Bulk Electric System and Rules of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-05

    ... Revisions to Electric Reliability Organization Definition of Bulk Electric System and Rules of Procedure...; ] DEPARTMENT OF ENERGY Federal Energy Regulatory Commission 18 CFR Part 40 Revisions to Electric Reliability Organization Definition of Bulk Electric System and Rules of Procedure AGENCY: Federal Energy...

  12. 78 FR 29209 - Revisions to Electric Reliability Organization Definition of Bulk Electric System and Rules of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-17

    ... to Electric Reliability Organization Definition of Bulk Electric System and Rules of Procedure; Final...; ] DEPARTMENT OF ENERGY Federal Energy Regulatory Commission 18 CFR Part 40 Revisions to Electric Reliability Organization Definition of Bulk Electric System and Rules of Procedure AGENCY: Federal Energy...

  13. Mathematic Modeling of Complex Hydraulic Machinery Systems When Evaluating Reliability Using Graph Theory

    NASA Astrophysics Data System (ADS)

    Zemenkova, M. Yu; Shipovalov, A. N.; Zemenkov, Yu D.

    2016-04-01

    The main technological equipment of pipeline transport of hydrocarbons are hydraulic machines. During transportation of oil mainly used of centrifugal pumps, designed to work in the “pumping station-pipeline” system. Composition of a standard pumping station consists of several pumps, complex hydraulic piping. The authors have developed a set of models and algorithms for calculating system reliability of pumps. It is based on the theory of reliability. As an example, considered one of the estimation methods with the application of graph theory.

  14. Reliability analysis of the objective structured clinical examination using generalizability theory

    PubMed Central

    Trejo-Mejía, Juan Andrés; Sánchez-Mendiola, Melchor; Méndez-Ramírez, Ignacio; Martínez-González, Adrián

    2016-01-01

    Background The objective structured clinical examination (OSCE) is a widely used method for assessing clinical competence in health sciences education. Studies using this method have shown evidence of validity and reliability. There are no published studies of OSCE reliability measurement with generalizability theory (G-theory) in Latin America. The aims of this study were to assess the reliability of an OSCE in medical students using G-theory and explore its usefulness for quality improvement. Methods An observational cross-sectional study was conducted at National Autonomous University of Mexico (UNAM) Faculty of Medicine in Mexico City. A total of 278 fifth-year medical students were assessed with an 18-station OSCE in a summative end-of-career final examination. There were four exam versions. G-theory with a crossover random effects design was used to identify the main sources of variance. Examiners, standardized patients, and cases were considered as a single facet of analysis. Results The exam was applied to 278 medical students. The OSCE had a generalizability coefficient of 0.93. The major components of variance were stations, students, and residual error. The sites and the versions of the tests had minimum variance. Conclusions Our study achieved a G coefficient similar to that found in other reports, which is acceptable for summative tests. G-theory allows the estimation of the magnitude of multiple sources of error and helps decision makers to determine the number of stations, test versions, and examiners needed to obtain reliable measurements. PMID:27543188

  15. Some Characteristics of One Type of High Reliability Organization.

    ERIC Educational Resources Information Center

    Roberts, Karlene H.

    1990-01-01

    Attempts to define organizational processes necessary to operate safely technologically complex organizations. Identifies nuclear powered aircraft carriers as examples of potentially hazardous organizations with histories of excellent operations. Discusses how carriers deal with components of risk and antecedents to catastrophe cited by Perrow and…

  16. Two Prophecy Formulas for Assessing the Reliability of Item Response Theory-Based Ability Estimates

    ERIC Educational Resources Information Center

    Raju, Nambury S.; Oshima, T.C.

    2005-01-01

    Two new prophecy formulas for estimating item response theory (IRT)-based reliability of a shortened or lengthened test are proposed. Some of the relationships between the two formulas, one of which is identical to the well-known Spearman-Brown prophecy formula, are examined and illustrated. The major assumptions underlying these formulas are…

  17. Estimating Reliability of School-Level Scores Using Multilevel and Generalizability Theory Models

    ERIC Educational Resources Information Center

    Jeon, Min-Jeong; Lee, Guemin; Hwang, Jeong-Won; Kang, Sang-Jin

    2009-01-01

    The purpose of this study was to investigate the methods of estimating the reliability of school-level scores using generalizability theory and multilevel models. Two approaches, "student within schools" and "students within schools and subject areas," were conceptualized and implemented in this study. Four methods resulting from the combination…

  18. Assessing Academic Advising Outcomes Using Social Cognitive Theory: A Validity and Reliability Study

    ERIC Educational Resources Information Center

    Erlich, Richard J.; Russ-Eft, Darlene F.

    2012-01-01

    The validity and reliability of three instruments, the "Counselor Rubric for Gauging Student Understanding of Academic Planning," micro-analytic questions, and the "Student Survey for Understanding Academic Planning," all based on social cognitive theory, were tested as means to assess self-efficacy and self-regulated learning in college academic…

  19. Using chemical organization theory for model checking

    PubMed Central

    Kaleta, Christoph; Richter, Stephan; Dittrich, Peter

    2009-01-01

    Motivation: The increasing number and complexity of biomodels makes automatic procedures for checking the models' properties and quality necessary. Approaches like elementary mode analysis, flux balance analysis, deficiency analysis and chemical organization theory (OT) require only the stoichiometric structure of the reaction network for derivation of valuable information. In formalisms like Systems Biology Markup Language (SBML), however, information about the stoichiometric coefficients required for an analysis of chemical organizations can be hidden in kinetic laws. Results: First, we introduce an algorithm that uncovers stoichiometric information that might be hidden in the kinetic laws of a reaction network. This allows us to apply OT to SBML models using modifiers. Second, using the new algorithm, we performed a large-scale analysis of the 185 models contained in the manually curated BioModels Database. We found that for 41 models (22%) the set of organizations changes when modifiers are considered correctly. We discuss one of these models in detail (BIOMD149, a combined model of the ERK- and Wnt-signaling pathways), whose set of organizations drastically changes when modifiers are considered. Third, we found inconsistencies in 5 models (3%) and identified their characteristics. Compared with flux-based methods, OT is able to identify those species and reactions more accurately [in 26 cases (14%)] that can be present in a long-term simulation of the model. We conclude that our approach is a valuable tool that helps to improve the consistency of biomodels and their repositories. Availability: All data and a JAVA applet to check SBML-models is available from http://www.minet.uni-jena.de/csb/prj/ot/tools Contact: dittrich@minet.uni-jena.de Supplementary information: Supplementary data are available at Bioinformatics online. PMID:19468053

  20. 78 FR 30245 - Electric Reliability Organization Interpretation of Specific Requirements of the Disturbance...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-22

    ... Environmental Policy Act, Order No. 486, 52 FR 47897 (Dec. 17, 1987), FERC Stats. & Regs. Regulations Preambles... Energy Regulatory Commission 18 CFR Part 40 Electric Reliability Organization Interpretation of Specific... for approval by the North American Electric Reliability Corporation, the Commission-certified...

  1. 77 FR 59745 - Delegation of Authority Regarding Electric Reliability Organization's Budget, Delegation...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-01

    ... No. 486, 52 FR 47897 (Dec. 17, 1987), FERC Stats. & Regs. ] 30,783 (1987). \\14\\ 18 CFR 380.4(a)(1). V... Energy Regulatory Commission 18 CFR Part 375 Delegation of Authority Regarding Electric Reliability... responsibilities for specific Electric Reliability Organization (ERO) filings. In particular, this Final...

  2. Organization Theory and Its Application to Adult Education.

    ERIC Educational Resources Information Center

    Geering, Adrian D.

    This paper surveys the field of organization theory and its application to adult education agencies. The paper first defines organization theory (the study of the structure and functioning of organizations and the behavior of groups and individuals within them), and discusses its historical development. It then presents four emerging trends of…

  3. 76 FR 16263 - Revision to Electric Reliability Organization Definition of Bulk Electric System

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-23

    ... Organization Definition of Bulk Electric System, Order No. 743, 75 FR 72910 (Nov. 26, 2010), 133 FERC ] 61,150... Energy Regulatory Commission 18 CFR Part 40 Revision to Electric Reliability Organization Definition of Bulk Electric System AGENCY: Federal Energy Regulatory Commission, DOE. ACTION: Order on...

  4. Influencing Organizations to Promote Health: Applying Stakeholder Theory

    ERIC Educational Resources Information Center

    Kok, Gerjo; Gurabardhi, Zamira; Gottlieb, Nell H.; Zijlstra, Fred R. H.

    2015-01-01

    Stakeholder theory may help health promoters to make changes at the organizational and policy level to promote health. A stakeholder is any individual, group, or organization that can influence an organization. The organization that is the focus for influence attempts is called the focal organization. The more salient a stakeholder is and the more…

  5. Building New Bridges: Linking Organization Theory with Other Educational Literatures

    ERIC Educational Resources Information Center

    Johnson, Bob L., Jr.; Owens, Michael

    2005-01-01

    Purpose: This paper provides an example of how organization theory can be linked with other literatures in a complementary and productive manner. Establishing a bridge between the organization theory and learning environment literatures, the authors seek to provide an example of how such literature-bridging can enrich our understanding of the…

  6. In search of principles for a Theory of Organisms.

    PubMed

    Longo, Giuseppe; Montevil, Mael; Sonnenschein, Carlos; Soto, Ana M

    2015-12-01

    Lacking an operational theory to explain the organization and behaviour of matter in unicellular and multicellular organisms hinders progress in biology. Such a theory should address life cycles from ontogenesis to death. This theory would complement the theory of evolution that addresses phylogenesis, and would posit theoretical extensions to accepted physical principles and default states in order to grasp the living state of matter and define proper biological observables. Thus, we favour adopting the default state implicit in Darwin's theory, namely, cell proliferation with variation plus motility, and a framing principle, namely, life phenomena manifest themselves as non-identical iterations of morphogenetic processes. From this perspective, organisms become a consequence of the inherent variability generated by proliferation, motility and self-organization. Morphogenesis would then be the result of the default state plus physical constraints, like gravity, and those present in living organisms, like muscular tension. PMID:26648040

  7. Sensor Reliability Evaluation Scheme for Target Classification Using Belief Function Theory

    PubMed Central

    Zhu, Jing; Luo, Yupin; Zhou, Jianjun

    2013-01-01

    In the target classification based on belief function theory, sensor reliability evaluation has two basic issues: reasonable dissimilarity measure among evidences, and adaptive combination of static and dynamic discounting. One solution to the two issues has been proposed here. Firstly, an improved dissimilarity measure based on dualistic exponential function has been designed. We assess the static reliability from a training set by the local decision of each sensor and the dissimilarity measure among evidences. The dynamic reliability factors are obtained from each test target using the dissimilarity measure between the output information of each sensor and the consensus. Secondly, an adaptive combination method of static and dynamic discounting has been introduced. We adopt Parzen-window to estimate the matching degree of current performance and static performance for the sensor. Through fuzzy theory, the fusion system can realize self-learning and self-adapting with the sensor performance changing. Experiments conducted on real databases demonstrate that our proposed scheme performs better in target classification under different target conditions compared with other methods. PMID:24351632

  8. Generalizability theory reliability of written expression curriculum-based measurement in universal screening.

    PubMed

    Keller-Margulis, Milena A; Mercer, Sterett H; Thomas, Erin L

    2016-09-01

    The purpose of this study was to examine the reliability of written expression curriculum-based measurement (WE-CBM) in the context of universal screening from a generalizability theory framework. Students in second through fifth grade (n = 145) participated in the study. The sample included 54% female students, 49% White students, 23% African American students, 17% Hispanic students, 8% Asian students, and 3% of students identified as 2 or more races. Of the sample, 8% were English Language Learners and 6% were students receiving special education. Three WE-CBM probes were administered for 7 min each at 3 time points across 1 year. Writing samples were scored for commonly used WE-CBM metrics (e.g., correct minus incorrect word sequences; CIWS). Results suggest that nearly half the variance in WE-CBM is related to unsystematic error and that conventional screening procedures (i.e., the use of one 3-min sample) do not yield scores with adequate reliability for relative or absolute decisions about student performance. In most grades, three 3-min writing samples (or 2 longer duration samples) were required for adequate reliability for relative decisions, and three 7-min writing samples would not yield adequate reliability for relative decisions about within-year student growth. Implications and recommendations are discussed. (PsycINFO Database Record PMID:26322656

  9. [Reliability theory based on quality risk network analysis for Chinese medicine injection].

    PubMed

    Li, Zheng; Kang, Li-Yuan; Fan, Xiao-Hui

    2014-08-01

    A new risk analysis method based upon reliability theory was introduced in this paper for the quality risk management of Chinese medicine injection manufacturing plants. The risk events including both cause and effect ones were derived in the framework as nodes with a Bayesian network analysis approach. It thus transforms the risk analysis results from failure mode and effect analysis (FMEA) into a Bayesian network platform. With its structure and parameters determined, the network can be used to evaluate the system reliability quantitatively with probabilistic analytical appraoches. Using network analysis tools such as GeNie and AgenaRisk, we are able to find the nodes that are most critical to influence the system reliability. The importance of each node to the system can be quantitatively evaluated by calculating the effect of the node on the overall risk, and minimization plan can be determined accordingly to reduce their influences and improve the system reliability. Using the Shengmai injection manufacturing plant of SZYY Ltd as a user case, we analyzed the quality risk with both static FMEA analysis and dynamic Bayesian Network analysis. The potential risk factors for the quality of Shengmai injection manufacturing were identified with the network analysis platform. Quality assurance actions were further defined to reduce the risk and improve the product quality. PMID:25509315

  10. Central Perspectives and Debates in Organization Theory.

    ERIC Educational Resources Information Center

    Astley, W. Graham; Van de Ven, Andrew H.

    1983-01-01

    Classifies organizational theories, by analytical level and assumptions about human nature, into four perspectives (system-structural, strategic choice, natural selection, collective action), each with different concepts of organizational structure, behavior, change, and managerial roles. Identifies six debates generated among the perspectives and…

  11. Research on High Reliability Organizations: Implications for School Effects Research, Policy, and Educational Practice.

    ERIC Educational Resources Information Center

    Stringfield, Sam

    Current theorizing in education, as in industry, is largely devoted to explaining trial-and-error, failure-tolerant, low-reliability organizations. This article examines changing societal demands on education and argues that effective responses to those demands require new and different organizational structures. Schools must abandon industrial…

  12. 75 FR 72909 - Revision to Electric Reliability Organization Definition of Bulk Electric System

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-26

    ... Reliability Organization Definition of Bulk Electric System, Notice of Proposed Rulemaking, 75 FR 14097 (Mar... Electric System, Notice of Proposed Rulemaking, 75 FR 14097 (Mar. 24, 2010), FERC Stats. & Regs. ] 32,654... the NOPR in this Final Rule, as described below. \\26\\ See 75 FR 14097 (Mar. 24, 2010). \\27\\ A list...

  13. 18 CFR 39.4 - Funding of the Electric Reliability Organization.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Funding of the Electric Reliability Organization. 39.4 Section 39.4 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY REGULATIONS UNDER THE FEDERAL POWER ACT RULES...

  14. [Employees in high-reliability organizations: systematic selection of personnel as a final criterion].

    PubMed

    Oubaid, V; Anheuser, P

    2014-05-01

    Employees represent an important safety factor in high-reliability organizations. The combination of clear organizational structures, a nonpunitive safety culture, and psychological personnel selection guarantee a high level of safety. The cockpit personnel selection process of a major German airline is presented in order to demonstrate a possible transferability into medicine and urology. PMID:24806799

  15. 18 CFR 39.4 - Funding of the Electric Reliability Organization.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Funding of the Electric Reliability Organization. 39.4 Section 39.4 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY REGULATIONS UNDER THE FEDERAL POWER ACT RULES...

  16. Using the Reliability Theory for Assessing the Decision Confidence Probability for Comparative Life Cycle Assessments.

    PubMed

    Wei, Wei; Larrey-Lassalle, Pyrène; Faure, Thierry; Dumoulin, Nicolas; Roux, Philippe; Mathias, Jean-Denis

    2016-03-01

    Comparative decision making process is widely used to identify which option (system, product, service, etc.) has smaller environmental footprints and for providing recommendations that help stakeholders take future decisions. However, the uncertainty problem complicates the comparison and the decision making. Probability-based decision support in LCA is a way to help stakeholders in their decision-making process. It calculates the decision confidence probability which expresses the probability of a option to have a smaller environmental impact than the one of another option. Here we apply the reliability theory to approximate the decision confidence probability. We compare the traditional Monte Carlo method with a reliability method called FORM method. The Monte Carlo method needs high computational time to calculate the decision confidence probability. The FORM method enables us to approximate the decision confidence probability with fewer simulations than the Monte Carlo method by approximating the response surface. Moreover, the FORM method calculates the associated importance factors that correspond to a sensitivity analysis in relation to the probability. The importance factors allow stakeholders to determine which factors influence their decision. Our results clearly show that the reliability method provides additional useful information to stakeholders as well as it reduces the computational time. PMID:26815724

  17. Organizations and Social Systems: Organization Theory's Neglected Mandate.

    ERIC Educational Resources Information Center

    Stern, Robert N.; Barley, Stephen R.

    1996-01-01

    The social-systems perspective in organizational theory faded because the increasing complexity of social relations hindered determination of an appropriate unit of analysis. Also, the business-school environment in which organizational research occurred discouraged examination of broad social questions, promoted a particular approach to science,…

  18. Influencing organizations to promote health: applying stakeholder theory.

    PubMed

    Kok, Gerjo; Gurabardhi, Zamira; Gottlieb, Nell H; Zijlstra, Fred R H

    2015-04-01

    Stakeholder theory may help health promoters to make changes at the organizational and policy level to promote health. A stakeholder is any individual, group, or organization that can influence an organization. The organization that is the focus for influence attempts is called the focal organization. The more salient a stakeholder is and the more central in the network, the stronger the influence. As stakeholders, health promoters may use communicative, compromise, deinstitutionalization, or coercive methods through an ally or a coalition. A hypothetical case study, involving adolescent use of harmful legal products, illustrates the process of applying stakeholder theory to strategic decision making. PMID:25829111

  19. Selecting Organization Development Theory from an HRD Perspective

    ERIC Educational Resources Information Center

    Lynham, Susan A.; Chermack, Thomas J.; Noggle, Melissa A.

    2004-01-01

    As is true for human resource development (HRD), the field of organization development (OD) draws from numerous disciplines to inform its theory base. However, the identification and selection of theory to inform improved practice remains a challenge and begs the question of what can be used to inform and guide one in the identification and…

  20. General Systems Theory Approaches to Organizations: Some Problems in Application

    ERIC Educational Resources Information Center

    Peery, Newman S., Jr.

    1975-01-01

    Considers the limitations of General Systems Theory (GST) as a major paradigm within administrative theory and concludes that most systems formulations overemphasize growth and show little appreciation for intraorganizational conflict, diversity of values, and political action within organizations. Suggests that these limitations are mainly due to…

  1. Teaching Organization Theory and Practice: An Experiential and Reflective Approach

    ERIC Educational Resources Information Center

    Cameron, Mark; Turkiewicz, Rita M.; Holdaway, Britt A.; Bill, Jacqueline S.; Goodman, Jessica; Bonner, Aisha; Daly, Stacey; Cohen, Michael D.; Lorenz, Cassandra; Wilson, Paul R.; Rusk, James

    2009-01-01

    The organization is often the overlooked level in social work's ecological perspective. However, organizational realities exert a profound influence on human development and well-being as well as the nature and quality of social work practice. This article describes a model of teaching organization theory and practice which requires master's…

  2. Do Advance Organizers Facilitate Learning? A Review of Subsumption Theory.

    ERIC Educational Resources Information Center

    McEneany, John E.

    1990-01-01

    A review of four studies conducted by Ausubel raises serious doubts about the efficacy of advance organizers under a variety of circumstances. In addition, this review questions the adequacy of definitions for two central notions of subsumption theory (discriminability and advance organizer). (IAH)

  3. Enhance the lifetime and bias stress reliability in organic vertical transistor by UV/Ozone treatment

    NASA Astrophysics Data System (ADS)

    Lin, Hung-Cheng; Chang, Ming-Yu; Zan, Hsiao-Wen; Meng, Hsin-Fei; Chao, Yu-Chiang

    In this paper, we use UV/Ozone treatment to improve the lifetime and bias stress reliability of organic transistor with vertical channel. Even if vertical organic transistor exhibits better bias stress reliability than organic field effect transistor (OFET) due to bulk conduction mechanism, poor lifetime performance is still a challenge. Adding octadecyltrichlorosilane (OTS) to treat the vertical channel can reduce the trapping state and hence improve the bias stress ability. However, off-current is much higher after 6 days and lifetime performance is degraded. On the other hand, after 4000-s on-state bias stress, stable output current and on/off current ratio are demonstrated by using UV/Ozone to treat vertical channels. Threshold voltage shift is only -0.02 V which is much smaller than OFET with the same organic semiconductor material. Furthermore, the output current is also an order enhanced. Nevertheless, unlike device with OTS treatment, no obvious degradation is observed for UV/Ozone treated devices even after 170 days. With UV/Ozone treatment, the output current, bias stress reliability and lifetime were all improved. It makes vertical transistor become a promising device for the further application in display technology and flexible electronics.

  4. Thought analysis on self-organization theories of MHD plasma

    NASA Astrophysics Data System (ADS)

    Kondoh, Yoshiomi; Sato, Tetsuya

    1992-08-01

    A thought analysis on the self-organization theories of dissipative MHD plasmas is presented to lead to three groups of theories that lead to the same relaxed state of del x B = lambda(B), in order to find an essential physical picture embedded in the self-organization phenomena due to nonlinear and dissipative processes. The self-organized relaxed state due to the dissipation by the Ohm loss is shown to be formulated generally as the state such that yields the minimum dissipation rate of global auto- and/or cross-correlations between two quantities in j, B, and A for their own instantaneous values of the global correlations.

  5. Cliophysics: Socio-Political Reliability Theory, Polity Duration and African Political (In)stabilities

    PubMed Central

    Cherif, Alhaji; Barley, Kamal

    2010-01-01

    Quantification of historical sociological processes have recently gained attention among theoreticians in the effort of providing a solid theoretical understanding of the behaviors and regularities present in socio-political dynamics. Here we present a reliability theory of polity processes with emphases on individual political dynamics of African countries. We found that the structural properties of polity failure rates successfully capture the risk of political vulnerability and instabilities in which , , , and of the countries with monotonically increasing, unimodal, U-shaped and monotonically decreasing polity failure rates, respectively, have high level of state fragility indices. The quasi-U-shape relationship between average polity duration and regime types corroborates historical precedents and explains the stability of the autocracies and democracies. PMID:21206911

  6. Reliability and validity of the Leuven Perceptual Organization Screening Test (L-POST).

    PubMed

    Vancleef, Kathleen; Acke, Elia; Torfs, Katrien; Demeyere, Nele; Lafosse, Christophe; Humphreys, Glyn; Wagemans, Johan; de-Wit, Lee

    2015-09-01

    Neuropsychological tests of visual perception mostly assess high-level processes like object recognition. Object recognition, however, relies on distinct mid-level processes of perceptual organization that are only implicitly tested in classical tests. Furthermore, the psychometric properties of the existing instruments are limited. To fill this gap, the Leuven perceptual organization screening test (L-POST) was developed, in which a wide range of mid-level phenomena are measured in 15 subtests. In this study, we evaluated reliability and validity of the L-POST. Performance on the test is evaluated relative to a norm sample of more than 1,500 healthy control participants. Cronbach's alpha of the norm sample and test-retest correlations for 20 patients provide evidence for adequate reliability of L-POST performance. The convergent and discriminant validity of the test was assessed in 40 brain-damaged patients, whose performance on the L-POST was compared with standard clinical tests of visual perception and other measures of cognitive function. The L-POST showed high sensitivity to visual dysfunction and decreased performance was specific to visual problems. In conclusion, the L-POST is a reliable and valid screening test for perceptual organization. It offers a useful online tool for researchers and clinicians to get a broader overview of the mid-level processes that are preserved or disrupted in a given patient. PMID:25042381

  7. Measuring theory of mind across middle childhood: Reliability and validity of the Silent Films and Strange Stories tasks.

    PubMed

    Devine, Rory T; Hughes, Claire

    2016-09-01

    Recent years have seen a growth of research on the development of children's ability to reason about others' mental states (or "theory of mind") beyond the narrow confines of the preschool period. The overall aim of this study was to investigate the psychometric properties of a task battery composed of items from Happé's Strange Stories task and Devine and Hughes' Silent Film task. A sample of 460 ethnically and socially diverse children (211 boys) between 7 and 13years of age completed the task battery at two time points separated by 1month. The Strange Stories and Silent Film tasks were strongly correlated even when verbal ability and narrative comprehension were taken into account, and all items loaded onto a single theory-of-mind latent factor. The theory-of-mind latent factor provided reliable estimates of performance across a wide range of theory-of-mind ability and showed no evidence of differential item functioning across gender, ethnicity, or socioeconomic status. The theory-of-mind latent factor also exhibited strong 1-month test-retest reliability, and this stability did not vary as a function of child characteristics. Taken together, these findings provide evidence for the validity and reliability of the Strange Stories and Silent Film task battery as a measure of individual differences in theory of mind suitable for use across middle childhood. We consider the methodological and conceptual implications of these findings for research on theory of mind beyond the preschool years. PMID:26255713

  8. A win for HROs. Employing high-reliability organization characteristics in EMS.

    PubMed

    Heightman, A J

    2013-06-01

    Was I insubordinate, arrogant or disrespectful? You may feel that I was. But in reality, I was educated to a level that could have been validated and should have been respected by command. I was, in fact, practicing a key aspect of HRO. I was stopping an obvious dangerous condition before it could harm or kill emergency responders. My IC colleague knew it from the facts presented and, in fact, joked with me about my "subtle sarcasm" and moved the perimeter to the recommended half-mile distance. Did I win, or did a proactive HRO win? Actually, HRO won and potentially saved 30 lives. I simply presented the hazards of CFC inhalation. A high-reliability organization must not rely on only one source of data when detailed information on a hazard isn't immediately available, or if it isn't very informative during an emergency decision-making process. Read "EMS & High Reliability Organizing: Achieving safety & reliability in the dynamic, high-risk environment and practice its important principles," pp. 60-63. It's really common sense, not rocket science, and may save you, your crews or others in your community. PMID:24159720

  9. A modelling approach to find stable and reliable soil organic carbon values for further regionalization.

    NASA Astrophysics Data System (ADS)

    Bönecke, Eric; Franko, Uwe

    2015-04-01

    Soil organic matter (SOM) and carbon (SOC) might be the most important components to describe soil fertility of agricultural used soils. It is sensitive to temporal and spatial changes due to varying weather conditions, uneven crops and soil management practices and still struggles with providing reliable delineation of spatial variability. Soil organic carbon, furthermore, is an essential initial parameter for dynamic modelling, understanding e.g. carbon and nitrogen processes. Alas it requires cost and time intensive field and laboratory work to attain and using this information. The objective of this study is to assess an approach that reduces efforts of laboratory and field analyses by using method to find stable initial soil organic carbon values for further soil process modelling and regionalization on field scale. The demand of strategies, technics and tools to improve reliable soil organic carbon high resolution maps and additionally reducing cost constraints is hence still facing an increasing attention of scientific research. Although, it is nowadays a widely used practice, combining effective sampling schemes with geophysical sensing techniques, to describe within-field variability of soil organic carbon, it is still challenging large uncertainties, even at field scale in both, science and agriculture. Therefore, an analytical and modelling approach might facilitate and improve this strategy on small and large field scale. This study will show a method, how to find reliable steady state values of soil organic carbon at particular points, using the approved soil process model CANDY (Franko et al. 1995). It is focusing on an iterative algorithm of adjusting the key driving components: soil physical properties, meteorological data and management information, for which we quantified the input and the losses of soil carbon (manure, crop residues, other organic inputs, decomposition, leaching). Furthermore, this approach can be combined with geophysical

  10. Dependence Assessment in Human Reliability Analysis Using Evidence Theory and AHP.

    PubMed

    Su, Xiaoyan; Mahadevan, Sankaran; Xu, Peida; Deng, Yong

    2015-07-01

    Dependence assessment among human errors in human reliability analysis (HRA) is an important issue. Many of the dependence assessment methods in HRA rely heavily on the expert's opinion, thus are subjective and may sometimes cause inconsistency. In this article, we propose a computational model based on the Dempster-Shafer evidence theory (DSET) and the analytic hierarchy process (AHP) method to handle dependence in HRA. First, dependence influencing factors among human tasks are identified and the weights of the factors are determined by experts using the AHP method. Second, judgment on each factor is given by the analyst referring to anchors and linguistic labels. Third, the judgments are represented as basic belief assignments (BBAs) and are integrated into a fused BBA by weighted average combination in DSET. Finally, the CHEP is calculated based on the fused BBA. The proposed model can deal with ambiguity and the degree of confidence in the judgments, and is able to reduce the subjectivity and improve the consistency in the evaluation process. PMID:25847228

  11. Reliability and validity of the German version of the Structured Interview of Personality Organization (STIPO)

    PubMed Central

    2013-01-01

    Background The assessment of personality organization and its observable behavioral manifestations, i.e. personality functioning, has a long tradition in psychodynamic psychiatry. Recently, the DSM-5 Levels of Personality Functioning Scale has moved it into the focus of psychiatric diagnostics. Based on Kernberg’s concept of personality organization the Structured Interview of Personality Organization (STIPO) was developed for diagnosing personality functioning. The STIPO covers seven dimensions: (1) identity, (2) object relations, (3) primitive defenses, (4) coping/rigidity, (5) aggression, (6) moral values, and (7) reality testing and perceptual distortions. The English version of the STIPO has previously revealed satisfying psychometric properties. Methods Validity and reliability of the German version of the 100-item instrument have been evaluated in 122 psychiatric patients. All patients were diagnosed according to the Diagnostic and Statistical Manual for Mental Disorders (DSM-IV) and were assessed by means of the STIPO. Moreover, all patients completed eight questionnaires that served as criteria for external validity of the STIPO. Results Interrater reliability varied between intraclass correlations of .89 and 1.0, Crohnbach’s α for the seven dimensions was .69 to .93. All a priori selected questionnaire scales correlated significantly with the corresponding STIPO dimensions. Patients with personality disorder (PD) revealed significantly higher STIPO scores (i.e. worse personality functioning) than patients without PD; patients cluster B PD showed significantly higher STIPO scores than patients with cluster C PD. Conclusions Interrater reliability, Crohnbach’s α, concurrent validity, and differential validity of the STIPO are satisfying. The STIPO represents an appropriate instrument for the assessment of personality functioning in clinical and research settings. PMID:23941404

  12. Layered and segmented system organization (LASSO) for highly reliable inventory monitoring systems (IMS)

    SciTech Connect

    Mangan, Dennis L.; Matter, John C.; Waddoups, I.; Abhold, M. E.; Chiaro, P.

    2002-01-01

    The Trilateral Initiative is preparing for International Atomic Energy Agency (LUiA) verification of excess fissile material released itom the defense programs of the United States and the Russian Federation. Following acceptance of the material using an Attribute Verification System, the IAEA will depend on an Inventory Monitoring System to maintain Continuity of Knowledge of the large inventory of thousands of items. Recovery fiom a total loss of Continuity of Knowledge in such a large storage facility would involve an extremely costly inventory re-verification This paper presents the framework for a Layered and Segmented System Organization that is the basis for a highly reliable IMS with protection-in-depth.

  13. Entity Model Based Quality Management: A First Step Towards High Reliability Organization Management

    NASA Astrophysics Data System (ADS)

    Engelbrecht, S.; Radestock, Chr.; Bohle, D. K. H.

    2010-09-01

    A management system built upon a generic entity model is presented as an approach towards management systems for High Reliability Organizations(HRO). The entity model is derived from the Ground Systems and Operations standard of the European Cooperation for Space Standardization(ECSS). DLR has launched a first application of the model in its Applied Remote Sensing Cluster, especially for the Center for Satellite based Crisis Information. It is proposed that a management system built upon the entity model systematically enhances a significant number of HRO characteristics.

  14. Teaching organization theory for healthcare management: three applied learning methods.

    PubMed

    Olden, Peter C

    2006-01-01

    Organization theory (OT) provides a way of seeing, describing, analyzing, understanding, and improving organizations based on patterns of organizational design and behavior (Daft 2004). It gives managers models, principles, and methods with which to diagnose and fix organization structure, design, and process problems. Health care organizations (HCOs) face serious problems such as fatal medical errors, harmful treatment delays, misuse of scarce nurses, costly inefficiency, and service failures. Some of health care managers' most critical work involves designing and structuring their organizations so their missions, visions, and goals can be achieved-and in some cases so their organizations can survive. Thus, it is imperative that graduate healthcare management programs develop effective approaches for teaching OT to students who will manage HCOs. Guided by principles of education, three applied teaching/learning activities/assignments were created to teach OT in a graduate healthcare management program. These educationalmethods develop students' competency with OT applied to HCOs. The teaching techniques in this article may be useful to faculty teaching graduate courses in organization theory and related subjects such as leadership, quality, and operation management. PMID:16566496

  15. A Critique of Raju and Oshima's Prophecy Formulas for Assessing the Reliability of Item Response Theory-Based Ability Estimates

    ERIC Educational Resources Information Center

    Wang, Wen-Chung

    2008-01-01

    Raju and Oshima (2005) proposed two prophecy formulas based on item response theory in order to predict the reliability of ability estimates for a test after change in its length. The first prophecy formula is equivalent to the classical Spearman-Brown prophecy formula. The second prophecy formula is misleading because of an underlying false…

  16. The [Alpha] and the [Omega] of Congeneric Test Theory: An Extension of Reliability and Internal Consistency to Heterogeneous Tests

    ERIC Educational Resources Information Center

    Lucke, Joseph F.

    2005-01-01

    Psychometric theory focuses primarily on tests that are homogeneous, measuring only one attribute of a psychosocial entity. However, the complexity of psychosocial behavior often requires tests that are heterogeneous, measuring more than one attribute. In this presentation, reliability and internal consistency are extended to heterogeneous tests…

  17. The Validity and Reliability of Concept Mapping as an Alternative Science Assessment when Item Response Theory Is Used for Scoring.

    ERIC Educational Resources Information Center

    Liu, Xiufeng

    Problems of validity and reliability of concept mapping are addressed by using item-response theory (IRT) models for scoring. In this study, the overall structure of students' concept maps are defined by the number of links, the number of hierarchies, the number of cross-links, and the number of examples. The study was conducted with 92 students…

  18. The Development of the Functional Literacy Experience Scale Based upon Ecological Theory (FLESBUET) and Validity-Reliability Study

    ERIC Educational Resources Information Center

    Özenç, Emine Gül; Dogan, M. Cihangir

    2014-01-01

    This study aims to perform a validity-reliability test by developing the Functional Literacy Experience Scale based upon Ecological Theory (FLESBUET) for primary education students. The study group includes 209 fifth grade students at Sabri Taskin Primary School in the Kartal District of Istanbul, Turkey during the 2010-2011 academic year.…

  19. Organic unity theory: the mind-body problem revisited.

    PubMed

    Goodman, A

    1991-05-01

    The purpose of this essay is to delineate the conceptual framework for psychiatry as an integrated and integrative science that unites the mental and the physical. Four basic philosophical perspectives concerning the relationship between mind and body are introduced. The biopsychosocial model, at this time the preeminent model in medical science that addresses this relationship, is examined and found to be flawed. Mental-physical identity theory is presented as the most valid philosophical approach to understanding the relationship between mind and body. Organic unity theory is then proposed as a synthesis of the biopsychosocial model and mental-physical identity theory in which the difficulties of the biopsychosocial model are resolved. Finally, some implications of organic unity theory for psychiatry are considered. 1) The conventional dichotomy between physical (organic) and mental (functional) is linguistic/conceptual rather than inherent in nature, and all events and processes involved in the etiology, pathogenesis, symptomatic manifestation, and treatment of psychiatric disorders are simultaneously biological and psychological. 2) Neuroscience requires new conceptual models to comprehend the integrated and emergent physiological processes to which psychological phenomena correspond. 3) Introspective awareness provides data that are valid for scientific inquiry and is the most direct method of knowing psychophysical events. 4) Energy currently being expended in disputes between biological and psychological psychiatry would be more productively invested in attempting to formulate the conditions under which each approach is maximally effective. PMID:2018155

  20. Validity and Reliability of Published Comprehensive Theory of Mind Tests for Normal Preschool Children: A Systematic Review

    PubMed Central

    Ziatabar Ahmadi, Seyyede Zohreh; Jalaie, Shohreh; Ashayeri, Hassan

    2015-01-01

    Objective: Theory of mind (ToM) or mindreading is an aspect of social cognition that evaluates mental states and beliefs of oneself and others. Validity and reliability are very important criteria when evaluating standard tests; and without them, these tests are not usable. The aim of this study was to systematically review the validity and reliability of published English comprehensive ToM tests developed for normal preschool children. Method: We searched MEDLINE (PubMed interface), Web of Science, Science direct, PsycINFO, and also evidence base Medicine (The Cochrane Library) databases from 1990 to June 2015. Search strategy was Latin transcription of ‘Theory of Mind’ AND test AND children. Also, we manually studied the reference lists of all final searched articles and carried out a search of their references. Inclusion criteria were as follows: Valid and reliable diagnostic ToM tests published from 1990 to June 2015 for normal preschool children; and exclusion criteria were as follows: the studies that only used ToM tests and single tasks (false belief tasks) for ToM assessment and/or had no description about structure, validity or reliability of their tests. Methodological quality of the selected articles was assessed using the Critical Appraisal Skills Programme (CASP). Result: In primary searching, we found 1237 articles in total databases. After removing duplicates and applying all inclusion and exclusion criteria, we selected 11 tests for this systematic review. Conclusion: There were a few valid, reliable and comprehensive ToM tests for normal preschool children. However, we had limitations concerning the included articles. The defined ToM tests were different in populations, tasks, mode of presentations, scoring, mode of responses, times and other variables. Also, they had various validities and reliabilities. Therefore, it is recommended that the researchers and clinicians select the ToM tests according to their psychometric characteristics

  1. Investigating Postgraduate College Admission Interviews: Generalizability Theory Reliability and Incremental Predictive Validity

    ERIC Educational Resources Information Center

    Arce-Ferrer, Alvaro J.; Castillo, Irene Borges

    2007-01-01

    The use of face-to-face interviews is controversial for college admissions decisions in light of the lack of availability of validity and reliability evidence for most college admission processes. This study investigated reliability and incremental predictive validity of a face-to-face postgraduate college admission interview with a sample of…

  2. Utilizing Generalizability Theory to Investigate the Reliability of the Grades Assigned to Undergraduate Research Papers

    ERIC Educational Resources Information Center

    Gugiu, Mihaiela R.; Gugiu, Paul C.; Baldus, Robert

    2012-01-01

    Background: Educational researchers have long espoused the virtues of writing with regard to student cognitive skills. However, research on the reliability of the grades assigned to written papers reveals a high degree of contradiction, with some researchers concluding that the grades assigned are very reliable whereas others suggesting that they…

  3. The chronic toxicity of molybdate to marine organisms. I. Generating reliable effects data.

    PubMed

    Heijerick, D G; Regoli, L; Stubblefield, W

    2012-07-15

    A scientific research program was initiated by the International Molybdenum Association (IMOA) which addressed identified gaps in the environmental toxicity data for the molybdate ion (MoO(4)(2-)). These gaps were previously identified during the preparation of EU-REACH-dossiers for different molybdenum compounds (European Union regulation on Registration, Evaluation, Authorization and Restriction of Chemical substances; EC, 2006). Evaluation of the open literature identified few reliable marine ecotoxicological data that could be used for deriving a Predicted No-Effect Concentration (PNEC) for the marine environment. Rather than calculating a PNEC(marine) using the assessment factor methodology on a combined freshwater/marine dataset, IMOA decided to generate sufficient reliable marine chronic data to permit derivation of a PNEC by means of the more scientifically robust species sensitivity distribution (SSD) approach (also called the statistical extrapolation approach). Nine test species were chronically exposed to molybdate (added as sodium molybdate dihydrate, Na(2)MoO(4)·2H(2)O) according to published standard testing guidelines that are acceptable for a broad range of regulatory purposes. The selected test organisms were representative for typical marine trophic levels: micro-algae/diatom (Phaeodactylum tricornutum, Dunaliella tertiolecta), macro-alga (Ceramium tenuicorne), mysids (Americamysis bahia), copepod (Acartia tonsa), fish (Cyprinodon variegatus), echinoderms (Dendraster exentricus, Strongylocentrotus purpuratus) and molluscs (Mytilus edulis, Crassostrea gigas). Available NOEC/EC(10) levels ranged between 4.4 mg Mo/L (blue mussel M. edulis) and 1174 mg Mo/L (oyster C. gigas). Using all available reliable marine chronic effects data that are currently available, a HC(5,50%) (median hazardous concentration affecting 5% of the species) of 5.74(mg Mo)/L was derived with the statistical extrapolation approach, a value that can be used for national and

  4. Highly Conductive and Reliable Copper-Filled Isotropically Conductive Adhesives Using Organic Acids for Oxidation Prevention

    NASA Astrophysics Data System (ADS)

    Chen, Wenjun; Deng, Dunying; Cheng, Yuanrong; Xiao, Fei

    2015-07-01

    The easy oxidation of copper is one critical obstacle to high-performance copper-filled isotropically conductive adhesives (ICAs). In this paper, a facile method to prepare highly reliable, highly conductive, and low-cost ICAs is reported. The copper fillers were treated by organic acids for oxidation prevention. Compared with ICA filled with untreated copper flakes, the ICA filled with copper flakes treated by different organic acids exhibited much lower bulk resistivity. The lowest bulk resistivity achieved was 4.5 × 10-5 Ω cm, which is comparable to that of commercially available Ag-filled ICA. After 500 h of 85°C/85% relative humidity (RH) aging, the treated ICAs showed quite stable bulk resistivity and relatively stable contact resistance. Through analyzing the results of x-ray diffraction, x-ray photoelectron spectroscopy, and thermogravimetric analysis, we found that, with the assistance of organic acids, the treated copper flakes exhibited resistance to oxidation, thus guaranteeing good performance.

  5. Portable SERS-enabled micropipettes for microarea sampling and reliably quantitative detection of surface organic residues.

    PubMed

    Fang, Wei; Zhang, Xinwei; Chen, Yong; Wan, Liang; Huang, Weihua; Shen, Aiguo; Hu, Jiming

    2015-09-15

    We report the first microsampling device for reliably quantitative, label-free and separation-free detection of multicomponents of surface organic residues (SORs) by means of a quality controllable surface-enhanced Raman scattering (SERS)-enabled micropipette. The micropipette is comprised of a drawn glass capillary with a tiny orifice (∼50 μm) at the distal tip, where the specially designed nanorattles (NRs) are compactly coated on the inner wall surface. SERS signals of 4-mercapto benzoic acid (MBA) anchored inside the internal gap of NRs could be used to evaluate and control the quality of micropipettes and, therefore, allow us to overcome the limitations of a reliably quantitative SERS assay using traditional substrates without an internal standard. By dropping a trace extraction agent on targeting SORs located on a narrow surface, the capillary and SERS functionalities of these micropipettes allow on-site microsampling via capillary action and subsequent multiplex distinction/detection due to their molecularly narrow Raman peaks. For example, 8 nM thiram (TMTD), 8 nM malachite green (MG), and 1.5 μM (400 ppb) methyl parathion (MPT) on pepper and cucumber peels have been simultaneously detected in a wide detection range. The portable SERS-enabled device could potentially be facilely incorporated with liquid-liquid or solid phase micro-extracting devices for a broader range of applications in rapid and field analysis of food/public/environment security related SORs. PMID:26274894

  6. Magnetoelectroluminescence of organic heterostructures: Analytical theory and spectrally resolved measurements

    DOE PAGESBeta

    Liu, Feilong; Kelley, Megan R.; Crooker, Scott A.; Nie, Wanyi; Mohite, Aditya D.; Ruden, P. Paul; Los Alamos National Lab.; Smith, Darryl L.; Los Alamos National Lab.

    2014-12-22

    The effect of a magnetic field on the electroluminescence of organic light emitting devices originates from the hyperfine interaction between the electron/hole polarons and the hydrogen nuclei of the host molecules. In this paper, we present an analytical theory of magnetoelectroluminescence for organic semiconductors. To be specific, we focus on bilayer heterostructure devices. In the case we are considering, light generation at the interface of the donor and acceptor layers results from the formation and recombination of exciplexes. The spin physics is described by a stochastic Liouville equation for the electron/hole spin density matrix. By finding the steady-state analytical solutionmore » using Bloch-Wangsness-Redfield theory, we explore how the singlet/triplet exciplex ratio is affected by the hyperfine interaction strength and by the external magnetic field. In order to validate the theory, spectrally resolved electroluminescence experiments on BPhen/m-MTDATA devices are analyzed. With increasing emission wavelength, the width of the magnetic field modulation curve of the electroluminescence increases while its depth decreases. Furthermore, these observations are consistent with the model.« less

  7. Magnetoelectroluminescence of organic heterostructures: Analytical theory and spectrally resolved measurements

    SciTech Connect

    Liu, Feilong; Kelley, Megan R.; Crooker, Scott A.; Nie, Wanyi; Mohite, Aditya D.; Ruden, P. Paul; Smith, Darryl L.

    2014-12-22

    The effect of a magnetic field on the electroluminescence of organic light emitting devices originates from the hyperfine interaction between the electron/hole polarons and the hydrogen nuclei of the host molecules. In this paper, we present an analytical theory of magnetoelectroluminescence for organic semiconductors. To be specific, we focus on bilayer heterostructure devices. In the case we are considering, light generation at the interface of the donor and acceptor layers results from the formation and recombination of exciplexes. The spin physics is described by a stochastic Liouville equation for the electron/hole spin density matrix. By finding the steady-state analytical solution using Bloch-Wangsness-Redfield theory, we explore how the singlet/triplet exciplex ratio is affected by the hyperfine interaction strength and by the external magnetic field. In order to validate the theory, spectrally resolved electroluminescence experiments on BPhen/m-MTDATA devices are analyzed. With increasing emission wavelength, the width of the magnetic field modulation curve of the electroluminescence increases while its depth decreases. Furthermore, these observations are consistent with the model.

  8. Polyanthraquinone as a Reliable Organic Electrode for Stable and Fast Lithium Storage.

    PubMed

    Song, Zhiping; Qian, Yumin; Gordin, Mikhail L; Tang, Duihai; Xu, Terrence; Otani, Minoru; Zhan, Hui; Zhou, Haoshen; Wang, Donghai

    2015-11-16

    In spite of recent progress, there is still a lack of reliable organic electrodes for Li storage with high comprehensive performance, especially in terms of long-term cycling stability. Herein, we report an ideal polymer electrode based on anthraquinone, namely, polyanthraquinone (PAQ), or specifically, poly(1,4-anthraquinone) (P14AQ) and poly(1,5-anthraquinone) (P15AQ). As a lithium-storage cathode, P14AQ showed exceptional performance, including reversible capacity almost equal to the theoretical value (260 mA h g(-1); >257 mA h g(-1) for AQ), a very small voltage gap between the charge and discharge curves (2.18-2.14=0.04 V), stable cycling performance (99.4% capacity retention after 1000 cycles), and fast-discharge/charge ability (release of 69% of the low-rate capacity or 64% of the energy in just 2 min). Exploration of the structure-performance relationship between P14AQ and related materials also provided us with deeper understanding for the design of organic electrodes. PMID:26411505

  9. Theory of hydrogen migration in organic-inorganic halide perovskites.

    PubMed

    Egger, David A; Kronik, Leeor; Rappe, Andrew M

    2015-10-12

    Solar cells based on organic-inorganic halide perovskites have recently been proven to be remarkably efficient. However, they exhibit hysteresis in their current-voltage curves, and their stability in the presence of water is problematic. Both issues are possibly related to a diffusion of defects in the perovskite material. By using first-principles calculations based on density functional theory, we study the properties of an important defect in hybrid perovskites-interstitial hydrogen. We show that differently charged defects occupy different crystal sites, which may allow for ionization-enhanced defect migration following the Bourgoin-Corbett mechanism. Our analysis highlights the structural flexibility of organic-inorganic perovskites: successive iodide displacements, combined with hydrogen bonding, enable proton diffusion with low migration barriers. These findings indicate that hydrogen defects can be mobile and thus highly relevant for the performance of perovskite solar cells. PMID:26073061

  10. The "New Institutionalism" in Organization Theory: Bringing Society and Culture Back in

    ERIC Educational Resources Information Center

    Senge, Konstanze

    2013-01-01

    This investigation will discuss the emergence of an economistical perspective among the dominant approaches of organization theory in the United States since the inception of "organization studies" as an academic discipline. It maintains that Contingency theory, Resource Dependency theory, Population Ecology theory, and Transaction Cost theory…

  11. Generalizability Theory Analysis of CBM Maze Reliability in Third- through Fifth-Grade Students

    ERIC Educational Resources Information Center

    Mercer, Sterett H.; Dufrene, Brad A.; Zoder-Martell, Kimberly; Harpole, Lauren Lestremau; Mitchell, Rachel R.; Blaze, John T.

    2012-01-01

    Despite growing use of CBM Maze in universal screening and research, little information is available regarding the number of CBM Maze probes needed for reliable decisions. The current study extends existing research on the technical adequacy of CBM Maze by investigating the number of probes and assessment durations (1-3 min) needed for reliable…

  12. Reliable thermal processing of organic perovskite films deposited on ZnO

    NASA Astrophysics Data System (ADS)

    Zakhidov, Alex; Manspeaker, Chris; Lyashenko, Dmitry; Alex Zakhidov Team

    Zinc oxide (ZnO) is a promising semiconducting material to serve as an electron transport layer (ETL) for solar cell devices based on organo-halide lead perovskites. ZnO ETL for perovskite photovoltaics has a combination of attractive electronic and optical properties: i) the electron affinity of ZnO is well aligned with valence band edge of the CH3NH3PbI3, ii) electron mobility of ZnO is >1 cm2/(Vs), which is a few orders of magnitude higher than that of TiO2 (another popular choice of ETL for perovskite photovoltaic devices), and iii) ZnO has a large of band gap of 3.3 eV, which ensures optical transparency and large barrier for the hole injection. Moreover, ZnO nanostructures can be printed on flexible substrates at room temperatures in cost effective manner. However, it was recently found that organic perovskites deposited on ZnO are unstable and readily decompose at >90°C. In this work, we further investigate the mechanism of decomposition of CH3NH3PbI3 film deposited on ZnO and reveal the role of the solvent in the film during the annealing process. We also develop a restricted volume solvent annealing (RVSA) process for post annealing of the perovskite film on ZnO without decomposition. We demonstrate that RVSA enables reliable perovskite solar cell fabrication.

  13. Reliable sex and strain discrimination in the mouse vomeronasal organ and accessory olfactory bulb.

    PubMed

    Tolokh, Illya I; Fu, Xiaoyan; Holy, Timothy E

    2013-08-21

    Animals modulate their courtship and territorial behaviors in response to olfactory cues produced by other animals. In rodents, detecting these cues is the primary role of the accessory olfactory system (AOS). We sought to systematically investigate the natural stimulus coding logic and robustness in neurons of the first two stages of accessory olfactory processing, the vomeronasal organ (VNO) and accessory olfactory bulb (AOB). We show that firing rate responses of just a few well-chosen mouse VNO or AOB neurons can be used to reliably encode both sex and strain of other mice from cues contained in urine. Additionally, we show that this population code can generalize to new concentrations of stimuli and appears to represent stimulus identity in terms of diverging paths in coding space. Together, the results indicate that firing rate code on the temporal order of seconds is sufficient for accurate classification of pheromonal patterns at different concentrations and may be used by AOS neural circuitry to discriminate among naturally occurring urine stimuli. PMID:23966710

  14. Reliability of a tool for measuring theory of planned behaviour constructs for use in evaluating research use in policymaking

    PubMed Central

    2011-01-01

    Background Although measures of knowledge translation and exchange (KTE) effectiveness based on the theory of planned behavior (TPB) have been used among patients and providers, no measure has been developed for use among health system policymakers and stakeholders. A tool that measures the intention to use research evidence in policymaking could assist researchers in evaluating the effectiveness of KTE strategies that aim to support evidence-informed health system decision-making. Therefore, we developed a 15-item tool to measure four TPB constructs (intention, attitude, subjective norm and perceived control) and assessed its face validity through key informant interviews. Methods We carried out a reliability study to assess the tool's internal consistency and test-retest reliability. Our study sample consisted of 62 policymakers and stakeholders that participated in deliberative dialogues. We assessed internal consistency using Cronbach's alpha and generalizability (G) coefficients, and we assessed test-retest reliability by calculating Pearson correlation coefficients (r) and G coefficients for each construct and the tool overall. Results The internal consistency of items within each construct was good with alpha ranging from 0.68 to alpha = 0.89. G-coefficients were lower for a single administration (G = 0.34 to G = 0.73) than for the average of two administrations (G = 0.79 to G = 0.89). Test-retest reliability coefficients for the constructs ranged from r = 0.26 to r = 0.77 and from G = 0.31 to G = 0.62 for a single administration, and from G = 0.47 to G = 0.86 for the average of two administrations. Test-retest reliability of the tool using G theory was moderate (G = 0.5) when we generalized across a single observation, but became strong (G = 0.9) when we averaged across both administrations. Conclusion This study provides preliminary evidence for the reliability of a tool that can be used to measure TPB constructs in relation to research use in policymaking

  15. Understanding Schools as High-Reliability Organizations: An Exploratory Examination of Teachers' and School Leaders' Perceptions of Success

    ERIC Educational Resources Information Center

    Lorton, Juli A.; Bellamy, G. Thomas; Reece, Anne; Carlson, Jill

    2013-01-01

    Drawing on research on high-reliability organizations, this interviewbased qualitative case study employs four characteristics of such organizations as a lens for analyzing the operations of one very successful K-5 public school. Results suggest that the school had processes similar to those characteristic of high-reliability organizations: a…

  16. What range of trait levels can the Autism-Spectrum Quotient (AQ) measure reliably? An item response theory analysis.

    PubMed

    Murray, Aja Louise; Booth, Tom; McKenzie, Karen; Kuenssberg, Renate

    2016-06-01

    It has previously been noted that inventories measuring traits that originated in a psychopathological paradigm can often reliably measure only a very narrow range of trait levels that are near and above clinical cutoffs. Much recent work has, however, suggested that autism spectrum disorder traits are on a continuum of severity that extends well into the nonclinical range. This implies a need for inventories that can capture individual differences in autistic traits from very high levels all the way to the opposite end of the continuum. The Autism-Spectrum Quotient (AQ) was developed based on a closely related rationale, but there has, to date, been no direct test of the range of trait levels that the AQ can reliably measure. To assess this, we fit a bifactor item response theory model to the AQ. Results suggested that AQ measures moderately low to moderately high levels of a general autistic trait with good measurement precision. The reliable range of measurement was significantly improved by scoring the instrument using its 4-point response scale, rather than dichotomizing responses. These results support the use of the AQ in nonclinical samples, but suggest that items measuring very low and very high levels of autistic traits would be beneficial additions to the inventory. (PsycINFO Database Record PMID:26302097

  17. Reliable Energy Level Alignment at Physisorbed Molecule–Metal Interfaces from Density Functional Theory

    PubMed Central

    2015-01-01

    A key quantity for molecule–metal interfaces is the energy level alignment of molecular electronic states with the metallic Fermi level. We develop and apply an efficient theoretical method, based on density functional theory (DFT) that can yield quantitatively accurate energy level alignment information for physisorbed metal–molecule interfaces. The method builds on the “DFT+Σ” approach, grounded in many-body perturbation theory, which introduces an approximate electron self-energy that corrects the level alignment obtained from conventional DFT for missing exchange and correlation effects associated with the gas-phase molecule and substrate polarization. Here, we extend the DFT+Σ approach in two important ways: first, we employ optimally tuned range-separated hybrid functionals to compute the gas-phase term, rather than rely on GW or total energy differences as in prior work; second, we use a nonclassical DFT-determined image-charge plane of the metallic surface to compute the substrate polarization term, rather than the classical DFT-derived image plane used previously. We validate this new approach by a detailed comparison with experimental and theoretical reference data for several prototypical molecule–metal interfaces, where excellent agreement with experiment is achieved: benzene on graphite (0001), and 1,4-benzenediamine, Cu-phthalocyanine, and 3,4,9,10-perylene-tetracarboxylic-dianhydride on Au(111). In particular, we show that the method correctly captures level alignment trends across chemical systems and that it retains its accuracy even for molecules for which conventional DFT suffers from severe self-interaction errors. PMID:25741626

  18. General systems theory, brain organization, and early experiences.

    PubMed

    Denenberg, V H

    1980-01-01

    Three hypothetical brain processes--interhemispheric coupling, hemispheric activation, and interhemispheric inhibition--are derived from an equation characterizing general systems theory. To investigate these processes, experimental rats were reared under differing early experience conditions. When adult, they had their right or left neocortex lesioned, had a sham operation, or were left undisturbed. Interhemispheric coupling was measured by means of a correlation coefficient between the right and left hemispheres. The presence of a significant positive correlation is taken as evidence of a negative feedback loop between the hemispheres. In one experimental population, in which rats did not receive any extra stimulation in infancy, the correlation was not significantly different from zero, thus implying that the two hemispheres were operating independently. In another population, in which rats had received handling stimulation in infancy, the correlation coefficient was significant (0.543), indicating that the hemispheres were coupled in a systems arrangement. The processes of hemispheric activation and interhemispheric inhibition were assessed by comparing the mean performance of the two unilateral lesion groups and the group with intact brain. The two rat populations had different forms of brain organizations as measured by these processes. These analyses show that the behavior of the isolated hemisphere cannot be directly extrapolated to the behavior of the connected hemisphere. If there is hemispheric coupling via a negative feedback loop or if there is interhemispheric inhibition, then the disconnected hemisphere may show behaviors that are not evident in the normal connected condition. PMID:7356045

  19. Theory manual for FAROW version 1.1: A numerical analysis of the Fatigue And Reliability Of Wind turbine components

    SciTech Connect

    WUBTERSTEUBMSTEVEB R.; VEERS,PAUL S.

    2000-01-01

    Because the fatigue lifetime of wind turbine components depends on several factors that are highly variable, a numerical analysis tool called FAROW has been created to cast the problem of component fatigue life in a probabilistic framework. The probabilistic analysis is accomplished using methods of structural reliability (FORM/SORM). While the workings of the FAROW software package are defined in the user's manual, this theory manual outlines the mathematical basis. A deterministic solution for the time to failure is made possible by assuming analytical forms for the basic inputs of wind speed, stress response, and material resistance. Each parameter of the assumed forms for the inputs can be defined to be a random variable. The analytical framework is described and the solution for time to failure is derived.

  20. A theory for the arrangement of sensory organs in Drosophila.

    PubMed

    Zhu, Huifeng; Gunaratne, Preethi H; Roman, Gregg W; Gunaratne, Gemunu H

    2010-03-01

    We study the arrangements of recurved bristles on the anterior wing margin of wild-type and mutant Drosophila. The epidermal or neural fate of a proneural cell depends on the concentrations of proteins of the achaete-scute complex. At puparium formation, concentrations of proteins are nearly identical in all cells of the anterior wing and each cell has the potential for neural fate. In wild-type flies, the action of regulatory networks drives the initial state to one where a bristle grows out of every fifth cell. Recent experiments have shown that the frequency of recurved bristles can be made to change by adjusting the mean concentrations of the zinc-finger transcription factor Senseless and the micro-RNA miR-9a. Specifically, mutant flies with reduced levels of miR-9a exhibit ectopic bristles, and those with lower levels of both miR-9a and Senseless show regular organization of recurved bristles, but with a lower periodicity of 4. We argue that these characteristics can be explained assuming an underlying Turing-type bifurcation whereby a periodic pattern spontaneously emerges from a uniform background. However, bristle patterns occur in a discrete array of cells, and are not mediated by diffusion. We argue that intracellular actions of transmembrane proteins such as Delta and Notch can play a role of diffusion in destabilizing the homogeneous state. In contrast to diffusion, intercellular actions can be activating or inhibiting; further, there can be lateral cross-species interactions. We introduce a phenomenological model to study bristle arrangements and make several model-independent predictions that can be tested in experiments. In our theory, miRNA-9a is one of the components of the underlying network and has no special regulatory role. The loss of periodicity in its absence is due to the transfer of the system to a bistable state. PMID:20370287

  1. A theory for the arrangement of sensory organs in Drosophila

    NASA Astrophysics Data System (ADS)

    Zhu, Huifeng; Gunaratne, Preethi H.; Roman, Gregg W.; Gunaratne, Gemunu H.

    2010-03-01

    We study the arrangements of recurved bristles on the anterior wing margin of wild-type and mutant Drosophila. The epidermal or neural fate of a proneural cell depends on the concentrations of proteins of the achaete-scute complex. At puparium formation, concentrations of proteins are nearly identical in all cells of the anterior wing and each cell has the potential for neural fate. In wild-type flies, the action of regulatory networks drives the initial state to one where a bristle grows out of every fifth cell. Recent experiments have shown that the frequency of recurved bristles can be made to change by adjusting the mean concentrations of the zinc-finger transcription factor Senseless and the micro-RNA miR-9a. Specifically, mutant flies with reduced levels of miR-9a exhibit ectopic bristles, and those with lower levels of both miR-9a and Senseless show regular organization of recurved bristles, but with a lower periodicity of 4. We argue that these characteristics can be explained assuming an underlying Turing-type bifurcation whereby a periodic pattern spontaneously emerges from a uniform background. However, bristle patterns occur in a discrete array of cells, and are not mediated by diffusion. We argue that intracellular actions of transmembrane proteins such as Delta and Notch can play a role of diffusion in destabilizing the homogeneous state. In contrast to diffusion, intercellular actions can be activating or inhibiting; further, there can be lateral cross-species interactions. We introduce a phenomenological model to study bristle arrangements and make several model-independent predictions that can be tested in experiments. In our theory, miRNA-9a is one of the components of the underlying network and has no special regulatory role. The loss of periodicity in its absence is due to the transfer of the system to a bistable state.

  2. Aligning the Undergraduate Organic Laboratory Experience with Professional Work: The Centrality of Reliable and Meaningful Data

    ERIC Educational Resources Information Center

    Alaimo, Peter J.; Langenhan, Joseph M.; Suydam, Ian T.

    2014-01-01

    Many traditional organic chemistry lab courses do not adequately help students to develop the professional skills required for creative, independent work. The overarching goal of the new organic chemistry lab series at Seattle University is to teach undergraduates to think, perform, and behave more like professional scientists. The conversion of…

  3. Left-right organizer flow dynamics: how much cilia activity reliably yields laterality?

    PubMed

    Sampaio, Pedro; Ferreira, Rita R; Guerrero, Adán; Pintado, Petra; Tavares, Bárbara; Amaro, Joana; Smith, Andrew A; Montenegro-Johnson, Thomas; Smith, David J; Lopes, Susana S

    2014-06-23

    Internal organs are asymmetrically positioned inside the body. Embryonic motile cilia play an essential role in this process by generating a directional fluid flow inside the vertebrate left-right organizer. Detailed characterization of how fluid flow dynamics modulates laterality is lacking. We used zebrafish genetics to experimentally generate a range of flow dynamics. By following the development of each embryo, we show that fluid flow in the left-right organizer is asymmetric and provides a good predictor of organ laterality. This was tested in mosaic organizers composed of motile and immotile cilia generated by dnah7 knockdowns. In parallel, we used simulations of fluid dynamics to analyze our experimental data. These revealed that fluid flow generated by 30 or more cilia predicts 90% situs solitus, similar to experimental observations. We conclude that cilia number, dorsal anterior motile cilia clustering, and left flow are critical to situs solitus via robust asymmetric charon expression. PMID:24930722

  4. How Settings Change People: Applying Behavior Setting Theory to Consumer-Run Organizations

    ERIC Educational Resources Information Center

    Brown, Louis D.; Shepherd, Matthew D.; Wituk, Scott A.; Meissen, Greg

    2007-01-01

    Self-help initiatives stand as a classic context for organizational studies in community psychology. Behavior setting theory stands as a classic conception of organizations and the environment. This study explores both, applying behavior setting theory to consumer-run organizations (CROs). Analysis of multiple data sets from all CROs in Kansas…

  5. Organization Theory and Memory for Prose: A Review of the Literature

    ERIC Educational Resources Information Center

    Shimmerlik, Susan M.

    1978-01-01

    Organization theory emphasizes groupings of items on the basis of a variety of characteristics, and the role of the learner as an active processor or encoder of information. Research on organization theory as it is applied to memory and recall of prose is reviewed here. (BW)

  6. Optimizing Reliability of Digital Inclinometer and Flexicurve Ruler Measures of Spine Curvatures in Postmenopausal Women with Osteoporosis of the Spine: An Illustration of the Use of Generalizability Theory

    PubMed Central

    MacIntyre, Norma J.; Bennett, Lisa; Bonnyman, Alison M.; Stratford, Paul W.

    2011-01-01

    The study illustrates the application of generalizability theory (G-theory) to identify measurement protocols that optimize reliability of two clinical methods for assessing spine curvatures in women with osteoporosis. Triplicate measures of spine curvatures were acquired for 9 postmenopausal women with spine osteoporosis by two raters during a single visit using a digital inclinometer and a flexicurve ruler. G-coefficients were estimated using a G-study, and a measurement protocol that optimized inter-rater and inter-trial reliability was identified using follow-up decision studies. The G-theory provides reliability estimates for measurement devices which can be generalized to different clinical contexts and/or measurement designs. PMID:22482067

  7. Toward a Broader Theory of Administration: A Defense of the Artistic Perspective on Organization.

    ERIC Educational Resources Information Center

    McDaniel, Thomas R.

    1982-01-01

    Modern administrative theory has evolved from social science models and experiments. Such theory has not adequately accounted for the individual in the organization. Future theory development should reflect both the scientific and artistic dimensions of inquiry and should move toward the integration of concepts from the arts and the sciences.…

  8. Reliability training

    NASA Technical Reports Server (NTRS)

    Lalli, Vincent R. (Editor); Malec, Henry A. (Editor); Dillard, Richard B.; Wong, Kam L.; Barber, Frank J.; Barina, Frank J.

    1992-01-01

    Discussed here is failure physics, the study of how products, hardware, software, and systems fail and what can be done about it. The intent is to impart useful information, to extend the limits of production capability, and to assist in achieving low cost reliable products. A review of reliability for the years 1940 to 2000 is given. Next, a review of mathematics is given as well as a description of what elements contribute to product failures. Basic reliability theory and the disciplines that allow us to control and eliminate failures are elucidated.

  9. An examination of maintenance activities in liquid metal reactor facilities: An analysis by the Centralized Reliability Data Organization (CREDO)

    SciTech Connect

    Haire, M J; Knee, H E; Manning, J J; Manneschmidt, J F; Setoguchi, K

    1987-01-01

    The Centralized Reliability Data Organization (CREDO) is the largest repository of liquid metal reactor (LMR) component reliability data in the world. It is jointly sponsored by the US Department of Energy (DOE) and the Power Reactor and Nuclear fuel Development Corporation (PNC) of Japan. The CREDO database contains information on a population of more than 21,000 components and approximately 1300 event records. Total experience is approaching 1.2 billion component operating hours. Although data gathering for CREDO concentrates on event (failure) information, the work reported here focuses on the maintenance information contained in CREDO and the development of maintenance critical items lists. That is, components are ranked in prioritized lists from worse to best performers from a maintenance standpoint.

  10. Density-functional theory with screened van der Waals interactions for the modeling of hybrid inorganic-organic systems.

    PubMed

    Ruiz, Victor G; Liu, Wei; Zojer, Egbert; Scheffler, Matthias; Tkatchenko, Alexandre

    2012-04-01

    The electronic properties and the function of hybrid inorganic-organic systems (HIOS) are intimately linked to their interface geometry. Here we show that the inclusion of the many-body collective response of the substrate electrons inside the inorganic bulk enables us to reliably predict the HIOS geometries and energies. This is achieved by the combination of dispersion-corrected density-functional theory (the DFT+ van der Waals approach) [Phys. Rev. Lett. 102, 073005 (2009)], with the Lifshitz-Zaremba-Kohn theory for the nonlocal Coulomb screening within the bulk. Our method yields geometries in remarkable agreement (≈0.1 Å) with normal incidence x-ray standing wave measurements for the 3, 4, 9, 10-perylene-tetracarboxylic acid dianhydride (C(24)O(6)H(8), PTCDA) molecule on Cu(111), Ag(111), and Au(111) surfaces. Similarly accurate results are obtained for xenon and benzene adsorbed on metal surfaces. PMID:22540809

  11. Using Multivariate Generalizability Theory to Assess the Effect of Content Stratification on the Reliability of a Performance Assessment

    ERIC Educational Resources Information Center

    Keller, Lisa A.; Clauser, Brian E.; Swanson, David B.

    2010-01-01

    In recent years, demand for performance assessments has continued to grow. However, performance assessments are notorious for lower reliability, and in particular, low reliability resulting from task specificity. Since reliability analyses typically treat the performance tasks as randomly sampled from an infinite universe of tasks, these estimates…

  12. The Design Organization Test: Further Demonstration of Reliability and Validity as a Brief Measure of Visuospatial Ability

    PubMed Central

    Killgore, William D. S.; Gogel, Hannah

    2013-01-01

    Neuropsychological assessments are frequently time-consuming and fatiguing for patients. Brief screening evaluations may reduce test duration and allow more efficient use of time by permitting greater attention toward neuropsychological domains showing probable deficits. The Design Organization Test (DOT) was initially developed as a 2-min paper-and-pencil alternative for the Block Design (BD) subtest of the Wechsler scales. Although initially validated for clinical neurologic patients, we sought to further establish the reliability and validity of this test in a healthy, more diverse population. Two alternate versions of the DOT and the Wechsler Abbreviated Scale of Intelligence (WASI) were administered to 61 healthy adult participants. The DOT showed high alternate forms reliability (r = .90–.92), and the two versions yielded equivalent levels of performance. The DOT was highly correlated with BD (r = .76–.79) and was significantly correlated with all subscales of the WASI. The DOT proved useful when used in lieu of BD in the calculation of WASI IQ scores. Findings support the reliability and validity of the DOT as a measure of visuospatial ability and suggest its potential worth as an efficient estimate of intellectual functioning in situations where lengthier tests may be inappropriate or unfeasible. PMID:25265311

  13. The design organization test: further demonstration of reliability and validity as a brief measure of visuospatial ability.

    PubMed

    Killgore, William D S; Gogel, Hannah

    2014-01-01

    Neuropsychological assessments are frequently time-consuming and fatiguing for patients. Brief screening evaluations may reduce test duration and allow more efficient use of time by permitting greater attention toward neuropsychological domains showing probable deficits. The Design Organization Test (DOT) was initially developed as a 2-min paper-and-pencil alternative for the Block Design (BD) subtest of the Wechsler scales. Although initially validated for clinical neurologic patients, we sought to further establish the reliability and validity of this test in a healthy, more diverse population. Two alternate versions of the DOT and the Wechsler Abbreviated Scale of Intelligence (WASI) were administered to 61 healthy adult participants. The DOT showed high alternate forms reliability (r = .90-.92), and the two versions yielded equivalent levels of performance. The DOT was highly correlated with BD (r = .76-.79) and was significantly correlated with all subscales of the WASI. The DOT proved useful when used in lieu of BD in the calculation of WASI IQ scores. Findings support the reliability and validity of the DOT as a measure of visuospatial ability and suggest its potential worth as an efficient estimate of intellectual functioning in situations where lengthier tests may be inappropriate or unfeasible. PMID:25265311

  14. Implications of Complexity and Chaos Theories for Organizations that Learn

    ERIC Educational Resources Information Center

    Smith, Peter A. C.

    2003-01-01

    In 1996 Hubert Saint-Onge and Smith published an article ("The evolutionary organization: avoiding a Titanic fate", in The Learning Organization, Vol. 3 No. 4), based on their experience at the Canadian Imperial Bank of Commerce (CIBC). It was established at CIBC that change could be successfully facilitated through blended application of theory…

  15. Reliable measurement of the Seebeck coefficient of organic and inorganic materials between 260 K and 460 K

    SciTech Connect

    Beretta, D.; Lanzani, G.; Bruno, P.; Caironi, M.

    2015-07-15

    A new experimental setup for reliable measurement of the in-plane Seebeck coefficient of organic and inorganic thin films and bulk materials is reported. The system is based on the “Quasi-Static” approach and can measure the thermopower in the range of temperature between 260 K and 460 K. The system has been tested on a pure nickel bulk sample and on a thin film of commercially available PEDOT:PSS deposited by spin coating on glass. Repeatability within 1.5% for the nickel sample is demonstrated, while accuracy in the measurement of both organic and inorganic samples is guaranteed by time interpolation of data and by operating with a temperature difference over the sample of less than 1 K.

  16. A Comparison of the Approaches of Generalizability Theory and Item Response Theory in Estimating the Reliability of Test Scores for Testlet-Composed Tests

    ERIC Educational Resources Information Center

    Lee, Guemin; Park, In-Yong

    2012-01-01

    Previous assessments of the reliability of test scores for testlet-composed tests have indicated that item-based estimation methods overestimate reliability. This study was designed to address issues related to the extent to which item-based estimation methods overestimate the reliability of test scores composed of testlets and to compare several…

  17. An Examination of Test-Retest, Alternate Form Reliability, and Generalizability Theory Study of the easyCBM Reading Assessments: Grade 1. Technical Report #1216

    ERIC Educational Resources Information Center

    Anderson, Daniel; Park, Jasmine, Bitnara; Lai, Cheng-Fei; Alonzo, Julie; Tindal, Gerald

    2012-01-01

    This technical report is one in a series of five describing the reliability (test/retest/and alternate form) and G-Theory/D-Study research on the easy CBM reading measures, grades 1-5. Data were gathered in the spring 2011 from a convenience sample of students nested within classrooms at a medium-sized school district in the Pacific Northwest. Due…

  18. Linking Quality Assurance to Performance Improvement to Produce a High Reliability Organization

    SciTech Connect

    Silvey, Andrea B.; Warrick, Louise H.

    2008-05-01

    Three basic change management models are currently used in healthcare to produce and sustain quality improvement. We have presented the context to determine where any particular organization stands within these paradigms. We also have introduced a change-management tool used to assess, plan, and monitor leadership effort and commitment to quality improvement and culture change activities, tracked as 'momentum for change.' This 'momentum' is measured at eight discrete levels, from recognizing a performance gap to officially implementing changes intended to improve quality.

  19. Cracking Silent Codes: Critical Race Theory and Education Organizing

    ERIC Educational Resources Information Center

    Su, Celina

    2007-01-01

    Critical race theory (CRT) has moved beyond legal scholarship to critique the ways in which "colorblind" laws and policies perpetuate existing racial inequalities in education policy. While criticisms of CRT have focused on the pessimism and lack of remedies presented, CRT scholars have begun to address issues of praxis. Specifically, communities…

  20. Can the second order multireference perturbation theory be considered a reliable tool to study mixed-valence compounds?

    NASA Astrophysics Data System (ADS)

    Pastore, Mariachiara; Helal, Wissam; Evangelisti, Stefano; Leininger, Thierry; Malrieu, Jean-Paul; Maynau, Daniel; Angeli, Celestino; Cimiraglia, Renzo

    2008-05-01

    In this paper, the problem of the calculation of the electronic structure of mixed-valence compounds is addressed in the frame of multireference perturbation theory (MRPT). Using a simple mixed-valence compound (the 5,5' (4H,4H')-spirobi[ciclopenta[c]pyrrole] 2,2',6,6' tetrahydro cation), and the n-electron valence state perturbation theory (NEVPT2) and CASPT2 approaches, it is shown that the ground state (GS) energy curve presents an unphysical "well" for nuclear coordinates close to the symmetric case, where a maximum is expected. For NEVPT, the correct shape of the energy curve is retrieved by applying the MPRT at the (computationally expensive) third order. This behavior is rationalized using a simple model (the ionized GS of two weakly interacting identical systems, each neutral system being described by two electrons in two orbitals), showing that the unphysical well is due to the canonical orbital energies which at the symmetric (delocalized) conformation lead to a sudden modification of the denominators in the perturbation expansion. In this model, the bias introduced in the second order correction to the energy is almost entirely removed going to the third order. With the results of the model in mind, one can predict that all MRPT methods in which the zero order Hamiltonian is based on canonical orbital energies are prone to present unreasonable energy profiles close to the symmetric situation. However, the model allows a strategy to be devised which can give a correct behavior even at the second order, by simply averaging the orbital energies of the two charge-localized electronic states. Such a strategy is adopted in a NEVPT2 scheme obtaining a good agreement with the third order results based on the canonical orbital energies. The answer to the question reported in the title (is this theoretical approach a reliable tool for a correct description of these systems?) is therefore positive, but care must be exercised, either in defining the orbital energies

  1. The contribution of organization theory to nursing health services research.

    PubMed

    Mick, Stephen S; Mark, Barbara A

    2005-01-01

    We review nursing and health services research on health care organizations over the period 1950 through 2004 to reveal the contribution of nursing to this field. Notwithstanding this rich tradition and the unique perspective of nursing researchers grounded in patient care production processes, the following gaps in nursing research remain: (1) the lack of theoretical frameworks about organizational factors relating to internal work processes; (2) the need for sophisticated methodologies to guide empirical investigations; (3) the difficulty in understanding how organizations adapt models for patient care delivery in response to market forces; (4) the paucity of attention to the impact of new technologies on the organization of patient care work processes. Given nurses' deep understanding of the inner workings of health care facilities, we hope to see an increasing number of research programs that tackle these deficiencies. PMID:16360704

  2. Reliability of equivalent sphere model in blood-forming organ dose estimation

    NASA Technical Reports Server (NTRS)

    Shinn, Judy L.; Wilson, John W.; Nealy, John E.

    1990-01-01

    The radiation dose equivalents to blood-forming organs (BFO's) of the astronauts at the Martian surface due to major solar flare events are calculated using the detailed body geometry of Langley and Billings. The solar flare spectra of February 1956, November 1960, and August 1972 events are employed instead of the idealized Webber form. The detailed geometry results are compared with those based on the 5-cm sphere model which was used often in the past to approximate BFO dose or dose equivalent. Larger discrepancies are found for the later two events possibly due to the lower numbers of highly penetrating protons. It is concluded that the 5-cm sphere model is not suitable for quantitative use in connection with future NASA deep-space, long-duration mission shield design studies.

  3. Reliability of equivalent sphere model in blood-forming organ dose estimation

    SciTech Connect

    Shinn, J.L.; Wilson, J.W.; Nealy, J.E.

    1990-04-01

    The radiation dose equivalents to blood-forming organs (BFO's) of the astronauts at the Martian surface due to major solar flare events are calculated using the detailed body geometry of Langley and Billings. The solar flare spectra of February 1956, November 1960, and August 1972 events are employed instead of the idealized Webber form. The detailed geometry results are compared with those based on the 5-cm sphere model which was used often in the past to approximate BFO dose or dose equivalent. Larger discrepancies are found for the later two events possibly due to the lower numbers of highly penetrating protons. It is concluded that the 5-cm sphere model is not suitable for quantitative use in connection with future NASA deep-space, long-duration mission shield design studies.

  4. [Phenomenological theory of the recuperative period of the living organism].

    PubMed

    Zaĭtsev, A A; Sazonov, S V

    1997-01-01

    A phenomenological nonlinear model, describing a reconstruction of the living organism after strong loading have been proposed. This model is describing a restitution dynamics of the organism functional state to the initial state, including a supercompensation stage. In a simplest (one-component) case this model is overdamping Duffing oscillator. It is shown that the mutation phenomena may be described as the phase transition within the framework of Landau-Khalatnikov approach. A generalized many-component nonlinear reconstruction model is proposed. PMID:9172700

  5. Development of an Axiomatic Theory of Organization/Environment Interaction: A Theoretical and Empirical Analysis.

    ERIC Educational Resources Information Center

    Ganey, Rodney F.

    The goal of this paper was to develop a theory of organization/environment interaction by examining the impact of perceived environmental uncertainty on organizational processes and on organizational goal attainment. It examines theories from the organizational environment literature and derives corollaries that are empirically tested using a data…

  6. Applying Hofstede's Cross-Cultural Theory of Organizations to School Governance: A French Case Study.

    ERIC Educational Resources Information Center

    Fowler, Frances C.

    This paper applies Geert Hofstede's cross-cultural theory of organizational structure and behavior to school administration, examining the governance structure of the French public school system to determine how accurately it predicts the form of that educational organization. The first section of the paper presents Hofstede's theory and his…

  7. Economic and Political Theories of Organization: The Case of Human Rights INGOs.

    ERIC Educational Resources Information Center

    Blaser, Arthur W.

    This paper reviews research on international nongovernmental organizations dealing with human rights (INGOs), and interprets this research in light of the overlap of the fields of organizational theory (including group theory) and human rights. The purpose is to contribute toward a useful exchange between social scientists who seek to explain…

  8. A Theory of Electronic Propinquity: Mediated Communication in Organizations.

    ERIC Educational Resources Information Center

    Korzenny, Felipe

    This paper proposes a theoretical approach to mediated communication in organizations. It is argued that the man/machine interface in mediated human communication is better dealt with when a comprehensive theoretical approach is used than when separate communication devices are tested as they appear in the market, such as video-teleconferencing.…

  9. Increasing Reliability of Direct Observation Measurement Approaches in Emotional and/or Behavioral Disorders Research Using Generalizability Theory

    ERIC Educational Resources Information Center

    Gage, Nicholas A.; Prykanowski, Debra; Hirn, Regina

    2014-01-01

    Reliability of direct observation outcomes ensures the results are consistent, dependable, and trustworthy. Typically, reliability of direct observation measurement approaches is assessed using interobserver agreement (IOA) and the calculation of observer agreement (e.g., percentage of agreement). However, IOA does not address intraobserver…

  10. Push-Pull Receptive Field Organization and Synaptic Depression: Mechanisms for Reliably Encoding Naturalistic Stimuli in V1.

    PubMed

    Kremkow, Jens; Perrinet, Laurent U; Monier, Cyril; Alonso, Jose-Manuel; Aertsen, Ad; Frégnac, Yves; Masson, Guillaume S

    2016-01-01

    Neurons in the primary visual cortex are known for responding vigorously but with high variability to classical stimuli such as drifting bars or gratings. By contrast, natural scenes are encoded more efficiently by sparse and temporal precise spiking responses. We used a conductance-based model of the visual system in higher mammals to investigate how two specific features of the thalamo-cortical pathway, namely push-pull receptive field organization and fast synaptic depression, can contribute to this contextual reshaping of V1 responses. By comparing cortical dynamics evoked respectively by natural vs. artificial stimuli in a comprehensive parametric space analysis, we demonstrate that the reliability and sparseness of the spiking responses during natural vision is not a mere consequence of the increased bandwidth in the sensory input spectrum. Rather, it results from the combined impacts of fast synaptic depression and push-pull inhibition, the later acting for natural scenes as a form of "effective" feed-forward inhibition as demonstrated in other sensory systems. Thus, the combination of feedforward-like inhibition with fast thalamo-cortical synaptic depression by simple cells receiving a direct structured input from thalamus composes a generic computational mechanism for generating a sparse and reliable encoding of natural sensory events. PMID:27242445

  11. Push-Pull Receptive Field Organization and Synaptic Depression: Mechanisms for Reliably Encoding Naturalistic Stimuli in V1

    PubMed Central

    Kremkow, Jens; Perrinet, Laurent U.; Monier, Cyril; Alonso, Jose-Manuel; Aertsen, Ad; Frégnac, Yves; Masson, Guillaume S.

    2016-01-01

    Neurons in the primary visual cortex are known for responding vigorously but with high variability to classical stimuli such as drifting bars or gratings. By contrast, natural scenes are encoded more efficiently by sparse and temporal precise spiking responses. We used a conductance-based model of the visual system in higher mammals to investigate how two specific features of the thalamo-cortical pathway, namely push-pull receptive field organization and fast synaptic depression, can contribute to this contextual reshaping of V1 responses. By comparing cortical dynamics evoked respectively by natural vs. artificial stimuli in a comprehensive parametric space analysis, we demonstrate that the reliability and sparseness of the spiking responses during natural vision is not a mere consequence of the increased bandwidth in the sensory input spectrum. Rather, it results from the combined impacts of fast synaptic depression and push-pull inhibition, the later acting for natural scenes as a form of “effective” feed-forward inhibition as demonstrated in other sensory systems. Thus, the combination of feedforward-like inhibition with fast thalamo-cortical synaptic depression by simple cells receiving a direct structured input from thalamus composes a generic computational mechanism for generating a sparse and reliable encoding of natural sensory events. PMID:27242445

  12. An Examination of Test-Retest, Alternate Form Reliability, and Generalizability Theory Study of the easyCBM Passage Reading Fluency Assessments: Grade 4. Technical Report #1219

    ERIC Educational Resources Information Center

    Park, Bitnara Jasmine; Anderson, Daniel; Alonzo, Julie; Lai, Cheng-Fei; Tindal, Gerald

    2012-01-01

    This technical report is one in a series of five describing the reliability (test/retest and alternate form) and G-Theory/D-Study research on the easyCBM reading measures, grades 1-5. Data were gathered in the spring of 2011 from a convenience sample of students nested within classrooms at a medium-sized school district in the Pacific Northwest.…

  13. An Examination of Test-Retest, Alternate Form Reliability, and Generalizability Theory Study of the easyCBM Reading Assessments: Grade 5. Technical Report #1220

    ERIC Educational Resources Information Center

    Lai, Cheng-Fei; Park, Bitnara Jasmine; Anderson, Daniel; Alonzo, Julie; Tindal, Gerald

    2012-01-01

    This technical report is one in a series of five describing the reliability (test/retest and alternate form) and G-Theory/D-Study research on the easyCBM reading measures, grades 1-5. Data were gathered in the spring of 2011 from a convenience sample of students nested within classrooms at a medium-sized school district in the Pacific Northwest.…

  14. An Examination of Test-Retest, Alternate Form Reliability, and Generalizability Theory Study of the easyCBM Reading Assessments: Grade 2. Technical Report #1217

    ERIC Educational Resources Information Center

    Anderson, Daniel; Lai, Cheg-Fei; Park, Bitnara Jasmine; Alonzo, Julie; Tindal, Gerald

    2012-01-01

    This technical report is one in a series of five describing the reliability (test/retest an alternate form) and G-Theory/D-Study on the easyCBM reading measures, grades 1-5. Data were gathered in the spring of 2011 from the convenience sample of students nested within classrooms at a medium-sized school district in the Pacific Northwest. Due to…

  15. Reliable Prediction with Tuned Range-Separated Functionals of the Singlet-Triplet Gap in Organic Emitters for Thermally Activated Delayed Fluorescence.

    PubMed

    Sun, Haitao; Zhong, Cheng; Brédas, Jean-Luc

    2015-08-11

    The thermally activated delayed fluorescence (TADF) mechanism has recently attracted significant interest in the field of organic light-emitting diodes (OLEDs). TADF relies on the presence of a very small energy gap between the lowest singlet and triplet excited states. Here, we demonstrate that time-dependent density functional theory in the Tamm-Dancoff approximation can be very successful in calculations of the lowest singlet and triplet excitation energies and the corresponding singlet-triplet gap when using nonempirically tuned range-separated functionals. Such functionals provide very good estimates in a series of 17 molecules used in TADF-based OLED devices with mean absolute deviations of 0.15 eV for the vertical singlet excitation energies and 0.09 eV [0.07 eV] for the adiabatic [vertical] singlet-triplet energy gaps as well as low relative errors and high correlation coefficients compared to the corresponding experimental values. They significantly outperform conventional functionals, a feature which is rationalized on the basis of the amount of exact-exchange included and the delocalization error. The present work provides a reliable theoretical tool for the prediction and development of novel TADF-based materials with low singlet-triplet energetic splittings. PMID:26574466

  16. Human hair follicle organ culture: theory, application and perspectives.

    PubMed

    Langan, Ewan A; Philpott, Michael P; Kloepper, Jennifer E; Paus, Ralf

    2015-12-01

    For almost a quarter of a century, ex vivo studies of human scalp hair follicles (HFs) have permitted major advances in hair research, spanning diverse fields such as chronobiology, endocrinology, immunology, metabolism, mitochondrial biology, neurobiology, pharmacology, pigmentation and stem cell biology. Despite this, a comprehensive methodological guide to serum-free human HF organ culture (HFOC) that facilitates the selection and analysis of standard HF biological parameters and points out both research opportunities and pitfalls to newcomers to the field is still lacking. The current methods review aims to close an important gap in the literature and attempts to promote standardisation of human HFOC. We provide basic information outlining the establishment of HFOC through to detailed descriptions of the analysis of standard read-out parameters alongside practical examples. The guide closes by pointing out how serum-free HFOC can be utilised optimally to obtain previously inaccessible insights into human HF biology and pathology that are of interest to experimental dermatologists, geneticists, developmental biologists and (neuro-) endocrinologists alike and by highlighting novel applications of the model, including gene silencing and gene expression profiling of defined, laser capture-microdissected HF compartments. PMID:26284830

  17. Targeting helicase-dependent amplification products with an electrochemical genosensor for reliable and sensitive screening of genetically modified organisms.

    PubMed

    Moura-Melo, Suely; Miranda-Castro, Rebeca; de-Los-Santos-Álvarez, Noemí; Miranda-Ordieres, Arturo J; Dos Santos Junior, J Ribeiro; da Silva Fonseca, Rosana A; Lobo-Castañón, Maria Jesús

    2015-08-18

    Cultivation of genetically modified organisms (GMOs) and their use in food and feed is constantly expanding; thus, the question of informing consumers about their presence in food has proven of significant interest. The development of sensitive, rapid, robust, and reliable methods for the detection of GMOs is crucial for proper food labeling. In response, we have experimentally characterized the helicase-dependent isothermal amplification (HDA) and sequence-specific detection of a transgene from the Cauliflower Mosaic Virus 35S Promoter (CaMV35S), inserted into most transgenic plants. HDA is one of the simplest approaches for DNA amplification, emulating the bacterial replication machinery, and resembling PCR but under isothermal conditions. However, it usually suffers from a lack of selectivity, which is due to the accumulation of spurious amplification products. To improve the selectivity of HDA, which makes the detection of amplification products more reliable, we have developed an electrochemical platform targeting the central sequence of HDA copies of the transgene. A binary monolayer architecture is built onto a thin gold film where, upon the formation of perfect nucleic acid duplexes with the amplification products, these are enzyme-labeled and electrochemically transduced. The resulting combined system increases genosensor detectability up to 10(6)-fold, allowing Yes/No detection of GMOs with a limit of detection of ∼30 copies of the CaMV35S genomic DNA. A set of general utility rules in the design of genosensors for detection of HDA amplicons, which may assist in the development of point-of-care tests, is also included. The method provides a versatile tool for detecting nucleic acids with extremely low abundance not only for food safety control but also in the diagnostics and environmental control areas. PMID:26198403

  18. A predictive theory of charge separation in organic photovoltaics interfaces

    NASA Astrophysics Data System (ADS)

    Troisi, Alessandro; Liu, Tao; Caruso, Domenico; Cheung, David L.; McMahon, David P.

    2012-09-01

    The key process in organic photovoltaics cells is the separation of an exciton, close to the donor/acceptor interface into a free hole (in the donor) and a free electron (in the acceptor). In an efficient solar cell, the majority of absorbed photons generate such hole-electron pairs but it is not clear why such a charge separation process is so efficient in some blends (for example in the blend formed by poly(3- hexylthiophene) (P3HT) and a C60 derivative (PCBM)) and how can one design better OPV materials. The electronic and geometric structure of the prototypical polymer:fullerene interface (P3HT:PCBM) is investigated theoretically using a combination of classical and quantum simulation methods. It is shown that the electronic structure of P3HT in contact with PCBM is significantly altered compared to bulk P3HT. Due to the additional free volume of the interface, P3HT chains close to PCBM are more disordered and, consequently, they are characterized by an increased band gap. Excitons and holes are therefore repelled by the interface. This provides a possible explanation of the low recombination efficiency and supports the direct formation of "quasi-free" charge separated species at the interface. This idea is further explored here by using a more general system-independent model Hamiltonian. The long range exciton dissociation rate is computed as a function of the exciton distance from the interface and the average dissociation distance is evaluated by comparing this rate with the exciton migration rate with a kinetic model. The phenomenological model shows that also in a generic interface the direct formation if quasi-free charges is extremely likely.

  19. Derivations and Comparisons of Three Groups ofSelf-Organization Theories for Magnetohydrodynamic Plasmas

    NASA Astrophysics Data System (ADS)

    Kondoh, Yoshiomi; Sato, Tetsuya

    1994-04-01

    A theoretical investigation on self-organization theories ofdissipative MHD plasmas is presented to derive three groups oftheories that lead to the same relaxed state of ∇ × B=λ B, in order to find more essential physicalpicture embedded in self-organization phenomena due to nonlinear anddissipative processes. Comparisons among all of the theories treatedand derived here suggest that a theory standing upon spectrumspreadings and selective dissipations of eigenmodes for thedissipative operator -∇ ×η j and leading toself-organized relaxed states of ∇ ×ηj=α B/2 with the minimum dissipation rate is the most agreeable to various results obtained by experiments and by 3-D MHD simulations reported so far.

  20. Towards a Theory of Variation in the Organization of the Word Reading System

    PubMed Central

    Rueckl, Jay G.

    2015-01-01

    The strategy underlying most computational models of word reading is to specify the organization of the reading system—its architecture and the processes and representations it employs—and to demonstrate that this organization would give rise to the behavior observed in word reading tasks. This approach fails to adequately address the variation in reading behavior observed across and within linguistic communities. Only computational models that incorporate learning can fully account for variation in organization. However, even extant learning models (e.g., the triangle model) must be extended if they are to fully account for variation in organization. The challenges associated with extending theories in this way are discussed. PMID:26997862

  1. A review of carrier thermoelectric-transport theory in organic semiconductors.

    PubMed

    Lu, Nianduan; Li, Ling; Liu, Ming

    2016-07-20

    Carrier thermoelectric-transport theory has recently become of growing interest and numerous thermoelectric-transport models have been proposed for organic semiconductors, due to pressing current issues involving energy production and the environment. The purpose of this review is to provide a theoretical description of the thermoelectric Seebeck effect in organic semiconductors. Special attention is devoted to the carrier concentration, temperature, polaron effect and dipole effect dependence of the Seebeck effect and its relationship to hopping transport theory. Furthermore, various theoretical methods are used to discuss carrier thermoelectric transport. Finally, an outlook of the remaining challenges ahead for future theoretical research is provided. PMID:27386952

  2. Application of fuzzy set and Dempster-Shafer theory to organic geochemistry interpretation

    NASA Technical Reports Server (NTRS)

    Kim, C. S.; Isaksen, G. H.

    1993-01-01

    An application of fuzzy sets and Dempster Shafter Theory (DST) in modeling the interpretational process of organic geochemistry data for predicting the level of maturities of oil and source rock samples is presented. This was accomplished by (1) representing linguistic imprecision and imprecision associated with experience by a fuzzy set theory, (2) capturing the probabilistic nature of imperfect evidences by a DST, and (3) combining multiple evidences by utilizing John Yen's generalized Dempster-Shafter Theory (GDST), which allows DST to deal with fuzzy information. The current prototype provides collective beliefs on the predicted levels of maturity by combining multiple evidences through GDST's rule of combination.

  3. Egalitarian and maximin theories of justice: directed donation of organs for transplant.

    PubMed

    Veatch, R M

    1998-08-01

    It is common to interpret Rawls's maximin theory of justice as egalitarian. Compared to utilitarian theories, this may be true. However, in special cases practices that distribute resources so as to benefit the worst off actually increase the inequality between the worst off and some who are better off. In these cases the Rawlsian maximin parts company with what is here called true egalitarianism. A policy question requiring a distinction between maximin and "true egalitarian" allocations has arisen in the arena of organ transplantation. This case is examined here as a venue for differentiating maximin and true egalitarian theories. Directed donation is the name given to donations of organs restricted to a particular social group. For example, the family of a member of the Ku Klux Klan donated his organs on the provision that they go only to members of the Caucasian race. While such donations appear to be discriminatory, if certain plausible assumptions are made, they satisfy the maximin criterion. They selectively advantage the recipient of the organs without harming anyone (assuming the organs would otherwise go unused). Moreover, everyone who is lower on the waiting list (who, thereby, could be considered worse off) is advantaged by moving up on the waiting list. This paper examines how maximin and more truly egalitarian theories handle this case arguing that, to the extent that directed donation is unethical, the best account of that conclusion is that an egalitarian principle of justice is to be preferred to the maximin. PMID:9892035

  4. Customer-organization relationships: development and test of a theory of extended identities.

    PubMed

    Bagozzi, Richard P; Bergami, Massimo; Marzocchi, Gian Luca; Morandin, Gabriele

    2012-01-01

    We develop a theory of personal, relational, and collective identities that links organizations and consumers. Four targets of identity are studied: small friendship groups of aficionados of Ducati motorcycles, virtual communities centered on Ducatis, the Ducati brand, and Ducati the company. The interplay amongst the identities is shown to order affective, cognitive, and evaluative reactions toward each target. Hypotheses are tested on a sample of 210 Ducati aficionados, and implications of these multiple, extended identities for organizations are examined. PMID:21766998

  5. Applications of the Conceptual Density Functional Theory Indices to Organic Chemistry Reactivity.

    PubMed

    Domingo, Luis R; Ríos-Gutiérrez, Mar; Pérez, Patricia

    2016-01-01

    Theoretical reactivity indices based on the conceptual Density Functional Theory (DFT) have become a powerful tool for the semiquantitative study of organic reactivity. A large number of reactivity indices have been proposed in the literature. Herein, global quantities like the electronic chemical potential μ, the electrophilicity ω and the nucleophilicity N indices, and local condensed indices like the electrophilic P k + and nucleophilic P k - Parr functions, as the most relevant indices for the study of organic reactivity, are discussed. PMID:27294896

  6. Examining Agency Theory in Training & Development: Understanding Self-Interest Behaviors in the Organization

    ERIC Educational Resources Information Center

    Azevedo, Ross E.; Akdere, Mesut

    2011-01-01

    Agency theory has been discussed widely in the business and management literature. However, to date there has been no investigation about its utility and implications for problems in training & development. Whereas organizations are still struggling to develop and implement effective training programs, there is little emphasis on the self-interest…

  7. How Youth Get Engaged: Grounded-Theory Research on Motivational Development in Organized Youth Programs

    ERIC Educational Resources Information Center

    Dawes, Nickki Pearce; Larson, Reed

    2011-01-01

    For youth to benefit from many of the developmental opportunities provided by organized programs, they need to not only attend but become psychologically engaged in program activities. This research was aimed at formulating empirically based grounded theory on the processes through which this engagement develops. Longitudinal interviews were…

  8. An Investigation of the Advance Organizer Theory as an Effective Teaching Model.

    ERIC Educational Resources Information Center

    Downing, Agnes

    This paper advocates for the improvement of presentational methods of teaching and expository learning, based on David Ausubel's theory of Meaningful Verbal Learning and its derivative, the Advance Organizer Model of Teaching. This approach to teaching enables teachers to convey large amounts of information as meaningfully and efficiently as…

  9. New type of time-series sediment trap for the reliable collection of inorganic and organic trace chemical substances

    NASA Astrophysics Data System (ADS)

    Kremling, K.; Lentz, U.; Zeitzschel, B.; Schulz-Bull, D. E.; Duinker, J. C.

    1996-12-01

    The new sediment trap has a 0.5 m2 aperture, a funnel slope of 34° and is capable of collecting 21 samples at programmed intervals (1 min-1 year) during deployment in the deep ocean. The trap has been designed to allow reliable data on trace inorganic and organic components (such as trace elements, n-alkanes, PCBs, PAHs, amino/fatty acids) in addition to the standard biogeochemical variables in the collected particles. Due to the exclusive use of synthetic (such as fiberglass, PVC, PTFE, or POM-Delrin®) and highly resistant metallic materials contamination problems have been eliminated for these species. Blank values determined in several tests in the open ocean were as low as 1% or even less of the amounts present in trap material, even at low particulate loadings. Another major aim was the elimination of loss of dissolved components from the sample cups into seawater. Microbial and chemical processes modify the collected particles, thereby mobilizing originally particulate species into solution. It is thus essential to avoid greater losses of dissolved species through diffusion into the surrounding seawater after collection. This was achieved by means of an especially designed sealing mechanism. Tests with tracer substances during field studies proved that losses of dissolved components from supernatants during one year of deployment are as low as 10%. Additionally, the relationship between flow characteristics around the traps and their excursions from the vertical position in a bottom-tethered array was studied during a one-year deployment in the North Atlantic.

  10. Density-Functional Theory with Screened van der Waals Interactions for the Modeling of Hybrid Inorganic/Organic Systems

    NASA Astrophysics Data System (ADS)

    Ruiz, Victor G.; Liu, Wei; Zojer, Egbert; Scheffler, Matthias; Tkatchenko, Alexandre

    2012-02-01

    The electronic properties and the function of hybrid inorganic/organic systems (HIOS) are intimately linked to their geometry, with van der Waals (vdW) interactions playing an essential role for the latter. Here we show that the inclusion of the many--body collective response of the substrate electrons inside the inorganic bulk enables us to reliably predict the HIOS geometries and energies. Specifically, dispersion-corrected density-functional theory (the DFT+vdW approach) [PRL 102, 073005 (2009)], is combined with the Lifshitz-Zaremba-Kohn theory [PRB 13, 2270 (1976)] for the non--local Coulomb screening within the bulk. Our method (DFT+vdW^surf ) includes both image-plane and interface polarization effects. We show that DFT+vdW^surf yields geometries in remarkable agreement ( 0.1 å) with normal incidence x-ray standing wave measurements for the 3,4,9,10--perylene--tetracarboxylic acid dianhydride (C24H8O6, PTCDA) molecule on Cu(111), Ag(111), and Au(111). Similarly accurate results are obtained for xenon and benzene adsorbed on metal surfaces.

  11. Density-Functional Theory with Screened van der Waals Interactions for the Modeling of Hybrid Inorganic-Organic Systems

    NASA Astrophysics Data System (ADS)

    Ruiz, Victor G.; Liu, Wei; Zojer, Egbert; Scheffler, Matthias; Tkatchenko, Alexandre

    2012-04-01

    The electronic properties and the function of hybrid inorganic-organic systems (HIOS) are intimately linked to their interface geometry. Here we show that the inclusion of the many-body collective response of the substrate electrons inside the inorganic bulk enables us to reliably predict the HIOS geometries and energies. This is achieved by the combination of dispersion-corrected density-functional theory (the DFT+ van der Waals approach) [Phys. Rev. Lett. 102, 073005 (2009)PRLTAO0031-900710.1103/PhysRevLett.102.073005], with the Lifshitz-Zaremba-Kohn theory for the nonlocal Coulomb screening within the bulk. Our method yields geometries in remarkable agreement (≈0.1Å) with normal incidence x-ray standing wave measurements for the 3, 4, 9, 10-perylene-tetracarboxylic acid dianhydride (C24O6H8, PTCDA) molecule on Cu(111), Ag(111), and Au(111) surfaces. Similarly accurate results are obtained for xenon and benzene adsorbed on metal surfaces.

  12. Reliability and Validity Study of the Mobile Learning Adoption Scale Developed Based on the Diffusion of Innovations Theory

    ERIC Educational Resources Information Center

    Celik, Ismail; Sahin, Ismail; Aydin, Mustafa

    2014-01-01

    In this study, a mobile learning adoption scale (MLAS) was developed on the basis of Rogers' (2003) Diffusion of Innovations Theory. The scale that was developed consists of four sections. These sections are as follows: Stages in the innovation-decision process, Types of m-learning decision, Innovativeness level and attributes of m-learning.…

  13. Reliability and Validity Study of the Mobile Learning Adoption Scale Developed Based on the Diffusion of Innovations Theory

    ERIC Educational Resources Information Center

    Celik, Ismail; Sahin, Ismail; Aydin, Mustafa

    2014-01-01

    In this study, a mobile learning adoption scale (MLAS) was developed on the basis of Rogers' (2003) Diffusion of Innovations Theory. The scale that was developed consists of four sections. These sections are as follows: Stages in the innovation-decision process, Types of m-learning decision, Innovativeness level and attributes of m-learning. There…

  14. Compatibility between Text Mining and Qualitative Research in the Perspectives of Grounded Theory, Content Analysis, and Reliability

    ERIC Educational Resources Information Center

    Yu, Chong Ho; Jannasch-Pennell, Angel; DiGangi, Samuel

    2011-01-01

    The objective of this article is to illustrate that text mining and qualitative research are epistemologically compatible. First, like many qualitative research approaches, such as grounded theory, text mining encourages open-mindedness and discourages preconceptions. Contrary to the popular belief that text mining is a linear and fully automated…

  15. Estimation of reliability and dynamic property for polymeric material at high strain rate using SHPB technique and probability theory

    NASA Astrophysics Data System (ADS)

    Kim, Dong Hyeok; Lee, Ouk Sub; Kim, Hong Min; Choi, Hye Bin

    2008-11-01

    A modified Split Hopkinson Pressure Bar technique with aluminum pressure bars and a pulse shaper technique to achieve a closer impedance match between the pressure bars and the specimen materials such as hot temperature degraded POM (Poly Oxy Methylene) and PP (Poly Propylene). The more distinguishable experimental signals were obtained to evaluate the more accurate dynamic deformation behavior of materials under a high strain rate loading condition. A pulse shaping technique is introduced to reduce the non-equilibrium on the dynamic material response by modulation of the incident wave during a short period of test. This increases the rise time of the incident pulse in the SHPB experiment. For the dynamic stress strain curve obtained from SHPB experiment, the Johnson-Cook model is applied as a constitutive equation. The applicability of this constitutive equation is verified by using the probabilistic reliability estimation method. Two reliability methodologies such as the FORM and the SORM have been proposed. The limit state function(LSF) includes the Johnson-Cook model and applied stresses. The LSF in this study allows more statistical flexibility on the yield stress than a paper published before. It is found that the failure probability estimated by using the SORM is more reliable than those of the FORM/ It is also noted that the failure probability increases with increase of the applied stress. Moreover, it is also found that the parameters of Johnson-Cook model such as A and n, and the applied stress are found to affect the failure probability more severely than the other random variables according to the sensitivity analysis.

  16. [Business organization theory: its potential use in the organization of the operating room].

    PubMed

    Bartz, H-J

    2005-07-01

    The paradigm of patient care in the German health system is changing. The introduction of German Diagnosis Related Groups (G-DRGs), a diagnosis-related coding system, has made process-oriented thinking increasingly important. The treatment process is viewed and managed as a whole from the admission to the discharge of the patient. The interfaces of departments and sectors are diminished. A main objective of these measures is to render patient care more cost efficient. Within the hospital, the operating room (OR) is the most expensive factor accounting for 25 - 50 % of the costs of a surgical patient and is also a bottleneck in the surgical patient care. Therefore, controlling of the perioperative treatment process is getting more and more important. Here, the business organisation theory can be a very useful tool. Especially the concepts of process organisation and process management can be applied to hospitals. Process-oriented thinking uncovers and solves typical organisational problems. Competences, responsibilities and tasks are reorganised by process orientation and the enterprise is gradually transformed to a process-oriented system. Process management includes objective-oriented controlling of the value chain of an enterprise with regard to quality, time, costs and customer satisfaction. The quality of the process is continuously improved using process-management techniques. The main advantage of process management is consistent customer orientation. Customer orientation means to be aware of the customer's needs at any time during the daily routine. The performance is therefore always directed towards current market requirements. This paper presents the basics of business organisation theory and to point out its potential use in the organisation of the OR. PMID:16001317

  17. Organizers.

    ERIC Educational Resources Information Center

    Callison, Daniel

    2000-01-01

    Focuses on "organizers," tools or techniques that provide identification and classification along with possible relationships or connections among ideas, concepts, and issues. Discusses David Ausubel's research and ideas concerning advance organizers; the implications of Ausubel's theory to curriculum and teaching; "webbing," a specific…

  18. Highly stable amorphous silicon thin film transistors and integration approaches for reliable organic light emitting diode displays on clear plastic

    NASA Astrophysics Data System (ADS)

    Hekmatshoar, Bahman

    Hydrogenated amorphous silicon (a-Si:H) thin-film transistors (TFTs) are currently in widespread production for integration with liquid crystals as driver devices. Liquid crystal displays are driven in AC with very low duty cycles and therefore fairly insensitive to the TFT threshold voltage rise which is well-known in a-Si:H devices. Organic light-emitting diodes (OLEDs) are a future technology choice for flexible displays with several advantages over liquid crystals. In contrast to liquid crystal displays, however, OLEDs are driven in DC and thus far more demanding in terms of the TFT stability requirements. Therefore the conventional thinking has been that a-Si:H TFTs are too unstable for driving OLEDs and the more expensive poly-Si or alternative TFT technologies are required. This thesis defies the conventional thinking by demonstrating that the knowledge of the degradation mechanisms in a-Si:H TFTs may be used to enhance the drive current half-life of a-Si:H TFTs from lower than a month to over 1000 years by modifying the growth conditions of the channel and the gate dielectric. Such high lifetimes suggest that the improved a-Si:H TFTs may qualify for driving OLEDs in commercial products. Taking advantage of industry-standard growth techniques, the improved a-Si:H TFTs offer a low barrier for industry insertion, in stark contrast with alternative technologies which require new infrastructure development. Further support for the practical advantages of a-Si:H TFTs for driving OLEDs is provided by a universal lifetime comparison framework proposed in this work, showing that the lifetime of the improved a-Si:H TFTs is well above those of other TFT technologies reported in the literature. Manufacturing of electronic devices on flexible plastic substrates is highly desirable for reducing the weight of the finished products as well as increasing their ruggedness. In addition, the flexibility of the substrate allows manufacturing bendable, foldable or rollable

  19. Self-organization theories and environmental management: The case of South Moresby, Canada

    NASA Astrophysics Data System (ADS)

    Grzybowski, Alex G. S.; Slocombe, D. Scott

    1988-07-01

    This article presents a new approach to the analysis and management of large-scale societal problems with complex ecological, economic, and social dimensions. The approach is based on the theory of self-organizing systems—complex, open, far-from-equilibrium systems with nonlinear dynamics. A brief overview and comparison of different self-organization theories (synergetics, self-organization theory, hypercycles, and autopoiesis) is presented in order to isolate the key characteristics of such systems. The approach is used to develop an analysis of the landuse controversy in the South Moresby area of the Queen Charlotte Islands, British Columbia, Canada. Critical variables are identified for each subsystem and classified by spatial and temporal scale, and discussed in terms of information content and internal/external origin. Eradication of sea otters, introduction of black-tailed deer, impacts of large-scale clearcut logging, sustainability of the coastal forest industry, and changing relations between native peoples and governments are discussed in detail to illustrate the system dynamics of the South Moresby “sociobiophysical” system. Finally, implications of the self-organizing sociobiophysical system view for regional analysis and management are identified.

  20. Assessment of the ΔSCF density functional theory approach for electronic excitations in organic dyes

    SciTech Connect

    Kowalczyk, T.; Yost, S. R.; Van Voorhis, T.

    2010-01-01

    This paper assesses the accuracy of the ΔSCF method for computing low-lying HOMO→LUMO transitions in organic dye molecules. For a test set of vertical excitation energies of 16 chromophores, surprisingly similar accuracy is observed for time-dependent density functional theory and for ΔSCF density functional theory. In light of this performance, we reconsider the ad hoc ΔSCF prescription and demonstrate that it formally obtains the exact stationary density within the adiabatic approximation, partially justifying its use. The relative merits and future prospects of ΔSCF for simulating individual excited states are discussed.

  1. Integrating self-organization theory into an advanced course on morphogenesis at Moscow State University.

    PubMed

    Beloussov, Lev V

    2003-01-01

    A lecture course on morphogenesis for fourth-year Moscow State University Specialist Diploma students specializing in embryology is described. The main goal of the course is to give the students an extensive theoretical background based on the tenets of the modern theory of Self-Organization and to show them how important this theory is for the proper understanding of developmental events. The corresponding mathematics are bound as tightly as possible to the actual morphogenetic processes. All of the lectures take the format of an active dialogue between the students and a tutor. PMID:12705667

  2. Content-oriented Approach to Organization of Theories and Its Utilization

    NASA Astrophysics Data System (ADS)

    Hayashi, Yusuke; Bourdeau, Jacqueline; Mizoguch, Riichiro

    In spite of the fact that the relation between theory and practice is a foundation of scientific and technological development, the trend of increasing the gap between theory and practice accelerates in these years. The gap embraces a risk of distrust of science and technology. Ontological engineering as the content-oriented research is expected to contribute to the resolution of the gap. This paper presents the feasibility of organization of theoretical knowledge on ontological engineering and new-generation intelligent systems based on it through an application of ontological engineering in the area of learning/instruction support. This area also has the problem of the gap between theory and practice, and its resolution is strongly required. So far we proposed OMNIBUS ontology, which is a comprehensive ontology that covers different learning/instructional theories and paradigms, and SMARTIES, which is a theory-aware and standard-compliant authoring system for making learning/instructional scenarios based on OMNIBUS ontology. We believe the theory-awareness and standard-compliance bridge the gap between theory and practice because it links theories to practical use of standard technologies and enables practitioners to easily enjoy theoretical support while using standard technologies in practice. The following goals are set in order to achieve it; computers (1) understand a variety of learning/instructional theories based on the organization of them, (2) utilize the understanding for helping authors' learning/instructional scenario making and (3) make such theoretically sound scenarios interoperable within the framework of standard technologies. This paper suggests an ontological engineering solution to the achievement of these three goals. Although the evaluation is far from complete in terms of practical use, we believe that the results of this study address high-level technical challenges from the viewpoint of the current state of the art in the research area

  3. Assessing governance theory and practice in health-care organizations: a survey of UK hospices.

    PubMed

    Chambers, Naomi; Benson, Lawrence; Boyd, Alan; Girling, Jeff

    2012-05-01

    This paper sets out a theoretical framework for analyzing board governance, and describes an empirical study of corporate governance practices in a subset of non-profit organizations (hospices in the UK). It examines how practices in hospice governance compare with what is known about effective board working. We found that key strengths of hospice boards included a strong focus on the mission and the finances of the organizations, and common weaknesses included a lack of involvement in strategic matters and a lack of confidence, and some nervousness about challenging the organization on the quality of clinical care. Finally, the paper offers suggestions for theoretical development particularly in relation to board governance in non-profit organizations. It develops an engagement theory for boards which comprises a triadic proposition of high challenge, high support and strong grip. PMID:22673698

  4. The Advancement of Family Therapy Theory Based on the Science of Self-Organizing Complex Systems.

    NASA Astrophysics Data System (ADS)

    Ramsey-Kemper, Valerie Ann

    1995-01-01

    Problem. The purpose of this study was to review the literature which presents the latest advancements in the field of family therapy theory. Since such advancement has relied on the scientific developments in the study of autopoietic self-organizing complex systems, then the review began with an historical overview of the development of these natural scientific concepts. The study then examined how the latest scientific concepts have been integrated with family therapy practice. The document is built on the theory that individuals are living, complex, self-organizing, autopoietic systems. When individual systems interact with other individual systems (such as in family interaction, or in interaction between therapist and client), then a third system emerges, which is the relationship. It is through interaction in the relationship that transformation of an individual system can occur. Method. The historical antecedents of the field of family therapy were outlined. It was demonstrated, via literature review, that the field of family therapy has traditionally paralleled developments in the hard sciences. Further, it was demonstrated via literature review that the newest understandings of the development of individuals, family systems, and therapeutic systems also parallel recent natural science developments, namely those developments based on the science of self-organizing complex systems. Outcome. The results of the study are twofold. First, the study articulates an expanded theory of the therapist, individual, and family as autopoietic self-organizing complex systems. Second, the study provides an expanded hypothesis which concerns recommendations for future research which will further advance the latest theories of family therapy. More precisely, the expanded hypothesis suggests that qualitative research, rather than quantitative research, is the method of choice for studying the effectiveness of phenomenological therapy.

  5. Insights into the organization of biochemical regulatory networks using graph theory analyses.

    PubMed

    Ma'ayan, Avi

    2009-02-27

    Graph theory has been a valuable mathematical modeling tool to gain insights into the topological organization of biochemical networks. There are two types of insights that may be obtained by graph theory analyses. The first provides an overview of the global organization of biochemical networks; the second uses prior knowledge to place results from multivariate experiments, such as microarray data sets, in the context of known pathways and networks to infer regulation. Using graph analyses, biochemical networks are found to be scale-free and small-world, indicating that these networks contain hubs, which are proteins that interact with many other molecules. These hubs may interact with many different types of proteins at the same time and location or at different times and locations, resulting in diverse biological responses. Groups of components in networks are organized in recurring patterns termed network motifs such as feedback and feed-forward loops. Graph analysis revealed that negative feedback loops are less common and are present mostly in proximity to the membrane, whereas positive feedback loops are highly nested in an architecture that promotes dynamical stability. Cell signaling networks have multiple pathways from some input receptors and few from others. Such topology is reminiscent of a classification system. Signaling networks display a bow-tie structure indicative of funneling information from extracellular signals and then dispatching information from a few specific central intracellular signaling nexuses. These insights show that graph theory is a valuable tool for gaining an understanding of global regulatory features of biochemical networks. PMID:18940806

  6. Artificial organisms as tools for the development of psychological theory: Tolman's lesson.

    PubMed

    Miglino, Orazio; Gigliotta, Onofrio; Cardaci, Maurizio; Ponticorvo, Michela

    2007-12-01

    In the 1930s and 1940s, Edward Tolman developed a psychological theory of spatial orientation in rats and humans. He expressed his theory as an automaton (the "schematic sowbug") or what today we would call an "artificial organism." With the technology of the day, he could not implement his model. Nonetheless, he used it to develop empirical predictions which tested with animals in the laboratory. This way of proceeding was in line with scientific practice dating back to Galileo. The way psychologists use artificial organisms in their work today breaks with this tradition. Modern "artificial organisms" are constructed a posteriori, working from experimental or ethological observations. As a result, researchers can use them to confirm a theoretical model or to simulate its operation. But they make no contribution to the actual building of models. In this paper, we try to return to Tolman's original strategy: implementing his theory of "vicarious trial and error" in a simulated robot, forecasting the robot's behavior and conducting experiments that verify or falsify these predictions. PMID:17665237

  7. Universal carrier thermoelectric-transport model based on percolation theory in organic semiconductors

    NASA Astrophysics Data System (ADS)

    Lu, Nianduan; Li, Ling; Liu, Ming

    2015-05-01

    Recent measurements conducted over a large range of temperature and carrier density have found that the Seebeck coefficient exhibits an approaching disorder-free transport feature in high-mobility conjugated polymers [D. Venkateshvaran et al., Nature 515, 384 (2014), 10.1038/nature13854]. It is difficult for the current Seebeck coefficient model to interpret the feature of the charge transport approaching disorder-free transport. We present a general analytical model to describe the Seebeck effect for organic semiconductors based on the hopping transport and percolation theory. The proposed model can well explain the Seebeck feature of the polymers with approaching disorder-free transport, as well as that of the organic semiconductors with the general disorder. The simulated results imply that the Seebeck coefficient in the organic semiconductors would happen to transfer from temperature dependence to temperature independence with the decrease of the energetic disorder.

  8. Validation and test-retest reliability of a health measure, health as ability of acting, based on the welfare theory of health.

    PubMed

    Snellman, Ingrid; Jonsson, Bosse; Wikblad, Karin

    2012-03-01

    The aim of this study was to conduct a validation and assess the test-retest reliability of the health questionnaire based on Nordenfelt's Welfare Theory of Health (WTH). The study used a questionnaire on health together with the Short Form 12-Item Health Survey (SF-12) questionnaire, and 490 pupils at colleges for adult education participated. The results of the study are in accordance with Nordenfelt's WTH. Three hypotheses were stated, and the first was confirmed: People who were satisfied with life rated higher levels than those who were dissatisfied with life concerning both mental and physical health, measured with the SF-12. The second hypothesis was partially confirmed: People with high education were more often satisfied with life than those with low education, but they were not healthier. The third hypothesis, that women are unhealthy more often than men, was not confirmed. The questionnaire on health showed acceptable stability. PMID:21930655

  9. Human reliability analysis

    SciTech Connect

    Dougherty, E.M.; Fragola, J.R.

    1988-01-01

    The authors present a treatment of human reliability analysis incorporating an introduction to probabilistic risk assessment for nuclear power generating stations. They treat the subject according to the framework established for general systems theory. Draws upon reliability analysis, psychology, human factors engineering, and statistics, integrating elements of these fields within a systems framework. Provides a history of human reliability analysis, and includes examples of the application of the systems approach.

  10. The Mosaic Theory Revisited: Common Molecular Mechanisms Coordinating Diverse Organ and Cellular Events in Hypertension

    PubMed Central

    Harrison, David G.

    2012-01-01

    Over 60 years ago, Dr. Irvine Page proposed the Mosaic Theory of hypertension, which states that many factors, including genetics, environment, adaptive, neural, mechanical and hormonal perturbations interdigitate to raise blood pressure. In the past two decades, it has become clear that common molecular and cellular events in various organs underlie many features of the Mosaic Theory. Two of these are the production of reactive oxygen species (ROS) and inflammation. These factors increase neuronal firing in specific brain centers, increase sympathetic outflow, alter vascular tone and morphology and promote sodium retention in the kidney. Moreover, factors such as genetics and environment contribute to oxidant generation and inflammation. Other common cellular signals, including calcium signaling and endoplasmic reticulum stress are similarly perturbed in different cells in hypertension and contribute to components of Dr. Page’s theory. Thus, Dr. Page’s Mosaic Theory formed a framework for future studies of molecular and cellular signals in the context of hypertension, and has greatly aided our understanding of this complex disease. PMID:23321405

  11. Using organization theory to understand the determinants of effective implementation of worksite health promotion programs.

    PubMed

    Weiner, Bryan J; Lewis, Megan A; Linnan, Laura A

    2009-04-01

    The field of worksite health promotion has moved toward the development and testing of comprehensive programs that target health behaviors with interventions operating at multiple levels of influence. Yet, observational and process evaluation studies indicate that such programs are challenging for worksites to implement effectively. Research has identified several organizational factors that promote or inhibit effective implementation of comprehensive worksite health promotion programs. However, no integrated theory of implementation has emerged from this research. This article describes a theory of the organizational determinants of effective implementation of comprehensive worksite health promotion programs. The model is adapted from theory and research on the implementation of complex innovations in manufacturing, education and health care settings. The article uses the Working Well Trial to illustrate the model's theoretical constructs. Although the article focuses on comprehensive worksite health promotion programs, the conceptual model may also apply to other types of complex health promotion programs. An organization-level theory of the determinants of effective implementation of worksite health promotion programs. PMID:18469319

  12. Are the Somatic Mutation and Tissue Organization Field Theories of Carcinogenesis Incompatible?

    PubMed Central

    Rosenfeld, Simon

    2013-01-01

    Two drastically different approaches to understanding the forces driving carcinogenesis have crystallized through years of research. These are the somatic mutation theory (SMT) and the tissue organization field theory (TOFT). The essence of SMT is that cancer is derived from a single somatic cell that has successively accumulated multiple DNA mutations, and that those mutations occur on genes which control cell proliferation and cell cycle. Thus, according to SMT, neoplastic lesions are the results of DNA-level events. Conversely, according to TOFT, carcinogenesis is primarily a problem of tissue organization: carcinogenic agents destroy the normal tissue architecture thus disrupting cell-to-cell signaling and compromising genomic integrity. Hence, in TOFT the DNA mutations are the effect, and not the cause, of the tissue-level events. Cardinal importance of successful resolution of the TOFT versus SMT controversy dwells in the fact that, according to SMT, cancer is a unidirectional and mostly irreversible disease; whereas, according to TOFT, it is curable and reversible. In this paper, our goal is to outline a plausible scenario in which TOFT and SMT can be reconciled using the framework and concepts of the self-organized criticality (SOC), the principle proven to be extremely fruitful in a wide range of disciplines pertaining to natural phenomena, to biological communities, to large-scale social developments, to technological networks, and to many other subjects of research. PMID:24324325

  13. Excited state and charge dynamics of hybrid organic/inorganic heterojunctions. I. Theory

    NASA Astrophysics Data System (ADS)

    Renshaw, C. Kyle; Forrest, Stephen R.

    2014-07-01

    The different cohesive forces that bond organic (i.e. excitonic) and inorganic semiconductors lead to widely disparate dielectric constants, charge mobilities, and other fundamental optoelectronic properties that make junctions between these materials interesting for numerous practical applications. Yet, there are no detailed theories addressing charge and energy transport across interfaces between these hybrid systems. Here, we develop a comprehensive physical model describing charge transport and photocurrent generation based on first-principles charge and excited state dynamics at the organic/inorganic heterojunction. We consider interfaces that are trap-free, as well as those with an exponential distribution of trap states. We find that the hybrid charge-transfer state resulting from photon absorption near the junction that subsequently migrates to the heterointerface is often unstable at room temperature, leading to its rapid dissociation into free charges that are collected at the device contacts. In the companion Paper II [A. Panda et al., Phys. Rev. B 90, 045303 (2014), 10.1103/PhysRevB.90.045303], we apply our theories to understanding the optical and electronic properties of archetype organic/inorganic heterojunction diodes. Our analysis provides insights for developing high performance optoelectronic devices whose properties are otherwise inaccessible to either conventional excitonic or inorganic semiconductor junctions.

  14. Are the somatic mutation and tissue organization field theories of carcinogenesis incompatible?

    PubMed

    Rosenfeld, Simon

    2013-01-01

    Two drastically different approaches to understanding the forces driving carcinogenesis have crystallized through years of research. These are the somatic mutation theory (SMT) and the tissue organization field theory (TOFT). The essence of SMT is that cancer is derived from a single somatic cell that has successively accumulated multiple DNA mutations, and that those mutations occur on genes which control cell proliferation and cell cycle. Thus, according to SMT, neoplastic lesions are the results of DNA-level events. Conversely, according to TOFT, carcinogenesis is primarily a problem of tissue organization: carcinogenic agents destroy the normal tissue architecture thus disrupting cell-to-cell signaling and compromising genomic integrity. Hence, in TOFT the DNA mutations are the effect, and not the cause, of the tissue-level events. Cardinal importance of successful resolution of the TOFT versus SMT controversy dwells in the fact that, according to SMT, cancer is a unidirectional and mostly irreversible disease; whereas, according to TOFT, it is curable and reversible. In this paper, our goal is to outline a plausible scenario in which TOFT and SMT can be reconciled using the framework and concepts of the self-organized criticality (SOC), the principle proven to be extremely fruitful in a wide range of disciplines pertaining to natural phenomena, to biological communities, to large-scale social developments, to technological networks, and to many other subjects of research. PMID:24324325

  15. African American Organ Donor Registration: A Mixed Methods Design using the Theory of Planned Behavior

    PubMed Central

    DuBay, Derek A.; Ivankova, Nataliya; Herby, Ivan; Wynn, Theresa A.; Kohler, Connie; Berry, Beverly; Foushee, Herman; Carson, April; Redden, David T.; Holt, Cheryl; Siminoff, Laura; Fouad, Mona; Martin, Michelle Y.

    2015-01-01

    Context A large racial disparity exists in organ donation. Objective The purpose of this study was to identify factors associated with becoming a registered organ donor in among African Americans in Alabama. Methods The study utilized a concurrent mixed methods design guided by the Theory of Planned Behavior to analyze African American’s decisions to become a registered organ donor using both qualitative (focus groups) and quantitative (survey) methods. Results The sample consisted of 22 registered organ donors (ROD) and 65 non-registered participants (NRP) from six focus groups completed in urban (n=3) and rural (n=3) areas. Participants emphasized the importance of the autonomy to make one’s own organ donation decision and have this decision honored posthumously. One novel barrier to becoming a ROD was the perception that organs from African Americans were often unusable due to high prevalence of chronic medical conditions such as diabetes and hypertension. Another novel theme discussed as an advantage to becoming a ROD was the subsequent motivation to take responsibility for one’s health. Family and friends were the most common groups of persons identified as approving and disapproving of the decision to become a ROD. The most common facilitator to becoming a ROD was information, while fear and the lack of information were the most common barriers. In contrast, religious beliefs, mistrust and social justice themes were infrequently referenced as barriers to becoming a ROD. Discussion Findings from this study may be useful for prioritizing organ donation community-based educational interventions in campaigns to increase donor registration. PMID:25193729

  16. Demonstration for novel self-organization theory by three-dimensional magnetohydrodynamic simulation

    NASA Astrophysics Data System (ADS)

    Kondoh, Yoshiomi; Hosaka, Yasuo; Liang, Jia-Ling

    1993-03-01

    It is demonstrated by three-dimensional simulations for resistive magnetohydrodynamic (MHD) plasmas with both 'spatially nonuniform resistivity eta' and 'uniform eta' that the attractor of the dissipative structure in the resistive MHD plasmas is given by del x (eta)j) = (alpha/2)B which is derived from a self-organization theory based on the minimum dissipation rate profile. It is shown by the simulations that the attractor is reduced to del x B = (lambda)B in the special case with the 'uniform eta' and no pressure gradient.

  17. Enhanced Breakdown Reliability and Spatial Uniformity of Atomic Layer Deposited High-k Gate Dielectrics on Graphene via Organic Seeding Layers

    NASA Astrophysics Data System (ADS)

    Sangwan, Vinod; Jariwala, Deep; Filippone, Stephen; Karmel, Hunter; Johns, James; Alaboson, Justice; Marks, Tobin; Lauhon, Lincoln; Hersam, Mark

    2013-03-01

    Ultra-thin high- κ top-gate dielectrics are essential for high-speed graphene-based nanoelectronic circuits. Motivated by the need for high reliability and spatial uniformity, we report here the first statistical analysis of the breakdown characteristics of dielectrics grown on graphene. Based on these measurements, a rational approach is devised that simultaneously optimizes the gate capacitance and the key parameters of large-area uniformity and dielectric strength. In particular, vertically heterogeneous oxide stacks grown via atomic-layer deposition (ALD) seeded by a molecularly thin perylene-3,4,9,10-tetracarboxylic dianhydride (PTCDA) organic monolayer result in improved reliability (Weibull shape parameter β > 25) compared to the control dielectric directly grown on graphene without PTCDA (β < 1). The optimized sample also showed a large breakdown strength (Weibull scale parameter, EBD > 7 MV/cm) that is comparable to that of the control dielectric grown on Si substrates.

  18. Reliability of fluid systems

    NASA Astrophysics Data System (ADS)

    Kopáček, Jaroslav; Fojtášek, Kamil; Dvořák, Lukáš

    2016-03-01

    This paper focuses on the importance of detection reliability, especially in complex fluid systems for demanding production technology. The initial criterion for assessing the reliability is the failure of object (element), which is seen as a random variable and their data (values) can be processed using by the mathematical methods of theory probability and statistics. They are defined the basic indicators of reliability and their applications in calculations of serial, parallel and backed-up systems. For illustration, there are calculation examples of indicators of reliability for various elements of the system and for the selected pneumatic circuit.

  19. In search of a reliable technique for the determination of the biological stability of the organic matter in the mechanical-biological treated waste.

    PubMed

    Barrena, Raquel; d'Imporzano, Giuliana; Ponsá, Sergio; Gea, Teresa; Artola, Adriana; Vázquez, Felícitas; Sánchez, Antoni; Adani, Fabrizio

    2009-03-15

    The biological stability determines the extent to which readily biodegradable organic matter has decomposed. In this work, a massive estimation of indices suitable for the measurement of biological stability of the organic matter content in solid waste samples has been carried out. Samples from different stages in a mechanical-biological treatment (MBT) plant treating municipal solid wastes (MSW) were selected as examples of different stages of organic matter stability in waste biological treatment. Aerobic indices based on respiration techniques properly reflected the process of organic matter biodegradation. Static and dynamic respirometry showed similar values in terms of aerobic biological activity (expressed as oxygen uptake rate, OUR), whereas cumulative oxygen consumption was a reliable method to express the biological stability of organic matter in solid samples. Methods based on OUR and cumulative oxygen consumption were positively correlated. Anaerobic methods based on biogas production (BP) tests also reflected well the degree of biological stability, although significant differences were found in solid and liquid BP assays. A significant correlation was found between cumulative oxygen consumption and ultimate biogas production. The results obtained in this study can be a basis for the quantitative measurement of the efficiency in the stabilization of organic matter in waste treatment plants, including MBT plants, anaerobic digestion of MSW and composting plants. PMID:18606494

  20. Benchmarking DFT and semi-empirical methods for a reliable and cost-efficient computational screening of benzofulvene derivatives as donor materials for small-molecule organic solar cells.

    PubMed

    Tortorella, Sara; Talamo, Maurizio Mastropasqua; Cardone, Antonio; Pastore, Mariachiara; De Angelis, Filippo

    2016-02-24

    A systematic computational investigation on the optical properties of a group of novel benzofulvene derivatives (Martinelli 2014 Org. Lett. 16 3424-7), proposed as possible donor materials in small molecule organic photovoltaic (smOPV) devices, is presented. A benchmark evaluation against experimental results on the accuracy of different exchange and correlation functionals and semi-empirical methods in predicting both reliable ground state equilibrium geometries and electronic absorption spectra is carried out. The benchmark of the geometry optimization level indicated that the best agreement with x-ray data is achieved by using the B3LYP functional. Concerning the optical gap prediction, we found that, among the employed functionals, MPW1K provides the most accurate excitation energies over the entire set of benzofulvenes. Similarly reliable results were also obtained for range-separated hybrid functionals (CAM-B3LYP and wB97XD) and for global hybrid methods incorporating a large amount of non-local exchange (M06-2X and M06-HF). Density functional theory (DFT) hybrids with a moderate (about 20-30%) extent of Hartree-Fock exchange (HFexc) (PBE0, B3LYP and M06) were also found to deliver HOMO-LUMO energy gaps which compare well with the experimental absorption maxima, thus representing a valuable alternative for a prompt and predictive estimation of the optical gap. The possibility of using completely semi-empirical approaches (AM1/ZINDO) is also discussed. PMID:26808717

  1. Benchmarking DFT and semi-empirical methods for a reliable and cost-efficient computational screening of benzofulvene derivatives as donor materials for small-molecule organic solar cells

    NASA Astrophysics Data System (ADS)

    Tortorella, Sara; Mastropasqua Talamo, Maurizio; Cardone, Antonio; Pastore, Mariachiara; De Angelis, Filippo

    2016-02-01

    A systematic computational investigation on the optical properties of a group of novel benzofulvene derivatives (Martinelli 2014 Org. Lett. 16 3424-7), proposed as possible donor materials in small molecule organic photovoltaic (smOPV) devices, is presented. A benchmark evaluation against experimental results on the accuracy of different exchange and correlation functionals and semi-empirical methods in predicting both reliable ground state equilibrium geometries and electronic absorption spectra is carried out. The benchmark of the geometry optimization level indicated that the best agreement with x-ray data is achieved by using the B3LYP functional. Concerning the optical gap prediction, we found that, among the employed functionals, MPW1K provides the most accurate excitation energies over the entire set of benzofulvenes. Similarly reliable results were also obtained for range-separated hybrid functionals (CAM-B3LYP and wB97XD) and for global hybrid methods incorporating a large amount of non-local exchange (M06-2X and M06-HF). Density functional theory (DFT) hybrids with a moderate (about 20-30%) extent of Hartree-Fock exchange (HFexc) (PBE0, B3LYP and M06) were also found to deliver HOMO-LUMO energy gaps which compare well with the experimental absorption maxima, thus representing a valuable alternative for a prompt and predictive estimation of the optical gap. The possibility of using completely semi-empirical approaches (AM1/ZINDO) is also discussed.

  2. Firm Size, a Self-Organized Critical Phenomenon: Evidence from the Dynamical Systems Theory

    NASA Astrophysics Data System (ADS)

    Chandra, Akhilesh

    This research draws upon a recent innovation in the dynamical systems literature called the theory of self -organized criticality (SOC) (Bak, Tang, and Wiesenfeld 1988) to develop a computational model of a firm's size by relating its internal and the external sub-systems. As a holistic paradigm, the theory of SOC implies that a firm as a composite system of many degrees of freedom naturally evolves to a critical state in which a minor event starts a chain reaction that can affect either a part or the system as a whole. Thus, the global features of a firm cannot be understood by analyzing its individual parts separately. The causal framework builds upon a constant capital resource to support a volume of production at the existing level of efficiency. The critical size is defined as the production level at which the average product of a firm's factors of production attains its maximum value. The non -linearity is inferred by a change in the nature of relations at the border of criticality, between size and the two performance variables, viz., the operating efficiency and the financial efficiency. The effect of breaching the critical size is examined on the stock price reactions. Consistent with the theory of SOC, it is hypothesized that the temporal response of a firm breaching the level of critical size should behave as a flicker noise (1/f) process. The flicker noise is characterized by correlations extended over a wide range of time scales, indicating some sort of cooperative effect among a firm's degrees of freedom. It is further hypothesized that a firm's size evolves to a spatial structure with scale-invariant, self-similar (fractal) properties. The system is said to be self-organized inasmuch as it naturally evolves to the state of criticality without any detailed specifications of the initial conditions. In this respect, the critical state is an attractor of the firm's dynamics. Another set of hypotheses examines the relations between the size and the

  3. Did Geomagnetic Activity Challenge Electric Power Reliability During Solar Cycle 23? Evidence from the PJM Regional Transmission Organization in North America

    NASA Technical Reports Server (NTRS)

    Forbes, Kevin F.; Cyr, Chris St

    2012-01-01

    During solar cycle 22, a very intense geomagnetic storm on 13 March 1989 contributed to the collapse of the Hydro-Quebec power system in Canada. This event clearly demonstrated that geomagnetic storms have the potential to lead to blackouts. This paper addresses whether geomagnetic activity challenged power system reliability during solar cycle 23. Operations by PJM Interconnection, LLC (hereafter PJM), a regional transmission organization in North America, are examined over the period 1 April 2002 through 30 April 2004. During this time PJM coordinated the movement of wholesale electricity in all or parts of Delaware, Maryland, New Jersey, Ohio, Pennsylvania, Virginia, West Virginia, and the District of Columbia in the United States. We examine the relationship between a proxy of geomagnetically induced currents (GICs) and a metric of challenged reliability. In this study, GICs are proxied using magnetometer data from a geomagnetic observatory located just outside the PJM control area. The metric of challenged reliability is the incidence of out-of-economic-merit order dispatching due to adverse reactive power conditions. The statistical methods employed make it possible to disentangle the effects of GICs on power system operations from purely terrestrial factors. The results of the analysis indicate that geomagnetic activity can significantly increase the likelihood that the system operator will dispatch generating units based on system stability considerations rather than economic merit.

  4. High reliable and stable organic field-effect transistor nonvolatile memory with a poly(4-vinyl phenol) charge trapping layer based on a pn-heterojunction active layer

    NASA Astrophysics Data System (ADS)

    Xiang, Lanyi; Ying, Jun; Han, Jinhua; Zhang, Letian; Wang, Wei

    2016-04-01

    In this letter, we demonstrate a high reliable and stable organic field-effect transistor (OFET) based nonvolatile memory (NVM) with a polymer poly(4-vinyl phenol) (PVP) as the charge trapping layer. In the unipolar OFETs, the inreversible shifts of the turn-on voltage (Von) and severe degradation of the memory window (ΔVon) at programming (P) and erasing (E) voltages, respectively, block their application in NVMs. The obstacle is overcome by using a pn-heterojunction as the active layer in the OFET memory, which supplied a holes and electrons accumulating channel at the supplied P and E voltages, respectively. Both holes and electrons transferring from the channels to PVP layer and overwriting the trapped charges with an opposite polarity result in the reliable bidirectional shifts of Von at P and E voltages, respectively. The heterojunction OFET exhibits excellent nonvolatile memory characteristics, with a large ΔVon of 8.5 V, desired reading (R) voltage at 0 V, reliable P/R/E/R dynamic endurance over 100 cycles and a long retention time over 10 years.

  5. Using Ontological Engineering to Organize Learning/Instructional Theories and Build a Theory-Aware Authoring System

    ERIC Educational Resources Information Center

    Hayashi, Yusuke; Bourdeau, Jacqueline; Mizoguchi, Riichiro

    2009-01-01

    This paper describes the achievements of an innovative eight-year research program first introduced in Mizoguchi and Bourdeau (2000), which was aimed at building a theory-aware authoring system by using ontological engineering. To date, we have proposed OMNIBUS, an ontology that comprehensively covers different learning/instructional theories and…

  6. 18 CFR 39.5 - Reliability Standards.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Reliability Standards... RELIABILITY ORGANIZATION; AND PROCEDURES FOR THE ESTABLISHMENT, APPROVAL, AND ENFORCEMENT OF ELECTRIC RELIABILITY STANDARDS § 39.5 Reliability Standards. (a) The Electric Reliability Organization shall file...

  7. 18 CFR 39.11 - Reliability reports.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Reliability reports. 39... RELIABILITY ORGANIZATION; AND PROCEDURES FOR THE ESTABLISHMENT, APPROVAL, AND ENFORCEMENT OF ELECTRIC RELIABILITY STANDARDS § 39.11 Reliability reports. (a) The Electric Reliability Organization shall...

  8. Investigation of Multiconfigurational Short-Range Density Functional Theory for Electronic Excitations in Organic Molecules.

    PubMed

    Hubert, Mickaël; Hedegård, Erik D; Jensen, Hans Jørgen Aa

    2016-05-10

    Computational methods that can accurately and effectively predict all types of electronic excitations for any molecular system are missing in the toolbox of the computational chemist. Although various Kohn-Sham density-functional methods (KS-DFT) fulfill this aim in some cases, they become inadequate when the molecule has near-degeneracies and/or low-lying double-excited states. To address these issues we have recently proposed multiconfiguration short-range density-functional theory-MC-srDFT-as a new tool in the toolbox. While initial applications for systems with multireference character and double excitations have been promising, it is nevertheless important that the accuracy of MC-srDFT is at least comparable to the best KS-DFT methods also for organic molecules that are typically of single-reference character. In this paper we therefore systematically investigate the performance of MC-srDFT for a selected benchmark set of electronic excitations of organic molecules, covering the most common types of organic chromophores. This investigation confirms the expectation that the MC-srDFT method is accurate for a broad range of excitations and comparable to accurate wave function methods such as CASPT2, NEVPT2, and the coupled cluster based CC2 and CC3. PMID:27058733

  9. Precise segmentation of multiple organs in CT volumes using learning-based approach and information theory.

    PubMed

    Lu, Chao; Zheng, Yefeng; Birkbeck, Neil; Zhang, Jingdan; Kohlberger, Timo; Tietjen, Christian; Boettger, Thomas; Duncan, James S; Zhou, S Kevin

    2012-01-01

    In this paper, we present a novel method by incorporating information theory into the learning-based approach for automatic and accurate pelvic organ segmentation (including the prostate, bladder and rectum). We target 3D CT volumes that are generated using different scanning protocols (e.g., contrast and non-contrast, with and without implant in the prostate, various resolution and position), and the volumes come from largely diverse sources (e.g., diseased in different organs). Three key ingredients are combined to solve this challenging segmentation problem. First, marginal space learning (MSL) is applied to efficiently and effectively localize the multiple organs in the largely diverse CT volumes. Second, learning techniques, steerable features, are applied for robust boundary detection. This enables handling of highly heterogeneous texture pattern. Third, a novel information theoretic scheme is incorporated into the boundary inference process. The incorporation of the Jensen-Shannon divergence further drives the mesh to the best fit of the image, thus improves the segmentation performance. The proposed approach is tested on a challenging dataset containing 188 volumes from diverse sources. Our approach not only produces excellent segmentation accuracy, but also runs about eighty times faster than previous state-of-the-art solutions. The proposed method can be applied to CT images to provide visual guidance to physicians during the computer-aided diagnosis, treatment planning and image-guided radiotherapy to treat cancers in pelvic region. PMID:23286081

  10. Making Reliability Arguments in Classrooms

    ERIC Educational Resources Information Center

    Parkes, Jay; Giron, Tilia

    2006-01-01

    Reliability methodology needs to evolve as validity has done into an argument supported by theory and empirical evidence. Nowhere is the inadequacy of current methods more visible than in classroom assessment. Reliability arguments would also permit additional methodologies for evidencing reliability in classrooms. It would liberalize methodology…

  11. 18 CFR 39.5 - Reliability Standards.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Reliability Standards... RELIABILITY STANDARDS § 39.5 Reliability Standards. (a) The Electric Reliability Organization shall file each Reliability Standard or modification to a Reliability Standard that it proposes to be made effective...

  12. 18 CFR 39.5 - Reliability Standards.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 18 Conservation of Power and Water Resources 1 2014-04-01 2014-04-01 false Reliability Standards... RELIABILITY STANDARDS § 39.5 Reliability Standards. (a) The Electric Reliability Organization shall file each Reliability Standard or modification to a Reliability Standard that it proposes to be made effective...

  13. Investigating the self-organization of debris flows: theory, modelling, and empirical work

    NASA Astrophysics Data System (ADS)

    von Elverfeldt, Kirsten; Keiler, Margreth; Elmenreich, Wilfried; Fehárvári, István; Zhevzhyk, Sergii

    2014-05-01

    Here we present the conceptual framework of an interdisciplinary project on the theory, empirics, and modelling of the self-organisation mechanisms within debris flows. Despite the fact that debris flows are causing severe damages in mountainous regions such as the Alps, the process behaviour of debris flows is still not well understood. This is mainly due to the process dynamics of debris flows: Erosion and material entrainment are essential for their destructive power, and because of this destructiveness it is nearly impossible to measure and observe these mechanisms in action. Hence, the interactions between channel bed and debris flow remain largely unknown whilst this knowledge is crucial for the understanding of debris flow behaviour. Furthermore, while these internal parameter interactions are changing during an event, they are at the same time governing the temporal and spatial evolution of a given event. This project aims at answering some of these unknowns by means of bringing theory, empirical work, and modelling of debris flows together. It especially aims at explaining why process types are switching along the flow path during an event, e.g. the change from a debris flow to a hyperconcentrated flow and back. A second focus is the question of why debris flows sometimes exhibit strong erosion and sediment mobilisation during an event and at other times they do not. A promising theoretical framework for the analysis of these observations is that of self-organizing systems, and especially Haken's theory of synergetics. Synergetics is an interdisciplinary theory of open systems that are characterized by many individual, yet interacting parts, resulting in spatio-temporal structures. We hypothesize that debris flows can successfully be analysed within this theoretical framework. In order to test this hypothesis, an innovative modelling approach is chosen in combination with detailed field work. In self-organising systems the interactions of the system

  14. Improved device reliability in organic light emitting devices by controlling the etching of indium zinc oxide anode

    NASA Astrophysics Data System (ADS)

    Liao, Ying-Jie; Lou, Yan-Hui; Wang, Zhao-Kui; Liao, Liang-Sheng

    2014-11-01

    A controllable etching process for indium zinc oxide (IZO) films was developed by using a weak etchant of oxalic acid with a slow etching ratio. With controllable etching time and temperature, a patterned IZO electrode with smoothed surface morphology and slope edge was achieved. For the practical application in organic light emitting devices (OLEDs), a suppression of the leak current in the current—voltage characteristics of OLEDs was observed. It resulted in a 1.6 times longer half lifetime in the IZO-based OLEDs compared to that using an indium tin oxide (ITO) anode etched by a conventional strong etchant of aqua regia.

  15. Collection-limited theory interprets the extraordinary response of single semiconductor organic solar cells.

    PubMed

    Ray, Biswajit; Baradwaj, Aditya G; Khan, Mohammad Ryyan; Boudouris, Bryan W; Alam, Muhammad Ashraful

    2015-09-01

    The bulk heterojunction (BHJ) organic photovoltaic (OPV) architecture has dominated the literature due to its ability to be implemented in devices with relatively high efficiency values. However, a simpler device architecture based on a single organic semiconductor (SS-OPV) offers several advantages: it obviates the need to control the highly system-dependent nanoscale BHJ morphology, and therefore, would allow the use of broader range of organic semiconductors. Unfortunately, the photocurrent in standard SS-OPV devices is typically very low, which generally is attributed to inefficient charge separation of the photogenerated excitons. Here we show that the short-circuit current density from SS-OPV devices can be enhanced significantly (∼100-fold) through the use of inverted device configurations, relative to a standard OPV device architecture. This result suggests that charge generation may not be the performance bottleneck in OPV device operation. Instead, poor charge collection, caused by defect-induced electric field screening, is most likely the primary performance bottleneck in regular-geometry SS-OPV cells. We justify this hypothesis by: (i) detailed numerical simulations, (ii) electrical characterization experiments of functional SS-OPV devices using multiple polymers as active layer materials, and (iii) impedance spectroscopy measurements. Furthermore, we show that the collection-limited photocurrent theory consistently interprets typical characteristics of regular SS-OPV devices. These insights should encourage the design and OPV implementation of high-purity, high-mobility polymers, and other soft materials that have shown promise in organic field-effect transistor applications, but have not performed well in BHJ OPV devices, wherein they adopt less-than-ideal nanostructures when blended with electron-accepting materials. PMID:26290582

  16. Collection-limited theory interprets the extraordinary response of single semiconductor organic solar cells

    PubMed Central

    Ray, Biswajit; Baradwaj, Aditya G.; Khan, Mohammad Ryyan; Boudouris, Bryan W.; Alam, Muhammad Ashraful

    2015-01-01

    The bulk heterojunction (BHJ) organic photovoltaic (OPV) architecture has dominated the literature due to its ability to be implemented in devices with relatively high efficiency values. However, a simpler device architecture based on a single organic semiconductor (SS-OPV) offers several advantages: it obviates the need to control the highly system-dependent nanoscale BHJ morphology, and therefore, would allow the use of broader range of organic semiconductors. Unfortunately, the photocurrent in standard SS-OPV devices is typically very low, which generally is attributed to inefficient charge separation of the photogenerated excitons. Here we show that the short-circuit current density from SS-OPV devices can be enhanced significantly (∼100-fold) through the use of inverted device configurations, relative to a standard OPV device architecture. This result suggests that charge generation may not be the performance bottleneck in OPV device operation. Instead, poor charge collection, caused by defect-induced electric field screening, is most likely the primary performance bottleneck in regular-geometry SS-OPV cells. We justify this hypothesis by: (i) detailed numerical simulations, (ii) electrical characterization experiments of functional SS-OPV devices using multiple polymers as active layer materials, and (iii) impedance spectroscopy measurements. Furthermore, we show that the collection-limited photocurrent theory consistently interprets typical characteristics of regular SS-OPV devices. These insights should encourage the design and OPV implementation of high-purity, high-mobility polymers, and other soft materials that have shown promise in organic field-effect transistor applications, but have not performed well in BHJ OPV devices, wherein they adopt less-than-ideal nanostructures when blended with electron-accepting materials. PMID:26290582

  17. A simple theory of molecular organization in fullerene-containing liquid crystals

    NASA Astrophysics Data System (ADS)

    Peroukidis, S. D.; Vanakaras, A. G.; Photinos, D. J.

    2005-10-01

    Systematic efforts to synthesize fullerene-containing liquid crystals have produced a variety of successful model compounds. We present a simple molecular theory, based on the interconverting shape approach [Vanakaras and Photinos, J. Mater. Chem. 15, 2002 (2005)], that relates the self-organization observed in these systems to their molecular structure. The interactions are modeled by dividing each molecule into a number of submolecular blocks to which specific interactions are assigned. Three types of blocks are introduced, corresponding to fullerene units, mesogenic units, and nonmesogenic linkage units. The blocks are constrained to move on a cubic three-dimensional lattice and molecular flexibility is allowed by retaining a number of representative conformations within the block representation of the molecule. Calculations are presented for a variety of molecular architectures including twin mesogenic branch monoadducts of C60, twin dendromesogenic branch monoadducts, and conical (badminton shuttlecock) multiadducts of C60. The dependence of the phase diagrams on the interaction parameters is explored. In spite of its many simplifications and the minimal molecular modeling used (three types of chemically distinct submolecular blocks with only repulsive interactions), the theory accounts remarkably well for the phase behavior of these systems.

  18. Functional Organization of the Action Observation Network in Autism: A Graph Theory Approach

    PubMed Central

    Alaerts, Kaat; Geerlings, Franca; Herremans, Lynn; Swinnen, Stephan P.; Verhoeven, Judith; Sunaert, Stefan; Wenderoth, Nicole

    2015-01-01

    Background The ability to recognize, understand and interpret other’s actions and emotions has been linked to the mirror system or action-observation-network (AON). Although variations in these abilities are prevalent in the neuro-typical population, persons diagnosed with autism spectrum disorders (ASD) have deficits in the social domain and exhibit alterations in this neural network. Method Here, we examined functional network properties of the AON using graph theory measures and region-to-region functional connectivity analyses of resting-state fMRI-data from adolescents and young adults with ASD and typical controls (TC). Results Overall, our graph theory analyses provided convergent evidence that the network integrity of the AON is altered in ASD, and that reductions in network efficiency relate to reductions in overall network density (i.e., decreased overall connection strength). Compared to TC, individuals with ASD showed significant reductions in network efficiency and increased shortest path lengths and centrality. Importantly, when adjusting for overall differences in network density between ASD and TC groups, participants with ASD continued to display reductions in network integrity, suggesting that also network-level organizational properties of the AON are altered in ASD. Conclusion While differences in empirical connectivity contributed to reductions in network integrity, graph theoretical analyses provided indications that also changes in the high-level network organization reduced integrity of the AON. PMID:26317222

  19. Coding theory based models for protein translation initiation in prokaryotic organisms.

    SciTech Connect

    May, Elebeoba Eni; Bitzer, Donald L. (North Carolina State University, Raleigh, NC); Rosnick, David I. (North Carolina State University, Raleigh, NC); Vouk, Mladen A.

    2003-03-01

    Our research explores the feasibility of using communication theory, error control (EC) coding theory specifically, for quantitatively modeling the protein translation initiation mechanism. The messenger RNA (mRNA) of Escherichia coli K-12 is modeled as a noisy (errored), encoded signal and the ribosome as a minimum Hamming distance decoder, where the 16S ribosomal RNA (rRNA) serves as a template for generating a set of valid codewords (the codebook). We tested the E. coli based coding models on 5' untranslated leader sequences of prokaryotic organisms of varying taxonomical relation to E. coli including: Salmonella typhimurium LT2, Bacillus subtilis, and Staphylococcus aureus Mu50. The model identified regions on the 5' untranslated leader where the minimum Hamming distance values of translated mRNA sub-sequences and non-translated genomic sequences differ the most. These regions correspond to the Shine-Dalgarno domain and the non-random domain. Applying the EC coding-based models to B. subtilis, and S. aureus Mu50 yielded results similar to those for E. coli K-12. Contrary to our expectations, the behavior of S. typhimurium LT2, the more taxonomically related to E. coli, resembled that of the non-translated sequence group.

  20. Nuclear weapons decision-making; an application of organization theory to the mini-nuke case

    SciTech Connect

    Kangas, J.L.

    1985-01-01

    This dissertation addresses the problem of constructing and developing normative theory responsive to the need for improving the quality of decision-making in the nuclear weapons policy-making. Against the background of a critical evaluation of various paradigms in the literature (systems analysis and opposed-systems designed, the bureaucratic politics model, and the cybernetic theory of decision) an attempt is made to design an alternative analytic framework based on the writings of numerous organization theorists such as Herbert Simon and Kenneth Arrow. The framework is applied to the case of mini-nukes, i.e., proposals in the mid-1970s to develop and deploy tens of thousands of very low-yield (sub-kiloton), miniaturized fission weapons in NATO. Heuristic case study identifies the type of study undertaken in the dissertation in contrast to the more familiar paradigmatic studies identified, for example, with the Harvard Weapons Project. Application of the analytic framework developed in the dissertation of the mini-nuke case resulted in an empirical understanding of why decision making concerning tactical nuclear weapons has been such a complex task and why force modernization issues in particular have been so controversial and lacking in policy resolution.

  1. The search for reliable aqueous solubility (Sw) and octanol-water partition coefficient (Kow) data for hydrophobic organic compounds; DDT and DDE as a case study

    USGS Publications Warehouse

    Pontolillo, James; Eganhouse, R.P.

    2001-01-01

    The accurate determination of an organic contaminant?s physico-chemical properties is essential for predicting its environmental impact and fate. Approximately 700 publications (1944?2001) were reviewed and all known aqueous solubilities (Sw) and octanol-water partition coefficients (Kow) for the organochlorine pesticide, DDT, and its persistent metabolite, DDE were compiled and examined. Two problems are evident with the available database: 1) egregious errors in reporting data and references, and 2) poor data quality and/or inadequate documentation of procedures. The published literature (particularly the collative literature such as compilation articles and handbooks) is characterized by a preponderance of unnecessary data duplication. Numerous data and citation errors are also present in the literature. The percentage of original Sw and Kow data in compilations has decreased with time, and in the most recent publications (1994?97) it composes only 6?26 percent of the reported data. The variability of original DDT/DDE Sw and Kow data spans 2?4 orders of magnitude, and there is little indication that the uncertainty in these properties has declined over the last 5 decades. A criteria-based evaluation of DDT/DDE Sw and Kow data sources shows that 95?100 percent of the database literature is of poor or unevaluatable quality. The accuracy and reliability of the vast majority of the data are unknown due to inadequate documentation of the methods of determination used by the authors. [For example, estimates of precision have been reported for only 20 percent of experimental Sw data and 10 percent of experimental Kow data.] Computational methods for estimating these parameters have been increasingly substituted for direct or indirect experimental determination despite the fact that the data used for model development and validation may be of unknown reliability. Because of the prevalence of errors, the lack of methodological documentation, and unsatisfactory data

  2. FFLO strange metal and quantum criticality in two dimensions: Theory and application to organic superconductors

    NASA Astrophysics Data System (ADS)

    Piazza, Francesco; Zwerger, Wilhelm; Strack, Philipp

    2016-02-01

    Increasing the spin imbalance in superconductors can spatially modulate the gap by forming Cooper pairs with finite momentum. For large imbalances compared to the Fermi energy, the inhomogeneous FFLO superconductor ultimately becomes a normal metal. There is mounting experimental evidence for this scenario in two-dimensional (2D) organic superconductors in large in-plane magnetic fields; this is complemented by ongoing efforts to realize this scenario in coupled tubes of atomic Fermi gases with spin imbalance. Yet, a theory for the phase transition from a metal to an FFLO superconductor has not been developed so far and the universality class has remained unknown. Here we propose and analyze a spin imbalance driven quantum critical point between a 2D metal and an FFLO phase in anisotropic electron systems. We derive the effective action for electrons and bosonic FFLO pairs at this quantum phase transition. Using this action, we predict non-Fermi-liquid behavior and the absence of quasiparticles at a discrete set of hot spots on the Fermi surfaces. This results in strange power laws in thermodynamics and response functions, which are testable with existing experimental setups on 2D organic superconductors and may also serve as signatures of the elusive FFLO phase itself. The proposed universality class is distinct from previously known quantum critical metals and, because its critical fluctuations appear already in the pairing channel, a promising candidate for naked metallic quantum criticality over extended temperature ranges.

  3. The reversed Müller-Lyer illusion and figure-ground organization theory.

    PubMed

    Taya, R; Ohashi, Y

    1992-01-01

    When the shaft is shortened and reaches neither of the vertices of the two pairs of wings, a reversed Müller-Lyer illusion is observed: a shaft between inward-pointing wings appears to be longer than a shaft between the outward-pointing wings. In this paper it is examined whether this illusion can be explained in terms of figure-ground organization. A circle was used as the focal area, instead of a shaft or a pair of dots, so that the figure-ground character could be seen more definitely in this focal area. The apparent size of the focal circle was measured under different conditions with three variables (enclosure, wings direction, and depth). The focal circle appeared to be largest in the condition where the circle should appear most readily as a hole, ie in the single, wings-in, space condition. The circle appeared to be smallest in the condition where the circle should appear most readily as a disc, ie in the separate, wings-out, object condition. This is consistent with an explanation of the usual, as well as the reversed, Müller-Lyer illusion in terms of figure-ground organization theory. PMID:1488264

  4. Simple, stable and reliable modeling of gas properties of organic working fluids in aerodynamic designs of turbomachinery for ORC and VCC

    NASA Astrophysics Data System (ADS)

    Kawakubo, T.

    2016-05-01

    A simple, stable and reliable modeling of the real gas nature of the working fluid is required for the aerodesigns of the turbine in the Organic Rankine Cycle and of the compressor in the Vapor Compression Cycle. Although many modern Computational Fluid Dynamics tools are capable of incorporating real gas models, simulations with such a gas model tend to be more time-consuming than those with a perfect gas model and even can be unstable due to the simulation near the saturation boundary. Thus a perfect gas approximation is still an attractive option to stably and swiftly conduct a design simulation. In this paper, an effective method of the CFD simulation with a perfect gas approximation is discussed. A method of representing the performance of the centrifugal compressor or the radial-inflow turbine by means of each set of non-dimensional performance parameters and translating the fictitious perfect gas result to the actual real gas performance is presented.

  5. Further discussion on reliability: the art of reliability estimation.

    PubMed

    Yang, Yanyun; Green, Samuel B

    2015-01-01

    Sijtsma and van der Ark (2015) focused in their lead article on three frameworks for reliability estimation in nursing research: classical test theory (CTT), factor analysis (FA), and generalizability theory. We extend their presentation with particular attention to CTT and FA methods. We first consider the potential of yielding an overly negative or an overly positive assessment of reliability based on coefficient alpha. Next, we discuss other CTT methods for estimating reliability and how the choice of methods affects the interpretation of the reliability coefficient. Finally, we describe FA methods, which not only permit an understanding of a measure's underlying structure but also yield a variety of reliability coefficients with different interpretations. On a more general note, we discourage reporting reliability as a two-choice outcome--unsatisfactory or satisfactory; rather, we recommend that nursing researchers make a conceptual and empirical argument about when a measure might be more or less reliable, depending on its use. PMID:25738627

  6. A regulatory theory of cortical organization and its applications to robotics

    NASA Astrophysics Data System (ADS)

    Thangavelautham, Jekanthan

    2009-11-01

    Fundamental aspects of biologically-inspired regulatory mechanisms are considered in a robotics context, using artificial neural-network control systems. Regulatory mechanisms are used to control expression of genes, adaptation of form and behavior in organisms. Traditional neural network control architectures assume networks of neurons are fixed and are interconnected by wires. However, these architectures tend to be specified by a designer and are faced with several limitations that reduce scalability and tractability for tasks with larger search spaces. Traditional methods used to overcome these limitations with fixed network topologies are to provide more supervision by a designer. More supervision as shown does not guarantee improvement during training particularly when making incorrect assumptions for little known task domains. Biological organisms often do not require such external intervention (more supervision) and have self-organized through adaptation. Artificial neural tissues (ANT) addresses limitations with current neural-network architectures by modeling both wired interactions between neurons and wireless interactions through use of chemical diffusion fields. An evolutionary (Darwinian) selection process is used to 'breed' ANT controllers for a task at hand and the framework facilitates emergence of creative solutions since only a system goal function and a generic set of basis behaviours need be defined. Regulatory mechanisms are formed dynamically within ANT through superpositioning of chemical diffusion fields from multiple sources and are used to select neuronal groups. Regulation drives competition and cooperation among neuronal groups and results in areas of specialization forming within the tissue. These regulatory mechanisms are also shown to increase tractability without requiring more supervision using a new statistical theory developed to predict performance characteristics of fixed network topologies. Simulations also confirm the

  7. Discovery of fairy circles in Australia supports self-organization theory.

    PubMed

    Getzin, Stephan; Yizhaq, Hezi; Bell, Bronwyn; Erickson, Todd E; Postle, Anthony C; Katra, Itzhak; Tzuk, Omer; Zelnik, Yuval R; Wiegand, Kerstin; Wiegand, Thorsten; Meron, Ehud

    2016-03-29

    Vegetation gap patterns in arid grasslands, such as the "fairy circles" of Namibia, are one of nature's greatest mysteries and subject to a lively debate on their origin. They are characterized by small-scale hexagonal ordering of circular bare-soil gaps that persists uniformly in the landscape scale to form a homogeneous distribution. Pattern-formation theory predicts that such highly ordered gap patterns should be found also in other water-limited systems across the globe, even if the mechanisms of their formation are different. Here we report that so far unknown fairy circles with the same spatial structure exist 10,000 km away from Namibia in the remote outback of Australia. Combining fieldwork, remote sensing, spatial pattern analysis, and process-based mathematical modeling, we demonstrate that these patterns emerge by self-organization, with no correlation with termite activity; the driving mechanism is a positive biomass-water feedback associated with water runoff and biomass-dependent infiltration rates. The remarkable match between the patterns of Australian and Namibian fairy circles and model results indicate that both patterns emerge from a nonuniform stationary instability, supporting a central universality principle of pattern-formation theory. Applied to the context of dryland vegetation, this principle predicts that different systems that go through the same instability type will show similar vegetation patterns even if the feedback mechanisms and resulting soil-water distributions are different, as we indeed found by comparing the Australian and the Namibian fairy-circle ecosystems. These results suggest that biomass-water feedbacks and resultant vegetation gap patterns are likely more common in remote drylands than is currently known. PMID:26976567

  8. Species Detection and Identification in Sexual Organisms Using Population Genetic Theory and DNA Sequences

    PubMed Central

    Birky, C. William

    2013-01-01

    Phylogenetic trees of DNA sequences of a group of specimens may include clades of two kinds: those produced by stochastic processes (random genetic drift) within a species, and clades that represent different species. The ratio of the mean pairwise sequence difference between a pair of clades (K) to the mean pairwise sequence difference within a clade (θ) can be used to determine whether the clades are samples from different species (K/θ≥4) or the same species (K/θ<4) with probability ≥0.95. Previously I applied this criterion to delimit species of asexual organisms. Here I use data from the literature to show how it can also be applied to delimit sexual species using four groups of sexual organisms as examples: ravens, spotted leopards, sea butterflies, and liverworts. Mitochondrial or chloroplast genes are used because these segregate earlier during speciation than most nuclear genes and hence detect earlier stages of speciation. In several cases the K/θ ratio was greater than 4, confirming the original authors' intuition that the clades were sufficiently different to be assigned to different species. But the K/θ ratio split each of two liverwort species into two evolutionary species, and showed that support for the distinction between the common and Chihuahuan raven species is weak. I also discuss some possible sources of error in using the K/θ ratio; the most significant one would be cases where males migrate between different populations but females do not, making the use of maternally inherited organelle genes problematic. The K/θ ratio must be used with some caution, like all other methods for species delimitation. Nevertheless, it is a simple theory-based quantitative method for using DNA sequences to make rigorous decisions about species delimitation in sexual as well as asexual eukaryotes. PMID:23308113

  9. The organic surface of 5145 Pholus: Constraints set by scattering theory

    NASA Technical Reports Server (NTRS)

    Wilson, Peter D.; Sagan, Carl; Thompson, W. Reid

    1994-01-01

    No known body in the Solar System has a spectrum redder than that of object 5145 Pholus. We use Hapke scattering theory and optical constants measured in this laboratory to examine the ability of mixtures of a number of organic solids and ices to reproduce the observed spectrum and phase variation. The primary materials considered are poly-HCN, kerogen, Murchison organic extract, Titan tholin, ice tholin, and water ice. In a computer grid search of over 10 million models, we find an intraparticle mixture of 15% Titan tholin, 10% poly-HCN, and 75% water ice with 10-micrometers particles to provide an excellent fit. Replacing water ice with ammonia ice improves the fits significantly while using a pure hydrocarbon tholin, Tholin alpha, instead of Titan tholin makes only modest improvements. All acceptable fits require Titan tholin or some comparable material to provide the steep slope in the visible, and poly-HCN or some comparable material to provide strong absorption in the near-infrared. A pure Titan tholin surface with 16-micrometers particles, as well as all acceptable Pholus models, fit the present spectrophotometric data for the transplutonian object 1992 QB(sub 1). The feasibility of gas-phase chemistry to generate material like Titan tholin on such small objects is examined. An irradiated transient atmosphere arising from sublimating ices may generate at most a few centimeters of tholin over the lifetime of the Solar System, but this is insignificant compared to the expected lag deposit of primordial contaminants left behind by the sublimating ice. Irradiation of subsurface N2/CH4 or NH3/CH4 ice by cosmic rays may generate approximately 20 cm of tholin in the upper 10 m of regolith in the same time scale but the identity of this tholin to its gas-phase equivalent has not been demonstrated.

  10. The Reliability Of Planar InGaAs/InP PIN Photodiodes With Organic Coatings For Use In Low Cost Receivers

    NASA Astrophysics Data System (ADS)

    Sutherland, Robert R.; Stokoe, J. C.; Skrimshire, Christopher P.; MacDonald, Brian M.; Sloan, Donald F.

    1990-01-01

    Low-cost optoelectronic components in non-hermetic packages are now required for use in the local loop, and these components will be subject to humidity-induced failure mechanisms. The results presented in this paper show that it is possible to provide some protection for the photodiode against the effects of humidity by means of an organic coating. Photodiodes with three different organic coatings were tested at 85C/85% relative humidity, and all survived longerthan uncoated control photodiodes. One of the coatings was superior to the other two, and photodiodes with this coating may have adequate reliability for some local loop applications. The temperature and humidity acceleration factors for uncoated photodiodes were determined. The temperature acceleration factor was found to be low (equivalent to an activation energy of 0.2eV, instead of the more usual 0.6eV). Failure analysis of failed photodiodes, which included Auger analysis of corrosion products, showed that failure was due to oxidation of the InP surface.

  11. Interface dipoles of organic molecules on Ag(111) in hybrid density-functional theory

    NASA Astrophysics Data System (ADS)

    Hofmann, Oliver T.; Atalla, Viktor; Moll, Nikolaj; Rinke, Patrick; Scheffler, Matthias

    2013-12-01

    We investigate the molecular acceptors 3,4,9,10-perylene-tetracarboxylic acid dianhydride (PTCDA), 2,3,5,6-tetrafluoro-7,7,8,8-tetracyanoquinodimethane (F4TCNQ) and 4,5,9,10-pyrenetetraone (PYTON) on Ag(111) using density-functional theory (DFT). For two groups of the Heyd-Scuseria-Ernzerhof (HSE(α, ω)) family of exchange-correlation functionals (ω = 0 and 0.2 Å) we study the isolated components as well as the combined systems as a function of the amount of exact-exchange (α). We find that hybrid functionals favour electron transfer to the adsorbate. Comparing with experimental work function data, for α ≈ 0.25 we report a notable but small improvement over (semi) local functionals for the interface dipole. Although Kohn-Sham eigenvalues are only approximate representations of ionization energies, incidentally, at this value also the density of states agrees well with the photoelectron spectra. However, increasing α to values for which the energy of the lowest unoccupied molecular orbital matches the experimental electron affinity in the gas phase worsens both the interface dipole and the density of states. Our results imply that semi-local DFT calculations may often be adequate for conjugated organic molecules on metal surfaces and that the much more computationally demanding hybrid functionals yield only small improvements.

  12. From Structural Dilemmas to Institutional Imperatives: A Descriptive Theory of the School as an Institution and of School Organizations

    ERIC Educational Resources Information Center

    Berg, Gunnar

    2007-01-01

    This study outlines a descriptive theory that seeks to grasp the complexity of the school as a state and societal institution as well as single schools as organizations. A significant characteristic of this complexity is the ambiguity of the missions and goals--the outer boundaries--of the school-institution. The more institutional ambiguity that…

  13. Body without Organs: Notes on Deleuze & Guattari, Critical Race Theory and the Socius of Anti-Racism

    ERIC Educational Resources Information Center

    Ibrahim, Awad

    2015-01-01

    My aim in this article is to epistemologically read Deleuze and Guattari (D & G) against critical race theory (CRT) and simultaneously delineate how D & G's notion of "body without organs" can benefit from CRT. At first glance, especially for language instructors and researchers, these two epistemological frameworks not only…

  14. The Process by Which Black Male College Students Become Leaders of Predominantly White Organizations in Higher Education: A Grounded Theory

    ERIC Educational Resources Information Center

    Moschella, Eric J.

    2013-01-01

    This study sought to understand the process by which Black undergraduate men on predominately White college campuses become leaders of predominately White organizations. Using the theoretical frameworks of Black and White racial identity development (Helms, 1990), Critical Race Theory (Delgado & Stefancic, 2001), and Wijeyesinghe's (2001)…

  15. Understanding and Administering Educational Organizations: The Contribution of Greenfield's "Alternative Theory."

    ERIC Educational Resources Information Center

    Johnson, Neil A.

    1990-01-01

    Assesses the contributions of T. B. Greenfield's "Alternative Theory" to a comprehensive theory for school administration in practice and scholarship. Considers Greenfield's standpoint in relation to rational, natural, and open systems perspectives. (DMM)

  16. Excited states properties of organic molecules: from density functional theory to the GW and Bethe-Salpeter Green's function formalisms.

    PubMed

    Faber, C; Boulanger, P; Attaccalite, C; Duchemin, I; Blase, X

    2014-03-13

    Many-body Green's function perturbation theories, such as the GW and Bethe-Salpeter formalisms, are starting to be routinely applied to study charged and neutral electronic excitations in molecular organic systems relevant to applications in photovoltaics, photochemistry or biology. In parallel, density functional theory and its time-dependent extensions significantly progressed along the line of range-separated hybrid functionals within the generalized Kohn-Sham formalism designed to provide correct excitation energies. We give an overview and compare these approaches with examples drawn from the study of gas phase organic systems such as fullerenes, porphyrins, bacteriochlorophylls or nucleobases molecules. The perspectives and challenges that many-body perturbation theory is facing, such as the role of self-consistency, the calculation of forces and potential energy surfaces in the excited states, or the development of embedding techniques specific to the GW and Bethe-Salpeter equation formalisms, are outlined. PMID:24516185

  17. Organizational Economics: Notes on the Use of Transaction-Cost Theory in the Study of Organizations.

    ERIC Educational Resources Information Center

    Robins, James A.

    1987-01-01

    Reviews transaction-cost approaches to organizational analysis, examines their use in microeconomic theory, and identifies some important flaws in the study. Advocates transaction-cost theory as a powerful tool for organizational and strategic analysis when set within the famework of more general organizational theory. Includes 61 references. (MLH)

  18. A Theory of Complex Adaptive Inquiring Organizations: Application to Continuous Assurance of Corporate Financial Information

    ERIC Educational Resources Information Center

    Kuhn, John R., Jr.

    2009-01-01

    Drawing upon the theories of complexity and complex adaptive systems and the Singerian Inquiring System from C. West Churchman's seminal work "The Design of Inquiring Systems" the dissertation herein develops a systems design theory for continuous auditing systems. The dissertation consists of discussion of the two foundational theories,…

  19. Theory aided design and analysis of dielectric and semiconductor components for organic field-effect transistors

    NASA Astrophysics Data System (ADS)

    Dibenedetto, Sara Arlene

    Perfluoroacyl/acyl-derivatized quaterthiophens are developed and synthesized. The frontier molecular orbital energies of these compounds are studied by optical spectroscopy and electrochemistry while solid-state/film properties are investigated by thermal analysis, x-ray diffraction, and scanning electron microscopy. Organic thin film transistors (OTFTs) performance parameters are discussed in terms of the interplay between semiconductor molecular energetics and film morphologies/microstructures. The majority charge carrier type and mobility exhibit a strong correlation with the regiochemistry of perfluoroarene incorporation. In quaterthiophene-based semiconductors, carbonyl-functionalization allows tuning of the majority carrier type from p-type to ambipolar and to n-type. In situ conversion of a p-type semiconducting film to n-type film is also demonstrated. The design of chemical and film microstructural alternative hybrid organic-inorganic gate dielectrics is described using the classic Clausius-Mossotti relation. The Maxwell-Wagner effective medium model is used to compute the effective dielectric permittivity of two types of dielectrics self-assembled nanodielectrics (SANDs) and crosslinked polymer blends (CPBs). In these calculations showing good agreement between theory and experiment, it is found that greater capacitances should be achievable with mixed composites than with layered composites. With this insight, a series of mixed metal oxide-polyolefin nanocomposites is synthesized via in-situ olefin polymerization using the single-site metallocene catalysts. By integrating organic and inorganic constituents, the resulting hybrid material exhibit high permittivity (from the inorganic inclusions) and high breakdown strength, mechanical flexibility, and facile processability (from the polymer matrices). In order to better optimize the capacitance and leakage current of hybrid organic-inorganic dielectrics, the capacitance, leakage current and OFET gate

  20. An Examination of Test-Retest, Alternate Form Reliability, and Generalizability Theory Study of the easyCBM Word and Passage Reading Fluency Assessments: Grade 3. Technical Report #1218

    ERIC Educational Resources Information Center

    Park, Bitnara Jasmine; Anderson, Daniel; Alonzo, Julie; Lai, Cheng-Fei; Tindal, Gerald

    2012-01-01

    This technical report is one in a series of five describing the reliability (test/retest and alternate form) and G-Theory/D-Study research on the easyCBM reading measures, grades 1-5. Data were gathered in the spring of 2011 from a convenience sample of students nested within classrooms at a medium-sized school district in the Pacific Northwest.…

  1. Intentions of becoming a living organ donor among Hispanics: a theory-based approach exploring differences between living and nonliving organ donation.

    PubMed

    Siegel, Jason T; Alvaro, Eusebio M; Lac, Andrew; Crano, William D; Dominick, Alexander

    2008-01-01

    This research examines perceptions concerning living (n = 1,253) and nonliving (n = 1,259) organ donation among Hispanic adults, a group considerably less likely than the general population to become donors. Measures are derived from the Theory of Planned Behavior (Ajzen, 1991) and Vested Interest Theory (Crano, 1983, 1997). A substantial percentage of respondents reported positive attitudes and high personal stake concerning organ donation. Mean differences in norms, attitudes, intentions, and assumed immediacy of payoff were found between living and nonliving donor groups, suggesting that these two donation formats are dissimilar and should be examined independently. Accordingly, separate hierarchical multiple regression models were estimated for living and nonliving donation. Analyses supported both theoretical frameworks: Constructs associated with Planned Behavior and Vested Interest independently contributed to donor intentions. The implications of these results, and our recommendations for future health campaigns, are presented in light of these theoretical models. PMID:18307137

  2. Microvascular changes explain the "two-hit" theory of multiple organ failure.

    PubMed Central

    Garrison, R N; Spain, D A; Wilson, M A; Keelen, P A; Harris, P D

    1998-01-01

    OBJECTIVE: The objective was to determine intestinal microvascular endothelial cell control after sequential hemorrhage and bacteremia. SUMMARY BACKGROUND DATA: Sepsis that follows severe hemorrhagic shock often results in multiple system organ failure (MSOF) and death. The sequential nature of this clinical scenario has led to the idea of a "two-hit" theory for the development of MSOF, the hallmark of which is peripheral vasodilation and acidosis. Acute bacteremia alone results in persistent intestinal vasoconstriction and mucosal hypoperfusion. Little experimental data exist to support the pathogenesis of vascular dysregulation during sequential physiologic insults. We postulate that hemorrhagic shock followed by bacteremia results in altered microvascular endothelial cell control of dilation and blood flow. METHODS: Rats underwent volume hemorrhage and resuscitation. A sham group underwent the vascular cannulation without hemorrhage and resuscitation, and controls had no surgical manipulation. After 24 and 72 hours, the small intestine microcirculation was visualized by in vivo videomicroscopy. Mean arterial pressure, heart rate, arteriolar diameters, and A1 flow by Doppler velocimetry were measured. Endothelial-dependent dilator function was determined by the topical application of acetylcholine (ACh). After 1 hour of Escherichia coil bacteremia, ACh dose responses were again measured. Topical nitroprusside was then applied to assess direct smooth muscle dilation (endothelial-independent dilator function) in all groups. Vascular reactivity to ACh was compared among the groups. RESULTS: Acute bacteremia, with or without prior hemorrhage, caused significant large-caliber A1 arteriolar constriction with a concomitant decrease in blood flow. This constriction was blunted at 24 hours after hemorrhage but was restored to control values by 72 hours. There was a reversal of the response to bacteremia in the premucosal A3 vessels, with a marked dilation both at 24 and

  3. Elastic, not plastic species: Frozen plasticity theory and the origin of adaptive evolution in sexually reproducing organisms

    PubMed Central

    2010-01-01

    Background Darwin's evolutionary theory could easily explain the evolution of adaptive traits (organs and behavioral patterns) in asexual but not in sexual organisms. Two models, the selfish gene theory and frozen plasticity theory were suggested to explain evolution of adaptive traits in sexual organisms in past 30 years. Results The frozen plasticity theory suggests that sexual species can evolve new adaptations only when their members are genetically uniform, i.e. only after a portion of the population of the original species had split off, balanced on the edge of extinction for several generations, and then undergone rapid expansion. After a short period of time, estimated on the basis of paleontological data to correspond to 1-2% of the duration of the species, polymorphism accumulates in the gene pool due to frequency-dependent selection; and thus, in each generation, new mutations occur in the presence of different alleles and therefore change their selection coefficients from generation to generation. The species ceases to behave in an evolutionarily plastic manner and becomes evolutionarily elastic on a microevolutionary time-scale and evolutionarily frozen on a macroevolutionary time-scale. It then exists in this state until such changes accumulate in the environment that the species becomes extinct. Conclusion Frozen plasticity theory, which includes the Darwinian model of evolution as a special case - the evolution of species in a plastic state, not only offers plenty of new predictions to be tested, but also provides explanations for a much broader spectrum of known biological phenomena than classic evolutionary theories. Reviewers This article was reviewed by Rob Knight, Fyodor Kondrashov and Massimo Di Giulio (nominated by David H. Ardell). PMID:20067646

  4. Assessment of Student Performance in a PSI College Physics Course Using Ausubel's Learning Theory as a Theoretical Framework for Content Organization.

    ERIC Educational Resources Information Center

    Moriera, M. A.

    1979-01-01

    David Ausubel's learning theory was used as a framework for the content organization of an experimental Personalized System of Instruction (PSI) course in physics. Evaluation suggests that the combination of PSI as a method of instruction and Ausubel's theory for organization might result in better learning outcomes. (Author/JMD)

  5. Reliability physics

    NASA Technical Reports Server (NTRS)

    Cuddihy, E. F.; Ross, R. G., Jr.

    1984-01-01

    Speakers whose topics relate to the reliability physics of solar arrays are listed and their topics briefly reviewed. Nine reports are reviewed ranging in subjects from studies of photothermal degradation in encapsulants and polymerizable ultraviolet stabilizers to interface bonding stability to electrochemical degradation of photovoltaic modules.

  6. A Theory for Nonprecipitating Convection between Two Parallel Plates. Part II: Nonlinear Theory and Cloud Field Organization.

    NASA Astrophysics Data System (ADS)

    Bretherton, Christopher S.

    1988-09-01

    In Part I, an idealized model of nonprecipitating moist convection in a shallow conditionally unstable layer of viscous and diffusive air between two parallel plates was introduced, and the `linear' instability of an exactly saturated static state maintained by diffusion was investigated. If there are initially many clouds, the `linear' theory predicted that weaker clouds are suppressed by the subsidence warming and drying from the ever-growing stronger clouds, and the average cloud spacing becomes arbitrarily large as time goes on. Each growing cloud is surrounded by compensating subsidence, which decreases away from the cloud with a characteristic decay scale Rs, the subsidence radius, which can be understood from gravity wave arguments.In Part II, fields of finite amplitude clouds are considered. An asymptotic analysis is performed in which the moist Rayleigh number Nc2 exceeds by only a small amount the value Nc02 necessary for the onset of convection. This leads to a nonlinear set of `cloud field equations' which predict how the amplitudes and positions of all the clouds evolve in time. These equations predict a minimum stable cloud spacing c Rslog(1). If the cloud spacing < c, slight differences in the strengths of neighboring clouds increase until the weaker clouds are suppressed. Unevenly spaced clouds drift until they become evenly spaced, ultimately resulting in a steady field of identical clouds with uniform spacing > c.Numerical experiments with dry stability Nd = Nc corroborate the conclusions from the cloud field equations when Nc2/Nc02 is less than ten. As Nc2 increases, the numerically determined c. becomes approximately 1.8Rs 1.8Nd. There is a second threshold spacing t 1.6Ndtheory, below which a field of identical growing clouds is transient. This leads to two types of cloud field evolution. If Nc2/Nc02 is less than 10, all initial conditions lead to steady uniformly spaced fields of identical clouds. If Nc2/Nc02

  7. Combination of structural reliability and interval analysis

    NASA Astrophysics Data System (ADS)

    Qiu, Zhiping; Yang, Di; Elishakoff, Isaac

    2008-02-01

    In engineering applications, probabilistic reliability theory appears to be presently the most important method, however, in many cases precise probabilistic reliability theory cannot be considered as adequate and credible model of the real state of actual affairs. In this paper, we developed a hybrid of probabilistic and non-probabilistic reliability theory, which describes the structural uncertain parameters as interval variables when statistical data are found insufficient. By using the interval analysis, a new method for calculating the interval of the structural reliability as well as the reliability index is introduced in this paper, and the traditional probabilistic theory is incorporated with the interval analysis. Moreover, the new method preserves the useful part of the traditional probabilistic reliability theory, but removes the restriction of its strict requirement on data acquisition. Example is presented to demonstrate the feasibility and validity of the proposed theory.

  8. A Theory-Based Comparison of the Reliabilities of Fixed-Length and Trials-to-Criterion Scoring of Physical Education Skills Tests.

    ERIC Educational Resources Information Center

    Feldt, Leonard S.; Spray, Judith A.

    1983-01-01

    The reliabilities of two types of measurement plans were compared across six hypothetical distributions of true scores or abilities. The measurement plans were: (1) fixed-length, where the number of trials for all examinees is set in advance; and (2) trials-to-criterion, where examinees must keep trying until they complete a given number of trials…

  9. A Monte Carlo Simulation Investigating the Validity and Reliability of Ability Estimation in Item Response Theory with Speeded Computer Adaptive Tests

    ERIC Educational Resources Information Center

    Schmitt, T. A.; Sass, D. A.; Sullivan, J. R.; Walker, C. M.

    2010-01-01

    Imposed time limits on computer adaptive tests (CATs) can result in examinees having difficulty completing all items, thus compromising the validity and reliability of ability estimates. In this study, the effects of speededness were explored in a simulated CAT environment by varying examinee response patterns to end-of-test items. Expectedly,…

  10. Multifractality to Photonic Crystal & Self-Organization to Metamaterials through Anderson Localizations & Group/Gauge Theory

    NASA Astrophysics Data System (ADS)

    Hidajatullah-Maksoed, Widastra

    2015-04-01

    Arthur Cayley at least investigate by creating the theory of permutation group[F:∖∖Group_theory.htm] where in cell elements addressing of the lattice Qmf used a Cayley tree, the self-afine object Qmf is described by the combination of the finite groups of rotation & inversion and the infinite groups of translation & dilation[G Corso & LS Lacena: ``Multifractal lattice and group theory'', Physica A: Statistical Mechanics &Its Applications, 2005, v 357, issue I, h 64-70; http://www.sciencedirect.com/science/articel/pii/S0378437105005005 ] hence multifractal can be related to group theory. Many grateful Thanks to HE. Mr. Drs. P. SWANTORO & HE. Mr. Ir. SARWONO KUSUMAATMADJA.

  11. Latent Trait Theory Approach to Measuring Person-Organization Fit: Conceptual Rationale and Empirical Evaluation

    ERIC Educational Resources Information Center

    Chernyshenko, Oleksandr S.; Stark, Stephen; Williams, Alex

    2009-01-01

    The purpose of this article is to offer a new approach to measuring person-organization (P-O) fit, referred to here as "Latent fit." Respondents were administered unidimensional forced choice items and were asked to choose the statement in each pair that better reflected the correspondence between their values and those of the organization;…

  12. Potential Applications of Matrix Organization Theory for the New Jersey Department of Education. Position Paper.

    ERIC Educational Resources Information Center

    Hanson, J. Robert

    Matrix organization focuses on the shift from cost center or process input planning to product output or results planning. Matrix organization puts the personnel and the resources where they are needed to get the job done. This management efficiency is brought about by dividing all organizational activities into two areas: (1) input or maintenance…

  13. Hiring the Best Teachers? Rural Values and Person-Organization Fit Theory

    ERIC Educational Resources Information Center

    Little, Paula S.; Miller, Stephen K.

    2007-01-01

    Person-organization fit theorizes perceptions of congruity between applicants and organizational characteristics in hiring decisions. This study extends person-organization fit to teacher selection in rural districts, hypothesizing that officials with strong rural values favor applicants who reflect the community's sense of place. Rural values of…

  14. Toward a Theory of Variation in the Organization of the Word Reading System

    ERIC Educational Resources Information Center

    Rueckl, Jay G.

    2016-01-01

    The strategy underlying most computational models of word reading is to specify the organization of the reading system--its architecture and the processes and representations it employs--and to demonstrate that this organization would give rise to the behavior observed in word reading tasks. This approach fails to adequately address the variation…

  15. Understanding the Environmental Elements in Religious Student Organizations through Sharon Parks' Mentoring Community Theory

    ERIC Educational Resources Information Center

    Gill, David Christopher

    2011-01-01

    Students are coming to colleges and universities for spiritual fulfillment and have turned to religious student organizations (i.e. Campus Crusade for Christ, Newman Centers, Muslim Student Association, Hillel, etc.) to attain guidance and support. To better understand the spiritual environment religious student organizations have in place, many…

  16. Network reliability

    NASA Technical Reports Server (NTRS)

    Johnson, Marjory J.

    1985-01-01

    Network control (or network management) functions are essential for efficient and reliable operation of a network. Some control functions are currently included as part of the Open System Interconnection model. For local area networks, it is widely recognized that there is a need for additional control functions, including fault isolation functions, monitoring functions, and configuration functions. These functions can be implemented in either a central or distributed manner. The Fiber Distributed Data Interface Medium Access Control and Station Management protocols provide an example of distributed implementation. Relative information is presented here in outline form.

  17. Power laws and self-organized criticality in theory and nature

    NASA Astrophysics Data System (ADS)

    Marković, Dimitrije; Gros, Claudius

    2014-03-01

    Power laws and distributions with heavy tails are common features of many complex systems. Examples are the distribution of earthquake magnitudes, solar flare intensities and the sizes of neuronal avalanches. Previously, researchers surmised that a single general concept may act as an underlying generative mechanism, with the theory of self organized criticality being a weighty contender. The power-law scaling observed in the primary statistical analysis is an important, but by far not the only feature characterizing experimental data. The scaling function, the distribution of energy fluctuations, the distribution of inter-event waiting times, and other higher order spatial and temporal correlations, have seen increased consideration over the last years. Leading to realization that basic models, like the original sandpile model, are often insufficient to adequately describe the complexity of real-world systems with power-law distribution. Consequently, a substantial amount of effort has gone into developing new and extended models and, hitherto, three classes of models have emerged. The first line of models is based on a separation between the time scales of an external drive and an internal dissipation, and includes the original sandpile model and its extensions, like the dissipative earthquake model. Within this approach the steady state is close to criticality in terms of an absorbing phase transition. The second line of models is based on external drives and internal dynamics competing on similar time scales and includes the coherent noise model, which has a non-critical steady state characterized by heavy-tailed distributions. The third line of models proposes a non-critical self-organizing state, being guided by an optimization principle, such as the concept of highly optimized tolerance. We present a comparative overview regarding distinct modeling approaches together with a discussion of their potential relevance as underlying generative models for real

  18. Knowledge sharing within organizations: linking art, theory, scenarios and professional experience

    NASA Technical Reports Server (NTRS)

    Bailey, T.; Burton, Y. C.

    2000-01-01

    In this discussion, T. Bailey will be addressing the multiple paradigms within organizations using imagery. Dr. Burton will discuss the relationship between these paradigms and social exchanges that lead to knowledge sharing.

  19. Molecular Theory of Detonation Initiation: Insight from First Principles Modeling of the Decomposition Mechanisms of Organic Nitro Energetic Materials.

    PubMed

    Tsyshevsky, Roman V; Sharia, Onise; Kuklja, Maija M

    2016-01-01

    This review presents a concept, which assumes that thermal decomposition processes play a major role in defining the sensitivity of organic energetic materials to detonation initiation. As a science and engineering community we are still far away from having a comprehensive molecular detonation initiation theory in a widely agreed upon form. However, recent advances in experimental and theoretical methods allow for a constructive and rigorous approach to design and test the theory or at least some of its fundamental building blocks. In this review, we analyzed a set of select experimental and theoretical articles, which were augmented by our own first principles modeling and simulations, to reveal new trends in energetic materials and to refine known existing correlations between their structures, properties, and functions. Our consideration is intentionally limited to the processes of thermally stimulated chemical reactions at the earliest stage of decomposition of molecules and materials containing defects. PMID:26907231

  20. Molecular Theory of Detonation Initiation: Insight from First Principles Modeling of the Decomposition Mechanisms of Organic Nitro Energetic Materials

    DOE PAGESBeta

    Tsyshevsky, Roman; Sharia, Onise; Kuklja, Maija

    2016-02-19

    Our review presents a concept, which assumes that thermal decomposition processes play a major role in defining the sensitivity of organic energetic materials to detonation initiation. As a science and engineering community we are still far away from having a comprehensive molecular detonation initiation theory in a widely agreed upon form. However, recent advances in experimental and theoretical methods allow for a constructive and rigorous approach to design and test the theory or at least some of its fundamental building blocks. In this review, we analyzed a set of select experimental and theoretical articles, which were augmented by our ownmore » first principles modeling and simulations, to reveal new trends in energetic materials and to refine known existing correlations between their structures, properties, and functions. Lastly, our consideration is intentionally limited to the processes of thermally stimulated chemical reactions at the earliest stage of decomposition of molecules and materials containing defects.« less

  1. Organization of instabilities in multispecies systems, a test of hierarchy theory.

    PubMed Central

    Waltho, N; Kolasa, J

    1994-01-01

    The hierarchy theory predicts that system components functioning at lower levels of hierarchy operate or change at higher rates than the components at the level(s) above. If this prediction is correct, then interpretation of stability in complex ecological systems may be in need of revision. We test the prediction using a model of hierarchical structure of habitat and a coral reef fish community. We found that the variability of ecological range and abundance increases exponentially from habitat generalists (high in hierarchy) to specialists (low in hierarchy), as postulated by the hierarchy theory. Our result suggests that community stability is a composite property and should be evaluated by considering the hierarchical structure of that community. Images PMID:11607461

  2. Intermolecular symmetry-adapted perturbation theory study of large organic complexes

    SciTech Connect

    Heßelmann, Andreas; Korona, Tatiana

    2014-09-07

    Binding energies for the complexes of the S12L database by Grimme [Chem. Eur. J. 18, 9955 (2012)] were calculated using intermolecular symmetry-adapted perturbation theory combined with a density-functional theory description of the interacting molecules. The individual interaction energy decompositions revealed no particular change in the stabilisation pattern as compared to smaller dimer systems at equilibrium structures. This demonstrates that, to some extent, the qualitative description of the interaction of small dimer systems may be extrapolated to larger systems, a method that is widely used in force-fields in which the total interaction energy is decomposed into atom-atom contributions. A comparison of the binding energies with accurate experimental reference values from Grimme, the latter including thermodynamic corrections from semiempirical calculations, has shown a fairly good agreement to within the error range of the reference binding energies.

  3. Adsorption of AN Organic Molecule on a Corrugated BN/Rh(111) "nanomesh":. Atomistic Simulation Using Density Functional Theory

    NASA Astrophysics Data System (ADS)

    Gomez Diaz, J.; Seitsonen, A. P.; Iannuzzi, M.; Hutter, J.

    2013-05-01

    We perform modelling of the organic hexa-iodo-cyclohexa-m-phenylene (CHP) molecule on the h-BN/Rh(111) nanomesh [M Corso et al., Science 132, 217 (2004)]. The nanomesh structure consists of a Moiré pattern with a periodicity of 3.2 nm. It forms a template on which the molecules preferentially adsorb in the lower-lying "pores". We employ density functional theory in a slab geometry to investigate the adsorption and the abstraction of iodine atoms of the CHP on the nanomesh.

  4. The oxidative stress theory of aging: embattled or invincible? Insights from non-traditional model organisms

    PubMed Central

    Edrey, Yael H.; Yang, Ting; Mele, James

    2008-01-01

    Reactive oxygen species (ROS), inevitable byproducts of aerobic metabolism, are known to cause oxidative damage to cells and molecules. This, in turn, is widely accepted as a pivotal determinant of both lifespan and health span. While studies in a wide range of species support the role of ROS in many age-related diseases, its role in aging per se is questioned. Comparative data from a wide range of endotherms offer equivocal support for this theory, with many exceptions and inconclusive findings as to whether or not oxidative stress is either a correlate or a determinant of maximum species lifespan. Available data do not support the premise that metabolic rate and in vivo ROS production are determinants of lifespan, or that superior antioxidant defense contributes to species longevity. Rather, published studies often show either a negative associate or lack of correlation with species longevity. Furthermore, many long-living species such as birds, bats and mole-rats exhibit high levels of oxidative damage even at young ages. Similarly genetic manipulations altering expression of key antioxidants do not necessarily show an impact on lifespan, even though oxidative damage levels may be affected. While it is possible that these multiple exceptions to straightforward predictions of the free radical theory of aging all reflect species-specific, “private” mechanisms of aging, the preponderance of contrary data nevertheless present a challenge to this august theory. Therefore, contrary to accepted dogma, the role of oxidative stress as a determinant of longevity is still open to question. PMID:19424860

  5. Theory and simulation of organic solar cell model compounds: how packing and morphology determine the electronic conductivity.

    PubMed

    Lampe, Benjamin; Koslowski, Thorsten

    2012-09-01

    We approach the electronic conductivity of simple models of organic solar cells containing linear and branched αα'-oligothiophenes and buckminsterfullerene. Close-packed model geometries are generated using a Monte Carlo method, this procedure is verified making use of an analogue model. The electronic structure is described by an extended Su-Schrieffer-Heeger Hamiltonian, the resulting potential energy surfaces relevant to charge transfer can be analyzed using Marcus' theory, leading to local and--via Kirchhoff's rule--global conductivities for uniform oligothiophene and fullerene systems and their mixtures. Dense fullerene systems or subsystems always exhibit a conductivity in excess of 100 S/cm. In contrast, oligothiophenes show a comparable conductivity only for uniform, well-ordered arrangements of layers. Branched oligomers show only a slight improvement over linear oligothiophenes. Our results support the bulk heterojunction approach as a design principle of organic solar cells from a theoretical perspective. PMID:22957590

  6. Hierarchies in eukaryotic genome organization: Insights from polymer theory and simulations

    PubMed Central

    2011-01-01

    Eukaryotic genomes possess an elaborate and dynamic higher-order structure within the limiting confines of the cell nucleus. Knowledge of the physical principles and the molecular machinery that govern the 3D organization of this structure and its regulation are key to understanding the relationship between genome structure and function. Elegant microscopy and chromosome conformation capture techniques supported by analysis based on polymer models are important steps in this direction. Here, we review results from these efforts and provide some additional insights that elucidate the relationship between structure and function at different hierarchical levels of genome organization. PMID:21595865

  7. Immodest Witnesses: Reliability and Writing Assessment

    ERIC Educational Resources Information Center

    Gallagher, Chris W.

    2014-01-01

    This article offers a survey of three reliability theories in writing assessment: positivist, hermeneutic, and rhetorical. Drawing on an interdisciplinary investigation of the notion of "witnessing," this survey emphasizes the kinds of readers and readings each theory of reliability produces and the epistemological grounds on which it…

  8. Theoretical modeling of the linear and nonlinear optical properties of organic crystals within the rigorous local field theory (RLFT)

    SciTech Connect

    Seidler, T.; Stadnicka, K.; Champagne, B.

    2015-03-30

    This contribution summarizes our current findings in the field of calculating and predicting the linear and second-order nonlinear electric susceptibility tensor components of organic crystals. The methodology used for this purpose is based on a combination of the electrostatic interaction scheme developed by Munn and his coworkers (RLFT) with high-level electronic structure calculations. We compare the results of calculations with available experimental data for several examples of molecular crystals. We show the quality of the final results is influenced by i) the chromophore geometry, ii) the method used for molecular properties calculations and iii) the partitioning scheme used. In conclusion we summarize further plans to improve the reliability and predictability of the method.

  9. The Impact of Multiple Master Patient Index Records on the Business Performance of Health Care Organizations: A Qualitative Grounded Theory Study

    ERIC Educational Resources Information Center

    Banton, Cynthia L.

    2014-01-01

    The purpose of this qualitative grounded theory study was to explore and examine the factors that led to the creation of multiple record entries, and present a theory on the impact the problem has on the business performance of health care organizations. A sample of 59 health care professionals across the United States participated in an online…

  10. Reliability analysis in intelligent machines

    NASA Technical Reports Server (NTRS)

    Mcinroy, John E.; Saridis, George N.

    1990-01-01

    Given an explicit task to be executed, an intelligent machine must be able to find the probability of success, or reliability, of alternative control and sensing strategies. By using concepts for information theory and reliability theory, new techniques for finding the reliability corresponding to alternative subsets of control and sensing strategies are proposed such that a desired set of specifications can be satisfied. The analysis is straightforward, provided that a set of Gaussian random state variables is available. An example problem illustrates the technique, and general reliability results are presented for visual servoing with a computed torque-control algorithm. Moreover, the example illustrates the principle of increasing precision with decreasing intelligence at the execution level of an intelligent machine.

  11. Understanding Program Planning Theory and Practice in a Feminist Community-Based Organization

    ERIC Educational Resources Information Center

    Bracken, Susan J.

    2011-01-01

    The purpose of this article is to discuss feminist-program-planning issues, drawing from a critical ethnographic study of a Latin American feminist community-based organization. The research findings discuss the centrality of feminist identity to understanding and analyzing day-to-day program-planning process issues within a feminist…

  12. Knowledge sharing within organizations: linking art, theory, scenarios and professional experience

    NASA Technical Reports Server (NTRS)

    Burton, Y. C.; Bailey, T.

    2000-01-01

    In this presentation, Burton and Bailey, discuss the challenges and opportunities in developing knowledge sharing systems in organizations. Bailey provides a tool using imagery and collage for identifying and utilizing the diverse values and beliefs of individuals and groups. Burton reveals findings from a business research study that examines how social construction influences knowledge sharing among task oriented groups.

  13. Understanding the Value of Enterprise Architecture for Organizations: A Grounded Theory Approach

    ERIC Educational Resources Information Center

    Nassiff, Edwin

    2012-01-01

    There is a high rate of information system implementation failures attributed to the lack of alignment between business and information technology strategy. Although enterprise architecture (EA) is a means to correct alignment problems and executives highly rate the importance of EA, it is still not used in most organizations today. Current…

  14. Data networks reliability

    NASA Astrophysics Data System (ADS)

    Gallager, Robert G.

    1988-10-01

    The research from 1984 to 1986 on Data Network Reliability had the objective of developing general principles governing the reliable and efficient control of data networks. The research was centered around three major areas: congestion control, multiaccess networks, and distributed asynchronous algorithms. The major topics within congestion control were the use of flow control algorithms. The major topics within congestion control were the use of flow control to reduce congestion and the use of routing to reduce congestion. The major topics within multiaccess networks were the communication properties of multiaccess channels, collision resolution, and packet radio networks. The major topics within asynchronous distributed algorithms were failure recovery, time vs. communication tradeoffs, and the general theory of distributed algorithms.

  15. Reliable prediction of three-body intermolecular interactions using dispersion-corrected second-order Møller-Plesset perturbation theory

    SciTech Connect

    Huang, Yuanhang; Beran, Gregory J. O.

    2015-07-28

    Three-body and higher intermolecular interactions can play an important role in molecular condensed phases. Recent benchmark calculations found problematic behavior for many widely used density functional approximations in treating 3-body intermolecular interactions. Here, we demonstrate that the combination of second-order Møller-Plesset (MP2) perturbation theory plus short-range damped Axilrod-Teller-Muto (ATM) dispersion accurately describes 3-body interactions with reasonable computational cost. The empirical damping function used in the ATM dispersion term compensates both for the absence of higher-order dispersion contributions beyond the triple-dipole ATM term and non-additive short-range exchange terms which arise in third-order perturbation theory and beyond. Empirical damping enables this simple model to out-perform a non-expanded coupled Kohn-Sham dispersion correction for 3-body intermolecular dispersion. The MP2 plus ATM dispersion model approaches the accuracy of O(N{sup 6}) methods like MP2.5 or even spin-component-scaled coupled cluster models for 3-body intermolecular interactions with only O(N{sup 5}) computational cost.

  16. Excitons in Organics Using Time-Dependent Density Functional Theory: PPV, Pentacene, and Picene.

    PubMed

    Sharma, S; Dewhurst, J K; Shallcross, S; Madjarova, G K; Gross, E K U

    2015-04-14

    We apply the bootstrap kernel within time-dependent density functional theory to study the one-dimensional chain of polymer polyphenylenevinylene and molecular crystals of picene and pentacene. The absorption spectra of poly(p-phenylenevinylene) has a bound excitonic peak that is well-reproduced. Pentacene and picene, electronically similar materials, have remarkably different excitonic physics, and this difference is also well captured. We show that the inclusion of local-field effects dramatically changes the spectra of both picene and pentacene but not for poly(p-phenylenevinylene). PMID:26574381

  17. The self-organizing fractal theory as a universal discovery method: the phenomenon of life

    PubMed Central

    2011-01-01

    A universal discovery method potentially applicable to all disciplines studying organizational phenomena has been developed. This method takes advantage of a new form of global symmetry, namely, scale-invariance of self-organizational dynamics of energy/matter at all levels of organizational hierarchy, from elementary particles through cells and organisms to the Universe as a whole. The method is based on an alternative conceptualization of physical reality postulating that the energy/matter comprising the Universe is far from equilibrium, that it exists as a flow, and that it develops via self-organization in accordance with the empirical laws of nonequilibrium thermodynamics. It is postulated that the energy/matter flowing through and comprising the Universe evolves as a multiscale, self-similar structure-process, i.e., as a self-organizing fractal. This means that certain organizational structures and processes are scale-invariant and are reproduced at all levels of the organizational hierarchy. Being a form of symmetry, scale-invariance naturally lends itself to a new discovery method that allows for the deduction of missing information by comparing scale-invariant organizational patterns across different levels of the organizational hierarchy. An application of the new discovery method to life sciences reveals that moving electrons represent a keystone physical force (flux) that powers, animates, informs, and binds all living structures-processes into a planetary-wide, multiscale system of electron flow/circulation, and that all living organisms and their larger-scale organizations emerge to function as electron transport networks that are supported by and, at the same time, support the flow of electrons down the Earth's redox gradient maintained along the core-mantle-crust-ocean-atmosphere axis of the planet. The presented findings lead to a radically new perspective on the nature and origin of life, suggesting that living matter is an organizational state

  18. The lag effect and differential organization theory: nine failures to replicate.

    PubMed

    Toppino, T C; Gracen, T F

    1985-01-01

    In our first experiment, we tried to replicate and extend previous results that had provided seemingly convincing support for a differential organization explanation of the monotonically increasing lag effect in free recall. However, we failed to obtain the lag effect. Eight additional experiments also failed to replicate the lag effect. In all, we made 918 observations of performance on items repeated at each of three lags, and the mean percentage of correct free recall varied by less than one percentage point. These results suggest that there are boundary conditions limiting the generality of the monotonically increasing lag effect in free recall. In addition, caution should be exercised in accepting certain previous findings as strong evidence for a differential organization explanation of the phenomenon. PMID:3156950

  19. Ethical models in bioethics: theory and application in organ allocation policies.

    PubMed

    Petrini, C

    2010-12-01

    Policies for allocating organs to people awaiting a transplant constitute a major ethical challenge. First and foremost, they demand balance between the principles of beneficence and justice, but many other ethically relevant principles are also involved: autonomy, responsibility, equity, efficiency, utility, therapeutic outcome, medical urgency, and so forth. Various organ allocation models can be developed based on the hierarchical importance assigned to a given principle over the others, but none of the principles should be completely disregarded. An ethically acceptable organ allocation policy must therefore be in conformity, to a certain extent, with the requirements of all the principles. Many models for organ allocation can be derived. The utilitarian model aims to maximize benefits, which can be of various types on a social or individual level, such as the number of lives saved, prognosis, and so forth. The prioritarian model favours the neediest or those who suffer most. The egalitarian model privileges equity and justice, suggesting that all people should have an equal opportunity (casual allocation) or priority should be given to those who have been waiting longer. The personalist model focuses on each individual patient, attempting to mesh together all the various aspects affecting the person: therapeutic needs (urgency), fairness, clinical outcomes, respect for persons. In the individualistic model the main element is free choice and the system of opting-in is privileged. Contrary to the individualistic model, the communitarian model identities in the community the fundamental elements for the legitimacy of choices: therefore, the system of opting-out is privileged. This article does not aim at suggesting practical solutions. Rather, it furnishes to decision makers an overview on the possible ethical approach to this matter. PMID:21196904

  20. Integrating mechanistic organism--environment interactions into the basic theory of community and evolutionary ecology.

    PubMed

    Baskett, Marissa L

    2012-03-15

    This paper presents an overview of how mechanistic knowledge of organism-environment interactions, including biomechanical interactions of heat, mass and momentum transfer, can be integrated into basic theoretical population biology through mechanistic functional responses that quantitatively describe how organisms respond to their physical environment. Integrating such functional responses into simple community and microevolutionary models allows scaling up of the organism-level understanding from biomechanics both ecologically and temporally. For community models, Holling-type functional responses for predator-prey interactions provide a classic example of the functional response affecting qualitative model dynamics, and recent efforts are expanding analogous models to incorporate environmental influences such as temperature. For evolutionary models, mechanistic functional responses dependent on the environment can serve as fitness functions in both quantitative genetic and game theoretic frameworks, especially those concerning function-valued traits. I present a novel comparison of a mechanistic fitness function based on thermal performance curves to a commonly used generic fitness function, which quantitatively differ in their predictions for response to environmental change. A variety of examples illustrate how mechanistic functional responses enhance model connections to biologically relevant traits and processes as well as environmental conditions and therefore have the potential to link theoretical and empirical studies. Sensitivity analysis of such models can provide biologically relevant insight into which parameters and processes are important to community and evolutionary responses to environmental change such as climate change, which can inform conservation management aimed at protecting response capacity. Overall, the distillation of detailed knowledge or organism-environment interactions into mechanistic functional responses in simple population

  1. On the purposes of color for living beings: toward a theory of color organization.

    PubMed

    Pinna, Baingio; Reeves, Adam

    2015-01-01

    Phylogenetic and paleontological evidence indicates that in the animal kingdom the ability to perceive colors evolved independently several times over the course of millennia. This implies a high evolutionary neural investment and suggests that color vision provides some fundamental biological benefits. What are these benefits? Why are some animals so colorful? What are the adaptive and perceptual meanings of polychromatism? We suggest that in addition to the discrimination of light and surface chromaticity, sensitivity to color contributes to the whole, the parts and the fragments of perceptual organization. New versions of neon color spreading and the watercolor illusion indicate that the visual purpose of color in humans is threefold: to inter-relate each chromatic component of an object, thus favoring the emergence of the whole; to support a part-whole organization in which components reciprocally enhance each other by amodal completion; and, paradoxically, to reveal fragments and hide the whole-that is, there is a chromatic parceling-out process of separation, division, and fragmentation of the whole. The evolution of these contributions of color to organization needs to be established, but traces of it can be found in Harlequin camouflage by animals and in the coloration of flowers. PMID:24374380

  2. Attachment at (not to) work: applying attachment theory to explain individual behavior in organizations.

    PubMed

    Richards, David A; Schat, Aaron C H

    2011-01-01

    In this article, we report the results of 2 studies that were conducted to investigate whether adult attachment theory explains employee behavior at work. In the first study, we examined the structure of a measure of adult attachment and its relations with measures of trait affectivity and the Big Five. In the second study, we examined the relations between dimensions of attachment and emotion regulation behaviors, turnover intentions, and supervisory reports of counterproductive work behavior and organizational citizenship behavior. Results showed that anxiety and avoidance represent 2 higher order dimensions of attachment that predicted these criteria (except for counterproductive work behavior) after controlling for individual difference variables and organizational commitment. The implications of these results for the study of attachment at work are discussed. PMID:20718531

  3. A Theory for the Function of the Spermaceti Organ of the Sperm Whale (Physeter Catodon L.)

    NASA Technical Reports Server (NTRS)

    Norris, K. S.; Harvey, G. W.

    1972-01-01

    The function of the spermaceti organ of the sperm whale is studied using a model of its acoustic system. Suggested functions of the system include: (1) action as an acoustic resonating and sound focussing chamber to form and process burst-pulsed clicks; (2) use of nasal passages in forehead for repeated recycling of air for phonation during dives and to provide mirrors for sound reflection and signal processing; and (3) use of the entire system to allow sound signal production especially useful for long range echolocofion in the deep sea.

  4. Current theories on the pathophysiology of multiple organ failure after trauma.

    PubMed

    Tsukamoto, Takeshi; Chanthaphavong, R Savanh; Pape, Hans-Christoph

    2010-01-01

    Despite the enormous efforts to elucidate the mechanisms of the development of multiple organ failure (MOF) following trauma, MOF following trauma is still a leading cause of late post-injury death and morbidity. Now, it has been proven that excessive systemic inflammation following trauma participates in the development of MOF. Fundamentally, the inflammatory response is a host-defence response; however, on occasion, this response turns around to cause deterioration to host depending on exo- and endogenic factors. Through this review we aim to describe the pathophysiological approach for MOF after trauma studied so far and also introduce the prospects of this issue for the future. PMID:19729158

  5. The logic of transaction cost economics in health care organization theory.

    PubMed

    Stiles, R A; Mick, S S; Wise, C G

    2001-01-01

    Health care is, at its core, comprised of complex sequences of transactions among patients, providers, and other stakeholders; these transactions occur in markets as well as within systems and organizations. Health care transactions serve one of two functions: the production of care (i.e., the laying on of hands) or the coordination of that care (i.e., scheduling, logistics). Because coordinating transactions is integral to care delivery, it is imperative that they are executed smoothly and efficiently. Transaction cost economics (TCE) is a conceptual framework for analyzing health care transactions and quantifying their impact on health care structures (organizational forms), processes, and outcomes. PMID:11293015

  6. Predicting intentions to purchase organic food: the role of affective and moral attitudes in the Theory of Planned Behaviour.

    PubMed

    Arvola, A; Vassallo, M; Dean, M; Lampila, P; Saba, A; Lähteenmäki, L; Shepherd, R

    2008-01-01

    This study examined the usefulness of integrating measures of affective and moral attitudes into the Theory of Planned Behaviour (TPB)-model in predicting purchase intentions of organic foods. Moral attitude was operationalised as positive self-rewarding feelings of doing the right thing. Questionnaire data were gathered in three countries: Italy (N=202), Finland (N=270) and UK (N=200) in March 2004. Questions focussed on intentions to purchase organic apples and organic ready-to-cook pizza instead of their conventional alternatives. Data were analysed using Structural Equation Modelling by simultaneous multi-group analysis of the three countries. Along with attitudes, moral attitude and subjective norms explained considerable shares of variances in intentions. The relative influences of these variables varied between the countries, such that in the UK and Italy moral attitude rather than subjective norms had stronger explanatory power. In Finland it was other way around. Inclusion of moral attitude improved the model fit and predictive ability of the model, although only marginally in Finland. Thus the results partially support the usefulness of incorporating moral measures as well as affective items for attitude into the framework of TPB. PMID:18036702

  7. Mean-field theory of atomic self-organization in optical cavities

    NASA Astrophysics Data System (ADS)

    Jäger, Simon B.; Schütz, Stefan; Morigi, Giovanna

    2016-08-01

    Photons mediate long-range optomechanical forces between atoms in high-finesse resonators, which can induce the formation of ordered spatial patterns. When a transverse laser drives the atoms, the system undergoes a second-order phase transition that separates a uniform spatial density from a Bragg grating maximizing scattering into the cavity and is controlled by the laser intensity. Starting from a Fokker-Planck equation describing the semiclassical dynamics of the N -atom distribution function, we systematically develop a mean-field model and analyze its predictions for the equilibrium and out-of-equilibrium dynamics. The validity of the mean-field model is tested by comparison with the numerical simulations of the N -body Fokker-Planck equation and by means of a Bogoliubov-Born-Green-Kirkwood-Yvon (BBGKY) hierarchy. The mean-field theory predictions well reproduce several results of the N -body Fokker-Planck equation for sufficiently short times and are in good agreement with existing theoretical approaches based on field-theoretical models. The mean field, on the other hand, predicts thermalization time scales which are at least one order of magnitude shorter than the ones predicted by the N -body dynamics. We attribute this discrepancy to the fact that the mean-field ansatz discards the effects of the long-range incoherent forces due to cavity losses.

  8. From organized high throughput data to phenomenological theory: The example of dielectric breakdown

    NASA Astrophysics Data System (ADS)

    Kim, Chiho; Pilania, Ghanshyam; Ramprasad, Rampi

    Understanding the behavior (and failure) of dielectric insulators experiencing extreme electric fields is critical to the operation of present and emerging electrical and electronic devices. Despite its importance, the development of a predictive theory of dielectric breakdown has remained a challenge, owing to the complex multiscale nature of this process. Here, we focus on the intrinsic dielectric breakdown field of insulators--the theoretical limit of breakdown determined purely by the chemistry of the material, i.e., the elements the material is composed of, the atomic-level structure, and the bonding. Starting from a benchmark dataset (generated from laborious first principles computations) of the intrinsic dielectric breakdown field of a variety of model insulators, simple predictive phenomenological models of dielectric breakdown are distilled using advanced statistical or machine learning schemes, revealing key correlations and analytical relationships between the breakdown field and easily accessible material properties. The models are shown to be general, and can hence guide the screening and systematic identification of high electric field tolerant materials.

  9. The metabolic pace-of-life model: incorporating ectothermic organisms into the theory of vertebrate ecoimmunology.

    PubMed

    Sandmeier, Franziska C; Tracy, Richard C

    2014-09-01

    We propose a new heuristic model that incorporates metabolic rate and pace of life to predict a vertebrate species' investment in adaptive immune function. Using reptiles as an example, we hypothesize that animals with low metabolic rates will invest more in innate immunity compared with adaptive immunity. High metabolic rates and body temperatures should logically optimize the efficacy of the adaptive immune system--through rapid replication of T and B cells, prolific production of induced antibodies, and kinetics of antibody--antigen interactions. In current theory, the precise mechanisms of vertebrate immune function oft are inadequately considered as diverse selective pressures on the evolution of pathogens. We propose that the strength of adaptive immune function and pace of life together determine many of the important dynamics of host-pathogen evolution, namely, that hosts with a short lifespan and innate immunity or with a long lifespan and strong adaptive immunity are expected to drive the rapid evolution of their populations of pathogens. Long-lived hosts that rely primarily on innate immune functions are more likely to use defense mechanisms of tolerance (instead of resistance), which are not expected to act as a selection pressure for the rapid evolution of pathogens' virulence. PMID:24760792

  10. Optical detection of charge carriers in multilayer organic light-emitting diodes: Experiment and theory

    NASA Astrophysics Data System (ADS)

    Book, K.; Nikitenko, V. R.; Bässler, H.; Elschner, A.

    2001-03-01

    We have investigated a multilayer organic light-emitting diode with 1,3,5-tris (N,N-bis-(4-methoxyphenyl)aminophenyl)-benzene acting as the hole transporting layer (HTL) and tris (8-hydroxy-quinolinolato) aluminum (Alq3) as the electron transporting layer. Positive charge carriers in the HTL were detected optically as a function of the applied bias. It was found that a hole injecting layer, consisting of 3,4 polyethylene-dioxy-thiophene doped with polystyrenesulfonate, forms an ohmic contact to the HTL by inducing a thin layer of holes in the interfacial region. An analytical model is developed to describe the observed carrier concentrations as well as the current-brightness-voltage characteristics quantitatively.

  11. Spin-boson theory for charge photogeneration in organic molecules: Role of quantum coherence

    NASA Astrophysics Data System (ADS)

    Yao, Yao

    2015-01-01

    The charge photogeneration process in organic molecules is investigated by a quantum heat engine model, in which two molecules are modeled by a two-spin system sandwiched between two bosonic baths. The two baths represent the high-temperature photon emission source and the low-temperature phonon environment, respectively. We utilize the time-dependent density matrix renormalization group algorithm to investigate the quantum dynamics of the model. It is found that the transient energy current flowing through the two molecules exhibits two stages. In the first stage the energy current is of a coherent feature and represents the ultrafast delocalization of the charge-transfer state, and in the second stage a steady incoherent current is established. The power conversion efficiency is significantly high and may reach the maximum value of 93 % with optimized model parameters. The long-lived quantum entanglement between the two spins is found to be primarily responsible for the hyperefficiency.

  12. Organizations.

    ERIC Educational Resources Information Center

    Aviation/Space, 1980

    1980-01-01

    This is a list of aerospace organizations and other groups that provides educators with assistance and information in specific areas. Both government and nongovernment organizations are included. (Author/SA)

  13. Theory of Current Transients in Planar Semiconductor Devices: Insights and Applications to Organic Solar Cells

    NASA Astrophysics Data System (ADS)

    Hawks, Steven A.; Finck, Benjamin Y.; Schwartz, Benjamin J.

    2015-04-01

    Time-domain current measurements are widely used to characterize semiconductor material properties, such as carrier mobility, doping concentration, carrier lifetime, and the static dielectric constant. It is therefore critical that these measurements be theoretically understood if they are to be successfully applied to assess the properties of materials and devices. In this paper, we derive generalized relations for describing current-density transients in planar semiconductor devices at uniform temperature. By spatially averaging the charge densities inside the semiconductor, we are able to provide a rigorous, straightforward, and experimentally relevant way to interpret these measurements. The formalism details several subtle aspects of current transients, including how the electrode charge relates to applied bias and internal space charge, how the displacement current can alter the apparent free-carrier current, and how to understand the integral of a charge-extraction transient. We also demonstrate how the formalism can be employed to derive the current transients arising from simple physical models, like those used to describe charge extraction by linearly increasing voltage (CELIV) and time-of-flight experiments. In doing so, we find that there is a nonintuitive factor-of-2 reduction in the apparent free-carrier concentration that can be easily missed, for example, in the application of charge-extraction models. Finally, to validate our theory and better understand the different current contributions, we perform a full time-domain drift-diffusion simulation of a CELIV trace and compare the results to our formalism. As expected, our analytic equations match precisely with the numerical solutions to the drift-diffusion, Poisson, and continuity equations. Thus, overall, our formalism provides a straightforward and general way to think about how the internal space-charge distribution, the electrode charge, and the externally applied bias translate into a measured

  14. Reliability and Confidence.

    ERIC Educational Resources Information Center

    Test Service Bulletin, 1952

    1952-01-01

    Some aspects of test reliability are discussed. Topics covered are: (1) how high should a reliability coefficient be?; (2) two factors affecting the interpretation of reliability coefficients--range of talent and interval between testings; (3) some common misconceptions--reliability of speed tests, part vs. total reliability, reliability for what…

  15. Theory of the β-Type Organic Superconductivity under Uniaxial Compression

    NASA Astrophysics Data System (ADS)

    Suzuki, Takeo; Onari, Seiichiro; Ito, Hiroshi; Tanaka, Yukio

    2011-09-01

    We study theoretically the shift of the superconducting transition temperature (Tc) under uniaxial compression in β-type organic superconductors, β-(BEDT-TTF)2I3 and β-(BDA-TTP)2X (X=SbF6, AsF6), in order to clarify the electron correlation, the spin frustration, and the effect of dimerization. The transfer integrals are calculated by the extended Hückel method assuming the uniaxial strain, and the superconducting state mediated by the spin fluctuation is solved using Eliashberg's equation with the fluctuation--exchange approximation. The calculation is carried out on both the dimerized (one-band) and nondimerized (two-band) Hubbard models. We have found that (i) the behavior of Tc in β-(BEDT-TTF)2I3 with a stronger dimerization is well reproduced by the dimer model, while that in weakly dimerized β-BDA-TTP salts is rather well reproduced by the two-band model, and (ii) the competition between the spin frustration and the effect induced by the fluctuation is important in these materials, which causes the nonmonotonic shift of Tc against uniaxial compression.

  16. The vibrational energy flow transition in organic molecules: Theory meets experiment

    PubMed Central

    Bigwood, R.; Gruebele, M.; Leitner, D. M.; Wolynes, P. G.

    1998-01-01

    Most large dynamical systems are thought to have ergodic dynamics, whereas small systems may not have free interchange of energy between degrees of freedom. This assumption is made in many areas of chemistry and physics, ranging from nuclei to reacting molecules and on to quantum dots. We examine the transition to facile vibrational energy flow in a large set of organic molecules as molecular size is increased. Both analytical and computational results based on local random matrix models describe the transition to unrestricted vibrational energy flow in these molecules. In particular, the models connect the number of states participating in intramolecular energy flow to simple molecular properties such as the molecular size and the distribution of vibrational frequencies. The transition itself is governed by a local anharmonic coupling strength and a local state density. The theoretical results for the transition characteristics compare well with those implied by experimental measurements using IR fluorescence spectroscopy of dilution factors reported by Stewart and McDonald [Stewart, G. M. & McDonald, J. D. (1983) J. Chem. Phys. 78, 3907–3915]. PMID:9600899

  17. Charge Photogeneration Experiments and Theory in Aggregated Squaraine Donor Materials for Improved Organic Solar Cell Efficiencies

    NASA Astrophysics Data System (ADS)

    Spencer, Susan Demetra

    Fossil fuel consumption has a deleterious effect on humans, the economy, and the environment. Renewable energy technologies must be identified and commercialized as quickly as possible so that the transition to renewables can happen at a minimum of financial and societal cost. Organic photovoltaic cells offer an inexpensive and disruptive energy technology, if the scientific challenges of understanding charge photogeneration in a bulk heterojunction material can be overcome. At RIT, there is a strong focus on creating new materials that can both offer fundamentally important scientific results relating to quantum photophysics, and simultaneously assist in the development of strong candidates for future commercialized technology. In this presentation, the results of intensive materials characterization of a series of squaraine small molecule donors will be presented, as well as a full study of the fabrication and optimization required to achieve >4% photovoltaic cell efficiency. A relationship between the molecular structure of the squaraine and its ability to form nanoscale aggregates will be explored. Squaraine aggregation will be described as a unique optoelectronic probe of the structure of the bulk heterojunction. This relationship will then be utilized to explain changes in crystallinity that impact the overall performance of the devices. Finally, a predictive summary will be given for the future of donor material research at RIT.

  18. Understanding Small-Molecule Interactions in Metal-Organic Frameworks: Coupling Experiment with Theory.

    PubMed

    Lee, Jason S; Vlaisavljevich, Bess; Britt, David K; Brown, Craig M; Haranczyk, Maciej; Neaton, Jeffrey B; Smit, Berend; Long, Jeffrey R; Queen, Wendy L

    2015-10-14

    Metal-organic frameworks (MOFs) have gained much attention as next-generation porous media for various applications, especially gas separation/storage, and catalysis. New MOFs are regularly reported; however, to develop better materials in a timely manner for specific applications, the interactions between guest molecules and the internal surface of the framework must first be understood. A combined experimental and theoretical approach is presented, which proves essential for the elucidation of small-molecule interactions in a model MOF system known as M2 (dobdc) (dobdc(4-) = 2,5-dioxido-1,4-benzenedicarboxylate; M = Mg, Mn, Fe, Co, Ni, Cu, or Zn), a material whose adsorption properties can be readily tuned via chemical substitution. It is additionally shown that the study of extensive families like this one can provide a platform to test the efficacy and accuracy of developing computational methodologies in slightly varying chemical environments, a task that is necessary for their evolution into viable, robust tools for screening large numbers of materials. PMID:26033176

  19. Fatigue properties of atomic-layer-deposited alumina ultra-barriers and their implications for the reliability of flexible organic electronics

    NASA Astrophysics Data System (ADS)

    Baumert, E. K.; Pierron, O. N.

    2012-12-01

    The fatigue degradation properties of atomic-layer-deposited alumina, with thickness ranging from 4.2 to 50 nm, were investigated using a silicon micro-resonator on which the coatings were deposited and strained in a static or cyclic manner, with strain amplitudes up to 2.2%, in controlled environments. Based on the measured resonant frequency evolution, post-test scanning electron microscopy observations, and finite element models, it is shown that cracks in the alumina nucleate and propagate under cyclic loading, and that the crack growth rates scale with the strain energy release rates for crack channeling. The implications for the reliability of flexible electronics are discussed.

  20. Fatigue properties of atomic-layer-deposited alumina ultra-barriers and their implications for the reliability of flexible organic electronics

    SciTech Connect

    Baumert, E. K.; Pierron, O. N.

    2012-12-17

    The fatigue degradation properties of atomic-layer-deposited alumina, with thickness ranging from 4.2 to 50 nm, were investigated using a silicon micro-resonator on which the coatings were deposited and strained in a static or cyclic manner, with strain amplitudes up to 2.2%, in controlled environments. Based on the measured resonant frequency evolution, post-test scanning electron microscopy observations, and finite element models, it is shown that cracks in the alumina nucleate and propagate under cyclic loading, and that the crack growth rates scale with the strain energy release rates for crack channeling. The implications for the reliability of flexible electronics are discussed.

  1. Inventing the future of reliability: FERC's recent orders and the consolidation of reliability authority

    SciTech Connect

    Skees, J. Daniel

    2010-06-15

    The Energy Policy Act of 2005 established mandatory reliability standard enforcement under a system in which the Federal Energy Regulatory Commission and the Electric Reliability Organization would have their own spheres of responsibility and authority. Recent orders, however, reflect the Commission's frustration with the reliability standard drafting process and suggest that the Electric Reliability Organization's discretion is likely to receive less deference in the future. (author)

  2. Formation of persistent organic pollutants from 2,4,5-trichlorothiophenol combustion: a density functional theory investigation.

    PubMed

    Dar, Tajwar; Shah, Kalpit; Moghtaderi, Behdad; Page, Alister J

    2016-06-01

    Polychlorinated dibenzothiophene (PCDT) and polychlorinated thianthrene (PCTA) are sulfur analogues of dioxins, such as polychlorinated dibenzo-p-dioxins and polychlorinated dibenzofurans (PCDD/F). In this work, we present a detailed mechanistic and kinetic analysis of PCDT and PCTA formation from the combustion of 2,4,5-trichlorothiophenol. It is shown that the formation of these persistent organic pollutants is more favourable, both kinetically and thermodynamically, than their analogous dioxin counterparts. This is rationalised in terms of the different influences of the S-H and O-H moieties in the 2,4,5-trichlorothiophenol and 2,4,5-trichlorophenol precursors. Kinetic parameters also indicate that the yield of PCDT should exceed that of PCDD. Finally, we demonstrate here that the degree and pattern of chlorination on the 2,4,5-trichlorothiophenol precursor leads to subtle thermodynamic and kinetic changes to the PCDT/PCTA formation mechanisms. Graphical abstract Formation mechanisms of persistant organic pollutants, PCDT and PCTA, from 2,4,5-trichlorothiophenol combustion, has been investigated using density functional theory. PMID:27179803

  3. Reliability and Maintainability (RAM) Training

    NASA Technical Reports Server (NTRS)

    Lalli, Vincent R. (Editor); Malec, Henry A. (Editor); Packard, Michael H. (Editor)

    2000-01-01

    The theme of this manual is failure physics-the study of how products, hardware, software, and systems fail and what can be done about it. The intent is to impart useful information, to extend the limits of production capability, and to assist in achieving low-cost reliable products. In a broader sense the manual should do more. It should underscore the urgent need CI for mature attitudes toward reliability. Five of the chapters were originally presented as a classroom course to over 1000 Martin Marietta engineers and technicians. Another four chapters and three appendixes have been added, We begin with a view of reliability from the years 1940 to 2000. Chapter 2 starts the training material with a review of mathematics and a description of what elements contribute to product failures. The remaining chapters elucidate basic reliability theory and the disciplines that allow us to control and eliminate failures.

  4. Kleiber's Law: How the Fire of Life ignited debate, fueled theory, and neglected plants as model organisms

    PubMed Central

    Niklas, Karl J; Kutschera, Ulrich

    2015-01-01

    Size is a key feature of any organism since it influences the rate at which resources are consumed and thus affects metabolic rates. In the 1930s, size-dependent relationships were codified as “allometry” and it was shown that most of these could be quantified using the slopes of log-log plots of any 2 variables of interest. During the decades that followed, physiologists explored how animal respiration rates varied as a function of body size across taxa. The expectation was that rates would scale as the 2/3 power of body size as a reflection of the Euclidean relationship between surface area and volume. However, the work of Max Kleiber (1893–1976) and others revealed that animal respiration rates apparently scale more closely as the 3/4 power of body size. This phenomenology, which is called “Kleiber's Law,” has been described for a broad range of organisms, including some algae and plants. It has also been severely criticized on theoretical and empirical grounds. Here, we review the history of the analysis of metabolism, which originated with the works of Antoine L. Lavoisier (1743–1794) and Julius Sachs (1832–1897), and culminated in Kleiber's book The Fire of Life (1961; 2. ed. 1975). We then evaluate some of the criticisms that have been leveled against Kleiber's Law and some examples of the theories that have tried to explain it. We revive the speculation that intracellular exo- and endocytotic processes are resource delivery-systems, analogous to the supercellular systems in multicellular organisms. Finally, we present data that cast doubt on the existence of a single scaling relationship between growth and body size in plants. PMID:26156204

  5. Kleiber's Law: How the Fire of Life ignited debate, fueled theory, and neglected plants as model organisms.

    PubMed

    Niklas, Karl J; Kutschera, Ulrich

    2015-01-01

    Size is a key feature of any organism since it influences the rate at which resources are consumed and thus affects metabolic rates. In the 1930s, size-dependent relationships were codified as "allometry" and it was shown that most of these could be quantified using the slopes of log-log plots of any 2 variables of interest. During the decades that followed, physiologists explored how animal respiration rates varied as a function of body size across taxa. The expectation was that rates would scale as the 2/3 power of body size as a reflection of the Euclidean relationship between surface area and volume. However, the work of Max Kleiber (1893-1976) and others revealed that animal respiration rates apparently scale more closely as the 3/4 power of body size. This phenomenology, which is called "Kleiber's Law," has been described for a broad range of organisms, including some algae and plants. It has also been severely criticized on theoretical and empirical grounds. Here, we review the history of the analysis of metabolism, which originated with the works of Antoine L. Lavoisier (1743-1794) and Julius Sachs (1832-1897), and culminated in Kleiber's book The Fire of Life (1961; 2. ed. 1975). We then evaluate some of the criticisms that have been leveled against Kleiber's Law and some examples of the theories that have tried to explain it. We revive the speculation that intracellular exo- and endocytotic processes are resource delivery-systems, analogous to the supercellular systems in multicellular organisms. Finally, we present data that cast doubt on the existence of a single scaling relationship between growth and body size in plants. PMID:26156204

  6. High Reliability and Excellence in Staffing.

    PubMed

    Mensik, Jennifer

    2015-01-01

    Nurse staffing is a complex issue, with many facets and no one right answer. High-reliability organizations (HROs) strive and succeed in achieving a high degree of safety or reliability despite operating in hazardous conditions. HROs have systems in place that make them extremely consistent in accomplishing their goals and avoiding potential errors. However, the inability to resolve quality issues may very well be related to the lack of adoption of high-reliability principles throughout our organizations. PMID:26625582

  7. Organics.

    ERIC Educational Resources Information Center

    Chian, Edward S. K.; DeWalle, Foppe B.

    1978-01-01

    Presents water analysis literature for 1978. This review is concerned with organics, and it covers: (1) detergents and surfactants; (2) aliphatic and aromatic hydrocarbons; (3) pesticides and chlorinated hydrocarbons; and (4) naturally occurring organics. A list of 208 references is also presented. (HM)

  8. Packaging Theory.

    ERIC Educational Resources Information Center

    Williams, Jeffrey

    1994-01-01

    Considers the recent flood of anthologies of literary criticism and theory as exemplifications of the confluence of pedagogical concerns, economics of publishing, and other historical factors. Looks specifically at how these anthologies present theory. Cites problems with their formatting theory and proposes alternative ways of organizing theory…

  9. Reliability and Validity of the World Health Organization Quality of Life: Brief Version (WHOQOL-BREF) in a Homeless Substance Dependent Veteran Population

    ERIC Educational Resources Information Center

    Garcia-Rea, Elizabeth A.; LePage, James P.

    2010-01-01

    With the high number of homeless, there is a critical need for rapid and accurate assessment of quality of life to assess program outcomes. The World Health Organization's WHOQOL-100 has demonstrated promise in accurately assessing quality-of-life in this population. However, its length may make large scale use impractical for working with a…

  10. Well-organized raspberry-like Ag@Cu bimetal nanoparticles for highly reliable and reproducible surface-enhanced Raman scattering

    NASA Astrophysics Data System (ADS)

    Lee, Jung-Pil; Chen, Dongchang; Li, Xiaxi; Yoo, Seungmin; Bottomley, Lawrence A.; El-Sayed, Mostafa A.; Park, Soojin; Liu, Meilin

    2013-11-01

    Surface-enhanced Raman scattering (SERS) is ideally suited for probing and mapping surface species and incipient phases on fuel cell electrodes because of its high sensitivity and surface-selectivity, potentially offering insights into the mechanisms of chemical and energy transformation processes. In particular, bimetal nanostructures of coinage metals (Au, Ag, and Cu) have attracted much attention as SERS-active agents due to their distinctive electromagnetic field enhancements originated from surface plasmon resonance. Here we report excellent SERS-active, raspberry-like nanostructures composed of a silver (Ag) nanoparticle core decorated with smaller copper (Cu) nanoparticles, which displayed enhanced and broadened UV-Vis absorption spectra. These unique Ag@Cu raspberry nanostructures enable us to use blue, green, and red light as the excitation laser source for surface-enhanced Raman spectroscopy (SERS) with a large enhancement factor (EF). A highly reliable SERS effect was demonstrated using Rhodamine 6G (R6G) molecules and a thin film of gadolinium doped ceria.Surface-enhanced Raman scattering (SERS) is ideally suited for probing and mapping surface species and incipient phases on fuel cell electrodes because of its high sensitivity and surface-selectivity, potentially offering insights into the mechanisms of chemical and energy transformation processes. In particular, bimetal nanostructures of coinage metals (Au, Ag, and Cu) have attracted much attention as SERS-active agents due to their distinctive electromagnetic field enhancements originated from surface plasmon resonance. Here we report excellent SERS-active, raspberry-like nanostructures composed of a silver (Ag) nanoparticle core decorated with smaller copper (Cu) nanoparticles, which displayed enhanced and broadened UV-Vis absorption spectra. These unique Ag@Cu raspberry nanostructures enable us to use blue, green, and red light as the excitation laser source for surface-enhanced Raman spectroscopy

  11. Reliability beyond Theory and into Practice

    ERIC Educational Resources Information Center

    Sijtsma, Klaas

    2009-01-01

    The critical reactions of Bentler (2009, doi: 10.1007/s11336-008-9100-1), Green and Yang (2009a, doi: 10.1007/s11336-008-9098-4 ; 2009b, doi: 10.1007/s11336-008-9099-3), and Revelle and Zinbarg (2009, doi: 10.1007/s11336-008-9102-z) to Sijtsma's (2009, doi: 10.1007/s11336-008-9101-0) paper on Cronbach's alpha are addressed. The dissemination of…

  12. Reliability of Scores on the Summative Performance Assessments

    ERIC Educational Resources Information Center

    Yang, Yanyun; Oosterhof, Albert; Xia, Yan

    2015-01-01

    The authors address the reliability of scores obtained on the summative performance assessments during the pilot year of our research. Contrary to classical test theory, we discussed the advantages of using generalizability theory for estimating reliability of scores for summative performance assessments. Generalizability theory was used as the…

  13. Self-organized criticality as Witten-type topological field theory with spontaneously broken Becchi-Rouet-Stora-Tyutin symmetry

    SciTech Connect

    Ovchinnikov, Igor V.

    2011-05-15

    Here, a scenario is proposed, according to which a generic self-organized critical (SOC) system can be looked upon as a Witten-type topological field theory (W-TFT) with spontaneously broken Becchi-Rouet-Stora-Tyutin (BRST) symmetry. One of the conditions for the SOC is the slow driving noise, which unambiguously suggests Stratonovich interpretation of the corresponding stochastic differential equation (SDE). This, in turn, necessitates the use of Parisi-Sourlas-Wu stochastic quantization procedure, which straightforwardly leads to a model with BRST-exact action, i.e., to a W-TFT. In the parameter space of the SDE, there must exist full-dimensional regions where the BRST symmetry is spontaneously broken by instantons, which in the context of SOC are essentially avalanches. In these regions, the avalanche-type SOC dynamics is liberated from overwise a rightful dynamics-less W-TFT, and a Goldstone mode of Fadeev-Popov ghosts exists. Goldstinos represent moduli of instantons (avalanches) and being gapless are responsible for the critical avalanche distribution in the low-energy, long-wavelength limit. The above arguments are robust against moderate variations of the SDE's parameters and the criticality is 'self-tuned'. The proposition of this paper suggests that the machinery of W-TFTs may find its applications in many different areas of modern science studying various physical realizations of SOC. It also suggests that there may in principle exist a connection between some SOC's and the concept of topological quantum computing.

  14. The effects of instructors' autonomy support and students' autonomous motivation on learning organic chemistry: A self-determination theory perspective

    NASA Astrophysics Data System (ADS)

    Black, Aaron E.; Deci, Edward L.

    2000-11-01

    This prospective study applied self-determination theory to investigate the effects of students' course-specific self-regulation and their perceptions of their instructors' autonomy support on adjustment and academic performance in a college-level organic chemistry course. The study revealed that: (1) students' reports of entering the course for relatively autonomous (vs. controlled) reasons predicted higher perceived competence and interest/enjoyment and lower anxiety and grade-focused performance goals during the course, and were related to whether or not the students dropped the course; and (2) students' perceptions of their instructors' autonomy support predicted increases in autonomous self-regulation, perceived competence, and interest/enjoyment, and decreases in anxiety over the semester. The change in autonomous self-regulation in turn predicted students' performance in the course. Further, instructor autonomy support also predicted course performance directly, although differences in the initial level of students' autonomous self-regulation moderated that effect, with autonomy support relating strongly to academic performance for students initially low in autonomous self-regulation but not for students initially high in autonomous self-regulation.

  15. Probing Surface-Adlayer Conjugation on Organic-Modified Si(111) Surfaces with Microscopy, Scattering, Spectroscopy, and Density Functional Theory

    SciTech Connect

    Kellar, Joshua A.; Lin, Jui-Ching; Kim, Jun-Hyun; Yoder, Nathan L.; Bevan, Kirk H.; Stokes, Grace Y.; Geiger, Franz M.; Nguyen, SonBinh T.; Bedzyk, Michael J.; Hersam, Mark C.

    2009-03-24

    Highly conjugated molecules bound to silicon are promising candidates for organosilicon electronic devices and sensors. In this study, 1-bromo-4-ethynylbenzene was synthesized and reacted with a hydrogen-passivated Si(111) surface via ultraviolet irradiation. Through an array of characterization and modeling tools, the binding configuration and morphology of the reacted molecule were thoroughly analyzed. Atomic force microscopy confirmed an atomically flat surface morphology following reaction, while X-ray photoelectron spectroscopy verified reaction to the surface via the terminal alkyne moiety. In addition, synchrotron X-ray characterization, including X-ray reflectivity, X-ray fluorescence, and X-ray standing wave measurements, enabled sub-angstrom determination of the position of the bromine atom with respect to the silicon lattice. This structural characterization was quantitatively compared with density functional theory (DFT) calculations, thus enabling the {pi}-conjugation of the terminal carbon atoms to be deduced. The X-ray and DFT results were additionally corroborated with the vibrational spectrum of the organic adlayer, which was measured with sum frequency generation. Overall, these results illustrate that the terminal carbon atoms in 1-bromo-4-ethynylbenzene adlayers on Si(111) retain {pi}-conjugation, thus revealing alkyne molecules as promising candidates for organosilicon electronics and sensing.

  16. Phosphorescence lifetimes of organic light-emitting diodes from two-component time-dependent density functional theory

    SciTech Connect

    Kühn, Michael; Weigend, Florian

    2014-12-14

    “Spin-forbidden” transitions are calculated for an eight-membered set of iridium-containing candidate molecules for organic light-emitting diodes (OLEDs) using two-component time-dependent density functional theory. Phosphorescence lifetimes (obtained from averaging over relevant excitations) are compared to experimental data. Assessment of parameters like non-distorted and distorted geometric structures, density functionals, relativistic Hamiltonians, and basis sets was done by a thorough study for Ir(ppy){sub 3} focussing not only on averaged phosphorescence lifetimes, but also on the agreement of the triplet substate structure with experimental data. The most favorable methods were applied to an eight-membered test set of OLED candidate molecules; Boltzmann-averaged phosphorescence lifetimes were investigated concerning the convergence with the number of excited states and the changes when including solvent effects. Finally, a simple model for sorting out molecules with long averaged phosphorescence lifetimes is developed by visual inspection of computationally easily achievable one-component frontier orbitals.

  17. Workplace support, discrimination, and person-organization fit: tests of the theory of work adjustment with LGB individuals.

    PubMed

    Velez, Brandon L; Moradi, Bonnie

    2012-07-01

    The present study explored the links of 2 workplace contextual variables--perceptions of workplace heterosexist discrimination and lesbian, gay, and bisexual (LGB)-supportive climates--with job satisfaction and turnover intentions in a sample of LGB employees. An extension of the theory of work adjustment (TWA) was used as the conceptual framework for the study; as such, perceived person-organization (P-O) fit was tested as a mediator of the relations between the workplace contextual variables and job outcomes. Data were analyzed from 326 LGB employees. Zero-order correlations indicated that perceptions of workplace heterosexist discrimination and LGB-supportive climates were correlated in expected directions with P-O fit, job satisfaction, and turnover intentions. Structural equation modeling (SEM) was used to compare multiple alternative measurement models evaluating the discriminant validity of the 2 workplace contextual variables relative to one another, and the 3 TWA job variables relative to one another; SEM was also used to test the hypothesized mediation model. Comparisons of multiple alternative measurement models supported the construct distinctiveness of the variables of interest. The test of the hypothesized structural model revealed that only LGB-supportive climates (and not workplace heterosexist discrimination) had a unique direct positive link with P-O fit and, through the mediating role of P-O fit, had significant indirect positive and negative relations with job satisfaction and turnover intentions, respectively. Moreover, P-O fit had a significant indirect negative link with turnover intentions through job satisfaction. PMID:22642266

  18. Reliability model generator

    NASA Technical Reports Server (NTRS)

    McMann, Catherine M. (Inventor); Cohen, Gerald C. (Inventor)

    1991-01-01

    An improved method and system for automatically generating reliability models for use with a reliability evaluation tool is described. The reliability model generator of the present invention includes means for storing a plurality of low level reliability models which represent the reliability characteristics for low level system components. In addition, the present invention includes means for defining the interconnection of the low level reliability models via a system architecture description. In accordance with the principles of the present invention, a reliability model for the entire system is automatically generated by aggregating the low level reliability models based on the system architecture description.

  19. 76 FR 66055 - North American Electric Reliability Corporation; Order Approving Interpretation of Reliability...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-25

    ... intervene serves to make AMP a party to this proceeding. \\9\\ 76 FR 52,325 (2011). \\10\\ 18 CFR 385.214 (2011... Energy Regulatory Commission North American Electric Reliability Corporation; Order Approving... Electric Reliability Corporation (NERC), the Commission-certified Electric Reliability Organization...

  20. Driving Method for Compensating Reliability Problem of Hydrogenated Amorphous Silicon Thin Film Transistors and Image Sticking Phenomenon in Active Matrix Organic Light-Emitting Diode Displays

    NASA Astrophysics Data System (ADS)

    Shin, Min-Seok; Jo, Yun-Rae; Kwon, Oh-Kyong

    2011-03-01

    In this paper, we propose a driving method for compensating the electrical instability of hydrogenated amorphous silicon (a-Si:H) thin film transistors (TFTs) and the luminance degradation of organic light-emitting diode (OLED) devices for large active matrix OLED (AMOLED) displays. The proposed driving method senses the electrical characteristics of a-Si:H TFTs and OLEDs using current integrators and compensates them by an external compensation method. Threshold voltage shift is controlled a using negative bias voltage. After applying the proposed driving method, the measured error of the maximum emission current ranges from -1.23 to +1.59 least significant bit (LSB) of a 10-bit gray scale under the threshold voltage shift ranging from -0.16 to 0.17 V.

  1. Organization-wide adoption of computerized provider order entry systems: a study based on diffusion of innovations theory

    PubMed Central

    2009-01-01

    Background Computerized provider order entry (CPOE) systems have been introduced to reduce medication errors, increase safety, improve work-flow efficiency, and increase medical service quality at the moment of prescription. Making the impact of CPOE systems more observable may facilitate their adoption by users. We set out to examine factors associated with the adoption of a CPOE system for inter-organizational and intra-organizational care. Methods The diffusion of innovation theory was used to understand physicians' and nurses' attitudes and thoughts about implementation and use of the CPOE system. Two online survey questionnaires were distributed to all physicians and nurses using a CPOE system in county-wide healthcare organizations. The number of complete questionnaires analyzed was 134 from 200 nurses (67.0%) and 176 from 741 physicians (23.8%). Data were analyzed using descriptive-analytical statistical methods. Results More nurses (56.7%) than physicians (31.3%) stated that the CPOE system introduction had worked well in their clinical setting (P < 0.001). Similarly, more physicians (73.9%) than nurses (50.7%) reported that they found the system not adapted to their specific professional practice (P = < 0.001). Also more physicians (25.0%) than nurses (13.4%) stated that they did want to return to the previous system (P = 0.041). We found that in particular the received relative advantages of the CPOE system were estimated to be significantly (P < 0.001) higher among nurses (39.6%) than physicians (16.5%). However, physicians' agreements with the compatibility of the CPOE and with its complexity were significantly higher than the nurses (P < 0.001). Conclusions Qualifications for CPOE adoption as defined by three attributes of diffusion of innovation theory were not satisfied in the study setting. CPOE systems are introduced as a response to the present limitations in paper-based systems. In consequence, user expectations are often high on their relative

  2. Reliability Generalization: "Lapsus Linguae"

    ERIC Educational Resources Information Center

    Smith, Julie M.

    2011-01-01

    This study examines the proposed Reliability Generalization (RG) method for studying reliability. RG employs the application of meta-analytic techniques similar to those used in validity generalization studies to examine reliability coefficients. This study explains why RG does not provide a proper research method for the study of reliability,…

  3. Business of reliability

    NASA Astrophysics Data System (ADS)

    Engel, Pierre

    1999-12-01

    The presentation is organized around three themes: (1) The decrease of reception equipment costs allows non-Remote Sensing organization to access a technology until recently reserved to scientific elite. What this means is the rise of 'operational' executive agencies considering space-based technology and operations as a viable input to their daily tasks. This is possible thanks to totally dedicated ground receiving entities focusing on one application for themselves, rather than serving a vast community of users. (2) The multiplication of earth observation platforms will form the base for reliable technical and financial solutions. One obstacle to the growth of the earth observation industry is the variety of policies (commercial versus non-commercial) ruling the distribution of the data and value-added products. In particular, the high volume of data sales required for the return on investment does conflict with traditional low-volume data use for most applications. Constant access to data sources supposes monitoring needs as well as technical proficiency. (3) Large volume use of data coupled with low- cost equipment costs is only possible when the technology has proven reliable, in terms of application results, financial risks and data supply. Each of these factors is reviewed. The expectation is that international cooperation between agencies and private ventures will pave the way for future business models. As an illustration, the presentation proposes to use some recent non-traditional monitoring applications, that may lead to significant use of earth observation data, value added products and services: flood monitoring, ship detection, marine oil pollution deterrent systems and rice acreage monitoring.

  4. Cognitive decision errors and organization vulnerabilities in nuclear power plant safety management: Modeling using the TOGA meta-theory framework

    SciTech Connect

    Cappelli, M.; Gadomski, A. M.; Sepiellis, M.; Wronikowska, M. W.

    2012-07-01

    In the field of nuclear power plant (NPP) safety modeling, the perception of the role of socio-cognitive engineering (SCE) is continuously increasing. Today, the focus is especially on the identification of human and organization decisional errors caused by operators and managers under high-risk conditions, as evident by analyzing reports on nuclear incidents occurred in the past. At present, the engineering and social safety requirements need to enlarge their domain of interest in such a way to include all possible losses generating events that could be the consequences of an abnormal state of a NPP. Socio-cognitive modeling of Integrated Nuclear Safety Management (INSM) using the TOGA meta-theory has been discussed during the ICCAP 2011 Conference. In this paper, more detailed aspects of the cognitive decision-making and its possible human errors and organizational vulnerability are presented. The formal TOGA-based network model for cognitive decision-making enables to indicate and analyze nodes and arcs in which plant operators and managers errors may appear. The TOGA's multi-level IPK (Information, Preferences, Knowledge) model of abstract intelligent agents (AIAs) is applied. In the NPP context, super-safety approach is also discussed, by taking under consideration unexpected events and managing them from a systemic perspective. As the nature of human errors depends on the specific properties of the decision-maker and the decisional context of operation, a classification of decision-making using IPK is suggested. Several types of initial situations of decision-making useful for the diagnosis of NPP operators and managers errors are considered. The developed models can be used as a basis for applications to NPP educational or engineering simulators to be used for training the NPP executive staff. (authors)

  5. Can There Be Reliability without "Reliability?"

    ERIC Educational Resources Information Center

    Mislevy, Robert J.

    2004-01-01

    An "Educational Researcher" article by Pamela Moss (1994) asks the title question, "Can there be validity without reliability?" Yes, she answers, if by reliability one means "consistency among independent observations intended as interchangeable" (Moss, 1994, p. 7), quantified by internal consistency indices such as KR-20 coefficients and…

  6. HELIOS Critical Design Review: Reliability

    NASA Technical Reports Server (NTRS)

    Benoehr, H. C.; Herholz, J.; Prem, H.; Mann, D.; Reichert, L.; Rupp, W.; Campbell, D.; Boettger, H.; Zerwes, G.; Kurvin, C.

    1972-01-01

    This paper presents Helios Critical Design Review Reliability form October 16-20, 1972. The topics include: 1) Reliability Requirement; 2) Reliability Apportionment; 3) Failure Rates; 4) Reliability Assessment; 5) Reliability Block Diagram; and 5) Reliability Information Sheet.

  7. Gas chromatography coupled to mass spectrometry analysis of volatiles, sugars, organic acids and aminoacids in Valencia Late orange juice and reliability of the Automated Mass Spectral Deconvolution and Identification System for their automatic identification and quantification.

    PubMed

    Cerdán-Calero, Manuela; Sendra, José María; Sentandreu, Enrique

    2012-06-01

    Neutral volatiles and non-volatile polar compounds (sugars, organics acids and aminoacids) present in Valencia Late orange juice have been analysed by Gas Chromatography coupled to Mass Spectrometry (GC-MS). Before analysis, the neutral volatiles have been extracted by Headspace-Solid Phase Microextraction (HS-SPME), and the non-volatile polar compounds have been transformed to their corresponding volatile trimethylsilyl (TMS) derivatives. From the resulting raw GC-MS data files, the reliability of the Automated Mass Spectral Deconvolution and Identification System (AMDIS) to perform accurate identification and quantification of the compounds present in the sample has been tested. Hence, both raw GC-MS data files have been processed automatically by using AMDIS and manually by using Xcalibur™, the manufacturer's data processing software for the GC-MS platform used. Results indicate that the reliability of AMDIS for accurate identification and quantification of the compounds present in the sample strongly depends on a number of operational settings, for both the MS and AMDIS, which must be optimized for the particular type of assayed sample. After optimization of these settings, AMDIS and Xcalibur™ yield practically the same results. A total of 85 volatiles and 22 polar compounds have been identified and quantified in Valencia Late orange juice. PMID:22533907

  8. Reliable Quantum Chemical Prediction of the Localized/Delocalized Character of Organic Mixed-Valence Radical Anions. From Continuum Solvent Models to Direct-COSMO-RS.

    PubMed

    Renz, Manuel; Kess, Martin; Diedenhofen, Michael; Klamt, Andreas; Kaupp, Martin

    2012-11-13

    A recently proposed quantum-chemical protocol for the description of the character of organic mixed-valence (MV) compounds, close from both sides to the localized/delocalized borderline, is evaluated and extended for a series of dinitroaryl radical anions 1-6. A combination of global hybrid functionals with exact-exchange admixtures of 35% (BLYP35) or 42% (BMK) with appropriate solvent modeling allows an essentially quantitative treatment of, for example, structural symmetry-breaking in Robin/Day class II systems, thermal electron transfer (ET) barriers, and intervalence charge-transfer (IV-CT) excitation energies, while covering also the delocalized class III cases. Global hybrid functionals with lower exact-exchange admixtures (e.g., B3LYP, M05, or M06) provide a too delocalized description, while functionals with higher exact-exchange admixtures (M05-2X, M06-2X) provide a too localized one. The B2PLYP double hybrid gives reasonable structures but far too small barriers in class II cases. The CAM-B3LYP range hybrid gives somewhat too high ET barriers and IV-CT energies, while the range hybrids ωB97X and LC-BLYP clearly exhibit too much exact exchange. Continuum solvent models describe the situation well in most aprotic solvents studied. The transition of 1,4-dinitrobenzene anion 1 from a class III behavior in aprotic solvents to a class II behavior in alcohols is not recovered by continuum solvent models. In contrast, it is treated faithfully by the novel direct conductor-like screening model for real solvents (D-COSMO-RS). The D-COSMO-RS approach, the TURBOMOLE implementation of which is reported, also describes accurately the increased ET barriers of class II systems 2 and 3 in alcohols as compared to aprotic solvents and can distinguish at least qualitatively between different aprotic solvents with identical or similar dielectric constants. The dominant role of the solvent environment for the ET character of these MV radical anions is emphasized, as in

  9. Applicability of the Multiple Intelligence Theory to the Process of Organizing and Planning of Learning and Teaching

    ERIC Educational Resources Information Center

    Acat, M. Bahaddin

    2005-01-01

    It has long been under discussion how the teaching and learning environment should be arranged, how individuals achieve learning, and how teachers can effectively contribute to this process. Accordingly, a considerable number of theories and models have been proposed. Gardner (1983) caused a remarkable shift in the perception of learning theory as…

  10. Stoking the Dialogue on the Domains of Transformative Learning Theory: Insights From Research With Faith-Based Organizations in Kenya

    ERIC Educational Resources Information Center

    Moyer, Joanne M.; Sinclair, A. John

    2016-01-01

    Transformative learning theory is applied in a variety of fields, including archaeology, religious studies, health care, the physical sciences, environmental studies, and natural resource management. Given the breadth of the theory's application, it needs to be adaptable to broad contexts. This article shares insights gained from applying the…

  11. Environmental control of sepalness and petalness in perianth organs of waterlilies: a new Mosaic Theory for the evolutionary origin of a differentiated perianth

    PubMed Central

    Warner, Kate A.; Rudall, Paula J.; Frohlich, Michael W.

    2009-01-01

    The conventional concept of an ‘undifferentiated perianth’, implying that all perianth organs of a flower are alike, obscures the fact that individual perianth organs are sometimes differentiated into sepaloid and petaloid regions, as in the early-divergent angiosperms Nuphar, Nymphaea, and Schisandra. In the waterlilies Nuphar and Nymphaea, sepaloid regions closely coincide with regions of the perianth that were exposed when the flower was in bud, whereas petaloid regions occur in covered regions, suggesting that their development is at least partly controlled by the environment of the developing tepal. Green and colourful areas differ from each other in trichome density and presence of papillae, features that often distinguish sepals and petals. Field experiments to test whether artificial exposure can induce sepalness in the inner tepals showed that development of sepaloid patches is initiated by exposure, at least in the waterlily species examined. Although light is an important environmental cue, other important factors include an absence of surface contact. Our interpretation contradicts the unspoken rule that ‘sepal’ and ‘petal’ must refer to whole organs. We propose a novel theory (the Mosaic theory), in which the distinction between sepalness and petalness evolved early in angiosperm history, but these features were not fixed to particular organs and were primarily environmentally controlled. At a later stage in angiosperm evolution, sepaloid and petaloid characteristics became fixed to whole organs in specific whorls, thus reducing or removing the need for environmental control in favour of fixed developmental control. PMID:19574253

  12. Environmental control of sepalness and petalness in perianth organs of waterlilies: a new Mosaic theory for the evolutionary origin of a differentiated perianth.

    PubMed

    Warner, Kate A; Rudall, Paula J; Frohlich, Michael W

    2009-01-01

    The conventional concept of an 'undifferentiated perianth', implying that all perianth organs of a flower are alike, obscures the fact that individual perianth organs are sometimes differentiated into sepaloid and petaloid regions, as in the early-divergent angiosperms Nuphar, Nymphaea, and Schisandra. In the waterlilies Nuphar and Nymphaea, sepaloid regions closely coincide with regions of the perianth that were exposed when the flower was in bud, whereas petaloid regions occur in covered regions, suggesting that their development is at least partly controlled by the environment of the developing tepal. Green and colourful areas differ from each other in trichome density and presence of papillae, features that often distinguish sepals and petals. Field experiments to test whether artificial exposure can induce sepalness in the inner tepals showed that development of sepaloid patches is initiated by exposure, at least in the waterlily species examined. Although light is an important environmental cue, other important factors include an absence of surface contact. Our interpretation contradicts the unspoken rule that 'sepal' and 'petal' must refer to whole organs. We propose a novel theory (the Mosaic theory), in which the distinction between sepalness and petalness evolved early in angiosperm history, but these features were not fixed to particular organs and were primarily environmentally controlled. At a later stage in angiosperm evolution, sepaloid and petaloid characteristics became fixed to whole organs in specific whorls, thus reducing or removing the need for environmental control in favour of fixed developmental control. PMID:19574253

  13. An asymptotic approach for assessing fatigue reliability

    SciTech Connect

    Tang, J.

    1996-12-01

    By applying the cumulative fatigue damage theory to the random process reliability problem, and the introduction of a new concept of unified equivalent stress level in fatigue life prediction, a technical reliability model for the random process reliability problem under fatigue failure is proposed. The technical model emphasizes efficiency in the design choice and also focuses on the accuracy of the results. Based on this model, an asymptotic method for fatigue reliability under stochastic process loadings is developed. The proposed method uses the recursive iteration algorithm to achieve results which include reliability and corresponding life. The method reconciles the requirement of accuracy and efficiency for the random process reliability problems under fatigue failure. The accuracy and analytical and numerical efforts required are compared. Through numerical example, the advantage of the proposed method is demonstrated.

  14. Reliability computation from reliability block diagrams

    NASA Technical Reports Server (NTRS)

    Chelson, P. O.; Eckstein, R. E.

    1971-01-01

    A method and a computer program are presented to calculate probability of system success from an arbitrary reliability block diagram. The class of reliability block diagrams that can be handled include any active/standby combination of redundancy, and the computations include the effects of dormancy and switching in any standby redundancy. The mechanics of the program are based on an extension of the probability tree method of computing system probabilities.

  15. Power electronics reliability analysis.

    SciTech Connect

    Smith, Mark A.; Atcitty, Stanley

    2009-12-01

    This report provides the DOE and industry with a general process for analyzing power electronics reliability. The analysis can help with understanding the main causes of failures, downtime, and cost and how to reduce them. One approach is to collect field maintenance data and use it directly to calculate reliability metrics related to each cause. Another approach is to model the functional structure of the equipment using a fault tree to derive system reliability from component reliability. Analysis of a fictitious device demonstrates the latter process. Optimization can use the resulting baseline model to decide how to improve reliability and/or lower costs. It is recommended that both electric utilities and equipment manufacturers make provisions to collect and share data in order to lay the groundwork for improving reliability into the future. Reliability analysis helps guide reliability improvements in hardware and software technology including condition monitoring and prognostics and health management.

  16. Human Reliability Program Overview

    SciTech Connect

    Bodin, Michael

    2012-09-25

    This presentation covers the high points of the Human Reliability Program, including certification/decertification, critical positions, due process, organizational structure, program components, personnel security, an overview of the US DOE reliability program, retirees and academia, and security program integration.

  17. Reliable Design Versus Trust

    NASA Technical Reports Server (NTRS)

    Berg, Melanie; LaBel, Kenneth A.

    2016-01-01

    This presentation focuses on reliability and trust for the users portion of the FPGA design flow. It is assumed that the manufacturer prior to hand-off to the user tests FPGA internal components. The objective is to present the challenges of creating reliable and trusted designs. The following will be addressed: What makes a design vulnerable to functional flaws (reliability) or attackers (trust)? What are the challenges for verifying a reliable design versus a trusted design?

  18. Predicting software reliability

    NASA Technical Reports Server (NTRS)

    Littlewood, B.

    1989-01-01

    A detailed look is given to software reliability techniques. A conceptual model of the failure process is examined, and some software reliability growth models are discussed. Problems for which no current solutions exist are addressed, emphasizing the very difficult problem of safety-critical systems for which the reliability requirements can be enormously demanding.

  19. Reliability model generator specification

    NASA Technical Reports Server (NTRS)

    Cohen, Gerald C.; Mccann, Catherine

    1990-01-01

    The Reliability Model Generator (RMG), a program which produces reliability models from block diagrams for ASSIST, the interface for the reliability evaluation tool SURE is described. An account is given of motivation for RMG and the implemented algorithms are discussed. The appendices contain the algorithms and two detailed traces of examples.

  20. Measurement Issues in High Stakes Testing: Validity and Reliability

    ERIC Educational Resources Information Center

    Mason, Emanuel J.

    2007-01-01

    Validity and reliability of the new high stakes testing systems initiated in school systems across the United States in recent years in response to the accountability features mandated in the No Child Left Behind Legislation largely depend on item response theory and new rules of measurement. Reliability and validity in item response theory and…

  1. Evaluation of reliability and validity of the European Organization for Research and Treatment of Cancer Quality of Life Questionnaire (EORTC QLQ-C30, Albanian version) among breast cancer patients from Kosovo

    PubMed Central

    Shuleta-Qehaja, Selvete; Sterjev, Zoran; Shuturkova, Ljubica

    2015-01-01

    Patients and methods A sample of breast cancer patients (n=62 women) were interviewed for the European Organization for Research and Treatment of Cancer Quality of Life Questionnaire (EORTC QLQ-C30) in Albanian. Reliability of the questionnaire was considered acceptable if Cronbach’s alpha was ≥0.70. Item convergent-discriminant validity was tested through multitrait scaling analysis. Construct validity was tested under the hypotheses that QLQ-C30 interscale correlations would have an acceptable value of ≥0.40 and as well as by known group comparisons assessing differences of patient subgroups with reference to disease stage and education level. Results The mean age of the patients was 50 years (standard deviation: 10.9 years). Cronbach’s alpha ranged from 0.54 for the cognitive functioning scale to 0.96 for the global health quality of life (GH/QoL) scale. In multitrait scaling analysis, the strength of Spearman’s correlations between an item and its own subscale was ≥0.40, with the exception of item 5 (ρ=0.22); results for item discriminant validity were satisfactory, with the exception of item 5, which showed higher correlation with other subscales than with its own physical functioning. The Spearman’s interscale coefficients generally were correlated with each other. Results of known group comparisons did not show significant differences in terms of disease stage. Regarding education level, patients with high school/university education had better functional scales scores only in certain subscales compared to other subgroups; furthermore, patients with secondary school education had better GH/QoL compared to other subgroups of patients. Conclusion The EORTC QLQ-C30 (v3.0) in Albanian was found to be valid and reliable for women with breast cancer and could be considered as a starting point for further evaluation study. PMID:25834410

  2. International Lead Zinc Research Organization-sponsored field-data collection and analysis to determine relationships between service conditions and reliability of valve-regulated lead-acid batteries in stationary applications

    NASA Astrophysics Data System (ADS)

    Taylor, P. A.; Moseley, P. T.; Butler, P. C.

    The International Lead Zinc Research Organization (ILZRO), in cooperation with Sandia National Laboratories, has initiated a multi-phase project with the following aims: to characterize relationships between valve-regulated lead-acid (VRLA) batteries, service conditions, and failure modes; to establish the degree of correlation between specific operating procedures and PCL; to identify operating procedures that mitigate PCL; to identify best-fits between the operating requirements of specific applications and the capabilities of specific VRLA technologies; to recommend combinations of battery design, manufacturing processes, and operating conditions that enhance VRLA performance and reliability. In the first phase of this project, ILZRO has contracted with Energetics to identify and survey manufacturers and users of VRLA batteries for stationary applications (including electric utilities, telecommunications companies, and government facilities). The confidential survey is collecting the service conditions of specific applications and performance records for specific VRLA technologies. From the data collected, Energetics is constructing a database of the service histories and analyzing the data to determine trends in performance for particular technologies in specific service conditions. ILZRO plans to make the final report of the analysis and a version of the database (that contains no proprietary information) available to ILZRO members, participants in the survey, and participants in a follow-on workshop for stakeholders in VRLA reliability. This paper presents the surveys distributed to manufacturers and end-users, discusses the analytic approach, presents an overview of the responses to the surveys and trends that have emerged in the early analysis of the data, and previews the functionality of the database being constructed.

  3. Predicting Cloud Computing Technology Adoption by Organizations: An Empirical Integration of Technology Acceptance Model and Theory of Planned Behavior

    ERIC Educational Resources Information Center

    Ekufu, ThankGod K.

    2012-01-01

    Organizations are finding it difficult in today's economy to implement the vast information technology infrastructure required to effectively conduct their business operations. Despite the fact that some of these organizations are leveraging on the computational powers and the cost-saving benefits of computing on the Internet cloud, others…

  4. Informational Closed-Loop Coding-Decoding Control Concept as the Base of the Living or Organized Systems Theory

    NASA Astrophysics Data System (ADS)

    Kirvelis, Dobilas; Beitas, Kastytis

    2008-10-01

    The aim of this work is to show that the essence of life and living systems is their organization as bioinformational technology on the base of informational anticipatory control. Principal paradigmatic and structural schemes of functional organization of life (organisms and their systems) are constructed on the basis of systemic analysis and synthesis of main phenomenological features of living world. Life is based on functional elements that implement engineering procedures of closed-loop coding-decoding control (CL-CDC). Phenomenon of natural bioinformational control appeared and developed on the Earth 3-4 bln years ago, when the life originated as a result of chemical and later biological evolution. Informatics paradigm considers the physical and chemical transformations of energy and matter in organized systems as flows that are controlled and the signals as means for purposive informational control programs. The social and technical technological systems as informational control systems are a latter phenomenon engineered by man. The information emerges in organized systems as a necessary component of control technology. Generalized schemes of functional organization on levels of cell, organism and brain neocortex, as the highest biosystem with CL-CDC, are presented. CL-CDC concept expands the understanding of bioinformatics.

  5. Signal verification can promote reliable signalling.

    PubMed

    Broom, Mark; Ruxton, Graeme D; Schaefer, H Martin

    2013-11-22

    The central question in communication theory is whether communication is reliable, and if so, which mechanisms select for reliability. The primary approach in the past has been to attribute reliability to strategic costs associated with signalling as predicted by the handicap principle. Yet, reliability can arise through other mechanisms, such as signal verification; but the theoretical understanding of such mechanisms has received relatively little attention. Here, we model whether verification can lead to reliability in repeated interactions that typically characterize mutualisms. Specifically, we model whether fruit consumers that discriminate among poor- and good-quality fruits within a population can select for reliable fruit signals. In our model, plants either signal or they do not; costs associated with signalling are fixed and independent of plant quality. We find parameter combinations where discriminating fruit consumers can select for signal reliability by abandoning unprofitable plants more quickly. This self-serving behaviour imposes costs upon plants as a by-product, rendering it unprofitable for unrewarding plants to signal. Thus, strategic costs to signalling are not a prerequisite for reliable communication. We expect verification to more generally explain signal reliability in repeated consumer-resource interactions that typify mutualisms but also in antagonistic interactions such as mimicry and aposematism. PMID:24068354

  6. Signal verification can promote reliable signalling

    PubMed Central

    Broom, Mark; Ruxton, Graeme D.; Schaefer, H. Martin

    2013-01-01

    The central question in communication theory is whether communication is reliable, and if so, which mechanisms select for reliability. The primary approach in the past has been to attribute reliability to strategic costs associated with signalling as predicted by the handicap principle. Yet, reliability can arise through other mechanisms, such as signal verification; but the theoretical understanding of such mechanisms has received relatively little attention. Here, we model whether verification can lead to reliability in repeated interactions that typically characterize mutualisms. Specifically, we model whether fruit consumers that discriminate among poor- and good-quality fruits within a population can select for reliable fruit signals. In our model, plants either signal or they do not; costs associated with signalling are fixed and independent of plant quality. We find parameter combinations where discriminating fruit consumers can select for signal reliability by abandoning unprofitable plants more quickly. This self-serving behaviour imposes costs upon plants as a by-product, rendering it unprofitable for unrewarding plants to signal. Thus, strategic costs to signalling are not a prerequisite for reliable communication. We expect verification to more generally explain signal reliability in repeated consumer–resource interactions that typify mutualisms but also in antagonistic interactions such as mimicry and aposematism. PMID:24068354

  7. Software reliability experiments data analysis and investigation

    NASA Technical Reports Server (NTRS)

    Walker, J. Leslie; Caglayan, Alper K.

    1991-01-01

    The objectives are to investigate the fundamental reasons which cause independently developed software programs to fail dependently, and to examine fault tolerant software structures which maximize reliability gain in the presence of such dependent failure behavior. The authors used 20 redundant programs from a software reliability experiment to analyze the software errors causing coincident failures, to compare the reliability of N-version and recovery block structures composed of these programs, and to examine the impact of diversity on software reliability using subpopulations of these programs. The results indicate that both conceptually related and unrelated errors can cause coincident failures and that recovery block structures offer more reliability gain than N-version structures if acceptance checks that fail independently from the software components are available. The authors present a theory of general program checkers that have potential application for acceptance tests.

  8. Learning in Complex Organizations as Practicing and Reflecting: A Model Development and Application from a Theory of Practice Perspective

    ERIC Educational Resources Information Center

    Schulz, Klaus-Peter

    2005-01-01

    Purpose: The article seeks to conceptualize learning in practice from a theories of practice view. This paradigmatic shift allows one to overcome problem areas related to traditional conceptions of learning such as the difficulty of knowledge transfer, and related to many situated learning models that neglect the aspect of reification of practice.…

  9. Workplace Support, Discrimination, and Person-Organization Fit: Tests of the Theory of Work Adjustment with LGB Individuals

    ERIC Educational Resources Information Center

    Velez, Brandon L.; Moradi, Bonnie

    2012-01-01

    The present study explored the links of 2 workplace contextual variables--perceptions of workplace heterosexist discrimination and lesbian, gay, and bisexual (LGB)-supportive climates--with job satisfaction and turnover intentions in a sample of LGB employees. An extension of the theory of work adjustment (TWA) was used as the conceptual framework…

  10. Change of Mind: How Organization Theory Led Me to Move from Studying Educational Reform to Pursuing Educational Design

    ERIC Educational Resources Information Center

    Ogawa, Rodney T.

    2015-01-01

    Purpose: The purpose of this paper is for the author to recount how his use of organizational theory to understand educational reform in the USA led to a change of mind. Design/methodology/approach: My shift resulted from my conclusion, derived from the new institutionalism, that only marginal changes can be made in schools and, thus, fundamental…

  11. Harm reduction theory: Users culture, micro-social indigenous harm reduction, and the self-organization and outside-organizing of users’ groups

    PubMed Central

    Friedman, Samuel R.; de Jong, Wouter; Rossi, Diana; Touzé, Graciela; Rockwell, Russell; Jarlais, Don C Des; Elovich, Richard

    2007-01-01

    This paper discusses the user side of harm reduction, focusing to some extent on the early responses to the HIV/AIDS epidemic in each of four sets of localities—New York City, Rotterdam, Buenos Aires, and sites in Central Asia. Using available qualitative and quantitative information, we present a series of vignettes about user activities in four different localities in behalf of reducing drug-related harm. Some of these activities have been micro-social (small group) activities; others have been conducted by formal organizations of users that the users organised at their own initiative. In spite of the limitations of the methodology, the data suggest that users’ activities have helped limit HIV spread. These activities are shaped by broader social contexts, such as the extent to which drug scenes are integrated with broader social networks and the way the political and economic systems impinge on drug users’ lives. Drug users are active agents in their own individual and collective behalf, and in helping to protect wider communities. Harm reduction activities and research should take note of and draw upon both the micro-social and formal organizations of users. Finally, both researchers and policy makers should help develop ways to enable and support both micro-social and formally organized action by users PMID:17689353

  12. Recalibrating software reliability models

    NASA Technical Reports Server (NTRS)

    Brocklehurst, Sarah; Chan, P. Y.; Littlewood, Bev; Snell, John

    1989-01-01

    In spite of much research effort, there is no universally applicable software reliability growth model which can be trusted to give accurate predictions of reliability in all circumstances. Further, it is not even possible to decide a priori which of the many models is most suitable in a particular context. In an attempt to resolve this problem, techniques were developed whereby, for each program, the accuracy of various models can be analyzed. A user is thus enabled to select that model which is giving the most accurate reliability predictions for the particular program under examination. One of these ways of analyzing predictive accuracy, called the u-plot, in fact allows a user to estimate the relationship between the predicted reliability and the true reliability. It is shown how this can be used to improve reliability predictions in a completely general way by a process of recalibration. Simulation results show that the technique gives improved reliability predictions in a large proportion of cases. However, a user does not need to trust the efficacy of recalibration, since the new reliability estimates produced by the technique are truly predictive and so their accuracy in a particular application can be judged using the earlier methods. The generality of this approach would therefore suggest that it be applied as a matter of course whenever a software reliability model is used.

  13. Software Reliability 2002

    NASA Technical Reports Server (NTRS)

    Wallace, Dolores R.

    2003-01-01

    In FY01 we learned that hardware reliability models need substantial changes to account for differences in software, thus making software reliability measurements more effective, accurate, and easier to apply. These reliability models are generally based on familiar distributions or parametric methods. An obvious question is 'What new statistical and probability models can be developed using non-parametric and distribution-free methods instead of the traditional parametric method?" Two approaches to software reliability engineering appear somewhat promising. The first study, begin in FY01, is based in hardware reliability, a very well established science that has many aspects that can be applied to software. This research effort has investigated mathematical aspects of hardware reliability and has identified those applicable to software. Currently the research effort is applying and testing these approaches to software reliability measurement, These parametric models require much project data that may be difficult to apply and interpret. Projects at GSFC are often complex in both technology and schedules. Assessing and estimating reliability of the final system is extremely difficult when various subsystems are tested and completed long before others. Parametric and distribution free techniques may offer a new and accurate way of modeling failure time and other project data to provide earlier and more accurate estimates of system reliability.

  14. Recalibrating software reliability models

    NASA Technical Reports Server (NTRS)

    Brocklehurst, Sarah; Chan, P. Y.; Littlewood, Bev; Snell, John

    1990-01-01

    In spite of much research effort, there is no universally applicable software reliability growth model which can be trusted to give accurate predictions of reliability in all circumstances. Further, it is not even possible to decide a priori which of the many models is most suitable in a particular context. In an attempt to resolve this problem, techniques were developed whereby, for each program, the accuracy of various models can be analyzed. A user is thus enabled to select that model which is giving the most accurate reliability predicitons for the particular program under examination. One of these ways of analyzing predictive accuracy, called the u-plot, in fact allows a user to estimate the relationship between the predicted reliability and the true reliability. It is shown how this can be used to improve reliability predictions in a completely general way by a process of recalibration. Simulation results show that the technique gives improved reliability predictions in a large proportion of cases. However, a user does not need to trust the efficacy of recalibration, since the new reliability estimates prodcued by the technique are truly predictive and so their accuracy in a particular application can be judged using the earlier methods. The generality of this approach would therefore suggest that it be applied as a matter of course whenever a software reliability model is used.

  15. Generalizability Theory and Classical Test Theory

    ERIC Educational Resources Information Center

    Brennan, Robert L.

    2011-01-01

    Broadly conceived, reliability involves quantifying the consistencies and inconsistencies in observed scores. Generalizability theory, or G theory, is particularly well suited to addressing such matters in that it enables an investigator to quantify and distinguish the sources of inconsistencies in observed scores that arise, or could arise, over…

  16. Solvent dependence of Stokes shift for organic solute-solvent systems: A comparative study by spectroscopy and reference interaction-site model-self-consistent-field theory

    NASA Astrophysics Data System (ADS)

    Nishiyama, Katsura; Watanabe, Yasuhiro; Yoshida, Norio; Hirata, Fumio

    2013-09-01

    The Stokes shift magnitudes for coumarin 153 (C153) in 13 organic solvents with various polarities have been determined by means of steady-state spectroscopy and reference interaction-site model-self-consistent-field (RISM-SCF) theory. RISM-SCF calculations have reproduced experimental results fairly well, including individual solvent characteristics. It is empirically known that in some solvents, larger Stokes shift magnitudes are detected than anticipated on the basis of the solvent relative permittivity, ɛr. In practice, 1,4-dioxane (ɛr = 2.21) provides almost identical Stokes shift magnitudes to that of tetrahydrofuran (THF, ɛr = 7.58), for C153 and other typical organic solutes. In this work, RISM-SCF theory has been used to estimate the energetics of C153-solvent systems involved in the absorption and fluorescence processes. The Stokes shift magnitudes estimated by RISM-SCF theory are ˜5 kJ mol-1 (400 cm-1) less than those determined by spectroscopy; however, the results obtained are still adequate for dipole moment comparisons, in a qualitative sense. We have also calculated the solute-solvent site-site radial distributions by this theory. It is shown that solvation structures with respect to the C-O-C framework, which is common to dioxane and THF, in the near vicinity (˜0.4 nm) of specific solute sites can largely account for their similar Stokes shift magnitudes. In previous works, such solute-solvent short-range interactions have been explained in terms of the higher-order multipole moments of the solvents. Our present study shows that along with the short-range interactions that contribute most significantly to the energetics, long-range electrostatic interactions are also important. Such long-range interactions are effective up to 2 nm from the solute site, as in the case of a typical polar solvent, acetonitrile.

  17. Managing Reliability in the 21st Century

    SciTech Connect

    Dellin, T.A.

    1998-11-23

    The rapid pace of change at Ike end of the 20th Century should continue unabated well into the 21st Century. The driver will be the marketplace imperative of "faster, better, cheaper." This imperative has already stimulated a revolution-in-engineering in design and manufacturing. In contrast, to date, reliability engineering has not undergone a similar level of change. It is critical that we implement a corresponding revolution-in-reliability-engineering as we enter the new millennium. If we are still using 20th Century reliability approaches in the 21st Century, then reliability issues will be the limiting factor in faster, better, and cheaper. At the heart of this reliability revolution will be a science-based approach to reliability engineering. Science-based reliability will enable building-in reliability, application-specific products, virtual qualification, and predictive maintenance. The purpose of this paper is to stimulate a dialogue on the future of reliability engineering. We will try to gaze into the crystal ball and predict some key issues that will drive reliability programs in the new millennium. In the 21st Century, we will demand more of our reliability programs. We will need the ability to make accurate reliability predictions that will enable optimizing cost, performance and time-to-market to meet the needs of every market segment. We will require that all of these new capabilities be in place prior to the stint of a product development cycle. The management of reliability programs will be driven by quantifiable metrics of value added to the organization business objectives.

  18. The body of the soul. Lucretian echoes in the Renaissance theories on the psychic substance and its organic repartition.

    PubMed

    Tutrone, Fabio

    2014-01-01

    In the 16th and 17th centuries, when Aristotelianism still was the leading current of natural philosophy and atomistic theories began to arise, Lucretius' De Rerum Natura stood out as an attractive and dangerous model. The present paper reassesses several relevant aspects of Lucretius' materialistic psychology by focusing on the problem of the soul's repartition through the limbs discussed in Book 3. A very successful Lucretian image serves as flu rouge throughout this survey: the description of a snake chopped up, with its pieces moving on the ground (Lucretius DRN 1969, 3.657-669). The paper's first section sets the poet's theory against the background of ancient psychology, pointing out its often neglected assimilation of Aristotelian elements. The second section highlights the influence of De Rerum Natura and its physiology of the soul on Bernardino Telesio, Agostino Doni and Francis Bacon, since all of these authors engage in an original recombination of mechanical and teleological explanations. PMID:25707096

  19. Screening for high-spin metal organic frameworks (MOFs): density functional theory study on DUT-8(M1,M2) (with Mi = V,…,Cu).

    PubMed

    Schwalbe, Sebastian; Trepte, Kai; Seifert, Gotthard; Kortus, Jens

    2016-03-01

    We present a first principles study of low-spin (LS)/high-spin (HS) screening for 3d metal centers in the metal organic framework (MOF) DUT-8(Ni). Various density functional theory (DFT) codes have been used to evaluate numerical and DFT related errors. We compare highly accurate all-electron implementations with the widely used plane wave approach. We present electronically and magnetically stable DUT-8(Ni) HS secondary building units (SBUs). In this work we show how to tune the magnetic and electronic properties of the original SBU only by changing the metal centers. PMID:26922864

  20. Hawaii electric system reliability.

    SciTech Connect

    Silva Monroy, Cesar Augusto; Loose, Verne William

    2012-09-01

    This report addresses Hawaii electric system reliability issues; greater emphasis is placed on short-term reliability but resource adequacy is reviewed in reference to electric consumers' views of reliability %E2%80%9Cworth%E2%80%9D and the reserve capacity required to deliver that value. The report begins with a description of the Hawaii electric system to the extent permitted by publicly available data. Electrical engineering literature in the area of electric reliability is researched and briefly reviewed. North American Electric Reliability Corporation standards and measures for generation and transmission are reviewed and identified as to their appropriateness for various portions of the electric grid and for application in Hawaii. Analysis of frequency data supplied by the State of Hawaii Public Utilities Commission is presented together with comparison and contrast of performance of each of the systems for two years, 2010 and 2011. Literature tracing the development of reliability economics is reviewed and referenced. A method is explained for integrating system cost with outage cost to determine the optimal resource adequacy given customers' views of the value contributed by reliable electric supply. The report concludes with findings and recommendations for reliability in the State of Hawaii.

  1. General theory for multiple input-output perturbations in complex molecular systems. 1. Linear QSPR electronegativity models in physical, organic, and medicinal chemistry.

    PubMed

    González-Díaz, Humberto; Arrasate, Sonia; Gómez-SanJuan, Asier; Sotomayor, Nuria; Lete, Esther; Besada-Porto, Lina; Ruso, Juan M

    2013-01-01

    In general perturbation methods starts with a known exact solution of a problem and add "small" variation terms in order to approach to a solution for a related problem without known exact solution. Perturbation theory has been widely used in almost all areas of science. Bhor's quantum model, Heisenberg's matrix mechanincs, Feyman diagrams, and Poincare's chaos model or "butterfly effect" in complex systems are examples of perturbation theories. On the other hand, the study of Quantitative Structure-Property Relationships (QSPR) in molecular complex systems is an ideal area for the application of perturbation theory. There are several problems with exact experimental solutions (new chemical reactions, physicochemical properties, drug activity and distribution, metabolic networks, etc.) in public databases like CHEMBL. However, in all these cases, we have an even larger list of related problems without known solutions. We need to know the change in all these properties after a perturbation of initial boundary conditions. It means, when we test large sets of similar, but different, compounds and/or chemical reactions under the slightly different conditions (temperature, time, solvents, enzymes, assays, protein targets, tissues, partition systems, organisms, etc.). However, to the best of our knowledge, there is no QSPR general-purpose perturbation theory to solve this problem. In this work, firstly we review general aspects and applications of both perturbation theory and QSPR models. Secondly, we formulate a general-purpose perturbation theory for multiple-boundary QSPR problems. Last, we develop three new QSPR-Perturbation theory models. The first model classify correctly >100,000 pairs of intra-molecular carbolithiations with 75-95% of Accuracy (Ac), Sensitivity (Sn), and Specificity (Sp). The model predicts probabilities of variations in the yield and enantiomeric excess of reactions due to at least one perturbation in boundary conditions (solvent, temperature

  2. Development of distribution system reliability and risk analysis models

    NASA Astrophysics Data System (ADS)

    Northcote-Green, J. E. D.; Vismor, T. D.; Brooks, C. L.

    1981-08-01

    The overall objectives of a research project were to: determine distribution reliability assessment methods currently used by the industry; develop a general outage reporting scheme suitable for a wide variety of distributing utilities (reliability model); develop a model for predicting the reliability of future system configurations (risk model); and compile a handbook of reliability assessment methods designed specifically for use by the practicing distribution engineer. Emphasis was placed on compiling and organizing reliability assessment techniques presently used by the industry. The project examined reliability evaluation from two perspectives: historical and predictive assessment. Two reliability assessment models, HISRAM - the historical reliability assessment model and PRAM - the predictive reliability assessment model were developed. Each model was tested in a utility environment by the Duquesne Light Company and the Public Service Electric and Gas Company of New Jersey. A survey of 56 diverse utilities served as a basis for examining current distribution reliability assessment practices in the electric power industry.

  3. CRITICAL EVALUATION OF THE DIFFUSION HYPOTHESIS IN THE THEORY OF POROUS MEDIA VOLATILE ORGANIC COMPOUND (VOC) SOURCES AND SINKS

    EPA Science Inventory

    The paper proposes three alternative, diffusion-limited mathematical models to account for volatile organic compound (VOC) interactions with indoor sinks, using the linear isotherm model as a reference point. (NOTE: Recent reports by both the U.S. EPA and a study committee of the...

  4. A Grounded Theory of the College Experiences of African American Males in Black Greek-Letter Organizations

    ERIC Educational Resources Information Center

    Ford, David Julius, Jr.

    2014-01-01

    Studies have shown that involvement in a student organization can improve the academic and psychosocial outcomes of African American male students (Harper, 2006b; Robertson & Mason, 2008; Williams & Justice, 2010). Further, Harper, Byars, and Jelke (2005) stated that African American fraternities and sororities (i.e., Black Greek-letter…

  5. DETERMINATION OF CHEMICAL CLASSES FROM MASS SPECTRA OF TOXIC ORGANIC COMPOUNDS BY SIMCA PATTERN RECOGNITION AND INFORMATION THEORY

    EPA Science Inventory

    The low resolution mass spectra of a set of 78 toxic volatile organic compounds were examined for information concerning chemical classes. These compounds were predominately chloro- and/or bromoaromatics, -alkanes, or -alkenes, which are routinely sought at trace levels in ambien...

  6. Learning for Social Justice: A Cultural Historical Activity Theory Analysis of Community Leadership Empowerment in a Korean American Community Organization

    ERIC Educational Resources Information Center

    Kim, Junghwan

    2012-01-01

    Community organizations, especially those aiming at social change, play a significant role in establishing societal health and contributing to adult learning in daily communities. Their existence secures marginalized groups' involvement in society and enhances community development by building community leadership with multiple stakeholders…

  7. Small Open Chemical Systems Theory: Its Implications to Darwinian Evolution Dynamics, Complex Self-Organization and Beyond

    NASA Astrophysics Data System (ADS)

    Qian, Hong

    2014-10-01

    The study of biological cells in terms of mesoscopic, nonequilibrium, nonlinear, stochastic dynamics of open chemical systems provides a paradigm for other complex, self-organizing systems with ultra-fast stochastic fluctuations, short-time deterministic nonlinear dynamics, and long-time evolutionary behavior with exponentially distributed rare events, discrete jumps among punctuated equilibria, and catastrophe.

  8. Accounting for natural organic matter in aqueous chemical equilibrium models: a review of the theories and applications

    NASA Astrophysics Data System (ADS)

    Dudal, Yves; Gérard, Frédéric

    2004-08-01

    Soil organic matter consists of a highly complex and diversified blend of organic molecules, ranging from low molecular weight organic acids (LMWOAs), sugars, amines, alcohols, etc., to high apparent molecular weight fulvic and humic acids. The presence of a wide range of functional groups on these molecules makes them very reactive and influential in soil chemistry, in regards to acid-base chemistry, metal complexation, precipitation and dissolution of minerals and microbial reactions. Out of these functional groups, the carboxylic and phenolic ones are the most abundant and most influential in regards to metal complexation. Therefore, chemical equilibrium models have progressively dealt with organic matter in their calculations. This paper presents a review of six chemical equilibrium models, namely N ICA-Donnan, E Q3/6, G EOCHEM, M INTEQA2, P HREEQC and W HAM, in light of the account they make of natural organic matter (NOM) with the objective of helping potential users in choosing a modelling approach. The account has taken various faces, mainly by adding specific molecules within the existing model databases (E Q3/6, G EOCHEM, and P HREEQC) or by using either a discrete (W HAM) or a continuous (N ICA-Donnan and M INTEQA2) distribution of the deprotonated carboxylic and phenolic groups. The different ways in which soil organic matter has been integrated into these models are discussed in regards to the model-experiment comparisons that were found in the literature, concerning applications to either laboratory or natural systems. Much of the attention has been focused on the two most advanced models, W HAM and N ICA-Donnan, which are able to reasonably describe most of the experimental results. Nevertheless, a better knowledge of the humic substances metal-binding properties is needed to better constrain model inputs with site-specific parameter values. This represents the main axis of research that needs to be carried out to improve the models. In addition to

  9. Self-organization in irregular landscapes: Detecting autogenic interactions from field data using descriptive statistics and dynamical systems theory

    NASA Astrophysics Data System (ADS)

    Larsen, L.; Watts, D.; Khurana, A.; Anderson, J. L.; Xu, C.; Merritts, D. J.

    2015-12-01

    The classic signal of self-organization in nature is pattern formation. However, the interactions and feedbacks that organize depositional landscapes do not always result in regular or fractal patterns. How might we detect their existence and effects in these "irregular" landscapes? Emergent landscapes such as newly forming deltaic marshes or some restoration sites provide opportunities to study the autogenic processes that organize landscapes and their physical signatures. Here we describe a quest to understand autogenic vs. allogenic controls on landscape evolution in Big Spring Run, PA, a landscape undergoing restoration from bare-soil conditions to a target wet meadow landscape. The contemporary motivation for asking questions about autogenic vs. allogenic controls is to evaluate how important initial conditions or environmental controls may be for the attainment of management objectives. However, these questions can also inform interpretation of the sedimentary record by enabling researchers to separate signals that may have arisen through self-organization processes from those resulting from environmental perturbations. Over three years at Big Spring Run, we mapped the dynamic evolution of floodplain vegetation communities and distributions of abiotic variables and topography. We used principal component analysis and transition probability analysis to detect associative interactions between vegetation and geomorphic variables and convergent cross-mapping on lidar data to detect causal interactions between biomass and topography. Exploratory statistics revealed that plant communities with distinct morphologies exerted control on landscape evolution through stress divergence (i.e., channel initiation) and promoting the accumulation of fine sediment in channels. Together, these communities participated in a negative feedback that maintains low energy and multiple channels. Because of the spatially explicit nature of this feedback, causal interactions could not

  10. 78 FR 30747 - Reliability Standards for Geomagnetic Disturbances

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-23

    ... Reliability Corporation (NERC), the Commission-certified Electric Reliability Organization, to submit to the... Commission directs the North American Electric Reliability Corporation (NERC), the Commission-certified... Report: Effects of Geomagnetic Disturbances on the Bulk Power System at ii (February 2012) (NERC...

  11. 76 FR 58730 - Version 4 Critical Infrastructure Protection Reliability Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-22

    ...Under section 215 of the Federal Power Act, the Federal Energy Regulatory Commission (Commission) proposes to approve eight modified Critical Infrastructure Protection (CIP) Reliability Standards, CIP- 002-4 through CIP-009-4, developed and submitted to the Commission for approval by the North American Electric Reliability Corporation (NERC), the Electric Reliability Organization certified by......

  12. 78 FR 72755 - Version 5 Critical Infrastructure Protection Reliability Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-03

    ...Pursuant to section 215 of the Federal Power Act, the Commission approves the Version 5 Critical Infrastructure Protection Reliability Standards, CIP-002-5 through CIP-011-1, submitted by the North American Electric Reliability Corporation (NERC), the Commission- certified Electric Reliability Organization. The CIP version 5 Standards address the cyber security of the bulk electric system and......

  13. Structural characterization of genomes by large scale sequence-structure threading: application of reliability analysis in structural genomics

    PubMed Central

    Cherkasov, Artem; Ho Sui, Shannan J; Brunham, Robert C; Jones, Steven JM

    2004-01-01

    Background We establish that the occurrence of protein folds among genomes can be accurately described with a Weibull function. Systems which exhibit Weibull character can be interpreted with reliability theory commonly used in engineering analysis. For instance, Weibull distributions are widely used in reliability, maintainability and safety work to model time-to-failure of mechanical devices, mechanisms, building constructions and equipment. Results We have found that the Weibull function describes protein fold distribution within and among genomes more accurately than conventional power functions which have been used in a number of structural genomic studies reported to date. It has also been found that the Weibull reliability parameter β for protein fold distributions varies between genomes and may reflect differences in rates of gene duplication in evolutionary history of organisms. Conclusions The results of this work demonstrate that reliability analysis can provide useful insights and testable predictions in the fields of comparative and structural genomics. PMID:15274750

  14. Correcting Fallacies in Validity, Reliability, and Classification

    ERIC Educational Resources Information Center

    Sijtsma, Klaas

    2009-01-01

    This article reviews three topics from test theory that continue to raise discussion and controversy and capture test theorists' and constructors' interest. The first topic concerns the discussion of the methodology of investigating and establishing construct validity; the second topic concerns reliability and its misuse, alternative definitions…

  15. First-principles calculation of photo-induced electron transfer rate constants in phthalocyanine-C60 organic photovoltaic materials: Beyond Marcus theory

    NASA Astrophysics Data System (ADS)

    Lee, Myeong H.; Dunietz, Barry D.; Geva, Eitan

    2014-03-01

    Classical Marcus theory is commonly adopted in solvent-mediated charge transfer (CT) process to obtain the CT rate constant, but it can become questionable when the intramolecular vibrational modes dominate the CT process as in OPV devices because Marcus theory treats these modes classically and therefore nuclear tunneling is not accounted for. We present a computational scheme to obtain the electron transfer rate constant beyond classical Marcus theory. Within this approach, the nuclear vibrational modes are treated quantum-mechanically and a short-time approximation is avoided. Ab initio calculations are used to obtain the basic parameters needed for calculating the electron transfer rate constant. We apply our methodology to phthalocyanine(H2PC)-C60 organic photovoltaic system where one C60 acceptor and one or two H2PC donors are included to model the donor-acceptor interface configuration. We obtain the electron transfer and recombination rate constants for all accessible charge transfer (CT) states, from which the CT exciton dynamics is determined by employing a master equation. The role of higher lying excited states in CT exciton dynamics is discussed. This work is pursued as part of the Center for Solar and Thermal Energy Conversion, an Energy Frontier Research Center funded by the US Department of Energy Office of Science, Office of Basic Energy Sciences under 390 Award No. DE-SC0000957.

  16. Photovoltaic system reliability

    SciTech Connect

    Maish, A.B.; Atcitty, C.; Greenberg, D.

    1997-10-01

    This paper discusses the reliability of several photovoltaic projects including SMUD`s PV Pioneer project, various projects monitored by Ascension Technology, and the Colorado Parks project. System times-to-failure range from 1 to 16 years, and maintenance costs range from 1 to 16 cents per kilowatt-hour. Factors contributing to the reliability of these systems are discussed, and practices are recommended that can be applied to future projects. This paper also discusses the methodology used to collect and analyze PV system reliability data.

  17. Reliability of the Eight Guiding Principles and Syndrome Diagnosis in Chinese Medicine Diagnosis of Patients with Knee Osteoarthritis.

    PubMed

    Hua, Bin; Abbas, Estelle; Hayes, Alan; Ryan, Peter F; Nelson, Lisa; O'Brien, Kylie

    2012-08-21

    Abstract Background: A Chinese medicine (CM) "Syndrome" or "pattern of disharmony" is a diagnostic subcategory of a disease/disorder or symptom, characterized by particular symptoms and signs, and indicative of the etiology and the state of pathogenesis at that point in time. In CM, treatment is aimed at addressing the disease/disorder and the underlying CM Syndrome. A few studies have assessed reliability of CM Syndrome diagnosis according to one of the major CM theories, Zang-Fu theory, but only 1 study has investigated the reliability of diagnosis according to a fundamental theory, that of the Eight Guiding Principles. Given that treatment follows diagnosis, if diagnosis is not reliable there will be lower confidence that optimal treatment is received. There have not yet been any reliability studies in osteoarthritis (OA). Little is known about the characteristics or Syndromes of OA with respect to the Eight Guiding Principles and Zang-Fu theory. Objectives: The objectives of this study were to characterize diagnostic subcategories of OA according to the Eight Guiding Principles and Zang-Fu theory and to investigate the inter-rater reliability of CM diagnosis according to these two theories. Methods: An inter-rater reliability study was conducted as a substudy of a clinical trial investigating the treatment of knee OA with Chinese herbal medicine. Two (2) experienced CM practitioners conducted a CM examination separately, within 2 hours of each other, of 40 participants. A CM assessment form was utilized to record the diagnostic data. Cohen's κ coefficient was used as a measure of reliability. Results: Results support the concept that knee OA is more likely a disease with characteristics of Interior, Deficiency, and Yin according to the Eight Guiding Principles. There was no clear agreement on CM Syndromes of knee OA according to Zang-Fu theory. The main Zang Organs involved were broadly agreed on; they were Kidney, Liver, and Spleen. Conclusions: Results lend

  18. Structure and role of metal clusters in a metal-organic coordination network determined by density functional theory

    NASA Astrophysics Data System (ADS)

    Svane, K. L.; Linderoth, T. R.; Hammer, B.

    2016-02-01

    We present a comprehensive theoretical investigation of the structures formed by self-assembly of tetrahydroxybenzene (THB)-derivatives on Cu(111). The THB molecule is known to dehydrogenate completely during annealing, forming a reactive radical which assembles into a close-packed structure or a porous metal-coordinated network depending on the coverage of the system. Here, we present details on how the structures are determined by density functional theory calculations, using scanning tunneling microscopy-derived information on the periodicity. The porous network is based on adatom trimers. By analysing the charge distribution of the structure, it is found that this unusual coordination motif is preferred because it simultaneously provides a good coordination of all oxygen atoms and allows for the formation of a two-dimensional network on the surface.

  19. Software reliability studies

    NASA Technical Reports Server (NTRS)

    Wilson, Larry W.

    1989-01-01

    The longterm goal of this research is to identify or create a model for use in analyzing the reliability of flight control software. The immediate tasks addressed are the creation of data useful to the study of software reliability and production of results pertinent to software reliability through the analysis of existing reliability models and data. The completed data creation portion of this research consists of a Generic Checkout System (GCS) design document created in cooperation with NASA and Research Triangle Institute (RTI) experimenters. This will lead to design and code reviews with the resulting product being one of the versions used in the Terminal Descent Experiment being conducted by the Systems Validations Methods Branch (SVMB) of NASA/Langley. An appended paper details an investigation of the Jelinski-Moranda and Geometric models for software reliability. The models were given data from a process that they have correctly simulated and asked to make predictions about the reliability of that process. It was found that either model will usually fail to make good predictions. These problems were attributed to randomness in the data and replication of data was recommended.

  20. Multidisciplinary System Reliability Analysis

    NASA Technical Reports Server (NTRS)

    Mahadevan, Sankaran; Han, Song; Chamis, Christos C. (Technical Monitor)

    2001-01-01

    The objective of this study is to develop a new methodology for estimating the reliability of engineering systems that encompass multiple disciplines. The methodology is formulated in the context of the NESSUS probabilistic structural analysis code, developed under the leadership of NASA Glenn Research Center. The NESSUS code has been successfully applied to the reliability estimation of a variety of structural engineering systems. This study examines whether the features of NESSUS could be used to investigate the reliability of systems in other disciplines such as heat transfer, fluid mechanics, electrical circuits etc., without considerable programming effort specific to each discipline. In this study, the mechanical equivalence between system behavior models in different disciplines are investigated to achieve this objective. A new methodology is presented for the analysis of heat transfer, fluid flow, and electrical circuit problems using the structural analysis routines within NESSUS, by utilizing the equivalence between the computational quantities in different disciplines. This technique is integrated with the fast probability integration and system reliability techniques within the NESSUS code, to successfully compute the system reliability of multidisciplinary systems. Traditional as well as progressive failure analysis methods for system reliability estimation are demonstrated, through a numerical example of a heat exchanger system involving failure modes in structural, heat transfer and fluid flow disciplines.

  1. Scaled CMOS Technology Reliability Users Guide

    NASA Technical Reports Server (NTRS)

    White, Mark

    2010-01-01

    The desire to assess the reliability of emerging scaled microelectronics technologies through faster reliability trials and more accurate acceleration models is the precursor for further research and experimentation in this relevant field. The effect of semiconductor scaling on microelectronics product reliability is an important aspect to the high reliability application user. From the perspective of a customer or user, who in many cases must deal with very limited, if any, manufacturer's reliability data to assess the product for a highly-reliable application, product-level testing is critical in the characterization and reliability assessment of advanced nanometer semiconductor scaling effects on microelectronics reliability. A methodology on how to accomplish this and techniques for deriving the expected product-level reliability on commercial memory products are provided.Competing mechanism theory and the multiple failure mechanism model are applied to the experimental results of scaled SDRAM products. Accelerated stress testing at multiple conditions is applied at the product level of several scaled memory products to assess the performance degradation and product reliability. Acceleration models are derived for each case. For several scaled SDRAM products, retention time degradation is studied and two distinct soft error populations are observed with each technology generation: early breakdown, characterized by randomly distributed weak bits with Weibull slope (beta)=1, and a main population breakdown with an increasing failure rate. Retention time soft error rates are calculated and a multiple failure mechanism acceleration model with parameters is derived for each technology. Defect densities are calculated and reflect a decreasing trend in the percentage of random defective bits for each successive product generation. A normalized soft error failure rate of the memory data retention time in FIT/Gb and FIT/cm2 for several scaled SDRAM generations is

  2. Theory-guided nano-engineering of organic electro-optic materials for hybrid silicon photonic, plasmonic, and metamaterial devices

    NASA Astrophysics Data System (ADS)

    Dalton, Larry R.

    2013-03-01

    Coarse-grained Monte Carlo/molecular dynamic calculations are employed to explore the effect of various of intermolecular electrostatic interactions upon chromophore order, lattice dimensionality, and viscoelasticity in electrically-poled organic second order nonlinear optical materials. The following classes of organic macromolecular materials are considered: (1) Chromophore-polymer composites, (2) chromophores covalently incorporated into polymers and dendrimers, (3) chromophores incorporating additional dipolar or quadrupolar interactions that enhance poling efficiency, and (4) binary chromophore materials. For chromophore-polymer composites, the competition of chromophore-chromophore dipolar interactions and nuclear repulsive (steric) interactions define poling-induced acentric order. For covalently incorporated chromophores, covalent bond potentials also influence poling-induced order. These first two classes of materials basically behave as Langevin (3-D) lattice materials. Dipolar (e.g., coumarin) and quadrupolar (arene-perfluoroarene) interactions act to influence lattice dimensionality and thus enhance poling efficiency (the ratio of electro-optic activity to electric poling field strength). The long-range molecular cooperativity associated with these interactions influences viscoelastic properties critical to material processing and integration into silicon photonic, plasmonic, and metamaterial devices. The interaction between different chromophore species in binary chromophore materials also enhances poling efficiency. Polarized laser radiation applied to certain binary chromophore materials can also be used to enhance poling efficiency through control of lattice dimensionality. Poling efficiency approaching 5 (nm/V)2 has been achieved for these latter two classes of materials. Improvement in poling efficiency and control of material viscosity is particular important for integration of organic materials into complex device structures.

  3. Test-Retest Reliability of High Angular Resolution Diffusion Imaging Acquisition within Medial Temporal Lobe Connections Assessed via Tract Based Spatial Statistics, Probabilistic Tractography and a Novel Graph Theory Metric

    PubMed Central

    Kuhn, T.; Gullett, J. M.; Nguyen, P.; Boutzoukas, A. E.; Ford, A.; Colon-Perez, L. M.; Triplett, W.; Carney, P.R.; Mareci, T. H.; Price, C. C.; Bauer, R. M.

    2015-01-01

    Introduction This study examined the reliability of high angular resolution diffusion tensor imaging (HARDI) data collected on a single individual across several sessions using the same scanner. Methods HARDI data was acquired for one healthy adult male at the same time of day on ten separate days across a one-month period. Environmental factors (e.g. temperature) were controlled across scanning sessions. Tract Based Spatial Statistics (TBSS) was used to assess session-to-session variability in measures of diffusion, fractional anisotropy (FA) and mean diffusivity (MD). To address reliability within specific structures of the medial temporal lobe (MTL; the focus of an ongoing investigation), probabilistic tractography segmented the Entorhinal cortex (ERc) based on connections with Hippocampus (HC), Perirhinal (PRc) and Parahippocampal (PHc) cortices. Streamline tractography generated edge weight (EW) metrics for the aforementioned ERc connections and, as comparison regions, connections between left and right rostral and caudal anterior cingulate cortex (ACC). Coefficients of variation (CoV) were derived for the surface area and volumes of these ERc connectivity-defined regions (CDR) and for EW across all ten scans, expecting that scan-to-scan reliability would yield low CoVs. Results TBSS revealed no significant variation in FA or MD across scanning sessions. Probabilistic tractography successfully reproduced histologically-verified adjacent medial temporal lobe circuits. Tractography-derived metrics displayed larger ranges of scanner-to-scanner variability. Connections involving HC displayed greater variability than metrics of connection between other investigated regions. Conclusions By confirming the test retest reliability of HARDI data acquisition, support for the validity of significant results derived from diffusion data can be obtained. PMID:26189060

  4. Test-retest reliability of high angular resolution diffusion imaging acquisition within medial temporal lobe connections assessed via tract based spatial statistics, probabilistic tractography and a novel graph theory metric.

    PubMed

    Kuhn, T; Gullett, J M; Nguyen, P; Boutzoukas, A E; Ford, A; Colon-Perez, L M; Triplett, W; Carney, P R; Mareci, T H; Price, C C; Bauer, R M

    2016-06-01

    This study examined the reliability of high angular resolution diffusion tensor imaging (HARDI) data collected on a single individual across several sessions using the same scanner. HARDI data was acquired for one healthy adult male at the same time of day on ten separate days across a one-month period. Environmental factors (e.g. temperature) were controlled across scanning sessions. Tract Based Spatial Statistics (TBSS) was used to assess session-to-session variability in measures of diffusion, fractional anisotropy (FA) and mean diffusivity (MD). To address reliability within specific structures of the medial temporal lobe (MTL; the focus of an ongoing investigation), probabilistic tractography segmented the Entorhinal cortex (ERc) based on connections with Hippocampus (HC), Perirhinal (PRc) and Parahippocampal (PHc) cortices. Streamline tractography generated edge weight (EW) metrics for the aforementioned ERc connections and, as comparison regions, connections between left and right rostral and caudal anterior cingulate cortex (ACC). Coefficients of variation (CoV) were derived for the surface area and volumes of these ERc connectivity-defined regions (CDR) and for EW across all ten scans, expecting that scan-to-scan reliability would yield low CoVs. TBSS revealed no significant variation in FA or MD across scanning sessions. Probabilistic tractography successfully reproduced histologically-verified adjacent medial temporal lobe circuits. Tractography-derived metrics displayed larger ranges of scanner-to-scanner variability. Connections involving HC displayed greater variability than metrics of connection between other investigated regions. By confirming the test retest reliability of HARDI data acquisition, support for the validity of significant results derived from diffusion data can be obtained. PMID:26189060

  5. Measurement Practices for Reliability and Power Quality

    SciTech Connect

    Kueck, JD

    2005-05-06

    This report provides a distribution reliability measurement ''toolkit'' that is intended to be an asset to regulators, utilities and power users. The metrics and standards discussed range from simple reliability, to power quality, to the new blend of reliability and power quality analysis that is now developing. This report was sponsored by the Office of Electric Transmission and Distribution, U.S. Department of Energy (DOE). Inconsistencies presently exist in commonly agreed-upon practices for measuring the reliability of the distribution systems. However, efforts are being made by a number of organizations to develop solutions. In addition, there is growing interest in methods or standards for measuring power quality, and in defining power quality levels that are acceptable to various industries or user groups. The problems and solutions vary widely among geographic areas and among large investor-owned utilities, rural cooperatives, and municipal utilities; but there is still a great degree of commonality. Industry organizations such as the National Rural Electric Cooperative Association (NRECA), the Electric Power Research Institute (EPRI), the American Public Power Association (APPA), and the Institute of Electrical and Electronics Engineers (IEEE) have made tremendous strides in preparing self-assessment templates, optimization guides, diagnostic techniques, and better definitions of reliability and power quality measures. In addition, public utility commissions have developed codes and methods for assessing performance that consider local needs. There is considerable overlap among these various organizations, and we see real opportunity and value in sharing these methods, guides, and standards in this report. This report provides a ''toolkit'' containing synopses of noteworthy reliability measurement practices. The toolkit has been developed to address the interests of three groups: electric power users, utilities, and regulators. The report will also serve

  6. Donating blood and organs: using an extended theory of planned behavior perspective to identify similarities and differences in individual motivations to donate.

    PubMed

    Hyde, Melissa K; Knowles, Simon R; White, Katherine M

    2013-12-01

    Due to the critical shortage and continued need of blood and organ donations (ODs), research exploring similarities and differences in the motivational determinants of these behaviors is needed. In a sample of 258 university students, we used a cross-sectional design to test the utility of an extended theory of planned behavior (TPB) including moral norm, self-identity and in-group altruism (family/close friends and ethnic group), to predict people's blood and OD intentions. Overall, the extended TPB explained 77.0% and 74.6% of variance in blood and OD intentions, respectively. In regression analyses, common contributors to intentions across donation contexts were attitude, self-efficacy and self-identity. Normative influences varied with subjective norm as a significant predictor related to OD intentions but not blood donation intentions at the final step of regression analyses. Moral norm did not contribute significantly to blood or OD intentions. In-group altruism (family/close friends) was significantly related to OD intentions only in regressions. Future donation strategies should increase confidence to donate, foster a perception of self as the type of person who donates blood and/or organs, and address preferences to donate organs to in-group members only. PMID:23943782

  7. From Organized High-Throughput Data to Phenomenological Theory using Machine Learning: The Example of Dielectric Breakdown

    DOE PAGESBeta

    Kim, Chiho; Pilania, Ghanshyam; Ramprasad, Ramamurthy

    2016-02-02

    Understanding the behavior (and failure) of dielectric insulators experiencing extreme electric fields is critical to the operation of present and emerging electrical and electronic devices. Despite its importance, the development of a predictive theory of dielectric breakdown has remained a challenge, owing to the complex multiscale nature of this process. We focus on the intrinsic dielectric breakdown field of insulators—the theoretical limit of breakdown determined purely by the chemistry of the material, i.e., the elements the material is composed of, the atomic-level structure, and the bonding. Starting from a benchmark dataset (generated from laborious first principles computations) of the intrinsicmore » dielectric breakdown field of a variety of model insulators, simple predictive phenomenological models of dielectric breakdown are distilled using advanced statistical or machine learning schemes, revealing key correlations and analytical relationships between the breakdown field and easily accessible material properties. Lastly, the models are shown to be general, and can hence guide the screening and systematic identification of high electric field tolerant materials.« less

  8. [Segments of the recovery story of a patient with anorexia nervosa from the viewpoint of self-organization theory].

    PubMed

    Empt, A K; Schiepek, G

    2000-11-01

    In order to examine the dynamics of anorectic processes we interviewed a 16-year-old girl which had recently recovered from her illness. She portrayed her experiences for a time span of 22 months from the beginning of her self-starvation until the date of the interview. After clustering all informations into variables she rated the degree of these variables for 11 important events during the time of her recovery. Synergies between cognition, emotion and behaviour during recovery are displayed. Activated feedback dynamics (positive, negative, and mixed feedback-loops) give rise of the interpretation of recovery from anorexia nervosa as a dynamic, self-organizing system. Nonlinear dynamics are salient in psychological change even if we are adopting a qualitative view on the phenomena. PMID:11138470

  9. Statistical modelling of software reliability

    NASA Technical Reports Server (NTRS)

    Miller, Douglas R.

    1991-01-01

    During the six-month period from 1 April 1991 to 30 September 1991 the following research papers in statistical modeling of software reliability appeared: (1) A Nonparametric Software Reliability Growth Model; (2) On the Use and the Performance of Software Reliability Growth Models; (3) Research and Development Issues in Software Reliability Engineering; (4) Special Issues on Software; and (5) Software Reliability and Safety.

  10. Photovoltaic module reliability workshop

    NASA Astrophysics Data System (ADS)

    Mrig, L.

    The paper and presentations compiled in this volume form the Proceedings of the fourth in a series of Workshops sponsored by Solar Energy Research Institute (SERI/DOE) under the general theme of photovoltaic module reliability during the period 1986 to 1990. The reliability photovoltaic (PV) modules/systems is exceedingly important along with the initial cost and efficiency of modules if the PV technology has to make a major impact in the power generation market, and for it to compete with the conventional electricity producing technologies. The reliability of photovoltaic modules has progressed significantly in the last few years as evidenced by warrantees available on commercial modules of as long as 12 years. However, there is still need for substantial research and testing required to improve module field reliability to levels of 30 years or more. Several small groups of researchers are involved in this research, development, and monitoring activity around the world. In the U.S., PV manufacturers, DOE laboratories, electric utilities and others are engaged in the photovoltaic reliability research and testing. This group of researchers and others interested in this field were brought together under SERI/DOE sponsorship to exchange the technical knowledge and field experience as related to current information in this important field. The papers presented here reflect this effort.

  11. Proposed reliability cost model

    NASA Technical Reports Server (NTRS)

    Delionback, L. M.

    1973-01-01

    The research investigations which were involved in the study include: cost analysis/allocation, reliability and product assurance, forecasting methodology, systems analysis, and model-building. This is a classic example of an interdisciplinary problem, since the model-building requirements include the need for understanding and communication between technical disciplines on one hand, and the financial/accounting skill categories on the other. The systems approach is utilized within this context to establish a clearer and more objective relationship between reliability assurance and the subcategories (or subelements) that provide, or reenforce, the reliability assurance for a system. Subcategories are further subdivided as illustrated by a tree diagram. The reliability assurance elements can be seen to be potential alternative strategies, or approaches, depending on the specific goals/objectives of the trade studies. The scope was limited to the establishment of a proposed reliability cost-model format. The model format/approach is dependent upon the use of a series of subsystem-oriented CER's and sometimes possible CTR's, in devising a suitable cost-effective policy.

  12. Orbiter Autoland reliability analysis

    NASA Technical Reports Server (NTRS)

    Welch, D. Phillip

    1993-01-01

    The Space Shuttle Orbiter is the only space reentry vehicle in which the crew is seated upright. This position presents some physiological effects requiring countermeasures to prevent a crewmember from becoming incapacitated. This also introduces a potential need for automated vehicle landing capability. Autoland is a primary procedure that was identified as a requirement for landing following and extended duration orbiter mission. This report documents the results of the reliability analysis performed on the hardware required for an automated landing. A reliability block diagram was used to evaluate system reliability. The analysis considers the manual and automated landing modes currently available on the Orbiter. (Autoland is presently a backup system only.) Results of this study indicate a +/- 36 percent probability of successfully extending a nominal mission to 30 days. Enough variations were evaluated to verify that the reliability could be altered with missions planning and procedures. If the crew is modeled as being fully capable after 30 days, the probability of a successful manual landing is comparable to that of Autoland because much of the hardware is used for both manual and automated landing modes. The analysis indicates that the reliability for the manual mode is limited by the hardware and depends greatly on crew capability. Crew capability for a successful landing after 30 days has not been determined yet.

  13. Photovoltaic module reliability workshop

    SciTech Connect

    Mrig, L.

    1990-01-01

    The paper and presentations compiled in this volume form the Proceedings of the fourth in a series of Workshops sponsored by Solar Energy Research Institute (SERI/DOE) under the general theme of photovoltaic module reliability during the period 1986--1990. The reliability Photo Voltaic (PV) modules/systems is exceedingly important along with the initial cost and efficiency of modules if the PV technology has to make a major impact in the power generation market, and for it to compete with the conventional electricity producing technologies. The reliability of photovoltaic modules has progressed significantly in the last few years as evidenced by warranties available on commercial modules of as long as 12 years. However, there is still need for substantial research and testing required to improve module field reliability to levels of 30 years or more. Several small groups of researchers are involved in this research, development, and monitoring activity around the world. In the US, PV manufacturers, DOE laboratories, electric utilities and others are engaged in the photovoltaic reliability research and testing. This group of researchers and others interested in this field were brought together under SERI/DOE sponsorship to exchange the technical knowledge and field experience as related to current information in this important field. The papers presented here reflect this effort.

  14. Density functional theory meta-GGA + U study of water incorporation in the metal-organic framework material Cu-BTC

    NASA Astrophysics Data System (ADS)

    Cockayne, Eric; Nelson, Eric B.

    2015-07-01

    Water absorption in the metal-organic framework (MOF) material Cu-BTC, up to a concentration of 3.5 H2O per Cu ion, is studied via density functional theory at the meta-GGA + U level. The stable arrangements of water molecules show chains of hydrogen-bonded water molecules and a tendency to form closed cages at high concentration. Water clusters are stabilized primarily by a combination of water-water hydrogen bonding and Cu-water oxygen interactions. Stability is further enhanced by van der Waals interactions, electric field enhancement of water-water bonding, and hydrogen bonding of water to framework oxygens. We hypothesize that the tendency to form such stable clusters explains the particularly strong affinity of water to Cu-BTC and related MOFs with exposed metal sites.

  15. Density functional theory meta-GGA + U study of water incorporation in the metal-organic framework material Cu-BTC.

    PubMed

    Cockayne, Eric; Nelson, Eric B

    2015-07-14

    Water absorption in the metal-organic framework (MOF) material Cu-BTC, up to a concentration of 3.5 H2O per Cu ion, is studied via density functional theory at the meta-GGA + U level. The stable arrangements of water molecules show chains of hydrogen-bonded water molecules and a tendency to form closed cages at high concentration. Water clusters are stabilized primarily by a combination of water-water hydrogen bonding and Cu-water oxygen interactions. Stability is further enhanced by van der Waals interactions, electric field enhancement of water-water bonding, and hydrogen bonding of water to framework oxygens. We hypothesize that the tendency to form such stable clusters explains the particularly strong affinity of water to Cu-BTC and related MOFs with exposed metal sites. PMID:26178120

  16. Infrared measurements of organic radical anions in solution using mid-infrared optical fibers and spectral analyses based on density functional theory calculations

    NASA Astrophysics Data System (ADS)

    Sakamoto, Akira; Kuroda, Masahito; Harada, Tomohisa; Tasumi, Mitsuo

    2005-02-01

    By using ATR and transmission probes combined with bundles of mid-infrared optical fibers, high-quality infrared spectra are observed for the radical anions of biphenyl and naphthalene in deuterated tetrahydrofuran solutions. The ATR and transmission probes can be inserted into a glass-tube cell with O-rings under vacuum. Organic radical anions prepared separately in a vacuum system are transferred into the cell for infrared absorption measurements. Observed infrared spectra are in good agreement with those calculated by density functional theory. The origin of the strong infrared absorption intensities characteristic of the radical anions are discussed in terms of changes in electronic structures induced by specific normal vibrations (electron-molecular vibration interaction).

  17. Reliability Centered Maintenance - Methodologies

    NASA Technical Reports Server (NTRS)

    Kammerer, Catherine C.

    2009-01-01

    Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.

  18. Software reliability perspectives

    NASA Technical Reports Server (NTRS)

    Wilson, Larry; Shen, Wenhui

    1987-01-01

    Software which is used in life critical functions must be known to be highly reliable before installation. This requires a strong testing program to estimate the reliability, since neither formal methods, software engineering nor fault tolerant methods can guarantee perfection. Prior to the final testing software goes through a debugging period and many models have been developed to try to estimate reliability from the debugging data. However, the existing models are poorly validated and often give poor performance. This paper emphasizes the fact that part of their failures can be attributed to the random nature of the debugging data given to these models as input, and it poses the problem of correcting this defect as an area of future research.

  19. Exploring the sodium storage mechanism in disodium terephthalate as anode for organic battery using density-functional theory calculations

    NASA Astrophysics Data System (ADS)

    Sk, Mahasin Alam; Manzhos, Sergei

    2016-08-01

    We present an ab initio study of sodium storage mechanism in disodium terephthalate (Na2TP) which is a very promising anode material for organic sodium (Na)-ion batteries with reported experimental capacities of ∼255 mAh g-1, previously attributed to Na attachment to the two carboxylate groups (coordinating to oxygen atoms). We show here that the inserted Na atoms prefer to bind at carboxylate sites at low Na concentrations and are dominant for insertion of up to one Na atom per molecule; for higher Na concentrations, the hexagonal sites (on the aromatic ring) become dominant. We confirm that the Na2TP crystal can store a maximum of two Na atoms per molecule, as observed in experiments. Our current results are intriguing as we reveal that the Na binding at carboxylate sites contributes to the initial part of Na2TP sodiation curve and the Na binding at hexagonal sites contributes to the second part of the curve. The inserted Na atoms donate electrons to empty states in the conduction band. Moreover, we show that the Na diffusion barriers in clean Na2TP can be as low as 0.23 eV. We also show that there is significant difference in the mechanism of Na interaction between individual molecules and the crystal.

  20. Density Functional Theory Study of Hydrogen Adsorption in a Ti-Decorated Mg-Based Metal-Organic Framework-74.

    PubMed

    Suksaengrat, Pitphichaya; Amornkitbamrung, Vittaya; Srepusharawoot, Pornjuk; Ahuja, Rajeev

    2016-03-16

    The Ti-binding energy and hydrogen adsorption energy of a Ti-decorated Mg-based metal-organic framework-74 (Mg-MOF-74) were evaluated by using first-principles calculations. Our results revealed that only three Ti adsorption sites were found to be stable. The adsorption site near the metal oxide unit is the most stable. To investigate the hydrogen-adsorption properties of Ti-functionalized Mg-MOF-74, the hydrogen-binding energy was determined. For the most stable Ti adsorption site, we found that the hydrogen adsorption energy ranged from 0.26 to 0.48 eV H2 (-1) . This is within the desirable range for practical hydrogen-storage applications. Moreover, the hydrogen capacity was determined by using ab initio molecular dynamics simulations. Our results revealed that the hydrogen uptake by Ti-decorated Mg-MOF-74 at temperatures of 77, 150, and 298 K and ambient pressure were 1.81, 1.74, and 1.29 H2  wt %, respectively. PMID:26717417

  1. Transcriptional regulation by histone modifications: towards a theory of chromatin re-organization during stem cell differentiation

    NASA Astrophysics Data System (ADS)

    Binder, Hans; Steiner, Lydia; Przybilla, Jens; Rohlf, Thimo; Prohaska, Sonja; Galle, Jörg

    2013-04-01

    Chromatin-related mechanisms, as e.g. histone modifications, are known to be involved in regulatory switches within the transcriptome. Only recently, mathematical models of these mechanisms have been established. So far they have not been applied to genome-wide data. We here introduce a mathematical model of transcriptional regulation by histone modifications and apply it to data of trimethylation of histone 3 at lysine 4 (H3K4me3) and 27 (H3K27me3) in mouse pluripotent and lineage-committed cells. The model describes binding of protein complexes to chromatin which are capable of reading and writing histone marks. Molecular interactions of the complexes with DNA and modified histones create a regulatory switch of transcriptional activity. The regulatory states of the switch depend on the activity of histone (de-) methylases, the strength of complex-DNA-binding and the number of nucleosomes capable of cooperatively contributing to complex-binding. Our model explains experimentally measured length distributions of modified chromatin regions. It suggests (i) that high CpG-density facilitates recruitment of the modifying complexes in embryonic stem cells and (ii) that re-organization of extended chromatin regions during lineage specification into neuronal progenitor cells requires targeted de-modification. Our approach represents a basic step towards multi-scale models of transcriptional control during development and lineage specification.

  2. Gearbox Reliability Collaborative Update (Presentation)

    SciTech Connect

    Sheng, S.

    2013-10-01

    This presentation was given at the Sandia Reliability Workshop in August 2013 and provides information on current statistics, a status update, next steps, and other reliability research and development activities related to the Gearbox Reliability Collaborative.

  3. Materials reliability issues in microelectronics

    SciTech Connect

    Lloyd, J.R. ); Yost, F.G. ); Ho, P.S. )

    1991-01-01

    This book covers the proceedings of a MRS symposium on materials reliability in microelectronics. Topics include: electromigration; stress effects on reliability; stress and packaging; metallization; device, oxide and dielectric reliability; new investigative techniques; and corrosion.

  4. IRT-Estimated Reliability for Tests Containing Mixed Item Formats

    ERIC Educational Resources Information Center

    Shu, Lianghua; Schwarz, Richard D.

    2014-01-01

    As a global measure of precision, item response theory (IRT) estimated reliability is derived for four coefficients (Cronbach's a, Feldt-Raju, stratified a, and marginal reliability). Models with different underlying assumptions concerning test-part similarity are discussed. A detailed computational example is presented for the targeted…

  5. Designing reliability into accelerators

    SciTech Connect

    Hutton, A.

    1992-08-01

    For the next generation of high performance, high average luminosity colliders, the ``factories,`` reliability engineering must be introduced right at the inception of the project and maintained as a central theme throughout the project. There are several aspects which will be addressed separately: Concept; design; motivation; management techniques; and fault diagnosis.

  6. Designing reliability into accelerators

    SciTech Connect

    Hutton, A.

    1992-08-01

    For the next generation of high performance, high average luminosity colliders, the factories,'' reliability engineering must be introduced right at the inception of the project and maintained as a central theme throughout the project. There are several aspects which will be addressed separately: Concept; design; motivation; management techniques; and fault diagnosis.

  7. Software reliability report

    NASA Technical Reports Server (NTRS)

    Wilson, Larry

    1991-01-01

    There are many software reliability models which try to predict future performance of software based on data generated by the debugging process. Unfortunately, the models appear to be unable to account for the random nature of the data. If the same code is debugged multiple times and one of the models is used to make predictions, intolerable variance is observed in the resulting reliability predictions. It is believed that data replication can remove this variance in lab type situations and that it is less than scientific to talk about validating a software reliability model without considering replication. It is also believed that data replication may prove to be cost effective in the real world, thus the research centered on verification of the need for replication and on methodologies for generating replicated data in a cost effective manner. The context of the debugging graph was pursued by simulation and experimentation. Simulation was done for the Basic model and the Log-Poisson model. Reasonable values of the parameters were assigned and used to generate simulated data which is then processed by the models in order to determine limitations on their accuracy. These experiments exploit the existing software and program specimens which are in AIR-LAB to measure the performance of reliability models.

  8. Software Reliability Measurement Experience

    NASA Technical Reports Server (NTRS)

    Nikora, A. P.

    1993-01-01

    In this chapter, we describe a recent study of software reliability measurement methods that was conducted at the Jet Propulsion Laboratory. The first section of the chapter, sections 8.1, summarizes the study, characterizes the participating projects, describes the available data, and summarizes the tudy's results.

  9. Reliable solar cookers

    SciTech Connect

    Magney, G.K.

    1992-12-31

    The author describes the activities of SERVE, a Christian relief and development agency, to introduce solar ovens to the Afghan refugees in Pakistan. It has provided 5,000 solar cookers since 1984. The experience has demonstrated the potential of the technology and the need for a durable and reliable product. Common complaints about the cookers are discussed and the ideal cooker is described.

  10. Parametric Mass Reliability Study

    NASA Technical Reports Server (NTRS)

    Holt, James P.

    2014-01-01

    The International Space Station (ISS) systems are designed based upon having redundant systems with replaceable orbital replacement units (ORUs). These ORUs are designed to be swapped out fairly quickly, but some are very large, and some are made up of many components. When an ORU fails, it is replaced on orbit with a spare; the failed unit is sometimes returned to Earth to be serviced and re-launched. Such a system is not feasible for a 500+ day long-duration mission beyond low Earth orbit. The components that make up these ORUs have mixed reliabilities. Components that make up the most mass-such as computer housings, pump casings, and the silicon board of PCBs-typically are the most reliable. Meanwhile components that tend to fail the earliest-such as seals or gaskets-typically have a small mass. To better understand the problem, my project is to create a parametric model that relates both the mass of ORUs to reliability, as well as the mass of ORU subcomponents to reliability.

  11. Nonparametric Methods in Reliability

    PubMed Central

    Hollander, Myles; Peña, Edsel A.

    2005-01-01

    Probabilistic and statistical models for the occurrence of a recurrent event over time are described. These models have applicability in the reliability, engineering, biomedical and other areas where a series of events occurs for an experimental unit as time progresses. Nonparametric inference methods, in particular, the estimation of a relevant distribution function, are described. PMID:16710444

  12. The Examination of Reliability According to Classical Test and Generalizability on a Job Performance Scale

    ERIC Educational Resources Information Center

    Yelboga, Atilla; Tavsancil, Ezel

    2010-01-01

    In this research, the classical test theory and generalizability theory analyses were carried out with the data obtained by a job performance scale for the years 2005 and 2006. The reliability coefficients obtained (estimated) from the classical test theory and generalizability theory analyses were compared. In classical test theory, test retest…

  13. 78 FR 58492 - Generator Verification Reliability Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-24

    ... Energy Regulatory Commission 18 CFR Part 40 Generator Verification Reliability Standards AGENCY: Federal... Organization: MOD-025-2 (Verification and Data Reporting of Generator Real and Reactive Power Capability and Synchronous Condenser Reactive Power Capability), MOD- 026-1 (Verification of Models and Data for...

  14. The Estimation of the IRT Reliability Coefficient and Its Lower and Upper Bounds, with Comparisons to CTT Reliability Statistics

    ERIC Educational Resources Information Center

    Kim, Seonghoon; Feldt, Leonard S.

    2010-01-01

    The primary purpose of this study is to investigate the mathematical characteristics of the test reliability coefficient rho[subscript XX'] as a function of item response theory (IRT) parameters and present the lower and upper bounds of the coefficient. Another purpose is to examine relative performances of the IRT reliability statistics and two…

  15. Can Sex Differences in Science Be Tied to the Long Reach of Prenatal Hormones? Brain Organization Theory, Digit Ratio (2D/4D), and Sex Differences in Preferences and Cognition.

    PubMed

    Valla, Jeffrey; Ceci, Stephen J

    2011-03-01

    Brain organization theory posits a cascade of physiological and behavioral changes initiated and shaped by prenatal hormones. Recently, this theory has been associated with outcomes including gendered toy preference, 2D/4D digit ratio, personality characteristics, sexual orientation, and cognitive profile (spatial, verbal, and mathematical abilities). We examine the evidence for this claim, focusing on 2D/4D and its putative role as a biomarker for organizational features that influence cognitive abilities/interests predisposing males toward mathematically and spatially intensive careers. Although massive support exists for early brain organization theory overall, there are myriad inconsistencies, alternative explanations, and outright contradictions that must be addressed while still taking the entire theory into account. Like a fractal within the larger theory, the 2D/4D hypothesis mirrors this overall support on a smaller scale while likewise suffering from inconsistencies (positive, negative, and sex-dependent correlations), alternative explanations (2D/4D related to spatial preferences rather than abilities per se), and contradictions (feminine 2D/4D in men associated with higher spatial ability). Using the debate over brain organization theory as the theoretical stage, we focus on 2D/4D evidence as an increasingly important player on this stage, a demonstrative case in point of the evidential complexities of the broader debate, and an increasingly important topic in its own right. PMID:22164187

  16. Can Sex Differences in Science Be Tied to the Long Reach of Prenatal Hormones? Brain Organization Theory, Digit Ratio (2D/4D), and Sex Differences in Preferences and Cognition

    PubMed Central

    Valla, Jeffrey; Ceci, Stephen J.

    2011-01-01

    Brain organization theory posits a cascade of physiological and behavioral changes initiated and shaped by prenatal hormones. Recently, this theory has been associated with outcomes including gendered toy preference, 2D/4D digit ratio, personality characteristics, sexual orientation, and cognitive profile (spatial, verbal, and mathematical abilities). We examine the evidence for this claim, focusing on 2D/4D and its putative role as a biomarker for organizational features that influence cognitive abilities/interests predisposing males toward mathematically and spatially intensive careers. Although massive support exists for early brain organization theory overall, there are myriad inconsistencies, alternative explanations, and outright contradictions that must be addressed while still taking the entire theory into account. Like a fractal within the larger theory, the 2D/4D hypothesis mirrors this overall support on a smaller scale while likewise suffering from inconsistencies (positive, negative, and sex-dependent correlations), alternative explanations (2D/4D related to spatial preferences rather than abilities per se), and contradictions (feminine 2D/4D in men associated with higher spatial ability). Using the debate over brain organization theory as the theoretical stage, we focus on 2D/4D evidence as an increasingly important player on this stage, a demonstrative case in point of the evidential complexities of the broader debate, and an increasingly important topic in its own right. PMID:22164187

  17. Demonstration of reliability centered maintenance

    SciTech Connect

    Schwan, C.A.; Morgan, T.A. )

    1991-04-01

    Reliability centered maintenance (RCM) is an approach to preventive maintenance planning and evaluation that has been used successfully by other industries, most notably the airlines and military. Now EPRI is demonstrating RCM in the commercial nuclear power industry. Just completed are large-scale, two-year demonstrations at Rochester Gas Electric (Ginna Nuclear Power Station) and Southern California Edison (San Onofre Nuclear Generating Station). Both demonstrations were begun in the spring of 1988. At each plant, RCM was performed on 12 to 21 major systems. Both demonstrations determined that RCM is an appropriate means to optimize a PM program and improve nuclear plant preventive maintenance on a large scale. Such favorable results had been suggested by three earlier EPRI pilot studies at Florida Power Light, Duke Power, and Southern California Edison. EPRI selected the Ginna and San Onofre sites because, together, they represent a broad range of utility and plant size, plant organization, plant age, and histories of availability and reliability. Significant steps in each demonstration included: selecting and prioritizing plant systems for RCM evaluation; performing the RCM evaluation steps on selected systems; evaluating the RCM recommendations by a multi-disciplinary task force; implementing the RCM recommendations; establishing a system to track and verify the RCM benefits; and establishing procedures to update the RCM bases and recommendations with time (a living program). 7 refs., 1 tab.

  18. Combined bending-torsion fatigue reliability. III

    NASA Technical Reports Server (NTRS)

    Kececioglu, D.; Chester, L. B.; Nolf, C. F., Jr.

    1975-01-01

    Results generated by three, unique fatigue reliability research machines which can apply reversed bending loads combined with steady torque are presented. AISI 4340 steel, grooved specimens with a stress concentration factor of 1.42 and 2.34, and Rockwell C hardness of 35/40 were subjected to various combinations of these loads and cycled to failure. The generated cycles-to-failure and stress-to-failure data are statistically analyzed to develop distributional S-N and Goodman diagrams. Various failure theories are investigated to determine which one represents the data best. The effects of the groove, and of the various combined bending-torsion loads, on the S-N and Goodman diagrams are determined. Two design applications are presented which illustrate the direct useability and value of the distributional failure governing strength and cycles-to-failure data in designing for specified levels of reliability and in predicting the reliability of given designs.

  19. Reliability analysis of ceramic matrix composite laminates

    NASA Technical Reports Server (NTRS)

    Thomas, David J.; Wetherhold, Robert C.

    1991-01-01

    At a macroscopic level, a composite lamina may be considered as a homogeneous orthotropic solid whose directional strengths are random variables. Incorporation of these random variable strengths into failure models, either interactive or non-interactive, allows for the evaluation of the lamina reliability under a given stress state. Using a non-interactive criterion for demonstration purposes, laminate reliabilities are calculated assuming previously established load sharing rules for the redistribution of load as the failure of laminae occur. The matrix cracking predicted by ACK theory is modeled to allow a loss of stiffness in the fiber direction. The subsequent failure in the fiber direction is controlled by a modified bundle theory. Results using this modified bundle model are compared with previous models which did not permit separate consideration of matrix cracking, as well as to results obtained from experimental data.

  20. Reliability design for impact vibration of hydraulic pressure pipeline systems

    NASA Astrophysics Data System (ADS)

    Zhang, Tianxiao; Liu, Xinhui

    2013-09-01

    The research of reliability design for impact vibration of hydraulic pressure pipeline systems is still in the primary stage, and the research of quantitative reliability of hydraulic components and system is still incomplete. On the condition of having obtained the numerical characteristics of basic random parameters, several techniques and methods including the probability statistical theory, hydraulic technique and stochastic perturbation method are employed to carry out the reliability design for impact vibration of the hydraulic pressure system. Considering the instantaneous pressure pulse of hydraulic impact in pipeline, the reliability analysis model of hydraulic pipeline system is established, and the reliability-based optimization design method is presented. The proposed method can reflect the inherent reliability of hydraulic pipe system exactly, and the desired result is obtained. The reliability design of hydraulic pipeline system is achieved by computer programs and the reliability design information of hydraulic pipeline system is obtained. This research proposes a reliability design method, which can solve the problem of the reliability-based optimization design for the hydraulic pressure system with impact vibration practically and effectively, and enhance the quantitative research on the reliability design of hydraulic pipeline system. The proposed method has generality for the reliability optimization design of hydraulic pipeline system.

  1. Space Shuttle Propulsion System Reliability

    NASA Technical Reports Server (NTRS)

    Welzyn, Ken; VanHooser, Katherine; Moore, Dennis; Wood, David

    2011-01-01

    This session includes the following sessions: (1) External Tank (ET) System Reliability and Lessons, (2) Space Shuttle Main Engine (SSME), Reliability Validated by a Million Seconds of Testing, (3) Reusable Solid Rocket Motor (RSRM) Reliability via Process Control, and (4) Solid Rocket Booster (SRB) Reliability via Acceptance and Testing.

  2. Reliable broadcast protocols

    NASA Technical Reports Server (NTRS)

    Joseph, T. A.; Birman, Kenneth P.

    1989-01-01

    A number of broadcast protocols that are reliable subject to a variety of ordering and delivery guarantees are considered. Developing applications that are distributed over a number of sites and/or must tolerate the failures of some of them becomes a considerably simpler task when such protocols are available for communication. Without such protocols the kinds of distributed applications that can reasonably be built will have a very limited scope. As the trend towards distribution and decentralization continues, it will not be surprising if reliable broadcast protocols have the same role in distributed operating systems of the future that message passing mechanisms have in the operating systems of today. On the other hand, the problems of engineering such a system remain large. For example, deciding which protocol is the most appropriate to use in a certain situation or how to balance the latency-communication-storage costs is not an easy question.

  3. Human Reliability Program Workshop

    SciTech Connect

    Landers, John; Rogers, Erin; Gerke, Gretchen

    2014-05-18

    A Human Reliability Program (HRP) is designed to protect national security as well as worker and public safety by continuously evaluating the reliability of those who have access to sensitive materials, facilities, and programs. Some elements of a site HRP include systematic (1) supervisory reviews, (2) medical and psychological assessments, (3) management evaluations, (4) personnel security reviews, and (4) training of HRP staff and critical positions. Over the years of implementing an HRP, the Department of Energy (DOE) has faced various challenges and overcome obstacles. During this 4-day activity, participants will examine programs that mitigate threats to nuclear security and the insider threat to include HRP, Nuclear Security Culture (NSC) Enhancement, and Employee Assistance Programs. The focus will be to develop an understanding of the need for a systematic HRP and to discuss challenges and best practices associated with mitigating the insider threat.

  4. Reliability of photovoltaic modules

    NASA Technical Reports Server (NTRS)

    Ross, R. G., Jr.

    1986-01-01

    In order to assess the reliability of photovoltaic modules, four categories of known array failure and degradation mechanisms are discussed, and target reliability allocations have been developed within each category based on the available technology and the life-cycle-cost requirements of future large-scale terrestrial applications. Cell-level failure mechanisms associated with open-circuiting or short-circuiting of individual solar cells generally arise from cell cracking or the fatigue of cell-to-cell interconnects. Power degradation mechanisms considered include gradual power loss in cells, light-induced effects, and module optical degradation. Module-level failure mechanisms and life-limiting wear-out mechanisms are also explored.

  5. Compact, Reliable EEPROM Controller

    NASA Technical Reports Server (NTRS)

    Katz, Richard; Kleyner, Igor

    2010-01-01

    A compact, reliable controller for an electrically erasable, programmable read-only memory (EEPROM) has been developed specifically for a space-flight application. The design may be adaptable to other applications in which there are requirements for reliability in general and, in particular, for prevention of inadvertent writing of data in EEPROM cells. Inadvertent writes pose risks of loss of reliability in the original space-flight application and could pose such risks in other applications. Prior EEPROM controllers are large and complex and do not provide all reasonable protections (in many cases, few or no protections) against inadvertent writes. In contrast, the present controller provides several layers of protection against inadvertent writes. The controller also incorporates a write-time monitor, enabling determination of trends in the performance of an EEPROM through all phases of testing. The controller has been designed as an integral subsystem of a system that includes not only the controller and the controlled EEPROM aboard a spacecraft but also computers in a ground control station, relatively simple onboard support circuitry, and an onboard communication subsystem that utilizes the MIL-STD-1553B protocol. (MIL-STD-1553B is a military standard that encompasses a method of communication and electrical-interface requirements for digital electronic subsystems connected to a data bus. MIL-STD- 1553B is commonly used in defense and space applications.) The intent was to both maximize reliability while minimizing the size and complexity of onboard circuitry. In operation, control of the EEPROM is effected via the ground computers, the MIL-STD-1553B communication subsystem, and the onboard support circuitry, all of which, in combination, provide the multiple layers of protection against inadvertent writes. There is no controller software, unlike in many prior EEPROM controllers; software can be a major contributor to unreliability, particularly in fault

  6. Spacecraft transmitter reliability

    NASA Technical Reports Server (NTRS)

    1980-01-01

    A workshop on spacecraft transmitter reliability was held at the NASA Lewis Research Center on September 25 and 26, 1979, to discuss present knowledge and to plan future research areas. Since formal papers were not submitted, this synopsis was derived from audio tapes of the workshop. The following subjects were covered: users' experience with space transmitters; cathodes; power supplies and interfaces; and specifications and quality assurance. A panel discussion ended the workshop.

  7. Reliability and testing

    NASA Technical Reports Server (NTRS)

    Auer, Werner

    1996-01-01

    Reliability and its interdependence with testing are important topics for development and manufacturing of successful products. This generally accepted fact is not only a technical statement, but must be also seen in the light of 'Human Factors.' While the background for this paper is the experience gained with electromechanical/electronic space products, including control and system considerations, it is believed that the content could be also of interest for other fields.

  8. Software reliability studies

    NASA Technical Reports Server (NTRS)

    Hoppa, Mary Ann; Wilson, Larry W.

    1994-01-01

    There are many software reliability models which try to predict future performance of software based on data generated by the debugging process. Our research has shown that by improving the quality of the data one can greatly improve the predictions. We are working on methodologies which control some of the randomness inherent in the standard data generation processes in order to improve the accuracy of predictions. Our contribution is twofold in that we describe an experimental methodology using a data structure called the debugging graph and apply this methodology to assess the robustness of existing models. The debugging graph is used to analyze the effects of various fault recovery orders on the predictive accuracy of several well-known software reliability algorithms. We found that, along a particular debugging path in the graph, the predictive performance of different models can vary greatly. Similarly, just because a model 'fits' a given path's data well does not guarantee that the model would perform well on a different path. Further we observed bug interactions and noted their potential effects on the predictive process. We saw that not only do different faults fail at different rates, but that those rates can be affected by the particular debugging stage at which the rates are evaluated. Based on our experiment, we conjecture that the accuracy of a reliability prediction is affected by the fault recovery order as well as by fault interaction.

  9. Ionization Energies and Aqueous Redox Potentials of Organic Molecules: Comparison of DFT, Correlated ab Initio Theory and Pair Natural Orbital Approaches.

    PubMed

    Isegawa, Miho; Neese, Frank; Pantazis, Dimitrios A

    2016-05-10

    The calculation of redox potentials involves large energetic terms arising from gas phase ionization energies, thermodynamic contributions, and solvation energies of the reduced and oxidized species. In this work we study the performance of a wide range of wave function and density functional theory methods for the prediction of ionization energies and aqueous one-electron oxidation potentials of a set of 19 organic molecules. Emphasis is placed on evaluating methods that employ the computationally efficient local pair natural orbital (LPNO) approach, as well as several implementations of coupled cluster theory and explicitly correlated F12 methods. The electronic energies are combined with implicit solvation models for the solvation energies. With the exception of MP2 and its variants, which suffer from enormous errors arising at least partially from the poor Hartree-Fock reference, ionization energies can be systematically predicted with average errors below 0.1 eV for most of the correlated wave function based methods studies here, provided basis set extrapolation is performed. LPNO methods are the most efficient way to achieve this type of accuracy. DFT methods show in general larger errors and suffer from inconsistent behavior. The only exception is the M06-2X functional which is found to be competitive with the best LPNO-based approaches for ionization energies. Importantly, the limiting factor for the calculation of accurate redox potentials is the solvation energy. The errors in the predicted solvation energies by all continuum solvation models tested in this work dominate the final computed reduction potential, resulting in average errors typically in excess of 0.3 V and hence obscuring the gains that arise from choosing a more accurate electronic structure method. PMID:27065224

  10. General Aviation Aircraft Reliability Study

    NASA Technical Reports Server (NTRS)

    Pettit, Duane; Turnbull, Andrew; Roelant, Henk A. (Technical Monitor)

    2001-01-01

    This reliability study was performed in order to provide the aviation community with an estimate of Complex General Aviation (GA) Aircraft System reliability. To successfully improve the safety and reliability for the next generation of GA aircraft, a study of current GA aircraft attributes was prudent. This was accomplished by benchmarking the reliability of operational Complex GA Aircraft Systems. Specifically, Complex GA Aircraft System reliability was estimated using data obtained from the logbooks of a random sample of the Complex GA Aircraft population.

  11. Reliable predictions of waste performance in a geologic repository

    SciTech Connect

    Pigford, T.H.; Chambre, P.L.

    1985-08-01

    Establishing reliable estimates of long-term performance of a waste repository requires emphasis upon valid theories to predict performance. Predicting rates that radionuclides are released from waste packages cannot rest upon empirical extrapolations of laboratory leach data. Reliable predictions can be based on simple bounding theoretical models, such as solubility-limited bulk-flow, if the assumed parameters are reliably known or defensibly conservative. Wherever possible, performance analysis should proceed beyond simple bounding calculations to obtain more realistic - and usually more favorable - estimates of expected performance. Desire for greater realism must be balanced against increasing uncertainties in prediction and loss of reliability. Theoretical predictions of release rate based on mass-transfer analysis are bounding and the theory can be verified. Postulated repository analogues to simulate laboratory leach experiments introduce arbitrary and fictitious repository parameters and are shown not to agree with well-established theory. 34 refs., 3 figs., 2 tabs.

  12. Origins of life: a comparison of theories and application to Mars

    NASA Technical Reports Server (NTRS)

    Davis, W. L.; McKay, C. P.

    1996-01-01

    The field of study that deals with the origins of life does not have a consensus for a theory of life's origin. An analysis of the range of theories offered shows that they share some common features that may be reliable predictors when considering the possible origins of life on another planet. The fundamental datum dealing with the origins of life is that life appeared early in the history of the Earth, probably before 3.5 Ga and possibly before 3.8 Ga. What might be called the standard theory (the Oparin-Haldane theory) posits the production of organic molecules on the early Earth followed by chemical reactions that produced increased organic complexity leading eventually to organic life capable of reproduction, mutation, and selection using organic material as nutrients. A distinct class of other theories (panspermia theories) suggests that life was carried to Earth from elsewhere--these theories receive some support from recent work on planetary impact processes. Other alternatives to the standard model suggest that life arose as an inorganic (clay) form and/or that the initial energy source was not organic material but chemical energy or sunlight. We find that the entire range of current theories suggests that liquid water is the quintessential environmental criterion for both the origin and sustenance of life. It is therefore of interest that during the time that life appeared on Earth we have evidence for liquid water present on the surface of Mars.

  13. Origins of life: a comparison of theories and application to Mars.

    PubMed

    Davis, W L; McKay, C P

    1996-02-01

    The field of study that deals with the origins of life does not have a consensus for a theory of life's origin. An analysis of the range of theories offered shows that they share some common features that may be reliable predictors when considering the possible origins of life on another planet. The fundamental datum dealing with the origins of life is that life appeared early in the history of the Earth, probably before 3.5 Ga and possibly before 3.8 Ga. What might be called the standard theory (the Oparin-Haldane theory) posits the production of organic molecules on the early Earth followed by chemical reactions that produced increased organic complexity leading eventually to organic life capable of reproduction, mutation, and selection using organic material as nutrients. A distinct class of other theories (panspermia theories) suggests that life was carried to Earth from elsewhere--these theories receive some support from recent work on planetary impact processes. Other alternatives to the standard model suggest that life arose as an inorganic (clay) form and/or that the initial energy source was not organic material but chemical energy or sunlight. We find that the entire range of current theories suggests that liquid water is the quintessential environmental criterion for both the origin and sustenance of life. It is therefore of interest that during the time that life appeared on Earth we have evidence for liquid water present on the surface of Mars. PMID:8920171

  14. Influences of molecular packing on the charge mobility of organic semiconductors: from quantum charge transfer rate theory beyond the first-order perturbation.

    PubMed

    Nan, Guangjun; Shi, Qiang; Shuai, Zhigang; Li, Zesheng

    2011-05-28

    The electronic coupling between adjacent molecules is an important parameter for the charge transport properties of organic semiconductors. In a previous paper, a semiclassical generalized nonadiabatic transition state theory was used to investigate the nonperturbative effect of the electronic coupling on the charge transport properties, but it is not applicable at low temperatures due to the presence of high-frequency modes from the intramolecular conjugated carbon-carbon stretching vibrations [G. J. Nan et al., J. Chem. Phys., 2009, 130, 024704]. In the present paper, we apply a quantum charge transfer rate formula based on the imaginary-time flux-flux correlation function without the weak electronic coupling approximation. The imaginary-time flux-flux correlation function is then expressed in terms of the vibrational-mode path average and is evaluated by the path integral approach. All parameters are computed by quantum chemical approaches, and the mobility is obtained by kinetic Monte-Carlo simulation. We evaluate the intra-layer mobility of sexithiophene crystal structures in high- and low-temperature phases for a wide range of temperatures. In the case of strong coupling, the quantum charge transfer rates were found to be significantly smaller than those calculated using the weak electronic coupling approximation, which leads to reduced mobility especially at low temperatures. As a consequence, the mobility becomes less dependent on temperature when the molecular packing leads to strong electronic coupling in some charge transport directions. The temperature-independent charge mobility in organic thin-film transistors from experimental measurements may be explained from the present model with the grain boundaries considered. In addition, we point out that the widely used Marcus equation is invalid in calculating charge carrier transfer rates in sexithiophene crystals. PMID:21503350

  15. Development, reliability and factor analysis of a self-administered questionnaire which originates from the World Health Organization's Composite International Diagnostic Interview – Short Form (CIDI-SF) for assessing mental disorders

    PubMed Central

    2008-01-01

    Background The Composite International Diagnostic Interview – Short Form consists of short form scales for evaluating psychiatric disorders. Also for this version training of the interviewer is required. Moreover, the confidentiality could be not adequately protected. This study focuses on the preliminary validation of a brief self-completed questionnaire which originates from the CIDI-SF. Sampling and Methods A preliminary version was assessed for content and face validity. An intermediate version was evaluated for test-retest reliability. The final version of the questionnaire was evaluated for factor exploratory analysis, and internal consistency. Results After the modifications by the focus groups, the questionnaire included 29 initial probe questions and 56 secondary questions. The test retest reliability weighted Kappas were acceptable to excellent for the vast majority of questions. Factor analysis revealed six factors explaining 53.6% of total variance. Cronbach's alpha was 0.89 for the questionnaire and 0.89, 0.67, 0.71, 0.71, 0.49, and 0.67, for the six factors respectively. Conclusion The questionnaire has satisfactory reliability, and internal consistency, and might be efficient for using in community research and clinical practice. In the future, the questionnaire could be further validated (i.e., concurrent validity, discriminant validity). PMID:18402667

  16. 77 FR 64920 - Revisions to Reliability Standard for Transmission Vegetation Management

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-24

    ...Pursuant to section 215 of the Federal Power Act, the Commission proposes to approve Reliability Standard FAC-003-2 (Transmission Vegetation Management), submitted by the North American Electric Reliability Corporation (NERC), the Commission-certified Electric Reliability Organization. The proposed Reliability Standard would expand the applicability of the standard to include overhead......

  17. 18 CFR 39.6 - Conflict of a Reliability Standard with a Commission Order.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... Reliability Standard with a Commission Order. 39.6 Section 39.6 Conservation of Power and Water Resources... CONCERNING CERTIFICATION OF THE ELECTRIC RELIABILITY ORGANIZATION; AND PROCEDURES FOR THE ESTABLISHMENT, APPROVAL, AND ENFORCEMENT OF ELECTRIC RELIABILITY STANDARDS § 39.6 Conflict of a Reliability Standard...

  18. Ultimately Reliable Pyrotechnic Systems

    NASA Technical Reports Server (NTRS)

    Scott, John H.; Hinkel, Todd

    2015-01-01

    This paper presents the methods by which NASA has designed, built, tested, and certified pyrotechnic devices for high reliability operation in extreme environments and illustrates the potential applications in the oil and gas industry. NASA's extremely successful application of pyrotechnics is built upon documented procedures and test methods that have been maintained and developed since the Apollo Program. Standards are managed and rigorously enforced for performance margins, redundancy, lot sampling, and personnel safety. The pyrotechnics utilized in spacecraft include such devices as small initiators and detonators with the power of a shotgun shell, detonating cord systems for explosive energy transfer across many feet, precision linear shaped charges for breaking structural membranes, and booster charges to actuate valves and pistons. NASA's pyrotechnics program is one of the more successful in the history of Human Spaceflight. No pyrotechnic device developed in accordance with NASA's Human Spaceflight standards has ever failed in flight use. NASA's pyrotechnic initiators work reliably in temperatures as low as -420 F. Each of the 135 Space Shuttle flights fired 102 of these initiators, some setting off multiple pyrotechnic devices, with never a failure. The recent landing on Mars of the Opportunity rover fired 174 of NASA's pyrotechnic initiators to complete the famous '7 minutes of terror.' Even after traveling through extreme radiation and thermal environments on the way to Mars, every one of them worked. These initiators have fired on the surface of Titan. NASA's design controls, procedures, and processes produce the most reliable pyrotechnics in the world. Application of pyrotechnics designed and procured in this manner could enable the energy industry's emergency equipment, such as shutoff valves and deep-sea blowout preventers, to be left in place for years in extreme environments and still be relied upon to function when needed, thus greatly enhancing

  19. CR reliability testing

    NASA Astrophysics Data System (ADS)

    Honeyman-Buck, Janice C.; Rill, Lynn; Frost, Meryll M.; Staab, Edward V.

    1998-07-01

    The purpose of this work was to develop a method for systematically testing the reliability of a CR system under realistic daily loads in a non-clinical environment prior to its clinical adoption. Once digital imaging replaces film, it will be very difficult to revert back should the digital system become unreliable. Prior to the beginning of the test, a formal evaluation was performed to set the benchmarks for performance and functionality. A formal protocol was established that included all the 62 imaging plates in the inventory for each 24-hour period in the study. Imaging plates were exposed using different combinations of collimation, orientation, and SID. Anthropomorphic phantoms were used to acquire images of different sizes. Each combination was chosen randomly to simulate the differences that could occur in clinical practice. The tests were performed over a wide range of times with batches of plates processed to simulate the temporal constraints required by the nature of portable radiographs taken in the Intensive Care Unit (ICU). Current patient demographics were used for the test studies so automatic routing algorithms could be tested. During the test, only three minor reliability problems occurred, two of which were not directly related to the CR unit. One plate was discovered to cause a segmentation error that essentially reduced the image to only black and white with no gray levels. This plate was removed from the inventory to be replaced. Another problem was a PACS routing problem that occurred when the DICOM server with which the CR was communicating had a problem with disk space. The final problem was a network printing failure to the laser cameras. Although the units passed the reliability test, problems with interfacing to workstations were discovered. The two issues that were identified were the interpretation of what constitutes a study for CR and the construction of the look-up table for a proper gray scale display.

  20. Reliable VLSI sequential controllers

    NASA Technical Reports Server (NTRS)

    Whitaker, S.; Maki, G.; Shamanna, M.

    1990-01-01

    A VLSI architecture for synchronous sequential controllers is presented that has attractive qualities for producing reliable circuits. In these circuits, one hardware implementation can realize any flow table with a maximum of 2(exp n) internal states and m inputs. Also all design equations are identical. A real time fault detection means is presented along with a strategy for verifying the correctness of the checking hardware. This self check feature can be employed with no increase in hardware. The architecture can be modified to achieve fail safe designs. With no increase in hardware, an adaptable circuit can be realized that allows replacement of faulty transitions with fault free transitions.

  1. Ferrite logic reliability study

    NASA Technical Reports Server (NTRS)

    Baer, J. A.; Clark, C. B.

    1973-01-01

    Development and use of digital circuits called all-magnetic logic are reported. In these circuits the magnetic elements and their windings comprise the active circuit devices in the logic portion of a system. The ferrite logic device belongs to the all-magnetic class of logic circuits. The FLO device is novel in that it makes use of a dual or bimaterial ferrite composition in one physical ceramic body. This bimaterial feature, coupled with its potential for relatively high speed operation, makes it attractive for high reliability applications. (Maximum speed of operation approximately 50 kHz.)

  2. Benchmarking density functional theory predictions of framework structures and properties in a chemically diverse test set of metal-organic frameworks

    SciTech Connect

    Nazarian, Dalar; Ganesh, P.; Sholl, David S.

    2015-09-30

    We compiled a test set of chemically and topologically diverse Metal–Organic Frameworks (MOFs) with high accuracy experimentally derived crystallographic structure data. The test set was used to benchmark the performance of Density Functional Theory (DFT) functionals (M06L, PBE, PW91, PBE-D2, PBE-D3, and vdW-DF2) for predicting lattice parameters, unit cell volume, bonded parameters and pore descriptors. On average PBE-D2, PBE-D3, and vdW-DF2 predict more accurate structures, but all functionals predicted pore diameters within 0.5 Å of the experimental diameter for every MOF in the test set. The test set was also used to assess the variance in performance of DFT functionals for elastic properties and atomic partial charges. The DFT predicted elastic properties such as minimum shear modulus and Young's modulus can differ by an average of 3 and 9 GPa for rigid MOFs such as those in the test set. Moreover, we calculated the partial charges by vdW-DF2 deviate the most from other functionals while there is no significant difference between the partial charges calculated by M06L, PBE, PW91, PBE-D2 and PBE-D3 for the MOFs in the test set. We find that while there are differences in the magnitude of the properties predicted by the various functionals, these discrepancies are small compared to the accuracy necessary for most practical applications.

  3. Inverse modelling of Köhler theory - Part 1: A response surface analysis of CCN spectra with respect to surface-active organic species

    NASA Astrophysics Data System (ADS)

    Lowe, Samuel; Partridge, Daniel; Topping, David; Stier, Philip

    2016-04-01

    partitioning process. The response surface sensitivity analysis identifies the accumulation mode concentration and surface tension to be the most sensitive parameters. The organic:inorganic mass ratio, insoluble fraction , solution ideality and mean diameter and geometric standard deviation of the accumulation mode showed significant sensitivity while chemical properties of the organic exhibited little sensitivity within parametric uncertainties. Parameters such as surface tension and solution ideality, can introduce considerable parametric uncertainty to models and are therefore particularly good candidates for further parameter calibration studies. A complete treatment of bulk-surface partitioning is found to model CCN spectra similar to those calculated using classical Köhler Theory with the surface tension of a pure water drop, as found in traditional sensitivity analysis studies. In addition, the sensitivity of CCN spectra to perturbations in the partitioning parameters K and Γ was found to be negligible. As a result, this study supports previously held recommendations that complex surfactant effects might be neglected and continued use of classical Köhler Theory in GCMs is recommended to avoid additional computational burden.

  4. A Study of Birnbaum's Theory of the Relationship between the Constructs of Leadership and Organization as Depicted in His Higher Education Models of Organizational Functioning: A Contextual Leadership Paradigm for Higher Education

    ERIC Educational Resources Information Center

    Douglas, Pamela A.

    2013-01-01

    This quantitative, nonexperimental study used survey research design and nonparametric statistics to investigate Birnbaum's (1988) theory that there is a relationship between the constructs of leadership and organization, as depicted in his five higher education models of organizational functioning: bureaucratic, collegial, political,…

  5. Fault Tree Reliability Analysis and Design-for-reliability

    Energy Science and Technology Software Center (ESTSC)

    1998-05-05

    WinR provides a fault tree analysis capability for performing systems reliability and design-for-reliability analyses. The package includes capabilities for sensitivity and uncertainity analysis, field failure data analysis, and optimization.

  6. Chemical Applications of Graph Theory: Part II. Isomer Enumeration.

    ERIC Educational Resources Information Center

    Hansen, Peter J.; Jurs, Peter C.

    1988-01-01

    Discusses the use of graph theory to aid in the depiction of organic molecular structures. Gives a historical perspective of graph theory and explains graph theory terminology with organic examples. Lists applications of graph theory to current research projects. (ML)

  7. On Component Reliability and System Reliability for Space Missions

    NASA Technical Reports Server (NTRS)

    Chen, Yuan; Gillespie, Amanda M.; Monaghan, Mark W.; Sampson, Michael J.; Hodson, Robert F.

    2012-01-01

    This paper is to address the basics, the limitations and the relationship between component reliability and system reliability through a study of flight computing architectures and related avionics components for NASA future missions. Component reliability analysis and system reliability analysis need to be evaluated at the same time, and the limitations of each analysis and the relationship between the two analyses need to be understood.

  8. Integrated circuit reliability testing

    NASA Technical Reports Server (NTRS)

    Buehler, Martin G. (Inventor); Sayah, Hoshyar R. (Inventor)

    1990-01-01

    A technique is described for use in determining the reliability of microscopic conductors deposited on an uneven surface of an integrated circuit device. A wafer containing integrated circuit chips is formed with a test area having regions of different heights. At the time the conductors are formed on the chip areas of the wafer, an elongated serpentine assay conductor is deposited on the test area so the assay conductor extends over multiple steps between regions of different heights. Also, a first test conductor is deposited in the test area upon a uniform region of first height, and a second test conductor is deposited in the test area upon a uniform region of second height. The occurrence of high resistances at the steps between regions of different height is indicated by deriving the measured length of the serpentine conductor using the resistance measured between the ends of the serpentine conductor, and comparing that to the design length of the serpentine conductor. The percentage by which the measured length exceeds the design length, at which the integrated circuit will be discarded, depends on the required reliability of the integrated circuit.

  9. Integrated circuit reliability testing

    NASA Technical Reports Server (NTRS)

    Buehler, Martin G. (Inventor); Sayah, Hoshyar R. (Inventor)

    1988-01-01

    A technique is described for use in determining the reliability of microscopic conductors deposited on an uneven surface of an integrated circuit device. A wafer containing integrated circuit chips is formed with a test area having regions of different heights. At the time the conductors are formed on the chip areas of the wafer, an elongated serpentine assay conductor is deposited on the test area so the assay conductor extends over multiple steps between regions of different heights. Also, a first test conductor is deposited in the test area upon a uniform region of first height, and a second test conductor is deposited in the test area upon a uniform region of second height. The occurrence of high resistances at the steps between regions of different height is indicated by deriving the measured length of the serpentine conductor using the resistance measured between the ends of the serpentine conductor, and comparing that to the design length of the serpentine conductor. The percentage by which the measured length exceeds the design length, at which the integrated circuit will be discarded, depends on the required reliability of the integrated circuit.

  10. Load Control System Reliability

    SciTech Connect

    Trudnowski, Daniel

    2015-04-03

    This report summarizes the results of the Load Control System Reliability project (DOE Award DE-FC26-06NT42750). The original grant was awarded to Montana Tech April 2006. Follow-on DOE awards and expansions to the project scope occurred August 2007, January 2009, April 2011, and April 2013. In addition to the DOE monies, the project also consisted of matching funds from the states of Montana and Wyoming. Project participants included Montana Tech; the University of Wyoming; Montana State University; NorthWestern Energy, Inc., and MSE. Research focused on two areas: real-time power-system load control methodologies; and, power-system measurement-based stability-assessment operation and control tools. The majority of effort was focused on area 2. Results from the research includes: development of fundamental power-system dynamic concepts, control schemes, and signal-processing algorithms; many papers (including two prize papers) in leading journals and conferences and leadership of IEEE activities; one patent; participation in major actual-system testing in the western North American power system; prototype power-system operation and control software installed and tested at three major North American control centers; and, the incubation of a new commercial-grade operation and control software tool. Work under this grant certainly supported the DOE-OE goals in the area of “Real Time Grid Reliability Management.”

  11. Understanding the Elements of Operational Reliability: A Key for Achieving High Reliability

    NASA Technical Reports Server (NTRS)

    Safie, Fayssal M.

    2010-01-01

    This viewgraph presentation reviews operational reliability and its role in achieving high reliability through design and process reliability. The topics include: 1) Reliability Engineering Major Areas and interfaces; 2) Design Reliability; 3) Process Reliability; and 4) Reliability Applications.

  12. 75 FR 35689 - System Personnel Training Reliability Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-23

    ... Reliability Standards for the Bulk-Power System, Order No. 693, Federal Register 72 FR 16,416 (Apr. 4, 2007... Corporation, the Electric Reliability Organization (ERO) certified by the Commission. In addition, pursuant to... develop rules for operating staff to follow.\\8\\ In addition, the Task Force urged NERC to...

  13. Formal methods and software reliability

    NASA Technical Reports Server (NTRS)

    Holzmann, Gerard J.

    2004-01-01

    In this position statement I briefly describe how the software reliability problem has changed over the years, and the primary reasons for the recent creation of the Laboratory for Reliable Software at JPL.

  14. Creep-rupture reliability analysis

    NASA Technical Reports Server (NTRS)

    Peralta-Duran, A.; Wirsching, P. H.

    1984-01-01

    A probabilistic approach to the correlation and extrapolation of creep-rupture data is presented. Time temperature parameters (TTP) are used to correlate the data, and an analytical expression for the master curve is developed. The expression provides a simple model for the statistical distribution of strength and fits neatly into a probabilistic design format. The analysis focuses on the Larson-Miller and on the Manson-Haferd parameters, but it can be applied to any of the TTP's. A method is developed for evaluating material dependent constants for TTP's. It is shown that optimized constants can provide a significant improvement in the correlation of the data, thereby reducing modelling error. Attempts were made to quantify the performance of the proposed method in predicting long term behavior. Uncertainty in predicting long term behavior from short term tests was derived for several sets of data. Examples are presented which illustrate the theory and demonstrate the application of state of the art reliability methods to the design of components under creep.

  15. Creep-rupture reliability analysis

    NASA Technical Reports Server (NTRS)

    Peralta-Duran, A.; Wirsching, P. H.

    1985-01-01

    A probabilistic approach to the correlation and extrapolation of creep-rupture data is presented. Time temperature parameters (TTP) are used to correlate the data, and an analytical expression for the master curve is developed. The expression provides a simple model for the statistical distribution of strength and fits neatly into a probabilistic design format. The analysis focuses on the Larson-Miller and on the Manson-Haferd parameters, but it can be applied to any of the TTP's. A method is developed for evaluating material dependent constants for TTP's. It is shown that optimized constants can provide a significant improvement in the correlation of the data, thereby reducing modelling error. Attempts were made to quantify the performance of the proposed method in predicting long term behavior. Uncertainty in predicting long term behavior from short term tests was derived for several sets of data. Examples are presented which illustrate the theory and demonstrate the application of state of the art reliability methods to the design of components under creep.

  16. Testing for PV Reliability (Presentation)

    SciTech Connect

    Kurtz, S.; Bansal, S.

    2014-09-01

    The DOE SUNSHOT workshop is seeking input from the community about PV reliability and how the DOE might address gaps in understanding. This presentation describes the types of testing that are needed for PV reliability and introduces a discussion to identify gaps in our understanding of PV reliability testing.

  17. Benchmarking density functional theory predictions of framework structures and properties in a chemically diverse test set of metal-organic frameworks

    DOE PAGESBeta

    Nazarian, Dalar; Ganesh, P.; Sholl, David S.

    2015-09-30

    We compiled a test set of chemically and topologically diverse Metal–Organic Frameworks (MOFs) with high accuracy experimentally derived crystallographic structure data. The test set was used to benchmark the performance of Density Functional Theory (DFT) functionals (M06L, PBE, PW91, PBE-D2, PBE-D3, and vdW-DF2) for predicting lattice parameters, unit cell volume, bonded parameters and pore descriptors. On average PBE-D2, PBE-D3, and vdW-DF2 predict more accurate structures, but all functionals predicted pore diameters within 0.5 Å of the experimental diameter for every MOF in the test set. The test set was also used to assess the variance in performance of DFT functionalsmore » for elastic properties and atomic partial charges. The DFT predicted elastic properties such as minimum shear modulus and Young's modulus can differ by an average of 3 and 9 GPa for rigid MOFs such as those in the test set. Moreover, we calculated the partial charges by vdW-DF2 deviate the most from other functionals while there is no significant difference between the partial charges calculated by M06L, PBE, PW91, PBE-D2 and PBE-D3 for the MOFs in the test set. We find that while there are differences in the magnitude of the properties predicted by the various functionals, these discrepancies are small compared to the accuracy necessary for most practical applications.« less

  18. Integrating theory, synthesis, spectroscopy and device efficiency to design and characterize donor materials for organic photovoltaics: a case study including 12 donors

    SciTech Connect

    Oosterhout, S. D.; Kopidakis, N.; Owczarczyk, Z. R.; Braunecker, W. A.; Larsen, R. E.; Ratcliff, E. L.; Olson, D. C.

    2015-04-07

    There have been remarkable improvements in the power conversion efficiency of solution-processable Organic Photovoltaics (OPV) have largely been driven by the development of novel narrow bandgap copolymer donors comprising an electron-donating (D) and an electron-withdrawing (A) group within the repeat unit. The large pool of potential D and A units and the laborious processes of chemical synthesis and device optimization, has made progress on new high efficiency materials slow with a few new efficient copolymers reported every year despite the large number of groups pursuing these materials. In our paper we present an integrated approach toward new narrow bandgap copolymers that uses theory to guide the selection of materials to be synthesized based on their predicted energy levels, and time-resolved microwave conductivity (TRMC) to select the best-performing copolymer–fullerene bulk heterojunction to be incorporated into complete OPV devices. We validate our methodology by using a diverse group of 12 copolymers, including new and literature materials, to demonstrate good correlation between (a) theoretically determined energy levels of polymers and experimentally determined ionization energies and electron affinities and (b) photoconductance, measured by TRMC, and OPV device performance. The materials used here also allow us to explore whether further copolymer design rules need to be incorporated into our methodology for materials selection. For example, we explore the effect of the enthalpy change (ΔH) during exciton dissociation on the efficiency of free charge carrier generation and device efficiency and find that ΔH of -0.4 eV is sufficient for efficient charge generation.

  19. Integrating theory, synthesis, spectroscopy and device efficiency to design and characterize donor materials for organic photovoltaics: a case study including 12 donors

    DOE PAGESBeta

    Oosterhout, S. D.; Kopidakis, N.; Owczarczyk, Z. R.; Braunecker, W. A.; Larsen, R. E.; Ratcliff, E. L.; Olson, D. C.

    2015-04-07

    There have been remarkable improvements in the power conversion efficiency of solution-processable Organic Photovoltaics (OPV) have largely been driven by the development of novel narrow bandgap copolymer donors comprising an electron-donating (D) and an electron-withdrawing (A) group within the repeat unit. The large pool of potential D and A units and the laborious processes of chemical synthesis and device optimization, has made progress on new high efficiency materials slow with a few new efficient copolymers reported every year despite the large number of groups pursuing these materials. In our paper we present an integrated approach toward new narrow bandgap copolymersmore » that uses theory to guide the selection of materials to be synthesized based on their predicted energy levels, and time-resolved microwave conductivity (TRMC) to select the best-performing copolymer–fullerene bulk heterojunction to be incorporated into complete OPV devices. We validate our methodology by using a diverse group of 12 copolymers, including new and literature materials, to demonstrate good correlation between (a) theoretically determined energy levels of polymers and experimentally determined ionization energies and electron affinities and (b) photoconductance, measured by TRMC, and OPV device performance. The materials used here also allow us to explore whether further copolymer design rules need to be incorporated into our methodology for materials selection. For example, we explore the effect of the enthalpy change (ΔH) during exciton dissociation on the efficiency of free charge carrier generation and device efficiency and find that ΔH of -0.4 eV is sufficient for efficient charge generation.« less

  20. Dehumanized Theories and the Humanization of Work.

    ERIC Educational Resources Information Center

    Friedlander, Frank

    This paper argues that current theories and concepts of organization and organization psychology as represented in journals and books are inadequate in dealing with major contemporary behavioral and societal issues. The topics discussed in this paper include the relevance of organization theory; the fragmentation of organization theory (structure…

  1. Cultural Issues in Organizations.

    ERIC Educational Resources Information Center

    1999

    This document contains four symposium papers on cultural issues in organizations. "Emotion Management and Organizational Functions: A Study of Action in a Not-for-Profit Organization" (Jamie Callahan Fabian) uses Hochschild's emotion systems theory and Parsons' social systems theory to explain why members of an organization managed their…

  2. Organic matter diagenesis as the key to a unifying theory for the genesis of tabular uranium-vanadium deposits in the Morrison Formation, Colorado Plateau

    USGS Publications Warehouse

    Hansley, P.L.; Spirakis, C.S.

    1992-01-01

    Interstitial, epigenetic amorphous organic matter is intimately associated with uranium in the Grants uranium region and is considered essential to genetic models for these deposits. In contrast, uranium minerals are intimately associated with authigenic vanadium chlorite and vanadium oxides in amorphous organic matter-poor ores of the Slick Rock and Henry Mountains mining districts and therefore, in some genetic models amorphous organic matter is not considered crucial to the formation of these deposits. Differences in organic matter content can be explained by recognizing that amorphous organic matter-poor deposits have been subjected to more advanced stages of diagenesis than amorphous organic matter-rich deposits. Evidence that amorphous organic matter was involved in the genesis of organic matter-poor, as well as organic matter-rich, deposits is described. -from Authors

  3. Computational methods for efficient structural reliability and reliability sensitivity analysis

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.

    1993-01-01

    This paper presents recent developments in efficient structural reliability analysis methods. The paper proposes an efficient, adaptive importance sampling (AIS) method that can be used to compute reliability and reliability sensitivities. The AIS approach uses a sampling density that is proportional to the joint PDF of the random variables. Starting from an initial approximate failure domain, sampling proceeds adaptively and incrementally with the goal of reaching a sampling domain that is slightly greater than the failure domain to minimize over-sampling in the safe region. Several reliability sensitivity coefficients are proposed that can be computed directly and easily from the above AIS-based failure points. These probability sensitivities can be used for identifying key random variables and for adjusting design to achieve reliability-based objectives. The proposed AIS methodology is demonstrated using a turbine blade reliability analysis problem.

  4. High-Reliability Health Care: Getting There from Here

    PubMed Central

    Chassin, Mark R; Loeb, Jerod M

    2013-01-01

    Context Despite serious and widespread efforts to improve the quality of health care, many patients still suffer preventable harm every day. Hospitals find improvement difficult to sustain, and they suffer “project fatigue” because so many problems need attention. No hospitals or health systems have achieved consistent excellence throughout their institutions. High-reliability science is the study of organizations in industries like commercial aviation and nuclear power that operate under hazardous conditions while maintaining safety levels that are far better than those of health care. Adapting and applying the lessons of this science to health care offer the promise of enabling hospitals to reach levels of quality and safety that are comparable to those of the best high-reliability organizations. Methods We combined the Joint Commission's knowledge of health care organizations with knowledge from the published literature and from experts in high-reliability industries and leading safety scholars outside health care. We developed a conceptual and practical framework for assessing hospitals’ readiness for and progress toward high reliability. By iterative testing with hospital leaders, we refined the framework and, for each of its fourteen components, defined stages of maturity through which we believe hospitals must pass to reach high reliability. Findings We discovered that the ways that high-reliability organizations generate and maintain high levels of safety cannot be directly applied to today's hospitals. We defined a series of incremental changes that hospitals should undertake to progress toward high reliability. These changes involve the leadership's commitment to achieving zero patient harm, a fully functional culture of safety throughout the organization, and the widespread deployment of highly effective process improvement tools. Conclusions Hospitals can make substantial progress toward high reliability by undertaking several specific

  5. Reliability analysis and optimization in the design of distributed systems

    SciTech Connect

    Hariri, S.

    1986-01-01

    Reliability measures and efficient evaluation algorithms are presented to aid in designing reliable distributed systems. The terminal reliability between a pair of computers is a good measure in computer networks. For distributed systems, to capture more effectively the redundancy in resources, such as programs and files, two new reliability measures are introduced. These measures are Distributed Program Reliability (DPR) and Distributed System Reliability (DSR). A simple and efficient algorithm, SYREL, is developed to evaluate the reliability between two computing centers. This algorithm incorporates conditional probability, set theory, and Boolean algebra in a distinct approach to achieve fast execution times and obtain compact expressions. An elegant and unified approach based on graph-theoretic techniques is used in developing algorithms to evaluate DPR and DSR measures. It performs a breadth-first search on the graph representing a given distributed system to enumerate all the subgraphs that guarantee the proper accessibility for executing the given tasks(s). These subgraphs are then used to evaluate the desired reliabilities. Several optimization algorithms are developed for designing reliable systems under a cost constraint.

  6. Spike-time reliability of layered neural oscillator networks

    NASA Astrophysics Data System (ADS)

    Lin, K. K.; Shea-Brown, E.; Young, L.-S.

    2013-01-01

    If a network of neurons is repeatedly driven by the same fluctuating signal, will it give the same response each time? If so, the network is said to be reliable. Reliability is of interest in computational neuroscience because the degree to which a network is reliable constrains its ability to encode information in precise temporal patterns of spikes. This note outlines how the question of reliability may be fruitfully formulated and studied within the framework of random dynamical systems theory. A specific network architecture, that of a single-layer network, is examined. For the type of single-neuron dynamics and coupling considered here, single-layer networks are found to be very reliable. A qualitative explanation is proposed for this phenomenon.

  7. Computerized life and reliability modelling for turboprop transmissions

    NASA Technical Reports Server (NTRS)

    Savage, M.; Radil, K. C.; Lewicki, D. G.; Coy, J. J.

    1988-01-01

    A generalized life and reliability model is presented for parallel shaft geared prop-fan and turboprop aircraft transmissions. The transmission life and reliability model is a combination of the individual reliability models for all the bearings and gears in the main load paths. The bearing and gear reliability models are based on classical fatigue theory and the two parameter Weibull failure distribution. A computer program was developed to calculate the transmission life and reliability. The program is modular. In its present form, the program can analyze five different transmission arrangements. However, the program can be modified easily to include additional transmission arrangements. An example is included which compares the life of a compound two-stage transmission with the life of a split-torque, parallel compound two-stage transmission, as calculated by the computer program.

  8. Computerized life and reliability modelling for turboprop transmissions

    NASA Technical Reports Server (NTRS)

    Savage, M.; Radil, K. C.; Lewicki, D. G.; Coy, J. J.

    1988-01-01

    A generalized life and reliability model is presented for parallel shaft geared prop-fan and turboprop aircraft transmissions. The transmission life and reliability model is a combination of the individual reliability models for all the bearings and gears in the main load paths. The bearing and gear reliability models are based on classical fatigue theory and the two parameter Weibull failure distribution. A computer program was developed to calculate the transmission life and reliability. The program is modular. In its present form, the program can analyze five different transmission arrangements. However, the program can be modified easily to include additional transmission arrangements. An example is included which compares the life of a compound two-stage transmission with the life of a split-torque, parallel compound two-stage transmission as calculated by the comaputer program.

  9. Reliability of wireless sensor networks.

    PubMed

    Dâmaso, Antônio; Rosa, Nelson; Maciel, Paulo

    2014-01-01

    Wireless Sensor Networks (WSNs) consist of hundreds or thousands of sensor nodes with limited processing, storage, and battery capabilities. There are several strategies to reduce the power consumption of WSN nodes (by increasing the network lifetime) and increase the reliability of the network (by improving the WSN Quality of Service). However, there is an inherent conflict between power consumption and reliability: an increase in reliability usually leads to an increase in power consumption. For example, routing algorithms can send the same packet though different paths (multipath strategy), which it is important for reliability, but they significantly increase the WSN power consumption. In this context, this paper proposes a model for evaluating the reliability of WSNs considering the battery level as a key factor. Moreover, this model is based on routing algorithms used by WSNs. In order to evaluate the proposed models, three scenarios were considered to show the impact of the power consumption on the reliability of WSNs. PMID:25157553

  10. Nuclear weapon reliability evaluation methodology

    SciTech Connect

    Wright, D.L.

    1993-06-01

    This document provides an overview of those activities that are normally performed by Sandia National Laboratories to provide nuclear weapon reliability evaluations for the Department of Energy. These reliability evaluations are first provided as a prediction of the attainable stockpile reliability of a proposed weapon design. Stockpile reliability assessments are provided for each weapon type as the weapon is fielded and are continuously updated throughout the weapon stockpile life. The reliability predictions and assessments depend heavily on data from both laboratory simulation and actual flight tests. An important part of the methodology are the opportunities for review that occur throughout the entire process that assure a consistent approach and appropriate use of the data for reliability evaluation purposes.

  11. Reliability analysis of structural ceramics subjected to biaxial flexure

    NASA Technical Reports Server (NTRS)

    Chao, Luen-Yuan; Shetty, Dinesh K.

    1991-01-01

    The reliability of alumina disks subjected to biaxial flexure is predicted on the basis of statistical fracture theory using a critical strain energy release rate fracture criterion. Results on a sintered silicon nitride are consistent with reliability predictions based on pore-initiated penny-shaped cracks with preferred orientation normal to the maximum principal stress. Assumptions with regard to flaw types and their orientations in each ceramic can be justified by fractography. It is shown that there are no universal guidelines for selecting fracture criteria or assuming flaw orientations in reliability analyses.

  12. A fourth generation reliability predictor

    NASA Technical Reports Server (NTRS)

    Bavuso, Salvatore J.; Martensen, Anna L.

    1988-01-01

    A reliability/availability predictor computer program has been developed and is currently being beta-tested by over 30 US companies. The computer program is called the Hybrid Automated Reliability Predictor (HARP). HARP was developed to fill an important gap in reliability assessment capabilities. This gap was manifested through the use of its third-generation cousin, the Computer-Aided Reliability Estimation (CARE III) program, over a six-year development period and an additional three-year period during which CARE III has been in the public domain. The accumulated experience of the over 30 establishments now using CARE III was used in the development of the HARP program.

  13. US electric power system reliability

    NASA Astrophysics Data System (ADS)

    Electric energy supply, transmission and distribution systems are investigated in order to determine priorities for legislation. The status and the outlook for electric power reliability are discussed.

  14. Stirling Convertor Fasteners Reliability Quantification

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin R.; Korovaichuk, Igor; Kovacevich, Tiodor; Schreiber, Jeffrey G.

    2006-01-01

    Onboard Radioisotope Power Systems (RPS) being developed for NASA s deep-space science and exploration missions require reliable operation for up to 14 years and beyond. Stirling power conversion is a candidate for use in an RPS because it offers a multifold increase in the conversion efficiency of heat to electric power and reduced inventory of radioactive material. Structural fasteners are responsible to maintain structural integrity of the Stirling power convertor, which is critical to ensure reliable performance during the entire mission. Design of fasteners involve variables related to the fabrication, manufacturing, behavior of fasteners and joining parts material, structural geometry of the joining components, size and spacing of fasteners, mission loads, boundary conditions, etc. These variables have inherent uncertainties, which need to be accounted for in the reliability assessment. This paper describes these uncertainties along with a methodology to quantify the reliability, and provides results of the analysis in terms of quantified reliability and sensitivity of Stirling power conversion reliability to the design variables. Quantification of the reliability includes both structural and functional aspects of the joining components. Based on the results, the paper also describes guidelines to improve the reliability and verification testing.

  15. Avionics design for reliability bibliography

    NASA Technical Reports Server (NTRS)

    1976-01-01

    A bibliography with abstracts was presented in support of AGARD lecture series No. 81. The following areas were covered: (1) program management, (2) design for high reliability, (3) selection of components and parts, (4) environment consideration, (5) reliable packaging, (6) life cycle cost, and (7) case histories.

  16. Computer-Aided Reliability Estimation

    NASA Technical Reports Server (NTRS)

    Bavuso, S. J.; Stiffler, J. J.; Bryant, L. A.; Petersen, P. L.

    1986-01-01

    CARE III (Computer-Aided Reliability Estimation, Third Generation) helps estimate reliability of complex, redundant, fault-tolerant systems. Program specifically designed for evaluation of fault-tolerant avionics systems. However, CARE III general enough for use in evaluation of other systems as well.

  17. The Reliability of Density Measurements.

    ERIC Educational Resources Information Center

    Crothers, Charles

    1978-01-01

    Data from a land-use study of small- and medium-sized towns in New Zealand are used to ascertain the relationship between official and effective density measures. It was found that the reliability of official measures of density is very low overall, although reliability increases with community size. (Author/RLV)

  18. The path to safe and reliable healthcare.

    PubMed

    Leonard, Michael W; Frankel, Allan

    2010-09-01

    The ability to deliver safe and reliable healthcare is the goal of all healthcare delivery systems. To bridge the current performance gaps in quality and safety, organizations need to apply a systematic model that effectively addresses both culture and reliable processes of care. The model described in this article provides a comprehensive approach to improving the quality of care in any clinical domain. It also provides a roadmap for people working in clinical improvement to assess the strengths and current needs within their care systems, so they can be strategic and systematic in their work, essential elements for success. The concepts and tools provided can be readily applied to improve the quality and safety of care delivered. PMID:20688455

  19. Photovoltaic performance and reliability workshop

    SciTech Connect

    Mrig, L.

    1993-12-01

    This workshop was the sixth in a series of workshops sponsored by NREL/DOE under the general subject of photovoltaic testing and reliability during the period 1986--1993. PV performance and PV reliability are at least as important as PV cost, if not more. In the US, PV manufacturers, DOE laboratories, electric utilities, and others are engaged in the photovoltaic reliability research and testing. This group of researchers and others interested in the field were brought together to exchange the technical knowledge and field experience as related to current information in this evolving field of PV reliability. The papers presented here reflect this effort since the last workshop held in September, 1992. The topics covered include: cell and module characterization, module and system testing, durability and reliability, system field experience, and standards and codes.

  20. Photovoltaic performance and reliability workshop

    NASA Astrophysics Data System (ADS)

    Mrig, L.

    1993-12-01

    This workshop was the sixth in a series of workshops sponsored by NREL/DOE under the general subject of photovoltaic testing and reliability during the period 1986-1993. PV performance and PV reliability are at least as important as PV cost, if not more. In the U.S., PV manufacturers, DOE laboratories, electric utilities, and others are engaged in the photovoltaic reliability research and testing. This group of researchers and others interested in the field were brought together to exchange the technical knowledge and field experience as related to current information in this evolving field of PV reliability. The papers presented here reflect this effort since the last workshop held in September, 1992. The topics covered include: cell and module characterization, module and system testing, durability and reliability, system field experience, and standards and codes.

  1. Statistical modeling of software reliability

    NASA Technical Reports Server (NTRS)

    Miller, Douglas R.

    1992-01-01

    This working paper discusses the statistical simulation part of a controlled software development experiment being conducted under the direction of the System Validation Methods Branch, Information Systems Division, NASA Langley Research Center. The experiment uses guidance and control software (GCS) aboard a fictitious planetary landing spacecraft: real-time control software operating on a transient mission. Software execution is simulated to study the statistical aspects of reliability and other failure characteristics of the software during development, testing, and random usage. Quantification of software reliability is a major goal. Various reliability concepts are discussed. Experiments are described for performing simulations and collecting appropriate simulated software performance and failure data. This data is then used to make statistical inferences about the quality of the software development and verification processes as well as inferences about the reliability of software versions and reliability growth under random testing and debugging.

  2. Reliability of large superconducting magnets through design

    SciTech Connect

    Henning, C.D.

    1980-09-05

    As superconducting magnet systems grow larger and become the central component of major systems involving fusion, magnetohydrodynamics, and high-energy physics, their reliability must be commensurate with the enormous capital investment in the projects. Although the magnet may represent only 15% of the cost of a large system such as the Mirror Fusion Test Facility, its failure would be catastrophic to the entire investment. Effective quality control during construction is one method of ensuring success. However, if the design is unforgiving, even an inordinate amount of effort expended on quality control may be inadequate. Creative design is the most effective way of ensuring magnet reliability and providing a reasonable limit on the amount of quality control needed. For example, by subjecting the last drawing operation is superconductor manufacture to a stress larger than the magnet design stress, a 100% proof test is achieved; cabled conductors offer mechanical redundancy, as do some methods of conductor joining; ground-plane insulation should be multilayered to prevent arcs, and interturn and interlayer insulation spaced to be compatible with the self-extinguishing of arcs during quench voltages; electrical leads should be thermally protected; and guard vacuum spaces can be incorporated to control helium leaks. Many reliable design options are known to magnet designers. These options need to be documented and organized to produce a design guide. Eventually, standard procedures, safety factors, and design codes can lead to reliability in magnets comparable to that obtained in pressure vessels and other structures. Wihout such reliability, large-scale applications in major systems employing magnetic fusion energy, magnetohydrodynamics, or high-energy physics would present unacceptable economic risks.

  3. Reliability estimation procedures and CARE: The Computer-Aided Reliability Estimation Program

    NASA Technical Reports Server (NTRS)

    Mathur, F. P.

    1971-01-01

    Ultrareliable fault-tolerant onboard digital systems for spacecraft intended for long mission life exploration of the outer planets are under development. The design of systems involving self-repair and fault-tolerance leads to the companion problem of quantifying and evaluating the survival probability of the system for the mission under consideration and the constraints imposed upon the system. Methods have been developed to (1) model self-repair and fault-tolerant organizations; (2) compute survival probability, mean life, and many other reliability predictive functions with respect to various systems and mission parameters; (3) perform sensitivity analysis of the system with respect to mission parameters; and (4) quantitatively compare competitive fault-tolerant systems. Various measures of comparison are offered. To automate the procedures of reliability mathematical modeling and evaluation, the CARE (computer-aided reliability estimation) program was developed. CARE is an interactive program residing on the UNIVAC 1108 system, which makes the above calculations and facilitates report preparation by providing output in tabular form, graphical 2-dimensional plots, and 3-dimensional projections. The reliability estimation of fault-tolerant organization by means of the CARE program is described.

  4. String Theory and Gauge Theories

    SciTech Connect

    Maldacena, Juan

    2009-02-20

    We will see how gauge theories, in the limit that the number of colors is large, give string theories. We will discuss some examples of particular gauge theories where the corresponding string theory is known precisely, starting with the case of the maximally supersymmetric theory in four dimensions which corresponds to ten dimensional string theory. We will discuss recent developments in this area.

  5. [School Organization: Theory and Practice; Selected Readings on Grading, Nongrading, Multigrading, Self-Contained Classrooms, Departmentalization, Team Heterogeneous Grouping. Selected Bibliographies.] Rand McNally Education Series.

    ERIC Educational Resources Information Center

    Franklin, Marian Pope, Comp.

    Over 400 journal articles, case studies, research reports, dissertations, and position papers are briefly described in a series of eight selected bibliographies related to school organization. The eight specific areas treated in the volume and the number of items listed for each include: nongraded elementary school organization, 96; nongraded…

  6. 77 FR 27574 - Automatic Underfrequency Load Shedding and Load Shedding Plans Reliability Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-11

    ...Under section 215 of the Federal Power Act (FPA), the Federal Energy Regulatory Commission (Commission) approves Reliability Standards PRC-006-1 (Automatic Underfrequency Load Shedding) and EOP- 003-2 (Load Shedding Plans), developed and submitted to the Commission for approval by the North American Electric Reliability Corporation (NERC), the Electric Reliability Organization certified by the......

  7. Calculating system reliability with SRFYDO

    SciTech Connect

    Morzinski, Jerome; Anderson - Cook, Christine M; Klamann, Richard M

    2010-01-01

    SRFYDO is a process for estimating reliability of complex systems. Using information from all applicable sources, including full-system (flight) data, component test data, and expert (engineering) judgment, SRFYDO produces reliability estimates and predictions. It is appropriate for series systems with possibly several versions of the system which share some common components. It models reliability as a function of age and up to 2 other lifecycle (usage) covariates. Initial output from its Exploratory Data Analysis mode consists of plots and numerical summaries so that the user can check data entry and model assumptions, and help determine a final form for the system model. The System Reliability mode runs a complete reliability calculation using Bayesian methodology. This mode produces results that estimate reliability at the component, sub-system, and system level. The results include estimates of uncertainty, and can predict reliability at some not-too-distant time in the future. This paper presents an overview of the underlying statistical model for the analysis, discusses model assumptions, and demonstrates usage of SRFYDO.

  8. A reliable multicast for XTP

    NASA Technical Reports Server (NTRS)

    Dempsey, Bert J.; Weaver, Alfred C.

    1990-01-01

    Multicast services needed for current distributed applications on LAN's fall generally into one of three categories: datagram, semi-reliable, and reliable. Transport layer multicast datagrams represent unreliable service in which the transmitting context 'fires and forgets'. XTP executes these semantics when the MULTI and NOERR mode bits are both set. Distributing sensor data and other applications in which application-level error recovery strategies are appropriate benefit from the efficiency in multidestination delivery offered by datagram service. Semi-reliable service refers to multicasting in which the control algorithms of the transport layer--error, flow, and rate control--are used in transferring the multicast distribution to the set of receiving contexts, the multicast group. The multicast defined in XTP provides semi-reliable service. Since, under a semi-reliable service, joining a multicast group means listening on the group address and entails no coordination with other members, a semi-reliable facility can be used for communication between a client and a server group as well as true peer-to-peer group communication. Resource location in a LAN is an important application domain. The term 'semi-reliable' refers to the fact that group membership changes go undetected. No attempt is made to assess the current membership of the group at any time--before, during, or after--the data transfer.

  9. A Rationale for Assessing the Reliability of an Observational Measure.

    ERIC Educational Resources Information Center

    Rowley, Glenn

    The use of the intraclass correlation in determining reliability is discussed and shown to be both appropriate and simple to use in the case of an observational measure, provided that observations are made on at least two occasions. The interpretation of such coefficients is explained in terms of generalizability theory, and real data are used to…

  10. Measuring Rater Reliability on a Special Education Observation Tool

    ERIC Educational Resources Information Center

    Semmelroth, Carrie Lisa; Johnson, Evelyn

    2014-01-01

    This study used generalizability theory to measure reliability on the Recognizing Effective Special Education Teachers (RESET) observation tool designed to evaluate special education teacher effectiveness. At the time of this study, the RESET tool included three evidence-based instructional practices (direct, explicit instruction; whole-group…

  11. Monte Carlo Approach for Reliability Estimations in Generalizability Studies.

    ERIC Educational Resources Information Center

    Dimitrov, Dimiter M.

    A Monte Carlo approach is proposed, using the Statistical Analysis System (SAS) programming language, for estimating reliability coefficients in generalizability theory studies. Test scores are generated by a probabilistic model that considers the probability for a person with a given ability score to answer an item with a given difficulty…

  12. A Latent-Trait Based Reliability Estimate and Upper Bound.

    ERIC Educational Resources Information Center

    Nicewander, W. Alan

    1990-01-01

    An estimate and upper-bound estimate for the reliability of a test composed of binary items is derived from the multidimensional latent trait theory of R. D. Bock and M. Aitken (1981). The practical uses of such estimates are discussed. (SLD)

  13. Fatigue Reliability of Gas Turbine Engine Structures

    NASA Technical Reports Server (NTRS)

    Cruse, Thomas A.; Mahadevan, Sankaran; Tryon, Robert G.

    1997-01-01

    The results of an investigation are described for fatigue reliability in engine structures. The description consists of two parts. Part 1 is for method development. Part 2 is a specific case study. In Part 1, the essential concepts and practical approaches to damage tolerance design in the gas turbine industry are summarized. These have evolved over the years in response to flight safety certification requirements. The effect of Non-Destructive Evaluation (NDE) methods on these methods is also reviewed. Assessment methods based on probabilistic fracture mechanics, with regard to both crack initiation and crack growth, are outlined. Limit state modeling techniques from structural reliability theory are shown to be appropriate for application to this problem, for both individual failure mode and system-level assessment. In Part 2, the results of a case study for the high pressure turbine of a turboprop engine are described. The response surface approach is used to construct a fatigue performance function. This performance function is used with the First Order Reliability Method (FORM) to determine the probability of failure and the sensitivity of the fatigue life to the engine parameters for the first stage disk rim of the two stage turbine. A hybrid combination of regression and Monte Carlo simulation is to use incorporate time dependent random variables. System reliability is used to determine the system probability of failure, and the sensitivity of the system fatigue life to the engine parameters of the high pressure turbine. 'ne variation in the primary hot gas and secondary cooling air, the uncertainty of the complex mission loading, and the scatter in the material data are considered.

  14. Reliability analysis of interdependent lattices

    NASA Astrophysics Data System (ADS)

    Limiao, Zhang; Daqing, Li; Pengju, Qin; Bowen, Fu; Yinan, Jiang; Zio, Enrico; Rui, Kang

    2016-06-01

    Network reliability analysis has drawn much attention recently due to the risks of catastrophic damage in networked infrastructures. These infrastructures are dependent on each other as a result of various interactions. However, most of the reliability analyses of these interdependent networks do not consider spatial constraints, which are found important for robustness of infrastructures including power grid and transport systems. Here we study the reliability properties of interdependent lattices with different ranges of spatial constraints. Our study shows that interdependent lattices with strong spatial constraints are more resilient than interdependent Erdös-Rényi networks. There exists an intermediate range of spatial constraints, at which the interdependent lattices have minimal resilience.

  15. The Assessment of Reliability Under Range Restriction: A Comparison of [Alpha], [Omega], and Test-Retest Reliability for Dichotomous Data

    ERIC Educational Resources Information Center

    Fife, Dustin A.; Mendoza, Jorge L.; Terry, Robert

    2012-01-01

    Though much research and attention has been directed at assessing the correlation coefficient under range restriction, the assessment of reliability under range restriction has been largely ignored. This article uses item response theory to simulate dichotomous item-level data to assess the robustness of KR-20 ([alpha]), [omega], and test-retest…

  16. An experiment in software reliability

    NASA Technical Reports Server (NTRS)

    Dunham, J. R.; Pierce, J. L.

    1986-01-01

    The results of a software reliability experiment conducted in a controlled laboratory setting are reported. The experiment was undertaken to gather data on software failures and is one in a series of experiments being pursued by the Fault Tolerant Systems Branch of NASA Langley Research Center to find a means of credibly performing reliability evaluations of flight control software. The experiment tests a small sample of implementations of radar tracking software having ultra-reliability requirements and uses n-version programming for error detection, and repetitive run modeling for failure and fault rate estimation. The experiment results agree with those of Nagel and Skrivan in that the program error rates suggest an approximate log-linear pattern and the individual faults occurred with significantly different error rates. Additional analysis of the experimental data raises new questions concerning the phenomenon of interacting faults. This phenomenon may provide one explanation for software reliability decay.

  17. Failure Analysis for Improved Reliability

    NASA Technical Reports Server (NTRS)

    Sood, Bhanu

    2016-01-01

    Outline: Section 1 - What is reliability and root cause? Section 2 - Overview of failure mechanisms. Section 3 - Failure analysis techniques (1. Non destructive analysis techniques, 2. Destructive Analysis, 3. Materials Characterization). Section 4 - Summary and Closure

  18. GaAs Reliability Database

    NASA Technical Reports Server (NTRS)

    Sacco, T.; Gonzalez, S.; Kayali, S.

    1993-01-01

    The database consists of two main sections, the data references and the device reliability records. The reference section contains 8 fields: reference number, date of publication, authors, article title, publisher, volume, and page numbers.

  19. "High Stage" Organizing.

    ERIC Educational Resources Information Center

    Torbert, William R.

    Although a psychological theory of stages of transformation in human development currently exists, organizational researchers have yet to elaborate and test any theory of organizational transformation of comparable elegance. According to the organizational stage theory being developed since 1974 by William Torbert, bureaucratic organization, which…

  20. Photovoltaics Performance and Reliability Workshop

    NASA Astrophysics Data System (ADS)

    Mrig, L.

    This document consists of papers and viewgraphs compiled from the proceedings of a workshop held in September 1992. This workshop was the fifth in a series sponsored by NREL/DOE under the general subject areas of photovoltaic module testing and reliability. PV manufacturers, DOE laboratories, electric utilities, and others exchanged technical knowledge and field experience. The topics of cell and module characterization, module and system performance, materials and module durability/reliability research, solar radiation, and applications are discussed.