Science.gov

Sample records for reliability organizations theory

  1. Theory of reliable systems

    NASA Technical Reports Server (NTRS)

    Meyer, J. F.

    1975-01-01

    An attempt was made to refine the current notion of system reliability by identifying and investigating attributes of a system which are important to reliability considerations. Techniques which facilitate analysis of system reliability are included. Special attention was given to fault tolerance, diagnosability, and reconfigurability characteristics of systems.

  2. Decision theory in structural reliability

    NASA Technical Reports Server (NTRS)

    Thomas, J. M.; Hanagud, S.; Hawk, J. D.

    1975-01-01

    Some fundamentals of reliability analysis as applicable to aerospace structures are reviewed, and the concept of a test option is introduced. A decision methodology, based on statistical decision theory, is developed for determining the most cost-effective design factor and method of testing for a given structural assembly. The method is applied to several Saturn V and Space Shuttle structural assemblies as examples. It is observed that the cost and weight features of the design have a significant effect on the optimum decision.

  3. (Centralized Reliability Data Organization (CRDO))

    SciTech Connect

    Haire, M J

    1987-04-21

    One of the primary goals of the Centralized Reliability Data Organization (CREDO) is to be an international focal point for the collection, analysis, and dissemination of liquid metal reactor (LMR) component reliability, availability, and maintainability (RAM) data. During FY-1985, the Department of Energy (DOE) entered into a Specific Memorandum of Agreement (SMA) with Japan's Power Reactor and Nuclear Fuel Development Corporation (PNC) regarding cooperative data exchange efforts. This agreement was CREDO's first step toward internationalization and represented an initial realization of the previously mentioned goal. DOE's interest in further internationalization of the CREDO system was the primary motivation for the traveler's attendance at the Reliability '87 conference.

  4. Becoming a high reliability organization.

    PubMed

    Christianson, Marlys K; Sutcliffe, Kathleen M; Miller, Melissa A; Iwashyna, Theodore J

    2011-01-01

    Aircraft carriers, electrical power grids, and wildland firefighting, though seemingly different, are exemplars of high reliability organizations (HROs)--organizations that have the potential for catastrophic failure yet engage in nearly error-free performance. HROs commit to safety at the highest level and adopt a special approach to its pursuit. High reliability organizing has been studied and discussed for some time in other industries and is receiving increasing attention in health care, particularly in high-risk settings like the intensive care unit (ICU). The essence of high reliability organizing is a set of principles that enable organizations to focus attention on emergent problems and to deploy the right set of resources to address those problems. HROs behave in ways that sometimes seem counterintuitive--they do not try to hide failures but rather celebrate them as windows into the health of the system, they seek out problems, they avoid focusing on just one aspect of work and are able to see how all the parts of work fit together, they expect unexpected events and develop the capability to manage them, and they defer decision making to local frontline experts who are empowered to solve problems. Given the complexity of patient care in the ICU, the potential for medical error, and the particular sensitivity of critically ill patients to harm, high reliability organizing principles hold promise for improving ICU patient care.

  5. Creating Highly Reliable Accountable Care Organizations.

    PubMed

    Vogus, Timothy J; Singer, Sara J

    2016-12-01

    Accountable Care Organizations' (ACOs) pursuit of the triple aim of higher quality, lower cost, and improved population health has met with mixed results. To improve the design and implementation of ACOs we look to organizations that manage similarly complex, dynamic, and tightly coupled conditions while sustaining exceptional performance known as high-reliability organizations. We describe the key processes through which organizations achieve reliability, the leadership and organizational practices that enable it, and the role that professionals can play when charged with enacting it. Specifically, we present concrete practices and processes from health care organizations pursuing high-reliability and from early ACOs to illustrate how the triple aim may be met by cultivating mindful organizing, practicing reliability-enhancing leadership, and identifying and supporting reliability professionals. We conclude by proposing a set of research questions to advance the study of ACOs and high-reliability research.

  6. Organization Theory as Ideology.

    ERIC Educational Resources Information Center

    Greenfield, Thomas B.

    The theory that organizations are ideological inventions of the human mind is discussed. Organizational science is described as an ideology which is based upon social concepts and experiences. The main justification for organizational theory is that it attempts to answer why we behave as we do in social organizations. Ways in which ideas and…

  7. High Reliability Organizations in Education. Noteworthy Perspectives

    ERIC Educational Resources Information Center

    Eck, James H.; Bellamy, G. Thomas; Schaffer, Eugene; Stringfield, Sam; Reynolds, David

    2011-01-01

    The authors of this monograph assert that by assisting school systems to more closely resemble "high reliability" organizations (HROs) that already exist in other industries and benchmarking against top-performing education systems from around the globe, America's school systems can transform themselves from compliance-driven…

  8. Are hospitals becoming high reliability organizations?

    PubMed

    Bagnara, Sebastiano; Parlangeli, Oronzo; Tartaglia, Riccardo

    2010-09-01

    High Reliability Organizations (HROs) are complex systems in which many accidents and adverse events that could occur within those systems or at the interfaces with other systems are actually avoided or prevented. Many organizations in high-risk industries have successfully implemented HRO approaches. In recent years, initiatives have been undertaken aimed at transforming hospitals into HROs. Actually, despite some improvements, these initiatives have not shown the expected results. In this paper, we discuss the possible reasons for such outcomes. We will show that, when compared with traditional HROs, hospitals are undoubtedly high-risk organizations, but have specificities and experience systemic socio-organizational barriers that make them difficult to transform into HROs.

  9. 18 CFR 39.3 - Electric Reliability Organization certification.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 18 Conservation of Power and Water Resources 1 2012-04-01 2012-04-01 false Electric Reliability... CERTIFICATION OF THE ELECTRIC RELIABILITY ORGANIZATION; AND PROCEDURES FOR THE ESTABLISHMENT, APPROVAL, AND ENFORCEMENT OF ELECTRIC RELIABILITY STANDARDS § 39.3 Electric Reliability Organization certification. (a)...

  10. 18 CFR 39.3 - Electric Reliability Organization certification.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 18 Conservation of Power and Water Resources 1 2013-04-01 2013-04-01 false Electric Reliability... CERTIFICATION OF THE ELECTRIC RELIABILITY ORGANIZATION; AND PROCEDURES FOR THE ESTABLISHMENT, APPROVAL, AND ENFORCEMENT OF ELECTRIC RELIABILITY STANDARDS § 39.3 Electric Reliability Organization certification. (a)...

  11. 18 CFR 39.3 - Electric Reliability Organization certification.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Electric Reliability... CERTIFICATION OF THE ELECTRIC RELIABILITY ORGANIZATION; AND PROCEDURES FOR THE ESTABLISHMENT, APPROVAL, AND ENFORCEMENT OF ELECTRIC RELIABILITY STANDARDS § 39.3 Electric Reliability Organization certification. (a)...

  12. 18 CFR 39.3 - Electric Reliability Organization certification.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 18 Conservation of Power and Water Resources 1 2014-04-01 2014-04-01 false Electric Reliability... CERTIFICATION OF THE ELECTRIC RELIABILITY ORGANIZATION; AND PROCEDURES FOR THE ESTABLISHMENT, APPROVAL, AND ENFORCEMENT OF ELECTRIC RELIABILITY STANDARDS § 39.3 Electric Reliability Organization certification. (a)...

  13. Applying Organization Theory to Understanding the Adoption and Implementation of Accountable Care Organizations: Commentary.

    PubMed

    Shortell, Stephen M

    2016-12-01

    This commentary highights the key arguments and contributions of institutional thoery, transaction cost economics (TCE) theory, high reliability theory, and organizational learning theory to understanding the development and evolution of Accountable Care Organizations (ACOs). Institutional theory and TCE theory primarily emphasize the external influences shaping ACOs while high reliability theory and organizational learning theory underscore the internal fctors influencing ACO perfromance. A framework based on Implementation Science is proposed to conside the multiple perspectives on ACOs and, in particular, their abiity to innovate to achieve desired cost, quality, and population health goals.

  14. Design of high reliability organizations in health care.

    PubMed

    Carroll, J S; Rudolph, J W

    2006-12-01

    To improve safety performance, many healthcare organizations have sought to emulate high reliability organizations from industries such as nuclear power, chemical processing, and military operations. We outline high reliability design principles for healthcare organizations including both the formal structures and the informal practices that complement those structures. A stage model of organizational structures and practices, moving from local autonomy to formal controls to open inquiry to deep self-understanding, is used to illustrate typical challenges and design possibilities at each stage. We suggest how organizations can use the concepts and examples presented to increase their capacity to self-design for safety and reliability.

  15. Classical Perturbation Theory for Monte Carlo Studies of System Reliability

    SciTech Connect

    Lewins, Jeffrey D.

    2001-03-15

    A variational principle for a Markov system allows the derivation of perturbation theory for models of system reliability, with prospects of extension to generalized Markov processes of a wide nature. It is envisaged that Monte Carlo or stochastic simulation will supply the trial functions for such a treatment, which obviates the standard difficulties of direct analog Monte Carlo perturbation studies. The development is given in the specific mode for first- and second-order theory, using an example with known analytical solutions. The adjoint equation is identified with the importance function and a discussion given as to how both the forward and backward (adjoint) fields can be obtained from a single Monte Carlo study, with similar interpretations for the additional functions required by second-order theory. Generalized Markov models with age-dependence are identified as coming into the scope of this perturbation theory.

  16. Interrater reliability for Kernberg's structural interview for assessing personality organization.

    PubMed

    Ingenhoven, Theo J M; Duivenvoorden, Hugo J; Brogtrop, Janneke; Lindenborn, Anne; van den Brink, Wim; Passchier, Jan

    2009-10-01

    Interrater reliability is considered a precondition for the validity of theoretical models and their corresponding diagnostic instruments. Studies have documented good interrater reliability for structured interviews measuring personality characteristics on a descriptive-phenomenological level but there is little research on reliability of assessment procedures on a structural level. The current study investigated the interrater reliability of the structural interview (SI) designed to assess neurotic, borderline, and psychotic personality organization according to Kernberg. Videotaped SIs of 69 psychiatric patients were randomly and independently rated by two out of three trained psychologists. Agreement between rater pairs was expressed as square weighted kappa (K(sw), 95% CI). Results indicate satisfactory interrater reliability with respect to Kernberg's tripartite classification (K(sw) = 0.42, 95% CI 0.07 to 0.77). Subdivision of the borderline category or introduction of intermediate subcategories to the tripartite system did not significantly affect reliability (K(sw) = 0.55, 95% CI 0.30 to 0.80; K(sw) = 0.59, 95% CI 0.34 to 0.84, respectively). The conclusion is that trained clinicians can reliably assess structural personality organization using the SI. Refining the nosological system adding subcategories did not reduce reliability.

  17. Theory and the organic bioethicist.

    PubMed

    Chambers, T

    2001-01-01

    This article argues for the importance of theoretical reflections that originate from patients' experiences. Traditionally academic philosophers have linked their ability to theorize about the moral basis of medical practice to their role as outside observer. The author contends that recently a new type of reflection has come from within particular patient populations. Drawing upon a distinction created by Antonio Gramsci, it is argued that one can distinguish the theory generated by traditional bioethicists, who are academically trained, from that of "organic" bioethicists, who identify themselves with a particular patient community. The characteristics of this new type of bioethicist that are explored in this article include a close association of memoir and philosophy, an interrelationship of theory and praxis, and an intimate connection between the individual and a particular patient community.

  18. Modeling Sensor Reliability in Fault Diagnosis Based on Evidence Theory

    PubMed Central

    Yuan, Kaijuan; Xiao, Fuyuan; Fei, Liguo; Kang, Bingyi; Deng, Yong

    2016-01-01

    Sensor data fusion plays an important role in fault diagnosis. Dempster–Shafer (D-R) evidence theory is widely used in fault diagnosis, since it is efficient to combine evidence from different sensors. However, under the situation where the evidence highly conflicts, it may obtain a counterintuitive result. To address the issue, a new method is proposed in this paper. Not only the statistic sensor reliability, but also the dynamic sensor reliability are taken into consideration. The evidence distance function and the belief entropy are combined to obtain the dynamic reliability of each sensor report. A weighted averaging method is adopted to modify the conflict evidence by assigning different weights to evidence according to sensor reliability. The proposed method has better performance in conflict management and fault diagnosis due to the fact that the information volume of each sensor report is taken into consideration. An application in fault diagnosis based on sensor fusion is illustrated to show the efficiency of the proposed method. The results show that the proposed method improves the accuracy of fault diagnosis from 81.19% to 89.48% compared to the existing methods. PMID:26797611

  19. Modeling Sensor Reliability in Fault Diagnosis Based on Evidence Theory.

    PubMed

    Yuan, Kaijuan; Xiao, Fuyuan; Fei, Liguo; Kang, Bingyi; Deng, Yong

    2016-01-18

    Sensor data fusion plays an important role in fault diagnosis. Dempster-Shafer (D-R) evidence theory is widely used in fault diagnosis, since it is efficient to combine evidence from different sensors. However, under the situation where the evidence highly conflicts, it may obtain a counterintuitive result. To address the issue, a new method is proposed in this paper. Not only the statistic sensor reliability, but also the dynamic sensor reliability are taken into consideration. The evidence distance function and the belief entropy are combined to obtain the dynamic reliability of each sensor report. A weighted averaging method is adopted to modify the conflict evidence by assigning different weights to evidence according to sensor reliability. The proposed method has better performance in conflict management and fault diagnosis due to the fact that the information volume of each sensor report is taken into consideration. An application in fault diagnosis based on sensor fusion is illustrated to show the efficiency of the proposed method. The results show that the proposed method improves the accuracy of fault diagnosis from 81.19% to 89.48% compared to the existing methods.

  20. Teamwork as an essential component of high-reliability organizations.

    PubMed

    Baker, David P; Day, Rachel; Salas, Eduardo

    2006-08-01

    Organizations are increasingly becoming dynamic and unstable. This evolution has given rise to greater reliance on teams and increased complexity in terms of team composition, skills required, and degree of risk involved. High-reliability organizations (HROs) are those that exist in such hazardous environments where the consequences of errors are high, but the occurrence of error is extremely low. In this article, we argue that teamwork is an essential component of achieving high reliability particularly in health care organizations. We describe the fundamental characteristics of teams, review strategies in team training, demonstrate the criticality of teamwork in HROs and finally, identify specific challenges the health care community must address to improve teamwork and enhance reliability.

  1. Generalizability Theory as a Unifying Framework of Measurement Reliability in Adolescent Research

    ERIC Educational Resources Information Center

    Fan, Xitao; Sun, Shaojing

    2014-01-01

    In adolescence research, the treatment of measurement reliability is often fragmented, and it is not always clear how different reliability coefficients are related. We show that generalizability theory (G-theory) is a comprehensive framework of measurement reliability, encompassing all other reliability methods (e.g., Pearson "r,"…

  2. 18 CFR 39.3 - Electric Reliability Organization certification.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Electric Reliability Organization certification. 39.3 Section 39.3 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY REGULATIONS UNDER THE FEDERAL POWER ACT RULES...

  3. Theory of reliable systems. [reliability analysis and on-line fault diagnosis

    NASA Technical Reports Server (NTRS)

    Meyer, J. F.

    1974-01-01

    Research is reported in the program to refine the current notion of system reliability by identifying and investigating attributes of a system which are important to reliability considerations, and to develop techniques which facilitate analysis of system reliability. Reliability analysis, and on-line fault diagnosis are discussed.

  4. Putting "Organizations" into an Organization Theory Course: A Hybrid CAO Model for Teaching Organization Theory

    ERIC Educational Resources Information Center

    Hannah, David R.; Venkatachary, Ranga

    2010-01-01

    In this article, the authors present a retrospective analysis of an instructor's multiyear redesign of a course on organization theory into what is called a hybrid Classroom-as-Organization model. It is suggested that this new course design served to apprentice students to function in quasi-real organizational structures. The authors further argue…

  5. Comparison of Reliability Measures under Factor Analysis and Item Response Theory

    ERIC Educational Resources Information Center

    Cheng, Ying; Yuan, Ke-Hai; Liu, Cheng

    2012-01-01

    Reliability of test scores is one of the most pervasive psychometric concepts in measurement. Reliability coefficients based on a unifactor model for continuous indicators include maximal reliability rho and an unweighted sum score-based omega, among many others. With increasing popularity of item response theory, a parallel reliability measure pi…

  6. Reliability theory for receptor-ligand bond dissociation

    NASA Astrophysics Data System (ADS)

    Tees, David F. J.; Woodward, John T.; Hammer, David A.

    2001-05-01

    Cell adhesion in the presence of hydrodynamic forces is a critical factor in inflammation, cancer metastasis, and blood clotting. A number of assays have recently been developed to apply forces to small numbers of the receptor-ligand bonds responsible for adhesion. Examples include assays using hydrodynamic shear in flow chambers or elastic probe deflection assays such as the atomic force microscope or the biomembrane force probe. One wishes to use the data on the time distribution of dissociation from these assays to derive information on the force dependence of reaction rates, an important determinant of cell adhesive behavior. The dissociation process can be described using the theory developed for reliability engineering of electronic components and networks. We use this framework along with the Bell model for the reverse reaction rate (kr=kr0exp[r0 f/kT], where f is the applied force and kr0 and r0 are Bell model parameters) to write closed form expressions for the probability distribution of break-up with multiple independent or interacting bonds. These expressions show that the average lifetime of n bonds scales with the nth harmonic number multiplied by the lifetime of a single bond. Results from calculation and simulations are used to describe the effect of experimental procedures in forced unbinding assays on the estimation of parameters for the force dependence of reverse reaction rates.

  7. 76 FR 58101 - Electric Reliability Organization Interpretation of Transmission Operations Reliability Standard

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-20

    ... of Reliability Standard, TOP-001-1, Requirement R8, which pertains to the restoration of real and... Requirement R8 in Commission-approved NERC Reliability Standard TOP-001-1-- Reliability Responsibilities and... 107 Reliability Standards filed by NERC, including Reliability Standard TOP-001-1.\\4\\ \\2\\...

  8. Educational Management Organizations as High Reliability Organizations: A Study of Victory's Philadelphia High School Reform Work

    ERIC Educational Resources Information Center

    Thomas, David E.

    2013-01-01

    This executive position paper proposes recommendations for designing reform models between public and private sectors dedicated to improving school reform work in low performing urban high schools. It reviews scholarly research about for-profit educational management organizations, high reliability organizations, American high school reform, and…

  9. Reliability correction for functional connectivity: Theory and implementation.

    PubMed

    Mueller, Sophia; Wang, Danhong; Fox, Michael D; Pan, Ruiqi; Lu, Jie; Li, Kuncheng; Sun, Wei; Buckner, Randy L; Liu, Hesheng

    2015-11-01

    Network properties can be estimated using functional connectivity MRI (fcMRI). However, regional variation of the fMRI signal causes systematic biases in network estimates including correlation attenuation in regions of low measurement reliability. Here we computed the spatial distribution of fcMRI reliability using longitudinal fcMRI datasets and demonstrated how pre-estimated reliability maps can correct for correlation attenuation. As a test case of reliability-based attenuation correction we estimated properties of the default network, where reliability was significantly lower than average in the medial temporal lobe and higher in the posterior medial cortex, heterogeneity that impacts estimation of the network. Accounting for this bias using attenuation correction revealed that the medial temporal lobe's contribution to the default network is typically underestimated. To render this approach useful to a greater number of datasets, we demonstrate that test-retest reliability maps derived from repeated runs within a single scanning session can be used as a surrogate for multi-session reliability mapping. Using data segments with different scan lengths between 1 and 30 min, we found that test-retest reliability of connectivity estimates increases with scan length while the spatial distribution of reliability is relatively stable even at short scan lengths. Finally, analyses of tertiary data revealed that reliability distribution is influenced by age, neuropsychiatric status and scanner type, suggesting that reliability correction may be especially important when studying between-group differences. Collectively, these results illustrate that reliability-based attenuation correction is an easily implemented strategy that mitigates certain features of fMRI signal nonuniformity.

  10. Theory of reliable systems. [systems analysis and design

    NASA Technical Reports Server (NTRS)

    Meyer, J. F.

    1973-01-01

    The analysis and design of reliable systems are discussed. The attributes of system reliability studied are fault tolerance, diagnosability, and reconfigurability. Objectives of the study include: to determine properties of system structure that are conducive to a particular attribute; to determine methods for obtaining reliable realizations of a given system; and to determine how properties of system behavior relate to the complexity of fault tolerant realizations. A list of 34 references is included.

  11. 76 FR 23222 - Electric Reliability Organization Interpretation of Transmission Operations Reliability

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-26

    ...) proposed interpretation of Reliability Standard, TOP-001-1, Requirement R8. DATES: Comments are due June 27... Requirement R8 in Commission-approved NERC Reliability Standard TOP-001-1 -- Reliability Responsibilities and... by NERC, including Reliability Standard TOP-001-1.\\5\\ \\3\\ Rules Concerning Certification of...

  12. Conceptualizing Essay Tests' Reliability and Validity: From Research to Theory

    ERIC Educational Resources Information Center

    Badjadi, Nour El Imane

    2013-01-01

    The current paper on writing assessment surveys the literature on the reliability and validity of essay tests. The paper aims to examine the two concepts in relationship with essay testing as well as to provide a snapshot of the current understandings of the reliability and validity of essay tests as drawn in recent research studies. Bearing in…

  13. Tsallis statistics in reliability analysis: Theory and methods

    NASA Astrophysics Data System (ADS)

    Zhang, Fode; Shi, Yimin; Keung Tony Ng, Hon; Wang, Ruibing

    2016-10-01

    Tsallis statistics, which is based on a non-additive entropy characterized by an index q, is a very useful tool in physics and statistical mechanics. This paper presents an application of Tsallis statistics in reliability analysis. We first show that the q-gamma and incomplete q-gamma functions are q-generalized. Then, three commonly used statistical distributions in reliability analysis are introduced in Tsallis statistics, and the corresponding reliability characteristics including the reliability function, hazard function, cumulative hazard function and mean time to failure are investigated. In addition, we study the statistical inference based on censored reliability data. Specifically, we investigate the point and interval estimation of the model parameters of the q-exponential distribution based on the maximum likelihood method. Simulated and real-life datasets are used to illustrate the methodologies discussed in this paper. Finally, some concluding remarks are provided.

  14. 75 FR 14097 - Revision to Electric Reliability Organization Definition of Bulk Electric System

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-24

    ... Reliability Organization Definition of Bulk Electric System March 18, 2010. AGENCY: Federal Energy Regulatory... Electric Reliability Organization (ERO) to revise its definition of the term ``bulk electric system'' to... direct the Electric Reliability Organization (ERO) to revise its definition of the term ``bulk...

  15. Reliability theory for diffusion processes on interconnected networks

    NASA Astrophysics Data System (ADS)

    Khorramzadeh, Yasamin; Youssef, Mina; Eubank, Stephen

    2014-03-01

    We present the concept of network reliability as a framework to study diffusion dynamics in interdependent networks. We illustrate how different outcomes of diffusion processes, such as cascading failure, can be studied by estimating the reliability polynomial under different reliability rules. As an example, we investigate the effect of structural properties on diffusion dynamics for a few different topologies of two coupled networks. We evaluate the effect of varying the probability of failure propagating along the edges, both within a single network as well as between the networks. We exhibit the sensitivity of interdependent network reliability and connectivity to edge failures in each topology. Network Dynamics and Simulation Science Laboratory, Virginia Bioinformatics Institute, Virginia Tech, Blacksburg, Virginia 24061, USA.

  16. A PERSPECTIVE ON RELIABILITY: PROBABILITY THEORY AND BEYOND

    SciTech Connect

    J. M. BOOKER; N. D. SINGPURWALLA

    2001-05-01

    Reliability assessment in the coming era is inclined to be characterized by a difficult dilemma. On the one hand units and systems will be required to be ultra reliable; on the other hand, it may not be possible to subject them to a full-scale testing. A case in point occurs where testing is limited is one-of-a-kind complex systems, such as space exploration vehicles or where severe testing constraints are imposed such as full scale testing of strategic nuclear weapons prohibited by test ban treaties and international agreements. Decision makers also require reliability assessments for problems with terabytes of data, such as from complex simulations of system performance. Quantitative measures of reliability and their associated uncertainties will remain integral to system monitoring and tactical decision making. The challenge is to derive these defensible measures in light of these dilemmas. Because reliability is usually defined as a probability that the system performs to its required specification, probability enters into the heart of these dilemmas, both philosophically and practically. This paper provides an overview of the several interpretations of probability as they relate to reliability and to the uncertainties involved. The philosophical issues pertain to the interpretation and the quantification of reliability. For example, how must we interpret a number like 10{sup {minus}9}, for the failure rate of an airplane flight or an electrical power plant? Such numbers are common, particularly in the context of safety. Does it mean one failure in 10{sup 9} identical, or almost identical, trials? Are identical trials physically possible, let alone the fact that 10{sup 9} trials can take generations to perform? How can we make precise the notion of almost identical trials? If the trials are truly identical, then all of them must produce the same outcome and so the reliability must be either one or zero. However tautologies, like certainty and impossibility, can

  17. Understanding organic photovoltaic cells: Electrode, nanostructure, reliability, and performance

    NASA Astrophysics Data System (ADS)

    Kim, Myung-Su

    My Ph.D. research has focused on alternative renewable energy using organic semiconductors. During my study, first, I have established reliable characterization methods of organic photovoltaic devices. More specifically, less than 5% variation of power conversion efficiency of fabricated organic blend photovoltaic cells (OBPC) was achieved after optimization. The reproducibility of organic photovoltaic cell performance is one of the essential issues that must be clarified before beginning serious investigations of the application of creative and challenging ideas. Second, the relationships between fill factor (FF) and process variables have been demonstrated with series and shunt resistance, and this provided a chance to understand the electrical device behavior. In the blend layer, series resistance (Rs) and shunt resistance (Rsh) were varied by controlling the morphology of the blend layer, the regioregularity of the conjugated polymer, and the thickness of the blend layer. At the interface between the cathode including PEDOT:PSS and the blend layer, cathode conductivity was controlled by varying the structure of the cathode or adding an additive. Third, we thoroughly examined possible characterization mistakes in OPVC. One significant characterization mistake is observed when the crossbar electrode geometry of OPVC using PEDOT:PSS was fabricated and characterized with illumination which is larger than the actual device area. The hypothesis to explain this overestimation was excess photo-current generated from the cell region outside the overlapped electrode area, where PEDOT:PSS plays as anode and this was clearly supported with investigations. Finally, I incorporated a creative idea, which enhances the exciton dissociation efficiency by increasing the interface area between donor and acceptor to improve the power conversion efficiency of organic photovoltaic cells. To achieve this, nanoimprint lithography was applied for interface area increase. To clarify the

  18. Test Theories, Educational Priorities and Reliability of Public Examinations in England

    ERIC Educational Resources Information Center

    Baird, Jo-Anne; Black, Paul

    2013-01-01

    Much has already been written on the controversies surrounding the use of different test theories in educational assessment. Other authors have noted the prevalence of classical test theory over item response theory in practice. This Special Issue draws together articles based upon work conducted on the Reliability Programme for England's…

  19. 18 CFR 39.10 - Changes to an Electric Reliability Organization Rule or Regional Entity Rule.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... Reliability Organization Rule or Regional Entity Rule. 39.10 Section 39.10 Conservation of Power and Water... RULES CONCERNING CERTIFICATION OF THE ELECTRIC RELIABILITY ORGANIZATION; AND PROCEDURES FOR THE ESTABLISHMENT, APPROVAL, AND ENFORCEMENT OF ELECTRIC RELIABILITY STANDARDS § 39.10 Changes to an...

  20. Bi-Factor Multidimensional Item Response Theory Modeling for Subscores Estimation, Reliability, and Classification

    ERIC Educational Resources Information Center

    Md Desa, Zairul Nor Deana

    2012-01-01

    In recent years, there has been increasing interest in estimating and improving subscore reliability. In this study, the multidimensional item response theory (MIRT) and the bi-factor model were combined to estimate subscores, to obtain subscores reliability, and subscores classification. Both the compensatory and partially compensatory MIRT…

  1. Using Metaphors to Teach Organization Theory

    ERIC Educational Resources Information Center

    Taber, Tom D.

    2007-01-01

    Metaphors were used to teach systems thinking and to clarify concepts of organizational theory in an introductory MBA management course. Gareth Morgan's metaphors of organization were read by students and applied as frames to analyze a business case. In addition, personal metaphors were written by individual students in order to describe the…

  2. The Progress of Theory in Knowledge Organization.

    ERIC Educational Resources Information Center

    Smiraglia, Richard P.

    2002-01-01

    Presents a background on theory in knowledge organization, which has moved from an epistemic stance of pragmatism and rationalism (based on observation of the construction of retrieval tools), to empiricism (based on the results of empirical research). Discusses historicism, external validity, classification, user-interface design, and…

  3. Reliability of the Measure of Acceptance of the Theory of Evolution (MATE) Instrument with University Students

    ERIC Educational Resources Information Center

    Rutledge, Michael L.; Sadler, Kim C.

    2007-01-01

    The Measure of Acceptance of the Theory of Evolution (MATE) instrument was initially designed to assess high school biology teachers' acceptance of evolutionary theory. To determine if the MATE instrument is reliable with university students, it was administered to students in a non-majors biology course (n = 61) twice over a 3-week period.…

  4. 78 FR 38851 - Electric Reliability Organization Proposal To Retire Requirements in Reliability Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-28

    ... R5--Facility Ratings FAC-010-2.1, Requirement R5--System Operating Limits Methodology for the Planning Horizon FAC-011-2.1, Requirement R5--System Operating Limits Methodology for the Operations... system operating limits used in the reliable planning of the bulk electric system are determined based...

  5. Theory of organic magnetoresistance in disordered organic semiconductors

    NASA Astrophysics Data System (ADS)

    Harmon, Nicholas J.; Flatté, Michael E.

    2012-10-01

    The understanding of spin transport in organics has been challenged by the discovery of large magnetic field effects on properties such as conductivity and electroluminescence in a wide array of organic systems. To explain the large organic magnetoresistance (OMAR) phenomenon, we present and solve a model for magnetoresistance in positionally disordered organic materials using percolation theory. The model describes the effects of singlettriplet spin transitions on hopping transport by considering the role of spin dynamics on an effective density of hopping sites. Faster spin transitions open up `spin-blocked' pathways to become viable conduction channels and hence produce magnetoresistance. We concentrate on spin transitions under the effects of the hyperfine (isotropic and anisotropic), exchange, and dipolar interactions. The magnetoresistance can be found analytically in several regimes and explains several experimental observations

  6. Organic magnetoresistance based on hopping theory

    NASA Astrophysics Data System (ADS)

    Yang, Fu-Jiang; Xie, Shi-Jie

    2014-09-01

    For the organic magnetoresistance (OMAR) effect, we suggest a spin-related hopping of carriers (polarons) based on Marcus theory. The mobility of polarons is calculated with the master equation (ME) and then the magnetoresistance (MR) is obtained. The theoretical results are consistent with the experimental observation. Especially, the sign inversion of the MR under different driving bias voltages found in the experiment is predicted. Besides, the effects of molecule disorder, hyperfine interaction (HFI), polaron localization, and temperature on the MR are investigated.

  7. 75 FR 72909 - Revision to Electric Reliability Organization Definition of Bulk Electric System

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-26

    ... Definition of Bulk Electric System; Final Rule #0;#0;Federal Register / Vol. 75 , No. 227 / Friday, November... CFR Part 40 Revision to Electric Reliability Organization Definition of Bulk Electric System Issued... Proposed Rulemaking to require the Electric Reliability Organization (ERO) to revise its definition of...

  8. 76 FR 16263 - Revision to Electric Reliability Organization Definition of Bulk Electric System

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-23

    ... Energy Regulatory Commission 18 CFR Part 40 Revision to Electric Reliability Organization Definition of... Electric Reliability Organization (ERO) to revise the definition of the term ``bulk electric system... technical concerns and ensure that the definition encompasses all facilities necessary for operating...

  9. Development of the centralized reliability data organization (CREDO)

    SciTech Connect

    Bott, T. F.; Cunningham, G. W.; Greene, N. M.; Haas, P. M.; Hudson, S. D.; Knee, H. E.; Manning, J. J.

    1980-01-01

    The Centralized Reliability Data Organizaton (CREDO) has been established by the Reactor Research and Technology Division of the Department of Energy (RRT/DOE) at Oak Ridge National Laboratory (ORNL). It's primary functions are collection, evaluation and dissemination of reliability/availability data pertaining to advanced reactors. Associated information and analysis services will be provided to users. Interface and cooperative data exchange with existing US and international data banks is an integral part of CREDO's program plan. This paper outlines the design and operation of the proposed system and summarizes the status of its development. The schedule for developing CREDO has been lengthened as appropriate to the current schedule for development of advanced reactors in the US, but the initial development phase is nearing completion, and demonstration of system capabilities is anticipated prior to the end of FY 1980.

  10. Mathematic Modeling of Complex Hydraulic Machinery Systems When Evaluating Reliability Using Graph Theory

    NASA Astrophysics Data System (ADS)

    Zemenkova, M. Yu; Shipovalov, A. N.; Zemenkov, Yu D.

    2016-04-01

    The main technological equipment of pipeline transport of hydrocarbons are hydraulic machines. During transportation of oil mainly used of centrifugal pumps, designed to work in the “pumping station-pipeline” system. Composition of a standard pumping station consists of several pumps, complex hydraulic piping. The authors have developed a set of models and algorithms for calculating system reliability of pumps. It is based on the theory of reliability. As an example, considered one of the estimation methods with the application of graph theory.

  11. Some Characteristics of One Type of High Reliability Organization.

    ERIC Educational Resources Information Center

    Roberts, Karlene H.

    1990-01-01

    Attempts to define organizational processes necessary to operate safely technologically complex organizations. Identifies nuclear powered aircraft carriers as examples of potentially hazardous organizations with histories of excellent operations. Discusses how carriers deal with components of risk and antecedents to catastrophe cited by Perrow and…

  12. Reliability analysis of the objective structured clinical examination using generalizability theory

    PubMed Central

    Trejo-Mejía, Juan Andrés; Sánchez-Mendiola, Melchor; Méndez-Ramírez, Ignacio; Martínez-González, Adrián

    2016-01-01

    Background The objective structured clinical examination (OSCE) is a widely used method for assessing clinical competence in health sciences education. Studies using this method have shown evidence of validity and reliability. There are no published studies of OSCE reliability measurement with generalizability theory (G-theory) in Latin America. The aims of this study were to assess the reliability of an OSCE in medical students using G-theory and explore its usefulness for quality improvement. Methods An observational cross-sectional study was conducted at National Autonomous University of Mexico (UNAM) Faculty of Medicine in Mexico City. A total of 278 fifth-year medical students were assessed with an 18-station OSCE in a summative end-of-career final examination. There were four exam versions. G-theory with a crossover random effects design was used to identify the main sources of variance. Examiners, standardized patients, and cases were considered as a single facet of analysis. Results The exam was applied to 278 medical students. The OSCE had a generalizability coefficient of 0.93. The major components of variance were stations, students, and residual error. The sites and the versions of the tests had minimum variance. Conclusions Our study achieved a G coefficient similar to that found in other reports, which is acceptable for summative tests. G-theory allows the estimation of the magnitude of multiple sources of error and helps decision makers to determine the number of stations, test versions, and examiners needed to obtain reliable measurements. PMID:27543188

  13. Reliability analysis of the objective structured clinical examination using generalizability theory.

    PubMed

    Trejo-Mejía, Juan Andrés; Sánchez-Mendiola, Melchor; Méndez-Ramírez, Ignacio; Martínez-González, Adrián

    2016-01-01

    Background The objective structured clinical examination (OSCE) is a widely used method for assessing clinical competence in health sciences education. Studies using this method have shown evidence of validity and reliability. There are no published studies of OSCE reliability measurement with generalizability theory (G-theory) in Latin America. The aims of this study were to assess the reliability of an OSCE in medical students using G-theory and explore its usefulness for quality improvement. Methods An observational cross-sectional study was conducted at National Autonomous University of Mexico (UNAM) Faculty of Medicine in Mexico City. A total of 278 fifth-year medical students were assessed with an 18-station OSCE in a summative end-of-career final examination. There were four exam versions. G-theory with a crossover random effects design was used to identify the main sources of variance. Examiners, standardized patients, and cases were considered as a single facet of analysis. Results The exam was applied to 278 medical students. The OSCE had a generalizability coefficient of 0.93. The major components of variance were stations, students, and residual error. The sites and the versions of the tests had minimum variance. Conclusions Our study achieved a G coefficient similar to that found in other reports, which is acceptable for summative tests. G-theory allows the estimation of the magnitude of multiple sources of error and helps decision makers to determine the number of stations, test versions, and examiners needed to obtain reliable measurements.

  14. Estimating Reliability of School-Level Scores Using Multilevel and Generalizability Theory Models

    ERIC Educational Resources Information Center

    Jeon, Min-Jeong; Lee, Guemin; Hwang, Jeong-Won; Kang, Sang-Jin

    2009-01-01

    The purpose of this study was to investigate the methods of estimating the reliability of school-level scores using generalizability theory and multilevel models. Two approaches, "student within schools" and "students within schools and subject areas," were conceptualized and implemented in this study. Four methods resulting from the combination…

  15. Generalizability Theory Reliability of Written Expression Curriculum-Based Measurement in Universal Screening

    ERIC Educational Resources Information Center

    Keller-Margulis, Milena A.; Mercer, Sterett H.; Thomas, Erin L.

    2016-01-01

    The purpose of this study was to examine the reliability of written expression curriculum-based measurement (WE-CBM) in the context of universal screening from a generalizability theory framework. Students in second through fifth grade (n = 145) participated in the study. The sample included 54% female students, 49% White students, 23% African…

  16. Assessing Academic Advising Outcomes Using Social Cognitive Theory: A Validity and Reliability Study

    ERIC Educational Resources Information Center

    Erlich, Richard J.; Russ-Eft, Darlene F.

    2012-01-01

    The validity and reliability of three instruments, the "Counselor Rubric for Gauging Student Understanding of Academic Planning," micro-analytic questions, and the "Student Survey for Understanding Academic Planning," all based on social cognitive theory, were tested as means to assess self-efficacy and self-regulated learning in college academic…

  17. Test Reliability and the Kuder-Richardson Formulas: Derivation from Probability Theory

    ERIC Educational Resources Information Center

    Zimmerman, Donald W.

    1972-01-01

    Although a great deal of attention has been devoted over a period of years to the estimation of reliability from item statistics, there are still gaps in the mathematical derivation of the Kuder-Richardson results. The main purpose of this paper is to fill some of these gaps, using language consistent with modern probability theory. (Author)

  18. Score Reliability of a Test Composed of Passage-Based Testlets: A Generalizability Theory Perspective.

    ERIC Educational Resources Information Center

    Lee, Yong-Won

    The purpose of this study was to investigate the impact of local item dependence (LID) in passage-based testlets on the test score reliability of an English as a Foreign Language (EFL) reading comprehension test from the perspective of generalizability (G) theory. Definitions and causes of LID in passage-based testlets are reviewed within the…

  19. Test-retest reliability of graph theory measures of structural brain connectivity.

    PubMed

    Dennis, Emily L; Jahanshad, Neda; Toga, Arthur W; McMahon, Katie L; de Zubicaray, Greig I; Martin, Nicholas G; Wright, Margaret J; Thompson, Paul M

    2012-01-01

    The human connectome has recently become a popular research topic in neuroscience, and many new algorithms have been applied to analyze brain networks. In particular, network topology measures from graph theory have been adapted to analyze network efficiency and 'small-world' properties. While there has been a surge in the number of papers examining connectivity through graph theory, questions remain about its test-retest reliability (TRT). In particular, the reproducibility of structural connectivity measures has not been assessed. We examined the TRT of global connectivity measures generated from graph theory analyses of 17 young adults who underwent two high-angular resolution diffusion (HARDI) scans approximately 3 months apart. Of the measures assessed, modularity had the highest TRT, and it was stable across a range of sparsities (a thresholding parameter used to define which network edges are retained). These reliability measures underline the need to develop network descriptors that are robust to acquisition parameters.

  20. Leading Change: Transitioning the AFMS into a High Reliability Organization

    DTIC Science & Technology

    2016-02-16

    AIR WAR COLLEGE AIR UNIVERSITY LEADING CHANGE: TRANSITIONING THE AFMS INTO A HIGH RELIABILTY ORGANIZATION by Robert K. Bogart...academic research paper are those of the author and do not reflect the official policy or position of the US government, the Department of Defense, or Air ...University. In accordance with Air Force Instruction 51-303, it is not copyrighted, but is the property of the United States government. iii

  1. 77 FR 59745 - Delegation of Authority Regarding Electric Reliability Organization's Budget, Delegation...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-01

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF ENERGY Federal Energy Regulatory Commission 18 CFR Part 375 Delegation of Authority Regarding Electric Reliability Organization's Budget, Delegation Agreement, and Policy and Procedure Filings AGENCY: Federal Energy...

  2. Influencing Organizations to Promote Health: Applying Stakeholder Theory

    ERIC Educational Resources Information Center

    Kok, Gerjo; Gurabardhi, Zamira; Gottlieb, Nell H.; Zijlstra, Fred R. H.

    2015-01-01

    Stakeholder theory may help health promoters to make changes at the organizational and policy level to promote health. A stakeholder is any individual, group, or organization that can influence an organization. The organization that is the focus for influence attempts is called the focal organization. The more salient a stakeholder is and the more…

  3. Sensor Reliability Evaluation Scheme for Target Classification Using Belief Function Theory

    PubMed Central

    Zhu, Jing; Luo, Yupin; Zhou, Jianjun

    2013-01-01

    In the target classification based on belief function theory, sensor reliability evaluation has two basic issues: reasonable dissimilarity measure among evidences, and adaptive combination of static and dynamic discounting. One solution to the two issues has been proposed here. Firstly, an improved dissimilarity measure based on dualistic exponential function has been designed. We assess the static reliability from a training set by the local decision of each sensor and the dissimilarity measure among evidences. The dynamic reliability factors are obtained from each test target using the dissimilarity measure between the output information of each sensor and the consensus. Secondly, an adaptive combination method of static and dynamic discounting has been introduced. We adopt Parzen-window to estimate the matching degree of current performance and static performance for the sensor. Through fuzzy theory, the fusion system can realize self-learning and self-adapting with the sensor performance changing. Experiments conducted on real databases demonstrate that our proposed scheme performs better in target classification under different target conditions compared with other methods. PMID:24351632

  4. Organizations or Communities? Changing the Metaphor Changes the Theory.

    ERIC Educational Resources Information Center

    Sergiovanni, Thomas J.

    Educational administration has been shaped by the metaphor of organization. From organizational and management theory, and from economics, the parent of organizational theory, educational administration has borrowed definitions of quality, productivity, and efficiency; strategies to achieve them; and theories of human nature and motivation.…

  5. Generalizability theory reliability of written expression curriculum-based measurement in universal screening.

    PubMed

    Keller-Margulis, Milena A; Mercer, Sterett H; Thomas, Erin L

    2016-09-01

    The purpose of this study was to examine the reliability of written expression curriculum-based measurement (WE-CBM) in the context of universal screening from a generalizability theory framework. Students in second through fifth grade (n = 145) participated in the study. The sample included 54% female students, 49% White students, 23% African American students, 17% Hispanic students, 8% Asian students, and 3% of students identified as 2 or more races. Of the sample, 8% were English Language Learners and 6% were students receiving special education. Three WE-CBM probes were administered for 7 min each at 3 time points across 1 year. Writing samples were scored for commonly used WE-CBM metrics (e.g., correct minus incorrect word sequences; CIWS). Results suggest that nearly half the variance in WE-CBM is related to unsystematic error and that conventional screening procedures (i.e., the use of one 3-min sample) do not yield scores with adequate reliability for relative or absolute decisions about student performance. In most grades, three 3-min writing samples (or 2 longer duration samples) were required for adequate reliability for relative decisions, and three 7-min writing samples would not yield adequate reliability for relative decisions about within-year student growth. Implications and recommendations are discussed. (PsycINFO Database Record

  6. 18 CFR 39.4 - Funding of the Electric Reliability Organization.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... Reliability Organization. 39.4 Section 39.4 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY REGULATIONS UNDER THE FEDERAL POWER ACT RULES CONCERNING... interruption as it transitions from one method of funding to another. Any proposed transitional funding...

  7. 78 FR 29209 - Revisions to Electric Reliability Organization Definition of Bulk Electric System and Rules of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-17

    ... 17, 2013 Part II Department of Energy Federal Energy Regulatory Commission 18 CFR Part 40 Revisions...; ] DEPARTMENT OF ENERGY Federal Energy Regulatory Commission 18 CFR Part 40 Revisions to Electric Reliability Organization Definition of Bulk Electric System and Rules of Procedure AGENCY: Federal Energy...

  8. Research on High Reliability Organizations: Implications for School Effects Research, Policy, and Educational Practice.

    ERIC Educational Resources Information Center

    Stringfield, Sam

    Current theorizing in education, as in industry, is largely devoted to explaining trial-and-error, failure-tolerant, low-reliability organizations. This article examines changing societal demands on education and argues that effective responses to those demands require new and different organizational structures. Schools must abandon industrial…

  9. A Novel Evaluation Method for Building Construction Project Based on Integrated Information Entropy with Reliability Theory

    PubMed Central

    Bai, Xiao-ping; Zhang, Xi-wei

    2013-01-01

    Selecting construction schemes of the building engineering project is a complex multiobjective optimization decision process, in which many indexes need to be selected to find the optimum scheme. Aiming at this problem, this paper selects cost, progress, quality, and safety as the four first-order evaluation indexes, uses the quantitative method for the cost index, uses integrated qualitative and quantitative methodologies for progress, quality, and safety indexes, and integrates engineering economics, reliability theories, and information entropy theory to present a new evaluation method for building construction project. Combined with a practical case, this paper also presents detailed computing processes and steps, including selecting all order indexes, establishing the index matrix, computing score values of all order indexes, computing the synthesis score, sorting all selected schemes, and making analysis and decision. Presented method can offer valuable references for risk computing of building construction projects. PMID:23533352

  10. A novel evaluation method for building construction project based on integrated information entropy with reliability theory.

    PubMed

    Bai, Xiao-ping; Zhang, Xi-wei

    2013-01-01

    Selecting construction schemes of the building engineering project is a complex multiobjective optimization decision process, in which many indexes need to be selected to find the optimum scheme. Aiming at this problem, this paper selects cost, progress, quality, and safety as the four first-order evaluation indexes, uses the quantitative method for the cost index, uses integrated qualitative and quantitative methodologies for progress, quality, and safety indexes, and integrates engineering economics, reliability theories, and information entropy theory to present a new evaluation method for building construction project. Combined with a practical case, this paper also presents detailed computing processes and steps, including selecting all order indexes, establishing the index matrix, computing score values of all order indexes, computing the synthesis score, sorting all selected schemes, and making analysis and decision. Presented method can offer valuable references for risk computing of building construction projects.

  11. Understanding Schools as High-Reliability Organizations: An Exploratory Examination of Teachers' and School Leaders' Perceptions of Success

    ERIC Educational Resources Information Center

    Lorton, Juli A.; Bellamy, G. Thomas; Reece, Anne; Carlson, Jill

    2013-01-01

    Drawing on research on high-reliability organizations, this interviewbased qualitative case study employs four characteristics of such organizations as a lens for analyzing the operations of one very successful K-5 public school. Results suggest that the school had processes similar to those characteristic of high-reliability organizations: a…

  12. A Systems Theory View of Organizations as Communication Networks.

    ERIC Educational Resources Information Center

    Schwartz, Donald F.

    Focusing on the analysis of communication networks within organizations with an eye toward implications for study of external communication, this paper (1) develops a systems theory/communication view of the nature of formal organizations, (2) illustrates the notion of holistic organizational communication networks in organizations which include…

  13. Using the Reliability Theory for Assessing the Decision Confidence Probability for Comparative Life Cycle Assessments.

    PubMed

    Wei, Wei; Larrey-Lassalle, Pyrène; Faure, Thierry; Dumoulin, Nicolas; Roux, Philippe; Mathias, Jean-Denis

    2016-03-01

    Comparative decision making process is widely used to identify which option (system, product, service, etc.) has smaller environmental footprints and for providing recommendations that help stakeholders take future decisions. However, the uncertainty problem complicates the comparison and the decision making. Probability-based decision support in LCA is a way to help stakeholders in their decision-making process. It calculates the decision confidence probability which expresses the probability of a option to have a smaller environmental impact than the one of another option. Here we apply the reliability theory to approximate the decision confidence probability. We compare the traditional Monte Carlo method with a reliability method called FORM method. The Monte Carlo method needs high computational time to calculate the decision confidence probability. The FORM method enables us to approximate the decision confidence probability with fewer simulations than the Monte Carlo method by approximating the response surface. Moreover, the FORM method calculates the associated importance factors that correspond to a sensitivity analysis in relation to the probability. The importance factors allow stakeholders to determine which factors influence their decision. Our results clearly show that the reliability method provides additional useful information to stakeholders as well as it reduces the computational time.

  14. Patient safety in anesthesia: learning from the culture of high-reliability organizations.

    PubMed

    Wright, Suzanne M

    2015-03-01

    There has been an increased awareness of and interest in patient safety and improved outcomes, as well as a growing body of evidence substantiating medical error as a leading cause of death and injury in the United States. According to The Joint Commission, US hospitals demonstrate improvements in health care quality and patient safety. Although this progress is encouraging, much room for improvement remains. High-reliability organizations, industries that deliver reliable performances in the face of complex working environments, can serve as models of safety for our health care system until plausible explanations for patient harm are better understood.

  15. Hospital-based fall program measurement and improvement in high reliability organizations.

    PubMed

    Quigley, Patricia A; White, Susan V

    2013-05-31

    Falls and fall injuries in hospitals are the most frequently reported adverse event among adults in the inpatient setting. Advancing measurement and improvement around falls prevention in the hospital is important as falls are a nurse sensitive measure and nurses play a key role in this component of patient care. A framework for applying the concepts of high reliability organizations to falls prevention programs is described, including discussion of the core characteristics of such a model and determining the impact at the patient, unit, and organizational level. This article showcases the components of a patient safety culture and the integration of these components with fall prevention, the role of nurses, and high reliability.

  16. Central Perspectives and Debates in Organization Theory.

    ERIC Educational Resources Information Center

    Astley, W. Graham; Van de Ven, Andrew H.

    1983-01-01

    Classifies organizational theories, by analytical level and assumptions about human nature, into four perspectives (system-structural, strategic choice, natural selection, collective action), each with different concepts of organizational structure, behavior, change, and managerial roles. Identifies six debates generated among the perspectives and…

  17. Cultural Organization: Fragments of a Theory,

    DTIC Science & Technology

    1983-11-01

    34 In B. Staw & L.L. Cummings (eds.) Research in Organization Behavior, Vol. 6, Greenwich, CT: JAI Press, 1963. November, 1982. 0070-11H 0983 TR-11 Bailyn...627. Larson, M. S. The Rise of Professionalism. Berkeley: University of California Press, 1977. Lawrence, P. R. and Lorsch , J. W. Organization and...Pondy and P. Frost (eds.) Organizational Symbolism. Greenwich, CT: JAI Press, forthcoming. Manning, P. K. Metaphors of the Field. Administrative Science

  18. Organizations and Social Systems: Organization Theory's Neglected Mandate.

    ERIC Educational Resources Information Center

    Stern, Robert N.; Barley, Stephen R.

    1996-01-01

    The social-systems perspective in organizational theory faded because the increasing complexity of social relations hindered determination of an appropriate unit of analysis. Also, the business-school environment in which organizational research occurred discouraged examination of broad social questions, promoted a particular approach to science,…

  19. Influencing organizations to promote health: applying stakeholder theory.

    PubMed

    Kok, Gerjo; Gurabardhi, Zamira; Gottlieb, Nell H; Zijlstra, Fred R H

    2015-04-01

    Stakeholder theory may help health promoters to make changes at the organizational and policy level to promote health. A stakeholder is any individual, group, or organization that can influence an organization. The organization that is the focus for influence attempts is called the focal organization. The more salient a stakeholder is and the more central in the network, the stronger the influence. As stakeholders, health promoters may use communicative, compromise, deinstitutionalization, or coercive methods through an ally or a coalition. A hypothetical case study, involving adolescent use of harmful legal products, illustrates the process of applying stakeholder theory to strategic decision making.

  20. Enhance the lifetime and bias stress reliability in organic vertical transistor by UV/Ozone treatment

    NASA Astrophysics Data System (ADS)

    Lin, Hung-Cheng; Chang, Ming-Yu; Zan, Hsiao-Wen; Meng, Hsin-Fei; Chao, Yu-Chiang

    In this paper, we use UV/Ozone treatment to improve the lifetime and bias stress reliability of organic transistor with vertical channel. Even if vertical organic transistor exhibits better bias stress reliability than organic field effect transistor (OFET) due to bulk conduction mechanism, poor lifetime performance is still a challenge. Adding octadecyltrichlorosilane (OTS) to treat the vertical channel can reduce the trapping state and hence improve the bias stress ability. However, off-current is much higher after 6 days and lifetime performance is degraded. On the other hand, after 4000-s on-state bias stress, stable output current and on/off current ratio are demonstrated by using UV/Ozone to treat vertical channels. Threshold voltage shift is only -0.02 V which is much smaller than OFET with the same organic semiconductor material. Furthermore, the output current is also an order enhanced. Nevertheless, unlike device with OTS treatment, no obvious degradation is observed for UV/Ozone treated devices even after 170 days. With UV/Ozone treatment, the output current, bias stress reliability and lifetime were all improved. It makes vertical transistor become a promising device for the further application in display technology and flexible electronics.

  1. Theory and modeling of stereoselective organic reactions.

    PubMed

    Houk, K N; Paddon-Row, M N; Rondan, N G; Wu, Y D; Brown, F K; Spellmeyer, D C; Metz, J T; Li, Y; Loncharich, R J

    1986-03-07

    Theoretical investigations of the transition structures of additions and cycloadditions reveal details about the geometries of bond-forming processes that are not directly accessible by experiment. The conformational analysis of transition states has been developed from theoretical generalizations about the preferred angle of attack by reagents on multiple bonds and predictions of conformations with respect to partially formed bonds. Qualitative rules for the prediction of the stereochemistries of organic reactions have been devised, and semi-empirical computational models have also been developed to predict the stereoselectivities of reactions of large organic molecules, such as nucleophilic additions to carbonyls, electrophilic hydroborations and cycloadditions, and intramolecular radical additions and cycloadditions.

  2. Cliophysics: socio-political reliability theory, polity duration and African political (in)stabilities.

    PubMed

    Cherif, Alhaji; Barley, Kamal

    2010-12-29

    Quantification of historical sociological processes have recently gained attention among theoreticians in the effort of providing a solid theoretical understanding of the behaviors and regularities present in socio-political dynamics. Here we present a reliability theory of polity processes with emphases on individual political dynamics of African countries. We found that the structural properties of polity failure rates successfully capture the risk of political vulnerability and instabilities in which , , , and of the countries with monotonically increasing, unimodal, U-shaped and monotonically decreasing polity failure rates, respectively, have high level of state fragility indices. The quasi-U-shape relationship between average polity duration and regime types corroborates historical precedents and explains the stability of the autocracies and democracies.

  3. The Organization of the Living: A Theory of the Living Organization

    ERIC Educational Resources Information Center

    Maturana, H. R.

    1975-01-01

    Article presents a theory of the organization of living systems as autonomous entities, and a theory of the organization of the nervous system as a closed network of interacting neurons structurally coupled to the living system to whose realization it contributes. (Author)

  4. Cross Cultural Perspectives of the Learning Organization: Assessing the Validity and Reliability of the DLOQ in Korea

    ERIC Educational Resources Information Center

    Song, Ji Hoon; Kim, Jin Yong; Chermack, Thomas J.; Yang; Baiyin

    2008-01-01

    The primary purpose of this research was to adapt the Dimensions of Learning Organization Questionnaire (DLOQ) from Watkins and Marsick (1993, 1996) and examine its validity and reliability in a Korean context. Results indicate that the DLOQ produces valid and reliable scores of learning organization characteristics in a Korean cultural context.…

  5. Theory and modeling of stereoselective organic reactions

    SciTech Connect

    Houk, K.N.; Paddon-Row, M.N.; Rondan, N.G.; Wu, Y.D.; Brown, F.K.; Spellmeyer, D.C.; Metz, J.T.; Li, Y.; Loncharich, R.J.

    1986-03-07

    Theoretical investigations of the transition structures of additions and cycloadditions reveal details about the geometrics of bond-forming processes that are not directly accessible by experiment. The conformational analysis of transition states has been developed from theoretical generalizations about the preferred angle of attack by reagents on multiple bonds and predictions of conformations with respect to partially formed bonds. Qualitative rules for the prediction of the stereochemistries of organic reactions have been devised, and semi-empirical computational models have also been developed to predict the stereoselectivities of reactions of large organic molecules, such as nucleophilic additions to carbonyls, electrophilic hydroborations and cycloadditions, and intramolecular radical additions and cycloadditions. 52 references, 7 figures.

  6. Prolegomena to a Theory of Organization

    DTIC Science & Technology

    1951-12-10

    of loyalty, the " espirit de corps," corruption, the fact that ~~me leadersof organizations seem to be able to obtain a greater effect than others...such as a cartel, labor union , etc., the cost-profit principle is immediately seen to be in grave jeopardy. We shall return to this below. But f1rst...the first case; the difficulties some highly concentrated European econo- mies have encountered may be illustrations for the second case. However

  7. Teaching Organization Theory and Practice: An Experiential and Reflective Approach

    ERIC Educational Resources Information Center

    Cameron, Mark; Turkiewicz, Rita M.; Holdaway, Britt A.; Bill, Jacqueline S.; Goodman, Jessica; Bonner, Aisha; Daly, Stacey; Cohen, Michael D.; Lorenz, Cassandra; Wilson, Paul R.; Rusk, James

    2009-01-01

    The organization is often the overlooked level in social work's ecological perspective. However, organizational realities exert a profound influence on human development and well-being as well as the nature and quality of social work practice. This article describes a model of teaching organization theory and practice which requires master's…

  8. Do Advance Organizers Facilitate Learning? A Review of Subsumption Theory.

    ERIC Educational Resources Information Center

    McEneany, John E.

    1990-01-01

    A review of four studies conducted by Ausubel raises serious doubts about the efficacy of advance organizers under a variety of circumstances. In addition, this review questions the adequacy of definitions for two central notions of subsumption theory (discriminability and advance organizer). (IAH)

  9. A win for HROs. Employing high-reliability organization characteristics in EMS.

    PubMed

    Heightman, A J

    2013-06-01

    Was I insubordinate, arrogant or disrespectful? You may feel that I was. But in reality, I was educated to a level that could have been validated and should have been respected by command. I was, in fact, practicing a key aspect of HRO. I was stopping an obvious dangerous condition before it could harm or kill emergency responders. My IC colleague knew it from the facts presented and, in fact, joked with me about my "subtle sarcasm" and moved the perimeter to the recommended half-mile distance. Did I win, or did a proactive HRO win? Actually, HRO won and potentially saved 30 lives. I simply presented the hazards of CFC inhalation. A high-reliability organization must not rely on only one source of data when detailed information on a hazard isn't immediately available, or if it isn't very informative during an emergency decision-making process. Read "EMS & High Reliability Organizing: Achieving safety & reliability in the dynamic, high-risk environment and practice its important principles," pp. 60-63. It's really common sense, not rocket science, and may save you, your crews or others in your community.

  10. The tissue organization field theory of cancer: A testable replacement for the somatic mutation theory

    PubMed Central

    Soto, Ana M.; Sonnenschein, Carlos

    2014-01-01

    The somatic mutation theory (SMT) of cancer has been and remains the prevalent theory attempting to explain how neoplasms arise and progress. This theory proposes that cancer is a clonal, cell-based disease, and implicitly assumes that quiescence is the default state of cells in multicellular organisms. The SMT has not been rigorously tested, and several lines of evidence raise questions that are not addressed by this theory. Herein, we propose experimental strategies that may validate the SMT. We also call attention to an alternative theory of carcinogenesis, the tissue organization field theory (TOFT), which posits that cancer is a tissue-based disease and that proliferation is the default state of all cells. Based on epistemological and experimental evidence, we argue that the TOFT compellingly explains carcinogenesis, while placing it within an evolutionarily relevant context. PMID:21503935

  11. Carcinogenesis explained within the context of a theory of organisms.

    PubMed

    Sonnenschein, Carlos; Soto, Ana M

    2016-10-01

    For a century, the somatic mutation theory (SMT) has been the prevalent theory to explain carcinogenesis. According to the SMT, cancer is a cellular problem, and thus, the level of organization where it should be studied is the cellular level. Additionally, the SMT proposes that cancer is a problem of the control of cell proliferation and assumes that proliferative quiescence is the default state of cells in metazoa. In 1999, a competing theory, the tissue organization field theory (TOFT), was proposed. In contraposition to the SMT, the TOFT posits that cancer is a tissue-based disease whereby carcinogens (directly) and mutations in the germ-line (indirectly) alter the normal interactions between the diverse components of an organ, such as the stroma and its adjacent epithelium. The TOFT explicitly acknowledges that the default state of all cells is proliferation with variation and motility. When taking into consideration the principle of organization, we posit that carcinogenesis can be explained as a relational problem whereby release of the constraints created by cell interactions and the physical forces generated by cellular agency lead cells within a tissue to regain their default state of proliferation with variation and motility. Within this perspective, what matters both in morphogenesis and carcinogenesis is not only molecules, but also biophysical forces generated by cells and tissues. Herein, we describe how the principles for a theory of organisms apply to the TOFT and thus to the study of carcinogenesis.

  12. Measuring theory of mind across middle childhood: Reliability and validity of the Silent Films and Strange Stories tasks.

    PubMed

    Devine, Rory T; Hughes, Claire

    2016-09-01

    Recent years have seen a growth of research on the development of children's ability to reason about others' mental states (or "theory of mind") beyond the narrow confines of the preschool period. The overall aim of this study was to investigate the psychometric properties of a task battery composed of items from Happé's Strange Stories task and Devine and Hughes' Silent Film task. A sample of 460 ethnically and socially diverse children (211 boys) between 7 and 13years of age completed the task battery at two time points separated by 1month. The Strange Stories and Silent Film tasks were strongly correlated even when verbal ability and narrative comprehension were taken into account, and all items loaded onto a single theory-of-mind latent factor. The theory-of-mind latent factor provided reliable estimates of performance across a wide range of theory-of-mind ability and showed no evidence of differential item functioning across gender, ethnicity, or socioeconomic status. The theory-of-mind latent factor also exhibited strong 1-month test-retest reliability, and this stability did not vary as a function of child characteristics. Taken together, these findings provide evidence for the validity and reliability of the Strange Stories and Silent Film task battery as a measure of individual differences in theory of mind suitable for use across middle childhood. We consider the methodological and conceptual implications of these findings for research on theory of mind beyond the preschool years.

  13. A modelling approach to find stable and reliable soil organic carbon values for further regionalization.

    NASA Astrophysics Data System (ADS)

    Bönecke, Eric; Franko, Uwe

    2015-04-01

    Soil organic matter (SOM) and carbon (SOC) might be the most important components to describe soil fertility of agricultural used soils. It is sensitive to temporal and spatial changes due to varying weather conditions, uneven crops and soil management practices and still struggles with providing reliable delineation of spatial variability. Soil organic carbon, furthermore, is an essential initial parameter for dynamic modelling, understanding e.g. carbon and nitrogen processes. Alas it requires cost and time intensive field and laboratory work to attain and using this information. The objective of this study is to assess an approach that reduces efforts of laboratory and field analyses by using method to find stable initial soil organic carbon values for further soil process modelling and regionalization on field scale. The demand of strategies, technics and tools to improve reliable soil organic carbon high resolution maps and additionally reducing cost constraints is hence still facing an increasing attention of scientific research. Although, it is nowadays a widely used practice, combining effective sampling schemes with geophysical sensing techniques, to describe within-field variability of soil organic carbon, it is still challenging large uncertainties, even at field scale in both, science and agriculture. Therefore, an analytical and modelling approach might facilitate and improve this strategy on small and large field scale. This study will show a method, how to find reliable steady state values of soil organic carbon at particular points, using the approved soil process model CANDY (Franko et al. 1995). It is focusing on an iterative algorithm of adjusting the key driving components: soil physical properties, meteorological data and management information, for which we quantified the input and the losses of soil carbon (manure, crop residues, other organic inputs, decomposition, leaching). Furthermore, this approach can be combined with geophysical

  14. The Development of the Functional Literacy Experience Scale Based upon Ecological Theory (FLESBUET) and Validity-Reliability Study

    ERIC Educational Resources Information Center

    Özenç, Emine Gül; Dogan, M. Cihangir

    2014-01-01

    This study aims to perform a validity-reliability test by developing the Functional Literacy Experience Scale based upon Ecological Theory (FLESBUET) for primary education students. The study group includes 209 fifth grade students at Sabri Taskin Primary School in the Kartal District of Istanbul, Turkey during the 2010-2011 academic year.…

  15. Using G-Theory to Enhance Evidence of Reliability and Validity for Common Uses of the Paulhus Deception Scales.

    PubMed

    Vispoel, Walter P; Morris, Carrie A; Kilinc, Murat

    2016-04-13

    We applied a new approach to Generalizability theory (G-theory) involving parallel splits and repeated measures to evaluate common uses of the Paulhus Deception Scales based on polytomous and four types of dichotomous scoring. G-theory indices of reliability and validity accounting for specific-factor, transient, and random-response measurement error supported use of polytomous over dichotomous scores as contamination checks; as control, explanatory, and outcome variables; as aspects of construct validation; and as indexes of environmental effects on socially desirable responding. Polytomous scoring also provided results for flagging faking as dependable as those when using dichotomous scoring methods. These findings argue strongly against the nearly exclusive use of dichotomous scoring for the Paulhus Deception Scales in practice and underscore the value of G-theory in demonstrating this. We provide guidelines for applying our G-theory techniques to other objectively scored clinical assessments, for using G-theory to estimate how changes to a measure might improve reliability, and for obtaining software to conduct G-theory analyses free of charge.

  16. Organic unity theory: the mind-body problem revisited.

    PubMed

    Goodman, A

    1991-05-01

    The purpose of this essay is to delineate the conceptual framework for psychiatry as an integrated and integrative science that unites the mental and the physical. Four basic philosophical perspectives concerning the relationship between mind and body are introduced. The biopsychosocial model, at this time the preeminent model in medical science that addresses this relationship, is examined and found to be flawed. Mental-physical identity theory is presented as the most valid philosophical approach to understanding the relationship between mind and body. Organic unity theory is then proposed as a synthesis of the biopsychosocial model and mental-physical identity theory in which the difficulties of the biopsychosocial model are resolved. Finally, some implications of organic unity theory for psychiatry are considered. 1) The conventional dichotomy between physical (organic) and mental (functional) is linguistic/conceptual rather than inherent in nature, and all events and processes involved in the etiology, pathogenesis, symptomatic manifestation, and treatment of psychiatric disorders are simultaneously biological and psychological. 2) Neuroscience requires new conceptual models to comprehend the integrated and emergent physiological processes to which psychological phenomena correspond. 3) Introspective awareness provides data that are valid for scientific inquiry and is the most direct method of knowing psychophysical events. 4) Energy currently being expended in disputes between biological and psychological psychiatry would be more productively invested in attempting to formulate the conditions under which each approach is maximally effective.

  17. The chronic toxicity of molybdate to marine organisms. I. Generating reliable effects data.

    PubMed

    Heijerick, D G; Regoli, L; Stubblefield, W

    2012-07-15

    A scientific research program was initiated by the International Molybdenum Association (IMOA) which addressed identified gaps in the environmental toxicity data for the molybdate ion (MoO(4)(2-)). These gaps were previously identified during the preparation of EU-REACH-dossiers for different molybdenum compounds (European Union regulation on Registration, Evaluation, Authorization and Restriction of Chemical substances; EC, 2006). Evaluation of the open literature identified few reliable marine ecotoxicological data that could be used for deriving a Predicted No-Effect Concentration (PNEC) for the marine environment. Rather than calculating a PNEC(marine) using the assessment factor methodology on a combined freshwater/marine dataset, IMOA decided to generate sufficient reliable marine chronic data to permit derivation of a PNEC by means of the more scientifically robust species sensitivity distribution (SSD) approach (also called the statistical extrapolation approach). Nine test species were chronically exposed to molybdate (added as sodium molybdate dihydrate, Na(2)MoO(4)·2H(2)O) according to published standard testing guidelines that are acceptable for a broad range of regulatory purposes. The selected test organisms were representative for typical marine trophic levels: micro-algae/diatom (Phaeodactylum tricornutum, Dunaliella tertiolecta), macro-alga (Ceramium tenuicorne), mysids (Americamysis bahia), copepod (Acartia tonsa), fish (Cyprinodon variegatus), echinoderms (Dendraster exentricus, Strongylocentrotus purpuratus) and molluscs (Mytilus edulis, Crassostrea gigas). Available NOEC/EC(10) levels ranged between 4.4 mg Mo/L (blue mussel M. edulis) and 1174 mg Mo/L (oyster C. gigas). Using all available reliable marine chronic effects data that are currently available, a HC(5,50%) (median hazardous concentration affecting 5% of the species) of 5.74(mg Mo)/L was derived with the statistical extrapolation approach, a value that can be used for national and

  18. Validity and Reliability of Published Comprehensive Theory of Mind Tests for Normal Preschool Children: A Systematic Review

    PubMed Central

    Ziatabar Ahmadi, Seyyede Zohreh; Jalaie, Shohreh; Ashayeri, Hassan

    2015-01-01

    Objective: Theory of mind (ToM) or mindreading is an aspect of social cognition that evaluates mental states and beliefs of oneself and others. Validity and reliability are very important criteria when evaluating standard tests; and without them, these tests are not usable. The aim of this study was to systematically review the validity and reliability of published English comprehensive ToM tests developed for normal preschool children. Method: We searched MEDLINE (PubMed interface), Web of Science, Science direct, PsycINFO, and also evidence base Medicine (The Cochrane Library) databases from 1990 to June 2015. Search strategy was Latin transcription of ‘Theory of Mind’ AND test AND children. Also, we manually studied the reference lists of all final searched articles and carried out a search of their references. Inclusion criteria were as follows: Valid and reliable diagnostic ToM tests published from 1990 to June 2015 for normal preschool children; and exclusion criteria were as follows: the studies that only used ToM tests and single tasks (false belief tasks) for ToM assessment and/or had no description about structure, validity or reliability of their tests. Methodological quality of the selected articles was assessed using the Critical Appraisal Skills Programme (CASP). Result: In primary searching, we found 1237 articles in total databases. After removing duplicates and applying all inclusion and exclusion criteria, we selected 11 tests for this systematic review. Conclusion: There were a few valid, reliable and comprehensive ToM tests for normal preschool children. However, we had limitations concerning the included articles. The defined ToM tests were different in populations, tasks, mode of presentations, scoring, mode of responses, times and other variables. Also, they had various validities and reliabilities. Therefore, it is recommended that the researchers and clinicians select the ToM tests according to their psychometric characteristics

  19. Portable SERS-enabled micropipettes for microarea sampling and reliably quantitative detection of surface organic residues.

    PubMed

    Fang, Wei; Zhang, Xinwei; Chen, Yong; Wan, Liang; Huang, Weihua; Shen, Aiguo; Hu, Jiming

    2015-09-15

    We report the first microsampling device for reliably quantitative, label-free and separation-free detection of multicomponents of surface organic residues (SORs) by means of a quality controllable surface-enhanced Raman scattering (SERS)-enabled micropipette. The micropipette is comprised of a drawn glass capillary with a tiny orifice (∼50 μm) at the distal tip, where the specially designed nanorattles (NRs) are compactly coated on the inner wall surface. SERS signals of 4-mercapto benzoic acid (MBA) anchored inside the internal gap of NRs could be used to evaluate and control the quality of micropipettes and, therefore, allow us to overcome the limitations of a reliably quantitative SERS assay using traditional substrates without an internal standard. By dropping a trace extraction agent on targeting SORs located on a narrow surface, the capillary and SERS functionalities of these micropipettes allow on-site microsampling via capillary action and subsequent multiplex distinction/detection due to their molecularly narrow Raman peaks. For example, 8 nM thiram (TMTD), 8 nM malachite green (MG), and 1.5 μM (400 ppb) methyl parathion (MPT) on pepper and cucumber peels have been simultaneously detected in a wide detection range. The portable SERS-enabled device could potentially be facilely incorporated with liquid-liquid or solid phase micro-extracting devices for a broader range of applications in rapid and field analysis of food/public/environment security related SORs.

  20. Utilizing Generalizability Theory to Investigate the Reliability of the Grades Assigned to Undergraduate Research Papers

    ERIC Educational Resources Information Center

    Gugiu, Mihaiela R.; Gugiu, Paul C.; Baldus, Robert

    2012-01-01

    Background: Educational researchers have long espoused the virtues of writing with regard to student cognitive skills. However, research on the reliability of the grades assigned to written papers reveals a high degree of contradiction, with some researchers concluding that the grades assigned are very reliable whereas others suggesting that they…

  1. Investigating Postgraduate College Admission Interviews: Generalizability Theory Reliability and Incremental Predictive Validity

    ERIC Educational Resources Information Center

    Arce-Ferrer, Alvaro J.; Castillo, Irene Borges

    2007-01-01

    The use of face-to-face interviews is controversial for college admissions decisions in light of the lack of availability of validity and reliability evidence for most college admission processes. This study investigated reliability and incremental predictive validity of a face-to-face postgraduate college admission interview with a sample of…

  2. Organization Theory, Political Theory, and the International Arena: Some Hope But Very Little Time.

    ERIC Educational Resources Information Center

    Thayer, Frederick C.

    This paper presents background on a non-hierarchical organizational perspective. In addition, it presents guidelines for using a non-hierarchical perspective to create generally acceptable forms of international organizations. The theory on which the non-hierarchical perspective is based maintains that a form of comprehensive global planning…

  3. Organic unity theory: an integrative mind-body theory for psychiatry.

    PubMed

    Goodman, A

    1997-12-01

    The potential of psychiatry as an integrative science has been impeded by an internal schism that derives from the duality of mental and physical. Organic unity theory is proposed as a conceptual framework that brings together the terms of the mind-body duality in one coherent perspective. Organic unity theory is braided of three strands: identity, which describes the relationship between mentally described events and corresponding physically described events; continuity, which describes the linguistic-conceptual system that contains both mental and physical terms; and dialectic, which describes the relationship between the empirical way of knowing that is associated with the physical domain of the linguistic-conceptual system and the hermeneutic way of knowing that is associated with the mental domain. Each strand represents an integrative formulation that resolves an aspect of mental-physical dualism into an underlying unity. After the theory is presented, its implications for psychiatry are briefly considered.

  4. Magnetoelectroluminescence of organic heterostructures: Analytical theory and spectrally resolved measurements

    SciTech Connect

    Liu, Feilong; Kelley, Megan R.; Crooker, Scott A.; Nie, Wanyi; Mohite, Aditya D.; Ruden, P. Paul; Smith, Darryl L.

    2014-12-22

    The effect of a magnetic field on the electroluminescence of organic light emitting devices originates from the hyperfine interaction between the electron/hole polarons and the hydrogen nuclei of the host molecules. In this paper, we present an analytical theory of magnetoelectroluminescence for organic semiconductors. To be specific, we focus on bilayer heterostructure devices. In the case we are considering, light generation at the interface of the donor and acceptor layers results from the formation and recombination of exciplexes. The spin physics is described by a stochastic Liouville equation for the electron/hole spin density matrix. By finding the steady-state analytical solution using Bloch-Wangsness-Redfield theory, we explore how the singlet/triplet exciplex ratio is affected by the hyperfine interaction strength and by the external magnetic field. In order to validate the theory, spectrally resolved electroluminescence experiments on BPhen/m-MTDATA devices are analyzed. With increasing emission wavelength, the width of the magnetic field modulation curve of the electroluminescence increases while its depth decreases. Furthermore, these observations are consistent with the model.

  5. Magnetoelectroluminescence of organic heterostructures: Analytical theory and spectrally resolved measurements

    DOE PAGES

    Liu, Feilong; Kelley, Megan R.; Crooker, Scott A.; ...

    2014-12-22

    The effect of a magnetic field on the electroluminescence of organic light emitting devices originates from the hyperfine interaction between the electron/hole polarons and the hydrogen nuclei of the host molecules. In this paper, we present an analytical theory of magnetoelectroluminescence for organic semiconductors. To be specific, we focus on bilayer heterostructure devices. In the case we are considering, light generation at the interface of the donor and acceptor layers results from the formation and recombination of exciplexes. The spin physics is described by a stochastic Liouville equation for the electron/hole spin density matrix. By finding the steady-state analytical solutionmore » using Bloch-Wangsness-Redfield theory, we explore how the singlet/triplet exciplex ratio is affected by the hyperfine interaction strength and by the external magnetic field. In order to validate the theory, spectrally resolved electroluminescence experiments on BPhen/m-MTDATA devices are analyzed. With increasing emission wavelength, the width of the magnetic field modulation curve of the electroluminescence increases while its depth decreases. Furthermore, these observations are consistent with the model.« less

  6. Magnetoelectroluminescence of organic heterostructures: Analytical theory and spectrally resolved measurements

    NASA Astrophysics Data System (ADS)

    Liu, Feilong; Kelley, Megan R.; Crooker, Scott A.; Nie, Wanyi; Mohite, Aditya D.; Ruden, P. Paul; Smith, Darryl L.

    2014-12-01

    The effect of a magnetic field on the electroluminescence of organic light emitting devices originates from the hyperfine interaction between the electron/hole polarons and the hydrogen nuclei of the host molecules. In this paper, we present an analytical theory of magnetoelectroluminescence for organic semiconductors. To be specific, we focus on bilayer heterostructure devices. In the case we are considering, light generation at the interface of the donor and acceptor layers results from the formation and recombination of exciplexes. The spin physics is described by a stochastic Liouville equation for the electron/hole spin density matrix. By finding the steady-state analytical solution using Bloch-Wangsness-Redfield theory, we explore how the singlet/triplet exciplex ratio is affected by the hyperfine interaction strength and by the external magnetic field. To validate the theory, spectrally resolved electroluminescence experiments on BPhen/m-MTDATA devices are analyzed. With increasing emission wavelength, the width of the magnetic field modulation curve of the electroluminescence increases while its depth decreases. These observations are consistent with the model.

  7. Application of SAW method for multiple-criteria comparative analysis of the reliability of heat supply organizations

    NASA Astrophysics Data System (ADS)

    Akhmetova, I. G.; Chichirova, N. D.

    2016-12-01

    Heat supply is the most energy-consuming sector of the economy. Approximately 30% of all used primary fuel-and-energy resources is spent on municipal heat-supply needs. One of the key indicators of activity of heat-supply organizations is the reliability of an energy facility. The reliability index of a heat supply organization is of interest to potential investors for assessing risks when investing in projects. The reliability indices established by the federal legislation are actually reduced to a single numerical factor, which depends on the number of heat-supply outages in connection with disturbances in operation of heat networks and the volume of their resource recovery in the calculation year. This factor is rather subjective and may change in a wide range during several years. A technique is proposed for evaluating the reliability of heat-supply organizations with the use of the simple additive weighting (SAW) method. The technique for integrated-index determination satisfies the following conditions: the reliability level of the evaluated heat-supply system is represented maximum fully and objectively; the information used for the reliability-index evaluation is easily available (is located on the Internet in accordance with demands of data-disclosure standards). For reliability estimation of heat-supply organizations, the following indicators were selected: the wear of equipment of thermal energy sources, the wear of heat networks, the number of outages of supply of thermal energy (heat carrier due to technological disturbances on heat networks per 1 km of heat networks), the number of outages of supply of thermal energy (heat carrier due to technologic disturbances on thermal energy sources per 1 Gcal/h of installed power), the share of expenditures in the cost of thermal energy aimed at recovery of the resource (renewal of fixed assets), coefficient of renewal of fixed assets, and a coefficient of fixed asset retirement. A versatile program is developed

  8. Reliable thermal processing of organic perovskite films deposited on ZnO

    NASA Astrophysics Data System (ADS)

    Zakhidov, Alex; Manspeaker, Chris; Lyashenko, Dmitry; Alex Zakhidov Team

    Zinc oxide (ZnO) is a promising semiconducting material to serve as an electron transport layer (ETL) for solar cell devices based on organo-halide lead perovskites. ZnO ETL for perovskite photovoltaics has a combination of attractive electronic and optical properties: i) the electron affinity of ZnO is well aligned with valence band edge of the CH3NH3PbI3, ii) electron mobility of ZnO is >1 cm2/(Vs), which is a few orders of magnitude higher than that of TiO2 (another popular choice of ETL for perovskite photovoltaic devices), and iii) ZnO has a large of band gap of 3.3 eV, which ensures optical transparency and large barrier for the hole injection. Moreover, ZnO nanostructures can be printed on flexible substrates at room temperatures in cost effective manner. However, it was recently found that organic perovskites deposited on ZnO are unstable and readily decompose at >90°C. In this work, we further investigate the mechanism of decomposition of CH3NH3PbI3 film deposited on ZnO and reveal the role of the solvent in the film during the annealing process. We also develop a restricted volume solvent annealing (RVSA) process for post annealing of the perovskite film on ZnO without decomposition. We demonstrate that RVSA enables reliable perovskite solar cell fabrication.

  9. Elaboration and Application of a Theory of Criterion-Referenced Reliability.

    ERIC Educational Resources Information Center

    Lovett, Hubert T.

    The reliability of a criterion referenced test was defined as a measure of the degree to which the test discriminates between an individual's level of performance and a predetermined criterion level. The variances of observed and true scores were defined as the squared deviation of the score from the criterion. Based on these definitions and the…

  10. Generalizability Theory Analysis of CBM Maze Reliability in Third- through Fifth-Grade Students

    ERIC Educational Resources Information Center

    Mercer, Sterett H.; Dufrene, Brad A.; Zoder-Martell, Kimberly; Harpole, Lauren Lestremau; Mitchell, Rachel R.; Blaze, John T.

    2012-01-01

    Despite growing use of CBM Maze in universal screening and research, little information is available regarding the number of CBM Maze probes needed for reliable decisions. The current study extends existing research on the technical adequacy of CBM Maze by investigating the number of probes and assessment durations (1-3 min) needed for reliable…

  11. The "New Institutionalism" in Organization Theory: Bringing Society and Culture Back in

    ERIC Educational Resources Information Center

    Senge, Konstanze

    2013-01-01

    This investigation will discuss the emergence of an economistical perspective among the dominant approaches of organization theory in the United States since the inception of "organization studies" as an academic discipline. It maintains that Contingency theory, Resource Dependency theory, Population Ecology theory, and Transaction Cost theory…

  12. Theory of hydrogen migration in organic-inorganic halide perovskites.

    PubMed

    Egger, David A; Kronik, Leeor; Rappe, Andrew M

    2015-10-12

    Solar cells based on organic-inorganic halide perovskites have recently been proven to be remarkably efficient. However, they exhibit hysteresis in their current-voltage curves, and their stability in the presence of water is problematic. Both issues are possibly related to a diffusion of defects in the perovskite material. By using first-principles calculations based on density functional theory, we study the properties of an important defect in hybrid perovskites-interstitial hydrogen. We show that differently charged defects occupy different crystal sites, which may allow for ionization-enhanced defect migration following the Bourgoin-Corbett mechanism. Our analysis highlights the structural flexibility of organic-inorganic perovskites: successive iodide displacements, combined with hydrogen bonding, enable proton diffusion with low migration barriers. These findings indicate that hydrogen defects can be mobile and thus highly relevant for the performance of perovskite solar cells.

  13. Reliability of a viva assessment of clinical reasoning in an Australian pre-professional osteopathy program assessed using generalizability theory

    PubMed Central

    2017-01-01

    Clinical reasoning is situation-dependent and case-specific; therefore, assessments incorporating different patient presentations are warranted. The present study aimed to determine the reliability of a multi-station case-based viva assessment of clinical reasoning in an Australian pre-registration osteopathy program using generalizability theory. Students (from years 4 and 5) and examiners were recruited from the osteopathy program at Southern Cross University, Lismore, Australia. The study took place on a single day in the student teaching clinic. Examiners were trained before the examination. Students were allocated to 1 of 3 rounds consisting of 5 10-minute stations in an objective structured clinical examination-style. Generalizability analysis was used to explore the reliability of the examination. Fifteen students and 5 faculty members participated in the study. The examination produced a generalizability coefficient of 0.53, with 18 stations required to achieve a generalizability coefficient of 0.80. The reliability estimations were acceptable and the psychometric findings related to the marking rubric and overall scores were acceptable; however, further work is required in examiner training and ensuring consistent case difficulty to improve the reliability of the examination. PMID:28104901

  14. Reliability of a viva assessment of clinical reasoning in an Australian pre-professional osteopathy program assessed using generalizability theory.

    PubMed

    Vaughan, Brett; Orrock, Paul; Grace, Sandra

    2017-01-01

    Clinical reasoning is situation-dependent and case-specific; therefore, assessments incorporating different patient presentations are warranted. The present study aimed to determine the reliability of a multi-station case-based viva assessment of clinical reasoning in an Australian pre-registration osteopathy program using generalizability theory. Students (from years 4 and 5) and examiners were recruited from the osteopathy program at Southern Cross University, Lismore, Australia. The study took place on a single day in the student teaching clinic. Examiners were trained before the examination. Students were allocated to 1 of 3 rounds consisting of 5 10-minute stations in an objective structured clinical examination-style. Generalizability analysis was used to explore the reliability of the examination. Fifteen students and 5 faculty members participated in the study. The examination produced a generalizability coefficient of 0.53, with 18 stations required to achieve a generalizability coefficient of 0.80. The reliability estimations were acceptable and the psychometric findings related to the marking rubric and overall scores were acceptable; however, further work is required in examiner training and ensuring consistent case difficulty to improve the reliability of the examination.

  15. Reliability of surgical skills scores in otolaryngology residents: analysis using generalizability theory.

    PubMed

    Fernandez, Soledad A; Wiet, Gregory J; Butler, Nancy N; Welling, Bradley; Jarjoura, David

    2008-12-01

    Assessments of temporal bone dissection performance among otolaryngology residents have not been adequately developed. At the Ohio State College of Medicine, an instrument (Welling Scale, Version 1 [WS1]) is used to evaluate residents' end-product performance after drilling a temporal bone. In this study, the authors evaluate the components that contribute to measurement error using this scale. Generalizability theory was used to reveal components of measurement error that allow for better understanding of test results. A major component of measurement error came from inconsistency in performance across the two cadaveric test bones each resident was assigned. In contrast, ratings of performance using the WS1 were highly consistent across raters and rating sessions within raters. The largest source of measurement error was caused by residents' inconsistent performance across bones. Rater disagreement introduced only small error into scores. The WS1 provides small measurement error, with two raters and two bones for each participant.

  16. Reliable Energy Level Alignment at Physisorbed Molecule–Metal Interfaces from Density Functional Theory

    PubMed Central

    2015-01-01

    A key quantity for molecule–metal interfaces is the energy level alignment of molecular electronic states with the metallic Fermi level. We develop and apply an efficient theoretical method, based on density functional theory (DFT) that can yield quantitatively accurate energy level alignment information for physisorbed metal–molecule interfaces. The method builds on the “DFT+Σ” approach, grounded in many-body perturbation theory, which introduces an approximate electron self-energy that corrects the level alignment obtained from conventional DFT for missing exchange and correlation effects associated with the gas-phase molecule and substrate polarization. Here, we extend the DFT+Σ approach in two important ways: first, we employ optimally tuned range-separated hybrid functionals to compute the gas-phase term, rather than rely on GW or total energy differences as in prior work; second, we use a nonclassical DFT-determined image-charge plane of the metallic surface to compute the substrate polarization term, rather than the classical DFT-derived image plane used previously. We validate this new approach by a detailed comparison with experimental and theoretical reference data for several prototypical molecule–metal interfaces, where excellent agreement with experiment is achieved: benzene on graphite (0001), and 1,4-benzenediamine, Cu-phthalocyanine, and 3,4,9,10-perylene-tetracarboxylic-dianhydride on Au(111). In particular, we show that the method correctly captures level alignment trends across chemical systems and that it retains its accuracy even for molecules for which conventional DFT suffers from severe self-interaction errors. PMID:25741626

  17. Assessing Variations in Areal Organization for the Intrinsic Brain: From Fingerprints to Reliability

    PubMed Central

    Xu, Ting; Opitz, Alexander; Craddock, R. Cameron; Wright, Margaret J.; Zuo, Xi-Nian; Milham, Michael P.

    2016-01-01

    Resting state fMRI (R-fMRI) is a powerful in-vivo tool for examining the functional architecture of the human brain. Recent studies have demonstrated the ability to characterize transitions between functionally distinct cortical areas through the mapping of gradients in intrinsic functional connectivity (iFC) profiles. To date, this novel approach has primarily been applied to iFC profiles averaged across groups of individuals, or in one case, a single individual scanned multiple times. Here, we used a publically available R-fMRI dataset, in which 30 healthy participants were scanned 10 times (10 min per session), to investigate differences in full-brain transition profiles (i.e., gradient maps, edge maps) across individuals, and their reliability. 10-min R-fMRI scans were sufficient to achieve high accuracies in efforts to “fingerprint” individuals based upon full-brain transition profiles. Regarding test–retest reliability, the image-wise intraclass correlation coefficient (ICC) was moderate, and vertex-level ICC varied depending on region; larger durations of data yielded higher reliability scores universally. Initial application of gradient-based methodologies to a recently published dataset obtained from twins suggested inter-individual variation in areal profiles might have genetic and familial origins. Overall, these results illustrate the utility of gradient-based iFC approaches for studying inter-individual variation in brain function. PMID:27600846

  18. Reliability training

    NASA Technical Reports Server (NTRS)

    Lalli, Vincent R. (Editor); Malec, Henry A. (Editor); Dillard, Richard B.; Wong, Kam L.; Barber, Frank J.; Barina, Frank J.

    1992-01-01

    Discussed here is failure physics, the study of how products, hardware, software, and systems fail and what can be done about it. The intent is to impart useful information, to extend the limits of production capability, and to assist in achieving low cost reliable products. A review of reliability for the years 1940 to 2000 is given. Next, a review of mathematics is given as well as a description of what elements contribute to product failures. Basic reliability theory and the disciplines that allow us to control and eliminate failures are elucidated.

  19. 18 CFR 39.4 - Funding of the Electric Reliability Organization.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... business plan and organization chart, explaining the proposed collection of all dues, fees and charges and... budget and business plan no later than sixty (60) days in advance of the beginning of the Electric... Organization may include in the application a plan for a transitional funding mechanism that would allow...

  20. Aligning the Undergraduate Organic Laboratory Experience with Professional Work: The Centrality of Reliable and Meaningful Data

    ERIC Educational Resources Information Center

    Alaimo, Peter J.; Langenhan, Joseph M.; Suydam, Ian T.

    2014-01-01

    Many traditional organic chemistry lab courses do not adequately help students to develop the professional skills required for creative, independent work. The overarching goal of the new organic chemistry lab series at Seattle University is to teach undergraduates to think, perform, and behave more like professional scientists. The conversion of…

  1. Left-right organizer flow dynamics: how much cilia activity reliably yields laterality?

    PubMed

    Sampaio, Pedro; Ferreira, Rita R; Guerrero, Adán; Pintado, Petra; Tavares, Bárbara; Amaro, Joana; Smith, Andrew A; Montenegro-Johnson, Thomas; Smith, David J; Lopes, Susana S

    2014-06-23

    Internal organs are asymmetrically positioned inside the body. Embryonic motile cilia play an essential role in this process by generating a directional fluid flow inside the vertebrate left-right organizer. Detailed characterization of how fluid flow dynamics modulates laterality is lacking. We used zebrafish genetics to experimentally generate a range of flow dynamics. By following the development of each embryo, we show that fluid flow in the left-right organizer is asymmetric and provides a good predictor of organ laterality. This was tested in mosaic organizers composed of motile and immotile cilia generated by dnah7 knockdowns. In parallel, we used simulations of fluid dynamics to analyze our experimental data. These revealed that fluid flow generated by 30 or more cilia predicts 90% situs solitus, similar to experimental observations. We conclude that cilia number, dorsal anterior motile cilia clustering, and left flow are critical to situs solitus via robust asymmetric charon expression.

  2. First evidence on the validity and reliability of the Safety Organizing Scale-Nursing home version (SOS-NH)

    PubMed Central

    Ausserhofer, Dietmar; Anderson, Ruth A.; Colón-Emeric, Cathleen; Schwendimann, René

    2013-01-01

    Background The Safety Organizing Scale is a valid and reliable measure on safety behaviors and practices in hospitals. Purpose of the study This study aimed to explore the psychometric properties of the Safety Organizing Scale-Nursing Home version (SOS-NH). Design and Methods In a cross-sectional analysis of staff survey data, we examined validity and reliability of the 9-item Safety SOS-NH using American Educational Research Association guidelines. Subjects and Setting This sub-study of a larger trial used baseline survey data collected from staff members (n=627) in a variety of work roles in 13 NHs in North Carolina and Virginia, USA. Results Psychometric evaluation of the SOS-NH revealed good response patterns with low average of missing values across all items (3.05%). Analyses of the SOS-NH’s internal structure (e.g., comparative fit indices = 0.929, standardized root mean square error of approximation = 0.045) and consistency (composite reliability = 0.94) suggested its one-dimensionality. Significant between-facility variability, intraclass correlations, within-group agreement and design effect confirmed appropriateness of the SOS-NH for measurement at the NH level, justifying data aggregation. The SOS-NH showed discriminate validity from one related concept, communication openness. Implications Initial evidence regarding validity and reliability of the SOS-NH supports its’ utility in measuring safety behaviors and practices among a wide range of NH staff members, including those with low literacy. Further psychometric evaluation should focus on testing concurrent and criterion validity, using resident outcome measures (e.g., patient fall rates). PMID:23684122

  3. A theory for the arrangement of sensory organs in Drosophila

    NASA Astrophysics Data System (ADS)

    Zhu, Huifeng; Gunaratne, Preethi H.; Roman, Gregg W.; Gunaratne, Gemunu H.

    2010-03-01

    We study the arrangements of recurved bristles on the anterior wing margin of wild-type and mutant Drosophila. The epidermal or neural fate of a proneural cell depends on the concentrations of proteins of the achaete-scute complex. At puparium formation, concentrations of proteins are nearly identical in all cells of the anterior wing and each cell has the potential for neural fate. In wild-type flies, the action of regulatory networks drives the initial state to one where a bristle grows out of every fifth cell. Recent experiments have shown that the frequency of recurved bristles can be made to change by adjusting the mean concentrations of the zinc-finger transcription factor Senseless and the micro-RNA miR-9a. Specifically, mutant flies with reduced levels of miR-9a exhibit ectopic bristles, and those with lower levels of both miR-9a and Senseless show regular organization of recurved bristles, but with a lower periodicity of 4. We argue that these characteristics can be explained assuming an underlying Turing-type bifurcation whereby a periodic pattern spontaneously emerges from a uniform background. However, bristle patterns occur in a discrete array of cells, and are not mediated by diffusion. We argue that intracellular actions of transmembrane proteins such as Delta and Notch can play a role of diffusion in destabilizing the homogeneous state. In contrast to diffusion, intercellular actions can be activating or inhibiting; further, there can be lateral cross-species interactions. We introduce a phenomenological model to study bristle arrangements and make several model-independent predictions that can be tested in experiments. In our theory, miRNA-9a is one of the components of the underlying network and has no special regulatory role. The loss of periodicity in its absence is due to the transfer of the system to a bistable state.

  4. Including a measure of health status in Medicare's health maintenance organization capitation formula: reliability issues.

    PubMed

    Lichtenstein, R; Thomas, J W

    1987-02-01

    Medicare's formula for determining capitation levels for risk-based HMOs, the Adjusted Average Per Capita Cost (AAPCC), has been criticized as a poor basis for establishing payments. Among new adjusting factors suggested for the formula is a measure of beneficiaries' functional health status. The ability of such a measure to improve predictions of Medicare costs has been demonstrated in several studies. In addition to possessing predictive validity, a measure considered for inclusion in the AAPCC must also be reliable. In this paper, the authors examine a measure of functional health status for intrarater reliability or, equivalently, stability over time. A sample of 1,616 Medicare beneficiaries was surveyed twice--in late 1982 and in January 1984. Using a five-point scale, functional health status scores were calculated for each of the beneficiaries at two points in time. For 68.4% of the sample, functional health scores were unchanged over the year, and second-year scores were within one point of first-year scores for 94.3% of the sample. Based on the intraclass correlation coefficient, the scores on this functional health scale demonstrated substantial to "almost perfect" agreement over the 1-year period.

  5. How Settings Change People: Applying Behavior Setting Theory to Consumer-Run Organizations

    ERIC Educational Resources Information Center

    Brown, Louis D.; Shepherd, Matthew D.; Wituk, Scott A.; Meissen, Greg

    2007-01-01

    Self-help initiatives stand as a classic context for organizational studies in community psychology. Behavior setting theory stands as a classic conception of organizations and the environment. This study explores both, applying behavior setting theory to consumer-run organizations (CROs). Analysis of multiple data sets from all CROs in Kansas…

  6. The chronic toxicity of molybdate to freshwater organisms. I. Generating reliable effects data.

    PubMed

    De Schamphelaere, K A C; Stubblefield, W; Rodriguez, P; Vleminckx, K; Janssen, C R

    2010-10-15

    The European Union regulation on Registration, Evaluation, Authorization and Restriction of Chemical substances (REACH) (EC, 2006) requires the characterization of the chronic toxicity of many chemicals in the aquatic environment, including molybdate (MoO(4)(2-)). Our literature review on the ecotoxicity of molybdate revealed that a limited amount of reliable chronic no observed effect concentrations (NOECs) for the derivation of a predicted no-effect concentration (PNEC) existed. This paper presents the results of additional ecotoxicity experiments that were conducted in order to fulfill the requirements for the derivation of a PNEC by means of the scientifically most robust species sensitivity distribution (SSD) approach (also called the statistical extrapolation approach). Ten test species were chronically exposed to molybdate (added as sodium molybdate dihydrate, Na(2)MoO(4)·2H(2)O) according to internationally accepted standard testing guidelines or equivalent. The 10% effective concentrations (EC10, expressed as measured dissolved molybdenum) for the most sensitive endpoint per species were 62.8-105.6 (mg Mo)/L for Daphnia magna (21day-reproduction), 78.2 (mg Mo)/L for Ceriodaphnia dubia (7day-reproduction), 61.2-366.2 (mg Mo)/L for the green alga Pseudokirchneriella subcapitata (72h-growth rate), 193.6 (mg Mo)/L for the rotifer Brachionus calyciflorus (48h-population growth rate), 121.4 (mg Mo)/L for the midge Chironomus riparius (14day-growth), 211.3 (mg Mo)/L for the snail Lymnaea stagnalis (28day-growth rate), 115.9 (mg Mo)/L for the frog Xenopus laevis (4day-larval development), 241.5 (mg Mo)/L for the higher plant Lemna minor (7day-growth rate), 39.3 (mg Mo)/L for the fathead minnow Pimephales promelas (34day-dry weight/biomass), and 43.2 (mg Mo)/L for the rainbow trout Oncorhynchus mykiss (78day-biomass). These effect concentrations are in line with the few reliable data currently available in the open literature. The data presented in this study can

  7. Reliable thin film encapsulation for organic light emitting diodes grown by low-temperature atomic layer deposition

    NASA Astrophysics Data System (ADS)

    Meyer, J.; Schneidenbach, D.; Winkler, T.; Hamwi, S.; Weimann, T.; Hinze, P.; Ammermann, S.; Johannes, H.-H.; Riedl, T.; Kowalsky, W.

    2009-06-01

    We report on highly efficient gas diffusion barriers for organic light emitting diodes (OLEDs). Nanolaminate (NL) structures composed of alternating Al2O3 and ZrO2 sublayers grown by atomic layer deposition at 80 °C are used to realize long-term stable OLED devices. While the brightness of phosphorescent p-i-n OLEDs sealed by a single Al2O3 layer drops to 85% of the initial luminance of 1000 cd/m2 after 1000 h of continuous operation, OLEDs encapsulated with the NL retain more than 95% of their brightness. An extrapolated device lifetime substantially in excess of 10 000 h can be achieved, clearly proving the suitability of the NLs as highly dense and reliable thin film encapsulation of sensitive organic electronic devices.

  8. Reliable measurement of the Seebeck coefficient of organic and inorganic materials between 260 K and 460 K

    SciTech Connect

    Beretta, D.; Lanzani, G.; Bruno, P.; Caironi, M.

    2015-07-15

    A new experimental setup for reliable measurement of the in-plane Seebeck coefficient of organic and inorganic thin films and bulk materials is reported. The system is based on the “Quasi-Static” approach and can measure the thermopower in the range of temperature between 260 K and 460 K. The system has been tested on a pure nickel bulk sample and on a thin film of commercially available PEDOT:PSS deposited by spin coating on glass. Repeatability within 1.5% for the nickel sample is demonstrated, while accuracy in the measurement of both organic and inorganic samples is guaranteed by time interpolation of data and by operating with a temperature difference over the sample of less than 1 K.

  9. Using Multivariate Generalizability Theory to Assess the Effect of Content Stratification on the Reliability of a Performance Assessment

    ERIC Educational Resources Information Center

    Keller, Lisa A.; Clauser, Brian E.; Swanson, David B.

    2010-01-01

    In recent years, demand for performance assessments has continued to grow. However, performance assessments are notorious for lower reliability, and in particular, low reliability resulting from task specificity. Since reliability analyses typically treat the performance tasks as randomly sampled from an infinite universe of tasks, these estimates…

  10. An Examination of Test-Retest, Alternate Form Reliability, and Generalizability Theory Study of the easyCBM Reading Assessments: Grade 1. Technical Report #1216

    ERIC Educational Resources Information Center

    Anderson, Daniel; Park, Jasmine, Bitnara; Lai, Cheng-Fei; Alonzo, Julie; Tindal, Gerald

    2012-01-01

    This technical report is one in a series of five describing the reliability (test/retest/and alternate form) and G-Theory/D-Study research on the easy CBM reading measures, grades 1-5. Data were gathered in the spring 2011 from a convenience sample of students nested within classrooms at a medium-sized school district in the Pacific Northwest. Due…

  11. A Comparison of the Approaches of Generalizability Theory and Item Response Theory in Estimating the Reliability of Test Scores for Testlet-Composed Tests

    ERIC Educational Resources Information Center

    Lee, Guemin; Park, In-Yong

    2012-01-01

    Previous assessments of the reliability of test scores for testlet-composed tests have indicated that item-based estimation methods overestimate reliability. This study was designed to address issues related to the extent to which item-based estimation methods overestimate the reliability of test scores composed of testlets and to compare several…

  12. How Reliable is the Bulk δ13C value of Soil Organic Matter in Paleovegetational Reconstruction?

    NASA Astrophysics Data System (ADS)

    Sanyal, P.; Rakshit, S.

    2015-12-01

    Carbon isotope ratios of soil/paleosol organic matter (δ13CSOM) have been used to reconstruct abundance of C3-C4 plants survived in the landscape as the δ13C value of C3 (-27‰) and C4 (-12.5 ‰) plants are distinctly different. In an attempt to reconstruct the abundance of C3 and C4 plants, δ13CSOM have been measured from three soil profiles developed on flood plain of the Gangetic plain, Mohanpur, West Bengal, India. Satellite images reveal that the investigated sediments have been deposited in an oxbow lake setting of the river Ganges. The total organic carbon content of the profile ranges from 0.9% to 0.1%. The δ13CSOM values mostly range from -19.2‰ to -22‰ except a rapid positive excursions of ~5‰ at 1.5 m depth showing enriched value (-14.2‰) in all the three profiles. Based on mass balance calculation using the δ13C values of C3 and C4 plants, the δ13CSOM in the Gangetic plain indicate presence of both C3 and C4 plants in the floodplain. However, characterization of alkanes separated from lipids extracted from the same soil organic matter reveals dominant preferences in short carbon chain (C14, C16, C18, C20) with a little preferences for higher chain (C29, C31, C33). Interestingly, n-alkanes at 1.5 m depth shows very high concentration in short chain n-alkanes. Since the lower chain n-alkane represents aquatic productivity or intense bacterial decomposition and higher chain indicates the contribution from C3-C4 plants, the data from the investigated sedimentary profile shows contribution mostly from aquatic vegetation with a little contribution from terrestrial plants. This implies that before using bulk δ13CSOM value for reconstruction of C3-C4 plants from soil/paleosol, characterization (molecular level) of soil organic matter is required

  13. Implications of Complexity and Chaos Theories for Organizations that Learn

    ERIC Educational Resources Information Center

    Smith, Peter A. C.

    2003-01-01

    In 1996 Hubert Saint-Onge and Smith published an article ("The evolutionary organization: avoiding a Titanic fate", in The Learning Organization, Vol. 3 No. 4), based on their experience at the Canadian Imperial Bank of Commerce (CIBC). It was established at CIBC that change could be successfully facilitated through blended application…

  14. Reliability of equivalent sphere model in blood-forming organ dose estimation

    NASA Technical Reports Server (NTRS)

    Shinn, Judy L.; Wilson, John W.; Nealy, John E.

    1990-01-01

    The radiation dose equivalents to blood-forming organs (BFO's) of the astronauts at the Martian surface due to major solar flare events are calculated using the detailed body geometry of Langley and Billings. The solar flare spectra of February 1956, November 1960, and August 1972 events are employed instead of the idealized Webber form. The detailed geometry results are compared with those based on the 5-cm sphere model which was used often in the past to approximate BFO dose or dose equivalent. Larger discrepancies are found for the later two events possibly due to the lower numbers of highly penetrating protons. It is concluded that the 5-cm sphere model is not suitable for quantitative use in connection with future NASA deep-space, long-duration mission shield design studies.

  15. Reliability of equivalent sphere model in blood-forming organ dose estimation

    SciTech Connect

    Shinn, J.L.; Wilson, J.W.; Nealy, J.E.

    1990-04-01

    The radiation dose equivalents to blood-forming organs (BFO's) of the astronauts at the Martian surface due to major solar flare events are calculated using the detailed body geometry of Langley and Billings. The solar flare spectra of February 1956, November 1960, and August 1972 events are employed instead of the idealized Webber form. The detailed geometry results are compared with those based on the 5-cm sphere model which was used often in the past to approximate BFO dose or dose equivalent. Larger discrepancies are found for the later two events possibly due to the lower numbers of highly penetrating protons. It is concluded that the 5-cm sphere model is not suitable for quantitative use in connection with future NASA deep-space, long-duration mission shield design studies.

  16. Surrogacy theory and models of convoluted organic systems.

    PubMed

    Konopka, Andrzej K

    2007-03-01

    The theory of surrogacy is briefly outlined as one of the conceptual foundations of systems biology that has been developed for the last 30 years in the context of Hertz-Rosen modeling relationship. Conceptual foundations of modeling convoluted (biologically complex) systems are briefly reviewed and discussed in terms of current and future research in systems biology. New as well as older results that pertain to the concepts of modeling relationship, sequence of surrogacies, cascade of representations, complementarity, analogy, metaphor, and epistemic time are presented together with a classification of models in a cascade. Examples of anticipated future applications of surrogacy theory in life sciences are briefly discussed.

  17. Push-Pull Receptive Field Organization and Synaptic Depression: Mechanisms for Reliably Encoding Naturalistic Stimuli in V1

    PubMed Central

    Kremkow, Jens; Perrinet, Laurent U.; Monier, Cyril; Alonso, Jose-Manuel; Aertsen, Ad; Frégnac, Yves; Masson, Guillaume S.

    2016-01-01

    Neurons in the primary visual cortex are known for responding vigorously but with high variability to classical stimuli such as drifting bars or gratings. By contrast, natural scenes are encoded more efficiently by sparse and temporal precise spiking responses. We used a conductance-based model of the visual system in higher mammals to investigate how two specific features of the thalamo-cortical pathway, namely push-pull receptive field organization and fast synaptic depression, can contribute to this contextual reshaping of V1 responses. By comparing cortical dynamics evoked respectively by natural vs. artificial stimuli in a comprehensive parametric space analysis, we demonstrate that the reliability and sparseness of the spiking responses during natural vision is not a mere consequence of the increased bandwidth in the sensory input spectrum. Rather, it results from the combined impacts of fast synaptic depression and push-pull inhibition, the later acting for natural scenes as a form of “effective” feed-forward inhibition as demonstrated in other sensory systems. Thus, the combination of feedforward-like inhibition with fast thalamo-cortical synaptic depression by simple cells receiving a direct structured input from thalamus composes a generic computational mechanism for generating a sparse and reliable encoding of natural sensory events. PMID:27242445

  18. Reliability and validity of World Health Organization Quality of Life-100 in homeless substance-dependent veteran population.

    PubMed

    Garcia-Rea, Elizabeth; LePage, James P

    2008-01-01

    The number of homeless individuals and specifically homeless veterans is increasing. Accurate assessment of quality of life is an important need in working with this population because of the myriad problems encountered. However, the reliability and validity of quality-of-life instruments have not been assessed in this population. This study evaluated the psychometric properties of the U.S. version of the World Health Organization Quality of Life-100 in a homeless veteran population. Results found adequate internal consistency for all domain and most facet scores, while test-retest stability varied for the facet scores. We confirmed validity by using subsamples with physical, emotional, and social problems and by comparing scores from populations that returned to the community with employment and housing. Limitations and directions for future study are discussed.

  19. Increasing Reliability of Direct Observation Measurement Approaches in Emotional and/or Behavioral Disorders Research Using Generalizability Theory

    ERIC Educational Resources Information Center

    Gage, Nicholas A.; Prykanowski, Debra; Hirn, Regina

    2014-01-01

    Reliability of direct observation outcomes ensures the results are consistent, dependable, and trustworthy. Typically, reliability of direct observation measurement approaches is assessed using interobserver agreement (IOA) and the calculation of observer agreement (e.g., percentage of agreement). However, IOA does not address intraobserver…

  20. The contribution of organization theory to nursing health services research.

    PubMed

    Mick, Stephen S; Mark, Barbara A

    2005-01-01

    We review nursing and health services research on health care organizations over the period 1950 through 2004 to reveal the contribution of nursing to this field. Notwithstanding this rich tradition and the unique perspective of nursing researchers grounded in patient care production processes, the following gaps in nursing research remain: (1) the lack of theoretical frameworks about organizational factors relating to internal work processes; (2) the need for sophisticated methodologies to guide empirical investigations; (3) the difficulty in understanding how organizations adapt models for patient care delivery in response to market forces; (4) the paucity of attention to the impact of new technologies on the organization of patient care work processes. Given nurses' deep understanding of the inner workings of health care facilities, we hope to see an increasing number of research programs that tackle these deficiencies.

  1. Economic and Political Theories of Organization: The Case of Human Rights INGOs.

    ERIC Educational Resources Information Center

    Blaser, Arthur W.

    This paper reviews research on international nongovernmental organizations dealing with human rights (INGOs), and interprets this research in light of the overlap of the fields of organizational theory (including group theory) and human rights. The purpose is to contribute toward a useful exchange between social scientists who seek to explain…

  2. Prolegomena to a Primitive Theory of Human Communication in Human Organizations.

    ERIC Educational Resources Information Center

    Dance, Frank E. X.

    1979-01-01

    Calls for a reordering of values in the study of human communication in human organizations. Offers a preliminary discourse on a primitive theory of human communication as distinguished from an eclectic theory of organizational communication. Differences between the two types of theoretical approaches are suggested. (JMF)

  3. A Theory of Electronic Propinquity: Mediated Communication in Organizations.

    ERIC Educational Resources Information Center

    Korzenny, Felipe

    This paper proposes a theoretical approach to mediated communication in organizations. It is argued that the man/machine interface in mediated human communication is better dealt with when a comprehensive theoretical approach is used than when separate communication devices are tested as they appear in the market, such as video-teleconferencing.…

  4. A Theory of Electronic Propinquity: Mediated Communication in Organizations

    ERIC Educational Resources Information Center

    Korzenny, Felipe

    1978-01-01

    Proposes a theoretical approach to mediated communication in organizations suggesting that man-machine interface in mediated human communication is more effectively dealt with by using a comprehensive theoretical approach rather than separate communication devices that are tested as they appear in the market. (MH)

  5. Toward a theory of the functional organization of the retina

    NASA Astrophysics Data System (ADS)

    Ratliff, Charles P.

    2007-12-01

    The retina streams visual information to the brain through parallel channels with highly stereotyped patterns of organization and connection. Much progress has been made toward identifying the types of neurons present, and their connectivity. A key problem is inferring the function of a neural system based on its known anatomy and physiology, and identifying the advantages conferred by its particular design. Often, characterizing its architecture reveals some strange features of its organization, and the utility of these features is not always explained easily. Here evidence is presented that several intriguing 'design features' of the retina can be explained by careful application of a single hypothesis: that the retina is organized to maximize the information transmitted about natural visual stimuli, subject to a set of biophysical constraints. Specifically, the input neurons to the retina (photoreceptors) and the output neurons (ganglion cells) exhibit the following interesting features: (1) In trichromats, cone photoreceptors with peak sensitivity to long (L), medium (M) and short (S) wavelengths of light are asymmetrically distributed, so that the ratio of L/M (red/green) cones is highly variable, and S (blue) cones are relatively scarce. (2) Ganglion cell receptive fields are organized so that 3-4 cells of the same type represent each point in a visual image. (3) The retina devotes more resources to ganglion cells selective for negative contrasts (OFF cells) than those selective for positive contrasts (ON cells). (4) The shape of ganglion cell center/surround receptive fields depends on their spatial scale, so that the ratio of surround size to center size decreases with the visual angle subtended by the receptive field. In each case, statistical properties of natural visual stimuli could be coupled with realistic biophysical constraints to account for the features described. The analyses here constitute progress toward long-standing questions concerning the

  6. An Examination of Test-Retest, Alternate Form Reliability, and Generalizability Theory Study of the easyCBM Passage Reading Fluency Assessments: Grade 4. Technical Report #1219

    ERIC Educational Resources Information Center

    Park, Bitnara Jasmine; Anderson, Daniel; Alonzo, Julie; Lai, Cheng-Fei; Tindal, Gerald

    2012-01-01

    This technical report is one in a series of five describing the reliability (test/retest and alternate form) and G-Theory/D-Study research on the easyCBM reading measures, grades 1-5. Data were gathered in the spring of 2011 from a convenience sample of students nested within classrooms at a medium-sized school district in the Pacific Northwest.…

  7. An Examination of Test-Retest, Alternate Form Reliability, and Generalizability Theory Study of the easyCBM Reading Assessments: Grade 2. Technical Report #1217

    ERIC Educational Resources Information Center

    Anderson, Daniel; Lai, Cheg-Fei; Park, Bitnara Jasmine; Alonzo, Julie; Tindal, Gerald

    2012-01-01

    This technical report is one in a series of five describing the reliability (test/retest an alternate form) and G-Theory/D-Study on the easyCBM reading measures, grades 1-5. Data were gathered in the spring of 2011 from the convenience sample of students nested within classrooms at a medium-sized school district in the Pacific Northwest. Due to…

  8. An Examination of Test-Retest, Alternate Form Reliability, and Generalizability Theory Study of the easyCBM Reading Assessments: Grade 5. Technical Report #1220

    ERIC Educational Resources Information Center

    Lai, Cheng-Fei; Park, Bitnara Jasmine; Anderson, Daniel; Alonzo, Julie; Tindal, Gerald

    2012-01-01

    This technical report is one in a series of five describing the reliability (test/retest and alternate form) and G-Theory/D-Study research on the easyCBM reading measures, grades 1-5. Data were gathered in the spring of 2011 from a convenience sample of students nested within classrooms at a medium-sized school district in the Pacific Northwest.…

  9. Targeting helicase-dependent amplification products with an electrochemical genosensor for reliable and sensitive screening of genetically modified organisms.

    PubMed

    Moura-Melo, Suely; Miranda-Castro, Rebeca; de-Los-Santos-Álvarez, Noemí; Miranda-Ordieres, Arturo J; Dos Santos Junior, J Ribeiro; da Silva Fonseca, Rosana A; Lobo-Castañón, Maria Jesús

    2015-08-18

    Cultivation of genetically modified organisms (GMOs) and their use in food and feed is constantly expanding; thus, the question of informing consumers about their presence in food has proven of significant interest. The development of sensitive, rapid, robust, and reliable methods for the detection of GMOs is crucial for proper food labeling. In response, we have experimentally characterized the helicase-dependent isothermal amplification (HDA) and sequence-specific detection of a transgene from the Cauliflower Mosaic Virus 35S Promoter (CaMV35S), inserted into most transgenic plants. HDA is one of the simplest approaches for DNA amplification, emulating the bacterial replication machinery, and resembling PCR but under isothermal conditions. However, it usually suffers from a lack of selectivity, which is due to the accumulation of spurious amplification products. To improve the selectivity of HDA, which makes the detection of amplification products more reliable, we have developed an electrochemical platform targeting the central sequence of HDA copies of the transgene. A binary monolayer architecture is built onto a thin gold film where, upon the formation of perfect nucleic acid duplexes with the amplification products, these are enzyme-labeled and electrochemically transduced. The resulting combined system increases genosensor detectability up to 10(6)-fold, allowing Yes/No detection of GMOs with a limit of detection of ∼30 copies of the CaMV35S genomic DNA. A set of general utility rules in the design of genosensors for detection of HDA amplicons, which may assist in the development of point-of-care tests, is also included. The method provides a versatile tool for detecting nucleic acids with extremely low abundance not only for food safety control but also in the diagnostics and environmental control areas.

  10. A Study of the Readiness of Hospitals for Implementation of High Reliability Organizations Model in Tehran University of Medical Sciences.

    PubMed

    Mousavi, Seyed Mohammad Hadi; Dargahi, Hossein; Mohammadi, Sara

    2016-10-01

    Creating a safe of health care system requires the establishment of High Reliability Organizations (HROs), which reduces errors, and increases the level of safety in hospitals. This model focuses on improving reliability through higher process design, building a culture of accreditation, and leveraging human factors. The present study intends to determine the readiness of hospitals for the establishment of HROs model in Tehran University of Medical Sciences from the viewpoint of managers of these hospitals. This is a descriptive-analytical study carried out in 2013-2014. The research population consists of 105 senior and middle managers of 15 hospitals of Tehran University of Medical Sciences. The data collection tool was a 55-question researcher-made questionnaire, included six elements of HROs to assess the level of readiness for establishing HROS model from managers' point of view. The validity of the questionnaire was calculated through the content validity method using 10 experts in the area of hospitals' accreditation, and its reliability was calculated through test-retest method with a correlation coefficient of 0.90. The response rate was 90 percent. The Likert scale was used for the questions, and data analysis was conducted through SPSS version 21 Descriptive statistics was presented via tables and normal distributions of data and means. Analytical methods, including t-test, Mann-Whitney, Spearman, and Kruskal-Wallis, were used for presenting inferential statistics. The study showed that from the viewpoint of senior and middle managers of the hospitals considered in this study, these hospitals are indeed ready for acceptance and establishment of HROs model. A significant relationship was showed between HROs model and its elements with demographic details of managers like their age, work experience, management experience, and level of management. Although the studied hospitals, as viewed by their managers, are capable of attaining the goals of HROs, it

  11. Theory of zwitterionic molecular-based organic magnets

    NASA Astrophysics Data System (ADS)

    Shelton, William A.; Aprà, Edoardo; Sumpter, Bobby G.; Saraiva-Souza, Aldilene; Souza Filho, Antonio G.; Nero, Jordan Del; Meunier, Vincent

    2011-08-01

    We describe a class of organic molecular magnets based on zwitterionic molecules (betaine derivatives) possessing donor, π bridge, and acceptor groups. Using extensive electronic structure calculations we show the electronic ground-state in these systems is magnetic. In addition, we show that the large energy differences computed for the various magnetic states indicate a high Neel temperature. The quantum mechanical nature of the magnetic properties originates from the conjugated π bridge (only p electrons) in cooperation with the molecular donor-acceptor character. The exchange interactions between electron spin are strong, local, and independent on the length of the π bridge.

  12. Shock interaction with organized structures: Theory and computation

    NASA Astrophysics Data System (ADS)

    Ding, Zhong

    Unsteady interactions between shocks and turbulence are important phenomena frequently encountered in high-speed flows. In this dissertation the problem of a shock interaction with an entropy spot is studied by means of both theoretical analysis and nonlinear computation. The main objective of the studies is to apply both theoretical and computational approaches to study the physics underlying such shock interaction process. The theoretical analysis is based on the Fourier decomposition of the upstream disturbance, the interaction of each Fourier mode with the shock, and the reconstruction of the downstream disturbance via the inverse Fourier transform. The theory is linear in that it assumes the principle of superposition and that the Rankine-Hugoniot relations are linearized about the mean position of the shock. The numerical simulation is carried out within the framework of the unsteady and compressible Euler equations, coupled with an equation for the shock motion, solved numerically by a sixth-order accurate spatial scheme and a fourth-order Runge-Kutta time-integration method. Analyses of the results are concentrated on the case of a Mach 2.0 shock interaction with an entropy spot that has a Gaussian density distribution. The theoretical analysis and the numerical simulation are verified with each other for small amplitude disturbances. The roles of the evanescent and the non-evanescent waves and the mechanisms for downstream disturbance generations are explored in details. In addition, the quasi three-dimensional interaction between a shock and a vortex ring is investigated computationally within the framework of the axisymmetric Euler equations. The vortex ring, which is based on Lamb's formula, has an upstream circulation Gamma = 0.01 and its aspect ratio R lies in the range 8 ≤ R ≤ 100. The shock Mach number varies in the range 1.1 ≤ M1 ≤ 1.8. The interaction results in the streamwise compression of the vortex core and the generation of a toroidal

  13. A predictive theory of charge separation in organic photovoltaics interfaces

    NASA Astrophysics Data System (ADS)

    Troisi, Alessandro; Liu, Tao; Caruso, Domenico; Cheung, David L.; McMahon, David P.

    2012-09-01

    The key process in organic photovoltaics cells is the separation of an exciton, close to the donor/acceptor interface into a free hole (in the donor) and a free electron (in the acceptor). In an efficient solar cell, the majority of absorbed photons generate such hole-electron pairs but it is not clear why such a charge separation process is so efficient in some blends (for example in the blend formed by poly(3- hexylthiophene) (P3HT) and a C60 derivative (PCBM)) and how can one design better OPV materials. The electronic and geometric structure of the prototypical polymer:fullerene interface (P3HT:PCBM) is investigated theoretically using a combination of classical and quantum simulation methods. It is shown that the electronic structure of P3HT in contact with PCBM is significantly altered compared to bulk P3HT. Due to the additional free volume of the interface, P3HT chains close to PCBM are more disordered and, consequently, they are characterized by an increased band gap. Excitons and holes are therefore repelled by the interface. This provides a possible explanation of the low recombination efficiency and supports the direct formation of "quasi-free" charge separated species at the interface. This idea is further explored here by using a more general system-independent model Hamiltonian. The long range exciton dissociation rate is computed as a function of the exciton distance from the interface and the average dissociation distance is evaluated by comparing this rate with the exciton migration rate with a kinetic model. The phenomenological model shows that also in a generic interface the direct formation if quasi-free charges is extremely likely.

  14. Molecular Electron Density Theory: A Modern View of Reactivity in Organic Chemistry.

    PubMed

    Domingo, Luis R

    2016-09-30

    A new theory for the study of the reactivity in Organic Chemistry, named Molecular Electron Density Theory (MEDT), is proposed herein. MEDT is based on the idea that while the electron density distribution at the ground state is responsible for physical and chemical molecular properties, as proposed by the Density Functional Theory (DFT), the capability for changes in electron density is responsible for molecular reactivity. Within MEDT, the reactivity in Organic Chemistry is studied through a rigorous quantum chemical analysis of the changes of the electron density as well as the energies associated with these changes along the reaction path in order to understand experimental outcomes. Studies performed using MEDT allow establishing a modern rationalisation and to gain insight into molecular mechanisms and reactivity in Organic Chemistry.

  15. Towards a Theory of Variation in the Organization of the Word Reading System

    PubMed Central

    Rueckl, Jay G.

    2015-01-01

    The strategy underlying most computational models of word reading is to specify the organization of the reading system—its architecture and the processes and representations it employs—and to demonstrate that this organization would give rise to the behavior observed in word reading tasks. This approach fails to adequately address the variation in reading behavior observed across and within linguistic communities. Only computational models that incorporate learning can fully account for variation in organization. However, even extant learning models (e.g., the triangle model) must be extended if they are to fully account for variation in organization. The challenges associated with extending theories in this way are discussed. PMID:26997862

  16. [A new view of Wolfgang Gutmann and the "Organism-Centred Theory"].

    PubMed

    Weinich, Detlef

    2003-01-01

    Six years after the death of the founder of the so-called 'Organism-Centred Theory' - Prof. Dr. Wolfgang Friedrich Gutmann died on 15 April 1997 - it is obvious that numerous aspects of this theoretical system, which were highly controversial while GUTMANN was still alive, are today gaining increasing acceptance. Two things are worth noting here: on the one hand it can be observed that statements from this concept, also known as the "Frankfurt Theory" (FT), are slowly establishing themselves in the scientific community as everyday scientific knowledge, that is, without being identified as intrinsic parts of the organism-centred theory. On the other hand it cannot be ignored that a rethinking process and an assumption of construction-morphological ideas has been observed, even among those bio-scientists who firmly regard themselves as representatives of the traditional view of evolution theory oriented towards Darwinian evolution paradigms. In terms of content, this transformation focuses on the evaluation of two central points of "organism-centred theory": on the one hand, GUTMANN's criticism of reductionism is finding an increasing number of followers, and futhermore, his idea that an organism itself actively generates and creates its own form has been convincingly confirmed by a number of more recent cellular findings.

  17. Application of fuzzy set and Dempster-Shafer theory to organic geochemistry interpretation

    NASA Technical Reports Server (NTRS)

    Kim, C. S.; Isaksen, G. H.

    1993-01-01

    An application of fuzzy sets and Dempster Shafter Theory (DST) in modeling the interpretational process of organic geochemistry data for predicting the level of maturities of oil and source rock samples is presented. This was accomplished by (1) representing linguistic imprecision and imprecision associated with experience by a fuzzy set theory, (2) capturing the probabilistic nature of imperfect evidences by a DST, and (3) combining multiple evidences by utilizing John Yen's generalized Dempster-Shafter Theory (GDST), which allows DST to deal with fuzzy information. The current prototype provides collective beliefs on the predicted levels of maturity by combining multiple evidences through GDST's rule of combination.

  18. Egalitarian and maximin theories of justice: directed donation of organs for transplant.

    PubMed

    Veatch, R M

    1998-08-01

    It is common to interpret Rawls's maximin theory of justice as egalitarian. Compared to utilitarian theories, this may be true. However, in special cases practices that distribute resources so as to benefit the worst off actually increase the inequality between the worst off and some who are better off. In these cases the Rawlsian maximin parts company with what is here called true egalitarianism. A policy question requiring a distinction between maximin and "true egalitarian" allocations has arisen in the arena of organ transplantation. This case is examined here as a venue for differentiating maximin and true egalitarian theories. Directed donation is the name given to donations of organs restricted to a particular social group. For example, the family of a member of the Ku Klux Klan donated his organs on the provision that they go only to members of the Caucasian race. While such donations appear to be discriminatory, if certain plausible assumptions are made, they satisfy the maximin criterion. They selectively advantage the recipient of the organs without harming anyone (assuming the organs would otherwise go unused). Moreover, everyone who is lower on the waiting list (who, thereby, could be considered worse off) is advantaged by moving up on the waiting list. This paper examines how maximin and more truly egalitarian theories handle this case arguing that, to the extent that directed donation is unethical, the best account of that conclusion is that an egalitarian principle of justice is to be preferred to the maximin.

  19. Applications of the Conceptual Density Functional Theory Indices to Organic Chemistry Reactivity.

    PubMed

    Domingo, Luis R; Ríos-Gutiérrez, Mar; Pérez, Patricia

    2016-06-09

    Theoretical reactivity indices based on the conceptual Density Functional Theory (DFT) have become a powerful tool for the semiquantitative study of organic reactivity. A large number of reactivity indices have been proposed in the literature. Herein, global quantities like the electronic chemical potential μ, the electrophilicity ω and the nucleophilicity N indices, and local condensed indices like the electrophilic P k + and nucleophilic P k - Parr functions, as the most relevant indices for the study of organic reactivity, are discussed.

  20. Improving the Reliability of Student Scores from Speeded Assessments: An Illustration of Conditional Item Response Theory Using a Computer-Administered Measure of Vocabulary

    PubMed Central

    Petscher, Yaacov; Mitchell, Alison M.; Foorman, Barbara R.

    2016-01-01

    A growing body of literature suggests that response latency, the amount of time it takes an individual to respond to an item, may be an important factor to consider when using assessment data to estimate the ability of an individual. Considering that tests of passage and list fluency are being adapted to a computer administration format, it is possible that accounting for individual differences in response times may be an increasingly feasible option to strengthen the precision of individual scores. The present research evaluated the differential reliability of scores when using classical test theory and item response theory as compared to a conditional item response model which includes response time as an item parameter. Results indicated that the precision of student ability scores increased by an average of 5 % when using the conditional item response model, with greater improvements for those who were average or high ability. Implications for measurement models of speeded assessments are discussed. PMID:27721568

  1. A Contribution to a Theory of Organizations: An Examination of Student Protest.

    ERIC Educational Resources Information Center

    Norr, James L.

    Until recently most of the research on college student protest of the 1960's has taken either a political socialization or cultural-historical perspective. The research reported here takes an organizational perspective with the expectation that an examination of student protest should contribute to a theory of organizations. Two classes of…

  2. Innovation Diffusion: Proposal of an Organizing Theory on Which To Base Research into School Library Development.

    ERIC Educational Resources Information Center

    Knuth, Rebecca

    1997-01-01

    Discusses the appropriateness of applying diffusion theory to the study of five factors that influence school library development globally: (1) the evolution of, acceptance of, and consensus on a viable service-delivery model; (2) influence exercised by professional organizations; (3) generation of acceptable standards; (4) overt government…

  3. How Youth Get Engaged: Grounded-Theory Research on Motivational Development in Organized Youth Programs

    ERIC Educational Resources Information Center

    Dawes, Nickki Pearce; Larson, Reed

    2011-01-01

    For youth to benefit from many of the developmental opportunities provided by organized programs, they need to not only attend but become psychologically engaged in program activities. This research was aimed at formulating empirically based grounded theory on the processes through which this engagement develops. Longitudinal interviews were…

  4. Self-Determination Theory as an Organizing Framework To Investigate Women's Physical Activity Behavior.

    ERIC Educational Resources Information Center

    Landry, Joan B.; Solmon, Melinda A.

    2002-01-01

    Explores the literature on the status of women's health behavior and the benefits of physical activity, using Self- Determination Theory (SDT) as an organizing framework and including the Health Belief Model and Transtheoretical Model in the framework. Women's physical activity behaviors are examined through the lens of SDT with the intention of…

  5. An Investigation of the Advance Organizer Theory as an Effective Teaching Model.

    ERIC Educational Resources Information Center

    Downing, Agnes

    This paper advocates for the improvement of presentational methods of teaching and expository learning, based on David Ausubel's theory of Meaningful Verbal Learning and its derivative, the Advance Organizer Model of Teaching. This approach to teaching enables teachers to convey large amounts of information as meaningfully and efficiently as…

  6. Estimation of reliability and dynamic property for polymeric material at high strain rate using SHPB technique and probability theory

    NASA Astrophysics Data System (ADS)

    Kim, Dong Hyeok; Lee, Ouk Sub; Kim, Hong Min; Choi, Hye Bin

    2008-11-01

    A modified Split Hopkinson Pressure Bar technique with aluminum pressure bars and a pulse shaper technique to achieve a closer impedance match between the pressure bars and the specimen materials such as hot temperature degraded POM (Poly Oxy Methylene) and PP (Poly Propylene). The more distinguishable experimental signals were obtained to evaluate the more accurate dynamic deformation behavior of materials under a high strain rate loading condition. A pulse shaping technique is introduced to reduce the non-equilibrium on the dynamic material response by modulation of the incident wave during a short period of test. This increases the rise time of the incident pulse in the SHPB experiment. For the dynamic stress strain curve obtained from SHPB experiment, the Johnson-Cook model is applied as a constitutive equation. The applicability of this constitutive equation is verified by using the probabilistic reliability estimation method. Two reliability methodologies such as the FORM and the SORM have been proposed. The limit state function(LSF) includes the Johnson-Cook model and applied stresses. The LSF in this study allows more statistical flexibility on the yield stress than a paper published before. It is found that the failure probability estimated by using the SORM is more reliable than those of the FORM/ It is also noted that the failure probability increases with increase of the applied stress. Moreover, it is also found that the parameters of Johnson-Cook model such as A and n, and the applied stress are found to affect the failure probability more severely than the other random variables according to the sensitivity analysis.

  7. Adequacy of Asymptotic Normal Theory in Estimating Reliability for Mastery Tests Based on the Beta-Binomial Model.

    ERIC Educational Resources Information Center

    Huynh, Huynh

    1981-01-01

    Simulated data based on five test score distributions indicate that a slight modification of the asymptotic normal theory for the estimation of the p and kappa indices in mastery testing will provide results which are in close agreement with those based on small samples from the beta-binomial distribution. (Author/BW)

  8. Reliability and Validity Study of the Mobile Learning Adoption Scale Developed Based on the Diffusion of Innovations Theory

    ERIC Educational Resources Information Center

    Celik, Ismail; Sahin, Ismail; Aydin, Mustafa

    2014-01-01

    In this study, a mobile learning adoption scale (MLAS) was developed on the basis of Rogers' (2003) Diffusion of Innovations Theory. The scale that was developed consists of four sections. These sections are as follows: Stages in the innovation-decision process, Types of m-learning decision, Innovativeness level and attributes of m-learning.…

  9. Reliability and Validity Study of the Mobile Learning Adoption Scale Developed Based on the Diffusion of Innovations Theory

    ERIC Educational Resources Information Center

    Celik, Ismail; Sahin, Ismail; Aydin, Mustafa

    2014-01-01

    In this study, a mobile learning adoption scale (MLAS) was developed on the basis of Rogers' (2003) Diffusion of Innovations Theory. The scale that was developed consists of four sections. These sections are as follows: Stages in the innovation-decision process, Types of m-learning decision, Innovativeness level and attributes of m-learning. There…

  10. Compatibility between Text Mining and Qualitative Research in the Perspectives of Grounded Theory, Content Analysis, and Reliability

    ERIC Educational Resources Information Center

    Yu, Chong Ho; Jannasch-Pennell, Angel; DiGangi, Samuel

    2011-01-01

    The objective of this article is to illustrate that text mining and qualitative research are epistemologically compatible. First, like many qualitative research approaches, such as grounded theory, text mining encourages open-mindedness and discourages preconceptions. Contrary to the popular belief that text mining is a linear and fully automated…

  11. Assuring reliability program effectiveness.

    NASA Technical Reports Server (NTRS)

    Ball, L. W.

    1973-01-01

    An attempt is made to provide simple identification and description of techniques that have proved to be most useful either in developing a new product or in improving reliability of an established product. The first reliability task is obtaining and organizing parts failure rate data. Other tasks are parts screening, tabulation of general failure rates, preventive maintenance, prediction of new product reliability, and statistical demonstration of achieved reliability. Five principal tasks for improving reliability involve the physics of failure research, derating of internal stresses, control of external stresses, functional redundancy, and failure effects control. A final task is the training and motivation of reliability specialist engineers.

  12. Organizers.

    ERIC Educational Resources Information Center

    Callison, Daniel

    2000-01-01

    Focuses on "organizers," tools or techniques that provide identification and classification along with possible relationships or connections among ideas, concepts, and issues. Discusses David Ausubel's research and ideas concerning advance organizers; the implications of Ausubel's theory to curriculum and teaching; "webbing," a…

  13. Organisms, organizations and interactions: an information theory approach to biocultural evolution.

    PubMed

    Wallace, R; Wallace, R G

    1999-08-01

    The language metaphor of theoretical biology, proposed by Waddington in 1972, provides a basis for the formal examination of how different self-reproducing structures interact in an extended evolutionary context. Such interactions have become central objects of study in fields ranging from human evolution-genes and culture-to economics-firms, markets and technology. Here we use the Shannon-McMillan Theorem, one of the fundamental asymptotic relations of probability theory, to study the 'weakest' and hence most universal, forms of interaction between generalized languages. We propose that the co-evolving gene-culture structure that permits human ultra-sociality emerged in a singular coagulation of genetic and cultural 'languages', in the general sense of the word. Human populations have since hosted series of culture-only speciations and coagulations, events that, in this formulation, do not become mired in the 'meme' concept.

  14. [Business organization theory: its potential use in the organization of the operating room].

    PubMed

    Bartz, H-J

    2005-07-01

    The paradigm of patient care in the German health system is changing. The introduction of German Diagnosis Related Groups (G-DRGs), a diagnosis-related coding system, has made process-oriented thinking increasingly important. The treatment process is viewed and managed as a whole from the admission to the discharge of the patient. The interfaces of departments and sectors are diminished. A main objective of these measures is to render patient care more cost efficient. Within the hospital, the operating room (OR) is the most expensive factor accounting for 25 - 50 % of the costs of a surgical patient and is also a bottleneck in the surgical patient care. Therefore, controlling of the perioperative treatment process is getting more and more important. Here, the business organisation theory can be a very useful tool. Especially the concepts of process organisation and process management can be applied to hospitals. Process-oriented thinking uncovers and solves typical organisational problems. Competences, responsibilities and tasks are reorganised by process orientation and the enterprise is gradually transformed to a process-oriented system. Process management includes objective-oriented controlling of the value chain of an enterprise with regard to quality, time, costs and customer satisfaction. The quality of the process is continuously improved using process-management techniques. The main advantage of process management is consistent customer orientation. Customer orientation means to be aware of the customer's needs at any time during the daily routine. The performance is therefore always directed towards current market requirements. This paper presents the basics of business organisation theory and to point out its potential use in the organisation of the OR.

  15. Self-organization theories and environmental management: The case of South Moresby, Canada

    NASA Astrophysics Data System (ADS)

    Grzybowski, Alex G. S.; Slocombe, D. Scott

    1988-07-01

    This article presents a new approach to the analysis and management of large-scale societal problems with complex ecological, economic, and social dimensions. The approach is based on the theory of self-organizing systems—complex, open, far-from-equilibrium systems with nonlinear dynamics. A brief overview and comparison of different self-organization theories (synergetics, self-organization theory, hypercycles, and autopoiesis) is presented in order to isolate the key characteristics of such systems. The approach is used to develop an analysis of the landuse controversy in the South Moresby area of the Queen Charlotte Islands, British Columbia, Canada. Critical variables are identified for each subsystem and classified by spatial and temporal scale, and discussed in terms of information content and internal/external origin. Eradication of sea otters, introduction of black-tailed deer, impacts of large-scale clearcut logging, sustainability of the coastal forest industry, and changing relations between native peoples and governments are discussed in detail to illustrate the system dynamics of the South Moresby “sociobiophysical” system. Finally, implications of the self-organizing sociobiophysical system view for regional analysis and management are identified.

  16. Content-oriented Approach to Organization of Theories and Its Utilization

    NASA Astrophysics Data System (ADS)

    Hayashi, Yusuke; Bourdeau, Jacqueline; Mizoguch, Riichiro

    In spite of the fact that the relation between theory and practice is a foundation of scientific and technological development, the trend of increasing the gap between theory and practice accelerates in these years. The gap embraces a risk of distrust of science and technology. Ontological engineering as the content-oriented research is expected to contribute to the resolution of the gap. This paper presents the feasibility of organization of theoretical knowledge on ontological engineering and new-generation intelligent systems based on it through an application of ontological engineering in the area of learning/instruction support. This area also has the problem of the gap between theory and practice, and its resolution is strongly required. So far we proposed OMNIBUS ontology, which is a comprehensive ontology that covers different learning/instructional theories and paradigms, and SMARTIES, which is a theory-aware and standard-compliant authoring system for making learning/instructional scenarios based on OMNIBUS ontology. We believe the theory-awareness and standard-compliance bridge the gap between theory and practice because it links theories to practical use of standard technologies and enables practitioners to easily enjoy theoretical support while using standard technologies in practice. The following goals are set in order to achieve it; computers (1) understand a variety of learning/instructional theories based on the organization of them, (2) utilize the understanding for helping authors' learning/instructional scenario making and (3) make such theoretically sound scenarios interoperable within the framework of standard technologies. This paper suggests an ontological engineering solution to the achievement of these three goals. Although the evaluation is far from complete in terms of practical use, we believe that the results of this study address high-level technical challenges from the viewpoint of the current state of the art in the research area

  17. Assessing governance theory and practice in health-care organizations: a survey of UK hospices.

    PubMed

    Chambers, Naomi; Benson, Lawrence; Boyd, Alan; Girling, Jeff

    2012-05-01

    This paper sets out a theoretical framework for analyzing board governance, and describes an empirical study of corporate governance practices in a subset of non-profit organizations (hospices in the UK). It examines how practices in hospice governance compare with what is known about effective board working. We found that key strengths of hospice boards included a strong focus on the mission and the finances of the organizations, and common weaknesses included a lack of involvement in strategic matters and a lack of confidence, and some nervousness about challenging the organization on the quality of clinical care. Finally, the paper offers suggestions for theoretical development particularly in relation to board governance in non-profit organizations. It develops an engagement theory for boards which comprises a triadic proposition of high challenge, high support and strong grip.

  18. Theory of the field-effect mobility in amorphous organic transistors

    NASA Astrophysics Data System (ADS)

    Vissenberg, M. C. J. M.; Matters, M.

    1998-05-01

    The field-effect mobility in an organic thin-film transistor is studied theoretically. From a percolation model of hopping between localized states and a transistor model an analytic expression for the field-effect mobility is obtained. The theory is applied to describe the experiments by Brown et al. [Synth. Met. 88, 37 (1997)] on solution-processed amorphous organic transistors, made from a polymer (polythienylene vinylene) and from a small molecule (pentacene). Good agreement is obtained, with respect to both the gate voltage and the temperature dependence of the mobility.

  19. Insights into the organization of biochemical regulatory networks using graph theory analyses.

    PubMed

    Ma'ayan, Avi

    2009-02-27

    Graph theory has been a valuable mathematical modeling tool to gain insights into the topological organization of biochemical networks. There are two types of insights that may be obtained by graph theory analyses. The first provides an overview of the global organization of biochemical networks; the second uses prior knowledge to place results from multivariate experiments, such as microarray data sets, in the context of known pathways and networks to infer regulation. Using graph analyses, biochemical networks are found to be scale-free and small-world, indicating that these networks contain hubs, which are proteins that interact with many other molecules. These hubs may interact with many different types of proteins at the same time and location or at different times and locations, resulting in diverse biological responses. Groups of components in networks are organized in recurring patterns termed network motifs such as feedback and feed-forward loops. Graph analysis revealed that negative feedback loops are less common and are present mostly in proximity to the membrane, whereas positive feedback loops are highly nested in an architecture that promotes dynamical stability. Cell signaling networks have multiple pathways from some input receptors and few from others. Such topology is reminiscent of a classification system. Signaling networks display a bow-tie structure indicative of funneling information from extracellular signals and then dispatching information from a few specific central intracellular signaling nexuses. These insights show that graph theory is a valuable tool for gaining an understanding of global regulatory features of biochemical networks.

  20. Geminate electron-hole recombination in organic photovoltaic cells. A semi-empirical theory.

    PubMed

    Wojcik, Mariusz; Nowak, Artur; Seki, Kazuhiko

    2017-02-07

    We propose a semi-empirical theory which describes the geminate electron-hole separation probability in both homogeneous systems and donor-acceptor heterojunction systems applicable in organic photovoltaics. The theory is based on the results of extensive simulation calculations, which were carried out using various lattice models of the medium and different charge-carrier hopping mechanisms, over the parameter ranges typical for organic solar cells. It is found that the electron-hole separation probability can be conveniently described in terms of measurable parameters by a formula whose functional form is derived from the existing recombination theories, and which contains only one empirical parameter. For homogeneous systems, this parameter is determined by the structure of the medium and only weakly depends on the charge-carrier hopping mechanism. In the case of donor-acceptor heterojunction systems, this empirical parameter shows a simple power-law dependence on the product of the dielectric constant and inter-molecular contact distance. We also study the effect of heterojunction structure on the electron-hole separation probability and show that this probability decreases with increasing roughness of the heterojunction. By analyzing the simulation results obtained for systems under the influence of an external electric field, we find that the field effect on the electron-hole separation probability in donor-acceptor heterojunction systems is weaker than in homogeneous systems. We also describe this field effect by a convenient empirical formula.

  1. Geminate electron-hole recombination in organic photovoltaic cells. A semi-empirical theory

    NASA Astrophysics Data System (ADS)

    Wojcik, Mariusz; Nowak, Artur; Seki, Kazuhiko

    2017-02-01

    We propose a semi-empirical theory which describes the geminate electron-hole separation probability in both homogeneous systems and donor-acceptor heterojunction systems applicable in organic photovoltaics. The theory is based on the results of extensive simulation calculations, which were carried out using various lattice models of the medium and different charge-carrier hopping mechanisms, over the parameter ranges typical for organic solar cells. It is found that the electron-hole separation probability can be conveniently described in terms of measurable parameters by a formula whose functional form is derived from the existing recombination theories, and which contains only one empirical parameter. For homogeneous systems, this parameter is determined by the structure of the medium and only weakly depends on the charge-carrier hopping mechanism. In the case of donor-acceptor heterojunction systems, this empirical parameter shows a simple power-law dependence on the product of the dielectric constant and inter-molecular contact distance. We also study the effect of heterojunction structure on the electron-hole separation probability and show that this probability decreases with increasing roughness of the heterojunction. By analyzing the simulation results obtained for systems under the influence of an external electric field, we find that the field effect on the electron-hole separation probability in donor-acceptor heterojunction systems is weaker than in homogeneous systems. We also describe this field effect by a convenient empirical formula.

  2. The Mosaic Theory Revisited: Common Molecular Mechanisms Coordinating Diverse Organ and Cellular Events in Hypertension

    PubMed Central

    Harrison, David G.

    2012-01-01

    Over 60 years ago, Dr. Irvine Page proposed the Mosaic Theory of hypertension, which states that many factors, including genetics, environment, adaptive, neural, mechanical and hormonal perturbations interdigitate to raise blood pressure. In the past two decades, it has become clear that common molecular and cellular events in various organs underlie many features of the Mosaic Theory. Two of these are the production of reactive oxygen species (ROS) and inflammation. These factors increase neuronal firing in specific brain centers, increase sympathetic outflow, alter vascular tone and morphology and promote sodium retention in the kidney. Moreover, factors such as genetics and environment contribute to oxidant generation and inflammation. Other common cellular signals, including calcium signaling and endoplasmic reticulum stress are similarly perturbed in different cells in hypertension and contribute to components of Dr. Page’s theory. Thus, Dr. Page’s Mosaic Theory formed a framework for future studies of molecular and cellular signals in the context of hypertension, and has greatly aided our understanding of this complex disease. PMID:23321405

  3. Using organization theory to understand the determinants of effective implementation of worksite health promotion programs.

    PubMed

    Weiner, Bryan J; Lewis, Megan A; Linnan, Laura A

    2009-04-01

    The field of worksite health promotion has moved toward the development and testing of comprehensive programs that target health behaviors with interventions operating at multiple levels of influence. Yet, observational and process evaluation studies indicate that such programs are challenging for worksites to implement effectively. Research has identified several organizational factors that promote or inhibit effective implementation of comprehensive worksite health promotion programs. However, no integrated theory of implementation has emerged from this research. This article describes a theory of the organizational determinants of effective implementation of comprehensive worksite health promotion programs. The model is adapted from theory and research on the implementation of complex innovations in manufacturing, education and health care settings. The article uses the Working Well Trial to illustrate the model's theoretical constructs. Although the article focuses on comprehensive worksite health promotion programs, the conceptual model may also apply to other types of complex health promotion programs. An organization-level theory of the determinants of effective implementation of worksite health promotion programs.

  4. Predicting behavioural responses to novel organisms: state-dependent detection theory.

    PubMed

    Trimmer, Pete C; Ehlman, Sean M; Sih, Andrew

    2017-01-25

    Human activity alters natural habitats for many species. Understanding variation in animals' behavioural responses to these changing environments is critical. We show how signal detection theory can be used within a wider framework of state-dependent modelling to predict behavioural responses to a major environmental change: novel, exotic species. We allow thresholds for action to be a function of reserves, and demonstrate how optimal thresholds can be calculated. We term this framework 'state-dependent detection theory' (SDDT). We focus on behavioural and fitness outcomes when animals continue to use formerly adaptive thresholds following environmental change. In a simple example, we show that exposure to novel animals which appear dangerous-but are actually safe-(e.g. ecotourists) can have catastrophic consequences for 'prey' (organisms that respond as if the new organisms are predators), significantly increasing mortality even when the novel species is not predatory. SDDT also reveals that the effect on reproduction can be greater than the effect on lifespan. We investigate factors that influence the effect of novel organisms, and address the potential for behavioural adjustments (via evolution or learning) to recover otherwise reduced fitness. Although effects of environmental change are often difficult to predict, we suggest that SDDT provides a useful route ahead.

  5. Are the Somatic Mutation and Tissue Organization Field Theories of Carcinogenesis Incompatible?

    PubMed Central

    Rosenfeld, Simon

    2013-01-01

    Two drastically different approaches to understanding the forces driving carcinogenesis have crystallized through years of research. These are the somatic mutation theory (SMT) and the tissue organization field theory (TOFT). The essence of SMT is that cancer is derived from a single somatic cell that has successively accumulated multiple DNA mutations, and that those mutations occur on genes which control cell proliferation and cell cycle. Thus, according to SMT, neoplastic lesions are the results of DNA-level events. Conversely, according to TOFT, carcinogenesis is primarily a problem of tissue organization: carcinogenic agents destroy the normal tissue architecture thus disrupting cell-to-cell signaling and compromising genomic integrity. Hence, in TOFT the DNA mutations are the effect, and not the cause, of the tissue-level events. Cardinal importance of successful resolution of the TOFT versus SMT controversy dwells in the fact that, according to SMT, cancer is a unidirectional and mostly irreversible disease; whereas, according to TOFT, it is curable and reversible. In this paper, our goal is to outline a plausible scenario in which TOFT and SMT can be reconciled using the framework and concepts of the self-organized criticality (SOC), the principle proven to be extremely fruitful in a wide range of disciplines pertaining to natural phenomena, to biological communities, to large-scale social developments, to technological networks, and to many other subjects of research. PMID:24324325

  6. Are the somatic mutation and tissue organization field theories of carcinogenesis incompatible?

    PubMed

    Rosenfeld, Simon

    2013-01-01

    Two drastically different approaches to understanding the forces driving carcinogenesis have crystallized through years of research. These are the somatic mutation theory (SMT) and the tissue organization field theory (TOFT). The essence of SMT is that cancer is derived from a single somatic cell that has successively accumulated multiple DNA mutations, and that those mutations occur on genes which control cell proliferation and cell cycle. Thus, according to SMT, neoplastic lesions are the results of DNA-level events. Conversely, according to TOFT, carcinogenesis is primarily a problem of tissue organization: carcinogenic agents destroy the normal tissue architecture thus disrupting cell-to-cell signaling and compromising genomic integrity. Hence, in TOFT the DNA mutations are the effect, and not the cause, of the tissue-level events. Cardinal importance of successful resolution of the TOFT versus SMT controversy dwells in the fact that, according to SMT, cancer is a unidirectional and mostly irreversible disease; whereas, according to TOFT, it is curable and reversible. In this paper, our goal is to outline a plausible scenario in which TOFT and SMT can be reconciled using the framework and concepts of the self-organized criticality (SOC), the principle proven to be extremely fruitful in a wide range of disciplines pertaining to natural phenomena, to biological communities, to large-scale social developments, to technological networks, and to many other subjects of research.

  7. Decision-making regarding organ donation in Korean adults: A grounded-theory study.

    PubMed

    Yeun, Eun Ja; Kwon, Young Mi; Kim, Jung A

    2015-06-01

    The aim of this study was to identify the hidden patterns of behavior leading toward the decision to donate organs. Thirteen registrants at the Association for Organ Sharing in Korea were recruited. Data were collected using in-depth interview and the interview transcripts were analyzed using Glaserian grounded-theory methodology. The main problem of participants was "body attachment" and the core category (management process) was determined to be "pursuing life." The theme consisted of four phases, which were: "hesitating," "investigating," "releasing," and "re-discovering. " Therefore, to increase organ donations, it is important to find a strategy that will create positive attitudes about organ donation through education and public relations. These results explain and provide a deeper understanding of the main problem that Korean people have about organ donation and their management of decision-making processes. These findings can help care providers to facilitate the decision-making process and respond to public needs while taking into account the sociocultural context within which decisions are made.

  8. Benchmarking DFT and semi-empirical methods for a reliable and cost-efficient computational screening of benzofulvene derivatives as donor materials for small-molecule organic solar cells

    NASA Astrophysics Data System (ADS)

    Tortorella, Sara; Mastropasqua Talamo, Maurizio; Cardone, Antonio; Pastore, Mariachiara; De Angelis, Filippo

    2016-02-01

    A systematic computational investigation on the optical properties of a group of novel benzofulvene derivatives (Martinelli 2014 Org. Lett. 16 3424-7), proposed as possible donor materials in small molecule organic photovoltaic (smOPV) devices, is presented. A benchmark evaluation against experimental results on the accuracy of different exchange and correlation functionals and semi-empirical methods in predicting both reliable ground state equilibrium geometries and electronic absorption spectra is carried out. The benchmark of the geometry optimization level indicated that the best agreement with x-ray data is achieved by using the B3LYP functional. Concerning the optical gap prediction, we found that, among the employed functionals, MPW1K provides the most accurate excitation energies over the entire set of benzofulvenes. Similarly reliable results were also obtained for range-separated hybrid functionals (CAM-B3LYP and wB97XD) and for global hybrid methods incorporating a large amount of non-local exchange (M06-2X and M06-HF). Density functional theory (DFT) hybrids with a moderate (about 20-30%) extent of Hartree-Fock exchange (HFexc) (PBE0, B3LYP and M06) were also found to deliver HOMO-LUMO energy gaps which compare well with the experimental absorption maxima, thus representing a valuable alternative for a prompt and predictive estimation of the optical gap. The possibility of using completely semi-empirical approaches (AM1/ZINDO) is also discussed.

  9. The higher disinfectant resistance of nosocomial isolates of Klebsiella oxytoca: how reliable are indicator organisms in disinfectant testing?

    PubMed

    Gebel, J; Sonntag, H-G; Werner, H-P; Vacata, V; Exner, M; Kistemann, T

    2002-04-01

    The Children's Clinic in Giessen, Germany recently reported several severe infections with Klebsiella oxytoca resulting in deaths of two neonates. The putative source of the infections was a contaminated infusion solution. The resistance to disinfectant of the K. oxytoca isolates was investigated in three independent laboratories and was indeed found to be significantly increased. Comparative tests with standard strains of K. oxytoca and other recommended bacterial surrogates showed the disinfection procedures used were fully effective. The higher resistance of the nosocomial isolates may have developed due to improper handling and storage of the cleaning utensils. This report describes the events and draws conclusions concerning the use of disinfectants, the treatment of cleaning utensils, the reliability of procedures for testing disinfectants, and suggests additional measures.

  10. 18 CFR 39.11 - Reliability reports.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Reliability reports. 39... RELIABILITY ORGANIZATION; AND PROCEDURES FOR THE ESTABLISHMENT, APPROVAL, AND ENFORCEMENT OF ELECTRIC RELIABILITY STANDARDS § 39.11 Reliability reports. (a) The Electric Reliability Organization shall...

  11. Did Geomagnetic Activity Challenge Electric Power Reliability During Solar Cycle 23? Evidence from the PJM Regional Transmission Organization in North America

    NASA Technical Reports Server (NTRS)

    Forbes, Kevin F.; Cyr, Chris St

    2012-01-01

    During solar cycle 22, a very intense geomagnetic storm on 13 March 1989 contributed to the collapse of the Hydro-Quebec power system in Canada. This event clearly demonstrated that geomagnetic storms have the potential to lead to blackouts. This paper addresses whether geomagnetic activity challenged power system reliability during solar cycle 23. Operations by PJM Interconnection, LLC (hereafter PJM), a regional transmission organization in North America, are examined over the period 1 April 2002 through 30 April 2004. During this time PJM coordinated the movement of wholesale electricity in all or parts of Delaware, Maryland, New Jersey, Ohio, Pennsylvania, Virginia, West Virginia, and the District of Columbia in the United States. We examine the relationship between a proxy of geomagnetically induced currents (GICs) and a metric of challenged reliability. In this study, GICs are proxied using magnetometer data from a geomagnetic observatory located just outside the PJM control area. The metric of challenged reliability is the incidence of out-of-economic-merit order dispatching due to adverse reactive power conditions. The statistical methods employed make it possible to disentangle the effects of GICs on power system operations from purely terrestrial factors. The results of the analysis indicate that geomagnetic activity can significantly increase the likelihood that the system operator will dispatch generating units based on system stability considerations rather than economic merit.

  12. Making Reliability Arguments in Classrooms

    ERIC Educational Resources Information Center

    Parkes, Jay; Giron, Tilia

    2006-01-01

    Reliability methodology needs to evolve as validity has done into an argument supported by theory and empirical evidence. Nowhere is the inadequacy of current methods more visible than in classroom assessment. Reliability arguments would also permit additional methodologies for evidencing reliability in classrooms. It would liberalize methodology…

  13. The Mochi project: a field theory approach to plasma dynamics and self-organization

    NASA Astrophysics Data System (ADS)

    You, Setthivoine; von der Linden, Jens; Lavine, Eric Sander; Card, Alexander; Carroll, Evan

    2016-10-01

    The Mochi project is designed to study the interaction between plasma flows and magnetic fields from the point-of-view of canonical flux tubes. The Mochi Labjet experiment is being commissioned after achieving first plasma. Analytical and numerical tools are being developed to visualize canonical flux tubes. One analytical tool described here is a field theory approach to plasma dynamics and self-organization. A redefinition of the Lagrangian of a multi-particle system in fields reformulates the single-particle, kinetic, and fluid equations governing fluid and plasma dynamics as a single set of generalized Maxwell's equations and Ohm's law for canonical force-fields. The Lagrangian includes new terms representing the coupling between the motion of particle distributions, between distributions and electromagnetic fields, with relativistic contributions. The formulation shows that the concepts of self-organization and canonical helicity transport are applicable across single-particle, kinetic, and fluid regimes, at classical and relativistic scales. The theory gives the basis for comparing canonical helicity change to energy change in general systems. This work is supported by by US DOE Grant DE-SC0010340.

  14. Firm Size, a Self-Organized Critical Phenomenon: Evidence from the Dynamical Systems Theory

    NASA Astrophysics Data System (ADS)

    Chandra, Akhilesh

    This research draws upon a recent innovation in the dynamical systems literature called the theory of self -organized criticality (SOC) (Bak, Tang, and Wiesenfeld 1988) to develop a computational model of a firm's size by relating its internal and the external sub-systems. As a holistic paradigm, the theory of SOC implies that a firm as a composite system of many degrees of freedom naturally evolves to a critical state in which a minor event starts a chain reaction that can affect either a part or the system as a whole. Thus, the global features of a firm cannot be understood by analyzing its individual parts separately. The causal framework builds upon a constant capital resource to support a volume of production at the existing level of efficiency. The critical size is defined as the production level at which the average product of a firm's factors of production attains its maximum value. The non -linearity is inferred by a change in the nature of relations at the border of criticality, between size and the two performance variables, viz., the operating efficiency and the financial efficiency. The effect of breaching the critical size is examined on the stock price reactions. Consistent with the theory of SOC, it is hypothesized that the temporal response of a firm breaching the level of critical size should behave as a flicker noise (1/f) process. The flicker noise is characterized by correlations extended over a wide range of time scales, indicating some sort of cooperative effect among a firm's degrees of freedom. It is further hypothesized that a firm's size evolves to a spatial structure with scale-invariant, self-similar (fractal) properties. The system is said to be self-organized inasmuch as it naturally evolves to the state of criticality without any detailed specifications of the initial conditions. In this respect, the critical state is an attractor of the firm's dynamics. Another set of hypotheses examines the relations between the size and the

  15. 78 FR 24107 - Version 5 Critical Infrastructure Protection Reliability Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-24

    ... Electric Reliability Corporation, the Commission-certified Electric Reliability Organization. The proposed Reliability Standards, which pertain to the cyber security of the bulk electric system, represent an... Information), Office of Electric Reliability, Division of Reliability Standards and Security, Federal...

  16. The search for reliable aqueous solubility (Sw) and octanol-water partition coefficient (Kow) data for hydrophobic organic compounds; DDT and DDE as a case study

    USGS Publications Warehouse

    Pontolillo, James; Eganhouse, R.P.

    2001-01-01

    The accurate determination of an organic contaminant?s physico-chemical properties is essential for predicting its environmental impact and fate. Approximately 700 publications (1944?2001) were reviewed and all known aqueous solubilities (Sw) and octanol-water partition coefficients (Kow) for the organochlorine pesticide, DDT, and its persistent metabolite, DDE were compiled and examined. Two problems are evident with the available database: 1) egregious errors in reporting data and references, and 2) poor data quality and/or inadequate documentation of procedures. The published literature (particularly the collative literature such as compilation articles and handbooks) is characterized by a preponderance of unnecessary data duplication. Numerous data and citation errors are also present in the literature. The percentage of original Sw and Kow data in compilations has decreased with time, and in the most recent publications (1994?97) it composes only 6?26 percent of the reported data. The variability of original DDT/DDE Sw and Kow data spans 2?4 orders of magnitude, and there is little indication that the uncertainty in these properties has declined over the last 5 decades. A criteria-based evaluation of DDT/DDE Sw and Kow data sources shows that 95?100 percent of the database literature is of poor or unevaluatable quality. The accuracy and reliability of the vast majority of the data are unknown due to inadequate documentation of the methods of determination used by the authors. [For example, estimates of precision have been reported for only 20 percent of experimental Sw data and 10 percent of experimental Kow data.] Computational methods for estimating these parameters have been increasingly substituted for direct or indirect experimental determination despite the fact that the data used for model development and validation may be of unknown reliability. Because of the prevalence of errors, the lack of methodological documentation, and unsatisfactory data

  17. Precise segmentation of multiple organs in CT volumes using learning-based approach and information theory.

    PubMed

    Lu, Chao; Zheng, Yefeng; Birkbeck, Neil; Zhang, Jingdan; Kohlberger, Timo; Tietjen, Christian; Boettger, Thomas; Duncan, James S; Zhou, S Kevin

    2012-01-01

    In this paper, we present a novel method by incorporating information theory into the learning-based approach for automatic and accurate pelvic organ segmentation (including the prostate, bladder and rectum). We target 3D CT volumes that are generated using different scanning protocols (e.g., contrast and non-contrast, with and without implant in the prostate, various resolution and position), and the volumes come from largely diverse sources (e.g., diseased in different organs). Three key ingredients are combined to solve this challenging segmentation problem. First, marginal space learning (MSL) is applied to efficiently and effectively localize the multiple organs in the largely diverse CT volumes. Second, learning techniques, steerable features, are applied for robust boundary detection. This enables handling of highly heterogeneous texture pattern. Third, a novel information theoretic scheme is incorporated into the boundary inference process. The incorporation of the Jensen-Shannon divergence further drives the mesh to the best fit of the image, thus improves the segmentation performance. The proposed approach is tested on a challenging dataset containing 188 volumes from diverse sources. Our approach not only produces excellent segmentation accuracy, but also runs about eighty times faster than previous state-of-the-art solutions. The proposed method can be applied to CT images to provide visual guidance to physicians during the computer-aided diagnosis, treatment planning and image-guided radiotherapy to treat cancers in pelvic region.

  18. Investigation of Multiconfigurational Short-Range Density Functional Theory for Electronic Excitations in Organic Molecules.

    PubMed

    Hubert, Mickaël; Hedegård, Erik D; Jensen, Hans Jørgen Aa

    2016-05-10

    Computational methods that can accurately and effectively predict all types of electronic excitations for any molecular system are missing in the toolbox of the computational chemist. Although various Kohn-Sham density-functional methods (KS-DFT) fulfill this aim in some cases, they become inadequate when the molecule has near-degeneracies and/or low-lying double-excited states. To address these issues we have recently proposed multiconfiguration short-range density-functional theory-MC-srDFT-as a new tool in the toolbox. While initial applications for systems with multireference character and double excitations have been promising, it is nevertheless important that the accuracy of MC-srDFT is at least comparable to the best KS-DFT methods also for organic molecules that are typically of single-reference character. In this paper we therefore systematically investigate the performance of MC-srDFT for a selected benchmark set of electronic excitations of organic molecules, covering the most common types of organic chromophores. This investigation confirms the expectation that the MC-srDFT method is accurate for a broad range of excitations and comparable to accurate wave function methods such as CASPT2, NEVPT2, and the coupled cluster based CC2 and CC3.

  19. Adsorptive desulfurization with metal-organic frameworks: A density functional theory investigation

    NASA Astrophysics Data System (ADS)

    Chen, Zhiping; Ling, Lixia; Wang, Baojun; Fan, Huiling; Shangguan, Ju; Mi, Jie

    2016-11-01

    The contribution of each fragment of metal-organic frameworks (MOFs) to the adsorption of sulfur compounds were investigated using density functional theory (DFT). The involved sulfur compounds are dimethyl sulfide (CH3SCH3), ethyl mercaptan (CH3CH2SH) and hydrogen sulfide (H2S). MOFs with different organic ligands (NH2-BDC, BDC and NDC), metal centers structures (M, M-M and M3O) and metal ions (Zn, Cu and Fe) were used to study their effects on sulfur species adsorption. The results revealed that, MOFs with coordinatively unsaturated sites (CUS) have the strongest binding strength with sulfur compounds, MOFs with NH2-BDC substituent group ligand comes second, followed by that with saturated metal center, and the organic ligands without substituent group has the weakest adsorption strength. Moreover, it was also found that, among different metal ions (Fe, Zn and Cu), MOFs with unsaturated Fe has the strongest adsorption strength for sulfur compounds. These results are consistent with our previous experimental observations, and therefore provide insights on the better design of MOFs for desulfurization application.

  20. Investigating the self-organization of debris flows: theory, modelling, and empirical work

    NASA Astrophysics Data System (ADS)

    von Elverfeldt, Kirsten; Keiler, Margreth; Elmenreich, Wilfried; Fehárvári, István; Zhevzhyk, Sergii

    2014-05-01

    Here we present the conceptual framework of an interdisciplinary project on the theory, empirics, and modelling of the self-organisation mechanisms within debris flows. Despite the fact that debris flows are causing severe damages in mountainous regions such as the Alps, the process behaviour of debris flows is still not well understood. This is mainly due to the process dynamics of debris flows: Erosion and material entrainment are essential for their destructive power, and because of this destructiveness it is nearly impossible to measure and observe these mechanisms in action. Hence, the interactions between channel bed and debris flow remain largely unknown whilst this knowledge is crucial for the understanding of debris flow behaviour. Furthermore, while these internal parameter interactions are changing during an event, they are at the same time governing the temporal and spatial evolution of a given event. This project aims at answering some of these unknowns by means of bringing theory, empirical work, and modelling of debris flows together. It especially aims at explaining why process types are switching along the flow path during an event, e.g. the change from a debris flow to a hyperconcentrated flow and back. A second focus is the question of why debris flows sometimes exhibit strong erosion and sediment mobilisation during an event and at other times they do not. A promising theoretical framework for the analysis of these observations is that of self-organizing systems, and especially Haken's theory of synergetics. Synergetics is an interdisciplinary theory of open systems that are characterized by many individual, yet interacting parts, resulting in spatio-temporal structures. We hypothesize that debris flows can successfully be analysed within this theoretical framework. In order to test this hypothesis, an innovative modelling approach is chosen in combination with detailed field work. In self-organising systems the interactions of the system

  1. Collection-limited theory interprets the extraordinary response of single semiconductor organic solar cells.

    PubMed

    Ray, Biswajit; Baradwaj, Aditya G; Khan, Mohammad Ryyan; Boudouris, Bryan W; Alam, Muhammad Ashraful

    2015-09-08

    The bulk heterojunction (BHJ) organic photovoltaic (OPV) architecture has dominated the literature due to its ability to be implemented in devices with relatively high efficiency values. However, a simpler device architecture based on a single organic semiconductor (SS-OPV) offers several advantages: it obviates the need to control the highly system-dependent nanoscale BHJ morphology, and therefore, would allow the use of broader range of organic semiconductors. Unfortunately, the photocurrent in standard SS-OPV devices is typically very low, which generally is attributed to inefficient charge separation of the photogenerated excitons. Here we show that the short-circuit current density from SS-OPV devices can be enhanced significantly (∼100-fold) through the use of inverted device configurations, relative to a standard OPV device architecture. This result suggests that charge generation may not be the performance bottleneck in OPV device operation. Instead, poor charge collection, caused by defect-induced electric field screening, is most likely the primary performance bottleneck in regular-geometry SS-OPV cells. We justify this hypothesis by: (i) detailed numerical simulations, (ii) electrical characterization experiments of functional SS-OPV devices using multiple polymers as active layer materials, and (iii) impedance spectroscopy measurements. Furthermore, we show that the collection-limited photocurrent theory consistently interprets typical characteristics of regular SS-OPV devices. These insights should encourage the design and OPV implementation of high-purity, high-mobility polymers, and other soft materials that have shown promise in organic field-effect transistor applications, but have not performed well in BHJ OPV devices, wherein they adopt less-than-ideal nanostructures when blended with electron-accepting materials.

  2. Collection-limited theory interprets the extraordinary response of single semiconductor organic solar cells

    PubMed Central

    Ray, Biswajit; Baradwaj, Aditya G.; Khan, Mohammad Ryyan; Boudouris, Bryan W.; Alam, Muhammad Ashraful

    2015-01-01

    The bulk heterojunction (BHJ) organic photovoltaic (OPV) architecture has dominated the literature due to its ability to be implemented in devices with relatively high efficiency values. However, a simpler device architecture based on a single organic semiconductor (SS-OPV) offers several advantages: it obviates the need to control the highly system-dependent nanoscale BHJ morphology, and therefore, would allow the use of broader range of organic semiconductors. Unfortunately, the photocurrent in standard SS-OPV devices is typically very low, which generally is attributed to inefficient charge separation of the photogenerated excitons. Here we show that the short-circuit current density from SS-OPV devices can be enhanced significantly (∼100-fold) through the use of inverted device configurations, relative to a standard OPV device architecture. This result suggests that charge generation may not be the performance bottleneck in OPV device operation. Instead, poor charge collection, caused by defect-induced electric field screening, is most likely the primary performance bottleneck in regular-geometry SS-OPV cells. We justify this hypothesis by: (i) detailed numerical simulations, (ii) electrical characterization experiments of functional SS-OPV devices using multiple polymers as active layer materials, and (iii) impedance spectroscopy measurements. Furthermore, we show that the collection-limited photocurrent theory consistently interprets typical characteristics of regular SS-OPV devices. These insights should encourage the design and OPV implementation of high-purity, high-mobility polymers, and other soft materials that have shown promise in organic field-effect transistor applications, but have not performed well in BHJ OPV devices, wherein they adopt less-than-ideal nanostructures when blended with electron-accepting materials. PMID:26290582

  3. 18 CFR 39.5 - Reliability Standards.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... competition. (d) An approved Reliability Standard or modification to a Reliability Standard shall take effect... will not defer to the Electric Reliability Organization or a Regional Entity with respect to the...

  4. Coding theory based models for protein translation initiation in prokaryotic organisms.

    PubMed

    May, Elebeoba E; Vouk, Mladen A; Bitzer, Donald L; Rosnick, David I

    2004-01-01

    Our research explores the feasibility of using communication theory, error control (EC) coding theory specifically, for quantitatively modeling the protein translation initiation mechanism. The messenger RNA (mRNA) of Escherichia coli K-12 is modeled as a noisy (errored), encoded signal and the ribosome as a minimum Hamming distance decoder, where the 16S ribosomal RNA (rRNA) serves as a template for generating a set of valid codewords (the codebook). We tested the E. coli based coding models on 5' untranslated leader sequences of prokaryotic organisms of varying taxonomical relation to E. coli including: Salmonella typhimurium LT2, Bacillus subtilis, and Staphylococcus aureus Mu50. The model identified regions on the 5' untranslated leader where the minimum Hamming distance values of translated mRNA sub-sequences and non-translated genomic sequences differ the most. These regions correspond to the Shine-Dalgarno domain and the non-random domain. Applying the EC coding-based models to B. subtilis, and S. aureus Mu50 yielded results similar to those for E. coli K-12. Contrary to our expectations, the behavior of S. typhimurium LT2, the more taxonomically related to E. coli, resembled that of the non-translated sequence group.

  5. Nuclear weapons decision-making; an application of organization theory to the mini-nuke case

    SciTech Connect

    Kangas, J.L.

    1985-01-01

    This dissertation addresses the problem of constructing and developing normative theory responsive to the need for improving the quality of decision-making in the nuclear weapons policy-making. Against the background of a critical evaluation of various paradigms in the literature (systems analysis and opposed-systems designed, the bureaucratic politics model, and the cybernetic theory of decision) an attempt is made to design an alternative analytic framework based on the writings of numerous organization theorists such as Herbert Simon and Kenneth Arrow. The framework is applied to the case of mini-nukes, i.e., proposals in the mid-1970s to develop and deploy tens of thousands of very low-yield (sub-kiloton), miniaturized fission weapons in NATO. Heuristic case study identifies the type of study undertaken in the dissertation in contrast to the more familiar paradigmatic studies identified, for example, with the Harvard Weapons Project. Application of the analytic framework developed in the dissertation of the mini-nuke case resulted in an empirical understanding of why decision making concerning tactical nuclear weapons has been such a complex task and why force modernization issues in particular have been so controversial and lacking in policy resolution.

  6. Coding theory based models for protein translation initiation in prokaryotic organisms.

    SciTech Connect

    May, Elebeoba Eni; Bitzer, Donald L. (North Carolina State University, Raleigh, NC); Rosnick, David I. (North Carolina State University, Raleigh, NC); Vouk, Mladen A.

    2003-03-01

    Our research explores the feasibility of using communication theory, error control (EC) coding theory specifically, for quantitatively modeling the protein translation initiation mechanism. The messenger RNA (mRNA) of Escherichia coli K-12 is modeled as a noisy (errored), encoded signal and the ribosome as a minimum Hamming distance decoder, where the 16S ribosomal RNA (rRNA) serves as a template for generating a set of valid codewords (the codebook). We tested the E. coli based coding models on 5' untranslated leader sequences of prokaryotic organisms of varying taxonomical relation to E. coli including: Salmonella typhimurium LT2, Bacillus subtilis, and Staphylococcus aureus Mu50. The model identified regions on the 5' untranslated leader where the minimum Hamming distance values of translated mRNA sub-sequences and non-translated genomic sequences differ the most. These regions correspond to the Shine-Dalgarno domain and the non-random domain. Applying the EC coding-based models to B. subtilis, and S. aureus Mu50 yielded results similar to those for E. coli K-12. Contrary to our expectations, the behavior of S. typhimurium LT2, the more taxonomically related to E. coli, resembled that of the non-translated sequence group.

  7. Simple, stable and reliable modeling of gas properties of organic working fluids in aerodynamic designs of turbomachinery for ORC and VCC

    NASA Astrophysics Data System (ADS)

    Kawakubo, T.

    2016-05-01

    A simple, stable and reliable modeling of the real gas nature of the working fluid is required for the aerodesigns of the turbine in the Organic Rankine Cycle and of the compressor in the Vapor Compression Cycle. Although many modern Computational Fluid Dynamics tools are capable of incorporating real gas models, simulations with such a gas model tend to be more time-consuming than those with a perfect gas model and even can be unstable due to the simulation near the saturation boundary. Thus a perfect gas approximation is still an attractive option to stably and swiftly conduct a design simulation. In this paper, an effective method of the CFD simulation with a perfect gas approximation is discussed. A method of representing the performance of the centrifugal compressor or the radial-inflow turbine by means of each set of non-dimensional performance parameters and translating the fictitious perfect gas result to the actual real gas performance is presented.

  8. hfAIM: A reliable bioinformatics approach for in silico genome-wide identification of autophagy-associated Atg8-interacting motifs in various organisms.

    PubMed

    Xie, Qingjun; Tzfadia, Oren; Levy, Matan; Weithorn, Efrat; Peled-Zehavi, Hadas; Van Parys, Thomas; Van de Peer, Yves; Galili, Gad

    2016-05-03

    Most of the proteins that are specifically turned over by selective autophagy are recognized by the presence of short Atg8 interacting motifs (AIMs) that facilitate their association with the autophagy apparatus. Such AIMs can be identified by bioinformatics methods based on their defined degenerate consensus F/W/Y-X-X-L/I/V sequences in which X represents any amino acid. Achieving reliability and/or fidelity of the prediction of such AIMs on a genome-wide scale represents a major challenge. Here, we present a bioinformatics approach, high fidelity AIM (hfAIM), which uses additional sequence requirements-the presence of acidic amino acids and the absence of positively charged amino acids in certain positions-to reliably identify AIMs in proteins. We demonstrate that the use of the hfAIM method allows for in silico high fidelity prediction of AIMs in AIM-containing proteins (ACPs) on a genome-wide scale in various organisms. Furthermore, by using hfAIM to identify putative AIMs in the Arabidopsis proteome, we illustrate a potential contribution of selective autophagy to various biological processes. More specifically, we identified 9 peroxisomal PEX proteins that contain hfAIM motifs, among which AtPEX1, AtPEX6 and AtPEX10 possess evolutionary-conserved AIMs. Bimolecular fluorescence complementation (BiFC) results verified that AtPEX6 and AtPEX10 indeed interact with Atg8 in planta. In addition, we show that mutations occurring within or nearby hfAIMs in PEX1, PEX6 and PEX10 caused defects in the growth and development of various organisms. Taken together, the above results suggest that the hfAIM tool can be used to effectively perform genome-wide in silico screens of proteins that are potentially regulated by selective autophagy. The hfAIM system is a web tool that can be accessed at link: http://bioinformatics.psb.ugent.be/hfAIM/.

  9. Prediction of Charge Mobility in Amorphous Organic Materials through the Application of Hopping Theory.

    PubMed

    Lee, Choongkeun; Waterland, Robert; Sohlberg, Karl

    2011-08-09

    The application of hopping theory to predict charge (hole) mobility in amorphous organic molecular materials is studied in detail. Application is made to amorphous cells of N,N'-diphenyl-N,N'-bis-(3-methylphenylene)-1,1'-diphenyl-4,4'-diamine (TPD), 1,1-bis-(4,4'-diethylaminophenyl)-4,4-diphenyl-1,3,butadinene (DEPB), N4,N4'-di(biphenyl-3-yl)-N4,N4'-diphenylbiphenyl-4,4'-diamine (mBPD), N1,N4-di(naphthalen-1-yl)-N1,N4-diphenylbenzene-1,4-diamine (NNP), and N,N'-bis[9,9-dimethyl-2-fluorenyl]-N,N'-diphenyl-9,9-dimethylfluorene-2,7-diamine (pFFA). Detailed analysis of the computation of each of the parameters in the equations for hopping rate is presented, including studies of their convergence with respect to various numerical approximations. Based on these convergence studies, the most robust methodology is then applied to investigate the dependence of mobility on such parameters as the monomer reorganization energy, the monomer-monomer coupling, and the material density. The results give insight into what will be required to improve the accuracy of predictions of mobility in amorphous organic materials, and what factors should be controlled to develop materials with higher (or lower) charge (hole) mobility.

  10. FFLO strange metal and quantum criticality in two dimensions: Theory and application to organic superconductors

    NASA Astrophysics Data System (ADS)

    Piazza, Francesco; Zwerger, Wilhelm; Strack, Philipp

    2016-02-01

    Increasing the spin imbalance in superconductors can spatially modulate the gap by forming Cooper pairs with finite momentum. For large imbalances compared to the Fermi energy, the inhomogeneous FFLO superconductor ultimately becomes a normal metal. There is mounting experimental evidence for this scenario in two-dimensional (2D) organic superconductors in large in-plane magnetic fields; this is complemented by ongoing efforts to realize this scenario in coupled tubes of atomic Fermi gases with spin imbalance. Yet, a theory for the phase transition from a metal to an FFLO superconductor has not been developed so far and the universality class has remained unknown. Here we propose and analyze a spin imbalance driven quantum critical point between a 2D metal and an FFLO phase in anisotropic electron systems. We derive the effective action for electrons and bosonic FFLO pairs at this quantum phase transition. Using this action, we predict non-Fermi-liquid behavior and the absence of quasiparticles at a discrete set of hot spots on the Fermi surfaces. This results in strange power laws in thermodynamics and response functions, which are testable with existing experimental setups on 2D organic superconductors and may also serve as signatures of the elusive FFLO phase itself. The proposed universality class is distinct from previously known quantum critical metals and, because its critical fluctuations appear already in the pairing channel, a promising candidate for naked metallic quantum criticality over extended temperature ranges.

  11. Person Reliability

    ERIC Educational Resources Information Center

    Lumsden, James

    1977-01-01

    Person changes can be of three kinds: developmental trends, swells, and tremors. Person unreliability in the tremor sense (momentary fluctuations) can be estimated from person characteristic curves. Average person reliability for groups can be compared from item characteristic curves. (Author)

  12. The effect of the labile organic fraction in food waste and the substrate/inoculum ratio on anaerobic digestion for a reliable methane yield.

    PubMed

    Kawai, Minako; Nagao, Norio; Tajima, Nobuaki; Niwa, Chiaki; Matsuyama, Tatsushi; Toda, Tatsuki

    2014-04-01

    Influence of the labile organic fraction (LOF) on anaerobic digestion of food waste was investigated in different S/I ratio of 0.33, 0.5, 1.0, 2.0 and 4.0g-VSsubstrate/g-VSinoculum. Two types of substrate, standard food waste (Substrate 1) and standard food waste with the supernatant (containing LOF) removed (Substrate 2) were used. Highest methane yield of 435ml-CH4g-VS(-1) in Substrate 1 was observed in the lowest S/I ratio, while the methane yield of the other S/I ratios were 38-73% lower than the highest yield due to acidification. The methane yields in Substrate 2 were relatively stable in all S/I conditions, although the maximum methane yield was low compared with Substrate 1. These results showed that LOF in food waste causes acidification, but also contributes to high methane yields, suggesting that low S/I ratio (<0.33) is required to obtain a reliable methane yield from food waste compared to other organic substrates.

  13. A regulatory theory of cortical organization and its applications to robotics

    NASA Astrophysics Data System (ADS)

    Thangavelautham, Jekanthan

    2009-11-01

    Fundamental aspects of biologically-inspired regulatory mechanisms are considered in a robotics context, using artificial neural-network control systems. Regulatory mechanisms are used to control expression of genes, adaptation of form and behavior in organisms. Traditional neural network control architectures assume networks of neurons are fixed and are interconnected by wires. However, these architectures tend to be specified by a designer and are faced with several limitations that reduce scalability and tractability for tasks with larger search spaces. Traditional methods used to overcome these limitations with fixed network topologies are to provide more supervision by a designer. More supervision as shown does not guarantee improvement during training particularly when making incorrect assumptions for little known task domains. Biological organisms often do not require such external intervention (more supervision) and have self-organized through adaptation. Artificial neural tissues (ANT) addresses limitations with current neural-network architectures by modeling both wired interactions between neurons and wireless interactions through use of chemical diffusion fields. An evolutionary (Darwinian) selection process is used to 'breed' ANT controllers for a task at hand and the framework facilitates emergence of creative solutions since only a system goal function and a generic set of basis behaviours need be defined. Regulatory mechanisms are formed dynamically within ANT through superpositioning of chemical diffusion fields from multiple sources and are used to select neuronal groups. Regulation drives competition and cooperation among neuronal groups and results in areas of specialization forming within the tissue. These regulatory mechanisms are also shown to increase tractability without requiring more supervision using a new statistical theory developed to predict performance characteristics of fixed network topologies. Simulations also confirm the

  14. Discovery of fairy circles in Australia supports self-organization theory

    PubMed Central

    Getzin, Stephan; Yizhaq, Hezi; Bell, Bronwyn; Erickson, Todd E.; Postle, Anthony C.; Katra, Itzhak; Tzuk, Omer; Zelnik, Yuval R.; Wiegand, Kerstin; Wiegand, Thorsten; Meron, Ehud

    2016-01-01

    Vegetation gap patterns in arid grasslands, such as the “fairy circles” of Namibia, are one of nature’s greatest mysteries and subject to a lively debate on their origin. They are characterized by small-scale hexagonal ordering of circular bare-soil gaps that persists uniformly in the landscape scale to form a homogeneous distribution. Pattern-formation theory predicts that such highly ordered gap patterns should be found also in other water-limited systems across the globe, even if the mechanisms of their formation are different. Here we report that so far unknown fairy circles with the same spatial structure exist 10,000 km away from Namibia in the remote outback of Australia. Combining fieldwork, remote sensing, spatial pattern analysis, and process-based mathematical modeling, we demonstrate that these patterns emerge by self-organization, with no correlation with termite activity; the driving mechanism is a positive biomass–water feedback associated with water runoff and biomass-dependent infiltration rates. The remarkable match between the patterns of Australian and Namibian fairy circles and model results indicate that both patterns emerge from a nonuniform stationary instability, supporting a central universality principle of pattern-formation theory. Applied to the context of dryland vegetation, this principle predicts that different systems that go through the same instability type will show similar vegetation patterns even if the feedback mechanisms and resulting soil–water distributions are different, as we indeed found by comparing the Australian and the Namibian fairy-circle ecosystems. These results suggest that biomass–water feedbacks and resultant vegetation gap patterns are likely more common in remote drylands than is currently known. PMID:26976567

  15. Dispersion corrected hartree-fock and density functional theory for organic crystal structure prediction.

    PubMed

    Brandenburg, Jan Gerit; Grimme, Stefan

    2014-01-01

    We present and evaluate dispersion corrected Hartree-Fock (HF) and Density Functional Theory (DFT) based quantum chemical methods for organic crystal structure prediction. The necessity of correcting for missing long-range electron correlation, also known as van der Waals (vdW) interaction, is pointed out and some methodological issues such as inclusion of three-body dispersion terms are discussed. One of the most efficient and widely used methods is the semi-classical dispersion correction D3. Its applicability for the calculation of sublimation energies is investigated for the benchmark set X23 consisting of 23 small organic crystals. For PBE-D3 the mean absolute deviation (MAD) is below the estimated experimental uncertainty of 1.3 kcal/mol. For two larger π-systems, the equilibrium crystal geometry is investigated and very good agreement with experimental data is found. Since these calculations are carried out with huge plane-wave basis sets they are rather time consuming and routinely applicable only to systems with less than about 200 atoms in the unit cell. Aiming at crystal structure prediction, which involves screening of many structures, a pre-sorting with faster methods is mandatory. Small, atom-centered basis sets can speed up the computation significantly but they suffer greatly from basis set errors. We present the recently developed geometrical counterpoise correction gCP. It is a fast semi-empirical method which corrects for most of the inter- and intramolecular basis set superposition error. For HF calculations with nearly minimal basis sets, we additionally correct for short-range basis incompleteness. We combine all three terms in the HF-3c denoted scheme which performs very well for the X23 sublimation energies with an MAD of only 1.5 kcal/mol, which is close to the huge basis set DFT-D3 result.

  16. Species Detection and Identification in Sexual Organisms Using Population Genetic Theory and DNA Sequences

    PubMed Central

    Birky, C. William

    2013-01-01

    Phylogenetic trees of DNA sequences of a group of specimens may include clades of two kinds: those produced by stochastic processes (random genetic drift) within a species, and clades that represent different species. The ratio of the mean pairwise sequence difference between a pair of clades (K) to the mean pairwise sequence difference within a clade (θ) can be used to determine whether the clades are samples from different species (K/θ≥4) or the same species (K/θ<4) with probability ≥0.95. Previously I applied this criterion to delimit species of asexual organisms. Here I use data from the literature to show how it can also be applied to delimit sexual species using four groups of sexual organisms as examples: ravens, spotted leopards, sea butterflies, and liverworts. Mitochondrial or chloroplast genes are used because these segregate earlier during speciation than most nuclear genes and hence detect earlier stages of speciation. In several cases the K/θ ratio was greater than 4, confirming the original authors' intuition that the clades were sufficiently different to be assigned to different species. But the K/θ ratio split each of two liverwort species into two evolutionary species, and showed that support for the distinction between the common and Chihuahuan raven species is weak. I also discuss some possible sources of error in using the K/θ ratio; the most significant one would be cases where males migrate between different populations but females do not, making the use of maternally inherited organelle genes problematic. The K/θ ratio must be used with some caution, like all other methods for species delimitation. Nevertheless, it is a simple theory-based quantitative method for using DNA sequences to make rigorous decisions about species delimitation in sexual as well as asexual eukaryotes. PMID:23308113

  17. The organic surface of 5145 Pholus: Constraints set by scattering theory

    NASA Technical Reports Server (NTRS)

    Wilson, Peter D.; Sagan, Carl; Thompson, W. Reid

    1994-01-01

    No known body in the Solar System has a spectrum redder than that of object 5145 Pholus. We use Hapke scattering theory and optical constants measured in this laboratory to examine the ability of mixtures of a number of organic solids and ices to reproduce the observed spectrum and phase variation. The primary materials considered are poly-HCN, kerogen, Murchison organic extract, Titan tholin, ice tholin, and water ice. In a computer grid search of over 10 million models, we find an intraparticle mixture of 15% Titan tholin, 10% poly-HCN, and 75% water ice with 10-micrometers particles to provide an excellent fit. Replacing water ice with ammonia ice improves the fits significantly while using a pure hydrocarbon tholin, Tholin alpha, instead of Titan tholin makes only modest improvements. All acceptable fits require Titan tholin or some comparable material to provide the steep slope in the visible, and poly-HCN or some comparable material to provide strong absorption in the near-infrared. A pure Titan tholin surface with 16-micrometers particles, as well as all acceptable Pholus models, fit the present spectrophotometric data for the transplutonian object 1992 QB(sub 1). The feasibility of gas-phase chemistry to generate material like Titan tholin on such small objects is examined. An irradiated transient atmosphere arising from sublimating ices may generate at most a few centimeters of tholin over the lifetime of the Solar System, but this is insignificant compared to the expected lag deposit of primordial contaminants left behind by the sublimating ice. Irradiation of subsurface N2/CH4 or NH3/CH4 ice by cosmic rays may generate approximately 20 cm of tholin in the upper 10 m of regolith in the same time scale but the identity of this tholin to its gas-phase equivalent has not been demonstrated.

  18. An Examination of Test-Retest, Alternate Form Reliability, and Generalizability Theory Study of the easyCBM Word and Passage Reading Fluency Assessments: Grade 3. Technical Report #1218

    ERIC Educational Resources Information Center

    Park, Bitnara Jasmine; Anderson, Daniel; Alonzo, Julie; Lai, Cheng-Fei; Tindal, Gerald

    2012-01-01

    This technical report is one in a series of five describing the reliability (test/retest and alternate form) and G-Theory/D-Study research on the easyCBM reading measures, grades 1-5. Data were gathered in the spring of 2011 from a convenience sample of students nested within classrooms at a medium-sized school district in the Pacific Northwest.…

  19. The Process by Which Black Male College Students Become Leaders of Predominantly White Organizations in Higher Education: A Grounded Theory

    ERIC Educational Resources Information Center

    Moschella, Eric J.

    2013-01-01

    This study sought to understand the process by which Black undergraduate men on predominately White college campuses become leaders of predominately White organizations. Using the theoretical frameworks of Black and White racial identity development (Helms, 1990), Critical Race Theory (Delgado & Stefancic, 2001), and Wijeyesinghe's (2001)…

  20. Body without Organs: Notes on Deleuze & Guattari, Critical Race Theory and the Socius of Anti-Racism

    ERIC Educational Resources Information Center

    Ibrahim, Awad

    2015-01-01

    My aim in this article is to epistemologically read Deleuze and Guattari (D & G) against critical race theory (CRT) and simultaneously delineate how D & G's notion of "body without organs" can benefit from CRT. At first glance, especially for language instructors and researchers, these two epistemological frameworks not only…

  1. From Structural Dilemmas to Institutional Imperatives: A Descriptive Theory of the School as an Institution and of School Organizations

    ERIC Educational Resources Information Center

    Berg, Gunnar

    2007-01-01

    This study outlines a descriptive theory that seeks to grasp the complexity of the school as a state and societal institution as well as single schools as organizations. A significant characteristic of this complexity is the ambiguity of the missions and goals--the outer boundaries--of the school-institution. The more institutional ambiguity that…

  2. A Theory of Complex Adaptive Inquiring Organizations: Application to Continuous Assurance of Corporate Financial Information

    ERIC Educational Resources Information Center

    Kuhn, John R., Jr.

    2009-01-01

    Drawing upon the theories of complexity and complex adaptive systems and the Singerian Inquiring System from C. West Churchman's seminal work "The Design of Inquiring Systems" the dissertation herein develops a systems design theory for continuous auditing systems. The dissertation consists of discussion of the two foundational theories,…

  3. Reliability physics

    NASA Technical Reports Server (NTRS)

    Cuddihy, E. F.; Ross, R. G., Jr.

    1984-01-01

    Speakers whose topics relate to the reliability physics of solar arrays are listed and their topics briefly reviewed. Nine reports are reviewed ranging in subjects from studies of photothermal degradation in encapsulants and polymerizable ultraviolet stabilizers to interface bonding stability to electrochemical degradation of photovoltaic modules.

  4. 76 FR 58730 - Version 4 Critical Infrastructure Protection Reliability Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-22

    ... Electric Reliability Corporation (NERC), the Electric Reliability Organization certified by the Commission...), Office of Electric Reliability, Division of Logistics and Security, Federal Energy Regulatory Commission...), Office of Electric Reliability, Division of Logistics and Security, Federal Energy Regulatory...

  5. Elastic, not plastic species: Frozen plasticity theory and the origin of adaptive evolution in sexually reproducing organisms

    PubMed Central

    2010-01-01

    Background Darwin's evolutionary theory could easily explain the evolution of adaptive traits (organs and behavioral patterns) in asexual but not in sexual organisms. Two models, the selfish gene theory and frozen plasticity theory were suggested to explain evolution of adaptive traits in sexual organisms in past 30 years. Results The frozen plasticity theory suggests that sexual species can evolve new adaptations only when their members are genetically uniform, i.e. only after a portion of the population of the original species had split off, balanced on the edge of extinction for several generations, and then undergone rapid expansion. After a short period of time, estimated on the basis of paleontological data to correspond to 1-2% of the duration of the species, polymorphism accumulates in the gene pool due to frequency-dependent selection; and thus, in each generation, new mutations occur in the presence of different alleles and therefore change their selection coefficients from generation to generation. The species ceases to behave in an evolutionarily plastic manner and becomes evolutionarily elastic on a microevolutionary time-scale and evolutionarily frozen on a macroevolutionary time-scale. It then exists in this state until such changes accumulate in the environment that the species becomes extinct. Conclusion Frozen plasticity theory, which includes the Darwinian model of evolution as a special case - the evolution of species in a plastic state, not only offers plenty of new predictions to be tested, but also provides explanations for a much broader spectrum of known biological phenomena than classic evolutionary theories. Reviewers This article was reviewed by Rob Knight, Fyodor Kondrashov and Massimo Di Giulio (nominated by David H. Ardell). PMID:20067646

  6. Autotrophs' challenge to Dynamic Energy Budget theory: Comment on ;Physics of metabolic organization; by Marko Jusup et al.

    NASA Astrophysics Data System (ADS)

    Geček, Sunčana

    2017-03-01

    Jusup and colleagues in the recent review on physics of metabolic organization [1] discuss in detail motivational considerations and common assumptions of Dynamic Energy Budget (DEB) theory, supply readers with a practical guide to DEB-based modeling, demonstrate the construction and dynamics of the standard DEB model, and illustrate several applications. The authors make a step forward from the existing literature by seamlessly bridging over the dichotomy between (i) thermodynamic foundations of the theory (which are often more accessible and understandable to physicists and mathematicians), and (ii) the resulting bioenergetic models (mostly used by biologists in real-world applications).

  7. Assessment of Student Performance in a PSI College Physics Course Using Ausubel's Learning Theory as a Theoretical Framework for Content Organization.

    ERIC Educational Resources Information Center

    Moriera, M. A.

    1979-01-01

    David Ausubel's learning theory was used as a framework for the content organization of an experimental Personalized System of Instruction (PSI) course in physics. Evaluation suggests that the combination of PSI as a method of instruction and Ausubel's theory for organization might result in better learning outcomes. (Author/JMD)

  8. A Monte Carlo Simulation Investigating the Validity and Reliability of Ability Estimation in Item Response Theory with Speeded Computer Adaptive Tests

    ERIC Educational Resources Information Center

    Schmitt, T. A.; Sass, D. A.; Sullivan, J. R.; Walker, C. M.

    2010-01-01

    Imposed time limits on computer adaptive tests (CATs) can result in examinees having difficulty completing all items, thus compromising the validity and reliability of ability estimates. In this study, the effects of speededness were explored in a simulated CAT environment by varying examinee response patterns to end-of-test items. Expectedly,…

  9. Overcoming the Problem of Embedding Change in Educational Organizations: A Perspective from Normalization Process Theory

    ERIC Educational Resources Information Center

    Wood, Phil

    2017-01-01

    In this article, I begin by outlining some of the barriers which constrain sustainable organizational change in schools and universities. I then go on to introduce a theory which has already started to help explain complex change and innovation processes in health and care contexts, Normalization Process Theory. Finally, I consider what this…

  10. Communication as a predictor of willingness to donate one's organs: an addition to the Theory of Reasoned Action.

    PubMed

    Jeffres, Leo W; Carroll, Jeanine A; Rubenking, Bridget E; Amschlinger, Joe

    2008-12-01

    Fishbein and Ajzen's theory of reasoned action has been used by many researchers, particularly in regard to health communication, to predict behavioral intentions and behavior. According to that theory, one's intention is the best predictor that one will engage in a behavior, and attitudes and social norms predict behavioral intentions. Other researchers have added different variables to the postulates of attitudes and social norms that Fishbein and Ajzen maintain are the best predictors of behavioral intention. Here we draw on data from a 2006 telephone survey (N = 420) gauging the awareness of an organ donation campaign in Northeast Ohio to examine the impact of communication on people's intentions. The current study supports the hypothesis that those who communicate with others are more likely to express a greater willingness to become an organ donor, but it expands the range of communication contexts. With demographics and attitudes toward organ donation controlled for, this study shows that communication with others about organ donation increases the willingness of individuals to have favorable attitudes about being an organ donor.

  11. 75 FR 71625 - System Restoration Reliability Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-24

    ... Energy Regulatory Commission 18 CFR Part 40 System Restoration Reliability Standards November 18, 2010... to approve Reliability Standards EOP-001-1 (Emergency Operations Planning), EOP- 005-2 (System... Commission by the North American Electric Reliability Corporation, the Electric Reliability Organization...

  12. Network reliability

    NASA Technical Reports Server (NTRS)

    Johnson, Marjory J.

    1985-01-01

    Network control (or network management) functions are essential for efficient and reliable operation of a network. Some control functions are currently included as part of the Open System Interconnection model. For local area networks, it is widely recognized that there is a need for additional control functions, including fault isolation functions, monitoring functions, and configuration functions. These functions can be implemented in either a central or distributed manner. The Fiber Distributed Data Interface Medium Access Control and Station Management protocols provide an example of distributed implementation. Relative information is presented here in outline form.

  13. Regulating agents, functional interactions, and stimulus-reaction-schemes: the concept of "organism" in the organic system theories of Stahl, Bordeu, and Barthez.

    PubMed

    Cheung, Tobias

    2008-12-01

    In this essay, I sketch a problem-based framework within which I locate the concept of "organism" in the system theories of Georg Ernst Stahl, Théophile Bordeu, and Paul-Joseph Barthez. Around 1700, Stahl coins the word "organism" for a certain concept of order. For him, the concept explains the form of order of living bodies that is categorically different from the order of other (dead) bodies or composites. At the end of the century, the "organism" as a specific form of order becomes a major topos in many discourses. I will not so much focus on experiments and objects as on basic problems that contribute to the general framework of the concept of organism as a key concept of the vitalist movement between 1700 and 1800. For this purpose, I will investigate the combination of three explanatory tools. These tools refer to regulating agents, functional interactions, and stimulus-reactions-schemes within individual organic systems of forces. They are related to various themes--especially to energy, sensibility, and sympathy. I will retrace some aspects of these relations.

  14. Multifractality to Photonic Crystal & Self-Organization to Metamaterials through Anderson Localizations & Group/Gauge Theory

    NASA Astrophysics Data System (ADS)

    Hidajatullah-Maksoed, Widastra

    2015-04-01

    Arthur Cayley at least investigate by creating the theory of permutation group[F:∖∖Group_theory.htm] where in cell elements addressing of the lattice Qmf used a Cayley tree, the self-afine object Qmf is described by the combination of the finite groups of rotation & inversion and the infinite groups of translation & dilation[G Corso & LS Lacena: ``Multifractal lattice and group theory'', Physica A: Statistical Mechanics &Its Applications, 2005, v 357, issue I, h 64-70; http://www.sciencedirect.com/science/articel/pii/S0378437105005005 ] hence multifractal can be related to group theory. Many grateful Thanks to HE. Mr. Drs. P. SWANTORO & HE. Mr. Ir. SARWONO KUSUMAATMADJA.

  15. Challenges for dynamic energy budget theory. Comment on ;Physics of metabolic organization; by Marko Jusup et al.

    NASA Astrophysics Data System (ADS)

    Nisbet, Roger M.

    2017-03-01

    Jusup et al. [1] provide a comprehensive review of Dynamic Energy Budget (DEB) theory - a theory of metabolic organization that has its roots in a model by S.A.L.M Kooijman [2] and has evolved over three decades into a remarkable general theory whose use appears to be growing exponentially. The definitive text on DEB theory [3] is a challenging (though exceptionally rewarding) read, and previous reviews (e.g. [4,5]) have provided focused summaries of some of its main themes, targeted at specific groups of readers. The strong case for a further review is well captured in the abstract: ;Hitherto, the foundations were more accessible to physicists or mathematicians, and the applications to biologists, causing a dichotomy in what always should have been a single body of work.; In response to this need, Jusup et al. provide a review that combines a lucid, rigorous exposition of the core components of DEB theory with a diverse collection of DEB applications. They also highlight some recent advances, notably the rapidly growing on-line database of DEB model parameters (451 species on 15 August 2016 according to [1], now, just a few months later, over 500 species).

  16. Understanding the Environmental Elements in Religious Student Organizations through Sharon Parks' Mentoring Community Theory

    ERIC Educational Resources Information Center

    Gill, David Christopher

    2011-01-01

    Students are coming to colleges and universities for spiritual fulfillment and have turned to religious student organizations (i.e. Campus Crusade for Christ, Newman Centers, Muslim Student Association, Hillel, etc.) to attain guidance and support. To better understand the spiritual environment religious student organizations have in place, many…

  17. Toward a Theory of Variation in the Organization of the Word Reading System

    ERIC Educational Resources Information Center

    Rueckl, Jay G.

    2016-01-01

    The strategy underlying most computational models of word reading is to specify the organization of the reading system--its architecture and the processes and representations it employs--and to demonstrate that this organization would give rise to the behavior observed in word reading tasks. This approach fails to adequately address the variation…

  18. Potential Applications of Matrix Organization Theory for the New Jersey Department of Education. Position Paper.

    ERIC Educational Resources Information Center

    Hanson, J. Robert

    Matrix organization focuses on the shift from cost center or process input planning to product output or results planning. Matrix organization puts the personnel and the resources where they are needed to get the job done. This management efficiency is brought about by dividing all organizational activities into two areas: (1) input or maintenance…

  19. Immodest Witnesses: Reliability and Writing Assessment

    ERIC Educational Resources Information Center

    Gallagher, Chris W.

    2014-01-01

    This article offers a survey of three reliability theories in writing assessment: positivist, hermeneutic, and rhetorical. Drawing on an interdisciplinary investigation of the notion of "witnessing," this survey emphasizes the kinds of readers and readings each theory of reliability produces and the epistemological grounds on which it…

  20. Power laws and self-organized criticality in theory and nature

    NASA Astrophysics Data System (ADS)

    Marković, Dimitrije; Gros, Claudius

    2014-03-01

    Power laws and distributions with heavy tails are common features of many complex systems. Examples are the distribution of earthquake magnitudes, solar flare intensities and the sizes of neuronal avalanches. Previously, researchers surmised that a single general concept may act as an underlying generative mechanism, with the theory of self organized criticality being a weighty contender. The power-law scaling observed in the primary statistical analysis is an important, but by far not the only feature characterizing experimental data. The scaling function, the distribution of energy fluctuations, the distribution of inter-event waiting times, and other higher order spatial and temporal correlations, have seen increased consideration over the last years. Leading to realization that basic models, like the original sandpile model, are often insufficient to adequately describe the complexity of real-world systems with power-law distribution. Consequently, a substantial amount of effort has gone into developing new and extended models and, hitherto, three classes of models have emerged. The first line of models is based on a separation between the time scales of an external drive and an internal dissipation, and includes the original sandpile model and its extensions, like the dissipative earthquake model. Within this approach the steady state is close to criticality in terms of an absorbing phase transition. The second line of models is based on external drives and internal dynamics competing on similar time scales and includes the coherent noise model, which has a non-critical steady state characterized by heavy-tailed distributions. The third line of models proposes a non-critical self-organizing state, being guided by an optimization principle, such as the concept of highly optimized tolerance. We present a comparative overview regarding distinct modeling approaches together with a discussion of their potential relevance as underlying generative models for real

  1. New Evidence for the Theory of Chromosome Organization by Repetitive Elements (CORE).

    PubMed

    Tang, Shao-Jun

    2017-02-20

    Repetitive DNA elements were proposed to coordinate chromatin folding and interaction in chromosomes by their intrinsic homology-based clustering ability. A recent analysis of the data sets from chromosome-conformation-capture experiments confirms the spatial clustering of DNA repeats of the same family in the nuclear space, and thus provides strong new support for the CORE theory.

  2. New Evidence for the Theory of Chromosome Organization by Repetitive Elements (CORE)

    PubMed Central

    Tang, Shao-Jun

    2017-01-01

    Repetitive DNA elements were proposed to coordinate chromatin folding and interaction in chromosomes by their intrinsic homology-based clustering ability. A recent analysis of the data sets from chromosome-conformation-capture experiments confirms the spatial clustering of DNA repeats of the same family in the nuclear space, and thus provides strong new support for the CORE theory. PMID:28230735

  3. Knowledge sharing within organizations: linking art, theory, scenarios and professional experience

    NASA Technical Reports Server (NTRS)

    Bailey, T.; Burton, Y. C.

    2000-01-01

    In this discussion, T. Bailey will be addressing the multiple paradigms within organizations using imagery. Dr. Burton will discuss the relationship between these paradigms and social exchanges that lead to knowledge sharing.

  4. Molecular Theory of Detonation Initiation: Insight from First Principles Modeling of the Decomposition Mechanisms of Organic Nitro Energetic Materials.

    PubMed

    Tsyshevsky, Roman V; Sharia, Onise; Kuklja, Maija M

    2016-02-19

    This review presents a concept, which assumes that thermal decomposition processes play a major role in defining the sensitivity of organic energetic materials to detonation initiation. As a science and engineering community we are still far away from having a comprehensive molecular detonation initiation theory in a widely agreed upon form. However, recent advances in experimental and theoretical methods allow for a constructive and rigorous approach to design and test the theory or at least some of its fundamental building blocks. In this review, we analyzed a set of select experimental and theoretical articles, which were augmented by our own first principles modeling and simulations, to reveal new trends in energetic materials and to refine known existing correlations between their structures, properties, and functions. Our consideration is intentionally limited to the processes of thermally stimulated chemical reactions at the earliest stage of decomposition of molecules and materials containing defects.

  5. Electronic and optical properties of a metal-organic framework with ab initio many-body perturbation theory

    NASA Astrophysics Data System (ADS)

    Berland, Kristian; Lee, Kyuho; Sharifzadeh, Sahar; Neaton, Jeffrey B.

    2015-03-01

    With their unprecedented surface area, and their structural and chemical tunability, metal-organic frameworks (MOFs) are being thoroughly explored for applications related to gas storage. Less studied are their electronic, excited-state, and optical properties. Here we explored such properties of Mg-MOF-74 using a combination of density functional theory (DFT) and many-body perturbation theory (MBPT) within the GW approximation and the Bethe-Salpeter equation (BSE) approach. The near-gap electronic conduction states were found to fall into two distinct categories: molecular-like and 1d-dispersive. Further, using the BSE approach, we predict a strongly anisotropic absorption spectrum, which we link to the nature of its strongly-bound excitons. Our calculations are found to be in good agreement with experimental absorption spectra, validating our theoretical approach. This work is supported by Chalmers Area of Advance: Materials, Vetenskapsradet, DOE, and computational resources provided by NERSC.

  6. Molecular Theory of Detonation Initiation: Insight from First Principles Modeling of the Decomposition Mechanisms of Organic Nitro Energetic Materials

    SciTech Connect

    Tsyshevsky, Roman; Sharia, Onise; Kuklja, Maija

    2016-02-19

    Our review presents a concept, which assumes that thermal decomposition processes play a major role in defining the sensitivity of organic energetic materials to detonation initiation. As a science and engineering community we are still far away from having a comprehensive molecular detonation initiation theory in a widely agreed upon form. However, recent advances in experimental and theoretical methods allow for a constructive and rigorous approach to design and test the theory or at least some of its fundamental building blocks. In this review, we analyzed a set of select experimental and theoretical articles, which were augmented by our own first principles modeling and simulations, to reveal new trends in energetic materials and to refine known existing correlations between their structures, properties, and functions. Lastly, our consideration is intentionally limited to the processes of thermally stimulated chemical reactions at the earliest stage of decomposition of molecules and materials containing defects.

  7. Molecular Theory of Detonation Initiation: Insight from First Principles Modeling of the Decomposition Mechanisms of Organic Nitro Energetic Materials

    DOE PAGES

    Tsyshevsky, Roman; Sharia, Onise; Kuklja, Maija

    2016-02-19

    Our review presents a concept, which assumes that thermal decomposition processes play a major role in defining the sensitivity of organic energetic materials to detonation initiation. As a science and engineering community we are still far away from having a comprehensive molecular detonation initiation theory in a widely agreed upon form. However, recent advances in experimental and theoretical methods allow for a constructive and rigorous approach to design and test the theory or at least some of its fundamental building blocks. In this review, we analyzed a set of select experimental and theoretical articles, which were augmented by our ownmore » first principles modeling and simulations, to reveal new trends in energetic materials and to refine known existing correlations between their structures, properties, and functions. Lastly, our consideration is intentionally limited to the processes of thermally stimulated chemical reactions at the earliest stage of decomposition of molecules and materials containing defects.« less

  8. Theoretical modeling of the linear and nonlinear optical properties of organic crystals within the rigorous local field theory (RLFT)

    SciTech Connect

    Seidler, T.; Stadnicka, K.; Champagne, B.

    2015-03-30

    This contribution summarizes our current findings in the field of calculating and predicting the linear and second-order nonlinear electric susceptibility tensor components of organic crystals. The methodology used for this purpose is based on a combination of the electrostatic interaction scheme developed by Munn and his coworkers (RLFT) with high-level electronic structure calculations. We compare the results of calculations with available experimental data for several examples of molecular crystals. We show the quality of the final results is influenced by i) the chromophore geometry, ii) the method used for molecular properties calculations and iii) the partitioning scheme used. In conclusion we summarize further plans to improve the reliability and predictability of the method.

  9. Reliable prediction of three-body intermolecular interactions using dispersion-corrected second-order Møller-Plesset perturbation theory

    SciTech Connect

    Huang, Yuanhang; Beran, Gregory J. O.

    2015-07-28

    Three-body and higher intermolecular interactions can play an important role in molecular condensed phases. Recent benchmark calculations found problematic behavior for many widely used density functional approximations in treating 3-body intermolecular interactions. Here, we demonstrate that the combination of second-order Møller-Plesset (MP2) perturbation theory plus short-range damped Axilrod-Teller-Muto (ATM) dispersion accurately describes 3-body interactions with reasonable computational cost. The empirical damping function used in the ATM dispersion term compensates both for the absence of higher-order dispersion contributions beyond the triple-dipole ATM term and non-additive short-range exchange terms which arise in third-order perturbation theory and beyond. Empirical damping enables this simple model to out-perform a non-expanded coupled Kohn-Sham dispersion correction for 3-body intermolecular dispersion. The MP2 plus ATM dispersion model approaches the accuracy of O(N{sup 6}) methods like MP2.5 or even spin-component-scaled coupled cluster models for 3-body intermolecular interactions with only O(N{sup 5}) computational cost.

  10. Reliable prediction of three-body intermolecular interactions using dispersion-corrected second-order Møller-Plesset perturbation theory

    NASA Astrophysics Data System (ADS)

    Huang, Yuanhang; Beran, Gregory J. O.

    2015-07-01

    Three-body and higher intermolecular interactions can play an important role in molecular condensed phases. Recent benchmark calculations found problematic behavior for many widely used density functional approximations in treating 3-body intermolecular interactions. Here, we demonstrate that the combination of second-order Møller-Plesset (MP2) perturbation theory plus short-range damped Axilrod-Teller-Muto (ATM) dispersion accurately describes 3-body interactions with reasonable computational cost. The empirical damping function used in the ATM dispersion term compensates both for the absence of higher-order dispersion contributions beyond the triple-dipole ATM term and non-additive short-range exchange terms which arise in third-order perturbation theory and beyond. Empirical damping enables this simple model to out-perform a non-expanded coupled Kohn-Sham dispersion correction for 3-body intermolecular dispersion. The MP2 plus ATM dispersion model approaches the accuracy of O(N6) methods like MP2.5 or even spin-component-scaled coupled cluster models for 3-body intermolecular interactions with only O(N5) computational cost.

  11. The Impact of Multiple Master Patient Index Records on the Business Performance of Health Care Organizations: A Qualitative Grounded Theory Study

    ERIC Educational Resources Information Center

    Banton, Cynthia L.

    2014-01-01

    The purpose of this qualitative grounded theory study was to explore and examine the factors that led to the creation of multiple record entries, and present a theory on the impact the problem has on the business performance of health care organizations. A sample of 59 health care professionals across the United States participated in an online…

  12. Understanding the Value of Enterprise Architecture for Organizations: A Grounded Theory Approach

    ERIC Educational Resources Information Center

    Nassiff, Edwin

    2012-01-01

    There is a high rate of information system implementation failures attributed to the lack of alignment between business and information technology strategy. Although enterprise architecture (EA) is a means to correct alignment problems and executives highly rate the importance of EA, it is still not used in most organizations today. Current…

  13. Control of organ size: development, regeneration, and the role of theory in biology.

    PubMed

    Stevens, Charles F

    2015-02-19

    How organs grow to be the right size for the animal is one of the central mysteries of biology. In a paper in BMC Biology, Khammash et al. propose a mechanism for escaping from the deficiencies of feedback control of growth as a mechanism.

  14. Latent Trait Theory Approach to Measuring Person-Organization Fit: Conceptual Rationale and Empirical Evaluation

    ERIC Educational Resources Information Center

    Chernyshenko, Oleksandr S.; Stark, Stephen; Williams, Alex

    2009-01-01

    The purpose of this article is to offer a new approach to measuring person-organization (P-O) fit, referred to here as "Latent fit." Respondents were administered unidimensional forced choice items and were asked to choose the statement in each pair that better reflected the correspondence between their values and those of the…

  15. To the theory of hybrid organics/semiconductor nanostructures in microcavity

    NASA Astrophysics Data System (ADS)

    Dubovskiy, O. A.; Agranovich, V. M.

    2017-02-01

    We consider the hybrid structure in microcavity where the energy of Frenkel exciton in organic layer is equal to the energy of Wannier - Mott exciton in semiconductor quantum well (QW). The exciton located in QW of semiconductor layer can interact with molecules of organic layer and under influence of this interaction can change the position jumping and exciting one of organic molecules. The exciton located in molecule of organic layer also can change the position jumping to semiconductor QW. The number of such jumps depends on the intensity of interaction. In the paper we consider the influence of direct Coulomb dipole-dipole interaction and indirect interaction through the optical field of microcavity on the kinetics of excitation. It was shown that the dispersion of hybrid states are modified by Coulomb interaction particularly when the distance between layers is enough small. The lowest branch of dispersion curves with deep minimum at nonzero wave vector may be useful in the studies of the condensation of low energy hybrid excitations.

  16. Knowledge sharing within organizations: linking art, theory, scenarios and professional experience

    NASA Technical Reports Server (NTRS)

    Burton, Y. C.; Bailey, T.

    2000-01-01

    In this presentation, Burton and Bailey, discuss the challenges and opportunities in developing knowledge sharing systems in organizations. Bailey provides a tool using imagery and collage for identifying and utilizing the diverse values and beliefs of individuals and groups. Burton reveals findings from a business research study that examines how social construction influences knowledge sharing among task oriented groups.

  17. Using Population Genetic Theory and DNA Sequences for Species Detection and Identification in Asexual Organisms

    PubMed Central

    Birky, C. William; Adams, Joshua; Gemmel, Marlea; Perry, Julia

    2010-01-01

    Background It is widely agreed that species are fundamental units of biology, but there is little agreement on a definition of species or on an operational criterion for delimiting species that is applicable to all organisms. Methodology/Principal Findings We focus on asexual eukaryotes as the simplest case for investigating species and speciation. We describe a model of speciation in asexual organisms based on basic principles of population and evolutionary genetics. The resulting species are independently evolving populations as described by the evolutionary species concept or the general lineage species concept. Based on this model, we describe a procedure for using gene sequences from small samples of individuals to assign them to the same or different species. Using this method of species delimitation, we demonstrate the existence of species as independent evolutionary units in seven groups of invertebrates, fungi, and protists that reproduce asexually most or all of the time. Conclusions/Significance This wide evolutionary sampling establishes the general existence of species and speciation in asexual organisms. The method is well suited for measuring species diversity when phenotypic data are insufficient to distinguish species, or are not available, as in DNA barcoding and environmental sequencing. We argue that it is also widely applicable to sexual organisms. PMID:20498705

  18. Theory and Practice: Implications for the Implementation of Communication Technology in Organizations.

    ERIC Educational Resources Information Center

    Herndon, Sandra L.

    1997-01-01

    Argues that scientific management principles result in an implementation of technology which fails to take full advantage of organization members and of the technology itself, while in a sociotechnical systems approach, technology is designed and implemented in ways enhancing the potential of both individuals and the technology itself, in…

  19. Vicinal 1H-1H NMR coupling constants from density functional theory as reliable tools for stereochemical analysis of highly flexible multichiral center molecules.

    PubMed

    López-Vallejo, Fabian; Fragoso-Serrano, Mabel; Suárez-Ortiz, Gloria Alejandra; Hernández-Rojas, Adriana C; Cerda-García-Rojas, Carlos M; Pereda-Miranda, Rogelio

    2011-08-05

    A protocol for stereochemical analysis, based on the systematic comparison between theoretical and experimental vicinal (1)H-(1)H NMR coupling constants, was developed and applied to a series of flexible compounds (1-8) derived from the 6-heptenyl-5,6-dihydro-2H-pyran-2-one framework. The method included a broad conformational search, followed by geometry optimization at the DFT B3LYP/DGDZVP level, calculation of the vibrational frequencies, thermochemical parameters, magnetic shielding tensors, and the total NMR spin-spin coupling constants. Three scaling factors, depending on the carbon atom hybridizations, were found for the (1)H-C-C-(1)H vicinal coupling constants: f((sp3)-(sp3)) = 0.910, f((sp3)-(sp2)) = 0.929, and f((sp2)-(sp2))= 0.977. A remarkable correlation between the theoretical (J(pre)) and experimental (1)H-(1)H NMR (J(exp)) coupling constants for spicigerolide (1), a cytotoxic natural product, and some of its synthetic stereoisomers (2-4) demonstrated the predictive value of this approach for the stereochemical assignment of highly flexible compounds containing multiple chiral centers. The stereochemistry of two natural 6-heptenyl-5,6-dihydro-2H-pyran-2-ones (14 and 15) containing diverse functional groups in the heptenyl side chain was also analyzed by application of this combined theoretical and experimental approach, confirming its reliability. Additionally, a geometrical analysis for the conformations of 1-8 revealed that weak hydrogen bonds substantially guide the conformational behavior of the tetraacyloxy-6-heptenyl-2H-pyran-2-ones.

  20. The self-organizing fractal theory as a universal discovery method: the phenomenon of life.

    PubMed

    Kurakin, Alexei

    2011-03-29

    A universal discovery method potentially applicable to all disciplines studying organizational phenomena has been developed. This method takes advantage of a new form of global symmetry, namely, scale-invariance of self-organizational dynamics of energy/matter at all levels of organizational hierarchy, from elementary particles through cells and organisms to the Universe as a whole. The method is based on an alternative conceptualization of physical reality postulating that the energy/matter comprising the Universe is far from equilibrium, that it exists as a flow, and that it develops via self-organization in accordance with the empirical laws of nonequilibrium thermodynamics. It is postulated that the energy/matter flowing through and comprising the Universe evolves as a multiscale, self-similar structure-process, i.e., as a self-organizing fractal. This means that certain organizational structures and processes are scale-invariant and are reproduced at all levels of the organizational hierarchy. Being a form of symmetry, scale-invariance naturally lends itself to a new discovery method that allows for the deduction of missing information by comparing scale-invariant organizational patterns across different levels of the organizational hierarchy.An application of the new discovery method to life sciences reveals that moving electrons represent a keystone physical force (flux) that powers, animates, informs, and binds all living structures-processes into a planetary-wide, multiscale system of electron flow/circulation, and that all living organisms and their larger-scale organizations emerge to function as electron transport networks that are supported by and, at the same time, support the flow of electrons down the Earth's redox gradient maintained along the core-mantle-crust-ocean-atmosphere axis of the planet. The presented findings lead to a radically new perspective on the nature and origin of life, suggesting that living matter is an organizational state

  1. The self-organizing fractal theory as a universal discovery method: the phenomenon of life

    PubMed Central

    2011-01-01

    A universal discovery method potentially applicable to all disciplines studying organizational phenomena has been developed. This method takes advantage of a new form of global symmetry, namely, scale-invariance of self-organizational dynamics of energy/matter at all levels of organizational hierarchy, from elementary particles through cells and organisms to the Universe as a whole. The method is based on an alternative conceptualization of physical reality postulating that the energy/matter comprising the Universe is far from equilibrium, that it exists as a flow, and that it develops via self-organization in accordance with the empirical laws of nonequilibrium thermodynamics. It is postulated that the energy/matter flowing through and comprising the Universe evolves as a multiscale, self-similar structure-process, i.e., as a self-organizing fractal. This means that certain organizational structures and processes are scale-invariant and are reproduced at all levels of the organizational hierarchy. Being a form of symmetry, scale-invariance naturally lends itself to a new discovery method that allows for the deduction of missing information by comparing scale-invariant organizational patterns across different levels of the organizational hierarchy. An application of the new discovery method to life sciences reveals that moving electrons represent a keystone physical force (flux) that powers, animates, informs, and binds all living structures-processes into a planetary-wide, multiscale system of electron flow/circulation, and that all living organisms and their larger-scale organizations emerge to function as electron transport networks that are supported by and, at the same time, support the flow of electrons down the Earth's redox gradient maintained along the core-mantle-crust-ocean-atmosphere axis of the planet. The presented findings lead to a radically new perspective on the nature and origin of life, suggesting that living matter is an organizational state

  2. Software Reliability, Measurement, and Testing Software Reliability and Test Integration

    DTIC Science & Technology

    1992-04-01

    process rariables on software reliability A guidebook was produced to help pro- gram managers control and manage softwar~e reliability and testing...Integrated Reliability Management System "IRMS" 17 1.6 Organization of Report 17 2.0 SURVEYS 20 2.1 Software Projects Survey 20 2.1.1 Candidate Projects...Systems 103 4.2.2.3 Compiler 103 4.2.2.4 Data Management and Analysis 103 4.2.3 Test/Support Tools 103 4.2.3.1 DEC Test Manager 103 4.2.3.2 SDDL 103

  3. Ethical models in bioethics: theory and application in organ allocation policies.

    PubMed

    Petrini, C

    2010-12-01

    Policies for allocating organs to people awaiting a transplant constitute a major ethical challenge. First and foremost, they demand balance between the principles of beneficence and justice, but many other ethically relevant principles are also involved: autonomy, responsibility, equity, efficiency, utility, therapeutic outcome, medical urgency, and so forth. Various organ allocation models can be developed based on the hierarchical importance assigned to a given principle over the others, but none of the principles should be completely disregarded. An ethically acceptable organ allocation policy must therefore be in conformity, to a certain extent, with the requirements of all the principles. Many models for organ allocation can be derived. The utilitarian model aims to maximize benefits, which can be of various types on a social or individual level, such as the number of lives saved, prognosis, and so forth. The prioritarian model favours the neediest or those who suffer most. The egalitarian model privileges equity and justice, suggesting that all people should have an equal opportunity (casual allocation) or priority should be given to those who have been waiting longer. The personalist model focuses on each individual patient, attempting to mesh together all the various aspects affecting the person: therapeutic needs (urgency), fairness, clinical outcomes, respect for persons. In the individualistic model the main element is free choice and the system of opting-in is privileged. Contrary to the individualistic model, the communitarian model identities in the community the fundamental elements for the legitimacy of choices: therefore, the system of opting-out is privileged. This article does not aim at suggesting practical solutions. Rather, it furnishes to decision makers an overview on the possible ethical approach to this matter.

  4. Electronic structure of the organic semiconductor copper phthalocyanine: experiment and theory.

    PubMed

    Aristov, V Yu; Molodtsova, O V; Maslyuk, V V; Vyalikh, D V; Zhilin, V M; Ossipyan, Yu A; Bredow, T; Mertig, I; Knupfer, M

    2008-01-21

    The electronic structure of the organic semiconductor copper-phthalocyanine (CuPc) has been determined by a combination of conventional and resonant photoemission, near-edge x-ray absorption, as well as by the first-principles calculations. The experimentally obtained electronic valence band structure of CuPc is in very good agreement with the calculated density of states results, allowing the derivation of detailed site specific information.

  5. Cortical organization: a description and interpretation of anatomical findings based on systems theory

    PubMed Central

    Casanova, Manuel F.

    2012-01-01

    The organization of the cortex can be understood as a complex system comprised of interconnected modules called minicolumns. Comparative anatomical studies suggest that evolution has prompted a scale free world network of connectivity within the white matter while simultaneously increasing the complexity of minicolumnar composition. It is this author’s opinion that this complex system is poised to collapse under the weight of environmental exigencies. Some mental disorders may be the manifestations of this collapse. PMID:22754693

  6. Command and Control in Virtual Environments: Using Contingency Theory to Understand Organization in Virtual Worlds

    DTIC Science & Technology

    2010-10-01

    Life, Second Living Land Group). This is parallel to the kinds of virtual world businesses discussed above (e.g., architect, tailor, club owner) with ...however. For instance, users have the ability to purchase “ land ” (i.e., with real world currency such as US Dollars; although basic access to SL is...This raises an important organizational design question regarding the fit of such organizations with their virtual environments and corresponding

  7. Cortical organization: a description and interpretation of anatomical findings based on systems theory.

    PubMed

    Casanova, Manuel F

    2010-01-01

    The organization of the cortex can be understood as a complex system comprised of interconnected modules called minicolumns. Comparative anatomical studies suggest that evolution has prompted a scale free world network of connectivity within the white matter while simultaneously increasing the complexity of minicolumnar composition. It is this author's opinion that this complex system is poised to collapse under the weight of environmental exigencies. Some mental disorders may be the manifestations of this collapse.

  8. Reliability science and patient safety.

    PubMed

    Luria, Joseph W; Muething, Stephen E; Schoettker, Pamela J; Kotagal, Uma R

    2006-12-01

    Reliability is failure-free operation over time--the measurable capability of a process, procedure, or service to perform its intended function. Reliability science has the potential to help health care organizations reduce defects in care, increase the consistency with which care is delivered, and improve patient outcomes. Based on its principles, the Institute for Health care Improvement has developed a three-step model to prevent failures, mitigate the failures that occur, and redesign systems to reduce failures. Lessons may also be learned from complex organizations that have already adopted the principles of reliability science and operate with high rates of reliability. They share a preoccupation with failure, reluctance to simplify interpretations, sensitivity to operations, commitment to resilience, and underspecification of structures.

  9. Understanding reaction mechanisms in organic chemistry from catastrophe theory: ozone addition on benzene.

    PubMed

    Ndassa, Ibrahim Mbouombouo; Silvi, Bernard; Volatron, François

    2010-12-16

    The potential energy profiles of the endo and exo additions of ozone on benzene have been theoretically investigated within the framework provided by the electron localization function (ELF). This has been done by carrying out hybrid Hartree-Fock DFT B3LYP calculation followed by a bonding evolution theory (BET) analysis. For both approaches, the reaction is exothermic by ~98 kJ mol(-1). However, the activation energy is calculated to 10 kJ mol(-1) lower in the endo channel than in the exo one; therefore the formation of the endo C(6)H(6)O(3) adduct is kinetically favored. Six structural stability domains are identified along both reaction pathways as well as the bifurcation catastrophes responsible for the changes in the topology of the system. This provides a chemical description of the reaction mechanism in terms of heterolytic synchronous bond formation.

  10. Attachment at (not to) work: applying attachment theory to explain individual behavior in organizations.

    PubMed

    Richards, David A; Schat, Aaron C H

    2011-01-01

    In this article, we report the results of 2 studies that were conducted to investigate whether adult attachment theory explains employee behavior at work. In the first study, we examined the structure of a measure of adult attachment and its relations with measures of trait affectivity and the Big Five. In the second study, we examined the relations between dimensions of attachment and emotion regulation behaviors, turnover intentions, and supervisory reports of counterproductive work behavior and organizational citizenship behavior. Results showed that anxiety and avoidance represent 2 higher order dimensions of attachment that predicted these criteria (except for counterproductive work behavior) after controlling for individual difference variables and organizational commitment. The implications of these results for the study of attachment at work are discussed.

  11. A Theory for the Function of the Spermaceti Organ of the Sperm Whale (Physeter Catodon L.)

    NASA Technical Reports Server (NTRS)

    Norris, K. S.; Harvey, G. W.

    1972-01-01

    The function of the spermaceti organ of the sperm whale is studied using a model of its acoustic system. Suggested functions of the system include: (1) action as an acoustic resonating and sound focussing chamber to form and process burst-pulsed clicks; (2) use of nasal passages in forehead for repeated recycling of air for phonation during dives and to provide mirrors for sound reflection and signal processing; and (3) use of the entire system to allow sound signal production especially useful for long range echolocofion in the deep sea.

  12. Predicting organic food consumption: A meta-analytic structural equation model based on the theory of planned behavior.

    PubMed

    Scalco, Andrea; Noventa, Stefano; Sartori, Riccardo; Ceschi, Andrea

    2017-05-01

    During the last decade, the purchase of organic food within a sustainable consumption context has gained momentum. Consequently, the amount of research in the field has increased, leading in some cases to discrepancies regarding both methods and results. The present review examines those works that applied the theory of planned behavior (TPB; Ajzen, 1991) as a theoretical framework in order to understand and predict consumers' motivation to buy organic food. A meta-analysis has been conducted to assess the strength of the relationships between attitude, subjective norms, perceived behavioral control, and intention, as well as between intention and behavior. Results confirm the major role played by individual attitude in shaping buying intention, followed by subjective norms and perceived behavioral control. Intention-behavior shows a large effect size, few studies however explicitly reported such an association. Furthermore, starting from a pooled correlation matrix, a meta-analytic structural equation model has been applied to jointly evaluate the strength of the relationships among the factors of the original model. Results suggest the robustness of the TPB model. In addition, mediation analysis indicates a potential direct effect from subjective norms to individual attitude in the present context. Finally, some issues regarding methodological aspects of the application of the TPB within the context of organic food are discussed for further research developments.

  13. Local electronic structure and nanolevel hierarchical organization of bone tissue: theory and NEXAFS study

    NASA Astrophysics Data System (ADS)

    Pavlychev, A. A.; Avrunin, A. S.; Vinogradov, A. S.; Filatova, E. O.; Doctorov, A. A.; Krivosenko, Yu S.; Samoilenko, D. O.; Svirskiy, G. I.; Konashuk, A. S.; Rostov, D. A.

    2016-12-01

    Theoretical and experimental investigations of native bone are carried out to understand relationships between its hierarchical organization and local electronic and atomic structure of the mineralized phase. The 3D superlattice model of a coplanar assembly of the hydroxyapatite (HAP) nanocrystallites separated by the hydrated nanolayers is introduced to account the interplay of short-, long- and super-range order parameters in bone tissue. The model is applied to (i) predict and rationalize the HAP-to-bone spectral changes in the electronic structure and (ii) describe the mechanisms ensuring the link of the hierarchical organization with the electronic structure of the mineralized phase in bone. To check the predictions the near-edge x-ray absorption fine structure (NEXAFS) at the Ca 2p, P 2p and O 1s thresholds is measured for native bone and compared with NEXAFS for reference compounds. The NEXAFS analysis has demonstrated the essential hierarchy induced HAP-to-bone red shifts of the Ca and P 2p-to-valence transitions. The lowest O 1s excitation line at 532.2 eV in bone is assigned with superposition of core transitions in the hydroxide OH-(H2O) m anions, Ca2+(H2O) n cations, the carboxyl groups inside the collagen and [PO4]2- and [PO4]- anions with unsaturated P-O bonds.

  14. Reliability and Maintainability (RAM) Training

    NASA Technical Reports Server (NTRS)

    Lalli, Vincent R. (Editor); Malec, Henry A. (Editor); Packard, Michael H. (Editor)

    2000-01-01

    The theme of this manual is failure physics-the study of how products, hardware, software, and systems fail and what can be done about it. The intent is to impart useful information, to extend the limits of production capability, and to assist in achieving low-cost reliable products. In a broader sense the manual should do more. It should underscore the urgent need CI for mature attitudes toward reliability. Five of the chapters were originally presented as a classroom course to over 1000 Martin Marietta engineers and technicians. Another four chapters and three appendixes have been added, We begin with a view of reliability from the years 1940 to 2000. Chapter 2 starts the training material with a review of mathematics and a description of what elements contribute to product failures. The remaining chapters elucidate basic reliability theory and the disciplines that allow us to control and eliminate failures.

  15. Methodology for Software Reliability Prediction. Volume 1.

    DTIC Science & Technology

    1987-11-01

    models to express software reliability in terms of fault density (the number of faults per executable lines of code) and failure rate (the number of...Reliability Measurements ..................... 3-1 0 3.1 Software Quality Measurement Framework ............ 3-1 3.2 A Software Reliability Measurement Model ...3-3 3.2.1 A Model of the Software Failure Process .... 3-3 3.2.2 Organization of Software Reliability Measurements

  16. The metabolic pace-of-life model: incorporating ectothermic organisms into the theory of vertebrate ecoimmunology.

    PubMed

    Sandmeier, Franziska C; Tracy, Richard C

    2014-09-01

    We propose a new heuristic model that incorporates metabolic rate and pace of life to predict a vertebrate species' investment in adaptive immune function. Using reptiles as an example, we hypothesize that animals with low metabolic rates will invest more in innate immunity compared with adaptive immunity. High metabolic rates and body temperatures should logically optimize the efficacy of the adaptive immune system--through rapid replication of T and B cells, prolific production of induced antibodies, and kinetics of antibody--antigen interactions. In current theory, the precise mechanisms of vertebrate immune function oft are inadequately considered as diverse selective pressures on the evolution of pathogens. We propose that the strength of adaptive immune function and pace of life together determine many of the important dynamics of host-pathogen evolution, namely, that hosts with a short lifespan and innate immunity or with a long lifespan and strong adaptive immunity are expected to drive the rapid evolution of their populations of pathogens. Long-lived hosts that rely primarily on innate immune functions are more likely to use defense mechanisms of tolerance (instead of resistance), which are not expected to act as a selection pressure for the rapid evolution of pathogens' virulence.

  17. From organized high throughput data to phenomenological theory: The example of dielectric breakdown

    NASA Astrophysics Data System (ADS)

    Kim, Chiho; Pilania, Ghanshyam; Ramprasad, Rampi

    Understanding the behavior (and failure) of dielectric insulators experiencing extreme electric fields is critical to the operation of present and emerging electrical and electronic devices. Despite its importance, the development of a predictive theory of dielectric breakdown has remained a challenge, owing to the complex multiscale nature of this process. Here, we focus on the intrinsic dielectric breakdown field of insulators--the theoretical limit of breakdown determined purely by the chemistry of the material, i.e., the elements the material is composed of, the atomic-level structure, and the bonding. Starting from a benchmark dataset (generated from laborious first principles computations) of the intrinsic dielectric breakdown field of a variety of model insulators, simple predictive phenomenological models of dielectric breakdown are distilled using advanced statistical or machine learning schemes, revealing key correlations and analytical relationships between the breakdown field and easily accessible material properties. The models are shown to be general, and can hence guide the screening and systematic identification of high electric field tolerant materials.

  18. Mean-field theory of atomic self-organization in optical cavities

    NASA Astrophysics Data System (ADS)

    Jäger, Simon B.; Schütz, Stefan; Morigi, Giovanna

    2016-08-01

    Photons mediate long-range optomechanical forces between atoms in high-finesse resonators, which can induce the formation of ordered spatial patterns. When a transverse laser drives the atoms, the system undergoes a second-order phase transition that separates a uniform spatial density from a Bragg grating maximizing scattering into the cavity and is controlled by the laser intensity. Starting from a Fokker-Planck equation describing the semiclassical dynamics of the N -atom distribution function, we systematically develop a mean-field model and analyze its predictions for the equilibrium and out-of-equilibrium dynamics. The validity of the mean-field model is tested by comparison with the numerical simulations of the N -body Fokker-Planck equation and by means of a Bogoliubov-Born-Green-Kirkwood-Yvon (BBGKY) hierarchy. The mean-field theory predictions well reproduce several results of the N -body Fokker-Planck equation for sufficiently short times and are in good agreement with existing theoretical approaches based on field-theoretical models. The mean field, on the other hand, predicts thermalization time scales which are at least one order of magnitude shorter than the ones predicted by the N -body dynamics. We attribute this discrepancy to the fact that the mean-field ansatz discards the effects of the long-range incoherent forces due to cavity losses.

  19. Computational organic chemistry: bridging theory and experiment in establishing the mechanisms of chemical reactions.

    PubMed

    Cheng, Gui-Juan; Zhang, Xinhao; Chung, Lung Wa; Xu, Liping; Wu, Yun-Dong

    2015-02-11

    Understanding the mechanisms of chemical reactions, especially catalysis, has been an important and active area of computational organic chemistry, and close collaborations between experimentalists and theorists represent a growing trend. This Perspective provides examples of such productive collaborations. The understanding of various reaction mechanisms and the insight gained from these studies are emphasized. The applications of various experimental techniques in elucidation of reaction details as well as the development of various computational techniques to meet the demand of emerging synthetic methods, e.g., C-H activation, organocatalysis, and single electron transfer, are presented along with some conventional developments of mechanistic aspects. Examples of applications are selected to demonstrate the advantages and limitations of these techniques. Some challenges in the mechanistic studies and predictions of reactions are also analyzed.

  20. Information and Theory of Organizations as a Conceptual Framework for System Design of Automated Medical Information Systems

    PubMed Central

    Fuchs-Kittowski, K.

    1982-01-01

    To this date the design of hospital information systems has been the province of hardware and software specialists. The theories of information and social organizations can contribute to the design of information systems by stressing the principles of formalization and the differences between routine and non-routine tasks, with their accompanying effect on worker satisfaction and organizational efficiency. In particular, the difference between the needs of research hospitals and care hospitals will be discussed. Note: This is an edited version of the longer paper prepared by Professor Fuchs-Kittowski. The editing was done by Vincent Brannigan of the University of Maryland. Professor Fuchs-Kittowski was unable to review the edited version, so this version should not be quoted without reference to the original paper. Please contact Professor Brannigan for a copy of the original paper. All citations are those of the original paper.

  1. Theory of Current Transients in Planar Semiconductor Devices: Insights and Applications to Organic Solar Cells

    NASA Astrophysics Data System (ADS)

    Hawks, Steven A.; Finck, Benjamin Y.; Schwartz, Benjamin J.

    2015-04-01

    Time-domain current measurements are widely used to characterize semiconductor material properties, such as carrier mobility, doping concentration, carrier lifetime, and the static dielectric constant. It is therefore critical that these measurements be theoretically understood if they are to be successfully applied to assess the properties of materials and devices. In this paper, we derive generalized relations for describing current-density transients in planar semiconductor devices at uniform temperature. By spatially averaging the charge densities inside the semiconductor, we are able to provide a rigorous, straightforward, and experimentally relevant way to interpret these measurements. The formalism details several subtle aspects of current transients, including how the electrode charge relates to applied bias and internal space charge, how the displacement current can alter the apparent free-carrier current, and how to understand the integral of a charge-extraction transient. We also demonstrate how the formalism can be employed to derive the current transients arising from simple physical models, like those used to describe charge extraction by linearly increasing voltage (CELIV) and time-of-flight experiments. In doing so, we find that there is a nonintuitive factor-of-2 reduction in the apparent free-carrier concentration that can be easily missed, for example, in the application of charge-extraction models. Finally, to validate our theory and better understand the different current contributions, we perform a full time-domain drift-diffusion simulation of a CELIV trace and compare the results to our formalism. As expected, our analytic equations match precisely with the numerical solutions to the drift-diffusion, Poisson, and continuity equations. Thus, overall, our formalism provides a straightforward and general way to think about how the internal space-charge distribution, the electrode charge, and the externally applied bias translate into a measured

  2. Disrupted Brain Functional Organization in Epilepsy Revealed by Graph Theory Analysis.

    PubMed

    Song, Jie; Nair, Veena A; Gaggl, Wolfgang; Prabhakaran, Vivek

    2015-06-01

    The human brain is a complex and dynamic system that can be modeled as a large-scale brain network to better understand the reorganizational changes secondary to epilepsy. In this study, we developed a brain functional network model using graph theory methods applied to resting-state fMRI data acquired from a group of epilepsy patients and age- and gender-matched healthy controls. A brain functional network model was constructed based on resting-state functional connectivity. A minimum spanning tree combined with proportional thresholding approach was used to obtain sparse connectivity matrices for each subject, which formed the basis of brain networks. We examined the brain reorganizational changes in epilepsy thoroughly at the level of the whole brain, the functional network, and individual brain regions. At the whole-brain level, local efficiency was significantly decreased in epilepsy patients compared with the healthy controls. However, global efficiency was significantly increased in epilepsy due to increased number of functional connections between networks (although weakly connected). At the functional network level, there were significant proportions of newly formed connections between the default mode network and other networks and between the subcortical network and other networks. There was a significant proportion of decreasing connections between the cingulo-opercular task control network and other networks. Individual brain regions from different functional networks, however, showed a distinct pattern of reorganizational changes in epilepsy. These findings suggest that epilepsy alters brain efficiency in a consistent pattern at the whole-brain level, yet alters brain functional networks and individual brain regions differently.

  3. The semantic organization of the animal category: evidence from semantic verbal fluency and network theory.

    PubMed

    Goñi, Joaquín; Arrondo, Gonzalo; Sepulcre, Jorge; Martincorena, Iñigo; Vélez de Mendizábal, Nieves; Corominas-Murtra, Bernat; Bejarano, Bartolomé; Ardanza-Trevijano, Sergio; Peraita, Herminia; Wall, Dennis P; Villoslada, Pablo

    2011-05-01

    Semantic memory is the subsystem of human memory that stores knowledge of concepts or meanings, as opposed to life-specific experiences. How humans organize semantic information remains poorly understood. In an effort to better understand this issue, we conducted a verbal fluency experiment on 200 participants with the aim of inferring and representing the conceptual storage structure of the natural category of animals as a network. This was done by formulating a statistical framework for co-occurring concepts that aims to infer significant concept-concept associations and represent them as a graph. The resulting network was analyzed and enriched by means of a missing links recovery criterion based on modularity. Both network models were compared to a thresholded co-occurrence approach. They were evaluated using a random subset of verbal fluency tests and comparing the network outcomes (linked pairs are clustering transitions and disconnected pairs are switching transitions) to the outcomes of two expert human raters. Results show that the network models proposed in this study overcome a thresholded co-occurrence approach, and their outcomes are in high agreement with human evaluations. Finally, the interplay between conceptual structure and retrieval mechanisms is discussed.

  4. Charge Photogeneration Experiments and Theory in Aggregated Squaraine Donor Materials for Improved Organic Solar Cell Efficiencies

    NASA Astrophysics Data System (ADS)

    Spencer, Susan Demetra

    Fossil fuel consumption has a deleterious effect on humans, the economy, and the environment. Renewable energy technologies must be identified and commercialized as quickly as possible so that the transition to renewables can happen at a minimum of financial and societal cost. Organic photovoltaic cells offer an inexpensive and disruptive energy technology, if the scientific challenges of understanding charge photogeneration in a bulk heterojunction material can be overcome. At RIT, there is a strong focus on creating new materials that can both offer fundamentally important scientific results relating to quantum photophysics, and simultaneously assist in the development of strong candidates for future commercialized technology. In this presentation, the results of intensive materials characterization of a series of squaraine small molecule donors will be presented, as well as a full study of the fabrication and optimization required to achieve >4% photovoltaic cell efficiency. A relationship between the molecular structure of the squaraine and its ability to form nanoscale aggregates will be explored. Squaraine aggregation will be described as a unique optoelectronic probe of the structure of the bulk heterojunction. This relationship will then be utilized to explain changes in crystallinity that impact the overall performance of the devices. Finally, a predictive summary will be given for the future of donor material research at RIT.

  5. Reliability and Validity of the World Health Organization Quality of Life: Brief Version (WHOQOL-BREF) in a Homeless Substance Dependent Veteran Population

    ERIC Educational Resources Information Center

    Garcia-Rea, Elizabeth A.; LePage, James P.

    2010-01-01

    With the high number of homeless, there is a critical need for rapid and accurate assessment of quality of life to assess program outcomes. The World Health Organization's WHOQOL-100 has demonstrated promise in accurately assessing quality-of-life in this population. However, its length may make large scale use impractical for working with a…

  6. Organics.

    ERIC Educational Resources Information Center

    Chian, Edward S. K.; DeWalle, Foppe B.

    1978-01-01

    Presents water analysis literature for 1978. This review is concerned with organics, and it covers: (1) detergents and surfactants; (2) aliphatic and aromatic hydrocarbons; (3) pesticides and chlorinated hydrocarbons; and (4) naturally occurring organics. A list of 208 references is also presented. (HM)

  7. Kleiber's Law: How the Fire of Life ignited debate, fueled theory, and neglected plants as model organisms

    PubMed Central

    Niklas, Karl J; Kutschera, Ulrich

    2015-01-01

    Size is a key feature of any organism since it influences the rate at which resources are consumed and thus affects metabolic rates. In the 1930s, size-dependent relationships were codified as “allometry” and it was shown that most of these could be quantified using the slopes of log-log plots of any 2 variables of interest. During the decades that followed, physiologists explored how animal respiration rates varied as a function of body size across taxa. The expectation was that rates would scale as the 2/3 power of body size as a reflection of the Euclidean relationship between surface area and volume. However, the work of Max Kleiber (1893–1976) and others revealed that animal respiration rates apparently scale more closely as the 3/4 power of body size. This phenomenology, which is called “Kleiber's Law,” has been described for a broad range of organisms, including some algae and plants. It has also been severely criticized on theoretical and empirical grounds. Here, we review the history of the analysis of metabolism, which originated with the works of Antoine L. Lavoisier (1743–1794) and Julius Sachs (1832–1897), and culminated in Kleiber's book The Fire of Life (1961; 2. ed. 1975). We then evaluate some of the criticisms that have been leveled against Kleiber's Law and some examples of the theories that have tried to explain it. We revive the speculation that intracellular exo- and endocytotic processes are resource delivery-systems, analogous to the supercellular systems in multicellular organisms. Finally, we present data that cast doubt on the existence of a single scaling relationship between growth and body size in plants. PMID:26156204

  8. Kleiber's Law: How the Fire of Life ignited debate, fueled theory, and neglected plants as model organisms.

    PubMed

    Niklas, Karl J; Kutschera, Ulrich

    2015-01-01

    Size is a key feature of any organism since it influences the rate at which resources are consumed and thus affects metabolic rates. In the 1930s, size-dependent relationships were codified as "allometry" and it was shown that most of these could be quantified using the slopes of log-log plots of any 2 variables of interest. During the decades that followed, physiologists explored how animal respiration rates varied as a function of body size across taxa. The expectation was that rates would scale as the 2/3 power of body size as a reflection of the Euclidean relationship between surface area and volume. However, the work of Max Kleiber (1893-1976) and others revealed that animal respiration rates apparently scale more closely as the 3/4 power of body size. This phenomenology, which is called "Kleiber's Law," has been described for a broad range of organisms, including some algae and plants. It has also been severely criticized on theoretical and empirical grounds. Here, we review the history of the analysis of metabolism, which originated with the works of Antoine L. Lavoisier (1743-1794) and Julius Sachs (1832-1897), and culminated in Kleiber's book The Fire of Life (1961; 2. ed. 1975). We then evaluate some of the criticisms that have been leveled against Kleiber's Law and some examples of the theories that have tried to explain it. We revive the speculation that intracellular exo- and endocytotic processes are resource delivery-systems, analogous to the supercellular systems in multicellular organisms. Finally, we present data that cast doubt on the existence of a single scaling relationship between growth and body size in plants.

  9. Reliability of Scores on the Summative Performance Assessments

    ERIC Educational Resources Information Center

    Yang, Yanyun; Oosterhof, Albert; Xia, Yan

    2015-01-01

    The authors address the reliability of scores obtained on the summative performance assessments during the pilot year of our research. Contrary to classical test theory, we discussed the advantages of using generalizability theory for estimating reliability of scores for summative performance assessments. Generalizability theory was used as the…

  10. Reliability and Expected Loss: A Unifying Principle.

    ERIC Educational Resources Information Center

    Cooil, Bruce; Rust, Roland T.

    1994-01-01

    It is proposed that proportional reduction in loss (PRL) be used as a theoretical basis to derive, justify, and interpret reliability measures to gauge reliability on a zero-to-one scale. This PRL approach simplifies the interpretation of existing measures (e.g., generalizability-theory measures). (SLD)

  11. Reliability model generator

    NASA Technical Reports Server (NTRS)

    McMann, Catherine M. (Inventor); Cohen, Gerald C. (Inventor)

    1991-01-01

    An improved method and system for automatically generating reliability models for use with a reliability evaluation tool is described. The reliability model generator of the present invention includes means for storing a plurality of low level reliability models which represent the reliability characteristics for low level system components. In addition, the present invention includes means for defining the interconnection of the low level reliability models via a system architecture description. In accordance with the principles of the present invention, a reliability model for the entire system is automatically generated by aggregating the low level reliability models based on the system architecture description.

  12. Packaging Theory.

    ERIC Educational Resources Information Center

    Williams, Jeffrey

    1994-01-01

    Considers the recent flood of anthologies of literary criticism and theory as exemplifications of the confluence of pedagogical concerns, economics of publishing, and other historical factors. Looks specifically at how these anthologies present theory. Cites problems with their formatting theory and proposes alternative ways of organizing theory…

  13. Reliability beyond Theory and into Practice

    ERIC Educational Resources Information Center

    Sijtsma, Klaas

    2009-01-01

    The critical reactions of Bentler (2009, doi: 10.1007/s11336-008-9100-1), Green and Yang (2009a, doi: 10.1007/s11336-008-9098-4 ; 2009b, doi: 10.1007/s11336-008-9099-3), and Revelle and Zinbarg (2009, doi: 10.1007/s11336-008-9102-z) to Sijtsma's (2009, doi: 10.1007/s11336-008-9101-0) paper on Cronbach's alpha are addressed. The dissemination of…

  14. An Introduction to Structural Reliability Theory

    DTIC Science & Technology

    1989-01-01

    the following result: (k sm - asma *snp) (USf -’tf- c*fy 3) (,sw- (swc*sw)-( Lw-GwC*wP3) = 0 (5.47) Equation (5.47) is to be solved for 3. Notice that...straight line on a semi-log plot (Figure 9.1), and c = 2 results in the Rayleigh distribution. Guidelines for c for platforms in the Gulf of Mexico are...coo <z -M w ww zz IVw w 0 0)ulI 0 43 0 c ’si 3:ViS$&1 < 4aNa 235 DEPENDS ON SHOALINGDEPENDS ON 1.0 (SHELF PROFILE) NAT. PERIOD GULF OF MEXICO 0 - - I

  15. Reliability Generalization: "Lapsus Linguae"

    ERIC Educational Resources Information Center

    Smith, Julie M.

    2011-01-01

    This study examines the proposed Reliability Generalization (RG) method for studying reliability. RG employs the application of meta-analytic techniques similar to those used in validity generalization studies to examine reliability coefficients. This study explains why RG does not provide a proper research method for the study of reliability,…

  16. Crystallization force--a density functional theory concept for revealing intermolecular interactions and molecular packing in organic crystals.

    PubMed

    Li, Tonglei; Ayers, Paul W; Liu, Shubin; Swadley, Matthew J; Aubrey-Medendorp, Clare

    2009-01-01

    Organic molecules are prone to polymorphic formation in the solid state due to the rich diversity of functional groups that results in comparable intermolecular interactions, which can be greatly affected by the selection of solvent and other crystallization conditions. Intermolecular interactions are typically weak forces, such as van der Waals and stronger short-range ones including hydrogen bonding, that are believed to determine the packing of organic molecules during the crystal-growth process. A different packing of the same molecules leads to the formation of a new crystal structure. To disclose the underlying causes that drive the molecule to have various packing motifs in the solid state, an electronic concept or function within the framework of conceptual density functional theory has been developed, namely, crystallization force. The concept aims to describe the local change in electronic structure as a result of the self-assembly process of crystallization and may likely quantify the locality of intermolecular interactions that directs the molecular packing in a crystal. To assess the applicability of the concept, 5-methyl-2-[(2-nitrophenyl)amino]-3-thiophenecarbonitrile, so-called ROY, which is known to have the largest number of solved polymorphs, has been examined. Electronic calculations were conducted on the seven available crystal structures as well as on the single molecule. The electronic structures were analyzed and crystallization force values were obtained. The results indicate that the crystallization forces are able to reveal intermolecular interactions in the crystals, in particular, the close contacts that are formed between molecules. Strong correlations exist between the total crystallization force and lattice energy of a crystal structure, further suggesting the underlying connection between the crystallization force and molecular packing.

  17. Can There Be Reliability without "Reliability?"

    ERIC Educational Resources Information Center

    Mislevy, Robert J.

    2004-01-01

    An "Educational Researcher" article by Pamela Moss (1994) asks the title question, "Can there be validity without reliability?" Yes, she answers, if by reliability one means "consistency among independent observations intended as interchangeable" (Moss, 1994, p. 7), quantified by internal consistency indices such as…

  18. The Trofobiose Theory and organic agriculture: the active mobilization of nutrients and the use of rock powder as a tool for sustainability.

    PubMed

    Polito, Wagner L

    2006-12-01

    The primary objective of the present paper is to link some relevant concepts on the use of ecological agricultural practices to the production of food crops. In a special topic the Trofobiose Theory, as well as the principle of Active Dissolution of Rocks are considered as important tools in improving the sustainability of Organic, Biodynamic and Process Agricultures.

  19. [Ambulatory, interdisciplinary team work in the tension field between theory and practice--Vorarlberg social medicine organization].

    PubMed

    Girardi, P; Acherer, E; Holzapfl, M; Strebl, L

    1998-11-01

    Presented is the Social-Medical Organization active in the field of ambulatory neurological care of adults in Vorarlberg, Austria, offering interdisciplinary cooperation possibilities in terms of team consultation, case presentation and discussion, as well as supervision. Ambulatory interdisciplinary teamwork is situated in a theory-practice field of tension, with occupational training failing to teach interdisciplinary cooperation and interdisciplinarity role models not encountered either during practicals. The ensuing, deliberately identified problems and issues have been addressed in a planned process. Familiarization with the various occupational fields involved, with each field having its specific job profile, as well as the notions on cooperation among the various fields are presented. The role the various occupational fields have in neurological aftercare as well as existing job profile clichés are reflected upon. Communication is in addition hampered by the diversity of training contents in the various occupational fields. Focussing on case presentation and discussion, teams are considered a place for obtaining advice, for joint development of targets and strategies, and interdisciplinary interfacing is no longer perceived as threatening but as enriching and productive. The role of stronger family doctor inclusion in formulation therapy goals remains an issue as yet unsolved.

  20. Phosphorescence lifetimes of organic light-emitting diodes from two-component time-dependent density functional theory

    SciTech Connect

    Kühn, Michael; Weigend, Florian

    2014-12-14

    “Spin-forbidden” transitions are calculated for an eight-membered set of iridium-containing candidate molecules for organic light-emitting diodes (OLEDs) using two-component time-dependent density functional theory. Phosphorescence lifetimes (obtained from averaging over relevant excitations) are compared to experimental data. Assessment of parameters like non-distorted and distorted geometric structures, density functionals, relativistic Hamiltonians, and basis sets was done by a thorough study for Ir(ppy){sub 3} focussing not only on averaged phosphorescence lifetimes, but also on the agreement of the triplet substate structure with experimental data. The most favorable methods were applied to an eight-membered test set of OLED candidate molecules; Boltzmann-averaged phosphorescence lifetimes were investigated concerning the convergence with the number of excited states and the changes when including solvent effects. Finally, a simple model for sorting out molecules with long averaged phosphorescence lifetimes is developed by visual inspection of computationally easily achievable one-component frontier orbitals.

  1. Workplace support, discrimination, and person-organization fit: tests of the theory of work adjustment with LGB individuals.

    PubMed

    Velez, Brandon L; Moradi, Bonnie

    2012-07-01

    The present study explored the links of 2 workplace contextual variables--perceptions of workplace heterosexist discrimination and lesbian, gay, and bisexual (LGB)-supportive climates--with job satisfaction and turnover intentions in a sample of LGB employees. An extension of the theory of work adjustment (TWA) was used as the conceptual framework for the study; as such, perceived person-organization (P-O) fit was tested as a mediator of the relations between the workplace contextual variables and job outcomes. Data were analyzed from 326 LGB employees. Zero-order correlations indicated that perceptions of workplace heterosexist discrimination and LGB-supportive climates were correlated in expected directions with P-O fit, job satisfaction, and turnover intentions. Structural equation modeling (SEM) was used to compare multiple alternative measurement models evaluating the discriminant validity of the 2 workplace contextual variables relative to one another, and the 3 TWA job variables relative to one another; SEM was also used to test the hypothesized mediation model. Comparisons of multiple alternative measurement models supported the construct distinctiveness of the variables of interest. The test of the hypothesized structural model revealed that only LGB-supportive climates (and not workplace heterosexist discrimination) had a unique direct positive link with P-O fit and, through the mediating role of P-O fit, had significant indirect positive and negative relations with job satisfaction and turnover intentions, respectively. Moreover, P-O fit had a significant indirect negative link with turnover intentions through job satisfaction.

  2. The effects of instructors' autonomy support and students' autonomous motivation on learning organic chemistry: A self-determination theory perspective

    NASA Astrophysics Data System (ADS)

    Black, Aaron E.; Deci, Edward L.

    2000-11-01

    This prospective study applied self-determination theory to investigate the effects of students' course-specific self-regulation and their perceptions of their instructors' autonomy support on adjustment and academic performance in a college-level organic chemistry course. The study revealed that: (1) students' reports of entering the course for relatively autonomous (vs. controlled) reasons predicted higher perceived competence and interest/enjoyment and lower anxiety and grade-focused performance goals during the course, and were related to whether or not the students dropped the course; and (2) students' perceptions of their instructors' autonomy support predicted increases in autonomous self-regulation, perceived competence, and interest/enjoyment, and decreases in anxiety over the semester. The change in autonomous self-regulation in turn predicted students' performance in the course. Further, instructor autonomy support also predicted course performance directly, although differences in the initial level of students' autonomous self-regulation moderated that effect, with autonomy support relating strongly to academic performance for students initially low in autonomous self-regulation but not for students initially high in autonomous self-regulation.

  3. Self-organized criticality as Witten-type topological field theory with spontaneously broken Becchi-Rouet-Stora-Tyutin symmetry

    SciTech Connect

    Ovchinnikov, Igor V.

    2011-05-15

    Here, a scenario is proposed, according to which a generic self-organized critical (SOC) system can be looked upon as a Witten-type topological field theory (W-TFT) with spontaneously broken Becchi-Rouet-Stora-Tyutin (BRST) symmetry. One of the conditions for the SOC is the slow driving noise, which unambiguously suggests Stratonovich interpretation of the corresponding stochastic differential equation (SDE). This, in turn, necessitates the use of Parisi-Sourlas-Wu stochastic quantization procedure, which straightforwardly leads to a model with BRST-exact action, i.e., to a W-TFT. In the parameter space of the SDE, there must exist full-dimensional regions where the BRST symmetry is spontaneously broken by instantons, which in the context of SOC are essentially avalanches. In these regions, the avalanche-type SOC dynamics is liberated from overwise a rightful dynamics-less W-TFT, and a Goldstone mode of Fadeev-Popov ghosts exists. Goldstinos represent moduli of instantons (avalanches) and being gapless are responsible for the critical avalanche distribution in the low-energy, long-wavelength limit. The above arguments are robust against moderate variations of the SDE's parameters and the criticality is 'self-tuned'. The proposition of this paper suggests that the machinery of W-TFTs may find its applications in many different areas of modern science studying various physical realizations of SOC. It also suggests that there may in principle exist a connection between some SOC's and the concept of topological quantum computing.

  4. Luminescent properties of metal-organic framework MOF-5: relativistic time-dependent density functional theory investigations.

    PubMed

    Ji, Min; Lan, Xin; Han, Zhenping; Hao, Ce; Qiu, Jieshan

    2012-11-19

    The electronically excited state and luminescence property of metal-organic framework MOF-5 were investigated using relativistic density functional theory (DFT) and time-dependent DFT (TDDFT). The geometry, IR spectra, and UV-vis spectra of MOF-5 in the ground state were calculated using relativistic DFT, leading to good agreement between the experimental and theoretical results. The frontier molecular orbitals and electronic configuration indicated that the luminescence mechanism in MOF-5 follows ligand-to-ligand charge transfer (LLCT), namely, π* → π, rather than emission with the ZnO quantum dot (QD) proposed by Bordiga et al. The geometry and IR spectra of MOF-5 in the electronically excited state have been calculated using the relativistic TDDFT and compared with those for the ground state. The comparison reveals that the Zn4O13 QD is rigid, whereas the ligands BDC(2-) are nonrigid. In addition, the calculated emission band of MOF-5 is in good agreement with the experimental result and is similar to that of the ligand H2BDC. The combined results confirmed that the luminescence mechanism for MOF-5 should be LLCT with little mixing of the ligand-to-metal charge transfer. The reason for the MOF-5 luminescence is explained by the excellent coplanarity between the six-membered ring consisting of zinc, oxygen, carbon, and the benzene ring.

  5. Analysis of algal bloom risk with uncertainties in lakes by integrating self-organizing map and fuzzy information theory.

    PubMed

    Chen, Qiuwen; Rui, Han; Li, Weifeng; Zhang, Yanhui

    2014-06-01

    Algal blooms are a serious problem in waters, which damage aquatic ecosystems and threaten drinking water safety. However, the outbreak mechanism of algal blooms is very complex with great uncertainty, especially for large water bodies where environmental conditions have obvious variation in both space and time. This study developed an innovative method which integrated a self-organizing map (SOM) and fuzzy information diffusion theory to comprehensively analyze algal bloom risks with uncertainties. The Lake Taihu was taken as study case and the long-term (2004-2010) on-site monitoring data were used. The results showed that algal blooms in Taihu Lake were classified into four categories and exhibited obvious spatial-temporal patterns. The lake was mainly characterized by moderate bloom but had high uncertainty, whereas severe blooms with low uncertainty were observed in the northwest part of the lake. The study gives insight on the spatial-temporal dynamics of algal blooms, and should help government and decision-makers outline policies and practices on bloom monitoring and prevention. The developed method provides a promising approach to estimate algal bloom risks under uncertainties.

  6. 76 FR 58424 - Transmission Relay Loadability Reliability Standard

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-21

    ... Energy Regulatory Commission 18 CFR Parts 39 and 40 Transmission Relay Loadability Reliability Standard... section 215 of the Federal Power Act, the Commission proposes to approve Reliability Standard PRC-023-2... Reliability Corporation (NERC), the Electric Reliability Organization (ERO) certified by the Commission....

  7. Conference on Operator Theory, Wavelet Theory and Control Theory

    DTIC Science & Technology

    1993-09-30

    Bourbaki 662 (1985-1986). [9] Meyer, Y., Ondelettes et operateurs I, Hermann editeurs des sciences et des arts, 1990. [10] Natanson, I. P., Theory of...OPERATOR THEORY , WAVELET THEORY & CONTROL THEORY (U)F 6. AUTHOR(S) 2304/ES Professor Xingde Dai F49620-93-1-0180 7. PERFORMING ORGANIZATION NAME(S) AND...1STRIBUTION IS UNLIMITED UTL 13. ABSTRACT (Maximum 200 words) The conference on Interaction Between Operator Theory , Wavelet Theory and Control Theory

  8. Covariate-free and Covariate-dependent Reliability.

    PubMed

    Bentler, Peter M

    2016-12-01

    Classical test theory reliability coefficients are said to be population specific. Reliability generalization, a meta-analysis method, is the main procedure for evaluating the stability of reliability coefficients across populations. A new approach is developed to evaluate the degree of invariance of reliability coefficients to population characteristics. Factor or common variance of a reliability measure is partitioned into parts that are, and are not, influenced by control variables, resulting in a partition of reliability into a covariate-dependent and a covariate-free part. The approach can be implemented in a single sample and can be applied to a variety of reliability coefficients.

  9. Effects of London dispersion correction in density functional theory on the structures of organic molecules in the gas phase.

    PubMed

    Grimme, Stefan; Steinmetz, Marc

    2013-10-14

    A benchmark set of 25 rotational constants measured in the gas phase for nine molecules (termed ROT25) was compiled from available experimental data. The medium-sized molecules with 18-35 atoms cover common (bio)organic structure motifs including hydrogen bonding and flexible side chains. They were each considered in a single conformation. The experimental B0 values were back-corrected to reference equilibrium rotational constants (Be) by computation of the vibrational corrections ΔBvib. Various density functional theory (DFT) methods and Hartree-Fock with and without dispersion corrections as well as MP2 type methods and semi-empirical quantum chemical approaches are investigated. The ROT25 benchmark tests their ability to describe covalent bond lengths, longer inter-atomic distances, and the relative orientation of functional groups (intramolecular non-covalent interactions). In general, dispersion corrections to DFT and HF increase Be values (shrink molecular size) significantly by about 0.5-1.5% thereby in general improving agreement with the reference data. Regarding DFT methods, the overall accuracy of the optimized structures roughly follows the 'Jacobs ladder' classification scheme, i.e., it decreases in the series double-hybrid > (meta)hybrid > (meta)GGA > LDA. With B2PLYP-D3, SCS-MP2, B3LYP-D3/NL, or PW6B95-D3 methods and extended QZVP (def2-TZVP) AO basis sets, Be values, accurate to about 0.3-0.6 (0.5-1)% on average, can be computed routinely. The accuracy of B2PLYP-D3/QZVP with a mean deviation of only 3 MHz and a standard deviation of 0.24% is exceptional and we recommend this method when highly accurate structures are required or for problematic conformer assignments. The correlation effects for three inter-atomic distance regimes (covalent, medium-range, long) and the performance of minimal basis set (semi-empirical) methods are discussed.

  10. Cognitive decision errors and organization vulnerabilities in nuclear power plant safety management: Modeling using the TOGA meta-theory framework

    SciTech Connect

    Cappelli, M.; Gadomski, A. M.; Sepiellis, M.; Wronikowska, M. W.

    2012-07-01

    In the field of nuclear power plant (NPP) safety modeling, the perception of the role of socio-cognitive engineering (SCE) is continuously increasing. Today, the focus is especially on the identification of human and organization decisional errors caused by operators and managers under high-risk conditions, as evident by analyzing reports on nuclear incidents occurred in the past. At present, the engineering and social safety requirements need to enlarge their domain of interest in such a way to include all possible losses generating events that could be the consequences of an abnormal state of a NPP. Socio-cognitive modeling of Integrated Nuclear Safety Management (INSM) using the TOGA meta-theory has been discussed during the ICCAP 2011 Conference. In this paper, more detailed aspects of the cognitive decision-making and its possible human errors and organizational vulnerability are presented. The formal TOGA-based network model for cognitive decision-making enables to indicate and analyze nodes and arcs in which plant operators and managers errors may appear. The TOGA's multi-level IPK (Information, Preferences, Knowledge) model of abstract intelligent agents (AIAs) is applied. In the NPP context, super-safety approach is also discussed, by taking under consideration unexpected events and managing them from a systemic perspective. As the nature of human errors depends on the specific properties of the decision-maker and the decisional context of operation, a classification of decision-making using IPK is suggested. Several types of initial situations of decision-making useful for the diagnosis of NPP operators and managers errors are considered. The developed models can be used as a basis for applications to NPP educational or engineering simulators to be used for training the NPP executive staff. (authors)

  11. Power electronics reliability analysis.

    SciTech Connect

    Smith, Mark A.; Atcitty, Stanley

    2009-12-01

    This report provides the DOE and industry with a general process for analyzing power electronics reliability. The analysis can help with understanding the main causes of failures, downtime, and cost and how to reduce them. One approach is to collect field maintenance data and use it directly to calculate reliability metrics related to each cause. Another approach is to model the functional structure of the equipment using a fault tree to derive system reliability from component reliability. Analysis of a fictitious device demonstrates the latter process. Optimization can use the resulting baseline model to decide how to improve reliability and/or lower costs. It is recommended that both electric utilities and equipment manufacturers make provisions to collect and share data in order to lay the groundwork for improving reliability into the future. Reliability analysis helps guide reliability improvements in hardware and software technology including condition monitoring and prognostics and health management.

  12. Human Reliability Program Overview

    SciTech Connect

    Bodin, Michael

    2012-09-25

    This presentation covers the high points of the Human Reliability Program, including certification/decertification, critical positions, due process, organizational structure, program components, personnel security, an overview of the US DOE reliability program, retirees and academia, and security program integration.

  13. Integrated avionics reliability

    NASA Technical Reports Server (NTRS)

    Alikiotis, Dimitri

    1988-01-01

    The integrated avionics reliability task is an effort to build credible reliability and/or performability models for multisensor integrated navigation and flight control. The research was initiated by the reliability analysis of a multisensor navigation system consisting of the Global Positioning System (GPS), the Long Range Navigation system (Loran C), and an inertial measurement unit (IMU). Markov reliability models were developed based on system failure rates and mission time.

  14. Reliable Design Versus Trust

    NASA Technical Reports Server (NTRS)

    Berg, Melanie; LaBel, Kenneth A.

    2016-01-01

    This presentation focuses on reliability and trust for the users portion of the FPGA design flow. It is assumed that the manufacturer prior to hand-off to the user tests FPGA internal components. The objective is to present the challenges of creating reliable and trusted designs. The following will be addressed: What makes a design vulnerable to functional flaws (reliability) or attackers (trust)? What are the challenges for verifying a reliable design versus a trusted design?

  15. Reliability and structural integrity

    NASA Technical Reports Server (NTRS)

    Davidson, J. R.

    1976-01-01

    An analytic model is developed to calculate the reliability of a structure after it is inspected for cracks. The model accounts for the growth of undiscovered cracks between inspections and their effect upon the reliability after subsequent inspections. The model is based upon a differential form of Bayes' Theorem for reliability, and upon fracture mechanics for crack growth.

  16. Reliability model generator specification

    NASA Technical Reports Server (NTRS)

    Cohen, Gerald C.; Mccann, Catherine

    1990-01-01

    The Reliability Model Generator (RMG), a program which produces reliability models from block diagrams for ASSIST, the interface for the reliability evaluation tool SURE is described. An account is given of motivation for RMG and the implemented algorithms are discussed. The appendices contain the algorithms and two detailed traces of examples.

  17. Viking Lander reliability program

    NASA Technical Reports Server (NTRS)

    Pilny, M. J.

    1978-01-01

    The Viking Lander reliability program is reviewed with attention given to the development of the reliability program requirements, reliability program management, documents evaluation, failure modes evaluation, production variation control, failure reporting and correction, and the parts program. Lander hardware failures which have occurred during the mission are listed.

  18. Predicting software reliability

    NASA Technical Reports Server (NTRS)

    Littlewood, B.

    1989-01-01

    A detailed look is given to software reliability techniques. A conceptual model of the failure process is examined, and some software reliability growth models are discussed. Problems for which no current solutions exist are addressed, emphasizing the very difficult problem of safety-critical systems for which the reliability requirements can be enormously demanding.

  19. A theory of planned behavior study of college students' intention to register as organ donors in Japan, Korea, and the United States.

    PubMed

    Bresnahan, Mary; Lee, Sun Young; Smith, Sandi W; Shearman, Sachiyo; Nebashi, Reiko; Park, Cheong Yi; Yoo, Jina

    2007-01-01

    This study investigated willingness of Americans, Koreans, and Japanese to register as organ donors using the theory of planned behavior. Although previous research showed that attitude toward donation and communication with family predicted organ donation behaviors for respondents in the United States, these variables were also significant for respondents in Japan and Korea. Perceived behavioral control predicted intention to register for Japanese participants whereas knowledge about organ donation was associated with reluctance to register for Koreans. Spiritual connection and concern were shown to be causal factors underlying attitude in all 3 countries. In spite of positive attitudes toward organ donation and comparable knowledge with Americans and Japanese, most Korean participants declined to take an application to register as a donor. Implications of these findings for future research are discussed.

  20. Applicability of the Multiple Intelligence Theory to the Process of Organizing and Planning of Learning and Teaching

    ERIC Educational Resources Information Center

    Acat, M. Bahaddin

    2005-01-01

    It has long been under discussion how the teaching and learning environment should be arranged, how individuals achieve learning, and how teachers can effectively contribute to this process. Accordingly, a considerable number of theories and models have been proposed. Gardner (1983) caused a remarkable shift in the perception of learning theory as…

  1. Stoking the Dialogue on the Domains of Transformative Learning Theory: Insights From Research With Faith-Based Organizations in Kenya

    ERIC Educational Resources Information Center

    Moyer, Joanne M.; Sinclair, A. John

    2016-01-01

    Transformative learning theory is applied in a variety of fields, including archaeology, religious studies, health care, the physical sciences, environmental studies, and natural resource management. Given the breadth of the theory's application, it needs to be adaptable to broad contexts. This article shares insights gained from applying the…

  2. Measurement Issues in High Stakes Testing: Validity and Reliability

    ERIC Educational Resources Information Center

    Mason, Emanuel J.

    2007-01-01

    Validity and reliability of the new high stakes testing systems initiated in school systems across the United States in recent years in response to the accountability features mandated in the No Child Left Behind Legislation largely depend on item response theory and new rules of measurement. Reliability and validity in item response theory and…

  3. Environmental control of sepalness and petalness in perianth organs of waterlilies: a new Mosaic Theory for the evolutionary origin of a differentiated perianth

    PubMed Central

    Warner, Kate A.; Rudall, Paula J.; Frohlich, Michael W.

    2009-01-01

    The conventional concept of an ‘undifferentiated perianth’, implying that all perianth organs of a flower are alike, obscures the fact that individual perianth organs are sometimes differentiated into sepaloid and petaloid regions, as in the early-divergent angiosperms Nuphar, Nymphaea, and Schisandra. In the waterlilies Nuphar and Nymphaea, sepaloid regions closely coincide with regions of the perianth that were exposed when the flower was in bud, whereas petaloid regions occur in covered regions, suggesting that their development is at least partly controlled by the environment of the developing tepal. Green and colourful areas differ from each other in trichome density and presence of papillae, features that often distinguish sepals and petals. Field experiments to test whether artificial exposure can induce sepalness in the inner tepals showed that development of sepaloid patches is initiated by exposure, at least in the waterlily species examined. Although light is an important environmental cue, other important factors include an absence of surface contact. Our interpretation contradicts the unspoken rule that ‘sepal’ and ‘petal’ must refer to whole organs. We propose a novel theory (the Mosaic theory), in which the distinction between sepalness and petalness evolved early in angiosperm history, but these features were not fixed to particular organs and were primarily environmentally controlled. At a later stage in angiosperm evolution, sepaloid and petaloid characteristics became fixed to whole organs in specific whorls, thus reducing or removing the need for environmental control in favour of fixed developmental control. PMID:19574253

  4. [An examination of "Minamata disease general investigation and research liaison council"--The process of making uncertain the organic mercury causal theory].

    PubMed

    Nakano, Hiroshi

    2010-01-01

    Minamata disease occurred because inhabitants consumed the polluted seafood. The official confirmation of Minamata disease was in 1956. However, the material cause of that disease was uncertain at that time. The Minamata Food Poisoning Sub-committee, under authority of the Food Hygiene Investigation Committee of the Ministry of Health and Welfare, determined the material cause of Minamata disease to be a certain kind of organic mercury in 1959. The sub-committee was dissolved after their report. The discussion about the investigation of the cause was performed in a conference initiated by the Economic Planning Agency, which was titled "Minamata Disease General Investigation and Research Liaison Council". The Participants were eight scientists; four fishery scientists, two chemists, and only two medical scientists, which implied that only examination of the organic mercury was to be discussion. The conference was held four times from 1960 to 1961. In the first and second conferences, the organic mercury research from a medical perspective progressed in cooperation with fishery sciences. In the third conference, it was reported that UCHIDA Makio, professor of Kumamoto University, had found organic mercury crystal in the shellfish found in Minamata-bay. Authorities of biochemistry and medicine in the third conference criticized UCHIDA's research. At the fourth conference, reports contradicting his research were presented. Although those anti-UCHIDA reports were not verified, AKAHORI Shiro, the highest authority of biochemistry, not only accepted them, but also expressed doubt in the organic mercury causal theory. Therefore, this theory was recognized as uncertain.

  5. Structural studies of crystals of organic and organoelement compounds using modern quantum chemical calculations within the framework of the density functional theory

    NASA Astrophysics Data System (ADS)

    Korlyukov, Alexander A.; Antipin, Mikhail Yu

    2012-02-01

    The review generalizes the results of structural studies of crystals of organic and organometallic compounds by modern quantum chemical calculations within the framework of the density functional theory reported in the last decade. Features of the software for such calculations are discussed. Examples of the use of quantum chemical calculations for the studies of the electronic structure, spectroscopic and other physicochemical properties of molecular crystals are presented. The bibliography includes 223 references.

  6. Signal verification can promote reliable signalling

    PubMed Central

    Broom, Mark; Ruxton, Graeme D.; Schaefer, H. Martin

    2013-01-01

    The central question in communication theory is whether communication is reliable, and if so, which mechanisms select for reliability. The primary approach in the past has been to attribute reliability to strategic costs associated with signalling as predicted by the handicap principle. Yet, reliability can arise through other mechanisms, such as signal verification; but the theoretical understanding of such mechanisms has received relatively little attention. Here, we model whether verification can lead to reliability in repeated interactions that typically characterize mutualisms. Specifically, we model whether fruit consumers that discriminate among poor- and good-quality fruits within a population can select for reliable fruit signals. In our model, plants either signal or they do not; costs associated with signalling are fixed and independent of plant quality. We find parameter combinations where discriminating fruit consumers can select for signal reliability by abandoning unprofitable plants more quickly. This self-serving behaviour imposes costs upon plants as a by-product, rendering it unprofitable for unrewarding plants to signal. Thus, strategic costs to signalling are not a prerequisite for reliable communication. We expect verification to more generally explain signal reliability in repeated consumer–resource interactions that typify mutualisms but also in antagonistic interactions such as mimicry and aposematism. PMID:24068354

  7. Software reliability experiments data analysis and investigation

    NASA Technical Reports Server (NTRS)

    Walker, J. Leslie; Caglayan, Alper K.

    1991-01-01

    The objectives are to investigate the fundamental reasons which cause independently developed software programs to fail dependently, and to examine fault tolerant software structures which maximize reliability gain in the presence of such dependent failure behavior. The authors used 20 redundant programs from a software reliability experiment to analyze the software errors causing coincident failures, to compare the reliability of N-version and recovery block structures composed of these programs, and to examine the impact of diversity on software reliability using subpopulations of these programs. The results indicate that both conceptually related and unrelated errors can cause coincident failures and that recovery block structures offer more reliability gain than N-version structures if acceptance checks that fail independently from the software components are available. The authors present a theory of general program checkers that have potential application for acceptance tests.

  8. Software Reliability 2002

    NASA Technical Reports Server (NTRS)

    Wallace, Dolores R.

    2003-01-01

    In FY01 we learned that hardware reliability models need substantial changes to account for differences in software, thus making software reliability measurements more effective, accurate, and easier to apply. These reliability models are generally based on familiar distributions or parametric methods. An obvious question is 'What new statistical and probability models can be developed using non-parametric and distribution-free methods instead of the traditional parametric method?" Two approaches to software reliability engineering appear somewhat promising. The first study, begin in FY01, is based in hardware reliability, a very well established science that has many aspects that can be applied to software. This research effort has investigated mathematical aspects of hardware reliability and has identified those applicable to software. Currently the research effort is applying and testing these approaches to software reliability measurement, These parametric models require much project data that may be difficult to apply and interpret. Projects at GSFC are often complex in both technology and schedules. Assessing and estimating reliability of the final system is extremely difficult when various subsystems are tested and completed long before others. Parametric and distribution free techniques may offer a new and accurate way of modeling failure time and other project data to provide earlier and more accurate estimates of system reliability.

  9. Recalibrating software reliability models

    NASA Technical Reports Server (NTRS)

    Brocklehurst, Sarah; Chan, P. Y.; Littlewood, Bev; Snell, John

    1990-01-01

    In spite of much research effort, there is no universally applicable software reliability growth model which can be trusted to give accurate predictions of reliability in all circumstances. Further, it is not even possible to decide a priori which of the many models is most suitable in a particular context. In an attempt to resolve this problem, techniques were developed whereby, for each program, the accuracy of various models can be analyzed. A user is thus enabled to select that model which is giving the most accurate reliability predicitons for the particular program under examination. One of these ways of analyzing predictive accuracy, called the u-plot, in fact allows a user to estimate the relationship between the predicted reliability and the true reliability. It is shown how this can be used to improve reliability predictions in a completely general way by a process of recalibration. Simulation results show that the technique gives improved reliability predictions in a large proportion of cases. However, a user does not need to trust the efficacy of recalibration, since the new reliability estimates prodcued by the technique are truly predictive and so their accuracy in a particular application can be judged using the earlier methods. The generality of this approach would therefore suggest that it be applied as a matter of course whenever a software reliability model is used.

  10. Recalibrating software reliability models

    NASA Technical Reports Server (NTRS)

    Brocklehurst, Sarah; Chan, P. Y.; Littlewood, Bev; Snell, John

    1989-01-01

    In spite of much research effort, there is no universally applicable software reliability growth model which can be trusted to give accurate predictions of reliability in all circumstances. Further, it is not even possible to decide a priori which of the many models is most suitable in a particular context. In an attempt to resolve this problem, techniques were developed whereby, for each program, the accuracy of various models can be analyzed. A user is thus enabled to select that model which is giving the most accurate reliability predictions for the particular program under examination. One of these ways of analyzing predictive accuracy, called the u-plot, in fact allows a user to estimate the relationship between the predicted reliability and the true reliability. It is shown how this can be used to improve reliability predictions in a completely general way by a process of recalibration. Simulation results show that the technique gives improved reliability predictions in a large proportion of cases. However, a user does not need to trust the efficacy of recalibration, since the new reliability estimates produced by the technique are truly predictive and so their accuracy in a particular application can be judged using the earlier methods. The generality of this approach would therefore suggest that it be applied as a matter of course whenever a software reliability model is used.

  11. Subjective indicators as a gauge for improving organizational well-being. An attempt to apply the cognitive activation theory to organizations.

    PubMed

    Arnetz, Bengt B

    2005-11-01

    Globally, organizations are undergoing substantial changes, commonly resulting in significant employee stress. However, facing similar stressors and challenges, departments within an organizations, as well as companies within the same area of business, vary in the way they cope with change. It was hypothesized that collective uncertainty about the future as well as unclear organizational goals contribute to chronic stress in organizations exposed to change. Applying the theoretical cognitive activation theory of stress--CATS--model by Ursin and Eriksen at an organizational level, support was found for the above hypothesis. Changes in chronic stress indicators between two assessments were related to clarity of organizational goals. It is suggested that the CATS model might be fruitful, not only in understanding variations in individual stress responses and experiences, but also to interpret and manage organizational stress. By doing so, both organizational health and well-being will improve, creating enterprises with healthy employees and healthy productivity and economic results.

  12. Predicting Cloud Computing Technology Adoption by Organizations: An Empirical Integration of Technology Acceptance Model and Theory of Planned Behavior

    ERIC Educational Resources Information Center

    Ekufu, ThankGod K.

    2012-01-01

    Organizations are finding it difficult in today's economy to implement the vast information technology infrastructure required to effectively conduct their business operations. Despite the fact that some of these organizations are leveraging on the computational powers and the cost-saving benefits of computing on the Internet cloud, others…

  13. ERP Reliability Analysis (ERA) Toolbox: An open-source toolbox for analyzing the reliability of event-related brain potentials.

    PubMed

    Clayson, Peter E; Miller, Gregory A

    2017-01-01

    Generalizability theory (G theory) provides a flexible, multifaceted approach to estimating score reliability. G theory's approach to estimating score reliability has important advantages over classical test theory that are relevant for research using event-related brain potentials (ERPs). For example, G theory does not require parallel forms (i.e., equal means, variances, and covariances), can handle unbalanced designs, and provides a single reliability estimate for designs with multiple sources of error. This monograph provides a detailed description of the conceptual framework of G theory using examples relevant to ERP researchers, presents the algorithms needed to estimate ERP score reliability, and provides a detailed walkthrough of newly-developed software, the ERP Reliability Analysis (ERA) Toolbox, that calculates score reliability using G theory. The ERA Toolbox is open-source, Matlab software that uses G theory to estimate the contribution of the number of trials retained for averaging, group, and/or event types on ERP score reliability. The toolbox facilitates the rigorous evaluation of psychometric properties of ERP scores recommended elsewhere in this special issue.

  14. Reliability, Recursion, and Risk.

    ERIC Educational Resources Information Center

    Henriksen, Melvin, Ed.; Wagon, Stan, Ed.

    1991-01-01

    The discrete mathematics topics of trees and computational complexity are implemented in a simple reliability program which illustrates the process advantages of the PASCAL programing language. The discussion focuses on the impact that reliability research can provide in assessment of the risks found in complex technological ventures. (Author/JJK)

  15. Monte Carlo Reliability Analysis.

    DTIC Science & Technology

    1987-10-01

    to Stochastic Processes , Prentice-Hall, Englewood Cliffs, NJ, 1975. (5) R. E. Barlow and F. Proscham, Statistical TheorX of Reliability and Life...Lewis and Z. Tu, "Monte Carlo Reliability Modeling by Inhomogeneous ,Markov Processes, Reliab. Engr. 16, 277-296 (1986). (4) E. Cinlar, Introduction

  16. Hawaii electric system reliability.

    SciTech Connect

    Silva Monroy, Cesar Augusto; Loose, Verne William

    2012-09-01

    This report addresses Hawaii electric system reliability issues; greater emphasis is placed on short-term reliability but resource adequacy is reviewed in reference to electric consumers' views of reliability %E2%80%9Cworth%E2%80%9D and the reserve capacity required to deliver that value. The report begins with a description of the Hawaii electric system to the extent permitted by publicly available data. Electrical engineering literature in the area of electric reliability is researched and briefly reviewed. North American Electric Reliability Corporation standards and measures for generation and transmission are reviewed and identified as to their appropriateness for various portions of the electric grid and for application in Hawaii. Analysis of frequency data supplied by the State of Hawaii Public Utilities Commission is presented together with comparison and contrast of performance of each of the systems for two years, 2010 and 2011. Literature tracing the development of reliability economics is reviewed and referenced. A method is explained for integrating system cost with outage cost to determine the optimal resource adequacy given customers' views of the value contributed by reliable electric supply. The report concludes with findings and recommendations for reliability in the State of Hawaii.

  17. Hawaii Electric System Reliability

    SciTech Connect

    Loose, Verne William; Silva Monroy, Cesar Augusto

    2012-08-01

    This report addresses Hawaii electric system reliability issues; greater emphasis is placed on short-term reliability but resource adequacy is reviewed in reference to electric consumers’ views of reliability “worth” and the reserve capacity required to deliver that value. The report begins with a description of the Hawaii electric system to the extent permitted by publicly available data. Electrical engineering literature in the area of electric reliability is researched and briefly reviewed. North American Electric Reliability Corporation standards and measures for generation and transmission are reviewed and identified as to their appropriateness for various portions of the electric grid and for application in Hawaii. Analysis of frequency data supplied by the State of Hawaii Public Utilities Commission is presented together with comparison and contrast of performance of each of the systems for two years, 2010 and 2011. Literature tracing the development of reliability economics is reviewed and referenced. A method is explained for integrating system cost with outage cost to determine the optimal resource adequacy given customers’ views of the value contributed by reliable electric supply. The report concludes with findings and recommendations for reliability in the State of Hawaii.

  18. Behavioral ecology, endocrinology and signal reliability of electric communication

    PubMed Central

    Gavassa, Sat; Goldina, Anna; Silva, Ana C.; Stoddard, Philip K.

    2013-01-01

    Summary The balance between the costs and benefits of conspicuous animal communication signals ensures that signal expression relates to the quality of the bearer. Signal plasticity enables males to enhance conspicuous signals to impress mates and competitors and to reduce signal expression to lower energetic and predation-related signaling costs when competition is low. While signal plasticity may benefit the signaler, it can compromise the reliability of the information conveyed by the signals. In this paper we review the effect of signal plasticity on the reliability of the electrocommunication signal of the gymnotiform fish Brachyhypopomus gauderio. We (1) summarize the endocrine regulation of signal plasticity, (2) explore the regulation of signal plasticity in females, (3) examine the information conveyed by the signal, (4) show how that information changes when the signal changes, and (5) consider the energetic strategies used to sustain expensive signaling. The electric organ discharge (EOD) of B. gauderio changes in response to social environment on two time scales. Two hormone classes, melanocortins and androgens, underlie the short-term and long-term modulation of signal amplitude and duration observed during social interaction. Population density drives signal amplitude enhancement, unexpectedly improving the reliability with which the signal predicts the signaler's size. The signal's second phase elongation predicts androgen levels and male reproductive condition. Males sustain signal enhancement with dietary intake, but when food is limited, they ‘go for broke’ and put extra energy into electric signals. Cortisol diminishes EOD parameters, but energy-limited males offset cortisol effects by boosting androgen levels. While physiological constraints are sufficient to maintain signal amplitude reliability, phenotypic integration and signaling costs maintain reliability of signal duration, consistent with theory of honest signaling. PMID:23761465

  19. Chapter 9: Reliability

    SciTech Connect

    Algora, Carlos; Espinet-Gonzalez, Pilar; Vazquez, Manuel; Bosco, Nick; Miller, David; Kurtz, Sarah; Rubio, Francisca; McConnell,Robert

    2016-04-15

    This chapter describes the accumulated knowledge on CPV reliability with its fundamentals and qualification. It explains the reliability of solar cells, modules (including optics) and plants. The chapter discusses the statistical distributions, namely exponential, normal and Weibull. The reliability of solar cells includes: namely the issues in accelerated aging tests in CPV solar cells, types of failure and failures in real time operation. The chapter explores the accelerated life tests, namely qualitative life tests (mainly HALT) and quantitative accelerated life tests (QALT). It examines other well proven and experienced PV cells and/or semiconductor devices, which share similar semiconductor materials, manufacturing techniques or operating conditions, namely, III-V space solar cells and light emitting diodes (LEDs). It addresses each of the identified reliability issues and presents the current state of the art knowledge for their testing and evaluation. Finally, the chapter summarizes the CPV qualification and reliability standards.

  20. Informational Closed-Loop Coding-Decoding Control Concept as the Base of the Living or Organized Systems Theory

    NASA Astrophysics Data System (ADS)

    Kirvelis, Dobilas; Beitas, Kastytis

    2008-10-01

    The aim of this work is to show that the essence of life and living systems is their organization as bioinformational technology on the base of informational anticipatory control. Principal paradigmatic and structural schemes of functional organization of life (organisms and their systems) are constructed on the basis of systemic analysis and synthesis of main phenomenological features of living world. Life is based on functional elements that implement engineering procedures of closed-loop coding-decoding control (CL-CDC). Phenomenon of natural bioinformational control appeared and developed on the Earth 3-4 bln years ago, when the life originated as a result of chemical and later biological evolution. Informatics paradigm considers the physical and chemical transformations of energy and matter in organized systems as flows that are controlled and the signals as means for purposive informational control programs. The social and technical technological systems as informational control systems are a latter phenomenon engineered by man. The information emerges in organized systems as a necessary component of control technology. Generalized schemes of functional organization on levels of cell, organism and brain neocortex, as the highest biosystem with CL-CDC, are presented. CL-CDC concept expands the understanding of bioinformatics.

  1. Workplace Support, Discrimination, and Person-Organization Fit: Tests of the Theory of Work Adjustment with LGB Individuals

    ERIC Educational Resources Information Center

    Velez, Brandon L.; Moradi, Bonnie

    2012-01-01

    The present study explored the links of 2 workplace contextual variables--perceptions of workplace heterosexist discrimination and lesbian, gay, and bisexual (LGB)-supportive climates--with job satisfaction and turnover intentions in a sample of LGB employees. An extension of the theory of work adjustment (TWA) was used as the conceptual framework…

  2. Change of Mind: How Organization Theory Led Me to Move from Studying Educational Reform to Pursuing Educational Design

    ERIC Educational Resources Information Center

    Ogawa, Rodney T.

    2015-01-01

    Purpose: The purpose of this paper is for the author to recount how his use of organizational theory to understand educational reform in the USA led to a change of mind. Design/methodology/approach: My shift resulted from my conclusion, derived from the new institutionalism, that only marginal changes can be made in schools and, thus, fundamental…

  3. Harm reduction theory: Users culture, micro-social indigenous harm reduction, and the self-organization and outside-organizing of users’ groups

    PubMed Central

    Friedman, Samuel R.; de Jong, Wouter; Rossi, Diana; Touzé, Graciela; Rockwell, Russell; Jarlais, Don C Des; Elovich, Richard

    2007-01-01

    This paper discusses the user side of harm reduction, focusing to some extent on the early responses to the HIV/AIDS epidemic in each of four sets of localities—New York City, Rotterdam, Buenos Aires, and sites in Central Asia. Using available qualitative and quantitative information, we present a series of vignettes about user activities in four different localities in behalf of reducing drug-related harm. Some of these activities have been micro-social (small group) activities; others have been conducted by formal organizations of users that the users organised at their own initiative. In spite of the limitations of the methodology, the data suggest that users’ activities have helped limit HIV spread. These activities are shaped by broader social contexts, such as the extent to which drug scenes are integrated with broader social networks and the way the political and economic systems impinge on drug users’ lives. Drug users are active agents in their own individual and collective behalf, and in helping to protect wider communities. Harm reduction activities and research should take note of and draw upon both the micro-social and formal organizations of users. Finally, both researchers and policy makers should help develop ways to enable and support both micro-social and formally organized action by users PMID:17689353

  4. Generalizability Theory and Classical Test Theory

    ERIC Educational Resources Information Center

    Brennan, Robert L.

    2011-01-01

    Broadly conceived, reliability involves quantifying the consistencies and inconsistencies in observed scores. Generalizability theory, or G theory, is particularly well suited to addressing such matters in that it enables an investigator to quantify and distinguish the sources of inconsistencies in observed scores that arise, or could arise, over…

  5. Refining network reconstruction based on functional reliability.

    PubMed

    Zhang, Yunjun; Ouyang, Qi; Geng, Zhi

    2014-07-21

    Reliable functioning is crucial for the survival and development of the genetic regulatory networks in living cells and organisms. This functional reliability is an important feature of the networks and reflects the structural features that have been embedded in the regulatory networks by evolution. In this paper, we integrate this reliability into network reconstruction. We introduce the concept of dependency probability to measure the dependency of functional reliability on network edges. We also propose a method to estimate the dependency probability and select edges with high contributions to functional reliability. We use two real examples, the regulatory network of the cell cycle of the budding yeast and that of the fission yeast, to demonstrate that the proposed method improves network reconstruction. In addition, the dependency probability is robust in calculation and can be easily implemented in practice.

  6. Correcting Fallacies in Validity, Reliability, and Classification

    ERIC Educational Resources Information Center

    Sijtsma, Klaas

    2009-01-01

    This article reviews three topics from test theory that continue to raise discussion and controversy and capture test theorists' and constructors' interest. The first topic concerns the discussion of the methodology of investigating and establishing construct validity; the second topic concerns reliability and its misuse, alternative definitions…

  7. Solvent dependence of Stokes shift for organic solute-solvent systems: A comparative study by spectroscopy and reference interaction-site model-self-consistent-field theory.

    PubMed

    Nishiyama, Katsura; Watanabe, Yasuhiro; Yoshida, Norio; Hirata, Fumio

    2013-09-07

    The Stokes shift magnitudes for coumarin 153 (C153) in 13 organic solvents with various polarities have been determined by means of steady-state spectroscopy and reference interaction-site model-self-consistent-field (RISM-SCF) theory. RISM-SCF calculations have reproduced experimental results fairly well, including individual solvent characteristics. It is empirically known that in some solvents, larger Stokes shift magnitudes are detected than anticipated on the basis of the solvent relative permittivity, ɛr. In practice, 1,4-dioxane (ɛr = 2.21) provides almost identical Stokes shift magnitudes to that of tetrahydrofuran (THF, ɛr = 7.58), for C153 and other typical organic solutes. In this work, RISM-SCF theory has been used to estimate the energetics of C153-solvent systems involved in the absorption and fluorescence processes. The Stokes shift magnitudes estimated by RISM-SCF theory are ∼5 kJ mol(-1) (400 cm(-1)) less than those determined by spectroscopy; however, the results obtained are still adequate for dipole moment comparisons, in a qualitative sense. We have also calculated the solute-solvent site-site radial distributions by this theory. It is shown that solvation structures with respect to the C-O-C framework, which is common to dioxane and THF, in the near vicinity (∼0.4 nm) of specific solute sites can largely account for their similar Stokes shift magnitudes. In previous works, such solute-solvent short-range interactions have been explained in terms of the higher-order multipole moments of the solvents. Our present study shows that along with the short-range interactions that contribute most significantly to the energetics, long-range electrostatic interactions are also important. Such long-range interactions are effective up to 2 nm from the solute site, as in the case of a typical polar solvent, acetonitrile.

  8. Staff nurses' perceptions of job empowerment and level of burnout: a test of Kanter's theory of structural power in organizations.

    PubMed

    Hatcher, S; Laschinger, H K

    1996-01-01

    Kanter's structural theory of organizational behavior was used as framework to explore the relationship between perceptions of power and opportunity and level of burnout in a sample of 87 hospital staff nurses. Data were collected using a modified version of the Conditions for Work Effectiveness Questionnaire (Chandler, 1986) and the Human Services Survey (Maslach & Jackson, 1986). Consistent with Kanter's theory, perceived access to power and opportunity was significantly related to the three aspects of burnout: level of emotional exhaustion and depersonalization (r = -.3419, p = .004; r = -.2931, p = .02), and personal accomplishments (r = .3630, p = .002). The results of this study are useful for nurse administrators positioned to create organizational structures than empower staff nurses and subsequently decrease burnout.

  9. The body of the soul. Lucretian echoes in the Renaissance theories on the psychic substance and its organic repartition.

    PubMed

    Tutrone, Fabio

    2014-01-01

    In the 16th and 17th centuries, when Aristotelianism still was the leading current of natural philosophy and atomistic theories began to arise, Lucretius' De Rerum Natura stood out as an attractive and dangerous model. The present paper reassesses several relevant aspects of Lucretius' materialistic psychology by focusing on the problem of the soul's repartition through the limbs discussed in Book 3. A very successful Lucretian image serves as flu rouge throughout this survey: the description of a snake chopped up, with its pieces moving on the ground (Lucretius DRN 1969, 3.657-669). The paper's first section sets the poet's theory against the background of ancient psychology, pointing out its often neglected assimilation of Aristotelian elements. The second section highlights the influence of De Rerum Natura and its physiology of the soul on Bernardino Telesio, Agostino Doni and Francis Bacon, since all of these authors engage in an original recombination of mechanical and teleological explanations.

  10. Reliability Analysis Model

    NASA Technical Reports Server (NTRS)

    1970-01-01

    RAM program determines probability of success for one or more given objectives in any complex system. Program includes failure mode and effects, criticality and reliability analyses, and some aspects of operations, safety, flight technology, systems design engineering, and configuration analyses.

  11. The rating reliability calculator

    PubMed Central

    Solomon, David J

    2004-01-01

    Background Rating scales form an important means of gathering evaluation data. Since important decisions are often based on these evaluations, determining the reliability of rating data can be critical. Most commonly used methods of estimating reliability require a complete set of ratings i.e. every subject being rated must be rated by each judge. Over fifty years ago Ebel described an algorithm for estimating the reliability of ratings based on incomplete data. While his article has been widely cited over the years, software based on the algorithm is not readily available. This paper describes an easy-to-use Web-based utility for estimating the reliability of ratings based on incomplete data using Ebel's algorithm. Methods The program is available public use on our server and the source code is freely available under GNU General Public License. The utility is written in PHP, a common open source imbedded scripting language. The rating data can be entered in a convenient format on the user's personal computer that the program will upload to the server for calculating the reliability and other statistics describing the ratings. Results When the program is run it displays the reliability, number of subject rated, harmonic mean number of judges rating each subject, the mean and standard deviation of the averaged ratings per subject. The program also displays the mean, standard deviation and number of ratings for each subject rated. Additionally the program will estimate the reliability of an average of a number of ratings for each subject via the Spearman-Brown prophecy formula. Conclusion This simple web-based program provides a convenient means of estimating the reliability of rating data without the need to conduct special studies in order to provide complete rating data. I would welcome other researchers revising and enhancing the program. PMID:15117416

  12. Scaled CMOS Technology Reliability Users Guide

    NASA Technical Reports Server (NTRS)

    White, Mark

    2010-01-01

    The desire to assess the reliability of emerging scaled microelectronics technologies through faster reliability trials and more accurate acceleration models is the precursor for further research and experimentation in this relevant field. The effect of semiconductor scaling on microelectronics product reliability is an important aspect to the high reliability application user. From the perspective of a customer or user, who in many cases must deal with very limited, if any, manufacturer's reliability data to assess the product for a highly-reliable application, product-level testing is critical in the characterization and reliability assessment of advanced nanometer semiconductor scaling effects on microelectronics reliability. A methodology on how to accomplish this and techniques for deriving the expected product-level reliability on commercial memory products are provided.Competing mechanism theory and the multiple failure mechanism model are applied to the experimental results of scaled SDRAM products. Accelerated stress testing at multiple conditions is applied at the product level of several scaled memory products to assess the performance degradation and product reliability. Acceleration models are derived for each case. For several scaled SDRAM products, retention time degradation is studied and two distinct soft error populations are observed with each technology generation: early breakdown, characterized by randomly distributed weak bits with Weibull slope (beta)=1, and a main population breakdown with an increasing failure rate. Retention time soft error rates are calculated and a multiple failure mechanism acceleration model with parameters is derived for each technology. Defect densities are calculated and reflect a decreasing trend in the percentage of random defective bits for each successive product generation. A normalized soft error failure rate of the memory data retention time in FIT/Gb and FIT/cm2 for several scaled SDRAM generations is

  13. General theory for multiple input-output perturbations in complex molecular systems. 1. Linear QSPR electronegativity models in physical, organic, and medicinal chemistry.

    PubMed

    González-Díaz, Humberto; Arrasate, Sonia; Gómez-SanJuan, Asier; Sotomayor, Nuria; Lete, Esther; Besada-Porto, Lina; Ruso, Juan M

    2013-01-01

    In general perturbation methods starts with a known exact solution of a problem and add "small" variation terms in order to approach to a solution for a related problem without known exact solution. Perturbation theory has been widely used in almost all areas of science. Bhor's quantum model, Heisenberg's matrix mechanincs, Feyman diagrams, and Poincare's chaos model or "butterfly effect" in complex systems are examples of perturbation theories. On the other hand, the study of Quantitative Structure-Property Relationships (QSPR) in molecular complex systems is an ideal area for the application of perturbation theory. There are several problems with exact experimental solutions (new chemical reactions, physicochemical properties, drug activity and distribution, metabolic networks, etc.) in public databases like CHEMBL. However, in all these cases, we have an even larger list of related problems without known solutions. We need to know the change in all these properties after a perturbation of initial boundary conditions. It means, when we test large sets of similar, but different, compounds and/or chemical reactions under the slightly different conditions (temperature, time, solvents, enzymes, assays, protein targets, tissues, partition systems, organisms, etc.). However, to the best of our knowledge, there is no QSPR general-purpose perturbation theory to solve this problem. In this work, firstly we review general aspects and applications of both perturbation theory and QSPR models. Secondly, we formulate a general-purpose perturbation theory for multiple-boundary QSPR problems. Last, we develop three new QSPR-Perturbation theory models. The first model classify correctly >100,000 pairs of intra-molecular carbolithiations with 75-95% of Accuracy (Ac), Sensitivity (Sn), and Specificity (Sp). The model predicts probabilities of variations in the yield and enantiomeric excess of reactions due to at least one perturbation in boundary conditions (solvent, temperature

  14. Multidisciplinary System Reliability Analysis

    NASA Technical Reports Server (NTRS)

    Mahadevan, Sankaran; Han, Song; Chamis, Christos C. (Technical Monitor)

    2001-01-01

    The objective of this study is to develop a new methodology for estimating the reliability of engineering systems that encompass multiple disciplines. The methodology is formulated in the context of the NESSUS probabilistic structural analysis code, developed under the leadership of NASA Glenn Research Center. The NESSUS code has been successfully applied to the reliability estimation of a variety of structural engineering systems. This study examines whether the features of NESSUS could be used to investigate the reliability of systems in other disciplines such as heat transfer, fluid mechanics, electrical circuits etc., without considerable programming effort specific to each discipline. In this study, the mechanical equivalence between system behavior models in different disciplines are investigated to achieve this objective. A new methodology is presented for the analysis of heat transfer, fluid flow, and electrical circuit problems using the structural analysis routines within NESSUS, by utilizing the equivalence between the computational quantities in different disciplines. This technique is integrated with the fast probability integration and system reliability techniques within the NESSUS code, to successfully compute the system reliability of multidisciplinary systems. Traditional as well as progressive failure analysis methods for system reliability estimation are demonstrated, through a numerical example of a heat exchanger system involving failure modes in structural, heat transfer and fluid flow disciplines.

  15. Metrology automation reliability

    NASA Astrophysics Data System (ADS)

    Chain, Elizabeth E.

    1996-09-01

    At Motorola's MOS-12 facility automated measurements on 200- mm diameter wafers proceed in a hands-off 'load-and-go' mode requiring only wafer loading, measurement recipe loading, and a 'run' command for processing. Upon completion of all sample measurements, the data is uploaded to the factory's data collection software system via a SECS II interface, eliminating the requirement of manual data entry. The scope of in-line measurement automation has been extended to the entire metrology scheme from job file generation to measurement and data collection. Data analysis and comparison to part specification limits is also carried out automatically. Successful integration of automated metrology into the factory measurement system requires that automated functions, such as autofocus and pattern recognition algorithms, display a high degree of reliability. In the 24- hour factory reliability data can be collected automatically on every part measured. This reliability data is then uploaded to the factory data collection software system at the same time as the measurement data. Analysis of the metrology reliability data permits improvements to be made as needed, and provides an accurate accounting of automation reliability. This reliability data has so far been collected for the CD-SEM (critical dimension scanning electron microscope) metrology tool, and examples are presented. This analysis method can be applied to such automated in-line measurements as CD, overlay, particle and film thickness measurements.

  16. A Grounded Theory of the College Experiences of African American Males in Black Greek-Letter Organizations

    ERIC Educational Resources Information Center

    Ford, David Julius, Jr.

    2014-01-01

    Studies have shown that involvement in a student organization can improve the academic and psychosocial outcomes of African American male students (Harper, 2006b; Robertson & Mason, 2008; Williams & Justice, 2010). Further, Harper, Byars, and Jelke (2005) stated that African American fraternities and sororities (i.e., Black Greek-letter…

  17. CRITICAL EVALUATION OF THE DIFFUSION HYPOTHESIS IN THE THEORY OF POROUS MEDIA VOLATILE ORGANIC COMPOUND (VOC) SOURCES AND SINKS

    EPA Science Inventory

    The paper proposes three alternative, diffusion-limited mathematical models to account for volatile organic compound (VOC) interactions with indoor sinks, using the linear isotherm model as a reference point. (NOTE: Recent reports by both the U.S. EPA and a study committee of the...

  18. Self-organization in irregular landscapes: Detecting autogenic interactions from field data using descriptive statistics and dynamical systems theory

    NASA Astrophysics Data System (ADS)

    Larsen, L.; Watts, D.; Khurana, A.; Anderson, J. L.; Xu, C.; Merritts, D. J.

    2015-12-01

    The classic signal of self-organization in nature is pattern formation. However, the interactions and feedbacks that organize depositional landscapes do not always result in regular or fractal patterns. How might we detect their existence and effects in these "irregular" landscapes? Emergent landscapes such as newly forming deltaic marshes or some restoration sites provide opportunities to study the autogenic processes that organize landscapes and their physical signatures. Here we describe a quest to understand autogenic vs. allogenic controls on landscape evolution in Big Spring Run, PA, a landscape undergoing restoration from bare-soil conditions to a target wet meadow landscape. The contemporary motivation for asking questions about autogenic vs. allogenic controls is to evaluate how important initial conditions or environmental controls may be for the attainment of management objectives. However, these questions can also inform interpretation of the sedimentary record by enabling researchers to separate signals that may have arisen through self-organization processes from those resulting from environmental perturbations. Over three years at Big Spring Run, we mapped the dynamic evolution of floodplain vegetation communities and distributions of abiotic variables and topography. We used principal component analysis and transition probability analysis to detect associative interactions between vegetation and geomorphic variables and convergent cross-mapping on lidar data to detect causal interactions between biomass and topography. Exploratory statistics revealed that plant communities with distinct morphologies exerted control on landscape evolution through stress divergence (i.e., channel initiation) and promoting the accumulation of fine sediment in channels. Together, these communities participated in a negative feedback that maintains low energy and multiple channels. Because of the spatially explicit nature of this feedback, causal interactions could not

  19. Organ-specific rates of cellular respiration in developing sunflower seedlings and their bearing on metabolic scaling theory.

    PubMed

    Kutschera, Ulrich; Niklas, Karl J

    2012-10-01

    Fifty years ago Max Kleiber described what has become known as the "mouse-to-elephant" curve, i.e., a log-log plot of basal metabolic rate versus body mass. From these data, "Kleiber's 3/4 law" was deduced, which states that metabolic activity scales as the three fourths-power of body mass. However, for reasons unknown so far, no such "universal scaling law" has been discovered for land plants (embryophytes). Here, we report that the metabolic rates of four different organs (cotyledons, cotyledonary hook, hypocotyl, and roots) of developing sunflower (Helianthus annuus L.) seedlings grown in darkness (skotomorphogenesis) and in white light (photomorphogenesis) differ by a factor of 2 to 5 and are largely independent of light treatment. The organ-specific respiration rate (oxygen uptake per minute per gram of fresh mass) of the apical hook, which is composed of cells with densely packaged cytoplasm, is much higher than that of the hypocotyl, an organ that contains vacuolated cells. Data for cell length, cell density, and DNA content reveal that (1) hook opening in white light is caused by a stimulation of cell elongation on the inside of the curved organ, (2) respiration, cell density and DNA content are much higher in the hook than in the stem, and (3) organ-specific respiration rates and the DNA contents of tissues are statistically correlated. We conclude that, due to the heterogeneity of the plant body caused by the vacuolization of the cells, Kleiber's law, which was deduced using mammals as a model system, cannot be applied to embryophytes. In plants, this rule may reflect scaling phenomena at the level of the metabolically active protoplasmic contents of the cells.

  20. Statistical modelling of software reliability

    NASA Technical Reports Server (NTRS)

    Miller, Douglas R.

    1991-01-01

    During the six-month period from 1 April 1991 to 30 September 1991 the following research papers in statistical modeling of software reliability appeared: (1) A Nonparametric Software Reliability Growth Model; (2) On the Use and the Performance of Software Reliability Growth Models; (3) Research and Development Issues in Software Reliability Engineering; (4) Special Issues on Software; and (5) Software Reliability and Safety.

  1. Photovoltaic module reliability workshop

    SciTech Connect

    Mrig, L.

    1990-01-01

    The paper and presentations compiled in this volume form the Proceedings of the fourth in a series of Workshops sponsored by Solar Energy Research Institute (SERI/DOE) under the general theme of photovoltaic module reliability during the period 1986--1990. The reliability Photo Voltaic (PV) modules/systems is exceedingly important along with the initial cost and efficiency of modules if the PV technology has to make a major impact in the power generation market, and for it to compete with the conventional electricity producing technologies. The reliability of photovoltaic modules has progressed significantly in the last few years as evidenced by warranties available on commercial modules of as long as 12 years. However, there is still need for substantial research and testing required to improve module field reliability to levels of 30 years or more. Several small groups of researchers are involved in this research, development, and monitoring activity around the world. In the US, PV manufacturers, DOE laboratories, electric utilities and others are engaged in the photovoltaic reliability research and testing. This group of researchers and others interested in this field were brought together under SERI/DOE sponsorship to exchange the technical knowledge and field experience as related to current information in this important field. The papers presented here reflect this effort.

  2. Photovoltaic module reliability workshop

    NASA Astrophysics Data System (ADS)

    Mrig, L.

    The paper and presentations compiled in this volume form the Proceedings of the fourth in a series of Workshops sponsored by Solar Energy Research Institute (SERI/DOE) under the general theme of photovoltaic module reliability during the period 1986 to 1990. The reliability photovoltaic (PV) modules/systems is exceedingly important along with the initial cost and efficiency of modules if the PV technology has to make a major impact in the power generation market, and for it to compete with the conventional electricity producing technologies. The reliability of photovoltaic modules has progressed significantly in the last few years as evidenced by warrantees available on commercial modules of as long as 12 years. However, there is still need for substantial research and testing required to improve module field reliability to levels of 30 years or more. Several small groups of researchers are involved in this research, development, and monitoring activity around the world. In the U.S., PV manufacturers, DOE laboratories, electric utilities and others are engaged in the photovoltaic reliability research and testing. This group of researchers and others interested in this field were brought together under SERI/DOE sponsorship to exchange the technical knowledge and field experience as related to current information in this important field. The papers presented here reflect this effort.

  3. Orbiter Autoland reliability analysis

    NASA Technical Reports Server (NTRS)

    Welch, D. Phillip

    1993-01-01

    The Space Shuttle Orbiter is the only space reentry vehicle in which the crew is seated upright. This position presents some physiological effects requiring countermeasures to prevent a crewmember from becoming incapacitated. This also introduces a potential need for automated vehicle landing capability. Autoland is a primary procedure that was identified as a requirement for landing following and extended duration orbiter mission. This report documents the results of the reliability analysis performed on the hardware required for an automated landing. A reliability block diagram was used to evaluate system reliability. The analysis considers the manual and automated landing modes currently available on the Orbiter. (Autoland is presently a backup system only.) Results of this study indicate a +/- 36 percent probability of successfully extending a nominal mission to 30 days. Enough variations were evaluated to verify that the reliability could be altered with missions planning and procedures. If the crew is modeled as being fully capable after 30 days, the probability of a successful manual landing is comparable to that of Autoland because much of the hardware is used for both manual and automated landing modes. The analysis indicates that the reliability for the manual mode is limited by the hardware and depends greatly on crew capability. Crew capability for a successful landing after 30 days has not been determined yet.

  4. Proposed reliability cost model

    NASA Technical Reports Server (NTRS)

    Delionback, L. M.

    1973-01-01

    The research investigations which were involved in the study include: cost analysis/allocation, reliability and product assurance, forecasting methodology, systems analysis, and model-building. This is a classic example of an interdisciplinary problem, since the model-building requirements include the need for understanding and communication between technical disciplines on one hand, and the financial/accounting skill categories on the other. The systems approach is utilized within this context to establish a clearer and more objective relationship between reliability assurance and the subcategories (or subelements) that provide, or reenforce, the reliability assurance for a system. Subcategories are further subdivided as illustrated by a tree diagram. The reliability assurance elements can be seen to be potential alternative strategies, or approaches, depending on the specific goals/objectives of the trade studies. The scope was limited to the establishment of a proposed reliability cost-model format. The model format/approach is dependent upon the use of a series of subsystem-oriented CER's and sometimes possible CTR's, in devising a suitable cost-effective policy.

  5. Test-retest reliability of high angular resolution diffusion imaging acquisition within medial temporal lobe connections assessed via tract based spatial statistics, probabilistic tractography and a novel graph theory metric.

    PubMed

    Kuhn, T; Gullett, J M; Nguyen, P; Boutzoukas, A E; Ford, A; Colon-Perez, L M; Triplett, W; Carney, P R; Mareci, T H; Price, C C; Bauer, R M

    2016-06-01

    This study examined the reliability of high angular resolution diffusion tensor imaging (HARDI) data collected on a single individual across several sessions using the same scanner. HARDI data was acquired for one healthy adult male at the same time of day on ten separate days across a one-month period. Environmental factors (e.g. temperature) were controlled across scanning sessions. Tract Based Spatial Statistics (TBSS) was used to assess session-to-session variability in measures of diffusion, fractional anisotropy (FA) and mean diffusivity (MD). To address reliability within specific structures of the medial temporal lobe (MTL; the focus of an ongoing investigation), probabilistic tractography segmented the Entorhinal cortex (ERc) based on connections with Hippocampus (HC), Perirhinal (PRc) and Parahippocampal (PHc) cortices. Streamline tractography generated edge weight (EW) metrics for the aforementioned ERc connections and, as comparison regions, connections between left and right rostral and caudal anterior cingulate cortex (ACC). Coefficients of variation (CoV) were derived for the surface area and volumes of these ERc connectivity-defined regions (CDR) and for EW across all ten scans, expecting that scan-to-scan reliability would yield low CoVs. TBSS revealed no significant variation in FA or MD across scanning sessions. Probabilistic tractography successfully reproduced histologically-verified adjacent medial temporal lobe circuits. Tractography-derived metrics displayed larger ranges of scanner-to-scanner variability. Connections involving HC displayed greater variability than metrics of connection between other investigated regions. By confirming the test retest reliability of HARDI data acquisition, support for the validity of significant results derived from diffusion data can be obtained.

  6. Test-Retest Reliability of High Angular Resolution Diffusion Imaging Acquisition within Medial Temporal Lobe Connections Assessed via Tract Based Spatial Statistics, Probabilistic Tractography and a Novel Graph Theory Metric

    PubMed Central

    Kuhn, T.; Gullett, J. M.; Nguyen, P.; Boutzoukas, A. E.; Ford, A.; Colon-Perez, L. M.; Triplett, W.; Carney, P.R.; Mareci, T. H.; Price, C. C.; Bauer, R. M.

    2015-01-01

    Introduction This study examined the reliability of high angular resolution diffusion tensor imaging (HARDI) data collected on a single individual across several sessions using the same scanner. Methods HARDI data was acquired for one healthy adult male at the same time of day on ten separate days across a one-month period. Environmental factors (e.g. temperature) were controlled across scanning sessions. Tract Based Spatial Statistics (TBSS) was used to assess session-to-session variability in measures of diffusion, fractional anisotropy (FA) and mean diffusivity (MD). To address reliability within specific structures of the medial temporal lobe (MTL; the focus of an ongoing investigation), probabilistic tractography segmented the Entorhinal cortex (ERc) based on connections with Hippocampus (HC), Perirhinal (PRc) and Parahippocampal (PHc) cortices. Streamline tractography generated edge weight (EW) metrics for the aforementioned ERc connections and, as comparison regions, connections between left and right rostral and caudal anterior cingulate cortex (ACC). Coefficients of variation (CoV) were derived for the surface area and volumes of these ERc connectivity-defined regions (CDR) and for EW across all ten scans, expecting that scan-to-scan reliability would yield low CoVs. Results TBSS revealed no significant variation in FA or MD across scanning sessions. Probabilistic tractography successfully reproduced histologically-verified adjacent medial temporal lobe circuits. Tractography-derived metrics displayed larger ranges of scanner-to-scanner variability. Connections involving HC displayed greater variability than metrics of connection between other investigated regions. Conclusions By confirming the test retest reliability of HARDI data acquisition, support for the validity of significant results derived from diffusion data can be obtained. PMID:26189060

  7. Characterizing metal coordination environments in porous organic polymers: a joint density functional theory and experimental infrared spectroscopy study.

    PubMed

    López-Encarnación, Juan M; Tanabe, Kristine K; Johnson, Marc J A; Jellinek, Julius

    2013-10-04

    Very POP right now! DFT computational analysis on the structural, energetic, and IR spectroscopic characteristics of a porous organic polymer support, [Ta(NMe2 )5 ] as a molecular precursor, and the catalytic material synthesized from these two components are presented and analyzed against recorded IR spectra of these systems. The analysis leads to unambiguous identification of the atomic structure of the POP-supported Ta-amide reaction center synthesized in the experiment.

  8. Self-organized synchronization of digital phase-locked loops with delayed coupling in theory and experiment

    PubMed Central

    Wetzel, Lucas; Jörg, David J.; Pollakis, Alexandros; Rave, Wolfgang; Fettweis, Gerhard; Jülicher, Frank

    2017-01-01

    Self-organized synchronization occurs in a variety of natural and technical systems but has so far only attracted limited attention as an engineering principle. In distributed electronic systems, such as antenna arrays and multi-core processors, a common time reference is key to coordinate signal transmission and processing. Here we show how the self-organized synchronization of mutually coupled digital phase-locked loops (DPLLs) can provide robust clocking in large-scale systems. We develop a nonlinear phase description of individual and coupled DPLLs that takes into account filter impulse responses and delayed signal transmission. Our phase model permits analytical expressions for the collective frequencies of synchronized states, the analysis of stability properties and the time scale of synchronization. In particular, we find that signal filtering introduces stability transitions that are not found in systems without filtering. To test our theoretical predictions, we designed and carried out experiments using networks of off-the-shelf DPLL integrated circuitry. We show that the phase model can quantitatively predict the existence, frequency, and stability of synchronized states. Our results demonstrate that mutually delay-coupled DPLLs can provide robust and self-organized synchronous clocking in electronic systems. PMID:28207779

  9. Reliability Centered Maintenance - Methodologies

    NASA Technical Reports Server (NTRS)

    Kammerer, Catherine C.

    2009-01-01

    Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.

  10. Gearbox Reliability Collaborative Update (Presentation)

    SciTech Connect

    Sheng, S.; Keller, J.; Glinsky, C.

    2013-10-01

    This presentation was given at the Sandia Reliability Workshop in August 2013 and provides information on current statistics, a status update, next steps, and other reliability research and development activities related to the Gearbox Reliability Collaborative.

  11. Reliability Engineering Handbook

    DTIC Science & Technology

    1964-06-01

    INTEVAL 00 0 542 917 1953 OPERATING TIME IN HOURS Figure 6-4. TWT Reliability Function, Showing the 90% Confidence Interval 6-7 6-2-4 to 6-2-5 NAVWEPS...the lower one-sided 90% greater than 977 hours, or 90% confidence confidence limit on 0 is (.704)(530) = 373 that 0 lies between these two bounds . R...6-4 6-2-2 Measurement of Reliability (Application of Confidence Limits).. 6-4 6-2-3 Procedural Steps

  12. Optimal reliability-based planning of experiments for POD curves

    SciTech Connect

    Soerensen, J.D.; Faber, M.H.; Kroon, I.B.

    1995-12-31

    Optimal planning of crack detection tests is considered. The tests are used to update the information on the reliability of inspection techniques modeled by probability of detection (P.O.D.) curves. It is shown how cost-optimal and reliability-based test plans can be obtained using First Order Reliability Methods in combination with life-cycle cost-optimal inspection and maintenance planning. The methodology is based on preposterior analyses from Bayesian decisions theory. An illustrative example is shown.

  13. IRT-Estimated Reliability for Tests Containing Mixed Item Formats

    ERIC Educational Resources Information Center

    Shu, Lianghua; Schwarz, Richard D.

    2014-01-01

    As a global measure of precision, item response theory (IRT) estimated reliability is derived for four coefficients (Cronbach's a, Feldt-Raju, stratified a, and marginal reliability). Models with different underlying assumptions concerning test-part similarity are discussed. A detailed computational example is presented for the targeted…

  14. Inverse modelling of Köhler theory - Part 1: A response surface analysis of CCN spectra with respect to surface-active organic species

    NASA Astrophysics Data System (ADS)

    Lowe, Samuel; Partridge, Daniel G.; Topping, David; Stier, Philip

    2016-09-01

    In this study a novel framework for inverse modelling of cloud condensation nuclei (CCN) spectra is developed using Köhler theory. The framework is established by using model-generated synthetic measurements as calibration data for a parametric sensitivity analysis. Assessment of the relative importance of aerosol physicochemical parameters, while accounting for bulk-surface partitioning of surface-active organic species, is carried out over a range of atmospherically relevant supersaturations. By introducing an objective function that provides a scalar metric for diagnosing the deviation of modelled CCN concentrations from synthetic observations, objective function response surfaces are presented as a function of model input parameters. Crucially, for the chosen calibration data, aerosol-CCN spectrum closure is confirmed as a well-posed inverse modelling exercise for a subset of the parameters explored herein. The response surface analysis indicates that the appointment of appropriate calibration data is particularly important. To perform an inverse aerosol-CCN closure analysis and constrain parametric uncertainties, it is shown that a high-resolution CCN spectrum definition of the calibration data is required where single-valued definitions may be expected to fail. Using Köhler theory to model CCN concentrations requires knowledge of many physicochemical parameters, some of which are difficult to measure in situ on the scale of interest and introduce a considerable amount of parametric uncertainty to model predictions. For all partitioning schemes and environments modelled, model output showed significant sensitivity to perturbations in aerosol log-normal parameters describing the accumulation mode, surface tension, organic : inorganic mass ratio, insoluble fraction, and solution ideality. Many response surfaces pertaining to these parameters contain well-defined minima and are therefore good candidates for calibration using a Monte Carlo Markov Chain (MCMC

  15. Structural reliability of road accidents reconstruction.

    PubMed

    Wach, Wojciech

    2013-05-10

    Reconstruction of road accidents combines objective and subjective action. The former concerns science, the latter assessment of human behavior in the context of objective findings. It is not uncommon for experts equipped with an arsenal of tools to obtain similar results of calculations, but to present radically different conclusions about the cause of the accident. The use of sophisticated methods of uncertainty analysis does not guarantee improvement in quality of reconstruction, because, increasingly, the most serious source of reduced reliability of reconstruction is problems in logical inference. In the article the structure of uncertainty and reliability of accident reconstruction was described. A definition of reliability of road accident reconstruction based on the theory of conditional probability and Bayesian network, as a function of modeling, data and expert reliability (defined in the text) was proposed. The uncertainty of reconstruction was made dependent only on the uncertainty of the data. This separation makes it possible to conduct a qualitative and quantitative analysis of reconstruction reliability and to analyze its sensitivity to component parameters, independently of the uncertainty analysis. An example of calculation was presented. The proposed formalism constitutes a tool helpful to explain, among other things, the paradox of reliable reconstruction despite its uncertain results or unreliable reconstruction despite high precision of results. This approach is of great importance in the reconstruction of road accidents, which goes far beyond the analysis of a single, homogeneous subsystem.

  16. Assuring Software Reliability

    DTIC Science & Technology

    2014-08-01

    resources.sei.cmu.edu/asset_files/WhitePaper/2009_019_001_29066.pdf [Boydston 2009] Boydston, A. & Lewis , W. Qualification and Reliability of...Woody, Carol . Survivability Analysis Framework (CMU/SEI-2010-TN-013). Software Engineering Institute, Carnegie Mellon University, 2010. http

  17. Parametric Mass Reliability Study

    NASA Technical Reports Server (NTRS)

    Holt, James P.

    2014-01-01

    The International Space Station (ISS) systems are designed based upon having redundant systems with replaceable orbital replacement units (ORUs). These ORUs are designed to be swapped out fairly quickly, but some are very large, and some are made up of many components. When an ORU fails, it is replaced on orbit with a spare; the failed unit is sometimes returned to Earth to be serviced and re-launched. Such a system is not feasible for a 500+ day long-duration mission beyond low Earth orbit. The components that make up these ORUs have mixed reliabilities. Components that make up the most mass-such as computer housings, pump casings, and the silicon board of PCBs-typically are the most reliable. Meanwhile components that tend to fail the earliest-such as seals or gaskets-typically have a small mass. To better understand the problem, my project is to create a parametric model that relates both the mass of ORUs to reliability, as well as the mass of ORU subcomponents to reliability.

  18. Sequential Reliability Tests.

    ERIC Educational Resources Information Center

    Eiting, Mindert H.

    1991-01-01

    A method is proposed for sequential evaluation of reliability of psychometric instruments. Sample size is unfixed; a test statistic is computed after each person is sampled and a decision is made in each stage of the sampling process. Results from a series of Monte-Carlo experiments establish the method's efficiency. (SLD)

  19. Designing reliability into accelerators

    SciTech Connect

    Hutton, A.

    1992-08-01

    For the next generation of high performance, high average luminosity colliders, the ``factories,`` reliability engineering must be introduced right at the inception of the project and maintained as a central theme throughout the project. There are several aspects which will be addressed separately: Concept; design; motivation; management techniques; and fault diagnosis.

  20. Designing reliability into accelerators

    SciTech Connect

    Hutton, A.

    1992-08-01

    For the next generation of high performance, high average luminosity colliders, the factories,'' reliability engineering must be introduced right at the inception of the project and maintained as a central theme throughout the project. There are several aspects which will be addressed separately: Concept; design; motivation; management techniques; and fault diagnosis.

  1. Reliable solar cookers

    SciTech Connect

    Magney, G.K.

    1992-12-31

    The author describes the activities of SERVE, a Christian relief and development agency, to introduce solar ovens to the Afghan refugees in Pakistan. It has provided 5,000 solar cookers since 1984. The experience has demonstrated the potential of the technology and the need for a durable and reliable product. Common complaints about the cookers are discussed and the ideal cooker is described.

  2. Software reliability report

    NASA Technical Reports Server (NTRS)

    Wilson, Larry

    1991-01-01

    There are many software reliability models which try to predict future performance of software based on data generated by the debugging process. Unfortunately, the models appear to be unable to account for the random nature of the data. If the same code is debugged multiple times and one of the models is used to make predictions, intolerable variance is observed in the resulting reliability predictions. It is believed that data replication can remove this variance in lab type situations and that it is less than scientific to talk about validating a software reliability model without considering replication. It is also believed that data replication may prove to be cost effective in the real world, thus the research centered on verification of the need for replication and on methodologies for generating replicated data in a cost effective manner. The context of the debugging graph was pursued by simulation and experimentation. Simulation was done for the Basic model and the Log-Poisson model. Reasonable values of the parameters were assigned and used to generate simulated data which is then processed by the models in order to determine limitations on their accuracy. These experiments exploit the existing software and program specimens which are in AIR-LAB to measure the performance of reliability models.

  3. Reliability Design Handbook

    DTIC Science & Technology

    1976-03-01

    prediction, failure modes and effects analysis ( FMEA ) and reliability growth techniques represent those prediction and design evaluation methods that...Assessment Production Operation Ö Maintenance MIL-HDBK- 217 Bayesian Techniques Probabilistic Design FMEA I R Growth " I...devices suffer thermal aging; oxidation and other chemical reactions are enhanced; viscosity reduction and evaporation of lubricants are problems

  4. The Estimation of the IRT Reliability Coefficient and Its Lower and Upper Bounds, with Comparisons to CTT Reliability Statistics

    ERIC Educational Resources Information Center

    Kim, Seonghoon; Feldt, Leonard S.

    2010-01-01

    The primary purpose of this study is to investigate the mathematical characteristics of the test reliability coefficient rho[subscript XX'] as a function of item response theory (IRT) parameters and present the lower and upper bounds of the coefficient. Another purpose is to examine relative performances of the IRT reliability statistics and two…

  5. Reliability Generalization (RG) Analysis: The Test Is Not Reliable

    ERIC Educational Resources Information Center

    Warne, Russell

    2008-01-01

    Literature shows that most researchers are unaware of some of the characteristics of reliability. This paper clarifies some misconceptions by describing the procedures, benefits, and limitations of reliability generalization while using it to illustrate the nature of score reliability. Reliability generalization (RG) is a meta-analytic method…

  6. From Organized High-Throughput Data to Phenomenological Theory using Machine Learning: The Example of Dielectric Breakdown

    DOE PAGES

    Kim, Chiho; Pilania, Ghanshyam; Ramprasad, Ramamurthy

    2016-02-02

    Understanding the behavior (and failure) of dielectric insulators experiencing extreme electric fields is critical to the operation of present and emerging electrical and electronic devices. Despite its importance, the development of a predictive theory of dielectric breakdown has remained a challenge, owing to the complex multiscale nature of this process. We focus on the intrinsic dielectric breakdown field of insulators—the theoretical limit of breakdown determined purely by the chemistry of the material, i.e., the elements the material is composed of, the atomic-level structure, and the bonding. Starting from a benchmark dataset (generated from laborious first principles computations) of the intrinsicmore » dielectric breakdown field of a variety of model insulators, simple predictive phenomenological models of dielectric breakdown are distilled using advanced statistical or machine learning schemes, revealing key correlations and analytical relationships between the breakdown field and easily accessible material properties. Lastly, the models are shown to be general, and can hence guide the screening and systematic identification of high electric field tolerant materials.« less

  7. Crossed aphasia in a dextral: a test of the Alexander-Annett theory of anomalous organization of brain function.

    PubMed

    Osmon, D C; Panos, J; Kautz, P; Gandhavadi, B

    1998-07-01

    A case of crossed aphasia is presented in a strongly right-handed 77-year-old white female without history of familial sinistrality or prior neurological illness. She developed a right middle cerebral artery infarction documented by CT and accompanied by obvious clinical signs of a conduction aphasia with some resolution but continuing obvious language defect after 9 weeks in rehabilitation. Comprehensive neuropsychological and aphasia testing suggested anomalous lateralization of phonologic-output aspects of language, emotional prosody, motor planning and body schema modules with usual lateralization of lexical-semantic aspects of language and visuo-spatial functions. Experimental validation of the uncrossed lexical-semantic aspects of language using tachistoscope methods found support for the Alexander-Annett theory that different aspects of language can be dissociated in their lateralization. The subject had difficulty identifying a semantic associate of a picture presented to the left visual field (7 errors out of 10) relative to right visual field presentation (2 errors out of 10). Bilateral free naming errors (6 and 5 errors in the left and right visual fields, respectively) occurred consistent with the aphasic presentation, suggesting phonologic-output dysfunction from the right cerebral vascular accident. Implications of the results for aphasia classification are discussed.

  8. From Organized High-Throughput Data to Phenomenological Theory using Machine Learning: The Example of Dielectric Breakdown

    SciTech Connect

    Kim, Chiho; Pilania, Ghanshyam; Ramprasad, Ramamurthy

    2016-02-02

    Understanding the behavior (and failure) of dielectric insulators experiencing extreme electric fields is critical to the operation of present and emerging electrical and electronic devices. Despite its importance, the development of a predictive theory of dielectric breakdown has remained a challenge, owing to the complex multiscale nature of this process. We focus on the intrinsic dielectric breakdown field of insulators—the theoretical limit of breakdown determined purely by the chemistry of the material, i.e., the elements the material is composed of, the atomic-level structure, and the bonding. Starting from a benchmark dataset (generated from laborious first principles computations) of the intrinsic dielectric breakdown field of a variety of model insulators, simple predictive phenomenological models of dielectric breakdown are distilled using advanced statistical or machine learning schemes, revealing key correlations and analytical relationships between the breakdown field and easily accessible material properties. Lastly, the models are shown to be general, and can hence guide the screening and systematic identification of high electric field tolerant materials.

  9. The Examination of Reliability According to Classical Test and Generalizability on a Job Performance Scale

    ERIC Educational Resources Information Center

    Yelboga, Atilla; Tavsancil, Ezel

    2010-01-01

    In this research, the classical test theory and generalizability theory analyses were carried out with the data obtained by a job performance scale for the years 2005 and 2006. The reliability coefficients obtained (estimated) from the classical test theory and generalizability theory analyses were compared. In classical test theory, test retest…

  10. Bio-inspired transition metal-organic hydride conjugates for catalysis of transfer hydrogenation: experiment and theory.

    PubMed

    McSkimming, Alex; Chan, Bun; Bhadbhade, Mohan M; Ball, Graham E; Colbran, Stephen B

    2015-02-09

    Taking inspiration from yeast alcohol dehydrogenase (yADH), a benzimidazolium (BI(+) ) organic hydride-acceptor domain has been coupled with a 1,10-phenanthroline (phen) metal-binding domain to afford a novel multifunctional ligand (L(BI+) ) with hydride-carrier capacity (L(BI+) +H(-) ⇌L(BI) H). Complexes of the type [Cp*M(L(BI) )Cl][PF6 ]2 (M=Rh, Ir) have been made and fully characterised by cyclic voltammetry, UV/Vis spectroelectrochemistry, and, for the Ir(III) congener, X-ray crystallography. [Cp*Rh(L(BI) )Cl][PF6 ]2 catalyses the transfer hydrogenation of imines by formate ion in very goods yield under conditions where the corresponding [Cp*Ir(L(BI) )Cl][PF6 ] and [Cp*M(phen)Cl][PF6 ] (M=Rh, Ir) complexes are almost inert as catalysts. Possible alternatives for the catalysis pathway are canvassed, and the free energies of intermediates and transition states determined by DFT calculations. The DFT study supports a mechanism involving formate-driven RhH formation (90 kJ mol(-1) free-energy barrier), transfer of hydride between the Rh and BI(+) centres to generate a tethered benzimidazoline (BIH) hydride donor, binding of imine substrate at Rh, back-transfer of hydride from the BIH organic hydride donor to the Rh-activated imine substrate (89 kJ mol(-1) barrier), and exergonic protonation of the metal-bound amide by formic acid with release of amine product to close the catalytic cycle. Parallels with the mechanism of biological hydride transfer in yADH are discussed.

  11. Human development VIII: a theory of "deep" quantum chemistry and cell consciousness: quantum chemistry controls genes and biochemistry to give cells and higher organisms consciousness and complex behavior.

    PubMed

    Ventegodt, Søren; Hermansen, Tyge Dahl; Flensborg-Madsen, Trine; Nielsen, Maj Lyck; Merrick, Joav

    2006-11-14

    Deep quantum chemistry is a theory of deeply structured quantum fields carrying the biological information of the cell, making it able to remember, intend, represent the inner and outer world for comparison, understand what it "sees", and make choices on its structure, form, behavior and division. We suggest that deep quantum chemistry gives the cell consciousness and all the qualities and abilities related to consciousness. We use geometric symbolism, which is a pre-mathematical and philosophical approach to problems that cannot yet be handled mathematically. Using Occam's razor we have started with the simplest model that works; we presume this to be a many-dimensional, spiral fractal. We suggest that all the electrons of the large biological molecules' orbitals make one huge "cell-orbital", which is structured according to the spiral fractal nature of quantum fields. Consciousness of single cells, multi cellular structures as e.g. organs, multi-cellular organisms and multi-individual colonies (like ants) and human societies can thus be explained by deep quantum chemistry. When biochemical activity is strictly controlled by the quantum-mechanical super-orbital of the cell, this orbital can deliver energetic quanta as biological information, distributed through many fractal levels of the cell to guide form and behavior of an individual single or a multi-cellular organism. The top level of information is the consciousness of the cell or organism, which controls all the biochemical processes. By this speculative work inspired by Penrose and Hameroff we hope to inspire other researchers to formulate more strict and mathematically correct hypothesis on the complex and coherence nature of matter, life and consciousness.

  12. ON THE DISTRIBUTION THEORY FOR SOME CONSTRAINED LIFE TESTING EXPERIMENTS.

    DTIC Science & Technology

    RELIABILITY, *TEST METHODS, * DISTRIBUTION THEORY , MATHEMATICAL MODELS, STATISTICAL DISTRIBUTIONS, MULTIVARIATE ANALYSIS, DECISION THEORY, LIFE EXPECTANCY(SERVICE LIFE), EXPONENTIAL FUNCTIONS, THESES.

  13. Molecular Spintronics: Theory of Spin-Dependent Electron Transport Between Iron Nano-Contacts Bridged by Organic Molecules and Fe Atomic Chains*

    NASA Astrophysics Data System (ADS)

    Dalgleish, Hugh

    2005-03-01

    Recent experiments [1] have lent support to theoretical predictions [2] that organic molecules connecting nickel nano-contacts may exhibit magneto-resistance and spin-valve effects. Here we present predictions of spintronic phenomena in another class of ferromagnetic nano-systems: Fe nano-contacts bridged by single conducting or insulating molecules or chains of Fe atoms. Models are constructed based on semi-empirical considerations, the known electronic structure of bulk Fe and ab initio density functional calculations. Using Lippmann-Schwinger and Green's function techniques, and Landauer theory, significant magneto-resistance is predicted in these systems. Under appropriate conditions, novel device characteristics such as negative magneto-resistance are also predicted to emerge. * Supported by NSERC and the Canadian Institute for Advanced Research. 1 J. R. Petta et al., Phys. Rev. Lett. 93, 136601 (2004). 2 E. G. Emberly and G. Kirczenow, Chem. Phys. 281, 311 (2002); R. Pati, et al., Phys. Rev. B 68, 100407 (2003).

  14. Sorption of nonionic organic solutes from water to tetraalkylammonium bentonites: Mechanistic considerations and application of the Polanyi-Manes potential theory.

    PubMed

    Fuller, Megan; Smith, James A; Burns, Susan E

    2007-09-15

    This work describes the role of quaternary alkylammonium amendment length on sorption mechanisms of modified bentonites for four nonionic organic compounds; benzene, carbon tetrachloride, TCE, and 1,2-DCB. Tetramethyl to tetrabutyl alkyl amendments were studied and an important mechanistic shift occurred at the propyl chain length for all four solutes studied. Three- and four-carbon-chain functional groups on the ammonium cation resulted in a linear, rather than a curvilinear isotherm. The uptake on tetrapropyl and tetrabutylammonium clays was noncompetitive in binary systems and showed negligible sensitivity to temperature variations, indicating the linear isotherms describe a partitioning uptake mechanism for these organoclays. The adsorptive organoclays (tetramethyl and tetraethylammonium clays) were fit with the Dubinin-Radushkevich equation to investigate the application of the Polanyi-Manes potential theory to organoclay adsorption. It was found that TCE and carbon tetrachloride, with similar physical and chemical characteristics, behaved according to the Polanyi-Manes theory. Benzene showed an anomalously high adsorption volume limit, possibly due to dense packing in the adsorption space or chemisorption to the short chain alkyl groups.

  15. Reliability analysis of ceramic matrix composite laminates

    NASA Technical Reports Server (NTRS)

    Thomas, David J.; Wetherhold, Robert C.

    1991-01-01

    At a macroscopic level, a composite lamina may be considered as a homogeneous orthotropic solid whose directional strengths are random variables. Incorporation of these random variable strengths into failure models, either interactive or non-interactive, allows for the evaluation of the lamina reliability under a given stress state. Using a non-interactive criterion for demonstration purposes, laminate reliabilities are calculated assuming previously established load sharing rules for the redistribution of load as the failure of laminae occur. The matrix cracking predicted by ACK theory is modeled to allow a loss of stiffness in the fiber direction. The subsequent failure in the fiber direction is controlled by a modified bundle theory. Results using this modified bundle model are compared with previous models which did not permit separate consideration of matrix cracking, as well as to results obtained from experimental data.

  16. Space Shuttle Propulsion System Reliability

    NASA Technical Reports Server (NTRS)

    Welzyn, Ken; VanHooser, Katherine; Moore, Dennis; Wood, David

    2011-01-01

    This session includes the following sessions: (1) External Tank (ET) System Reliability and Lessons, (2) Space Shuttle Main Engine (SSME), Reliability Validated by a Million Seconds of Testing, (3) Reusable Solid Rocket Motor (RSRM) Reliability via Process Control, and (4) Solid Rocket Booster (SRB) Reliability via Acceptance and Testing.

  17. Transcriptional regulation by histone modifications: towards a theory of chromatin re-organization during stem cell differentiation

    NASA Astrophysics Data System (ADS)

    Binder, Hans; Steiner, Lydia; Przybilla, Jens; Rohlf, Thimo; Prohaska, Sonja; Galle, Jörg

    2013-04-01

    Chromatin-related mechanisms, as e.g. histone modifications, are known to be involved in regulatory switches within the transcriptome. Only recently, mathematical models of these mechanisms have been established. So far they have not been applied to genome-wide data. We here introduce a mathematical model of transcriptional regulation by histone modifications and apply it to data of trimethylation of histone 3 at lysine 4 (H3K4me3) and 27 (H3K27me3) in mouse pluripotent and lineage-committed cells. The model describes binding of protein complexes to chromatin which are capable of reading and writing histone marks. Molecular interactions of the complexes with DNA and modified histones create a regulatory switch of transcriptional activity. The regulatory states of the switch depend on the activity of histone (de-) methylases, the strength of complex-DNA-binding and the number of nucleosomes capable of cooperatively contributing to complex-binding. Our model explains experimentally measured length distributions of modified chromatin regions. It suggests (i) that high CpG-density facilitates recruitment of the modifying complexes in embryonic stem cells and (ii) that re-organization of extended chromatin regions during lineage specification into neuronal progenitor cells requires targeted de-modification. Our approach represents a basic step towards multi-scale models of transcriptional control during development and lineage specification.

  18. The biological default state of cell proliferation with variation and motility, a fundamental principle for a theory of organisms.

    PubMed

    Soto, Ana M; Longo, Giuseppe; Montévil, Maël; Sonnenschein, Carlos

    2016-10-01

    The principle of inertia is central to the modern scientific revolution. By postulating this principle Galileo at once identified a pertinent physical observable (momentum) and a conservation law (momentum conservation). He then could scientifically analyze what modifies inertial movement: gravitation and friction. Inertia, the default state in mechanics, represented a major theoretical commitment: there is no need to explain uniform rectilinear motion, rather, there is a need to explain departures from it. By analogy, we propose a biological default state of proliferation with variation and motility. From this theoretical commitment, what requires explanation is proliferative quiescence, lack of variation, lack of movement. That proliferation is the default state is axiomatic for biologists studying unicellular organisms. Moreover, it is implied in Darwin's "descent with modification". Although a "default state" is a theoretical construct and a limit case that does not need to be instantiated, conditions that closely resemble unrestrained cell proliferation are readily obtained experimentally. We will illustrate theoretical and experimental consequences of applying and of ignoring this principle.

  19. Transcriptional regulation by histone modifications: towards a theory of chromatin re-organization during stem cell differentiation.

    PubMed

    Binder, Hans; Steiner, Lydia; Przybilla, Jens; Rohlf, Thimo; Prohaska, Sonja; Galle, Jörg

    2013-04-01

    Chromatin-related mechanisms, as e.g. histone modifications, are known to be involved in regulatory switches within the transcriptome. Only recently, mathematical models of these mechanisms have been established. So far they have not been applied to genome-wide data. We here introduce a mathematical model of transcriptional regulation by histone modifications and apply it to data of trimethylation of histone 3 at lysine 4 (H3K4me3) and 27 (H3K27me3) in mouse pluripotent and lineage-committed cells. The model describes binding of protein complexes to chromatin which are capable of reading and writing histone marks. Molecular interactions of the complexes with DNA and modified histones create a regulatory switch of transcriptional activity. The regulatory states of the switch depend on the activity of histone (de-) methylases, the strength of complex-DNA-binding and the number of nucleosomes capable of cooperatively contributing to complex-binding. Our model explains experimentally measured length distributions of modified chromatin regions. It suggests (i) that high CpG-density facilitates recruitment of the modifying complexes in embryonic stem cells and (ii) that re-organization of extended chromatin regions during lineage specification into neuronal progenitor cells requires targeted de-modification. Our approach represents a basic step towards multi-scale models of transcriptional control during development and lineage specification.

  20. Age-related transcriptional changes in gene expression in different organs of mice support the metabolic stability theory of aging.

    PubMed

    Brink, Thore C; Demetrius, Lloyd; Lehrach, Hans; Adjaye, James

    2009-10-01

    Individual differences in the rate of aging are determined by the efficiency with which an organism transforms resources into metabolic energy thus maintaining the homeostatic condition of its cells and tissues. This observation has been integrated with analytical studies of the metabolic process to derive the following principle: The metabolic stability of regulatory networks, that is the ability of cells to maintain stable concentrations of reactive oxygen species (ROS) and other critical metabolites is the prime determinant of life span. The metabolic stability of a regulatory network is determined by the diversity of the metabolic pathways or the degree of connectivity of genes in the network. These properties can be empirically evaluated in terms of transcriptional changes in gene expression. We use microarrays to investigate the age-dependence of transcriptional changes of genes in the insulin signaling, oxidative phosphorylation and glutathione metabolism pathways in mice. Our studies delineate age and tissue specific patterns of transcriptional changes which are consistent with the metabolic stability-longevity principle. This study, in addition, rejects the free radical hypothesis which postulates that the production rate of ROS, and not its stability, determines life span.

  1. Human Reliability Program Workshop

    SciTech Connect

    Landers, John; Rogers, Erin; Gerke, Gretchen

    2014-05-18

    A Human Reliability Program (HRP) is designed to protect national security as well as worker and public safety by continuously evaluating the reliability of those who have access to sensitive materials, facilities, and programs. Some elements of a site HRP include systematic (1) supervisory reviews, (2) medical and psychological assessments, (3) management evaluations, (4) personnel security reviews, and (4) training of HRP staff and critical positions. Over the years of implementing an HRP, the Department of Energy (DOE) has faced various challenges and overcome obstacles. During this 4-day activity, participants will examine programs that mitigate threats to nuclear security and the insider threat to include HRP, Nuclear Security Culture (NSC) Enhancement, and Employee Assistance Programs. The focus will be to develop an understanding of the need for a systematic HRP and to discuss challenges and best practices associated with mitigating the insider threat.

  2. Reliable broadcast protocols

    NASA Technical Reports Server (NTRS)

    Joseph, T. A.; Birman, Kenneth P.

    1989-01-01

    A number of broadcast protocols that are reliable subject to a variety of ordering and delivery guarantees are considered. Developing applications that are distributed over a number of sites and/or must tolerate the failures of some of them becomes a considerably simpler task when such protocols are available for communication. Without such protocols the kinds of distributed applications that can reasonably be built will have a very limited scope. As the trend towards distribution and decentralization continues, it will not be surprising if reliable broadcast protocols have the same role in distributed operating systems of the future that message passing mechanisms have in the operating systems of today. On the other hand, the problems of engineering such a system remain large. For example, deciding which protocol is the most appropriate to use in a certain situation or how to balance the latency-communication-storage costs is not an easy question.

  3. Conditional Reliability Coefficients for Test Scores.

    PubMed

    Nicewander, W Alan

    2017-04-06

    The most widely used, general index of measurement precision for psychological and educational test scores is the reliability coefficient-a ratio of true variance for a test score to the true-plus-error variance of the score. In item response theory (IRT) models for test scores, the information function is the central, conditional index of measurement precision. In this inquiry, conditional reliability coefficients for a variety of score types are derived as simple transformations of information functions. It is shown, for example, that the conditional reliability coefficient for an ordinary, number-correct score, X, is equal to, ρ(X,X'|θ)=I(X,θ)/[I(X,θ)+1] Where: θ is a latent variable measured by an observed test score, X; p(X, X'|θ) is the conditional reliability of X at a fixed value of θ; and I(X, θ) is the score information function. This is a surprisingly simple relationship between the 2, basic indices of measurement precision from IRT and classical test theory (CTT). This relationship holds for item scores as well as test scores based on sums of item scores-and it holds for dichotomous as well as polytomous items, or a mix of both item types. Also, conditional reliabilities are derived for computerized adaptive test scores, and for θ-estimates used as alternatives to number correct scores. These conditional reliabilities are all related to information in a manner similar-or-identical to the 1 given above for the number-correct (NC) score. (PsycINFO Database Record

  4. Compact, Reliable EEPROM Controller

    NASA Technical Reports Server (NTRS)

    Katz, Richard; Kleyner, Igor

    2010-01-01

    A compact, reliable controller for an electrically erasable, programmable read-only memory (EEPROM) has been developed specifically for a space-flight application. The design may be adaptable to other applications in which there are requirements for reliability in general and, in particular, for prevention of inadvertent writing of data in EEPROM cells. Inadvertent writes pose risks of loss of reliability in the original space-flight application and could pose such risks in other applications. Prior EEPROM controllers are large and complex and do not provide all reasonable protections (in many cases, few or no protections) against inadvertent writes. In contrast, the present controller provides several layers of protection against inadvertent writes. The controller also incorporates a write-time monitor, enabling determination of trends in the performance of an EEPROM through all phases of testing. The controller has been designed as an integral subsystem of a system that includes not only the controller and the controlled EEPROM aboard a spacecraft but also computers in a ground control station, relatively simple onboard support circuitry, and an onboard communication subsystem that utilizes the MIL-STD-1553B protocol. (MIL-STD-1553B is a military standard that encompasses a method of communication and electrical-interface requirements for digital electronic subsystems connected to a data bus. MIL-STD- 1553B is commonly used in defense and space applications.) The intent was to both maximize reliability while minimizing the size and complexity of onboard circuitry. In operation, control of the EEPROM is effected via the ground computers, the MIL-STD-1553B communication subsystem, and the onboard support circuitry, all of which, in combination, provide the multiple layers of protection against inadvertent writes. There is no controller software, unlike in many prior EEPROM controllers; software can be a major contributor to unreliability, particularly in fault

  5. Designing reliability into accelerators

    NASA Astrophysics Data System (ADS)

    Hutton, A.

    1992-07-01

    Future accelerators will have to provide a high degree of reliability. Quality must be designed in right from the beginning and must remain a central theme throughout the project. The problem is similar to the problems facing US industry today, and examples of the successful application of quality engineering will be given. Different aspects of an accelerator project will be addressed: Concept, Design, Motivation, Management Techniques, and Fault Diagnosis. The importance of creating and maintaining a coherent team will be stressed.

  6. Reliability and testing

    NASA Technical Reports Server (NTRS)

    Auer, Werner

    1996-01-01

    Reliability and its interdependence with testing are important topics for development and manufacturing of successful products. This generally accepted fact is not only a technical statement, but must be also seen in the light of 'Human Factors.' While the background for this paper is the experience gained with electromechanical/electronic space products, including control and system considerations, it is believed that the content could be also of interest for other fields.

  7. Laser System Reliability

    DTIC Science & Technology

    1977-03-01

    NEALE CAPT. RANDALL D. GODFREY CAPT. JOHN E. ACTON HR. DAVE B. LEMMING (ASD) :,^ 19 . ••^w**** SECTION III RELIABILITY PREDICTION...Dete Exchange Program) failure rate date bank. In addition, some data have been obtained from Hughes. Rocketdyne , Garrett, and the AFWL’s APT Failure...Central Ave, Suite 306, Albuq, NM 87108 R/M Systems, Inc (Dr. K. Blemel), 10801 Lomas 81vd NE, Albuquerque, NM 87112 Rocketdyne 01 v, Rockwell

  8. Spacecraft transmitter reliability

    NASA Technical Reports Server (NTRS)

    1980-01-01

    A workshop on spacecraft transmitter reliability was held at the NASA Lewis Research Center on September 25 and 26, 1979, to discuss present knowledge and to plan future research areas. Since formal papers were not submitted, this synopsis was derived from audio tapes of the workshop. The following subjects were covered: users' experience with space transmitters; cathodes; power supplies and interfaces; and specifications and quality assurance. A panel discussion ended the workshop.

  9. Software reliability studies

    NASA Technical Reports Server (NTRS)

    Hoppa, Mary Ann; Wilson, Larry W.

    1994-01-01

    There are many software reliability models which try to predict future performance of software based on data generated by the debugging process. Our research has shown that by improving the quality of the data one can greatly improve the predictions. We are working on methodologies which control some of the randomness inherent in the standard data generation processes in order to improve the accuracy of predictions. Our contribution is twofold in that we describe an experimental methodology using a data structure called the debugging graph and apply this methodology to assess the robustness of existing models. The debugging graph is used to analyze the effects of various fault recovery orders on the predictive accuracy of several well-known software reliability algorithms. We found that, along a particular debugging path in the graph, the predictive performance of different models can vary greatly. Similarly, just because a model 'fits' a given path's data well does not guarantee that the model would perform well on a different path. Further we observed bug interactions and noted their potential effects on the predictive process. We saw that not only do different faults fail at different rates, but that those rates can be affected by the particular debugging stage at which the rates are evaluated. Based on our experiment, we conjecture that the accuracy of a reliability prediction is affected by the fault recovery order as well as by fault interaction.

  10. General Aviation Aircraft Reliability Study

    NASA Technical Reports Server (NTRS)

    Pettit, Duane; Turnbull, Andrew; Roelant, Henk A. (Technical Monitor)

    2001-01-01

    This reliability study was performed in order to provide the aviation community with an estimate of Complex General Aviation (GA) Aircraft System reliability. To successfully improve the safety and reliability for the next generation of GA aircraft, a study of current GA aircraft attributes was prudent. This was accomplished by benchmarking the reliability of operational Complex GA Aircraft Systems. Specifically, Complex GA Aircraft System reliability was estimated using data obtained from the logbooks of a random sample of the Complex GA Aircraft population.

  11. 18 CFR 39.6 - Conflict of a Reliability Standard with a Commission Order.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... Reliability Standard with a Commission Order. 39.6 Section 39.6 Conservation of Power and Water Resources... CONCERNING CERTIFICATION OF THE ELECTRIC RELIABILITY ORGANIZATION; AND PROCEDURES FOR THE ESTABLISHMENT, APPROVAL, AND ENFORCEMENT OF ELECTRIC RELIABILITY STANDARDS § 39.6 Conflict of a Reliability Standard...

  12. Can Sex Differences in Science Be Tied to the Long Reach of Prenatal Hormones? Brain Organization Theory, Digit Ratio (2D/4D), and Sex Differences in Preferences and Cognition

    PubMed Central

    Valla, Jeffrey; Ceci, Stephen J.

    2011-01-01

    Brain organization theory posits a cascade of physiological and behavioral changes initiated and shaped by prenatal hormones. Recently, this theory has been associated with outcomes including gendered toy preference, 2D/4D digit ratio, personality characteristics, sexual orientation, and cognitive profile (spatial, verbal, and mathematical abilities). We examine the evidence for this claim, focusing on 2D/4D and its putative role as a biomarker for organizational features that influence cognitive abilities/interests predisposing males toward mathematically and spatially intensive careers. Although massive support exists for early brain organization theory overall, there are myriad inconsistencies, alternative explanations, and outright contradictions that must be addressed while still taking the entire theory into account. Like a fractal within the larger theory, the 2D/4D hypothesis mirrors this overall support on a smaller scale while likewise suffering from inconsistencies (positive, negative, and sex-dependent correlations), alternative explanations (2D/4D related to spatial preferences rather than abilities per se), and contradictions (feminine 2D/4D in men associated with higher spatial ability). Using the debate over brain organization theory as the theoretical stage, we focus on 2D/4D evidence as an increasingly important player on this stage, a demonstrative case in point of the evidential complexities of the broader debate, and an increasingly important topic in its own right. PMID:22164187

  13. CR reliability testing

    NASA Astrophysics Data System (ADS)

    Honeyman-Buck, Janice C.; Rill, Lynn; Frost, Meryll M.; Staab, Edward V.

    1998-07-01

    The purpose of this work was to develop a method for systematically testing the reliability of a CR system under realistic daily loads in a non-clinical environment prior to its clinical adoption. Once digital imaging replaces film, it will be very difficult to revert back should the digital system become unreliable. Prior to the beginning of the test, a formal evaluation was performed to set the benchmarks for performance and functionality. A formal protocol was established that included all the 62 imaging plates in the inventory for each 24-hour period in the study. Imaging plates were exposed using different combinations of collimation, orientation, and SID. Anthropomorphic phantoms were used to acquire images of different sizes. Each combination was chosen randomly to simulate the differences that could occur in clinical practice. The tests were performed over a wide range of times with batches of plates processed to simulate the temporal constraints required by the nature of portable radiographs taken in the Intensive Care Unit (ICU). Current patient demographics were used for the test studies so automatic routing algorithms could be tested. During the test, only three minor reliability problems occurred, two of which were not directly related to the CR unit. One plate was discovered to cause a segmentation error that essentially reduced the image to only black and white with no gray levels. This plate was removed from the inventory to be replaced. Another problem was a PACS routing problem that occurred when the DICOM server with which the CR was communicating had a problem with disk space. The final problem was a network printing failure to the laser cameras. Although the units passed the reliability test, problems with interfacing to workstations were discovered. The two issues that were identified were the interpretation of what constitutes a study for CR and the construction of the look-up table for a proper gray scale display.

  14. Ultimately Reliable Pyrotechnic Systems

    NASA Technical Reports Server (NTRS)

    Scott, John H.; Hinkel, Todd

    2015-01-01

    This paper presents the methods by which NASA has designed, built, tested, and certified pyrotechnic devices for high reliability operation in extreme environments and illustrates the potential applications in the oil and gas industry. NASA's extremely successful application of pyrotechnics is built upon documented procedures and test methods that have been maintained and developed since the Apollo Program. Standards are managed and rigorously enforced for performance margins, redundancy, lot sampling, and personnel safety. The pyrotechnics utilized in spacecraft include such devices as small initiators and detonators with the power of a shotgun shell, detonating cord systems for explosive energy transfer across many feet, precision linear shaped charges for breaking structural membranes, and booster charges to actuate valves and pistons. NASA's pyrotechnics program is one of the more successful in the history of Human Spaceflight. No pyrotechnic device developed in accordance with NASA's Human Spaceflight standards has ever failed in flight use. NASA's pyrotechnic initiators work reliably in temperatures as low as -420 F. Each of the 135 Space Shuttle flights fired 102 of these initiators, some setting off multiple pyrotechnic devices, with never a failure. The recent landing on Mars of the Opportunity rover fired 174 of NASA's pyrotechnic initiators to complete the famous '7 minutes of terror.' Even after traveling through extreme radiation and thermal environments on the way to Mars, every one of them worked. These initiators have fired on the surface of Titan. NASA's design controls, procedures, and processes produce the most reliable pyrotechnics in the world. Application of pyrotechnics designed and procured in this manner could enable the energy industry's emergency equipment, such as shutoff valves and deep-sea blowout preventers, to be left in place for years in extreme environments and still be relied upon to function when needed, thus greatly enhancing

  15. Reliability Growth Prediction

    DTIC Science & Technology

    1986-09-01

    the Duane model because: *e ’he reliability gSod’ data analyzed were reflective of a single Lesr- for • each equipment as opposed to a series of zest ...fabrication) and costs wbich are a function of test length (e.g., chamber operations). A life -cycle cost model, Ref. 14 for example, can be exercised tc...J. Gibson and K. K. Mcain APPROVED FOR FUBLIC RE1EAS NI ODSrRUTIG1,N UY-LUlHfF :-:-4 ROME AIR DEVELOPMENT CENTER Air Force Systems Command Griffiss

  16. Blade reliability collaborative :

    SciTech Connect

    Ashwill, Thomas D.; Ogilvie, Alistair B.; Paquette, Joshua A.

    2013-04-01

    The Blade Reliability Collaborative (BRC) was started by the Wind Energy Technologies Department of Sandia National Laboratories and DOE in 2010 with the goal of gaining insight into planned and unplanned O&M issues associated with wind turbine blades. A significant part of BRC is the Blade Defect, Damage and Repair Survey task, which will gather data from blade manufacturers, service companies, operators and prior studies to determine details about the largest sources of blade unreliability. This report summarizes the initial findings from this work.

  17. Reliable VLSI sequential controllers

    NASA Technical Reports Server (NTRS)

    Whitaker, S.; Maki, G.; Shamanna, M.

    1990-01-01

    A VLSI architecture for synchronous sequential controllers is presented that has attractive qualities for producing reliable circuits. In these circuits, one hardware implementation can realize any flow table with a maximum of 2(exp n) internal states and m inputs. Also all design equations are identical. A real time fault detection means is presented along with a strategy for verifying the correctness of the checking hardware. This self check feature can be employed with no increase in hardware. The architecture can be modified to achieve fail safe designs. With no increase in hardware, an adaptable circuit can be realized that allows replacement of faulty transitions with fault free transitions.

  18. Ferrite logic reliability study

    NASA Technical Reports Server (NTRS)

    Baer, J. A.; Clark, C. B.

    1973-01-01

    Development and use of digital circuits called all-magnetic logic are reported. In these circuits the magnetic elements and their windings comprise the active circuit devices in the logic portion of a system. The ferrite logic device belongs to the all-magnetic class of logic circuits. The FLO device is novel in that it makes use of a dual or bimaterial ferrite composition in one physical ceramic body. This bimaterial feature, coupled with its potential for relatively high speed operation, makes it attractive for high reliability applications. (Maximum speed of operation approximately 50 kHz.)

  19. Testing of reliability - Analysis tools

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J.

    1989-01-01

    An outline is presented of issues raised in verifying the accuracy of reliability analysis tools. State-of-the-art reliability analysis tools implement various decomposition, aggregation, and estimation techniques to compute the reliability of a diversity of complex fault-tolerant computer systems. However, no formal methodology has been formulated for validating the reliability estimates produced by these tools. The author presents three states of testing that can be performed on most reliability analysis tools to effectively increase confidence in a tool. These testing stages were applied to the SURE (semi-Markov Unreliability Range Evaluator) reliability analysis tool, and the results of the testing are discussed.

  20. Understanding the Elements of Operational Reliability: A Key for Achieving High Reliability

    NASA Technical Reports Server (NTRS)

    Safie, Fayssal M.

    2010-01-01

    This viewgraph presentation reviews operational reliability and its role in achieving high reliability through design and process reliability. The topics include: 1) Reliability Engineering Major Areas and interfaces; 2) Design Reliability; 3) Process Reliability; and 4) Reliability Applications.

  1. Verification, validation, and reliability of predictions

    SciTech Connect

    Pigford, T.H.; Chambre, P.L.

    1987-04-01

    The objective of predicting long-term performance should be to make reliable determinations of whether the prediction falls within the criteria for acceptable performance. Establishing reliable predictions of long-term performance of a waste repository requires emphasis on valid theories to predict performance. The validation process must establish the validity of the theory, the parameters used in applying the theory, the arithmetic of calculations, and the interpretation of results; but validation of such performance predictions is not possible unless there are clear criteria for acceptable performance. Validation programs should emphasize identification of the substantive issues of prediction that need to be resolved. Examples relevant to waste package performance are predicting the life of waste containers and the time distribution of container failures, establishing the criteria for defining container failure, validating theories for time-dependent waste dissolution that depend on details of the repository environment, and determining the extent of congruent dissolution of radionuclides in the UO/sub 2/ matrix of spent fuel. Prediction and validation should go hand in hand and should be done and reviewed frequently, as essential tools for the programs to design and develop repositories. 29 refs.

  2. Origins of life: a comparison of theories and application to Mars.

    PubMed

    Davis, W L; McKay, C P

    1996-02-01

    The field of study that deals with the origins of life does not have a consensus for a theory of life's origin. An analysis of the range of theories offered shows that they share some common features that may be reliable predictors when considering the possible origins of life on another planet. The fundamental datum dealing with the origins of life is that life appeared early in the history of the Earth, probably before 3.5 Ga and possibly before 3.8 Ga. What might be called the standard theory (the Oparin-Haldane theory) posits the production of organic molecules on the early Earth followed by chemical reactions that produced increased organic complexity leading eventually to organic life capable of reproduction, mutation, and selection using organic material as nutrients. A distinct class of other theories (panspermia theories) suggests that life was carried to Earth from elsewhere--these theories receive some support from recent work on planetary impact processes. Other alternatives to the standard model suggest that life arose as an inorganic (clay) form and/or that the initial energy source was not organic material but chemical energy or sunlight. We find that the entire range of current theories suggests that liquid water is the quintessential environmental criterion for both the origin and sustenance of life. It is therefore of interest that during the time that life appeared on Earth we have evidence for liquid water present on the surface of Mars.

  3. Origins of life: a comparison of theories and application to Mars

    NASA Technical Reports Server (NTRS)

    Davis, W. L.; McKay, C. P.

    1996-01-01

    The field of study that deals with the origins of life does not have a consensus for a theory of life's origin. An analysis of the range of theories offered shows that they share some common features that may be reliable predictors when considering the possible origins of life on another planet. The fundamental datum dealing with the origins of life is that life appeared early in the history of the Earth, probably before 3.5 Ga and possibly before 3.8 Ga. What might be called the standard theory (the Oparin-Haldane theory) posits the production of organic molecules on the early Earth followed by chemical reactions that produced increased organic complexity leading eventually to organic life capable of reproduction, mutation, and selection using organic material as nutrients. A distinct class of other theories (panspermia theories) suggests that life was carried to Earth from elsewhere--these theories receive some support from recent work on planetary impact processes. Other alternatives to the standard model suggest that life arose as an inorganic (clay) form and/or that the initial energy source was not organic material but chemical energy or sunlight. We find that the entire range of current theories suggests that liquid water is the quintessential environmental criterion for both the origin and sustenance of life. It is therefore of interest that during the time that life appeared on Earth we have evidence for liquid water present on the surface of Mars.

  4. Load Control System Reliability

    SciTech Connect

    Trudnowski, Daniel

    2015-04-03

    This report summarizes the results of the Load Control System Reliability project (DOE Award DE-FC26-06NT42750). The original grant was awarded to Montana Tech April 2006. Follow-on DOE awards and expansions to the project scope occurred August 2007, January 2009, April 2011, and April 2013. In addition to the DOE monies, the project also consisted of matching funds from the states of Montana and Wyoming. Project participants included Montana Tech; the University of Wyoming; Montana State University; NorthWestern Energy, Inc., and MSE. Research focused on two areas: real-time power-system load control methodologies; and, power-system measurement-based stability-assessment operation and control tools. The majority of effort was focused on area 2. Results from the research includes: development of fundamental power-system dynamic concepts, control schemes, and signal-processing algorithms; many papers (including two prize papers) in leading journals and conferences and leadership of IEEE activities; one patent; participation in major actual-system testing in the western North American power system; prototype power-system operation and control software installed and tested at three major North American control centers; and, the incubation of a new commercial-grade operation and control software tool. Work under this grant certainly supported the DOE-OE goals in the area of “Real Time Grid Reliability Management.”

  5. Integrated circuit reliability testing

    NASA Technical Reports Server (NTRS)

    Buehler, Martin G. (Inventor); Sayah, Hoshyar R. (Inventor)

    1990-01-01

    A technique is described for use in determining the reliability of microscopic conductors deposited on an uneven surface of an integrated circuit device. A wafer containing integrated circuit chips is formed with a test area having regions of different heights. At the time the conductors are formed on the chip areas of the wafer, an elongated serpentine assay conductor is deposited on the test area so the assay conductor extends over multiple steps between regions of different heights. Also, a first test conductor is deposited in the test area upon a uniform region of first height, and a second test conductor is deposited in the test area upon a uniform region of second height. The occurrence of high resistances at the steps between regions of different height is indicated by deriving the measured length of the serpentine conductor using the resistance measured between the ends of the serpentine conductor, and comparing that to the design length of the serpentine conductor. The percentage by which the measured length exceeds the design length, at which the integrated circuit will be discarded, depends on the required reliability of the integrated circuit.

  6. Integrated circuit reliability testing

    NASA Technical Reports Server (NTRS)

    Buehler, Martin G. (Inventor); Sayah, Hoshyar R. (Inventor)

    1988-01-01

    A technique is described for use in determining the reliability of microscopic conductors deposited on an uneven surface of an integrated circuit device. A wafer containing integrated circuit chips is formed with a test area having regions of different heights. At the time the conductors are formed on the chip areas of the wafer, an elongated serpentine assay conductor is deposited on the test area so the assay conductor extends over multiple steps between regions of different heights. Also, a first test conductor is deposited in the test area upon a uniform region of first height, and a second test conductor is deposited in the test area upon a uniform region of second height. The occurrence of high resistances at the steps between regions of different height is indicated by deriving the measured length of the serpentine conductor using the resistance measured between the ends of the serpentine conductor, and comparing that to the design length of the serpentine conductor. The percentage by which the measured length exceeds the design length, at which the integrated circuit will be discarded, depends on the required reliability of the integrated circuit.

  7. The Reliability of Neurons

    PubMed Central

    Bullock, Theodore Holmes

    1970-01-01

    The prevalent probabilistic view is virtually untestable; it remains a plausible belief. The cases usually cited can not be taken as evidence for it. Several grounds for this conclusion are developed. Three issues are distinguished in an attempt to clarify a murky debate: (a) the utility of probabilistic methods in data reduction, (b) the value of models that assume indeterminacy, and (c) the validity of the inference that the nervous system is largely indeterministic at the neuronal level. No exception is taken to the first two; the second is a private heuristic question. The third is the issue to which the assertion in the first two sentences is addressed. Of the two kinds of uncertainty, statistical mechanical (= practical unpredictability) as in a gas, and Heisenbergian indeterminancy, the first certainly exists, the second is moot at the neuronal level. It would contribute to discussion to recognize that neurons perform with a degree of reliability. Although unreliability is difficult to establish, to say nothing of measure, evidence that some neurons have a high degree of reliability, in both connections and activity is increasing greatly. An example is given from sternarchine electric fish. PMID:5462670

  8. Reliable Entanglement Verification

    NASA Astrophysics Data System (ADS)

    Arrazola, Juan; Gittsovich, Oleg; Donohue, John; Lavoie, Jonathan; Resch, Kevin; Lütkenhaus, Norbert

    2013-05-01

    Entanglement plays a central role in quantum protocols. It is therefore important to be able to verify the presence of entanglement in physical systems from experimental data. In the evaluation of these data, the proper treatment of statistical effects requires special attention, as one can never claim to have verified the presence of entanglement with certainty. Recently increased attention has been paid to the development of proper frameworks to pose and to answer these type of questions. In this work, we apply recent results by Christandl and Renner on reliable quantum state tomography to construct a reliable entanglement verification procedure based on the concept of confidence regions. The statements made do not require the specification of a prior distribution nor the assumption of an independent and identically distributed (i.i.d.) source of states. Moreover, we develop efficient numerical tools that are necessary to employ this approach in practice, rendering the procedure ready to be employed in current experiments. We demonstrate this fact by analyzing the data of an experiment where photonic entangled two-photon states were generated and whose entanglement is verified with the use of an accessible nonlinear witness.

  9. Resource based view: a promising new theory for healthcare organizations: Comment on "Resource based view of the firm as a theoretical lens on the organisational consequences of quality improvement".

    PubMed

    Ferlie, Ewan

    2014-11-01

    This commentary reviews a recent piece by Burton and Rycroft-Malone on the use of Resource Based View (RBV) in healthcare organizations. It first outlines the core content of their piece. It then discusses their attempts to extend RBV to the analysis of large scale quality improvement efforts in healthcare. Some critique is elaborated. The broader question of why RBV seems to be migrating into healthcare management research is considered. They conclude RBV is a promising new theory for healthcare organizations.

  10. Creep-rupture reliability analysis

    NASA Technical Reports Server (NTRS)

    Peralta-Duran, A.; Wirsching, P. H.

    1984-01-01

    A probabilistic approach to the correlation and extrapolation of creep-rupture data is presented. Time temperature parameters (TTP) are used to correlate the data, and an analytical expression for the master curve is developed. The expression provides a simple model for the statistical distribution of strength and fits neatly into a probabilistic design format. The analysis focuses on the Larson-Miller and on the Manson-Haferd parameters, but it can be applied to any of the TTP's. A method is developed for evaluating material dependent constants for TTP's. It is shown that optimized constants can provide a significant improvement in the correlation of the data, thereby reducing modelling error. Attempts were made to quantify the performance of the proposed method in predicting long term behavior. Uncertainty in predicting long term behavior from short term tests was derived for several sets of data. Examples are presented which illustrate the theory and demonstrate the application of state of the art reliability methods to the design of components under creep.

  11. Influences of molecular packing on the charge mobility of organic semiconductors: from quantum charge transfer rate theory beyond the first-order perturbation.

    PubMed

    Nan, Guangjun; Shi, Qiang; Shuai, Zhigang; Li, Zesheng

    2011-05-28

    The electronic coupling between adjacent molecules is an important parameter for the charge transport properties of organic semiconductors. In a previous paper, a semiclassical generalized nonadiabatic transition state theory was used to investigate the nonperturbative effect of the electronic coupling on the charge transport properties, but it is not applicable at low temperatures due to the presence of high-frequency modes from the intramolecular conjugated carbon-carbon stretching vibrations [G. J. Nan et al., J. Chem. Phys., 2009, 130, 024704]. In the present paper, we apply a quantum charge transfer rate formula based on the imaginary-time flux-flux correlation function without the weak electronic coupling approximation. The imaginary-time flux-flux correlation function is then expressed in terms of the vibrational-mode path average and is evaluated by the path integral approach. All parameters are computed by quantum chemical approaches, and the mobility is obtained by kinetic Monte-Carlo simulation. We evaluate the intra-layer mobility of sexithiophene crystal structures in high- and low-temperature phases for a wide range of temperatures. In the case of strong coupling, the quantum charge transfer rates were found to be significantly smaller than those calculated using the weak electronic coupling approximation, which leads to reduced mobility especially at low temperatures. As a consequence, the mobility becomes less dependent on temperature when the molecular packing leads to strong electronic coupling in some charge transport directions. The temperature-independent charge mobility in organic thin-film transistors from experimental measurements may be explained from the present model with the grain boundaries considered. In addition, we point out that the widely used Marcus equation is invalid in calculating charge carrier transfer rates in sexithiophene crystals.

  12. Testing for PV Reliability (Presentation)

    SciTech Connect

    Kurtz, S.; Bansal, S.

    2014-09-01

    The DOE SUNSHOT workshop is seeking input from the community about PV reliability and how the DOE might address gaps in understanding. This presentation describes the types of testing that are needed for PV reliability and introduces a discussion to identify gaps in our understanding of PV reliability testing.

  13. Discrete Reliability Projection

    DTIC Science & Technology

    2014-12-01

    Defense, Handbook MIL - HDBK -189C, 2011 Hall, J. B., Methodology for Evaluating Reliability Growth Programs of Discrete Systems, Ph.D. thesis, University...pk,i ] · [ 1− (1− θ̆k) · ( N k · T )]k−m , (2.13) 5 2 Hall’s Model where m is the number of observed failure modes and d∗i estimates di (either based...Mode Failures FEF Ni d ∗ i 1 1 0.95 2 1 0.70 3 1 0.90 4 1 0.90 5 4 0.95 6 2 0.70 7 1 0.80 Using equations 2.1 and 2.2 we can calculate the failure

  14. 78 FR 14783 - Citizens Energy Task Force, Save Our Unique Lands (Complainants) v. Midwest Reliability...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-07

    ... Reliability Organization, Midwest Independent Transmission System Operator, Inc., Xcel Energy, Inc., Great River Energy, Dairyland Power Cooperative, Wisconsin Public Power Inc., (Respondents); Notice of... grid were not considered, and that instead of improving the reliability of the system, it...

  15. A Study of Birnbaum's Theory of the Relationship between the Constructs of Leadership and Organization as Depicted in His Higher Education Models of Organizational Functioning: A Contextual Leadership Paradigm for Higher Education

    ERIC Educational Resources Information Center

    Douglas, Pamela A.

    2013-01-01

    This quantitative, nonexperimental study used survey research design and nonparametric statistics to investigate Birnbaum's (1988) theory that there is a relationship between the constructs of leadership and organization, as depicted in his five higher education models of organizational functioning: bureaucratic, collegial, political,…

  16. Multiconformation, Density Functional Theory-Based pKa Prediction in Application to Large, Flexible Organic Molecules with Diverse Functional Groups.

    PubMed

    Bochevarov, Art D; Watson, Mark A; Greenwood, Jeremy R; Philipp, Dean M

    2016-12-13

    We consider the conformational flexibility of molecules and its implications for micro- and macro-pKa. The corresponding formulas are derived and discussed against the background of a comprehensive scientific and algorithmic description of the latest version of our computer program Jaguar pKa, a density functional theory-based pKa predictor, which is now capable of acting on multiple conformations explicitly. Jaguar pKa is essentially a complex computational workflow incorporating research and technologies from the fields of cheminformatics, molecular mechanics, quantum mechanics, and implicit solvation models. The workflow also makes use of automatically applied empirical corrections which account for the systematic errors resulting from the neglect of explicit solvent interactions in the algorithm's implicit solvent model. Applications of our program to large, flexible organic molecules representing several classes of functional groups are shown, with a particular emphasis in illustrations laid on drug-like molecules. It is demonstrated that a combination of aggressive conformational search and an explicit consideration of multiple conformations nearly eliminates the dependence of results on the initially chosen conformation. In certain cases this leads to unprecedented accuracy, which is sufficient for distinguishing stereoisomers that have slightly different pKa values. An application of Jaguar pKa to proton sponges, the pKa of which are strongly influenced by steric effects, showcases the advantages that pKa predictors based on quantum mechanical calculations have over similar empirical programs.

  17. Benchmarking density functional theory predictions of framework structures and properties in a chemically diverse test set of metal-organic frameworks

    SciTech Connect

    Nazarian, Dalar; Ganesh, P.; Sholl, David S.

    2015-01-01

    We compiled a test set of chemically and topologically diverse Metal–Organic Frameworks (MOFs) with high accuracy experimentally derived crystallographic structure data. The test set was used to benchmark the performance of Density Functional Theory (DFT) functionals (M06L, PBE, PW91, PBE-D2, PBE-D3, and vdW-DF2) for predicting lattice parameters, unit cell volume, bonded parameters and pore descriptors. On average PBE-D2, PBE-D3, and vdW-DF2 predict more accurate structures, but all functionals predicted pore diameters within 0.5 Å of the experimental diameter for every MOF in the test set. The test set was also used to assess the variance in performance of DFT functionals for elastic properties and atomic partial charges. The DFT predicted elastic properties such as minimum shear modulus and Young's modulus can differ by an average of 3 and 9 GPa for rigid MOFs such as those in the test set. Moreover, we calculated the partial charges by vdW-DF2 deviate the most from other functionals while there is no significant difference between the partial charges calculated by M06L, PBE, PW91, PBE-D2 and PBE-D3 for the MOFs in the test set. We find that while there are differences in the magnitude of the properties predicted by the various functionals, these discrepancies are small compared to the accuracy necessary for most practical applications.

  18. Chemical Applications of Graph Theory: Part II. Isomer Enumeration.

    ERIC Educational Resources Information Center

    Hansen, Peter J.; Jurs, Peter C.

    1988-01-01

    Discusses the use of graph theory to aid in the depiction of organic molecular structures. Gives a historical perspective of graph theory and explains graph theory terminology with organic examples. Lists applications of graph theory to current research projects. (ML)

  19. Inverse modelling of Köhler theory - Part 1: A response surface analysis of CCN spectra with respect to surface-active organic species

    NASA Astrophysics Data System (ADS)

    Lowe, Samuel; Partridge, Daniel; Topping, David; Stier, Philip

    2016-04-01

    partitioning process. The response surface sensitivity analysis identifies the accumulation mode concentration and surface tension to be the most sensitive parameters. The organic:inorganic mass ratio, insoluble fraction , solution ideality and mean diameter and geometric standard deviation of the accumulation mode showed significant sensitivity while chemical properties of the organic exhibited little sensitivity within parametric uncertainties. Parameters such as surface tension and solution ideality, can introduce considerable parametric uncertainty to models and are therefore particularly good candidates for further parameter calibration studies. A complete treatment of bulk-surface partitioning is found to model CCN spectra similar to those calculated using classical Köhler Theory with the surface tension of a pure water drop, as found in traditional sensitivity analysis studies. In addition, the sensitivity of CCN spectra to perturbations in the partitioning parameters K and Γ was found to be negligible. As a result, this study supports previously held recommendations that complex surfactant effects might be neglected and continued use of classical Köhler Theory in GCMs is recommended to avoid additional computational burden.

  20. Adsorption of organic dyes on TiO2 surfaces in dye-sensitized solar cells: interplay of theory and experiment.

    PubMed

    Anselmi, Chiara; Mosconi, Edoardo; Pastore, Mariachiara; Ronca, Enrico; De Angelis, Filippo

    2012-12-14

    First-principles computer simulations can contribute to a deeper understanding of the dye/semiconductor interface lying at the heart of Dye-sensitized Solar Cells (DSCs). Here, we present the results of simulation of dye adsorption onto TiO(2) surfaces, and of their implications for the functioning of the corresponding solar cells. We propose an integrated strategy which combines FT-IR measurements with DFT calculations to individuate the energetically favorable TiO(2) adsorption mode of acetic acid, as a meaningful model for realistic organic dyes. Although we found a sizable variability in the relative stability of the considered adsorption modes with the model system and the method, a bridged bidentate structure was found to closely match the FT-IR frequency pattern, also being calculated as the most stable adsorption mode by calculations in solution. This adsorption mode was found to be the most stable binding also for realistic organic dyes bearing cyanoacrylic anchoring groups, while for a rhodanine-3-acetic acid anchoring group, an undissociated monodentate adsorption mode was found to be of comparable stability. The structural differences induced by the different anchoring groups were related to the different electron injection/recombination with oxidized dye properties which were experimentally assessed for the two classes of dyes. A stronger coupling and a possibly faster electron injection were also calculated for the bridged bidentate mode. We then investigated the adsorption mode and I(2) binding of prototype organic dyes. Car-Parrinello molecular dynamics and geometry optimizations were performed for two coumarin dyes differing by the length of the π-bridge separating the donor and acceptor moieties. We related the decreasing distance of the carbonylic oxygen from the titania to an increased I(2) concentration in proximity of the oxide surface, which might account for the different observed photovoltaic performances. The interplay between theory

  1. [Reliability and validity of the Japanese version of the Thinking Style Inventory].

    PubMed

    Ochiai, Jun; Maie, Yuko; Wada, Yuichi

    2016-06-01

    This study examined the internal and external validity of the Japanese version of the Thinking Styles Inventory (TSI: Hiruma, 2000), which was originally developed by Sternberg and Wagner (1991) based on the framework of Sternberg's (1988) theory of mental self-government. The term "thinking style" refers to the concept that individuals differ in how they organize, direct, and manage their own thinking activities. We administered the Japanese version of the TSI to Japanese participants (N = 655: Age range 20-84 years). The results of item analysis, reliability analysis, and factor analysis, were consistent with the general ideas of the theory. In addition, there were significant relationships between certain thinking styles and 3 participant characteristics: age, gender, and working arrangement. Furthermore, some thinking styles were positively correlated with social skill. Implications of these results for the nature of Japanese thinking styles are discussed.

  2. Reliability assessment of nuclear structural systems

    SciTech Connect

    Reich, M.; Hwang, H.

    1983-01-01

    Reliability assessment of nuclear structural systems has been receiving more emphasis over the last few years. This paper deals with the recent progress made by the Structural Analysis Division of Brookhaven National Laboratory (BNL), in the development of a probability-based reliability analysis methodology for safety evaluation of reactor containments and other seismic category I structures. An important feature of this methodology is the incorporation of finite element analysis and random vibration theory. By utilizing this method, it is possible to evaluate the safety of nuclear structures under various static and dynamic loads in terms of limit state probability. Progress in other related areas, such as the establishment of probabilistic characteristics for various loads and structural resistance, are also described. Results of an application of the methodology to a realistic reinforced concrete containment subjected to dead and live loads, accidental internal pressures and earthquake ground accelerations are presented.

  3. Computational methods for efficient structural reliability and reliability sensitivity analysis

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.

    1993-01-01

    This paper presents recent developments in efficient structural reliability analysis methods. The paper proposes an efficient, adaptive importance sampling (AIS) method that can be used to compute reliability and reliability sensitivities. The AIS approach uses a sampling density that is proportional to the joint PDF of the random variables. Starting from an initial approximate failure domain, sampling proceeds adaptively and incrementally with the goal of reaching a sampling domain that is slightly greater than the failure domain to minimize over-sampling in the safe region. Several reliability sensitivity coefficients are proposed that can be computed directly and easily from the above AIS-based failure points. These probability sensitivities can be used for identifying key random variables and for adjusting design to achieve reliability-based objectives. The proposed AIS methodology is demonstrated using a turbine blade reliability analysis problem.

  4. Extension of a noninteractive reliability model for ceramic matrix composites

    NASA Technical Reports Server (NTRS)

    Duffy, Stephen F.; Wetherhold, Robert C.; Jain, Lalit K.

    1990-01-01

    Developments in the processing of high temperature ceramic composites demand innovative and progressive design protocols that adequately predict the thermal and mechanical behavior of these materials. The focus here is the extension of a reliability model for orthotropic ceramic composites. The approach chosen to describe failures leads to a noninteractive formulation of reliability that is phenomenological. This particular criterion, which was constructed using tensorial invariant theory, allows six potential failure modes to emerge.

  5. High-Reliability Health Care: Getting There from Here

    PubMed Central

    Chassin, Mark R; Loeb, Jerod M

    2013-01-01

    Context Despite serious and widespread efforts to improve the quality of health care, many patients still suffer preventable harm every day. Hospitals find improvement difficult to sustain, and they suffer “project fatigue” because so many problems need attention. No hospitals or health systems have achieved consistent excellence throughout their institutions. High-reliability science is the study of organizations in industries like commercial aviation and nuclear power that operate under hazardous conditions while maintaining safety levels that are far better than those of health care. Adapting and applying the lessons of this science to health care offer the promise of enabling hospitals to reach levels of quality and safety that are comparable to those of the best high-reliability organizations. Methods We combined the Joint Commission's knowledge of health care organizations with knowledge from the published literature and from experts in high-reliability industries and leading safety scholars outside health care. We developed a conceptual and practical framework for assessing hospitals’ readiness for and progress toward high reliability. By iterative testing with hospital leaders, we refined the framework and, for each of its fourteen components, defined stages of maturity through which we believe hospitals must pass to reach high reliability. Findings We discovered that the ways that high-reliability organizations generate and maintain high levels of safety cannot be directly applied to today's hospitals. We defined a series of incremental changes that hospitals should undertake to progress toward high reliability. These changes involve the leadership's commitment to achieving zero patient harm, a fully functional culture of safety throughout the organization, and the widespread deployment of highly effective process improvement tools. Conclusions Hospitals can make substantial progress toward high reliability by undertaking several specific

  6. Cultural Issues in Organizations.

    ERIC Educational Resources Information Center

    1999

    This document contains four symposium papers on cultural issues in organizations. "Emotion Management and Organizational Functions: A Study of Action in a Not-for-Profit Organization" (Jamie Callahan Fabian) uses Hochschild's emotion systems theory and Parsons' social systems theory to explain why members of an organization managed their…

  7. Computerized life and reliability modelling for turboprop transmissions

    NASA Technical Reports Server (NTRS)

    Savage, M.; Radil, K. C.; Lewicki, D. G.; Coy, J. J.

    1988-01-01

    A generalized life and reliability model is presented for parallel shaft geared prop-fan and turboprop aircraft transmissions. The transmission life and reliability model is a combination of the individual reliability models for all the bearings and gears in the main load paths. The bearing and gear reliability models are based on classical fatigue theory and the two parameter Weibull failure distribution. A computer program was developed to calculate the transmission life and reliability. The program is modular. In its present form, the program can analyze five different transmission arrangements. However, the program can be modified easily to include additional transmission arrangements. An example is included which compares the life of a compound two-stage transmission with the life of a split-torque, parallel compound two-stage transmission, as calculated by the computer program.

  8. Computerized life and reliability modelling for turboprop transmissions

    NASA Technical Reports Server (NTRS)

    Savage, M.; Radil, K. C.; Lewicki, D. G.; Coy, J. J.

    1988-01-01

    A generalized life and reliability model is presented for parallel shaft geared prop-fan and turboprop aircraft transmissions. The transmission life and reliability model is a combination of the individual reliability models for all the bearings and gears in the main load paths. The bearing and gear reliability models are based on classical fatigue theory and the two parameter Weibull failure distribution. A computer program was developed to calculate the transmission life and reliability. The program is modular. In its present form, the program can analyze five different transmission arrangements. However, the program can be modified easily to include additional transmission arrangements. An example is included which compares the life of a compound two-stage transmission with the life of a split-torque, parallel compound two-stage transmission as calculated by the comaputer program.

  9. Benchmarking density functional theory predictions of framework structures and properties in a chemically diverse test set of metal-organic frameworks

    DOE PAGES

    Nazarian, Dalar; Ganesh, P.; Sholl, David S.

    2015-01-01

    We compiled a test set of chemically and topologically diverse Metal–Organic Frameworks (MOFs) with high accuracy experimentally derived crystallographic structure data. The test set was used to benchmark the performance of Density Functional Theory (DFT) functionals (M06L, PBE, PW91, PBE-D2, PBE-D3, and vdW-DF2) for predicting lattice parameters, unit cell volume, bonded parameters and pore descriptors. On average PBE-D2, PBE-D3, and vdW-DF2 predict more accurate structures, but all functionals predicted pore diameters within 0.5 Å of the experimental diameter for every MOF in the test set. The test set was also used to assess the variance in performance of DFT functionalsmore » for elastic properties and atomic partial charges. The DFT predicted elastic properties such as minimum shear modulus and Young's modulus can differ by an average of 3 and 9 GPa for rigid MOFs such as those in the test set. Moreover, we calculated the partial charges by vdW-DF2 deviate the most from other functionals while there is no significant difference between the partial charges calculated by M06L, PBE, PW91, PBE-D2 and PBE-D3 for the MOFs in the test set. We find that while there are differences in the magnitude of the properties predicted by the various functionals, these discrepancies are small compared to the accuracy necessary for most practical applications.« less

  10. Integrating theory, synthesis, spectroscopy and device efficiency to design and characterize donor materials for organic photovoltaics: a case study including 12 donors

    DOE PAGES

    Oosterhout, S. D.; Kopidakis, N.; Owczarczyk, Z. R.; ...

    2015-04-07

    There have been remarkable improvements in the power conversion efficiency of solution-processable Organic Photovoltaics (OPV) have largely been driven by the development of novel narrow bandgap copolymer donors comprising an electron-donating (D) and an electron-withdrawing (A) group within the repeat unit. The large pool of potential D and A units and the laborious processes of chemical synthesis and device optimization, has made progress on new high efficiency materials slow with a few new efficient copolymers reported every year despite the large number of groups pursuing these materials. In our paper we present an integrated approach toward new narrow bandgap copolymersmore » that uses theory to guide the selection of materials to be synthesized based on their predicted energy levels, and time-resolved microwave conductivity (TRMC) to select the best-performing copolymer–fullerene bulk heterojunction to be incorporated into complete OPV devices. We validate our methodology by using a diverse group of 12 copolymers, including new and literature materials, to demonstrate good correlation between (a) theoretically determined energy levels of polymers and experimentally determined ionization energies and electron affinities and (b) photoconductance, measured by TRMC, and OPV device performance. The materials used here also allow us to explore whether further copolymer design rules need to be incorporated into our methodology for materials selection. For example, we explore the effect of the enthalpy change (ΔH) during exciton dissociation on the efficiency of free charge carrier generation and device efficiency and find that ΔH of -0.4 eV is sufficient for efficient charge generation.« less

  11. Integrating theory, synthesis, spectroscopy and device efficiency to design and characterize donor materials for organic photovoltaics: a case study including 12 donors

    SciTech Connect

    Oosterhout, S. D.; Kopidakis, N.; Owczarczyk, Z. R.; Braunecker, W. A.; Larsen, R. E.; Ratcliff, E. L.; Olson, D. C.

    2015-04-07

    There have been remarkable improvements in the power conversion efficiency of solution-processable Organic Photovoltaics (OPV) have largely been driven by the development of novel narrow bandgap copolymer donors comprising an electron-donating (D) and an electron-withdrawing (A) group within the repeat unit. The large pool of potential D and A units and the laborious processes of chemical synthesis and device optimization, has made progress on new high efficiency materials slow with a few new efficient copolymers reported every year despite the large number of groups pursuing these materials. In our paper we present an integrated approach toward new narrow bandgap copolymers that uses theory to guide the selection of materials to be synthesized based on their predicted energy levels, and time-resolved microwave conductivity (TRMC) to select the best-performing copolymer–fullerene bulk heterojunction to be incorporated into complete OPV devices. We validate our methodology by using a diverse group of 12 copolymers, including new and literature materials, to demonstrate good correlation between (a) theoretically determined energy levels of polymers and experimentally determined ionization energies and electron affinities and (b) photoconductance, measured by TRMC, and OPV device performance. The materials used here also allow us to explore whether further copolymer design rules need to be incorporated into our methodology for materials selection. For example, we explore the effect of the enthalpy change (ΔH) during exciton dissociation on the efficiency of free charge carrier generation and device efficiency and find that ΔH of -0.4 eV is sufficient for efficient charge generation.

  12. Reliability of Wireless Sensor Networks

    PubMed Central

    Dâmaso, Antônio; Rosa, Nelson; Maciel, Paulo

    2014-01-01

    Wireless Sensor Networks (WSNs) consist of hundreds or thousands of sensor nodes with limited processing, storage, and battery capabilities. There are several strategies to reduce the power consumption of WSN nodes (by increasing the network lifetime) and increase the reliability of the network (by improving the WSN Quality of Service). However, there is an inherent conflict between power consumption and reliability: an increase in reliability usually leads to an increase in power consumption. For example, routing algorithms can send the same packet though different paths (multipath strategy), which it is important for reliability, but they significantly increase the WSN power consumption. In this context, this paper proposes a model for evaluating the reliability of WSNs considering the battery level as a key factor. Moreover, this model is based on routing algorithms used by WSNs. In order to evaluate the proposed models, three scenarios were considered to show the impact of the power consumption on the reliability of WSNs. PMID:25157553

  13. Nuclear weapon reliability evaluation methodology

    SciTech Connect

    Wright, D.L.

    1993-06-01

    This document provides an overview of those activities that are normally performed by Sandia National Laboratories to provide nuclear weapon reliability evaluations for the Department of Energy. These reliability evaluations are first provided as a prediction of the attainable stockpile reliability of a proposed weapon design. Stockpile reliability assessments are provided for each weapon type as the weapon is fielded and are continuously updated throughout the weapon stockpile life. The reliability predictions and assessments depend heavily on data from both laboratory simulation and actual flight tests. An important part of the methodology are the opportunities for review that occur throughout the entire process that assure a consistent approach and appropriate use of the data for reliability evaluation purposes.

  14. A fourth generation reliability predictor

    NASA Technical Reports Server (NTRS)

    Bavuso, Salvatore J.; Martensen, Anna L.

    1988-01-01

    A reliability/availability predictor computer program has been developed and is currently being beta-tested by over 30 US companies. The computer program is called the Hybrid Automated Reliability Predictor (HARP). HARP was developed to fill an important gap in reliability assessment capabilities. This gap was manifested through the use of its third-generation cousin, the Computer-Aided Reliability Estimation (CARE III) program, over a six-year development period and an additional three-year period during which CARE III has been in the public domain. The accumulated experience of the over 30 establishments now using CARE III was used in the development of the HARP program.

  15. Organic matter diagenesis as the key to a unifying theory for the genesis of tabular uranium-vanadium deposits in the Morrison Formation, Colorado Plateau

    USGS Publications Warehouse

    Hansley, P.L.; Spirakis, C.S.

    1992-01-01

    Interstitial, epigenetic amorphous organic matter is intimately associated with uranium in the Grants uranium region and is considered essential to genetic models for these deposits. In contrast, uranium minerals are intimately associated with authigenic vanadium chlorite and vanadium oxides in amorphous organic matter-poor ores of the Slick Rock and Henry Mountains mining districts and therefore, in some genetic models amorphous organic matter is not considered crucial to the formation of these deposits. Differences in organic matter content can be explained by recognizing that amorphous organic matter-poor deposits have been subjected to more advanced stages of diagenesis than amorphous organic matter-rich deposits. Evidence that amorphous organic matter was involved in the genesis of organic matter-poor, as well as organic matter-rich, deposits is described. -from Authors

  16. Identity theory and personality theory: mutual relevance.

    PubMed

    Stryker, Sheldon

    2007-12-01

    Some personality psychologists have found a structural symbolic interactionist frame and identity theory relevant to their work. This frame and theory, developed in sociology, are first reviewed. Emphasized in the review are a multiple identity conception of self, identities as internalized expectations derived from roles embedded in organized networks of social interaction, and a view of social structures as facilitators in bringing people into networks or constraints in keeping them out, subsequently, attention turns to a discussion of the mutual relevance of structural symbolic interactionism/identity theory and personality theory, looking to extensions of the current literature on these topics.

  17. Descriptive Case Study of Theories of Action, Strategic Objectives, and Strategic Initiatives Used by California Female County Superintendents to Move Their Organizations from Current State to Desired Future

    ERIC Educational Resources Information Center

    Park, Valerie Darlene

    2014-01-01

    The purpose of this study was to describe the theories of action, strategic objectives, and strategic initiatives of school systems led by female county superintendents in California and examine their impact on improving system outcomes. Additionally, the factors influencing theory of action, strategic objective, and initiative development were…

  18. Fallacies in the Coordinated Management of Meaning: A Philosophy of Language Critique of the Hierarchical Organization of Coherent Conversation and Related Theory.

    ERIC Educational Resources Information Center

    Brenders, David A.

    1987-01-01

    Analyzes W. Barnett Pearce's "Coordinated Management of Meaning" theory--finding philosophical flaws and equivocations inherent in the model proposed within the theory. Argues that by making all the terms of their hierarchy conform to the notion of "episodic" communication, Pearce reintroduces basic errors about the nature of…

  19. Stirling Convertor Fasteners Reliability Quantification

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin R.; Korovaichuk, Igor; Kovacevich, Tiodor; Schreiber, Jeffrey G.

    2006-01-01

    Onboard Radioisotope Power Systems (RPS) being developed for NASA s deep-space science and exploration missions require reliable operation for up to 14 years and beyond. Stirling power conversion is a candidate for use in an RPS because it offers a multifold increase in the conversion efficiency of heat to electric power and reduced inventory of radioactive material. Structural fasteners are responsible to maintain structural integrity of the Stirling power convertor, which is critical to ensure reliable performance during the entire mission. Design of fasteners involve variables related to the fabrication, manufacturing, behavior of fasteners and joining parts material, structural geometry of the joining components, size and spacing of fasteners, mission loads, boundary conditions, etc. These variables have inherent uncertainties, which need to be accounted for in the reliability assessment. This paper describes these uncertainties along with a methodology to quantify the reliability, and provides results of the analysis in terms of quantified reliability and sensitivity of Stirling power conversion reliability to the design variables. Quantification of the reliability includes both structural and functional aspects of the joining components. Based on the results, the paper also describes guidelines to improve the reliability and verification testing.

  20. The Reliability of Density Measurements.

    ERIC Educational Resources Information Center

    Crothers, Charles

    1978-01-01

    Data from a land-use study of small- and medium-sized towns in New Zealand are used to ascertain the relationship between official and effective density measures. It was found that the reliability of official measures of density is very low overall, although reliability increases with community size. (Author/RLV)

  1. Computer-Aided Reliability Estimation

    NASA Technical Reports Server (NTRS)

    Bavuso, S. J.; Stiffler, J. J.; Bryant, L. A.; Petersen, P. L.

    1986-01-01

    CARE III (Computer-Aided Reliability Estimation, Third Generation) helps estimate reliability of complex, redundant, fault-tolerant systems. Program specifically designed for evaluation of fault-tolerant avionics systems. However, CARE III general enough for use in evaluation of other systems as well.

  2. The Validity of Reliability Measures.

    ERIC Educational Resources Information Center

    Seddon, G. M.

    1988-01-01

    Demonstrates that some commonly used indices can be misleading in their quantification of reliability. The effects are most pronounced on gain or difference scores. Proposals are made to avoid sources of invalidity by using a procedure to assess reliability in terms of upper and lower limits for the true scores of each examinee. (Author/JDH)

  3. Reliability in CMOS IC processing

    NASA Technical Reports Server (NTRS)

    Shreeve, R.; Ferrier, S.; Hall, D.; Wang, J.

    1990-01-01

    Critical CMOS IC processing reliability monitors are defined in this paper. These monitors are divided into three categories: process qualifications, ongoing production workcell monitors, and ongoing reliability monitors. The key measures in each of these categories are identified and prioritized based on their importance.

  4. Essay Reliability: Form and Meaning.

    ERIC Educational Resources Information Center

    Shale, Doug

    This study is an attempt at a cohesive characterization of the concept of essay reliability. As such, it takes as a basic premise that previous and current practices in reporting reliability estimates for essay tests have certain shortcomings. The study provides an analysis of these shortcomings--partly to encourage a fuller understanding of the…

  5. Reliable avionics design for deep space

    NASA Astrophysics Data System (ADS)

    Johnson, Stephen B.

    The technical and organizational problems posed by the Space Exploration Initiative (SEI) are discussed, and some possible solutions are examined. It is pointed out that SEI poses a whole new set of challenging problems in the design of reliable systems. These missions and their corresponding systems are far more complex than current systems. The initiative requires a set of vehicles and systems which must have very high levels of autonomy, reliability, and operability for long periods of time. It is emphasized that to achieve these goals in the face of great complexity, new technologies and organizational techniques will be necessary. It is noted that the key to a good design is good people. Not only must good people be found, but they must be placed in positions appropriate to their skills. It is argued that the atomistic and autocratic paradigm of vertical organizations must be replaced with more team-oriented and democratic structures.

  6. Reliability-based design optimization using efficient global reliability analysis.

    SciTech Connect

    Bichon, Barron J.; Mahadevan, Sankaran; Eldred, Michael Scott

    2010-05-01

    Finding the optimal (lightest, least expensive, etc.) design for an engineered component that meets or exceeds a specified level of reliability is a problem of obvious interest across a wide spectrum of engineering fields. Various methods for this reliability-based design optimization problem have been proposed. Unfortunately, this problem is rarely solved in practice because, regardless of the method used, solving the problem is too expensive or the final solution is too inaccurate to ensure that the reliability constraint is actually satisfied. This is especially true for engineering applications involving expensive, implicit, and possibly nonlinear performance functions (such as large finite element models). The Efficient Global Reliability Analysis method was recently introduced to improve both the accuracy and efficiency of reliability analysis for this type of performance function. This paper explores how this new reliability analysis method can be used in a design optimization context to create a method of sufficient accuracy and efficiency to enable the use of reliability-based design optimization as a practical design tool.

  7. Photovoltaic performance and reliability workshop

    SciTech Connect

    Mrig, L.

    1993-12-01

    This workshop was the sixth in a series of workshops sponsored by NREL/DOE under the general subject of photovoltaic testing and reliability during the period 1986--1993. PV performance and PV reliability are at least as important as PV cost, if not more. In the US, PV manufacturers, DOE laboratories, electric utilities, and others are engaged in the photovoltaic reliability research and testing. This group of researchers and others interested in the field were brought together to exchange the technical knowledge and field experience as related to current information in this evolving field of PV reliability. The papers presented here reflect this effort since the last workshop held in September, 1992. The topics covered include: cell and module characterization, module and system testing, durability and reliability, system field experience, and standards and codes.

  8. Statistical modeling of software reliability

    NASA Technical Reports Server (NTRS)

    Miller, Douglas R.

    1992-01-01

    This working paper discusses the statistical simulation part of a controlled software development experiment being conducted under the direction of the System Validation Methods Branch, Information Systems Division, NASA Langley Research Center. The experiment uses guidance and control software (GCS) aboard a fictitious planetary landing spacecraft: real-time control software operating on a transient mission. Software execution is simulated to study the statistical aspects of reliability and other failure characteristics of the software during development, testing, and random usage. Quantification of software reliability is a major goal. Various reliability concepts are discussed. Experiments are described for performing simulations and collecting appropriate simulated software performance and failure data. This data is then used to make statistical inferences about the quality of the software development and verification processes as well as inferences about the reliability of software versions and reliability growth under random testing and debugging.

  9. Theories of autism.

    PubMed

    Levy, Florence

    2007-11-01

    The purpose of the present paper was to review psychological theories of autism, and to integrate these theories with neurobiological findings. Cognitive, theory of mind, language and coherence theories were identified, and briefly reviewed. Psychological theories were found not to account for the rigid/repetitive behaviours universally described in autistic subjects, and underlying neurobiological systems were identified. When the developing brain encounters constrained connectivity, it evolves an abnormal organization, the features of which may be best explained by a developmental failure of neural connectivity, where high local connectivity develops in tandem with low long-range connectivity, resulting in constricted repetitive behaviours.

  10. Reliability estimation procedures and CARE: The Computer-Aided Reliability Estimation Program

    NASA Technical Reports Server (NTRS)

    Mathur, F. P.

    1971-01-01

    Ultrareliable fault-tolerant onboard digital systems for spacecraft intended for long mission life exploration of the outer planets are under development. The design of systems involving self-repair and fault-tolerance leads to the companion problem of quantifying and evaluating the survival probability of the system for the mission under consideration and the constraints imposed upon the system. Methods have been developed to (1) model self-repair and fault-tolerant organizations; (2) compute survival probability, mean life, and many other reliability predictive functions with respect to various systems and mission parameters; (3) perform sensitivity analysis of the system with respect to mission parameters; and (4) quantitatively compare competitive fault-tolerant systems. Various measures of comparison are offered. To automate the procedures of reliability mathematical modeling and evaluation, the CARE (computer-aided reliability estimation) program was developed. CARE is an interactive program residing on the UNIVAC 1108 system, which makes the above calculations and facilitates report preparation by providing output in tabular form, graphical 2-dimensional plots, and 3-dimensional projections. The reliability estimation of fault-tolerant organization by means of the CARE program is described.

  11. On the robustness of a Bayes estimate. [in reliability theory

    NASA Technical Reports Server (NTRS)

    Canavos, G. C.

    1974-01-01

    This paper examines the robustness of a Bayes estimator with respect to the assigned prior distribution. A Bayesian analysis for a stochastic scale parameter of a Weibull failure model is summarized in which the natural conjugate is assigned as the prior distribution of the random parameter. The sensitivity analysis is carried out by the Monte Carlo method in which, although an inverted gamma is the assigned prior, realizations are generated using distribution functions of varying shape. For several distributional forms and even for some fixed values of the parameter, simulated mean squared errors of Bayes and minimum variance unbiased estimators are determined and compared. Results indicate that the Bayes estimator remains squared-error superior and appears to be largely robust to the form of the assigned prior distribution.

  12. Crystalline-silicon reliability lessons for thin-film modules

    NASA Technical Reports Server (NTRS)

    Ross, Ronald G., Jr.

    1985-01-01

    Key reliability and engineering lessons learned from the 10-year history of the Jet Propulsion Laboratory's Flat-Plate Solar Array Project are presented and analyzed. Particular emphasis is placed on lessons applicable to the evolving new thin-film cell and module technologies and the organizations involved with these technologies. The user-specific demand for reliability is a strong function of the application, its location, and its expected duration. Lessons relative to effective means of specifying reliability are described, and commonly used test requirements are assessed from the standpoint of which are the most troublesome to pass, and which correlate best with field experience. Module design lessons are also summarized, including the significance of the most frequently encountered failure mechanisms and the role of encapsulant and cell reliability in determining module reliability. Lessons pertaining to research, design, and test approaches include the historical role and usefulness of qualification tests and field tests.

  13. A reliable multicast for XTP

    NASA Technical Reports Server (NTRS)

    Dempsey, Bert J.; Weaver, Alfred C.

    1990-01-01

    Multicast services needed for current distributed applications on LAN's fall generally into one of three categories: datagram, semi-reliable, and reliable. Transport layer multicast datagrams represent unreliable service in which the transmitting context 'fires and forgets'. XTP executes these semantics when the MULTI and NOERR mode bits are both set. Distributing sensor data and other applications in which application-level error recovery strategies are appropriate benefit from the efficiency in multidestination delivery offered by datagram service. Semi-reliable service refers to multicasting in which the control algorithms of the transport layer--error, flow, and rate control--are used in transferring the multicast distribution to the set of receiving contexts, the multicast group. The multicast defined in XTP provides semi-reliable service. Since, under a semi-reliable service, joining a multicast group means listening on the group address and entails no coordination with other members, a semi-reliable facility can be used for communication between a client and a server group as well as true peer-to-peer group communication. Resource location in a LAN is an important application domain. The term 'semi-reliable' refers to the fact that group membership changes go undetected. No attempt is made to assess the current membership of the group at any time--before, during, or after--the data transfer.

  14. Calculating system reliability with SRFYDO

    SciTech Connect

    Morzinski, Jerome; Anderson - Cook, Christine M; Klamann, Richard M

    2010-01-01

    SRFYDO is a process for estimating reliability of complex systems. Using information from all applicable sources, including full-system (flight) data, component test data, and expert (engineering) judgment, SRFYDO produces reliability estimates and predictions. It is appropriate for series systems with possibly several versions of the system which share some common components. It models reliability as a function of age and up to 2 other lifecycle (usage) covariates. Initial output from its Exploratory Data Analysis mode consists of plots and numerical summaries so that the user can check data entry and model assumptions, and help determine a final form for the system model. The System Reliability mode runs a complete reliability calculation using Bayesian methodology. This mode produces results that estimate reliability at the component, sub-system, and system level. The results include estimates of uncertainty, and can predict reliability at some not-too-distant time in the future. This paper presents an overview of the underlying statistical model for the analysis, discusses model assumptions, and demonstrates usage of SRFYDO.

  15. Fatigue Reliability of Gas Turbine Engine Structures

    NASA Technical Reports Server (NTRS)

    Cruse, Thomas A.; Mahadevan, Sankaran; Tryon, Robert G.

    1997-01-01

    The results of an investigation are described for fatigue reliability in engine structures. The description consists of two parts. Part 1 is for method development. Part 2 is a specific case study. In Part 1, the essential concepts and practical approaches to damage tolerance design in the gas turbine industry are summarized. These have evolved over the years in response to flight safety certification requirements. The effect of Non-Destructive Evaluation (NDE) methods on these methods is also reviewed. Assessment methods based on probabilistic fracture mechanics, with regard to both crack initiation and crack growth, are outlined. Limit state modeling techniques from structural reliability theory are shown to be appropriate for application to this problem, for both individual failure mode and system-level assessment. In Part 2, the results of a case study for the high pressure turbine of a turboprop engine are described. The response surface approach is used to construct a fatigue performance function. This performance function is used with the First Order Reliability Method (FORM) to determine the probability of failure and the sensitivity of the fatigue life to the engine parameters for the first stage disk rim of the two stage turbine. A hybrid combination of regression and Monte Carlo simulation is to use incorporate time dependent random variables. System reliability is used to determine the system probability of failure, and the sensitivity of the system fatigue life to the engine parameters of the high pressure turbine. 'ne variation in the primary hot gas and secondary cooling air, the uncertainty of the complex mission loading, and the scatter in the material data are considered.

  16. Monte Carlo Approach for Reliability Estimations in Generalizability Studies.

    ERIC Educational Resources Information Center

    Dimitrov, Dimiter M.

    A Monte Carlo approach is proposed, using the Statistical Analysis System (SAS) programming language, for estimating reliability coefficients in generalizability theory studies. Test scores are generated by a probabilistic model that considers the probability for a person with a given ability score to answer an item with a given difficulty…

  17. Integrated modular engine - Reliability assessment

    NASA Astrophysics Data System (ADS)

    Parsley, R. C.; Ward, T. B.

    1992-07-01

    A major driver in the increased interest in integrated modular engine configurations is the desire for ultra reliability for future rocket propulsion systems. The concept of configuring multiple sets of turbomachinery networked to multiple thrust chamber assemblies has been identified as an approach with potential to achieve significant reliability enhancement. This paper summarizes the results of a reliability study comparing networked systems vs. discrete engine installations, both with and without major module and engine redundancy. The study was conducted for gas generator, expander, and staged combustion cycles. The results are representative of either booster or upper-stage applications and are indicative of either plug or nonplug installation philosophies.

  18. Aerospace reliability applied to biomedicine.

    NASA Technical Reports Server (NTRS)

    Lalli, V. R.; Vargo, D. J.

    1972-01-01

    An analysis is presented that indicates that the reliability and quality assurance methodology selected by NASA to minimize failures in aerospace equipment can be applied directly to biomedical devices to improve hospital equipment reliability. The Space Electric Rocket Test project is used as an example of NASA application of reliability and quality assurance (R&QA) methods. By analogy a comparison is made to show how these same methods can be used in the development of transducers, instrumentation, and complex systems for use in medicine.

  19. Reliability of tactile tools for pain assessment in blind athletes.

    PubMed

    Leite, Ana Claudia de Souza; Pagliuca, Lorita M Freitag; Almeida, Paulo Cesar P; Dallaire, Clemence C

    2008-06-01

    Health professionals have numerous visual and reporting scales at their disposal to assess pain. In recent years new tactile tools have been created (Pain Texture Scale and Tactile Pain Scale). This study validates these scales compared with the Numerical Rating Scale in 36 blind athletes who were assessed before and after competitions in the World Paralympics Games organized by the International Blind Sports Federation (IBSA) in Quebec, Canada. The reliability of these scales was analyzed through the intraclass correlation coefficient. Results showed good reliability for the Tactile Pain Scale and satisfactory reliability for the Pain Texture Scale.

  20. Progress in string theory

    NASA Astrophysics Data System (ADS)

    Maldacena, Juan Martín

    D-Branes on Calabi-Yau manifolds / Paul S. Aspinwall -- Lectures on AdS/CFT / Juan M. Maldacena -- Tachyon dynamics in open string theory / Ashoke Sen -- TASI/PITP/ISS lectures on moduli and microphysics / Eva Silverstein -- The duality cascade / Matthew J. Strassler -- Perturbative computations in string field theory / Washington Taylor -- Student seminars -- Student participants -- Lecturers, directors, and local organizing committee.

  1. Human- and computer-accessible 2D correlation data for a more reliable structure determination of organic compounds. Future roles of researchers, software developers, spectrometer managers, journal editors, reviewers, publisher and database managers toward artificial-intelligence analysis of NMR spectra.

    PubMed

    Jeannerat, Damien

    2017-01-01

    The introduction of a universal data format to report the correlation data of 2D NMR spectra such as COSY, HSQC and HMBC spectra will have a large impact on the reliability of structure determination of small organic molecules. These lists of assigned cross peaks will bridge signals found in NMR 1D and 2D spectra and the assigned chemical structure. The record could be very compact, human and computer readable so that it can be included in the supplementary material of publications and easily transferred into databases of scientific literature and chemical compounds. The records will allow authors, reviewers and future users to test the consistency and, in favorable situations, the uniqueness of the assignment of the correlation data to the associated chemical structures. Ideally, the data format of the correlation data should include direct links to the NMR spectra to make it possible to validate their reliability and allow direct comparison of spectra. In order to take the full benefits of their potential, the correlation data and the NMR spectra should therefore follow any manuscript in the review process and be stored in open-access database after publication. Keeping all NMR spectra, correlation data and assigned structures together at all time will allow the future development of validation tools increasing the reliability of past and future NMR data. This will facilitate the development of artificial intelligence analysis of NMR spectra by providing a source of data than can be used efficiently because they have been validated or can be validated by future users. Copyright © 2016 John Wiley & Sons, Ltd.

  2. The Assessment of Reliability Under Range Restriction: A Comparison of [Alpha], [Omega], and Test-Retest Reliability for Dichotomous Data

    ERIC Educational Resources Information Center

    Fife, Dustin A.; Mendoza, Jorge L.; Terry, Robert

    2012-01-01

    Though much research and attention has been directed at assessing the correlation coefficient under range restriction, the assessment of reliability under range restriction has been largely ignored. This article uses item response theory to simulate dichotomous item-level data to assess the robustness of KR-20 ([alpha]), [omega], and test-retest…

  3. How Reliable Is Laboratory Testing?

    MedlinePlus

    ... page: Was this page helpful? Overview | Key Concepts | Quality Control | Role of Testing | Conclusion | Sources What are the ... but is constantly monitored for reliability through comprehensive quality control and quality assurance procedures. Therefore, when your blood ...

  4. Accelerator Availability and Reliability Issues

    SciTech Connect

    Steve Suhring

    2003-05-01

    Maintaining reliable machine operations for existing machines as well as planning for future machines' operability present significant challenges to those responsible for system performance and improvement. Changes to machine requirements and beam specifications often reduce overall machine availability in an effort to meet user needs. Accelerator reliability issues from around the world will be presented, followed by a discussion of the major factors influencing machine availability.

  5. Reliability Validation and Improvement Framework

    DTIC Science & Technology

    2012-11-01

    discover and remove bugs using various test cover- age metrics to determine test sufficiency. Failure-probability density function based on code met- rics ...Coverage Metrics Traditional reliability engineering has focused on fault density and reliability growth as key met- rics . These are statistical...abs_all.jsp?arnumber=781027 [Kwiatkowska 2010] Kwiatkowska, M., Norman, G., & Parker , D. “Advances and Challenges of Probabilistic Model Checking

  6. Lifetime Reliability Prediction of Ceramic Structures Under Transient Thermomechanical Loads

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.; Jadaan, Osama J.; Gyekenyesi, John P.

    2005-01-01

    An analytical methodology is developed to predict the probability of survival (reliability) of ceramic components subjected to harsh thermomechanical loads that can vary with time (transient reliability analysis). This capability enables more accurate prediction of ceramic component integrity against fracture in situations such as turbine startup and shutdown, operational vibrations, atmospheric reentry, or other rapid heating or cooling situations (thermal shock). The transient reliability analysis methodology developed herein incorporates the following features: fast-fracture transient analysis (reliability analysis without slow crack growth, SCG); transient analysis with SCG (reliability analysis with time-dependent damage due to SCG); a computationally efficient algorithm to compute the reliability for components subjected to repeated transient loading (block loading); cyclic fatigue modeling using a combined SCG and Walker fatigue law; proof testing for transient loads; and Weibull and fatigue parameters that are allowed to vary with temperature or time. Component-to-component variation in strength (stochastic strength response) is accounted for with the Weibull distribution, and either the principle of independent action or the Batdorf theory is used to predict the effect of multiaxial stresses on reliability. The reliability analysis can be performed either as a function of the component surface (for surface-distributed flaws) or component volume (for volume-distributed flaws). The transient reliability analysis capability has been added to the NASA CARES/ Life (Ceramic Analysis and Reliability Evaluation of Structures/Life) code. CARES/Life was also updated to interface with commercially available finite element analysis software, such as ANSYS, when used to model the effects of transient load histories. Examples are provided to demonstrate the features of the methodology as implemented in the CARES/Life program.

  7. Robust fusion with reliabilities weights

    NASA Astrophysics Data System (ADS)

    Grandin, Jean-Francois; Marques, Miguel

    2002-03-01

    The reliability is a value of the degree of trust in a given measurement. We analyze and compare: ML (Classical Maximum Likelihood), MLE (Maximum Likelihood weighted by Entropy), MLR (Maximum Likelihood weighted by Reliability), MLRE (Maximum Likelihood weighted by Reliability and Entropy), DS (Credibility Plausibility), DSR (DS weighted by reliabilities). The analysis is based on a model of a dynamical fusion process. It is composed of three sensors, which have each it's own discriminatory capacity, reliability rate, unknown bias and measurement noise. The knowledge of uncertainties is also severely corrupted, in order to analyze the robustness of the different fusion operators. Two sensor models are used: the first type of sensor is able to estimate the probability of each elementary hypothesis (probabilistic masses), the second type of sensor delivers masses on union of elementary hypotheses (DS masses). In the second case probabilistic reasoning leads to sharing the mass abusively between elementary hypotheses. Compared to the classical ML or DS which achieves just 50% of correct classification in some experiments, DSR, MLE, MLR and MLRE reveals very good performances on all experiments (more than 80% of correct classification rate). The experiment was performed with large variations of the reliability coefficients for each sensor (from 0 to 1), and with large variations on the knowledge of these coefficients (from 0 0.8). All four operators reveal good robustness, but the MLR reveals to be uniformly dominant on all the experiments in the Bayesian case and achieves the best mean performance under incomplete a priori information.

  8. MEMS reliability: coming of age

    NASA Astrophysics Data System (ADS)

    Douglass, Michael R.

    2008-02-01

    In today's high-volume semiconductor world, one could easily take reliability for granted. As the MOEMS/MEMS industry continues to establish itself as a viable alternative to conventional manufacturing in the macro world, reliability can be of high concern. Currently, there are several emerging market opportunities in which MOEMS/MEMS is gaining a foothold. Markets such as mobile media, consumer electronics, biomedical devices, and homeland security are all showing great interest in microfabricated products. At the same time, these markets are among the most demanding when it comes to reliability assurance. To be successful, each company developing a MOEMS/MEMS device must consider reliability on an equal footing with cost, performance and manufacturability. What can this maturing industry learn from the successful development of DLP technology, air bag accelerometers and inkjet printheads? This paper discusses some basic reliability principles which any MOEMS/MEMS device development must use. Examples from the commercially successful and highly reliable Digital Micromirror Device complement the discussion.

  9. Reliability Assessment of Graphite Specimens under Multiaxial Stresses

    NASA Technical Reports Server (NTRS)

    Sookdeo, Steven; Nemeth, Noel N.; Bratton, Robert L.

    2008-01-01

    An investigation was conducted to predict the failure strength response of IG-100 nuclear grade graphite exposed to multiaxial stresses. As part of this effort, a review of failure criteria accounting for the stochastic strength response is provided. The experimental work was performed in the early 1990s at the Oak Ridge National Laboratory (ORNL) on hollow graphite tubes under the action of axial tensile loading and internal pressurization. As part of the investigation, finite-element analysis (FEA) was performed and compared with results of FEA from the original ORNL report. The new analysis generally compared well with the original analysis, although some discrepancies in the location of peak stresses was noted. The Ceramics Analysis and Reliability Evaluation of Structures Life prediction code (CARES/Life) was used with the FEA results to predict the quadrants I (tensile-tensile) and quadrant IV (compression-tension) strength response of the graphite tubes for the principle of independent action (PIA), the Weibull normal stress averaging (NSA), and the Batdorf multiaxial failure theories. The CARES/Life reliability analysis showed that all three failure theories gave similar results in quadrant I but that in quadrant IV, the PIA and Weibull normal stress-averaging theories were not conservative, whereas the Batdorf theory was able to correlate well with experimental results. The conclusion of the study was that the Batdorf theory should generally be used to predict the reliability response of graphite and brittle materials in multiaxial loading situations.

  10. Quantitative metal magnetic memory reliability modeling for welded joints

    NASA Astrophysics Data System (ADS)

    Xing, Haiyan; Dang, Yongbin; Wang, Ben; Leng, Jiancheng

    2016-03-01

    Metal magnetic memory(MMM) testing has been widely used to detect welded joints. However, load levels, environmental magnetic field, and measurement noises make the MMM data dispersive and bring difficulty to quantitative evaluation. In order to promote the development of quantitative MMM reliability assessment, a new MMM model is presented for welded joints. Steel Q235 welded specimens are tested along the longitudinal and horizontal lines by TSC-2M-8 instrument in the tensile fatigue experiments. The X-ray testing is carried out synchronously to verify the MMM results. It is found that MMM testing can detect the hidden crack earlier than X-ray testing. Moreover, the MMM gradient vector sum K vs is sensitive to the damage degree, especially at early and hidden damage stages. Considering the dispersion of MMM data, the K vs statistical law is investigated, which shows that K vs obeys Gaussian distribution. So K vs is the suitable MMM parameter to establish reliability model of welded joints. At last, the original quantitative MMM reliability model is first presented based on the improved stress strength interference theory. It is shown that the reliability degree R gradually decreases with the decreasing of the residual life ratio T, and the maximal error between prediction reliability degree R 1 and verification reliability degree R 2 is 9.15%. This presented method provides a novel tool of reliability testing and evaluating in practical engineering for welded joints.

  11. Environmental education curriculum evaluation questionnaire: A reliability and validity study

    NASA Astrophysics Data System (ADS)

    Minner, Daphne Diane

    The intention of this research project was to bridge the gap between social science research and application to the environmental domain through the development of a theoretically derived instrument designed to give educators a template by which to evaluate environmental education curricula. The theoretical base for instrument development was provided by several developmental theories such as Piaget's theory of cognitive development, Developmental Systems Theory, Life-span Perspective, as well as curriculum research within the area of environmental education. This theoretical base fueled the generation of a list of components which were then translated into a questionnaire with specific questions relevant to the environmental education domain. The specific research question for this project is: Can a valid assessment instrument based largely on human development and education theory be developed that reliably discriminates high, moderate, and low quality in environmental education curricula? The types of analyses conducted to answer this question were interrater reliability (percent agreement, Cohen's Kappa coefficient, Pearson's Product-Moment correlation coefficient), test-retest reliability (percent agreement, correlation), and criterion-related validity (correlation). Face validity and content validity were also assessed through thorough reviews. Overall results indicate that 29% of the questions on the questionnaire demonstrated a high level of interrater reliability and 43% of the questions demonstrated a moderate level of interrater reliability. Seventy-one percent of the questions demonstrated a high test-retest reliability and 5% a moderate level. Fifty-five percent of the questions on the questionnaire were reliable (high or moderate) both across time and raters. Only eight questions (8%) did not show either interrater or test-retest reliability. The global overall rating of high, medium, or low quality was reliable across both coders and time, indicating

  12. Some Clinical Diagnoses are More Reliable than Others

    DTIC Science & Technology

    1989-03-29

    a more reliable measure than diagnostic type (e.g., schizophrenia versus personality disor- der). Diagnostic type, in turn, was a more reliable...diagnoses included thie following diagnostic typesý: 1) organic psychoses, 2) schizophrenia , 3) affective psychoses, 4) pala- 4 noia, 5) other...Else 1 vs 2 2 vy: 3 3 vs 4 4 vs 5 1st vs Last Last PE Board Organic Psychoses .15 .0o .16 .15 .13 .31 Schizophrenia .5F .61 .63 .73 .58 .72 Affective Pty

  13. "High Stage" Organizing.

    ERIC Educational Resources Information Center

    Torbert, William R.

    Although a psychological theory of stages of transformation in human development currently exists, organizational researchers have yet to elaborate and test any theory of organizational transformation of comparable elegance. According to the organizational stage theory being developed since 1974 by William Torbert, bureaucratic organization, which…

  14. Column Grid Array Rework for High Reliability

    NASA Technical Reports Server (NTRS)

    Mehta, Atul C.; Bodie, Charles C.

    2008-01-01

    Due to requirements for reduced size and weight, use of grid array packages in space applications has become common place. To meet the requirement of high reliability and high number of I/Os, ceramic column grid array packages (CCGA) were selected for major electronic components used in next MARS Rover mission (specifically high density Field Programmable Gate Arrays). ABSTRACT The probability of removal and replacement of these devices on the actual flight printed wiring board assemblies is deemed to be very high because of last minute discoveries in final test which will dictate changes in the firmware. The questions and challenges presented to the manufacturing organizations engaged in the production of high reliability electronic assemblies are, Is the reliability of the PWBA adversely affected by rework (removal and replacement) of the CGA package? and How many times can we rework the same board without destroying a pad or degrading the lifetime of the assembly? To answer these questions, the most complex printed wiring board assembly used by the project was chosen to be used as the test vehicle, the PWB was modified to provide a daisy chain pattern, and a number of bare PWB s were acquired to this modified design. Non-functional 624 pin CGA packages with internal daisy chained matching the pattern on the PWB were procured. The combination of the modified PWB and the daisy chained packages enables continuity measurements of every soldered contact during subsequent testing and thermal cycling. Several test vehicles boards were assembled, reworked and then thermal cycled to assess the reliability of the solder joints and board material including pads and traces near the CGA. The details of rework process and results of thermal cycling are presented in this paper.

  15. Electric system restructuring and system reliability

    NASA Astrophysics Data System (ADS)

    Horiuchi, Catherine Miller

    In 1996 the California legislature passed AB 1890, explicitly defining economic benefits and detailing specific mechanisms for initiating a partial restructuring the state's electric system. Critics have since sought re-regulation and proponents have asked for patience as the new institutions and markets take shape. Other states' electric system restructuring activities have been tempered by real and perceived problems in the California model. This study examines the reduced regulatory controls and new constraints introduced in California's limited restructuring model using utility and regulatory agency records from the 1990's to investigate effects of new institutions and practices on system reliability for the state's five largest public and private utilities. Logit and negative binomial regressions indicate negative impact from the California model of restructuring on system reliability as measured by customer interruptions. Time series analysis of outage data could not predict the wholesale power market collapse and the subsequent rolling blackouts in early 2001; inclusion of near-outage reliability disturbances---load shedding and energy emergencies---provided a measure of forewarning. Analysis of system disruptions, generation capacity and demand, and the role of purchased power challenge conventional wisdom on the causality of Californian's power problems. The quantitative analysis was supplemented by a targeted survey of electric system restructuring participants. Findings suggest each utility and the organization controlling the state's electric grid provided protection from power outages comparable to pre-restructuring operations through 2000; however, this reliability has come at an inflated cost, resulting in reduced system purchases and decreased marginal protection. The historic margin of operating safety has fully eroded, increasing mandatory load shedding and emergency declarations for voluntary and mandatory conservation. Proposed remedies focused

  16. Reliability of plantar pressure platforms.

    PubMed

    Hafer, Jocelyn F; Lenhoff, Mark W; Song, Jinsup; Jordan, Joanne M; Hannan, Marian T; Hillstrom, Howard J

    2013-07-01

    Plantar pressure measurement is common practice in many research and clinical protocols. While the accuracy of some plantar pressure measuring devices and methods for ensuring consistency in data collection on plantar pressure measuring devices have been reported, the reliability of different devices when testing the same individuals is not known. This study calculated intra-mat, intra-manufacturer, and inter-manufacturer reliability of plantar pressure parameters as well as the number of plantar pressure trials needed to reach a stable estimate of the mean for an individual. Twenty-two healthy adults completed ten walking trials across each of two Novel emed-x(®) and two Tekscan MatScan(®) plantar pressure measuring devices in a single visit. Intraclass correlation (ICC) was used to describe the agreement between values measured by different devices. All intra-platform reliability correlations were greater than 0.70. All inter-emed-x(®) reliability correlations were greater than 0.70. Inter-MatScan(®) reliability correlations were greater than 0.70 in 31 and 52 of 56 parameters when looking at a 10-trial average and a 5-trial average, respectively. Inter-manufacturer reliability including all four devices was greater than 0.70 for 52 and 56 of 56 parameters when looking at a 10-trial average and a 5-trial average, respectively. All parameters reached a value within 90% of an unbiased estimate of the mean within five trials. Overall, reliability results are encouraging for investigators and clinicians who may have plantar pressure data sets that include data collected on different devices.

  17. String Theory and Gauge Theories

    SciTech Connect

    Maldacena, Juan

    2009-02-20

    We will see how gauge theories, in the limit that the number of colors is large, give string theories. We will discuss some examples of particular gauge theories where the corresponding string theory is known precisely, starting with the case of the maximally supersymmetric theory in four dimensions which corresponds to ten dimensional string theory. We will discuss recent developments in this area.

  18. In Search of a Unified Theory of Biological Organization: What Does the Motor System of a Sea Slug Tell Us About Human Motor Integration?

    DTIC Science & Technology

    1992-04-07

    Therefore. the controiling balance between converging trans- mitters and neuromodulators that affect neurona structure need not be a simple linear...G. M. Edelman and V. B. Mountcastle. 55-110. Cambridge, MA: MIT Press, 1978. 52. Edelman. G. M. Neural Darnnum: The Theory of Neurona Group Selec

  19. A discussion of system reliability and the relative importance of pumps and valves to overall system availability

    SciTech Connect

    Poole, A.B.

    1996-12-01

    An analysis was undertaken to establish preliminary trends for how component aging can effect failure rates for swing check valves, centrifugal pumps and motor operated valves. These failure rate trends were evaluated over time and linear aging rate models established. The failure rate models were then used with classic reliability theories to estimate reliability as a function of operating time. Reliability theory was also used to establish a simple system reliability model. Using the system model, the relative importance of pumps and valves to the overall system reliability were studied. Conclusions were established relative to overall system availability over time and the relative unavailabilities of the various components studied.

  20. CERTS: Consortium for Electric Reliability Technology Solutions - Research Highlights

    SciTech Connect

    Eto, Joseph

    2003-07-30

    Historically, the U.S. electric power industry was vertically integrated, and utilities were responsible for system planning, operations, and reliability management. As the nation moves to a competitive market structure, these functions have been disaggregated, and no single entity is responsible for reliability management. As a result, new tools, technologies, systems, and management processes are needed to manage the reliability of the electricity grid. However, a number of simultaneous trends prevent electricity market participants from pursuing development of these reliability tools: utilities are preoccupied with restructuring their businesses, research funding has declined, and the formation of Independent System Operators (ISOs) and Regional Transmission Organizations (RTOs) to operate the grid means that control of transmission assets is separate from ownership of these assets; at the same time, business uncertainty, and changing regulatory policies have created a climate in which needed investment for transmission infrastructure and tools for reliability management has dried up. To address the resulting emerging gaps in reliability R&D, CERTS has undertaken much-needed public interest research on reliability technologies for the electricity grid. CERTS' vision is to: (1) Transform the electricity grid into an intelligent network that can sense and respond automatically to changing flows of power and emerging problems; (2) Enhance reliability management through market mechanisms, including transparency of real-time information on the status of the grid; (3) Empower customers to manage their energy use and reliability needs in response to real-time market price signals; and (4) Seamlessly integrate distributed technologies--including those for generation, storage, controls, and communications--to support the reliability needs of both the grid and individual customers.