Sample records for reliability organizations theory

  1. Reliability theory for repair service organization simulation and increase of innovative attraction of industrial enterprises

    NASA Astrophysics Data System (ADS)

    Dolzhenkova, E. V.; Iurieva, L. V.

    2018-05-01

    The study presents the author's algorithm for the industrial enterprise repair service organization simulation based on the reliability theory, as well as the results of its application. The monitoring of the industrial enterprise repair service organization is proposed to perform on the basis of the enterprise's state indexes for the main resources (equipment, labour, finances, repair areas), which allows quantitative evaluation of the reliability level as a resulting summary rating of the said parameters and the ensuring of an appropriate level of the operation reliability of the serviced technical objects. Under the conditions of the tough competition, the following approach is advisable: the higher efficiency of production and a repair service itself, the higher the innovative attractiveness of an industrial enterprise. The results of the calculations show that in order to prevent inefficient losses of production and to reduce the repair costs, it is advisable to apply the reliability theory. The overall reliability rating calculated on the basis of the author's algorithm has low values. The processing of the statistical data forms the reliability characteristics for the different workshops and services of an industrial enterprise, which allows one to define the failure rates of the various units of equipment and to establish the reliability indexes necessary for the subsequent mathematical simulation. The proposed simulating algorithm contributes to an increase of the efficiency of the repair service organization and improvement of the innovative attraction of an industrial enterprise.

  2. Applying Organization Theory to Understanding the Adoption and Implementation of Accountable Care Organizations: Commentary.

    PubMed

    Shortell, Stephen M

    2016-12-01

    This commentary highights the key arguments and contributions of institutional thoery, transaction cost economics (TCE) theory, high reliability theory, and organizational learning theory to understanding the development and evolution of Accountable Care Organizations (ACOs). Institutional theory and TCE theory primarily emphasize the external influences shaping ACOs while high reliability theory and organizational learning theory underscore the internal fctors influencing ACO perfromance. A framework based on Implementation Science is proposed to conside the multiple perspectives on ACOs and, in particular, their abiity to innovate to achieve desired cost, quality, and population health goals. © The Author(s) 2016.

  3. Creating Highly Reliable Accountable Care Organizations.

    PubMed

    Vogus, Timothy J; Singer, Sara J

    2016-12-01

    Accountable Care Organizations' (ACOs) pursuit of the triple aim of higher quality, lower cost, and improved population health has met with mixed results. To improve the design and implementation of ACOs we look to organizations that manage similarly complex, dynamic, and tightly coupled conditions while sustaining exceptional performance known as high-reliability organizations. We describe the key processes through which organizations achieve reliability, the leadership and organizational practices that enable it, and the role that professionals can play when charged with enacting it. Specifically, we present concrete practices and processes from health care organizations pursuing high-reliability and from early ACOs to illustrate how the triple aim may be met by cultivating mindful organizing, practicing reliability-enhancing leadership, and identifying and supporting reliability professionals. We conclude by proposing a set of research questions to advance the study of ACOs and high-reliability research. © The Author(s) 2016.

  4. Improving Patient Safety in Hospitals: Contributions of High-Reliability Theory and Normal Accident Theory

    PubMed Central

    Tamuz, Michal; Harrison, Michael I

    2006-01-01

    Objective To identify the distinctive contributions of high-reliability theory (HRT) and normal accident theory (NAT) as frameworks for examining five patient safety practices. Data Sources/Study Setting We reviewed and drew examples from studies of organization theory and health services research. Study Design After highlighting key differences between HRT and NAT, we applied the frames to five popular safety practices: double-checking medications, crew resource management (CRM), computerized physician order entry (CPOE), incident reporting, and root cause analysis (RCA). Principal Findings HRT highlights how double checking, which is designed to prevent errors, can undermine mindfulness of risk. NAT emphasizes that social redundancy can diffuse and reduce responsibility for locating mistakes. CRM promotes high reliability organizations by fostering deference to expertise, rather than rank. However, HRT also suggests that effective CRM depends on fundamental changes in organizational culture. NAT directs attention to an underinvestigated feature of CPOE: it tightens the coupling of the medication ordering process, and tight coupling increases the chances of a rapid and hard-to-contain spread of infrequent, but harmful errors. Conclusions Each frame can make a valuable contribution to improving patient safety. By applying the HRT and NAT frames, health care researchers and administrators can identify health care settings in which new and existing patient safety interventions are likely to be effective. Furthermore, they can learn how to improve patient safety, not only from analyzing mishaps, but also by studying the organizational consequences of implementing safety measures. PMID:16898984

  5. How to Map Theory: Reliable Methods Are Fruitless Without Rigorous Theory.

    PubMed

    Gray, Kurt

    2017-09-01

    Good science requires both reliable methods and rigorous theory. Theory allows us to build a unified structure of knowledge, to connect the dots of individual studies and reveal the bigger picture. Some have criticized the proliferation of pet "Theories," but generic "theory" is essential to healthy science, because questions of theory are ultimately those of validity. Although reliable methods and rigorous theory are synergistic, Action Identification suggests psychological tension between them: The more we focus on methodological details, the less we notice the broader connections. Therefore, psychology needs to supplement training in methods (how to design studies and analyze data) with training in theory (how to connect studies and synthesize ideas). This article provides a technique for visually outlining theory: theory mapping. Theory mapping contains five elements, which are illustrated with moral judgment and with cars. Also included are 15 additional theory maps provided by experts in emotion, culture, priming, power, stress, ideology, morality, marketing, decision-making, and more (see all at theorymaps.org ). Theory mapping provides both precision and synthesis, which helps to resolve arguments, prevent redundancies, assess the theoretical contribution of papers, and evaluate the likelihood of surprising effects.

  6. 78 FR 41339 - Electric Reliability Organization Proposal To Retire Requirements in Reliability Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-10

    ...] Electric Reliability Organization Proposal To Retire Requirements in Reliability Standards AGENCY: Federal... Reliability Standards identified by the North American Electric Reliability Corporation (NERC), the Commission-certified Electric Reliability Organization. FOR FURTHER INFORMATION CONTACT: Kevin Ryan (Legal Information...

  7. Neurology objective structured clinical examination reliability using generalizability theory

    PubMed Central

    Park, Yoon Soo; Lukas, Rimas V.; Brorson, James R.

    2015-01-01

    Objectives: This study examines factors affecting reliability, or consistency of assessment scores, from an objective structured clinical examination (OSCE) in neurology through generalizability theory (G theory). Methods: Data include assessments from a multistation OSCE taken by 194 medical students at the completion of a neurology clerkship. Facets evaluated in this study include cases, domains, and items. Domains refer to areas of skill (or constructs) that the OSCE measures. G theory is used to estimate variance components associated with each facet, derive reliability, and project the number of cases required to obtain a reliable (consistent, precise) score. Results: Reliability using G theory is moderate (Φ coefficient = 0.61, G coefficient = 0.64). Performance is similar across cases but differs by the particular domain, such that the majority of variance is attributed to the domain. Projections in reliability estimates reveal that students need to participate in 3 OSCE cases in order to increase reliability beyond the 0.70 threshold. Conclusions: This novel use of G theory in evaluating an OSCE in neurology provides meaningful measurement characteristics of the assessment. Differing from prior work in other medical specialties, the cases students were randomly assigned did not influence their OSCE score; rather, scores varied in expected fashion by domain assessed. PMID:26432851

  8. Neurology objective structured clinical examination reliability using generalizability theory.

    PubMed

    Blood, Angela D; Park, Yoon Soo; Lukas, Rimas V; Brorson, James R

    2015-11-03

    This study examines factors affecting reliability, or consistency of assessment scores, from an objective structured clinical examination (OSCE) in neurology through generalizability theory (G theory). Data include assessments from a multistation OSCE taken by 194 medical students at the completion of a neurology clerkship. Facets evaluated in this study include cases, domains, and items. Domains refer to areas of skill (or constructs) that the OSCE measures. G theory is used to estimate variance components associated with each facet, derive reliability, and project the number of cases required to obtain a reliable (consistent, precise) score. Reliability using G theory is moderate (Φ coefficient = 0.61, G coefficient = 0.64). Performance is similar across cases but differs by the particular domain, such that the majority of variance is attributed to the domain. Projections in reliability estimates reveal that students need to participate in 3 OSCE cases in order to increase reliability beyond the 0.70 threshold. This novel use of G theory in evaluating an OSCE in neurology provides meaningful measurement characteristics of the assessment. Differing from prior work in other medical specialties, the cases students were randomly assigned did not influence their OSCE score; rather, scores varied in expected fashion by domain assessed. © 2015 American Academy of Neurology.

  9. High reliability and implications for nursing leaders.

    PubMed

    Riley, William

    2009-03-01

    To review high reliability theory and discuss its implications for the nursing leader. A high reliability organization (HRO) is considered that which has measurable near perfect performance for quality and safety. The author has reviewed the literature, discussed research findings that contribute to improving reliability in health care organizations, and makes five recommendations for how nursing leaders can create high reliability organizations. Health care is not a safe industry and unintended patient harm occurs at epidemic levels. Health care can learn from high reliability theory and practice developed in other high-risk industries. Viewed by HRO standards, unintended patient injury in health care is excessively high and quality is distressingly low. HRO theory and practice can be successfully applied in health care using advanced interdisciplinary teamwork training and deliberate process design techniques. Nursing has a primary leadership function for ensuring patient safety and achieving high quality in health care organizations. Learning HRO theory and methods for achieving high reliability is a foremost opportunity for nursing leaders.

  10. 18 CFR 39.3 - Electric Reliability Organization certification.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 18 Conservation of Power and Water Resources 1 2012-04-01 2012-04-01 false Electric Reliability... CERTIFICATION OF THE ELECTRIC RELIABILITY ORGANIZATION; AND PROCEDURES FOR THE ESTABLISHMENT, APPROVAL, AND ENFORCEMENT OF ELECTRIC RELIABILITY STANDARDS § 39.3 Electric Reliability Organization certification. (a) Any...

  11. 18 CFR 39.3 - Electric Reliability Organization certification.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Electric Reliability... CERTIFICATION OF THE ELECTRIC RELIABILITY ORGANIZATION; AND PROCEDURES FOR THE ESTABLISHMENT, APPROVAL, AND ENFORCEMENT OF ELECTRIC RELIABILITY STANDARDS § 39.3 Electric Reliability Organization certification. (a) Any...

  12. 18 CFR 39.3 - Electric Reliability Organization certification.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 18 Conservation of Power and Water Resources 1 2014-04-01 2014-04-01 false Electric Reliability... CERTIFICATION OF THE ELECTRIC RELIABILITY ORGANIZATION; AND PROCEDURES FOR THE ESTABLISHMENT, APPROVAL, AND ENFORCEMENT OF ELECTRIC RELIABILITY STANDARDS § 39.3 Electric Reliability Organization certification. (a) Any...

  13. 18 CFR 39.3 - Electric Reliability Organization certification.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 18 Conservation of Power and Water Resources 1 2013-04-01 2013-04-01 false Electric Reliability... CERTIFICATION OF THE ELECTRIC RELIABILITY ORGANIZATION; AND PROCEDURES FOR THE ESTABLISHMENT, APPROVAL, AND ENFORCEMENT OF ELECTRIC RELIABILITY STANDARDS § 39.3 Electric Reliability Organization certification. (a) Any...

  14. Generalizability Theory as a Unifying Framework of Measurement Reliability in Adolescent Research

    ERIC Educational Resources Information Center

    Fan, Xitao; Sun, Shaojing

    2014-01-01

    In adolescence research, the treatment of measurement reliability is often fragmented, and it is not always clear how different reliability coefficients are related. We show that generalizability theory (G-theory) is a comprehensive framework of measurement reliability, encompassing all other reliability methods (e.g., Pearson "r,"…

  15. Design of high reliability organizations in health care.

    PubMed

    Carroll, J S; Rudolph, J W

    2006-12-01

    To improve safety performance, many healthcare organizations have sought to emulate high reliability organizations from industries such as nuclear power, chemical processing, and military operations. We outline high reliability design principles for healthcare organizations including both the formal structures and the informal practices that complement those structures. A stage model of organizational structures and practices, moving from local autonomy to formal controls to open inquiry to deep self-understanding, is used to illustrate typical challenges and design possibilities at each stage. We suggest how organizations can use the concepts and examples presented to increase their capacity to self-design for safety and reliability.

  16. Design of high reliability organizations in health care

    PubMed Central

    Carroll, J S; Rudolph, J W

    2006-01-01

    To improve safety performance, many healthcare organizations have sought to emulate high reliability organizations from industries such as nuclear power, chemical processing, and military operations. We outline high reliability design principles for healthcare organizations including both the formal structures and the informal practices that complement those structures. A stage model of organizational structures and practices, moving from local autonomy to formal controls to open inquiry to deep self‐understanding, is used to illustrate typical challenges and design possibilities at each stage. We suggest how organizations can use the concepts and examples presented to increase their capacity to self‐design for safety and reliability. PMID:17142607

  17. The Stability and Reliability of a Modified Work Components Study Questionnaire in the Educational Organization.

    ERIC Educational Resources Information Center

    Miskel, Cecil; Heller, Leonard E.

    The investigation attempted to establish the factorial validity and reliability of an industrial selection device based on Herzberg's theory of work motivation related to the school organization. The questionnaire was reworded to reflect an educational work situation; and a random sample of 197 students, 118 administrators, and 432 teachers was…

  18. Reliability of Test Scores in Nonparametric Item Response Theory.

    ERIC Educational Resources Information Center

    Sijtsma, Klaas; Molenaar, Ivo W.

    1987-01-01

    Three methods for estimating reliability are studied within the context of nonparametric item response theory. Two were proposed originally by Mokken and a third is developed in this paper. Using a Monte Carlo strategy, these three estimation methods are compared with four "classical" lower bounds to reliability. (Author/JAZ)

  19. Extended Importance Sampling for Reliability Analysis under Evidence Theory

    NASA Astrophysics Data System (ADS)

    Yuan, X. K.; Chen, B.; Zhang, B. Q.

    2018-05-01

    In early engineering practice, the lack of data and information makes uncertainty difficult to deal with. However, evidence theory has been proposed to handle uncertainty with limited information as an alternative way to traditional probability theory. In this contribution, a simulation-based approach, called ‘Extended importance sampling’, is proposed based on evidence theory to handle problems with epistemic uncertainty. The proposed approach stems from the traditional importance sampling for reliability analysis under probability theory, and is developed to handle the problem with epistemic uncertainty. It first introduces a nominal instrumental probability density function (PDF) for every epistemic uncertainty variable, and thus an ‘equivalent’ reliability problem under probability theory is obtained. Then the samples of these variables are generated in a way of importance sampling. Based on these samples, the plausibility and belief (upper and lower bounds of probability) can be estimated. It is more efficient than direct Monte Carlo simulation. Numerical and engineering examples are given to illustrate the efficiency and feasible of the proposed approach.

  20. Comparison of Reliability Measures under Factor Analysis and Item Response Theory

    ERIC Educational Resources Information Center

    Cheng, Ying; Yuan, Ke-Hai; Liu, Cheng

    2012-01-01

    Reliability of test scores is one of the most pervasive psychometric concepts in measurement. Reliability coefficients based on a unifactor model for continuous indicators include maximal reliability rho and an unweighted sum score-based omega, among many others. With increasing popularity of item response theory, a parallel reliability measure pi…

  1. Creating High Reliability in Health Care Organizations

    PubMed Central

    Pronovost, Peter J; Berenholtz, Sean M; Goeschel, Christine A; Needham, Dale M; Sexton, J Bryan; Thompson, David A; Lubomski, Lisa H; Marsteller, Jill A; Makary, Martin A; Hunt, Elizabeth

    2006-01-01

    Objective The objective of this paper was to present a comprehensive approach to help health care organizations reliably deliver effective interventions. Context Reliability in healthcare translates into using valid rate-based measures. Yet high reliability organizations have proven that the context in which care is delivered, called organizational culture, also has important influences on patient safety. Model for Improvement Our model to improve reliability, which also includes interventions to improve culture, focuses on valid rate-based measures. This model includes (1) identifying evidence-based interventions that improve the outcome, (2) selecting interventions with the most impact on outcomes and converting to behaviors, (3) developing measures to evaluate reliability, (4) measuring baseline performance, and (5) ensuring patients receive the evidence-based interventions. The comprehensive unit-based safety program (CUSP) is used to improve culture and guide organizations in learning from mistakes that are important, but cannot be measured as rates. Conclusions We present how this model was used in over 100 intensive care units in Michigan to improve culture and eliminate catheter-related blood stream infections—both were accomplished. Our model differs from existing models in that it incorporates efforts to improve a vital component for system redesign—culture, it targets 3 important groups—senior leaders, team leaders, and front line staff, and facilitates change management—engage, educate, execute, and evaluate for planned interventions. PMID:16898981

  2. Reliability analysis of the objective structured clinical examination using generalizability theory.

    PubMed

    Trejo-Mejía, Juan Andrés; Sánchez-Mendiola, Melchor; Méndez-Ramírez, Ignacio; Martínez-González, Adrián

    2016-01-01

    The objective structured clinical examination (OSCE) is a widely used method for assessing clinical competence in health sciences education. Studies using this method have shown evidence of validity and reliability. There are no published studies of OSCE reliability measurement with generalizability theory (G-theory) in Latin America. The aims of this study were to assess the reliability of an OSCE in medical students using G-theory and explore its usefulness for quality improvement. An observational cross-sectional study was conducted at National Autonomous University of Mexico (UNAM) Faculty of Medicine in Mexico City. A total of 278 fifth-year medical students were assessed with an 18-station OSCE in a summative end-of-career final examination. There were four exam versions. G-theory with a crossover random effects design was used to identify the main sources of variance. Examiners, standardized patients, and cases were considered as a single facet of analysis. The exam was applied to 278 medical students. The OSCE had a generalizability coefficient of 0.93. The major components of variance were stations, students, and residual error. The sites and the versions of the tests had minimum variance. Our study achieved a G coefficient similar to that found in other reports, which is acceptable for summative tests. G-theory allows the estimation of the magnitude of multiple sources of error and helps decision makers to determine the number of stations, test versions, and examiners needed to obtain reliable measurements.

  3. Reliability analysis of the objective structured clinical examination using generalizability theory.

    PubMed

    Trejo-Mejía, Juan Andrés; Sánchez-Mendiola, Melchor; Méndez-Ramírez, Ignacio; Martínez-González, Adrián

    2016-01-01

    Background The objective structured clinical examination (OSCE) is a widely used method for assessing clinical competence in health sciences education. Studies using this method have shown evidence of validity and reliability. There are no published studies of OSCE reliability measurement with generalizability theory (G-theory) in Latin America. The aims of this study were to assess the reliability of an OSCE in medical students using G-theory and explore its usefulness for quality improvement. Methods An observational cross-sectional study was conducted at National Autonomous University of Mexico (UNAM) Faculty of Medicine in Mexico City. A total of 278 fifth-year medical students were assessed with an 18-station OSCE in a summative end-of-career final examination. There were four exam versions. G-theory with a crossover random effects design was used to identify the main sources of variance. Examiners, standardized patients, and cases were considered as a single facet of analysis. Results The exam was applied to 278 medical students. The OSCE had a generalizability coefficient of 0.93. The major components of variance were stations, students, and residual error. The sites and the versions of the tests had minimum variance. Conclusions Our study achieved a G coefficient similar to that found in other reports, which is acceptable for summative tests. G-theory allows the estimation of the magnitude of multiple sources of error and helps decision makers to determine the number of stations, test versions, and examiners needed to obtain reliable measurements.

  4. Organization Theory as Ideology.

    ERIC Educational Resources Information Center

    Greenfield, Thomas B.

    The theory that organizations are ideological inventions of the human mind is discussed. Organizational science is described as an ideology which is based upon social concepts and experiences. The main justification for organizational theory is that it attempts to answer why we behave as we do in social organizations. Ways in which ideas and…

  5. Large Sample Confidence Intervals for Item Response Theory Reliability Coefficients

    ERIC Educational Resources Information Center

    Andersson, Björn; Xin, Tao

    2018-01-01

    In applications of item response theory (IRT), an estimate of the reliability of the ability estimates or sum scores is often reported. However, analytical expressions for the standard errors of the estimators of the reliability coefficients are not available in the literature and therefore the variability associated with the estimated reliability…

  6. Test Theories, Educational Priorities and Reliability of Public Examinations in England

    ERIC Educational Resources Information Center

    Baird, Jo-Anne; Black, Paul

    2013-01-01

    Much has already been written on the controversies surrounding the use of different test theories in educational assessment. Other authors have noted the prevalence of classical test theory over item response theory in practice. This Special Issue draws together articles based upon work conducted on the Reliability Programme for England's…

  7. Teamwork as an Essential Component of High-Reliability Organizations

    PubMed Central

    Baker, David P; Day, Rachel; Salas, Eduardo

    2006-01-01

    Organizations are increasingly becoming dynamic and unstable. This evolution has given rise to greater reliance on teams and increased complexity in terms of team composition, skills required, and degree of risk involved. High-reliability organizations (HROs) are those that exist in such hazardous environments where the consequences of errors are high, but the occurrence of error is extremely low. In this article, we argue that teamwork is an essential component of achieving high reliability particularly in health care organizations. We describe the fundamental characteristics of teams, review strategies in team training, demonstrate the criticality of teamwork in HROs and finally, identify specific challenges the health care community must address to improve teamwork and enhance reliability. PMID:16898980

  8. Reliable Cellular Automata with Self-Organization

    NASA Astrophysics Data System (ADS)

    Gács, Peter

    2001-04-01

    In a probabilistic cellular automaton in which all local transitions have positive probability, the problem of keeping a bit of information indefinitely is nontrivial, even in an infinite automaton. Still, there is a solution in 2 dimensions, and this solution can be used to construct a simple 3-dimensional discrete-time universal fault-tolerant cellular automaton. This technique does not help much to solve the following problems: remembering a bit of information in 1 dimension; computing in dimensions lower than 3; computing in any dimension with non-synchronized transitions. Our more complex technique organizes the cells in blocks that perform a reliable simulation of a second (generalized) cellular automaton. The cells of the latter automaton are also organized in blocks, simulating even more reliably a third automaton, etc. Since all this (a possibly infinite hierarchy) is organized in "software," it must be under repair all the time from damage caused by errors. A large part of the problem is essentially self-stabilization recovering from a mess of arbitrary size and content. The present paper constructs an asynchronous one-dimensional fault-tolerant cellular automaton, with the further feature of "self-organization." The latter means that unless a large amount of input information must be given, the initial configuration can be chosen homogeneous.

  9. Studying Reliability of Open Ended Mathematics Items According to the Classical Test Theory and Generalizability Theory

    ERIC Educational Resources Information Center

    Guler, Nese; Gelbal, Selahattin

    2010-01-01

    In this study, the Classical test theory and generalizability theory were used for determination to reliability of scores obtained from measurement tool of mathematics success. 24 open-ended mathematics question of the TIMSS-1999 was applied to 203 students in 2007-spring semester. Internal consistency of scores was found as 0.92. For…

  10. Seeking high reliability in primary care: Leadership, tools, and organization.

    PubMed

    Weaver, Robert R

    2015-01-01

    Leaders in health care increasingly recognize that improving health care quality and safety requires developing an organizational culture that fosters high reliability and continuous process improvement. For various reasons, a reliability-seeking culture is lacking in most health care settings. Developing a reliability-seeking culture requires leaders' sustained commitment to reliability principles using key mechanisms to embed those principles widely in the organization. The aim of this study was to examine how key mechanisms used by a primary care practice (PCP) might foster a reliability-seeking, system-oriented organizational culture. A case study approach was used to investigate the PCP's reliability culture. The study examined four cultural artifacts used to embed reliability-seeking principles across the organization: leadership statements, decision support tools, and two organizational processes. To decipher their effects on reliability, the study relied on observations of work patterns and the tools' use, interactions during morning huddles and process improvement meetings, interviews with clinical and office staff, and a "collective mindfulness" questionnaire. The five reliability principles framed the data analysis. Leadership statements articulated principles that oriented the PCP toward a reliability-seeking culture of care. Reliability principles became embedded in the everyday discourse and actions through the use of "problem knowledge coupler" decision support tools and daily "huddles." Practitioners and staff were encouraged to report unexpected events or close calls that arose and which often initiated a formal "process change" used to adjust routines and prevent adverse events from recurring. Activities that foster reliable patient care became part of the taken-for-granted routine at the PCP. The analysis illustrates the role leadership, tools, and organizational processes play in developing and embedding a reliable-seeking culture across an

  11. Putting "Organizations" into an Organization Theory Course: A Hybrid CAO Model for Teaching Organization Theory

    ERIC Educational Resources Information Center

    Hannah, David R.; Venkatachary, Ranga

    2010-01-01

    In this article, the authors present a retrospective analysis of an instructor's multiyear redesign of a course on organization theory into what is called a hybrid Classroom-as-Organization model. It is suggested that this new course design served to apprentice students to function in quasi-real organizational structures. The authors further argue…

  12. Structural reliability analysis under evidence theory using the active learning kriging model

    NASA Astrophysics Data System (ADS)

    Yang, Xufeng; Liu, Yongshou; Ma, Panke

    2017-11-01

    Structural reliability analysis under evidence theory is investigated. It is rigorously proved that a surrogate model providing only correct sign prediction of the performance function can meet the accuracy requirement of evidence-theory-based reliability analysis. Accordingly, a method based on the active learning kriging model which only correctly predicts the sign of the performance function is proposed. Interval Monte Carlo simulation and a modified optimization method based on Karush-Kuhn-Tucker conditions are introduced to make the method more efficient in estimating the bounds of failure probability based on the kriging model. Four examples are investigated to demonstrate the efficiency and accuracy of the proposed method.

  13. High Reliability Organizations in Education. Noteworthy Perspectives

    ERIC Educational Resources Information Center

    Eck, James H.; Bellamy, G. Thomas; Schaffer, Eugene; Stringfield, Sam; Reynolds, David

    2011-01-01

    The authors of this monograph assert that by assisting school systems to more closely resemble "high reliability" organizations (HROs) that already exist in other industries and benchmarking against top-performing education systems from around the globe, America's school systems can transform themselves from compliance-driven…

  14. 77 FR 59745 - Delegation of Authority Regarding Electric Reliability Organization's Budget, Delegation...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-01

    ...; Order No. 766] Delegation of Authority Regarding Electric Reliability Organization's Budget, Delegation... Electric Reliability Organization (ERO) filings. In particular, this Final Rule transfers delegated... delegation agreements, and ERO policies and procedures. DATES: This rule is effective October 1, 2012. FOR...

  15. The Organization of the Living: A Theory of the Living Organization

    ERIC Educational Resources Information Center

    Maturana, H. R.

    1975-01-01

    Article presents a theory of the organization of living systems as autonomous entities, and a theory of the organization of the nervous system as a closed network of interacting neurons structurally coupled to the living system to whose realization it contributes. (Author)

  16. A Three-Part Theory of Critical Thinking: Dialogue, Mental Models, and Reliability

    DTIC Science & Technology

    2000-08-01

    A THREE-PART THEORY OF CRITICAL THINKING: DIALOGUE, MENTAL MODELS, AND RELIABILITY1 Marvin S. Cohen, Ph.D. Cognitive Technologies...1. REPORT DATE AUG 2000 2. REPORT TYPE 3. DATES COVERED 00-00-2000 to 00-00-2000 4. TITLE AND SUBTITLE A Three-part Theory of Critical...in logic or decision theory ? Does it require stand-alone courses? How will we persuade students to devote their time to the study of critical

  17. Bi-Factor Multidimensional Item Response Theory Modeling for Subscores Estimation, Reliability, and Classification

    ERIC Educational Resources Information Center

    Md Desa, Zairul Nor Deana

    2012-01-01

    In recent years, there has been increasing interest in estimating and improving subscore reliability. In this study, the multidimensional item response theory (MIRT) and the bi-factor model were combined to estimate subscores, to obtain subscores reliability, and subscores classification. Both the compensatory and partially compensatory MIRT…

  18. In search of principles for a Theory of Organisms

    PubMed Central

    Longo, Giuseppe; Montévil, Maël; Sonnenschein, Carlos; Soto, Ana M

    2017-01-01

    Lacking an operational theory to explain the organization and behaviour of matter in unicellular and multicellular organisms hinders progress in biology. Such a theory should address life cycles from ontogenesis to death. This theory would complement the theory of evolution that addresses phylogenesis, and would posit theoretical extensions to accepted physical principles and default states in order to grasp the living state of matter and define proper biological observables. Thus, we favour adopting the default state implicit in Darwin’s theory, namely, cell proliferation with variation plus motility, and a framing principle, namely, life phenomena manifest themselves as non-identical iterations of morphogenetic processes. From this perspective, organisms become a consequence of the inherent variability generated by proliferation, motility and self-organization. Morphogenesis would then be the result of the default state plus physical constraints, like gravity, and those present in living organisms, like muscular tension. PMID:26648040

  19. Using Generalizability Theory to Assess the Score Reliability of Communication Skills of Dentistry Students

    ERIC Educational Resources Information Center

    Uzun, N. Bilge; Aktas, Mehtap; Asiret, Semih; Yormaz, Seha

    2018-01-01

    The goal of this study is to determine the reliability of the performance points of dentistry students regarding communication skills and to examine the scoring reliability by generalizability theory in balanced random and fixed facet (mixed design) data, considering also the interactions of student, rater and duty. The study group of the research…

  20. Modeling Sensor Reliability in Fault Diagnosis Based on Evidence Theory

    PubMed Central

    Yuan, Kaijuan; Xiao, Fuyuan; Fei, Liguo; Kang, Bingyi; Deng, Yong

    2016-01-01

    Sensor data fusion plays an important role in fault diagnosis. Dempster–Shafer (D-R) evidence theory is widely used in fault diagnosis, since it is efficient to combine evidence from different sensors. However, under the situation where the evidence highly conflicts, it may obtain a counterintuitive result. To address the issue, a new method is proposed in this paper. Not only the statistic sensor reliability, but also the dynamic sensor reliability are taken into consideration. The evidence distance function and the belief entropy are combined to obtain the dynamic reliability of each sensor report. A weighted averaging method is adopted to modify the conflict evidence by assigning different weights to evidence according to sensor reliability. The proposed method has better performance in conflict management and fault diagnosis due to the fact that the information volume of each sensor report is taken into consideration. An application in fault diagnosis based on sensor fusion is illustrated to show the efficiency of the proposed method. The results show that the proposed method improves the accuracy of fault diagnosis from 81.19% to 89.48% compared to the existing methods. PMID:26797611

  1. Organic unity theory: an integrative mind-body theory for psychiatry.

    PubMed

    Goodman, A

    1997-12-01

    The potential of psychiatry as an integrative science has been impeded by an internal schism that derives from the duality of mental and physical. Organic unity theory is proposed as a conceptual framework that brings together the terms of the mind-body duality in one coherent perspective. Organic unity theory is braided of three strands: identity, which describes the relationship between mentally described events and corresponding physically described events; continuity, which describes the linguistic-conceptual system that contains both mental and physical terms; and dialectic, which describes the relationship between the empirical way of knowing that is associated with the physical domain of the linguistic-conceptual system and the hermeneutic way of knowing that is associated with the mental domain. Each strand represents an integrative formulation that resolves an aspect of mental-physical dualism into an underlying unity. After the theory is presented, its implications for psychiatry are briefly considered.

  2. Educational Management Organizations as High Reliability Organizations: A Study of Victory's Philadelphia High School Reform Work

    ERIC Educational Resources Information Center

    Thomas, David E.

    2013-01-01

    This executive position paper proposes recommendations for designing reform models between public and private sectors dedicated to improving school reform work in low performing urban high schools. It reviews scholarly research about for-profit educational management organizations, high reliability organizations, American high school reform, and…

  3. Henry's Constants of Persistent Organic Pollutants by a Group-Contribution Method Based on Scaled-Particle Theory.

    PubMed

    Razdan, Neil K; Koshy, David M; Prausnitz, John M

    2017-11-07

    A group-contribution method based on scaled-particle theory was developed to predict Henry's constants for six families of persistent organic pollutants: polychlorinated benzenes, polychlorinated biphenyls, polychlorinated dibenzodioxins, polychlorinated dibenzofurans, polychlorinated naphthalenes, and polybrominated diphenyl ethers. The group-contribution model uses limited experimental data to obtain group-interaction parameters for an easy-to-use method to predict Henry's constants for systems where reliable experimental data are scarce. By using group-interaction parameters obtained from data reduction, scaled-particle theory gives the partial molar Gibbs energy of dissolution, Δg̅ 2 , allowing calculation of Henry's constant, H 2 , for more than 700 organic pollutants. The average deviation between predicted values of log H 2 and experiment is 4%. Application of an approximate van't Hoff equation gives the temperature dependence of Henry's constants for polychlorinated biphenyls, polychlorinated naphthalenes, and polybrominated diphenyl ethers in the environmentally relevant range 0-40 °C.

  4. 75 FR 14097 - Revision to Electric Reliability Organization Definition of Bulk Electric System

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-24

    ... Commission 18 CFR Part 40 [Docket No. RM09-18-000; 130 FERC ] 61,204] Revision to Electric Reliability... Reliability Organization (ERO) to revise its definition of the term ``bulk electric system'' to include all... compliance with mandatory Reliability Standards. The Commission believes that a 100 kV threshold for...

  5. Generalizability Theory Reliability of Written Expression Curriculum-Based Measurement in Universal Screening

    ERIC Educational Resources Information Center

    Keller-Margulis, Milena A.; Mercer, Sterett H.; Thomas, Erin L.

    2016-01-01

    The purpose of this study was to examine the reliability of written expression curriculum-based measurement (WE-CBM) in the context of universal screening from a generalizability theory framework. Students in second through fifth grade (n = 145) participated in the study. The sample included 54% female students, 49% White students, 23% African…

  6. Influencing organizations to promote health: applying stakeholder theory.

    PubMed

    Kok, Gerjo; Gurabardhi, Zamira; Gottlieb, Nell H; Zijlstra, Fred R H

    2015-04-01

    Stakeholder theory may help health promoters to make changes at the organizational and policy level to promote health. A stakeholder is any individual, group, or organization that can influence an organization. The organization that is the focus for influence attempts is called the focal organization. The more salient a stakeholder is and the more central in the network, the stronger the influence. As stakeholders, health promoters may use communicative, compromise, deinstitutionalization, or coercive methods through an ally or a coalition. A hypothetical case study, involving adolescent use of harmful legal products, illustrates the process of applying stakeholder theory to strategic decision making. © 2015 Society for Public Health Education.

  7. Are We Hoping For A Bounce A Study On Resilience And Human Relations In A High Reliability Organization

    DTIC Science & Technology

    2016-03-01

    A BOUNCE? A STUDY ON RESILIENCE AND HUMAN RELATIONS IN A HIGH RELIABILITY ORGANIZATION by Robert D. Johns March 2016 Thesis Advisor...RELATIONS IN A HIGH RELIABILITY ORGANIZATION 5. FUNDING NUMBERS 6. AUTHOR(S) Robert D. Johns 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES...200 words) This study analyzes the various resilience factors associated with a military high reliability organization (HRO). The data measuring

  8. High Reliability Organizations--Medication Safety.

    PubMed

    Yip, Luke; Farmer, Brenna

    2015-06-01

    High reliability organizations (HROs), such as the aviation industry, successfully engage in high-risk endeavors and have low incidence of adverse events. HROs have a preoccupation with failure and errors. They analyze each event to effect system wide change in an attempt to mitigate the occurrence of similar errors. The healthcare industry can adapt HRO practices, specifically with regard to teamwork and communication. Crew resource management concepts can be adapted to healthcare with the use of certain tools such as checklists and the sterile cockpit to reduce medication errors. HROs also use The Swiss Cheese Model to evaluate risk and look for vulnerabilities in multiple protective barriers, instead of focusing on one failure. This model can be used in medication safety to evaluate medication management in addition to using the teamwork and communication tools of HROs.

  9. Human Resource Management, Computers, and Organization Theory.

    ERIC Educational Resources Information Center

    Garson, G. David

    In an attempt to provide a framework for research and theory building in public management information systems (PMIS), state officials responsible for computing in personnel operations were surveyed. The data were applied to hypotheses arising from a recent model by Bozeman and Bretschneider, attempting to relate organization theory to management…

  10. A Holistic Equilibrium Theory of Organization Development

    ERIC Educational Resources Information Center

    Yang, Baiyin; Zheng, Wei

    2005-01-01

    This paper proposes a holistic equilibrium theory of organizational development (OD). The theory states that there are three driving forces in organizational change and development--rationality, reality, and liberty. OD can be viewed as a planned process of change in an organization so as to establish equilibrium among these three interacting…

  11. Reliable change of the sensory organization test.

    PubMed

    Broglio, Steven P; Ferrara, Michael S; Sopiarz, Kay; Kelly, Michael S

    2008-03-01

    To establish the sensitivity and specificity of the NeuroCom Sensory Organization Test (SOT) and provide practitioners with cut-scores for clinical decision making using estimates of reliable change. Retrospective cohort study. Research laboratory. Healthy (n = 66) and concussed (n = 63) young adult participants. Postural control assessments on the NeuroCom SOT were completed twice (baseline and follow-up) for both groups. Postconcussion assessments were administered within 24 hours of injury diagnosis. The reliable change technique was used to calculated cut-scores for each SOT variable (composite balance; somatosensory, visual, and vestibular ratios) at the 95%, 90%, 85%, 80%, 75%, and 70% confidence interval levels. When cut-scores were applied to the post-concussion evaluations, sensitivity and specificity varied with SOT variable and confidence interval. An evaluation for change on one or more SOT variable resulted in the highest combined sensitivity (57%) and specificity (80%) at the 75% confidence interval. Use of reliable change scores to detect significant changes in performance on the SOT resulted in decreased sensitivity and improved specificity compared to a previous report. These findings indicate that some concussed athletes may not show large changes in postconcussion postural control and this postural control evaluation should not be used in exclusion of other assessment techniques. The postural control assessment should be combined with other evaluative measures to gain the highest sensitivity to concussive injuries.

  12. Application of SAW method for multiple-criteria comparative analysis of the reliability of heat supply organizations

    NASA Astrophysics Data System (ADS)

    Akhmetova, I. G.; Chichirova, N. D.

    2016-12-01

    Heat supply is the most energy-consuming sector of the economy. Approximately 30% of all used primary fuel-and-energy resources is spent on municipal heat-supply needs. One of the key indicators of activity of heat-supply organizations is the reliability of an energy facility. The reliability index of a heat supply organization is of interest to potential investors for assessing risks when investing in projects. The reliability indices established by the federal legislation are actually reduced to a single numerical factor, which depends on the number of heat-supply outages in connection with disturbances in operation of heat networks and the volume of their resource recovery in the calculation year. This factor is rather subjective and may change in a wide range during several years. A technique is proposed for evaluating the reliability of heat-supply organizations with the use of the simple additive weighting (SAW) method. The technique for integrated-index determination satisfies the following conditions: the reliability level of the evaluated heat-supply system is represented maximum fully and objectively; the information used for the reliability-index evaluation is easily available (is located on the Internet in accordance with demands of data-disclosure standards). For reliability estimation of heat-supply organizations, the following indicators were selected: the wear of equipment of thermal energy sources, the wear of heat networks, the number of outages of supply of thermal energy (heat carrier due to technological disturbances on heat networks per 1 km of heat networks), the number of outages of supply of thermal energy (heat carrier due to technologic disturbances on thermal energy sources per 1 Gcal/h of installed power), the share of expenditures in the cost of thermal energy aimed at recovery of the resource (renewal of fixed assets), coefficient of renewal of fixed assets, and a coefficient of fixed asset retirement. A versatile program is developed

  13. Using G-Theory to Enhance Evidence of Reliability and Validity for Common Uses of the Paulhus Deception Scales.

    PubMed

    Vispoel, Walter P; Morris, Carrie A; Kilinc, Murat

    2018-01-01

    We applied a new approach to Generalizability theory (G-theory) involving parallel splits and repeated measures to evaluate common uses of the Paulhus Deception Scales based on polytomous and four types of dichotomous scoring. G-theory indices of reliability and validity accounting for specific-factor, transient, and random-response measurement error supported use of polytomous over dichotomous scores as contamination checks; as control, explanatory, and outcome variables; as aspects of construct validation; and as indexes of environmental effects on socially desirable responding. Polytomous scoring also provided results for flagging faking as dependable as those when using dichotomous scoring methods. These findings argue strongly against the nearly exclusive use of dichotomous scoring for the Paulhus Deception Scales in practice and underscore the value of G-theory in demonstrating this. We provide guidelines for applying our G-theory techniques to other objectively scored clinical assessments, for using G-theory to estimate how changes to a measure might improve reliability, and for obtaining software to conduct G-theory analyses free of charge.

  14. Reliability measures in item response theory: manifest versus latent correlation functions.

    PubMed

    Milanzi, Elasma; Molenberghs, Geert; Alonso, Ariel; Verbeke, Geert; De Boeck, Paul

    2015-02-01

    For item response theory (IRT) models, which belong to the class of generalized linear or non-linear mixed models, reliability at the scale of observed scores (i.e., manifest correlation) is more difficult to calculate than latent correlation based reliability, but usually of greater scientific interest. This is not least because it cannot be calculated explicitly when the logit link is used in conjunction with normal random effects. As such, approximations such as Fisher's information coefficient, Cronbach's α, or the latent correlation are calculated, allegedly because it is easy to do so. Cronbach's α has well-known and serious drawbacks, Fisher's information is not meaningful under certain circumstances, and there is an important but often overlooked difference between latent and manifest correlations. Here, manifest correlation refers to correlation between observed scores, while latent correlation refers to correlation between scores at the latent (e.g., logit or probit) scale. Thus, using one in place of the other can lead to erroneous conclusions. Taylor series based reliability measures, which are based on manifest correlation functions, are derived and a careful comparison of reliability measures based on latent correlations, Fisher's information, and exact reliability is carried out. The latent correlations are virtually always considerably higher than their manifest counterparts, Fisher's information measure shows no coherent behaviour (it is even negative in some cases), while the newly introduced Taylor series based approximations reflect the exact reliability very closely. Comparisons among the various types of correlations, for various IRT models, are made using algebraic expressions, Monte Carlo simulations, and data analysis. Given the light computational burden and the performance of Taylor series based reliability measures, their use is recommended. © 2014 The British Psychological Society.

  15. Influencing Organizations to Promote Health: Applying Stakeholder Theory

    ERIC Educational Resources Information Center

    Kok, Gerjo; Gurabardhi, Zamira; Gottlieb, Nell H.; Zijlstra, Fred R. H.

    2015-01-01

    Stakeholder theory may help health promoters to make changes at the organizational and policy level to promote health. A stakeholder is any individual, group, or organization that can influence an organization. The organization that is the focus for influence attempts is called the focal organization. The more salient a stakeholder is and the more…

  16. 18 CFR 39.4 - Funding of the Electric Reliability Organization.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... Reliability Organization. 39.4 Section 39.4 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY REGULATIONS UNDER THE FEDERAL POWER ACT RULES CONCERNING... interruption as it transitions from one method of funding to another. Any proposed transitional funding plan...

  17. 18 CFR 39.4 - Funding of the Electric Reliability Organization.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... Reliability Organization. 39.4 Section 39.4 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY REGULATIONS UNDER THE FEDERAL POWER ACT RULES CONCERNING... interruption as it transitions from one method of funding to another. Any proposed transitional funding plan...

  18. 18 CFR 39.4 - Funding of the Electric Reliability Organization.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... Reliability Organization. 39.4 Section 39.4 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY REGULATIONS UNDER THE FEDERAL POWER ACT RULES CONCERNING... interruption as it transitions from one method of funding to another. Any proposed transitional funding plan...

  19. 18 CFR 39.4 - Funding of the Electric Reliability Organization.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... Reliability Organization. 39.4 Section 39.4 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY REGULATIONS UNDER THE FEDERAL POWER ACT RULES CONCERNING... interruption as it transitions from one method of funding to another. Any proposed transitional funding plan...

  20. Understanding Schools as High-Reliability Organizations: An Exploratory Examination of Teachers' and School Leaders' Perceptions of Success

    ERIC Educational Resources Information Center

    Lorton, Juli A.; Bellamy, G. Thomas; Reece, Anne; Carlson, Jill

    2013-01-01

    Drawing on research on high-reliability organizations, this interviewbased qualitative case study employs four characteristics of such organizations as a lens for analyzing the operations of one very successful K-5 public school. Results suggest that the school had processes similar to those characteristic of high-reliability organizations: a…

  1. Reliability of the Measure of Acceptance of the Theory of Evolution (MATE) Instrument with University Students

    ERIC Educational Resources Information Center

    Rutledge, Michael L.; Sadler, Kim C.

    2007-01-01

    The Measure of Acceptance of the Theory of Evolution (MATE) instrument was initially designed to assess high school biology teachers' acceptance of evolutionary theory. To determine if the MATE instrument is reliable with university students, it was administered to students in a non-majors biology course (n = 61) twice over a 3-week period.…

  2. Measuring theory of mind across middle childhood: Reliability and validity of the Silent Films and Strange Stories tasks.

    PubMed

    Devine, Rory T; Hughes, Claire

    2016-09-01

    Recent years have seen a growth of research on the development of children's ability to reason about others' mental states (or "theory of mind") beyond the narrow confines of the preschool period. The overall aim of this study was to investigate the psychometric properties of a task battery composed of items from Happé's Strange Stories task and Devine and Hughes' Silent Film task. A sample of 460 ethnically and socially diverse children (211 boys) between 7 and 13years of age completed the task battery at two time points separated by 1month. The Strange Stories and Silent Film tasks were strongly correlated even when verbal ability and narrative comprehension were taken into account, and all items loaded onto a single theory-of-mind latent factor. The theory-of-mind latent factor provided reliable estimates of performance across a wide range of theory-of-mind ability and showed no evidence of differential item functioning across gender, ethnicity, or socioeconomic status. The theory-of-mind latent factor also exhibited strong 1-month test-retest reliability, and this stability did not vary as a function of child characteristics. Taken together, these findings provide evidence for the validity and reliability of the Strange Stories and Silent Film task battery as a measure of individual differences in theory of mind suitable for use across middle childhood. We consider the methodological and conceptual implications of these findings for research on theory of mind beyond the preschool years. Copyright © 2015 Elsevier Inc. All rights reserved.

  3. Organizations or Communities? Changing the Metaphor Changes the Theory.

    ERIC Educational Resources Information Center

    Sergiovanni, Thomas J.

    Educational administration has been shaped by the metaphor of organization. From organizational and management theory, and from economics, the parent of organizational theory, educational administration has borrowed definitions of quality, productivity, and efficiency; strategies to achieve them; and theories of human nature and motivation.…

  4. The "New Institutionalism" in Organization Theory: Bringing Society and Culture Back in

    ERIC Educational Resources Information Center

    Senge, Konstanze

    2013-01-01

    This investigation will discuss the emergence of an economistical perspective among the dominant approaches of organization theory in the United States since the inception of "organization studies" as an academic discipline. It maintains that Contingency theory, Resource Dependency theory, Population Ecology theory, and Transaction Cost theory…

  5. ERP Reliability Analysis (ERA) Toolbox: An open-source toolbox for analyzing the reliability of event-related brain potentials.

    PubMed

    Clayson, Peter E; Miller, Gregory A

    2017-01-01

    Generalizability theory (G theory) provides a flexible, multifaceted approach to estimating score reliability. G theory's approach to estimating score reliability has important advantages over classical test theory that are relevant for research using event-related brain potentials (ERPs). For example, G theory does not require parallel forms (i.e., equal means, variances, and covariances), can handle unbalanced designs, and provides a single reliability estimate for designs with multiple sources of error. This monograph provides a detailed description of the conceptual framework of G theory using examples relevant to ERP researchers, presents the algorithms needed to estimate ERP score reliability, and provides a detailed walkthrough of newly-developed software, the ERP Reliability Analysis (ERA) Toolbox, that calculates score reliability using G theory. The ERA Toolbox is open-source, Matlab software that uses G theory to estimate the contribution of the number of trials retained for averaging, group, and/or event types on ERP score reliability. The toolbox facilitates the rigorous evaluation of psychometric properties of ERP scores recommended elsewhere in this special issue. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. Building New Bridges: Linking Organization Theory with Other Educational Literatures

    ERIC Educational Resources Information Center

    Johnson, Bob L., Jr.; Owens, Michael

    2005-01-01

    Purpose: This paper provides an example of how organization theory can be linked with other literatures in a complementary and productive manner. Establishing a bridge between the organization theory and learning environment literatures, the authors seek to provide an example of how such literature-bridging can enrich our understanding of the…

  7. Reliability and validity of advanced theory-of-mind measures in middle childhood and adolescence.

    PubMed

    Hayward, Elizabeth O; Homer, Bruce D

    2017-09-01

    Although theory-of-mind (ToM) development is well documented for early childhood, there is increasing research investigating changes in ToM reasoning in middle childhood and adolescence. However, the psychometric properties of most advanced ToM measures for use with older children and adolescents have not been firmly established. We report on the reliability and validity of widely used, conventional measures of advanced ToM with this age group. Notable issues with both reliability and validity of several of the measures were evident in the findings. With regard to construct validity, results do not reveal a clear empirical commonality between tasks, and, after accounting for comprehension, developmental trends were evident in only one of the tasks investigated. Statement of contribution What is already known on this subject? Second-order false belief tasks have acceptable internal consistency. The Eyes Test has poor internal consistency. Validity of advanced theory-of-mind tasks is often based on the ability to distinguish clinical from typical groups. What does this study add? This study examines internal consistency across six widely used advanced theory-of-mind tasks. It investigates validity of tasks based on comprehension of items by typically developing individuals. It further assesses construct validity, or commonality between tasks. © 2017 The British Psychological Society.

  8. [Reliability theory based on quality risk network analysis for Chinese medicine injection].

    PubMed

    Li, Zheng; Kang, Li-Yuan; Fan, Xiao-Hui

    2014-08-01

    A new risk analysis method based upon reliability theory was introduced in this paper for the quality risk management of Chinese medicine injection manufacturing plants. The risk events including both cause and effect ones were derived in the framework as nodes with a Bayesian network analysis approach. It thus transforms the risk analysis results from failure mode and effect analysis (FMEA) into a Bayesian network platform. With its structure and parameters determined, the network can be used to evaluate the system reliability quantitatively with probabilistic analytical appraoches. Using network analysis tools such as GeNie and AgenaRisk, we are able to find the nodes that are most critical to influence the system reliability. The importance of each node to the system can be quantitatively evaluated by calculating the effect of the node on the overall risk, and minimization plan can be determined accordingly to reduce their influences and improve the system reliability. Using the Shengmai injection manufacturing plant of SZYY Ltd as a user case, we analyzed the quality risk with both static FMEA analysis and dynamic Bayesian Network analysis. The potential risk factors for the quality of Shengmai injection manufacturing were identified with the network analysis platform. Quality assurance actions were further defined to reduce the risk and improve the product quality.

  9. Inter-Observer, Intra-Observer and Intra-Individual Reliability of Uroflowmetry Tests in Aged Men: A Generalizability Theory Approach.

    PubMed

    Liu, Ying-Buh; Yang, Stephen S; Hsieh, Cheng-Hsing; Lin, Chia-Da; Chang, Shang-Jen

    2014-05-01

    To evaluate the inter-observer, intra-observer and intra-individual reliability of uroflowmetry and post-void residual urine (PVR) tests in adult men. Healthy volunteers aged over 40 years were enrolled. Every participant underwent two sets of uroflowmetry and PVR tests with a 2-week interval between the tests. The uroflowmetry tests were interpreted by four urologists independently. Uroflowmetry curves were classified as bell-shaped, bell-shaped with tail, obstructive, restrictive, staccato, interrupted and tower-shaped and scored from 1 (highly abnormal) to 5 (absolutely normal). The agreements between the observers, interpretations and tests within individuals were analyzed using kappa statistics and intraclass correlation coefficients. Generalizability theory with decision analysis was used to determine how many observers, tests, and interpretations were needed to obtain an acceptable reliability (> 0.80). Of 108 volunteers, we randomly selected the uroflowmetry results from 25 participants for the evaluation of reliability. The mean age of the studied adults was 55.3 years. The intra-individual and intra-observer reliability on uroflowmetry tests ranged from good to very good. However, the inter-observer reliability on normalcy and specific type of flow pattern were relatively lower. In generalizability theory, three observers were needed to obtain an acceptable reliability on normalcy of uroflow pattern if the patient underwent uroflowmetry tests twice with one observation. The intra-individual and intra-observer reliability on uroflowmetry tests were good while the inter-observer reliability was relatively lower. To improve inter-observer reliability, the definition of uroflowmetry should be clarified by the International Continence Society. © 2013 Wiley Publishing Asia Pty Ltd.

  10. Carcinogenesis explained within the context of a theory of organisms.

    PubMed

    Sonnenschein, Carlos; Soto, Ana M

    2016-10-01

    For a century, the somatic mutation theory (SMT) has been the prevalent theory to explain carcinogenesis. According to the SMT, cancer is a cellular problem, and thus, the level of organization where it should be studied is the cellular level. Additionally, the SMT proposes that cancer is a problem of the control of cell proliferation and assumes that proliferative quiescence is the default state of cells in metazoa. In 1999, a competing theory, the tissue organization field theory (TOFT), was proposed. In contraposition to the SMT, the TOFT posits that cancer is a tissue-based disease whereby carcinogens (directly) and mutations in the germ-line (indirectly) alter the normal interactions between the diverse components of an organ, such as the stroma and its adjacent epithelium. The TOFT explicitly acknowledges that the default state of all cells is proliferation with variation and motility. When taking into consideration the principle of organization, we posit that carcinogenesis can be explained as a relational problem whereby release of the constraints created by cell interactions and the physical forces generated by cellular agency lead cells within a tissue to regain their default state of proliferation with variation and motility. Within this perspective, what matters both in morphogenesis and carcinogenesis is not only molecules, but also biophysical forces generated by cells and tissues. Herein, we describe how the principles for a theory of organisms apply to the TOFT and thus to the study of carcinogenesis. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Selecting Organization Development Theory from an HRD Perspective

    ERIC Educational Resources Information Center

    Lynham, Susan A.; Chermack, Thomas J.; Noggle, Melissa A.

    2004-01-01

    As is true for human resource development (HRD), the field of organization development (OD) draws from numerous disciplines to inform its theory base. However, the identification and selection of theory to inform improved practice remains a challenge and begs the question of what can be used to inform and guide one in the identification and…

  12. Two Prophecy Formulas for Assessing the Reliability of Item Response Theory-Based Ability Estimates

    ERIC Educational Resources Information Center

    Raju, Nambury S.; Oshima, T.C.

    2005-01-01

    Two new prophecy formulas for estimating item response theory (IRT)-based reliability of a shortened or lengthened test are proposed. Some of the relationships between the two formulas, one of which is identical to the well-known Spearman-Brown prophecy formula, are examined and illustrated. The major assumptions underlying these formulas are…

  13. Organization Theory and Memory for Prose: A Review of the Literature

    ERIC Educational Resources Information Center

    Shimmerlik, Susan M.

    1978-01-01

    Organization theory emphasizes groupings of items on the basis of a variety of characteristics, and the role of the learner as an active processor or encoder of information. Research on organization theory as it is applied to memory and recall of prose is reviewed here. (BW)

  14. Cross Cultural Perspectives of the Learning Organization: Assessing the Validity and Reliability of the DLOQ in Korea

    ERIC Educational Resources Information Center

    Song, Ji Hoon; Kim, Jin Yong; Chermack, Thomas J.; Yang; Baiyin

    2008-01-01

    The primary purpose of this research was to adapt the Dimensions of Learning Organization Questionnaire (DLOQ) from Watkins and Marsick (1993, 1996) and examine its validity and reliability in a Korean context. Results indicate that the DLOQ produces valid and reliable scores of learning organization characteristics in a Korean cultural context.…

  15. Using Metaphors to Teach Organization Theory

    ERIC Educational Resources Information Center

    Taber, Tom D.

    2007-01-01

    Metaphors were used to teach systems thinking and to clarify concepts of organizational theory in an introductory MBA management course. Gareth Morgan's metaphors of organization were read by students and applied as frames to analyze a business case. In addition, personal metaphors were written by individual students in order to describe the…

  16. Dielectric properties of organic solvents from non-polarizable molecular dynamics simulation with electronic continuum model and density functional theory.

    PubMed

    Lee, Sanghun; Park, Sung Soo

    2011-11-03

    Dielectric constants of electrolytic organic solvents are calculated employing nonpolarizable Molecular Dynamics simulation with Electronic Continuum (MDEC) model and Density Functional Theory. The molecular polarizabilities are obtained by the B3LYP/6-311++G(d,p) level of theory to estimate high-frequency refractive indices while the densities and dipole moment fluctuations are computed using nonpolarizable MD simulations. The dielectric constants reproduced from these procedures are evaluated to provide a reliable approach for estimating the experimental data. An additional feature, two representative solvents which have similar molecular weights but are different dielectric properties, i.e., ethyl methyl carbonate and propylene carbonate, are compared using MD simulations and the distinctly different dielectric behaviors are observed at short times as well as at long times.

  17. Validity and Reliability of Published Comprehensive Theory of Mind Tests for Normal Preschool Children: A Systematic Review.

    PubMed

    Ziatabar Ahmadi, Seyyede Zohreh; Jalaie, Shohreh; Ashayeri, Hassan

    2015-09-01

    Theory of mind (ToM) or mindreading is an aspect of social cognition that evaluates mental states and beliefs of oneself and others. Validity and reliability are very important criteria when evaluating standard tests; and without them, these tests are not usable. The aim of this study was to systematically review the validity and reliability of published English comprehensive ToM tests developed for normal preschool children. We searched MEDLINE (PubMed interface), Web of Science, Science direct, PsycINFO, and also evidence base Medicine (The Cochrane Library) databases from 1990 to June 2015. Search strategy was Latin transcription of 'Theory of Mind' AND test AND children. Also, we manually studied the reference lists of all final searched articles and carried out a search of their references. Inclusion criteria were as follows: Valid and reliable diagnostic ToM tests published from 1990 to June 2015 for normal preschool children; and exclusion criteria were as follows: the studies that only used ToM tests and single tasks (false belief tasks) for ToM assessment and/or had no description about structure, validity or reliability of their tests. METHODological quality of the selected articles was assessed using the Critical Appraisal Skills Programme (CASP). In primary searching, we found 1237 articles in total databases. After removing duplicates and applying all inclusion and exclusion criteria, we selected 11 tests for this systematic review. There were a few valid, reliable and comprehensive ToM tests for normal preschool children. However, we had limitations concerning the included articles. The defined ToM tests were different in populations, tasks, mode of presentations, scoring, mode of responses, times and other variables. Also, they had various validities and reliabilities. Therefore, it is recommended that the researchers and clinicians select the ToM tests according to their psychometric characteristics, validity and reliability.

  18. Validity and Reliability of Published Comprehensive Theory of Mind Tests for Normal Preschool Children: A Systematic Review

    PubMed Central

    Ziatabar Ahmadi, Seyyede Zohreh; Jalaie, Shohreh; Ashayeri, Hassan

    2015-01-01

    Objective: Theory of mind (ToM) or mindreading is an aspect of social cognition that evaluates mental states and beliefs of oneself and others. Validity and reliability are very important criteria when evaluating standard tests; and without them, these tests are not usable. The aim of this study was to systematically review the validity and reliability of published English comprehensive ToM tests developed for normal preschool children. Method: We searched MEDLINE (PubMed interface), Web of Science, Science direct, PsycINFO, and also evidence base Medicine (The Cochrane Library) databases from 1990 to June 2015. Search strategy was Latin transcription of ‘Theory of Mind’ AND test AND children. Also, we manually studied the reference lists of all final searched articles and carried out a search of their references. Inclusion criteria were as follows: Valid and reliable diagnostic ToM tests published from 1990 to June 2015 for normal preschool children; and exclusion criteria were as follows: the studies that only used ToM tests and single tasks (false belief tasks) for ToM assessment and/or had no description about structure, validity or reliability of their tests. Methodological quality of the selected articles was assessed using the Critical Appraisal Skills Programme (CASP). Result: In primary searching, we found 1237 articles in total databases. After removing duplicates and applying all inclusion and exclusion criteria, we selected 11 tests for this systematic review. Conclusion: There were a few valid, reliable and comprehensive ToM tests for normal preschool children. However, we had limitations concerning the included articles. The defined ToM tests were different in populations, tasks, mode of presentations, scoring, mode of responses, times and other variables. Also, they had various validities and reliabilities. Therefore, it is recommended that the researchers and clinicians select the ToM tests according to their psychometric characteristics

  19. Reliability Correction for Functional Connectivity: Theory and Implementation

    PubMed Central

    Mueller, Sophia; Wang, Danhong; Fox, Michael D.; Pan, Ruiqi; Lu, Jie; Li, Kuncheng; Sun, Wei; Buckner, Randy L.; Liu, Hesheng

    2016-01-01

    Network properties can be estimated using functional connectivity MRI (fcMRI). However, regional variation of the fMRI signal causes systematic biases in network estimates including correlation attenuation in regions of low measurement reliability. Here we computed the spatial distribution of fcMRI reliability using longitudinal fcMRI datasets and demonstrated how pre-estimated reliability maps can correct for correlation attenuation. As a test case of reliability-based attenuation correction we estimated properties of the default network, where reliability was significantly lower than average in the medial temporal lobe and higher in the posterior medial cortex, heterogeneity that impacts estimation of the network. Accounting for this bias using attenuation correction revealed that the medial temporal lobe’s contribution to the default network is typically underestimated. To render this approach useful to a greater number of datasets, we demonstrate that test-retest reliability maps derived from repeated runs within a single scanning session can be used as a surrogate for multi-session reliability mapping. Using data segments with different scan lengths between 1 and 30 min, we found that test-retest reliability of connectivity estimates increases with scan length while the spatial distribution of reliability is relatively stable even at short scan lengths. Finally, analyses of tertiary data revealed that reliability distribution is influenced by age, neuropsychiatric status and scanner type, suggesting that reliability correction may be especially important when studying between-group differences. Collectively, these results illustrate that reliability-based attenuation correction is an easily implemented strategy that mitigates certain features of fMRI signal nonuniformity. PMID:26493163

  20. 18 CFR 39.10 - Changes to an Electric Reliability Organization Rule or Regional Entity Rule.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 18 Conservation of Power and Water Resources 1 2014-04-01 2014-04-01 false Changes to an Electric Reliability Organization Rule or Regional Entity Rule. 39.10 Section 39.10 Conservation of Power and Water... ESTABLISHMENT, APPROVAL, AND ENFORCEMENT OF ELECTRIC RELIABILITY STANDARDS § 39.10 Changes to an Electric...

  1. 18 CFR 39.10 - Changes to an Electric Reliability Organization Rule or Regional Entity Rule.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Changes to an Electric Reliability Organization Rule or Regional Entity Rule. 39.10 Section 39.10 Conservation of Power and Water... ESTABLISHMENT, APPROVAL, AND ENFORCEMENT OF ELECTRIC RELIABILITY STANDARDS § 39.10 Changes to an Electric...

  2. 18 CFR 39.10 - Changes to an Electric Reliability Organization Rule or Regional Entity Rule.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 18 Conservation of Power and Water Resources 1 2012-04-01 2012-04-01 false Changes to an Electric Reliability Organization Rule or Regional Entity Rule. 39.10 Section 39.10 Conservation of Power and Water... ESTABLISHMENT, APPROVAL, AND ENFORCEMENT OF ELECTRIC RELIABILITY STANDARDS § 39.10 Changes to an Electric...

  3. 18 CFR 39.10 - Changes to an Electric Reliability Organization Rule or Regional Entity Rule.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Changes to an Electric Reliability Organization Rule or Regional Entity Rule. 39.10 Section 39.10 Conservation of Power and Water... ESTABLISHMENT, APPROVAL, AND ENFORCEMENT OF ELECTRIC RELIABILITY STANDARDS § 39.10 Changes to an Electric...

  4. High reliability organizing implementation at Sequoia and Kings Canyon National Parks

    Treesearch

    David A. Christenson; Mike DeGrosky; Anne E. Black; Brett Fay

    2008-01-01

    It is said that action often precedes cognition. For example, wildland fire management personnel already do things in the course of their work that they will later recognize as consistent with the principles of high reliability organizing (HRO), once they know about those principles. In the case of Sequoia and Kings Canyon National Parks (SEKI), the fire management...

  5. The Progress of Theory in Knowledge Organization.

    ERIC Educational Resources Information Center

    Smiraglia, Richard P.

    2002-01-01

    Presents a background on theory in knowledge organization, which has moved from an epistemic stance of pragmatism and rationalism (based on observation of the construction of retrieval tools), to empiricism (based on the results of empirical research). Discusses historicism, external validity, classification, user-interface design, and…

  6. Teaching organization theory for healthcare management: three applied learning methods.

    PubMed

    Olden, Peter C

    2006-01-01

    Organization theory (OT) provides a way of seeing, describing, analyzing, understanding, and improving organizations based on patterns of organizational design and behavior (Daft 2004). It gives managers models, principles, and methods with which to diagnose and fix organization structure, design, and process problems. Health care organizations (HCOs) face serious problems such as fatal medical errors, harmful treatment delays, misuse of scarce nurses, costly inefficiency, and service failures. Some of health care managers' most critical work involves designing and structuring their organizations so their missions, visions, and goals can be achieved-and in some cases so their organizations can survive. Thus, it is imperative that graduate healthcare management programs develop effective approaches for teaching OT to students who will manage HCOs. Guided by principles of education, three applied teaching/learning activities/assignments were created to teach OT in a graduate healthcare management program. These educationalmethods develop students' competency with OT applied to HCOs. The teaching techniques in this article may be useful to faculty teaching graduate courses in organization theory and related subjects such as leadership, quality, and operation management.

  7. Compliance and High Reliability in a Complex Healthcare Organization.

    PubMed

    Simon, Maxine dellaBadia

    2018-01-01

    When considering the impact of regulation on healthcare, visualize a spider's web. The spider weaves sections together to create the whole, with each fiber adding to the structure to support its success or lead to its failure. Each section is dependent on the others, and all must be aligned to maintain the structure. Outside forces can cause a shift in the web's fragile equilibrium.The interdependence of the sections of the spider's web is similar to the way hospital departments and services work together. An organization's structure must be shaped to support its mission and vision. At the same time, the business of healthcare requires the development and achievement of operational objectives and financial performance goals. Establishing a culture that is flexible enough to permit creativity, provide resiliency, and manage complexity as the organization grows is fundamental to success. An organization must address each of these factors while maintaining stability, carrying out its mission, and fostering improvement.Nature's order maintains the spider's web. Likewise, regulation can strengthen healthcare organizations by initiating disruptive changes that can support efforts to achieve and sustain high reliability in the delivery of care. To that end, leadership must be willing to provide the necessary vision and resources.

  8. How Settings Change People: Applying Behavior Setting Theory to Consumer-Run Organizations

    ERIC Educational Resources Information Center

    Brown, Louis D.; Shepherd, Matthew D.; Wituk, Scott A.; Meissen, Greg

    2007-01-01

    Self-help initiatives stand as a classic context for organizational studies in community psychology. Behavior setting theory stands as a classic conception of organizations and the environment. This study explores both, applying behavior setting theory to consumer-run organizations (CROs). Analysis of multiple data sets from all CROs in Kansas…

  9. Content-oriented Approach to Organization of Theories and Its Utilization

    NASA Astrophysics Data System (ADS)

    Hayashi, Yusuke; Bourdeau, Jacqueline; Mizoguch, Riichiro

    In spite of the fact that the relation between theory and practice is a foundation of scientific and technological development, the trend of increasing the gap between theory and practice accelerates in these years. The gap embraces a risk of distrust of science and technology. Ontological engineering as the content-oriented research is expected to contribute to the resolution of the gap. This paper presents the feasibility of organization of theoretical knowledge on ontological engineering and new-generation intelligent systems based on it through an application of ontological engineering in the area of learning/instruction support. This area also has the problem of the gap between theory and practice, and its resolution is strongly required. So far we proposed OMNIBUS ontology, which is a comprehensive ontology that covers different learning/instructional theories and paradigms, and SMARTIES, which is a theory-aware and standard-compliant authoring system for making learning/instructional scenarios based on OMNIBUS ontology. We believe the theory-awareness and standard-compliance bridge the gap between theory and practice because it links theories to practical use of standard technologies and enables practitioners to easily enjoy theoretical support while using standard technologies in practice. The following goals are set in order to achieve it; computers (1) understand a variety of learning/instructional theories based on the organization of them, (2) utilize the understanding for helping authors' learning/instructional scenario making and (3) make such theoretically sound scenarios interoperable within the framework of standard technologies. This paper suggests an ontological engineering solution to the achievement of these three goals. Although the evaluation is far from complete in terms of practical use, we believe that the results of this study address high-level technical challenges from the viewpoint of the current state of the art in the research area

  10. General Systems Theory Approaches to Organizations: Some Problems in Application

    ERIC Educational Resources Information Center

    Peery, Newman S., Jr.

    1975-01-01

    Considers the limitations of General Systems Theory (GST) as a major paradigm within administrative theory and concludes that most systems formulations overemphasize growth and show little appreciation for intraorganizational conflict, diversity of values, and political action within organizations. Suggests that these limitations are mainly due to…

  11. Patient safety in anesthesia: learning from the culture of high-reliability organizations.

    PubMed

    Wright, Suzanne M

    2015-03-01

    There has been an increased awareness of and interest in patient safety and improved outcomes, as well as a growing body of evidence substantiating medical error as a leading cause of death and injury in the United States. According to The Joint Commission, US hospitals demonstrate improvements in health care quality and patient safety. Although this progress is encouraging, much room for improvement remains. High-reliability organizations, industries that deliver reliable performances in the face of complex working environments, can serve as models of safety for our health care system until plausible explanations for patient harm are better understood. Copyright © 2015 Elsevier Inc. All rights reserved.

  12. Validity and reliability analysis of the planned behavior theory scale related to the testicular self-examination in a Turkish context.

    PubMed

    Iyigun, Emine; Tastan, Sevinc; Ayhan, Hatice; Kose, Gulsah; Acikel, Cengizhan

    2016-06-01

    This study aimed to determine the validity and reliability levels of the Planned Behavior Theory Scale as related to a testicular self-examination. The study was carried out in a health-profession higher-education school in Ankara, Turkey, from April to June 2012. The study participants comprised 215 male students. Study data were collected by using a questionnaire, a planned behavior theory scale related to testicular self-examination, and Champion's Health Belief Model Scale (CHBMS). The sub-dimensions of the planned behavior theory scale, namely those of intention, attitude, subjective norms and self-efficacy, were found to have Cronbach's alpha values of between 0.81 and 0.89. Exploratory factor analysis showed that items of the scale had five factors that accounted for 75% of the variance. Of these, the sub-dimension of intention was found to have the highest level of contribution. A significant correlation was found between the sub-dimensions of the testicular self-examination planned behavior theory scale and those of CHBMS (p < 0.05). The findings suggest that the Turkish version of the testicular self-examination Planned Behavior Theory Scale is a valid and reliable measurement for Turkish society.

  13. Molecular Electron Density Theory: A Modern View of Reactivity in Organic Chemistry.

    PubMed

    Domingo, Luis R

    2016-09-30

    A new theory for the study of the reactivity in Organic Chemistry, named Molecular Electron Density Theory (MEDT), is proposed herein. MEDT is based on the idea that while the electron density distribution at the ground state is responsible for physical and chemical molecular properties, as proposed by the Density Functional Theory (DFT), the capability for changes in electron density is responsible for molecular reactivity. Within MEDT, the reactivity in Organic Chemistry is studied through a rigorous quantum chemical analysis of the changes of the electron density as well as the energies associated with these changes along the reaction path in order to understand experimental outcomes. Studies performed using MEDT allow establishing a modern rationalisation and to gain insight into molecular mechanisms and reactivity in Organic Chemistry.

  14. A review of carrier thermoelectric-transport theory in organic semiconductors.

    PubMed

    Lu, Nianduan; Li, Ling; Liu, Ming

    2016-07-20

    Carrier thermoelectric-transport theory has recently become of growing interest and numerous thermoelectric-transport models have been proposed for organic semiconductors, due to pressing current issues involving energy production and the environment. The purpose of this review is to provide a theoretical description of the thermoelectric Seebeck effect in organic semiconductors. Special attention is devoted to the carrier concentration, temperature, polaron effect and dipole effect dependence of the Seebeck effect and its relationship to hopping transport theory. Furthermore, various theoretical methods are used to discuss carrier thermoelectric transport. Finally, an outlook of the remaining challenges ahead for future theoretical research is provided.

  15. Expert Reliability for the World Health Organization Standardized Ultrasound Classification of Cystic Echinococcosis

    PubMed Central

    Solomon, Nadia; Fields, Paul J.; Tamarozzi, Francesca; Brunetti, Enrico; Macpherson, Calum N. L.

    2017-01-01

    Cystic echinococcosis (CE), a parasitic zoonosis, results in cyst formation in the viscera. Cyst morphology depends on developmental stage. In 2003, the World Health Organization (WHO) published a standardized ultrasound (US) classification for CE, for use among experts as a standard of comparison. This study examined the reliability of this classification. Eleven international CE and US experts completed an assessment of eight WHO classification images and 88 test images representing cyst stages. Inter- and intraobserver reliability and observer performance were assessed using Fleiss' and Cohen's kappa. Interobserver reliability was moderate for WHO images (κ = 0.600, P < 0.0001) and substantial for test images (κ = 0.644, P < 0.0001), with substantial to almost perfect interobserver reliability for stages with pathognomonic signs (CE1, CE2, and CE3) for WHO (0.618 < κ < 0.904) and test images (0.642 < κ < 0.768). Comparisons of expert performances against the majority classification for each image were significant for WHO (0.413 < κ < 1.000, P < 0.005) and test images (0.718 < κ < 0.905, P < 0.0001); and intraobserver reliability was significant for WHO (0.520 < κ < 1.000, P < 0.005) and test images (0.690 < κ < 0.896, P < 0.0001). Findings demonstrate moderate to substantial interobserver and substantial to almost perfect intraobserver reliability for the WHO classification, with substantial to almost perfect interobserver reliability for pathognomonic stages. This confirms experts' abilities to reliably identify WHO-defined pathognomonic signs of CE, demonstrating that the WHO classification provides a reproducible way of staging CE. PMID:28070008

  16. Immodest Witnesses: Reliability and Writing Assessment

    ERIC Educational Resources Information Center

    Gallagher, Chris W.

    2014-01-01

    This article offers a survey of three reliability theories in writing assessment: positivist, hermeneutic, and rhetorical. Drawing on an interdisciplinary investigation of the notion of "witnessing," this survey emphasizes the kinds of readers and readings each theory of reliability produces and the epistemological grounds on which it…

  17. Using multivariate generalizability theory to assess the effect of content stratification on the reliability of a performance assessment.

    PubMed

    Keller, Lisa A; Clauser, Brian E; Swanson, David B

    2010-12-01

    In recent years, demand for performance assessments has continued to grow. However, performance assessments are notorious for lower reliability, and in particular, low reliability resulting from task specificity. Since reliability analyses typically treat the performance tasks as randomly sampled from an infinite universe of tasks, these estimates of reliability may not be accurate. For tests built according to a table of specifications, tasks are randomly sampled from different strata (content domains, skill areas, etc.). If these strata remain fixed in the test construction process, ignoring this stratification in the reliability analysis results in an underestimate of "parallel forms" reliability, and an overestimate of the person-by-task component. This research explores the effect of representing and misrepresenting the stratification appropriately in estimation of reliability and the standard error of measurement. Both multivariate and univariate generalizability studies are reported. Results indicate that the proper specification of the analytic design is essential in yielding the proper information both about the generalizability of the assessment and the standard error of measurement. Further, illustrative D studies present the effect under a variety of situations and test designs. Additional benefits of multivariate generalizability theory in test design and evaluation are also discussed.

  18. Organic magnetoresistance based on hopping theory

    NASA Astrophysics Data System (ADS)

    Yang, Fu-Jiang; Xie, Shi-Jie

    2014-09-01

    For the organic magnetoresistance (OMAR) effect, we suggest a spin-related hopping of carriers (polarons) based on Marcus theory. The mobility of polarons is calculated with the master equation (ME) and then the magnetoresistance (MR) is obtained. The theoretical results are consistent with the experimental observation. Especially, the sign inversion of the MR under different driving bias voltages found in the experiment is predicted. Besides, the effects of molecule disorder, hyperfine interaction (HFI), polaron localization, and temperature on the MR are investigated.

  19. Test-retest reliability of the scale of participation in organized activities among adolescents in the Czech Republic and Slovakia.

    PubMed

    Bosakova, Lucia; Kolarcik, Peter; Bobakova, Daniela; Sulcova, Martina; Van Dijk, Jitse P; Reijneveld, Sijmen A; Geckova, Andrea Madarasova

    2016-04-01

    Participation in organized activities is related with a range of positive outcomes, but the way such participation is measured has not been scrutinized. Test-retest reliability as an important indicator of a scale's reliability has been assessed rarely and for "The scale of participation in organized activities" lacks completely. This test-retest study is based on the Health Behaviour in School-aged Children study and is consistent with its methodology. We obtained data from 353 Czech (51.9 % boys) and 227 Slovak (52.9 % boys) primary school pupils, grades five and nine, who participated in this study in 2013. We used Cohen's kappa statistic and single measures of the intraclass correlation coefficient to estimate the test-retest reliability of all selected items in the sample, stratified by gender, age and country. We mostly observed a large correlation between the test and retest in all of the examined variables (κ ranged from 0.46 to 0.68). Test-retest reliability of the sum score of individual items showed substantial agreement (ICC = 0.64). The scale of participation in organized activities has an acceptable level of agreement, indicating good reliability.

  20. Teaching Organization Theory and Practice: An Experiential and Reflective Approach

    ERIC Educational Resources Information Center

    Cameron, Mark; Turkiewicz, Rita M.; Holdaway, Britt A.; Bill, Jacqueline S.; Goodman, Jessica; Bonner, Aisha; Daly, Stacey; Cohen, Michael D.; Lorenz, Cassandra; Wilson, Paul R.; Rusk, James

    2009-01-01

    The organization is often the overlooked level in social work's ecological perspective. However, organizational realities exert a profound influence on human development and well-being as well as the nature and quality of social work practice. This article describes a model of teaching organization theory and practice which requires master's…

  1. 18 CFR 39.3 - Electric Reliability Organization certification.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... operators of the Bulk-Power System, and other interested parties for improvement of the Electric Reliability... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Electric Reliability..., Reliability Standards that provide for an adequate level of reliability of the Bulk-Power System, and (2) Has...

  2. Theory of reliable systems. [systems analysis and design

    NASA Technical Reports Server (NTRS)

    Meyer, J. F.

    1973-01-01

    The analysis and design of reliable systems are discussed. The attributes of system reliability studied are fault tolerance, diagnosability, and reconfigurability. Objectives of the study include: to determine properties of system structure that are conducive to a particular attribute; to determine methods for obtaining reliable realizations of a given system; and to determine how properties of system behavior relate to the complexity of fault tolerant realizations. A list of 34 references is included.

  3. Reliability training

    NASA Technical Reports Server (NTRS)

    Lalli, Vincent R. (Editor); Malec, Henry A. (Editor); Dillard, Richard B.; Wong, Kam L.; Barber, Frank J.; Barina, Frank J.

    1992-01-01

    Discussed here is failure physics, the study of how products, hardware, software, and systems fail and what can be done about it. The intent is to impart useful information, to extend the limits of production capability, and to assist in achieving low cost reliable products. A review of reliability for the years 1940 to 2000 is given. Next, a review of mathematics is given as well as a description of what elements contribute to product failures. Basic reliability theory and the disciplines that allow us to control and eliminate failures are elucidated.

  4. The Development of the Functional Literacy Experience Scale Based upon Ecological Theory (FLESBUET) and Validity-Reliability Study

    ERIC Educational Resources Information Center

    Özenç, Emine Gül; Dogan, M. Cihangir

    2014-01-01

    This study aims to perform a validity-reliability test by developing the Functional Literacy Experience Scale based upon Ecological Theory (FLESBUET) for primary education students. The study group includes 209 fifth grade students at Sabri Taskin Primary School in the Kartal District of Istanbul, Turkey during the 2010-2011 academic year.…

  5. [Employees in high-reliability organizations: systematic selection of personnel as a final criterion].

    PubMed

    Oubaid, V; Anheuser, P

    2014-05-01

    Employees represent an important safety factor in high-reliability organizations. The combination of clear organizational structures, a nonpunitive safety culture, and psychological personnel selection guarantee a high level of safety. The cockpit personnel selection process of a major German airline is presented in order to demonstrate a possible transferability into medicine and urology.

  6. Inventing the future of reliability: FERC's recent orders and the consolidation of reliability authority

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Skees, J. Daniel

    2010-06-15

    The Energy Policy Act of 2005 established mandatory reliability standard enforcement under a system in which the Federal Energy Regulatory Commission and the Electric Reliability Organization would have their own spheres of responsibility and authority. Recent orders, however, reflect the Commission's frustration with the reliability standard drafting process and suggest that the Electric Reliability Organization's discretion is likely to receive less deference in the future. (author)

  7. The Reliability of Criterion-Referenced Measures.

    ERIC Educational Resources Information Center

    Livingston, Samuel A.

    The assumptions of the classical test-theory model are used to develop a theory of reliability for criterion-referenced measures which parallels that for norm-referenced measures. It is shown that the Spearman-Brown formula holds for criterion-referenced measures and that the criterion-referenced reliability coefficient can be used to correct…

  8. Cliophysics: Socio-Political Reliability Theory, Polity Duration and African Political (In)stabilities

    PubMed Central

    Cherif, Alhaji; Barley, Kamal

    2010-01-01

    Quantification of historical sociological processes have recently gained attention among theoreticians in the effort of providing a solid theoretical understanding of the behaviors and regularities present in socio-political dynamics. Here we present a reliability theory of polity processes with emphases on individual political dynamics of African countries. We found that the structural properties of polity failure rates successfully capture the risk of political vulnerability and instabilities in which , , , and of the countries with monotonically increasing, unimodal, U-shaped and monotonically decreasing polity failure rates, respectively, have high level of state fragility indices. The quasi-U-shape relationship between average polity duration and regime types corroborates historical precedents and explains the stability of the autocracies and democracies. PMID:21206911

  9. Reliable semiclassical computations in QCD

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dine, Michael; Department of Physics, Stanford University Stanford, California 94305-4060; Festuccia, Guido

    We revisit the question of whether or not one can perform reliable semiclassical QCD computations at zero temperature. We study correlation functions with no perturbative contributions, and organize the problem by means of the operator product expansion, establishing a precise criterion for the validity of a semiclassical calculation. For N{sub f}>N, a systematic computation is possible; for N{sub f}theory computations in the chiral limit.

  10. Keeping patients safe in healthcare organizations: a structuration theory of safety culture.

    PubMed

    Groves, Patricia S; Meisenbach, Rebecca J; Scott-Cawiezell, Jill

    2011-08-01

    This paper presents a discussion of the use of structuration theory to facilitate understanding and improvement of safety culture in healthcare organizations. Patient safety in healthcare organizations is an important problem worldwide. Safety culture has been proposed as a means to keep patients safe. However, lack of appropriate theory limits understanding and improvement of safety culture. The proposed structuration theory of safety culture was based on a critique of available English-language literature, resulting in literature published from 1983 to mid-2009. CINAHL, Communication and Mass Media Complete, ABI/Inform and Google Scholar databases were searched using the following terms: nursing, safety, organizational culture and safety culture. When viewed through the lens of structuration theory, safety culture is a system involving both individual actions and organizational structures. Healthcare organization members, particularly nurses, share these values through communication and enact them in practice, (re)producing an organizational safety culture system that reciprocally constrains and enables the actions of the members in terms of patient safety. This structurational viewpoint illuminates multiple opportunities for safety culture improvement. Nurse leaders should be cognizant of competing value-based culture systems in the organization and attend to nursing agency and all forms of communication when attempting to create or strengthen a safety culture. Applying structuration theory to the concept of safety culture reveals a dynamic system of individual action and organizational structure constraining and enabling safety practice. Nurses are central to the (re)production of this safety culture system. © 2011 Blackwell Publishing Ltd.

  11. The quest for a general theory of aging and longevity.

    PubMed

    Gavrilov, Leonid A; Gavrilova, Natalia S

    2003-07-16

    Extensive studies of phenomena related to aging have produced many diverse findings, which require a general theoretical framework to be organized into a comprehensive body of knowledge. As demonstrated by the success of evolutionary theories of aging, quite general theoretical considerations can be very useful when applied to research on aging. In this theoretical study, we attempt to gain insight into aging by applying a general theory of systems failure known as reliability theory. Considerations of this theory lead to the following conclusions: (i) Redundancy is a concept of crucial importance for understanding aging, particularly the systemic nature of aging. Systems that are redundant in numbers of irreplaceable elements deteriorate (that is, age) over time, even if they are built of elements that do not themselves age. (ii) An apparent aging rate or expression of aging is higher for systems that have higher levels of redundancy. (iii) Redundancy exhaustion over the life course explains a number of observations about mortality, including mortality convergence at later life (when death rates are becoming relatively similar at advanced ages for different populations of the same species) as well as late-life mortality deceleration, leveling off, and mortality plateaus. (iv) Living organisms apparently contain a high load of initial damage from the early stages of development, and therefore their life span and aging patterns may be sensitive to early-life conditions that determine this initial damage load. Thus, the reliability theory provides a parsimonious explanation for many important aging-related phenomena and suggests a number of interesting testable predictions. We therefore suggest adding the reliability theory to the arsenal of methodological approaches applied to research on aging.

  12. Research on High Reliability Organizations: Implications for School Effects Research, Policy, and Educational Practice.

    ERIC Educational Resources Information Center

    Stringfield, Sam

    Current theorizing in education, as in industry, is largely devoted to explaining trial-and-error, failure-tolerant, low-reliability organizations. This article examines changing societal demands on education and argues that effective responses to those demands require new and different organizational structures. Schools must abandon industrial…

  13. Magnetoelectroluminescence of organic heterostructures: Analytical theory and spectrally resolved measurements

    NASA Astrophysics Data System (ADS)

    Liu, Feilong; Kelley, Megan R.; Crooker, Scott A.; Nie, Wanyi; Mohite, Aditya D.; Ruden, P. Paul; Smith, Darryl L.

    2014-12-01

    The effect of a magnetic field on the electroluminescence of organic light emitting devices originates from the hyperfine interaction between the electron/hole polarons and the hydrogen nuclei of the host molecules. In this paper, we present an analytical theory of magnetoelectroluminescence for organic semiconductors. To be specific, we focus on bilayer heterostructure devices. In the case we are considering, light generation at the interface of the donor and acceptor layers results from the formation and recombination of exciplexes. The spin physics is described by a stochastic Liouville equation for the electron/hole spin density matrix. By finding the steady-state analytical solution using Bloch-Wangsness-Redfield theory, we explore how the singlet/triplet exciplex ratio is affected by the hyperfine interaction strength and by the external magnetic field. To validate the theory, spectrally resolved electroluminescence experiments on BPhen/m-MTDATA devices are analyzed. With increasing emission wavelength, the width of the magnetic field modulation curve of the electroluminescence increases while its depth decreases. These observations are consistent with the model.

  14. Magnetoelectroluminescence of organic heterostructures: Analytical theory and spectrally resolved measurements

    DOE PAGES

    Liu, Feilong; Kelley, Megan R.; Crooker, Scott A.; ...

    2014-12-22

    The effect of a magnetic field on the electroluminescence of organic light emitting devices originates from the hyperfine interaction between the electron/hole polarons and the hydrogen nuclei of the host molecules. In this paper, we present an analytical theory of magnetoelectroluminescence for organic semiconductors. To be specific, we focus on bilayer heterostructure devices. In the case we are considering, light generation at the interface of the donor and acceptor layers results from the formation and recombination of exciplexes. The spin physics is described by a stochastic Liouville equation for the electron/hole spin density matrix. By finding the steady-state analytical solutionmore » using Bloch-Wangsness-Redfield theory, we explore how the singlet/triplet exciplex ratio is affected by the hyperfine interaction strength and by the external magnetic field. In order to validate the theory, spectrally resolved electroluminescence experiments on BPhen/m-MTDATA devices are analyzed. With increasing emission wavelength, the width of the magnetic field modulation curve of the electroluminescence increases while its depth decreases. Furthermore, these observations are consistent with the model.« less

  15. A polarized light microscopy method for accurate and reliable grading of collagen organization in cartilage repair.

    PubMed

    Changoor, A; Tran-Khanh, N; Méthot, S; Garon, M; Hurtig, M B; Shive, M S; Buschmann, M D

    2011-01-01

    Collagen organization, a feature that is critical for cartilage load bearing and durability, is not adequately assessed in cartilage repair tissue by present histological scoring systems. Our objectives were to develop a new polarized light microscopy (PLM) score for collagen organization and to test its reliability. This PLM score uses an ordinal scale of 0-5 to rate the extent that collagen network organization resembles that of young adult hyaline articular cartilage (score of 5) vs a totally disorganized tissue (score of 0). Inter-reader reliability was assessed using Intraclass Correlation Coefficients (ICC) for Agreement, calculated from scores of three trained readers who independently evaluated blinded sections obtained from normal (n=4), degraded (n=2) and repair (n=22) human cartilage biopsies. The PLM score succeeded in distinguishing normal, degraded and repair cartilages, where the latter displayed greater complexity in collagen structure. Excellent inter-reader reproducibility was found with ICCs for Agreement of 0.90 [ICC(2,1)] (lower boundary of the 95% confidence interval is 0.83) and 0.96 [ICC(2,3)] (lower boundary of the 95% confidence interval is 0.94), indicating the reliability of a single reader's scores and the mean of all three readers' scores, respectively. This PLM method offers a novel means for systematically evaluating collagen organization in repair cartilage. We propose that it be used to supplement current gold standard histological scoring systems for a more complete assessment of repair tissue quality. Copyright © 2010 Osteoarthritis Research Society International. Published by Elsevier Ltd. All rights reserved.

  16. Analysis of mass incident diffusion in Weibo based on self-organization theory

    NASA Astrophysics Data System (ADS)

    Pan, Jun; Shen, Huizhang

    2018-02-01

    This study introduces some theories and methods of self-organization system to the research of the diffusion mechanism of mass incidents in Weibo (Chinese Twitter). Based on the analysis on massive Weibo data from Songjiang battery factory incident happened in 2013 and Jiiangsu Qidong OJI PAPER incident happened in 2012, we find out that diffusion system of mass incident in Weibo satisfies Power Law, Zipf's Law, 1/f noise and Self-similarity. It means this system is the self-organization criticality system and dissemination bursts can be understood as one kind of Self-organization behavior. As the consequence, self-organized criticality (SOC) theory can be used to explain the evolution of mass incident diffusion and people may come up with the right strategy to control such kind of diffusion if they can handle the key ingredients of Self-organization well. Such a study is of practical importance which can offer opportunities for policy makers to have good management on these events.

  17. Artificial organisms as tools for the development of psychological theory: Tolman's lesson.

    PubMed

    Miglino, Orazio; Gigliotta, Onofrio; Cardaci, Maurizio; Ponticorvo, Michela

    2007-12-01

    In the 1930s and 1940s, Edward Tolman developed a psychological theory of spatial orientation in rats and humans. He expressed his theory as an automaton (the "schematic sowbug") or what today we would call an "artificial organism." With the technology of the day, he could not implement his model. Nonetheless, he used it to develop empirical predictions which tested with animals in the laboratory. This way of proceeding was in line with scientific practice dating back to Galileo. The way psychologists use artificial organisms in their work today breaks with this tradition. Modern "artificial organisms" are constructed a posteriori, working from experimental or ethological observations. As a result, researchers can use them to confirm a theoretical model or to simulate its operation. But they make no contribution to the actual building of models. In this paper, we try to return to Tolman's original strategy: implementing his theory of "vicarious trial and error" in a simulated robot, forecasting the robot's behavior and conducting experiments that verify or falsify these predictions.

  18. Understanding organic photovoltaic cells: Electrode, nanostructure, reliability, and performance

    NASA Astrophysics Data System (ADS)

    Kim, Myung-Su

    My Ph.D. research has focused on alternative renewable energy using organic semiconductors. During my study, first, I have established reliable characterization methods of organic photovoltaic devices. More specifically, less than 5% variation of power conversion efficiency of fabricated organic blend photovoltaic cells (OBPC) was achieved after optimization. The reproducibility of organic photovoltaic cell performance is one of the essential issues that must be clarified before beginning serious investigations of the application of creative and challenging ideas. Second, the relationships between fill factor (FF) and process variables have been demonstrated with series and shunt resistance, and this provided a chance to understand the electrical device behavior. In the blend layer, series resistance (Rs) and shunt resistance (Rsh) were varied by controlling the morphology of the blend layer, the regioregularity of the conjugated polymer, and the thickness of the blend layer. At the interface between the cathode including PEDOT:PSS and the blend layer, cathode conductivity was controlled by varying the structure of the cathode or adding an additive. Third, we thoroughly examined possible characterization mistakes in OPVC. One significant characterization mistake is observed when the crossbar electrode geometry of OPVC using PEDOT:PSS was fabricated and characterized with illumination which is larger than the actual device area. The hypothesis to explain this overestimation was excess photo-current generated from the cell region outside the overlapped electrode area, where PEDOT:PSS plays as anode and this was clearly supported with investigations. Finally, I incorporated a creative idea, which enhances the exciton dissociation efficiency by increasing the interface area between donor and acceptor to improve the power conversion efficiency of organic photovoltaic cells. To achieve this, nanoimprint lithography was applied for interface area increase. To clarify the

  19. Reliability assessment of different plate theories for elastic wave propagation analysis in functionally graded plates.

    PubMed

    Mehrkash, Milad; Azhari, Mojtaba; Mirdamadi, Hamid Reza

    2014-01-01

    The importance of elastic wave propagation problem in plates arises from the application of ultrasonic elastic waves in non-destructive evaluation of plate-like structures. However, precise study and analysis of acoustic guided waves especially in non-homogeneous waveguides such as functionally graded plates are so complicated that exact elastodynamic methods are rarely employed in practical applications. Thus, the simple approximate plate theories have attracted much interest for the calculation of wave fields in FGM plates. Therefore, in the current research, the classical plate theory (CPT), first-order shear deformation theory (FSDT) and third-order shear deformation theory (TSDT) are used to obtain the transient responses of flexural waves in FGM plates subjected to transverse impulsive loadings. Moreover, comparing the results with those based on a well recognized hybrid numerical method (HNM), we examine the accuracy of the plate theories for several plates of various thicknesses under excitations of different frequencies. The material properties of the plate are assumed to vary across the plate thickness according to a simple power-law distribution in terms of volume fractions of constituents. In all analyses, spatial Fourier transform together with modal analysis are applied to compute displacement responses of the plates. A comparison of the results demonstrates the reliability ranges of the approximate plate theories for elastic wave propagation analysis in FGM plates. Furthermore, based on various examples, it is shown that whenever the plate theories are used within the appropriate ranges of plate thickness and frequency content, solution process in wave number-time domain based on modal analysis approach is not only sufficient but also efficient for finding the transient waveforms in FGM plates. Copyright © 2013 Elsevier B.V. All rights reserved.

  20. Using the Reliability Theory for Assessing the Decision Confidence Probability for Comparative Life Cycle Assessments.

    PubMed

    Wei, Wei; Larrey-Lassalle, Pyrène; Faure, Thierry; Dumoulin, Nicolas; Roux, Philippe; Mathias, Jean-Denis

    2016-03-01

    Comparative decision making process is widely used to identify which option (system, product, service, etc.) has smaller environmental footprints and for providing recommendations that help stakeholders take future decisions. However, the uncertainty problem complicates the comparison and the decision making. Probability-based decision support in LCA is a way to help stakeholders in their decision-making process. It calculates the decision confidence probability which expresses the probability of a option to have a smaller environmental impact than the one of another option. Here we apply the reliability theory to approximate the decision confidence probability. We compare the traditional Monte Carlo method with a reliability method called FORM method. The Monte Carlo method needs high computational time to calculate the decision confidence probability. The FORM method enables us to approximate the decision confidence probability with fewer simulations than the Monte Carlo method by approximating the response surface. Moreover, the FORM method calculates the associated importance factors that correspond to a sensitivity analysis in relation to the probability. The importance factors allow stakeholders to determine which factors influence their decision. Our results clearly show that the reliability method provides additional useful information to stakeholders as well as it reduces the computational time.

  1. Using Rasch Analysis to Evaluate the Reliability and Validity of the Swallowing Quality of Life Questionnaire: An Item Response Theory Approach.

    PubMed

    Cordier, Reinie; Speyer, Renée; Schindler, Antonio; Michou, Emilia; Heijnen, Bas Joris; Baijens, Laura; Karaduman, Ayşe; Swan, Katina; Clavé, Pere; Joosten, Annette Veronica

    2018-02-01

    The Swallowing Quality of Life questionnaire (SWAL-QOL) is widely used clinically and in research to evaluate quality of life related to swallowing difficulties. It has been described as a valid and reliable tool, but was developed and tested using classic test theory. This study describes the reliability and validity of the SWAL-QOL using item response theory (IRT; Rasch analysis). SWAL-QOL data were gathered from 507 participants at risk of oropharyngeal dysphagia (OD) across four European countries. OD was confirmed in 75.7% of participants via videofluoroscopy and/or fiberoptic endoscopic evaluation, or a clinical diagnosis based on meeting selected criteria. Patients with esophageal dysphagia were excluded. Data were analysed using Rasch analysis. Item and person reliability was good for all the items combined. However, person reliability was poor for 8 subscales and item reliability was poor for one subscale. Eight subscales exhibited poor person separation and two exhibited poor item separation. Overall item and person fit statistics were acceptable. However, at an individual item fit level results indicated unpredictable item responses for 28 items, and item redundancy for 10 items. The item-person dimensionality map confirmed these findings. Results from the overall Rasch model fit and Principal Component Analysis were suggestive of a second dimension. For all the items combined, none of the item categories were 'category', 'threshold' or 'step' disordered; however, all subscales demonstrated category disordered functioning. Findings suggest an urgent need to further investigate the underlying structure of the SWAL-QOL and its psychometric characteristics using IRT.

  2. Origins of life: a comparison of theories and application to Mars

    NASA Technical Reports Server (NTRS)

    Davis, W. L.; McKay, C. P.

    1996-01-01

    The field of study that deals with the origins of life does not have a consensus for a theory of life's origin. An analysis of the range of theories offered shows that they share some common features that may be reliable predictors when considering the possible origins of life on another planet. The fundamental datum dealing with the origins of life is that life appeared early in the history of the Earth, probably before 3.5 Ga and possibly before 3.8 Ga. What might be called the standard theory (the Oparin-Haldane theory) posits the production of organic molecules on the early Earth followed by chemical reactions that produced increased organic complexity leading eventually to organic life capable of reproduction, mutation, and selection using organic material as nutrients. A distinct class of other theories (panspermia theories) suggests that life was carried to Earth from elsewhere--these theories receive some support from recent work on planetary impact processes. Other alternatives to the standard model suggest that life arose as an inorganic (clay) form and/or that the initial energy source was not organic material but chemical energy or sunlight. We find that the entire range of current theories suggests that liquid water is the quintessential environmental criterion for both the origin and sustenance of life. It is therefore of interest that during the time that life appeared on Earth we have evidence for liquid water present on the surface of Mars.

  3. Reliability and validity of the German version of the Structured Interview of Personality Organization (STIPO)

    PubMed Central

    2013-01-01

    Background The assessment of personality organization and its observable behavioral manifestations, i.e. personality functioning, has a long tradition in psychodynamic psychiatry. Recently, the DSM-5 Levels of Personality Functioning Scale has moved it into the focus of psychiatric diagnostics. Based on Kernberg’s concept of personality organization the Structured Interview of Personality Organization (STIPO) was developed for diagnosing personality functioning. The STIPO covers seven dimensions: (1) identity, (2) object relations, (3) primitive defenses, (4) coping/rigidity, (5) aggression, (6) moral values, and (7) reality testing and perceptual distortions. The English version of the STIPO has previously revealed satisfying psychometric properties. Methods Validity and reliability of the German version of the 100-item instrument have been evaluated in 122 psychiatric patients. All patients were diagnosed according to the Diagnostic and Statistical Manual for Mental Disorders (DSM-IV) and were assessed by means of the STIPO. Moreover, all patients completed eight questionnaires that served as criteria for external validity of the STIPO. Results Interrater reliability varied between intraclass correlations of .89 and 1.0, Crohnbach’s α for the seven dimensions was .69 to .93. All a priori selected questionnaire scales correlated significantly with the corresponding STIPO dimensions. Patients with personality disorder (PD) revealed significantly higher STIPO scores (i.e. worse personality functioning) than patients without PD; patients cluster B PD showed significantly higher STIPO scores than patients with cluster C PD. Conclusions Interrater reliability, Crohnbach’s α, concurrent validity, and differential validity of the STIPO are satisfying. The STIPO represents an appropriate instrument for the assessment of personality functioning in clinical and research settings. PMID:23941404

  4. Toward a theory of organisms: Three founding principles in search of a useful integration

    PubMed Central

    SOTO, ANA M.; LONGO, GIUSEPPE; MIQUEL, PAUL-ANTOINE; MONTEVIL, MAËL; MOSSIO, MATTEO; PERRET, NICOLE; POCHEVILLE, ARNAUD; SONNENSCHEIN, CARLOS

    2016-01-01

    Organisms, be they uni- or multi-cellular, are agents capable of creating their own norms; they are continuously harmonizing their ability to create novelty and stability, that is, they combine plasticity with robustness. Here we articulate the three principles for a theory of organisms proposed in this issue, namely: the default state of proliferation with variation and motility, the principle of variation and the principle of organization. These principles profoundly change both biological observables and their determination with respect to the theoretical framework of physical theories. This radical change opens up the possibility of anchoring mathematical modeling in biologically proper principles. PMID:27498204

  5. Thermodynamics of organisms in the context of dynamic energy budget theory.

    PubMed

    Sousa, Tânia; Mota, Rui; Domingos, Tiago; Kooijman, S A L M

    2006-11-01

    We carry out a thermodynamic analysis to an organism. It is applicable to any type of organism because (1) it is based on a thermodynamic formalism applicable to all open thermodynamic systems and (2) uses a general model to describe the internal structure of the organism--the dynamic energy budget (DEB) model. Our results on the thermodynamics of DEB organisms are the following. (1) Thermodynamic constraints for the following types of organisms: (a) aerobic and exothermic, (b) anaerobic and exothermic, and (c) anaerobic and endothermic; showing that anaerobic organisms have a higher thermodynamic flexibility. (2) A way to compute the changes in the enthalpy and in the entropy of living biomass that accompany changes in growth rate solving the problem of evaluating the thermodynamic properties of biomass as a function of the amount of reserves. (3) Two expressions for Thornton's coefficient that explain its experimental variability and theoretically underpin its use in metabolic studies. (4) A mechanism that organisms in non-steady-state use to rid themselves of internal entropy production: "dilution of entropy production by growth." To demonstrate the practical applicability of DEB theory to quantify thermodynamic changes in organisms we use published data on Klebsiella aerogenes growing aerobically in a continuous culture. We obtain different values for molar entropies of the reserve and the structure of Klebsiella aerogenes proving that the reserve density concept of DEB theory is essential in discussions concerning (a) the relationship between organization and entropy and (b) the mechanism of storing entropy in new biomass. Additionally, our results suggest that the entropy of dead biomass is significantly different from the entropy of living biomass.

  6. A novel evaluation method for building construction project based on integrated information entropy with reliability theory.

    PubMed

    Bai, Xiao-ping; Zhang, Xi-wei

    2013-01-01

    Selecting construction schemes of the building engineering project is a complex multiobjective optimization decision process, in which many indexes need to be selected to find the optimum scheme. Aiming at this problem, this paper selects cost, progress, quality, and safety as the four first-order evaluation indexes, uses the quantitative method for the cost index, uses integrated qualitative and quantitative methodologies for progress, quality, and safety indexes, and integrates engineering economics, reliability theories, and information entropy theory to present a new evaluation method for building construction project. Combined with a practical case, this paper also presents detailed computing processes and steps, including selecting all order indexes, establishing the index matrix, computing score values of all order indexes, computing the synthesis score, sorting all selected schemes, and making analysis and decision. Presented method can offer valuable references for risk computing of building construction projects.

  7. Probability interpretations of intraclass reliabilities.

    PubMed

    Ellis, Jules L

    2013-11-20

    Research where many organizations are rated by different samples of individuals such as clients, patients, or employees frequently uses reliabilities computed from intraclass correlations. Consumers of statistical information, such as patients and policy makers, may not have sufficient background for deciding which levels of reliability are acceptable. It is shown that the reliability is related to various probabilities that may be easier to understand, for example, the proportion of organizations that will be classed significantly above (or below) the mean and the probability that an organization is classed correctly given that it is classed significantly above (or below) the mean. One can view these probabilities as the amount of information of the classification and the correctness of the classification. These probabilities have an inverse relationship: given a reliability, one can 'buy' correctness at the cost of informativeness and conversely. This article discusses how this can be used to make judgments about the required level of reliabilities. Copyright © 2013 John Wiley & Sons, Ltd.

  8. Influences on and Limitations of Classical Test Theory Reliability Estimates.

    ERIC Educational Resources Information Center

    Arnold, Margery E.

    It is incorrect to say "the test is reliable" because reliability is a function not only of the test itself, but of many factors. The present paper explains how different factors affect classical reliability estimates such as test-retest, interrater, internal consistency, and equivalent forms coefficients. Furthermore, the limits of classical test…

  9. Covariate-free and Covariate-dependent Reliability.

    PubMed

    Bentler, Peter M

    2016-12-01

    Classical test theory reliability coefficients are said to be population specific. Reliability generalization, a meta-analysis method, is the main procedure for evaluating the stability of reliability coefficients across populations. A new approach is developed to evaluate the degree of invariance of reliability coefficients to population characteristics. Factor or common variance of a reliability measure is partitioned into parts that are, and are not, influenced by control variables, resulting in a partition of reliability into a covariate-dependent and a covariate-free part. The approach can be implemented in a single sample and can be applied to a variety of reliability coefficients.

  10. User Data Spectrum Theory: Collecting, Interpreting, and Implementing User Data in Organizations

    ERIC Educational Resources Information Center

    Peer, Andrea Jo

    2017-01-01

    Organizations interested in increasing their user experience (UX) capacity lack the tools they need to know how to do so. This dissertation addresses this challenge via three major research efforts: 1) the creation of User Data Spectrum theory and a User Data Spectrum survey for helping organizations better invest resources to grow their UX…

  11. Gift Exchange Theory: a critique in relation to organ transplantation.

    PubMed

    Sque, M; Payne, S A

    1994-01-01

    Organ transplantation is becoming more important as a viable method of treatment for certain severe medical conditions. It is a complex and demanding process for all involved. Nursing as a developing science must respond to cultural and economic changes. Therefore, a need exists to develop a body of empirically based knowledge to understand and support the process of organ transplantation. This paper will argue that as trading in organs is unacceptable to the moral standards of western society and outlawed in many countries, an alternative framework must be considered for understanding the mechanisms through which organs are donated and utilized. The donating and receiving of organs may be equated with gift-giving, as there is no barter of commodities involved. Therefore, a useful framework to explore this phenomenon will be one that underpins the process of giving and receiving of gifts. Gift Exchange Theory will be evaluated and critically examined in relation to organ transplantation and the role of nurses in this process.

  12. Study on evaluation of construction reliability for engineering project based on fuzzy language operator

    NASA Astrophysics Data System (ADS)

    Shi, Yu-Fang; Ma, Yi-Yi; Song, Ping-Ping

    2018-03-01

    System Reliability Theory is a research hotspot of management science and system engineering in recent years, and construction reliability is useful for quantitative evaluation of project management level. According to reliability theory and target system of engineering project management, the defination of construction reliability appears. Based on fuzzy mathematics theory and language operator, value space of construction reliability is divided into seven fuzzy subsets and correspondingly, seven membership function and fuzzy evaluation intervals are got with the operation of language operator, which provides the basis of corresponding method and parameter for the evaluation of construction reliability. This method is proved to be scientific and reasonable for construction condition and an useful attempt for theory and method research of engineering project system reliability.

  13. Predicting behavioural responses to novel organisms: state-dependent detection theory.

    PubMed

    Trimmer, Pete C; Ehlman, Sean M; Sih, Andrew

    2017-01-25

    Human activity alters natural habitats for many species. Understanding variation in animals' behavioural responses to these changing environments is critical. We show how signal detection theory can be used within a wider framework of state-dependent modelling to predict behavioural responses to a major environmental change: novel, exotic species. We allow thresholds for action to be a function of reserves, and demonstrate how optimal thresholds can be calculated. We term this framework 'state-dependent detection theory' (SDDT). We focus on behavioural and fitness outcomes when animals continue to use formerly adaptive thresholds following environmental change. In a simple example, we show that exposure to novel animals which appear dangerous-but are actually safe-(e.g. ecotourists) can have catastrophic consequences for 'prey' (organisms that respond as if the new organisms are predators), significantly increasing mortality even when the novel species is not predatory. SDDT also reveals that the effect on reproduction can be greater than the effect on lifespan. We investigate factors that influence the effect of novel organisms, and address the potential for behavioural adjustments (via evolution or learning) to recover otherwise reduced fitness. Although effects of environmental change are often difficult to predict, we suggest that SDDT provides a useful route ahead. © 2017 The Author(s).

  14. Reliability analysis in interdependent smart grid systems

    NASA Astrophysics Data System (ADS)

    Peng, Hao; Kan, Zhe; Zhao, Dandan; Han, Jianmin; Lu, Jianfeng; Hu, Zhaolong

    2018-06-01

    Complex network theory is a useful way to study many real complex systems. In this paper, a reliability analysis model based on complex network theory is introduced in interdependent smart grid systems. In this paper, we focus on understanding the structure of smart grid systems and studying the underlying network model, their interactions, and relationships and how cascading failures occur in the interdependent smart grid systems. We propose a practical model for interdependent smart grid systems using complex theory. Besides, based on percolation theory, we also study the effect of cascading failures effect and reveal detailed mathematical analysis of failure propagation in such systems. We analyze the reliability of our proposed model caused by random attacks or failures by calculating the size of giant functioning components in interdependent smart grid systems. Our simulation results also show that there exists a threshold for the proportion of faulty nodes, beyond which the smart grid systems collapse. Also we determine the critical values for different system parameters. In this way, the reliability analysis model based on complex network theory can be effectively utilized for anti-attack and protection purposes in interdependent smart grid systems.

  15. Changing Race Relations in Organizations: A Comparison of Theories.

    DTIC Science & Technology

    1985-03-01

    collective term, is used to characterize individuals whose behavior is strongly influenced by how it will affect others. In contrast, idiocentric is the...term for individuals who give more weight to how their behavior will affect themselves rather than others. Triandis (1983) refers to an allocentric...and applying them to the affect , cognitions, and behavior of investigators as well as of respondents. It means bringing organization theory to the

  16. Organize to manage reliability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ricketts, R.

    An analysis of maintenance costs in hydrocarbon processing industry (HPI) plants has revealed that attitudes and practices of personnel are the major single bottom line factor. In reaching this conclusion, Solomon Associates examined comparative analysis of plant records over the past decade. The authors learned that there was a wide range of performance independent of refinery age, capacity, processing complexity, and location. Facilities of all extremes in these attributes are included in both high-cost and low-cost categories. Those in the lowest quartile of performance posted twice the resource consumption as the best quartile. Furthermore, there was almost no similarity betweenmore » refineries within a single company. The paper discusses cost versus availability, maintenance spending, two organizational approaches used (repair focused and reliability focused), and organizational style and structure.« less

  17. Generalizability Theory and Classical Test Theory

    ERIC Educational Resources Information Center

    Brennan, Robert L.

    2011-01-01

    Broadly conceived, reliability involves quantifying the consistencies and inconsistencies in observed scores. Generalizability theory, or G theory, is particularly well suited to addressing such matters in that it enables an investigator to quantify and distinguish the sources of inconsistencies in observed scores that arise, or could arise, over…

  18. Australian perioperative nurses' experiences of assisting in multi-organ procurement surgery: a grounded theory study.

    PubMed

    Smith, Zaneta; Leslie, Gavin; Wynaden, Dianne

    2015-03-01

    Multi-organ procurement surgical procedures through the generosity of deceased organ donors, have made an enormous impact on extending the lives of recipients. There is a dearth of in-depth knowledge relating to the experiences of perioperative nurses working closely with organ donors undergoing multi-organ procurement surgical procedures. The aim of this study was to address this gap by describing the perioperative nurses experiences of participating in multi-organ procurement surgical procedures and interpreting these findings as a substantive theory. This qualitative study used grounded theory methodology to generate a substantive theory of the experiences of perioperative nurses participating in multi-organ procurement surgery. Recruitment of participants took place after the study was advertised via a professional newsletter and journal. The study was conducted with participants from metropolitan, rural and regional areas of two Australian states; New South Wales and Western Australia. Thirty five perioperative nurse participants with three to 39 years of professional nursing experience informed the study. Semi structured in-depth interviews were undertaken from July 2009 to April 2010 with a mean interview time of 60 min. Interview data was transcribed verbatim and analysed using the constant comparative method. The study results draw attention to the complexities that exist for perioperative nurses when participating in multi-organ procurement surgical procedures reporting a basic social psychological problem articulated as hiding behind a mask and how they resolved this problem by the basic social psychological process of finding meaning. This study provides a greater understanding of how these surgical procedures impact on perioperative nurses by providing a substantive theory of this experience. The findings have the potential to guide further research into this challenging area of nursing practice with implications for clinical initiatives, management

  19. Reliability of a tool for measuring theory of planned behaviour constructs for use in evaluating research use in policymaking

    PubMed Central

    2011-01-01

    Background Although measures of knowledge translation and exchange (KTE) effectiveness based on the theory of planned behavior (TPB) have been used among patients and providers, no measure has been developed for use among health system policymakers and stakeholders. A tool that measures the intention to use research evidence in policymaking could assist researchers in evaluating the effectiveness of KTE strategies that aim to support evidence-informed health system decision-making. Therefore, we developed a 15-item tool to measure four TPB constructs (intention, attitude, subjective norm and perceived control) and assessed its face validity through key informant interviews. Methods We carried out a reliability study to assess the tool's internal consistency and test-retest reliability. Our study sample consisted of 62 policymakers and stakeholders that participated in deliberative dialogues. We assessed internal consistency using Cronbach's alpha and generalizability (G) coefficients, and we assessed test-retest reliability by calculating Pearson correlation coefficients (r) and G coefficients for each construct and the tool overall. Results The internal consistency of items within each construct was good with alpha ranging from 0.68 to alpha = 0.89. G-coefficients were lower for a single administration (G = 0.34 to G = 0.73) than for the average of two administrations (G = 0.79 to G = 0.89). Test-retest reliability coefficients for the constructs ranged from r = 0.26 to r = 0.77 and from G = 0.31 to G = 0.62 for a single administration, and from G = 0.47 to G = 0.86 for the average of two administrations. Test-retest reliability of the tool using G theory was moderate (G = 0.5) when we generalized across a single observation, but became strong (G = 0.9) when we averaged across both administrations. Conclusion This study provides preliminary evidence for the reliability of a tool that can be used to measure TPB constructs in relation to research use in policymaking

  20. Application of fuzzy set and Dempster-Shafer theory to organic geochemistry interpretation

    NASA Technical Reports Server (NTRS)

    Kim, C. S.; Isaksen, G. H.

    1993-01-01

    An application of fuzzy sets and Dempster Shafter Theory (DST) in modeling the interpretational process of organic geochemistry data for predicting the level of maturities of oil and source rock samples is presented. This was accomplished by (1) representing linguistic imprecision and imprecision associated with experience by a fuzzy set theory, (2) capturing the probabilistic nature of imperfect evidences by a DST, and (3) combining multiple evidences by utilizing John Yen's generalized Dempster-Shafter Theory (GDST), which allows DST to deal with fuzzy information. The current prototype provides collective beliefs on the predicted levels of maturity by combining multiple evidences through GDST's rule of combination.

  1. First evidence on the validity and reliability of the Safety Organizing Scale-Nursing Home version (SOS-NH).

    PubMed

    Ausserhofer, Dietmar; Anderson, Ruth A; Colón-Emeric, Cathleen; Schwendimann, René

    2013-08-01

    The Safety Organizing Scale is a valid and reliable measure on safety behaviors and practices in hospitals. This study aimed to explore the psychometric properties of the Safety Organizing Scale-Nursing Home version (SOS-NH). In a cross-sectional analysis of staff survey data, we examined validity and reliability of the 9-item Safety SOS-NH using American Educational Research Association guidelines. This substudy of a larger trial used baseline survey data collected from staff members (n = 627) in a variety of work roles in 13 nursing homes (NHs) in North Carolina and Virginia. Psychometric evaluation of the SOS-NH revealed good response patterns with low average of missing values across all items (3.05%). Analyses of the SOS-NH's internal structure (eg, comparative fit indices = 0.929, standardized root mean square error of approximation = 0.045) and consistency (composite reliability = 0.94) suggested its 1-dimensionality. Significant between-facility variability, intraclass correlations, within-group agreement, and design effect confirmed appropriateness of the SOS-NH for measurement at the NH level, justifying data aggregation. The SOS-NH showed discriminate validity from one related concept: communication openness. Initial evidence regarding validity and reliability of the SOS-NH supports its utility in measuring safety behaviors and practices among a wide range of NH staff members, including those with low literacy. Further psychometric evaluation should focus on testing concurrent and criterion validity, using resident outcome measures (eg, patient fall rates). Copyright © 2013 American Medical Directors Association, Inc. All rights reserved.

  2. Assuring reliability program effectiveness.

    NASA Technical Reports Server (NTRS)

    Ball, L. W.

    1973-01-01

    An attempt is made to provide simple identification and description of techniques that have proved to be most useful either in developing a new product or in improving reliability of an established product. The first reliability task is obtaining and organizing parts failure rate data. Other tasks are parts screening, tabulation of general failure rates, preventive maintenance, prediction of new product reliability, and statistical demonstration of achieved reliability. Five principal tasks for improving reliability involve the physics of failure research, derating of internal stresses, control of external stresses, functional redundancy, and failure effects control. A final task is the training and motivation of reliability specialist engineers.

  3. Current medical staff governance and physician sensemaking: a formula for resistance to high reliability.

    PubMed

    Flitter, Marc A; Riesenmy, Kelly Rouse; van Stralen, Daved

    2012-01-01

    To offer a theoretical explanation for observed physician resistance and rejection of high reliability patient safety initiatives. A grounded theoretical qualitative approach, utilizing the organizational theory of sensemaking, provided the foundation for inductive and deductive reasoning employed to analyze medical staff rejection of two successfully performing high reliability programs at separate hospitals. Physician behaviors resistant to patient-centric high reliability processes were traced to provider-centric physician sensemaking. Research, conducted with the advantage that prospective studies have over the limitations of this retrospective investigation, is needed to evaluate the potential for overcoming physician resistance to innovation implementation, employing strategies based upon these findings and sensemaking theory in general. If hospitals are to emulate high reliability industries that do successfully manage environments of extreme hazard, physicians must be fully integrated into the complex teams required to accomplish this goal. Reforming health care, through high reliability organizing, with its attendant continuous focus on patient-centric processes, offers a distinct alternative to efforts directed primarily at reforming health care insurance. It is by changing how health care is provided that true cost efficiencies can be achieved. Technology and the insights of organizational science present the opportunity of replacing the current emphasis on privileged information with collective tools capable of providing quality and safety in health care. The fictions that have sustained a provider-centric health care system have been challenged. The benefits of patient-centric care should be obtainable.

  4. 76 FR 23222 - Electric Reliability Organization Interpretation of Transmission Operations Reliability

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-26

    ... applications or print-to-PDF format, and not in a scanned format, at http://www.ferc.gov/docs-filing/efiling....3d 1342 (DC Cir. 2009). \\5\\ Mandatory Reliability Standards for the Bulk-Power System, Order No. 693... applications or print-to-PDF format and not in a scanned format. Commenters filing electronically do not need...

  5. An examination of three theoretical models to explain the organ donation attitude--registration discrepancy among mature adults.

    PubMed

    Quick, Brian L; Anker, Ashley E; Feeley, Thomas Hugh; Morgan, Susan E

    2016-01-01

    An inconsistency in the research indicates positive attitudes toward organ donation do not map reliably onto organ donor registrations. Various models have sought to explain this inconsistency and the current analysis formally compared three models: the Bystander Intervention Model (BIM), the Organ Donor Model (ODM), and Vested Interest Theory (VIT). Mature (N = 688) adults between the ages of 50 to 64 years completed surveys related to organ donation. Results revealed that VIT accounted for the most variance in organ donation registrations followed by the BIM and ODM. The discussion emphasizes the importance of employing theories to explain a phenomenon as well as the practical implications of the findings.

  6. 75 FR 80391 - Electric Reliability Organization Interpretations of Interconnection Reliability Operations and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-22

    ... configuration to maintain system stability, acceptable voltage or power flows.\\12\\ \\12\\ In the Western... prevent system instability or cascading outages, and protect other facilities in response to transmission... nature used to address system reliability vulnerabilities to prevent system instability, cascading...

  7. Reliability Evaluation and Improvement Approach of Chemical Production Man - Machine - Environment System

    NASA Astrophysics Data System (ADS)

    Miao, Yongchun; Kang, Rongxue; Chen, Xuefeng

    2017-12-01

    In recent years, with the gradual extension of reliability research, the study of production system reliability has become the hot topic in various industries. Man-machine-environment system is a complex system composed of human factors, machinery equipment and environment. The reliability of individual factor must be analyzed in order to gradually transit to the research of three-factor reliability. Meanwhile, the dynamic relationship among man-machine-environment should be considered to establish an effective blurry evaluation mechanism to truly and effectively analyze the reliability of such systems. In this paper, based on the system engineering, fuzzy theory, reliability theory, human error, environmental impact and machinery equipment failure theory, the reliabilities of human factor, machinery equipment and environment of some chemical production system were studied by the method of fuzzy evaluation. At last, the reliability of man-machine-environment system was calculated to obtain the weighted result, which indicated that the reliability value of this chemical production system was 86.29. Through the given evaluation domain it can be seen that the reliability of man-machine-environment integrated system is in a good status, and the effective measures for further improvement were proposed according to the fuzzy calculation results.

  8. Service quality and maturity of health care organizations through the lens of Complexity Leadership Theory.

    PubMed

    Horvat, Ana; Filipovic, Jovan

    2018-02-01

    This research focuses on Complexity Leadership Theory and the relationship between leadership-examined through the lens of Complexity Leadership Theory-and organizational maturity as an indicator of the performance of health organizations. The research adopts a perspective that conceptualizes organizations as complex adaptive systems and draws upon a survey of opinion of 189 managers working in Serbian health organizations. As the results indicate a dependency between functions of leadership and levels of the maturity of health organizations, we propose a model that connects the two. The study broadens our understanding of the implications of complexity thinking and its reflection on leadership functions and overall organizational performance. The correlations between leadership functions and maturity could have practical applications in policy processing, thus improving the quality of outcomes and the overall level of service quality. © 2017 John Wiley & Sons, Ltd.

  9. Customer-organization relationships: development and test of a theory of extended identities.

    PubMed

    Bagozzi, Richard P; Bergami, Massimo; Marzocchi, Gian Luca; Morandin, Gabriele

    2012-01-01

    We develop a theory of personal, relational, and collective identities that links organizations and consumers. Four targets of identity are studied: small friendship groups of aficionados of Ducati motorcycles, virtual communities centered on Ducatis, the Ducati brand, and Ducati the company. The interplay amongst the identities is shown to order affective, cognitive, and evaluative reactions toward each target. Hypotheses are tested on a sample of 210 Ducati aficionados, and implications of these multiple, extended identities for organizations are examined.

  10. Conceptualizing Essay Tests' Reliability and Validity: From Research to Theory

    ERIC Educational Resources Information Center

    Badjadi, Nour El Imane

    2013-01-01

    The current paper on writing assessment surveys the literature on the reliability and validity of essay tests. The paper aims to examine the two concepts in relationship with essay testing as well as to provide a snapshot of the current understandings of the reliability and validity of essay tests as drawn in recent research studies. Bearing in…

  11. Accuracy of a Classical Test Theory-Based Procedure for Estimating the Reliability of a Multistage Test. Research Report. ETS RR-17-02

    ERIC Educational Resources Information Center

    Kim, Sooyeon; Livingston, Samuel A.

    2017-01-01

    The purpose of this simulation study was to assess the accuracy of a classical test theory (CTT)-based procedure for estimating the alternate-forms reliability of scores on a multistage test (MST) having 3 stages. We generated item difficulty and discrimination parameters for 10 parallel, nonoverlapping forms of the complete 3-stage test and…

  12. Applications of the Conceptual Density Functional Theory Indices to Organic Chemistry Reactivity.

    PubMed

    Domingo, Luis R; Ríos-Gutiérrez, Mar; Pérez, Patricia

    2016-06-09

    Theoretical reactivity indices based on the conceptual Density Functional Theory (DFT) have become a powerful tool for the semiquantitative study of organic reactivity. A large number of reactivity indices have been proposed in the literature. Herein, global quantities like the electronic chemical potential μ, the electrophilicity ω and the nucleophilicity N indices, and local condensed indices like the electrophilic P k + and nucleophilic P k - Parr functions, as the most relevant indices for the study of organic reactivity, are discussed.

  13. Using chemical organization theory for model checking

    PubMed Central

    Kaleta, Christoph; Richter, Stephan; Dittrich, Peter

    2009-01-01

    Motivation: The increasing number and complexity of biomodels makes automatic procedures for checking the models' properties and quality necessary. Approaches like elementary mode analysis, flux balance analysis, deficiency analysis and chemical organization theory (OT) require only the stoichiometric structure of the reaction network for derivation of valuable information. In formalisms like Systems Biology Markup Language (SBML), however, information about the stoichiometric coefficients required for an analysis of chemical organizations can be hidden in kinetic laws. Results: First, we introduce an algorithm that uncovers stoichiometric information that might be hidden in the kinetic laws of a reaction network. This allows us to apply OT to SBML models using modifiers. Second, using the new algorithm, we performed a large-scale analysis of the 185 models contained in the manually curated BioModels Database. We found that for 41 models (22%) the set of organizations changes when modifiers are considered correctly. We discuss one of these models in detail (BIOMD149, a combined model of the ERK- and Wnt-signaling pathways), whose set of organizations drastically changes when modifiers are considered. Third, we found inconsistencies in 5 models (3%) and identified their characteristics. Compared with flux-based methods, OT is able to identify those species and reactions more accurately [in 26 cases (14%)] that can be present in a long-term simulation of the model. We conclude that our approach is a valuable tool that helps to improve the consistency of biomodels and their repositories. Availability: All data and a JAVA applet to check SBML-models is available from http://www.minet.uni-jena.de/csb/prj/ot/tools Contact: dittrich@minet.uni-jena.de Supplementary information: Supplementary data are available at Bioinformatics online. PMID:19468053

  14. Phenotypes of antibody-mediated rejection in organ transplants.

    PubMed

    Mengel, Michael; Husain, Sufia; Hidalgo, Luis; Sis, Banu

    2012-06-01

    Antibody-mediated hyperacute rejection was the first rejection phenotype observed in human organ transplants. This devastating phenotype was eliminated by reliable crossmatch technologies. Since then, the focus was on T-cell-mediated rejection and de novo donor-specific antibodies were considered an epiphenomenon of cognate T-cell activation. The immune theory was that controlling the T-cell response would entail elimination of antibody-mediated rejection (ABMR). With modern immunosuppressive drugs, T-cell-mediated rejection is essentially treatable. However, this did not prevent ABMR from emerging as a significant phenotype in all types of organ transplants. It became obvious that both rejection types require distinct treatment and thus reliable diagnosis. This is the current challenge. ABMR, depending on stage, grade, time course, organ type or prior treatment, can present with a wide spectrum of phenotypes. This review summarizes the current diagnostic consensus for ABMR, describes unmet needs and challenges in diagnostics, and proposes new approaches for consideration. © 2012 The Authors. Transplant International © 2012 European Society for Organ Transplantation.

  15. The evolution of senescence through decelerating selection for system reliability.

    PubMed

    Laird, R A; Sherratt, T N

    2009-05-01

    Senescence is a universal phenomenon in organisms, characterized by increasing mortality and decreasing fecundity with advancing chronological age. Most proximate agents of senescence, such as reactive oxygen species and UV radiation, are thought to operate by causing a gradual build-up of bodily damage. Yet most current evolutionary theories of senescence emphasize the deleterious effects of functioning genes in late life, leaving a gap between proximate and ultimate explanations. Here, we present an evolutionary model of senescence based on reliability theory, in which beneficial genes or gene products gradually get damaged and thereby fail, rather than actively cause harm. Specifically, the model allows organisms to evolve multiple redundant copies of a gene product (or gene) that performs a vital function, assuming that organisms can avoid condition-dependent death so long as at least one copy remains undamaged. We show that organisms with low levels of extrinsic mortality, and high levels of genetic damage, tend to evolve high levels of redundancy, and that mutation-selection balance results in a stable population distribution of the number of redundant elements. In contrast to previous evolutionary models of senescence, the mortality curves that emerge from such populations match empirical senescence patterns in three key respects: they exhibit: (1) an initially low, but rapidly increasing mortality rate at young ages, (2) a plateau in mortality at advanced ages and (3) 'mortality compensation', whereby the height of the mortality plateau is independent of the environmental conditions under which different populations evolved.

  16. High-Reliability Health Care: Getting There from Here

    PubMed Central

    Chassin, Mark R; Loeb, Jerod M

    2013-01-01

    Context Despite serious and widespread efforts to improve the quality of health care, many patients still suffer preventable harm every day. Hospitals find improvement difficult to sustain, and they suffer “project fatigue” because so many problems need attention. No hospitals or health systems have achieved consistent excellence throughout their institutions. High-reliability science is the study of organizations in industries like commercial aviation and nuclear power that operate under hazardous conditions while maintaining safety levels that are far better than those of health care. Adapting and applying the lessons of this science to health care offer the promise of enabling hospitals to reach levels of quality and safety that are comparable to those of the best high-reliability organizations. Methods We combined the Joint Commission's knowledge of health care organizations with knowledge from the published literature and from experts in high-reliability industries and leading safety scholars outside health care. We developed a conceptual and practical framework for assessing hospitals’ readiness for and progress toward high reliability. By iterative testing with hospital leaders, we refined the framework and, for each of its fourteen components, defined stages of maturity through which we believe hospitals must pass to reach high reliability. Findings We discovered that the ways that high-reliability organizations generate and maintain high levels of safety cannot be directly applied to today's hospitals. We defined a series of incremental changes that hospitals should undertake to progress toward high reliability. These changes involve the leadership's commitment to achieving zero patient harm, a fully functional culture of safety throughout the organization, and the widespread deployment of highly effective process improvement tools. Conclusions Hospitals can make substantial progress toward high reliability by undertaking several specific

  17. Highly Conductive and Reliable Copper-Filled Isotropically Conductive Adhesives Using Organic Acids for Oxidation Prevention

    NASA Astrophysics Data System (ADS)

    Chen, Wenjun; Deng, Dunying; Cheng, Yuanrong; Xiao, Fei

    2015-07-01

    The easy oxidation of copper is one critical obstacle to high-performance copper-filled isotropically conductive adhesives (ICAs). In this paper, a facile method to prepare highly reliable, highly conductive, and low-cost ICAs is reported. The copper fillers were treated by organic acids for oxidation prevention. Compared with ICA filled with untreated copper flakes, the ICA filled with copper flakes treated by different organic acids exhibited much lower bulk resistivity. The lowest bulk resistivity achieved was 4.5 × 10-5 Ω cm, which is comparable to that of commercially available Ag-filled ICA. After 500 h of 85°C/85% relative humidity (RH) aging, the treated ICAs showed quite stable bulk resistivity and relatively stable contact resistance. Through analyzing the results of x-ray diffraction, x-ray photoelectron spectroscopy, and thermogravimetric analysis, we found that, with the assistance of organic acids, the treated copper flakes exhibited resistance to oxidation, thus guaranteeing good performance.

  18. 18 CFR 39.11 - Reliability reports.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... Electric Reliability Organization shall conduct assessments of the adequacy of the Bulk-Power System in... assessments as determined by the Commission of the reliability of the Bulk-Power System in North America and... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Reliability reports. 39...

  19. Evaluating deceased organ donation: a programme theory approach.

    PubMed

    Manzano, Ana; Pawson, Ray

    2014-01-01

    Organ donation and transplantation services represent a microcosm of modern healthcare organisations. They are complex adaptive systems. They face perpetual problems of matching supply and demand. They operate under fierce time and resource constraints. And yet they have received relatively little attention from a systems perspective. The purpose of this paper is to consider some of the fundamental issues in evaluating, improving and policy reform in such complex systems. The paper advocates an approach based on programme theory evaluation. The paper explains how the death to donation to transplantation process depends on the accumulation of series of embedded, institutional sub-processes. Evaluators need to be concerned with this whole system rather than with its discrete parts or sectors. Policy makers may expect disappointment if they seek to improve donation rates by applying nudges or administrative reforms at a single point in the implementation chain. These services represent concentrated, perfect storms of complexity and the paper offers guidance to practitioners with bio-medical backgrounds on how such services might be evaluated and improved. For the methodological audience the paper caters for the burgeoning interest in programme theory evaluation while illustrating the design phase of this research strategy.

  20. The Estimation of the IRT Reliability Coefficient and Its Lower and Upper Bounds, with Comparisons to CTT Reliability Statistics

    ERIC Educational Resources Information Center

    Kim, Seonghoon; Feldt, Leonard S.

    2010-01-01

    The primary purpose of this study is to investigate the mathematical characteristics of the test reliability coefficient rho[subscript XX'] as a function of item response theory (IRT) parameters and present the lower and upper bounds of the coefficient. Another purpose is to examine relative performances of the IRT reliability statistics and two…

  1. Influences on the Test-Retest Reliability of Functional Connectivity MRI and its Relationship with Behavioral Utility.

    PubMed

    Noble, Stephanie; Spann, Marisa N; Tokoglu, Fuyuze; Shen, Xilin; Constable, R Todd; Scheinost, Dustin

    2017-11-01

    Best practices are currently being developed for the acquisition and processing of resting-state magnetic resonance imaging data used to estimate brain functional organization-or "functional connectivity." Standards have been proposed based on test-retest reliability, but open questions remain. These include how amount of data per subject influences whole-brain reliability, the influence of increasing runs versus sessions, the spatial distribution of reliability, the reliability of multivariate methods, and, crucially, how reliability maps onto prediction of behavior. We collected a dataset of 12 extensively sampled individuals (144 min data each across 2 identically configured scanners) to assess test-retest reliability of whole-brain connectivity within the generalizability theory framework. We used Human Connectome Project data to replicate these analyses and relate reliability to behavioral prediction. Overall, the historical 5-min scan produced poor reliability averaged across connections. Increasing the number of sessions was more beneficial than increasing runs. Reliability was lowest for subcortical connections and highest for within-network cortical connections. Multivariate reliability was greater than univariate. Finally, reliability could not be used to improve prediction; these findings are among the first to underscore this distinction for functional connectivity. A comprehensive understanding of test-retest reliability, including its limitations, supports the development of best practices in the field. © The Author 2017. Published by Oxford University Press.

  2. A Contribution to a Theory of Organizations: An Examination of Student Protest.

    ERIC Educational Resources Information Center

    Norr, James L.

    Until recently most of the research on college student protest of the 1960's has taken either a political socialization or cultural-historical perspective. The research reported here takes an organizational perspective with the expectation that an examination of student protest should contribute to a theory of organizations. Two classes of…

  3. 78 FR 38851 - Electric Reliability Organization Proposal To Retire Requirements in Reliability Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-28

    ... 215 of the Federal Power Act, the Commission proposes to approve the retirement of 34 requirements... of the Reliability Standards. In addition, the Commission proposes to withdraw 41 outstanding...-Power System. This proposal is part of the Commission's ongoing effort to review its requirements and...

  4. Reliability and validity of a Chinese version of the Modified Body Image Scale in patients with symptomatic pelvic organ prolapse.

    PubMed

    Zhu, Lan; Wang, Xiaoqian; Shi, Honghui; Xu, Tao; Lang, Jinghe; Tang, Xiang

    2015-08-01

    To validate a Chinese version of the Modified Body Image Scale (MBIS) among patients with symptomatic pelvic organ prolapse. As part of a validation study at a center in Beijing, China, women with symptomatic pelvic organ prolapse stage II or greater completed the Chinese version of the MBIS, the 12-item Short-Form Health Survey (SF-12), and the Pelvic Organ Prolapse/Urinary Incontinence Sexual Questionnaire (PISQ-12). A sample of 30 women was randomly chosen to return 2weeks later to complete the questionnaires again. The reliability and validity of the MBIS were assessed. Overall, 52 patients participated. A Cronbach α of 0.926 demonstrated adequate internal consistency of the Chinese MBIS. Its reproducibility was demonstrated by intraclass correlation coefficient values of 0.554-0.963 (P<0.01 for all items). Confirmatory factor analysis supported its construct validity. The MBIS and SF-12 scores were negatively correlated (r=-0.390; P<0.001), and the MBIS and PISQ-12 scores were also negatively correlated (r=-0.709; P<0.001). The Chinese version of the MBIS is a reliable and valid tool to evaluate body image perception among patients with symptomatic pelvic organ prolapse. Copyright © 2015. Published by Elsevier Ireland Ltd.

  5. High-reliability health care: getting there from here.

    PubMed

    Chassin, Mark R; Loeb, Jerod M

    2013-09-01

    Despite serious and widespread efforts to improve the quality of health care, many patients still suffer preventable harm every day. Hospitals find improvement difficult to sustain, and they suffer "project fatigue" because so many problems need attention. No hospitals or health systems have achieved consistent excellence throughout their institutions. High-reliability science is the study of organizations in industries like commercial aviation and nuclear power that operate under hazardous conditions while maintaining safety levels that are far better than those of health care. Adapting and applying the lessons of this science to health care offer the promise of enabling hospitals to reach levels of quality and safety that are comparable to those of the best high-reliability organizations. We combined the Joint Commission's knowledge of health care organizations with knowledge from the published literature and from experts in high-reliability industries and leading safety scholars outside health care. We developed a conceptual and practical framework for assessing hospitals' readiness for and progress toward high reliability. By iterative testing with hospital leaders, we refined the framework and, for each of its fourteen components, defined stages of maturity through which we believe hospitals must pass to reach high reliability. We discovered that the ways that high-reliability organizations generate and maintain high levels of safety cannot be directly applied to today's hospitals. We defined a series of incremental changes that hospitals should undertake to progress toward high reliability. These changes involve the leadership's commitment to achieving zero patient harm, a fully functional culture of safety throughout the organization, and the widespread deployment of highly effective process improvement tools. Hospitals can make substantial progress toward high reliability by undertaking several specific organizational change initiatives. Further research

  6. Reliability Assessment of Graphite Specimens under Multiaxial Stresses

    NASA Technical Reports Server (NTRS)

    Sookdeo, Steven; Nemeth, Noel N.; Bratton, Robert L.

    2008-01-01

    An investigation was conducted to predict the failure strength response of IG-100 nuclear grade graphite exposed to multiaxial stresses. As part of this effort, a review of failure criteria accounting for the stochastic strength response is provided. The experimental work was performed in the early 1990s at the Oak Ridge National Laboratory (ORNL) on hollow graphite tubes under the action of axial tensile loading and internal pressurization. As part of the investigation, finite-element analysis (FEA) was performed and compared with results of FEA from the original ORNL report. The new analysis generally compared well with the original analysis, although some discrepancies in the location of peak stresses was noted. The Ceramics Analysis and Reliability Evaluation of Structures Life prediction code (CARES/Life) was used with the FEA results to predict the quadrants I (tensile-tensile) and quadrant IV (compression-tension) strength response of the graphite tubes for the principle of independent action (PIA), the Weibull normal stress averaging (NSA), and the Batdorf multiaxial failure theories. The CARES/Life reliability analysis showed that all three failure theories gave similar results in quadrant I but that in quadrant IV, the PIA and Weibull normal stress-averaging theories were not conservative, whereas the Batdorf theory was able to correlate well with experimental results. The conclusion of the study was that the Batdorf theory should generally be used to predict the reliability response of graphite and brittle materials in multiaxial loading situations.

  7. [Reliability and validity of the Japanese version of the Thinking Style Inventory].

    PubMed

    Ochiai, Jun; Maie, Yuko; Wada, Yuichi

    2016-06-01

    This study examined the internal and external validity of the Japanese version of the Thinking Styles Inventory (TSI: Hiruma, 2000), which was originally developed by Sternberg and Wagner (1991) based on the framework of Sternberg's (1988) theory of mental self-government. The term "thinking style" refers to the concept that individuals differ in how they organize, direct, and manage their own thinking activities. We administered the Japanese version of the TSI to Japanese participants (N = 655: Age range 20-84 years). The results of item analysis, reliability analysis, and factor analysis, were consistent with the general ideas of the theory. In addition, there were significant relationships between certain thinking styles and 3 participant characteristics: age, gender, and working arrangement. Furthermore, some thinking styles were positively correlated with social skill. Implications of these results for the nature of Japanese thinking styles are discussed.

  8. Flexible organic TFT bio-signal amplifier using reliable chip component assembly process with conductive adhesive.

    PubMed

    Yoshimoto, Shusuke; Uemura, Takafumi; Akiyama, Mihoko; Ihara, Yoshihiro; Otake, Satoshi; Fujii, Tomoharu; Araki, Teppei; Sekitani, Tsuyoshi

    2017-07-01

    This paper presents a flexible organic thin-film transistor (OTFT) amplifier for bio-signal monitoring and presents the chip component assembly process. Using a conductive adhesive and a chip mounter, the chip components are mounted on a flexible film substrate, which has OTFT circuits. This study first investigates the assembly technique reliability for chip components on the flexible substrate. This study also specifically examines heart pulse wave monitoring conducted using the proposed flexible amplifier circuit and a flexible piezoelectric film. We connected the amplifier to a bluetooth device for a wearable device demonstration.

  9. Pioneering topological methods for network-based drug-target prediction by exploiting a brain-network self-organization theory.

    PubMed

    Durán, Claudio; Daminelli, Simone; Thomas, Josephine M; Haupt, V Joachim; Schroeder, Michael; Cannistraci, Carlo Vittorio

    2017-04-26

    The bipartite network representation of the drug-target interactions (DTIs) in a biosystem enhances understanding of the drugs' multifaceted action modes, suggests therapeutic switching for approved drugs and unveils possible side effects. As experimental testing of DTIs is costly and time-consuming, computational predictors are of great aid. Here, for the first time, state-of-the-art DTI supervised predictors custom-made in network biology were compared-using standard and innovative validation frameworks-with unsupervised pure topological-based models designed for general-purpose link prediction in bipartite networks. Surprisingly, our results show that the bipartite topology alone, if adequately exploited by means of the recently proposed local-community-paradigm (LCP) theory-initially detected in brain-network topological self-organization and afterwards generalized to any complex network-is able to suggest highly reliable predictions, with comparable performance with the state-of-the-art-supervised methods that exploit additional (non-topological, for instance biochemical) DTI knowledge. Furthermore, a detailed analysis of the novel predictions revealed that each class of methods prioritizes distinct true interactions; hence, combining methodologies based on diverse principles represents a promising strategy to improve drug-target discovery. To conclude, this study promotes the power of bio-inspired computing, demonstrating that simple unsupervised rules inspired by principles of topological self-organization and adaptiveness arising during learning in living intelligent systems (like the brain) can efficiently equal perform complicated algorithms based on advanced, supervised and knowledge-based engineering. © The Author 2017. Published by Oxford University Press.

  10. Youth Purpose through the Lens of the Theory of Organizing Models of Thinking

    ERIC Educational Resources Information Center

    Arantes, Valeria; Araujo, Ulisses; Pinheiro, Viviane; Moreno Marimon, Montserrat; Sastre, Genoveva

    2017-01-01

    Purpose represents a unique opportunity for identifying and analyzing the complexity of human reasoning, considering that its constitution brings together cognitive, affective and social elements. In this article, we use the Theory of Organizing Models of Thinking (OMT), an epistemological and methodological approach based on developmental…

  11. An Examination of Test-Retest, Alternate Form Reliability, and Generalizability Theory Study of the easyCBM Reading Assessments: Grade 2. Technical Report #1217

    ERIC Educational Resources Information Center

    Anderson, Daniel; Lai, Cheg-Fei; Park, Bitnara Jasmine; Alonzo, Julie; Tindal, Gerald

    2012-01-01

    This technical report is one in a series of five describing the reliability (test/retest an alternate form) and G-Theory/D-Study on the easyCBM reading measures, grades 1-5. Data were gathered in the spring of 2011 from the convenience sample of students nested within classrooms at a medium-sized school district in the Pacific Northwest. Due to…

  12. An Examination of Test-Retest, Alternate Form Reliability, and Generalizability Theory Study of the easyCBM Reading Assessments: Grade 1. Technical Report #1216

    ERIC Educational Resources Information Center

    Anderson, Daniel; Park, Jasmine, Bitnara; Lai, Cheng-Fei; Alonzo, Julie; Tindal, Gerald

    2012-01-01

    This technical report is one in a series of five describing the reliability (test/retest/and alternate form) and G-Theory/D-Study research on the easy CBM reading measures, grades 1-5. Data were gathered in the spring 2011 from a convenience sample of students nested within classrooms at a medium-sized school district in the Pacific Northwest. Due…

  13. An Examination of Test-Retest, Alternate Form Reliability, and Generalizability Theory Study of the easyCBM Reading Assessments: Grade 5. Technical Report #1220

    ERIC Educational Resources Information Center

    Lai, Cheng-Fei; Park, Bitnara Jasmine; Anderson, Daniel; Alonzo, Julie; Tindal, Gerald

    2012-01-01

    This technical report is one in a series of five describing the reliability (test/retest and alternate form) and G-Theory/D-Study research on the easyCBM reading measures, grades 1-5. Data were gathered in the spring of 2011 from a convenience sample of students nested within classrooms at a medium-sized school district in the Pacific Northwest.…

  14. Environmental education curriculum evaluation questionnaire: A reliability and validity study

    NASA Astrophysics Data System (ADS)

    Minner, Daphne Diane

    The intention of this research project was to bridge the gap between social science research and application to the environmental domain through the development of a theoretically derived instrument designed to give educators a template by which to evaluate environmental education curricula. The theoretical base for instrument development was provided by several developmental theories such as Piaget's theory of cognitive development, Developmental Systems Theory, Life-span Perspective, as well as curriculum research within the area of environmental education. This theoretical base fueled the generation of a list of components which were then translated into a questionnaire with specific questions relevant to the environmental education domain. The specific research question for this project is: Can a valid assessment instrument based largely on human development and education theory be developed that reliably discriminates high, moderate, and low quality in environmental education curricula? The types of analyses conducted to answer this question were interrater reliability (percent agreement, Cohen's Kappa coefficient, Pearson's Product-Moment correlation coefficient), test-retest reliability (percent agreement, correlation), and criterion-related validity (correlation). Face validity and content validity were also assessed through thorough reviews. Overall results indicate that 29% of the questions on the questionnaire demonstrated a high level of interrater reliability and 43% of the questions demonstrated a moderate level of interrater reliability. Seventy-one percent of the questions demonstrated a high test-retest reliability and 5% a moderate level. Fifty-five percent of the questions on the questionnaire were reliable (high or moderate) both across time and raters. Only eight questions (8%) did not show either interrater or test-retest reliability. The global overall rating of high, medium, or low quality was reliable across both coders and time, indicating

  15. The organization of verbs of knowing: evidence for cultural commonality and variation in theory of mind.

    PubMed

    Schwanenflugel, P J; Martin, M; Takahashi, T

    1999-09-01

    Cross-cultural commonality and variation in folk theories of knowing were studied by examining the organization of verbs of knowing in German and Japanese adults. German and Japanese adults performed one of two tasks: a similarity judgment task and an attribute rating task. Organizational structure was assessed for the similarity judgment task using multidimensional scaling and additive similarity tree analyses. The attribute rating task was used to describe the characteristics that organized the dimensions and clusters emerging from the scaling solutions. The folk theory of mind displayed was an information processing model with constructive components, although the constructive aspects were more salient for the Germans than for the Japanese.

  16. Energy-level alignment at organic heterointerfaces

    PubMed Central

    Oehzelt, Martin; Akaike, Kouki; Koch, Norbert; Heimel, Georg

    2015-01-01

    Today’s champion organic (opto-)electronic devices comprise an ever-increasing number of different organic-semiconductor layers. The functionality of these complex heterostructures largely derives from the relative alignment of the frontier molecular-orbital energies in each layer with respect to those in all others. Despite the technological relevance of the energy-level alignment at organic heterointerfaces, and despite continued scientific interest, a reliable model that can quantitatively predict the full range of phenomena observed at such interfaces is notably absent. We identify the limitations of previous attempts to formulate such a model and highlight inconsistencies in the interpretation of the experimental data they were based on. We then develop a theoretical framework, which we demonstrate to accurately reproduce experiment. Applying this theory, a comprehensive overview of all possible energy-level alignment scenarios that can be encountered at organic heterojunctions is finally given. These results will help focus future efforts on developing functional organic interfaces for superior device performance. PMID:26702447

  17. Reliability and Maintainability (RAM) Training

    NASA Technical Reports Server (NTRS)

    Lalli, Vincent R. (Editor); Malec, Henry A. (Editor); Packard, Michael H. (Editor)

    2000-01-01

    The theme of this manual is failure physics-the study of how products, hardware, software, and systems fail and what can be done about it. The intent is to impart useful information, to extend the limits of production capability, and to assist in achieving low-cost reliable products. In a broader sense the manual should do more. It should underscore the urgent need CI for mature attitudes toward reliability. Five of the chapters were originally presented as a classroom course to over 1000 Martin Marietta engineers and technicians. Another four chapters and three appendixes have been added, We begin with a view of reliability from the years 1940 to 2000. Chapter 2 starts the training material with a review of mathematics and a description of what elements contribute to product failures. The remaining chapters elucidate basic reliability theory and the disciplines that allow us to control and eliminate failures.

  18. Elastic, not plastic species: frozen plasticity theory and the origin of adaptive evolution in sexually reproducing organisms.

    PubMed

    Flegr, Jaroslav

    2010-01-13

    Darwin's evolutionary theory could easily explain the evolution of adaptive traits (organs and behavioral patterns) in asexual but not in sexual organisms. Two models, the selfish gene theory and frozen plasticity theory were suggested to explain evolution of adaptive traits in sexual organisms in past 30 years. The frozen plasticity theory suggests that sexual species can evolve new adaptations only when their members are genetically uniform, i.e. only after a portion of the population of the original species had split off, balanced on the edge of extinction for several generations, and then undergone rapid expansion. After a short period of time, estimated on the basis of paleontological data to correspond to 1-2% of the duration of the species, polymorphism accumulates in the gene pool due to frequency-dependent selection; and thus, in each generation, new mutations occur in the presence of different alleles and therefore change their selection coefficients from generation to generation. The species ceases to behave in an evolutionarily plastic manner and becomes evolutionarily elastic on a microevolutionary time-scale and evolutionarily frozen on a macroevolutionary time-scale. It then exists in this state until such changes accumulate in the environment that the species becomes extinct. Frozen plasticity theory, which includes the Darwinian model of evolution as a special case--the evolution of species in a plastic state, not only offers plenty of new predictions to be tested, but also provides explanations for a much broader spectrum of known biological phenomena than classic evolutionary theories. This article was reviewed by Rob Knight, Fyodor Kondrashov and Massimo Di Giulio (nominated by David H. Ardell).

  19. Decision-making regarding organ donation in Korean adults: A grounded-theory study.

    PubMed

    Yeun, Eun Ja; Kwon, Young Mi; Kim, Jung A

    2015-06-01

    The aim of this study was to identify the hidden patterns of behavior leading toward the decision to donate organs. Thirteen registrants at the Association for Organ Sharing in Korea were recruited. Data were collected using in-depth interview and the interview transcripts were analyzed using Glaserian grounded-theory methodology. The main problem of participants was "body attachment" and the core category (management process) was determined to be "pursuing life." The theme consisted of four phases, which were: "hesitating," "investigating," "releasing," and "re-discovering. " Therefore, to increase organ donations, it is important to find a strategy that will create positive attitudes about organ donation through education and public relations. These results explain and provide a deeper understanding of the main problem that Korean people have about organ donation and their management of decision-making processes. These findings can help care providers to facilitate the decision-making process and respond to public needs while taking into account the sociocultural context within which decisions are made. © 2014 Wiley Publishing Asia Pty Ltd.

  20. Examining Agency Theory in Training & Development: Understanding Self-Interest Behaviors in the Organization

    ERIC Educational Resources Information Center

    Azevedo, Ross E.; Akdere, Mesut

    2011-01-01

    Agency theory has been discussed widely in the business and management literature. However, to date there has been no investigation about its utility and implications for problems in training & development. Whereas organizations are still struggling to develop and implement effective training programs, there is little emphasis on the self-interest…

  1. Ultra Reliability Workshop Introduction

    NASA Technical Reports Server (NTRS)

    Shapiro, Andrew A.

    2006-01-01

    This plan is the accumulation of substantial work by a large number of individuals. The Ultra-Reliability team consists of representatives from each center who have agreed to champion the program and be the focal point for their center. A number of individuals from NASA, government agencies (including the military), universities, industry and non-governmental organizations also contributed significantly to this effort. Most of their names may be found on the Ultra-Reliability PBMA website.

  2. IRT-Estimated Reliability for Tests Containing Mixed Item Formats

    ERIC Educational Resources Information Center

    Shu, Lianghua; Schwarz, Richard D.

    2014-01-01

    As a global measure of precision, item response theory (IRT) estimated reliability is derived for four coefficients (Cronbach's a, Feldt-Raju, stratified a, and marginal reliability). Models with different underlying assumptions concerning test-part similarity are discussed. A detailed computational example is presented for the targeted…

  3. Evaluating physical therapists' perception of empowerment using Kanter's theory of structural power in organizations.

    PubMed

    Miller, P A; Goddard, P; Spence Laschinger, H K

    2001-12-01

    Little is known about physical therapists' perceptions of empowerment. In this study, Kanter's theory of structural power in organizations was used to examine physical therapists' perceptions of empowerment in a large Canadian urban teaching hospital. Kanter's theory, which has been studied extensively in the nursing profession, proposes that power in organizations is derived from access to information, support, resources, opportunity, and proportions. A convenience sample of physical therapists who had been working in the hospital longer than 3 months was used to determine the scores for the physical therapists' ratings of empowerment using the Conditions of Work Effectiveness Questionnaire. Physical therapists' scores were similar to reported staff nurses' scores for access to support, information, resources, and opportunity (mean=2.89, 2.91, 2.62, 3.25, respectively). Physical therapists' scores were higher than the majority of reported staff nurses' and nurse managers' scores for access to sources of informal and formal power structures (mean=2.81 and 3.29, respectively). There was a relationship between the empowerment score and the physical therapists' global rating of empowerment. Unlike studies of nurses, there were no relationships when demographic attributes and empowerment scores were examined. Evidence for the validity of Kanter's theory of empowerment was found. Kanter's theory can provide physical therapists and their managers with a useful framework for examining critical organizational factors (access to information, support, opportunity, and resources) that contribute to employees' perceptions of empowerment. A baseline measure for comparing future empowerment scores of this sample is available. Further work to examine the application of Kanter's theory to other samples of physical therapists appears to be warranted.

  4. Critical assessment of density functional theory for computing vibrational (hyper)polarizabilities

    NASA Astrophysics Data System (ADS)

    Zaleśny, R.; Bulik, I. W.; Mikołajczyk, M.; Bartkowiak, W.; Luis, J. M.; Kirtman, B.; Avramopoulos, A.; Papadopoulos, M. G.

    2012-12-01

    Despite undisputed success of the density functional theory (DFT) in various branches of chemistry and physics, an application of the DFT for reliable predictions of nonlinear optical properties of molecules has been questioned a decade ago. As it was shown by Champagne, et al. [1, 2, 3] most conventional DFT schemes were unable to qualitatively predict the response of conjugated oligomers to a static electric field. Long-range corrected (LRC) functionals, like LC-BLYP or CAM-B3LYP, have been proposed to alleviate this deficiency. The reliability of LRC functionals for evaluating molecular (hyper)polarizabilities is studied for various groups of organic systems, with a special focus on vibrational corrections to the electric properties.

  5. Reliability approach to rotating-component design. [fatigue life and stress concentration

    NASA Technical Reports Server (NTRS)

    Kececioglu, D. B.; Lalli, V. R.

    1975-01-01

    A probabilistic methodology for designing rotating mechanical components using reliability to relate stress to strength is explained. The experimental test machines and data obtained for steel to verify this methodology are described. A sample mechanical rotating component design problem is solved by comparing a deterministic design method with the new design-by reliability approach. The new method shows that a smaller size and weight can be obtained for specified rotating shaft life and reliability, and uses the statistical distortion-energy theory with statistical fatigue diagrams for optimum shaft design. Statistical methods are presented for (1) determining strength distributions for steel experimentally, (2) determining a failure theory for stress variations in a rotating shaft subjected to reversed bending and steady torque, and (3) relating strength to stress by reliability.

  6. Signal verification can promote reliable signalling.

    PubMed

    Broom, Mark; Ruxton, Graeme D; Schaefer, H Martin

    2013-11-22

    The central question in communication theory is whether communication is reliable, and if so, which mechanisms select for reliability. The primary approach in the past has been to attribute reliability to strategic costs associated with signalling as predicted by the handicap principle. Yet, reliability can arise through other mechanisms, such as signal verification; but the theoretical understanding of such mechanisms has received relatively little attention. Here, we model whether verification can lead to reliability in repeated interactions that typically characterize mutualisms. Specifically, we model whether fruit consumers that discriminate among poor- and good-quality fruits within a population can select for reliable fruit signals. In our model, plants either signal or they do not; costs associated with signalling are fixed and independent of plant quality. We find parameter combinations where discriminating fruit consumers can select for signal reliability by abandoning unprofitable plants more quickly. This self-serving behaviour imposes costs upon plants as a by-product, rendering it unprofitable for unrewarding plants to signal. Thus, strategic costs to signalling are not a prerequisite for reliable communication. We expect verification to more generally explain signal reliability in repeated consumer-resource interactions that typify mutualisms but also in antagonistic interactions such as mimicry and aposematism.

  7. [Process design in high-reliability organizations].

    PubMed

    Sommer, K-J; Kranz, J; Steffens, J

    2014-05-01

    Modern medicine is a highly complex service industry in which individual care providers are linked in a complicated network. The complexity and interlinkedness is associated with risks concerning patient safety. Other highly complex industries like commercial aviation have succeeded in maintaining or even increasing its safety levels despite rapidly increasing passenger figures. Standard operating procedures (SOPs), crew resource management (CRM), as well as operational risk evaluation (ORE) are historically developed and trusted parts of a comprehensive and systemic safety program. If medicine wants to follow this quantum leap towards increased patient safety, it must intensively evaluate the results of other high-reliability industries and seek step-by-step implementation after a critical assessment.

  8. Exploring the reliability and validity of the social-moral awareness test.

    PubMed

    Livesey, Alexandra; Dodd, Karen; Pote, Helen; Marlow, Elizabeth

    2012-11-01

    The aim of the study was to explore the validity of the social-moral awareness test (SMAT) a measure designed for assessing socio-moral rule knowledge and reasoning in people with learning disabilities. Comparisons between Theory of Mind and socio-moral reasoning allowed the exploration of construct validity of the tool. Factor structure, reliability and discriminant validity were also assessed. Seventy-one participants with mild-moderate learning disabilities completed the two scales of the SMAT and two False Belief Tasks for Theory of Mind. Reliability of the SMAT was very good, and the scales were shown to be uni-dimensional in factor structure. There was a significant positive relationship between Theory of Mind and both SMAT scales. There is early evidence of the construct validity and reliability of the SMAT. Further assessment of the validity of the SMAT will be required. © 2012 Blackwell Publishing Ltd.

  9. The Mochi project: a field theory approach to plasma dynamics and self-organization

    NASA Astrophysics Data System (ADS)

    You, Setthivoine; von der Linden, Jens; Lavine, Eric Sander; Card, Alexander; Carroll, Evan

    2016-10-01

    The Mochi project is designed to study the interaction between plasma flows and magnetic fields from the point-of-view of canonical flux tubes. The Mochi Labjet experiment is being commissioned after achieving first plasma. Analytical and numerical tools are being developed to visualize canonical flux tubes. One analytical tool described here is a field theory approach to plasma dynamics and self-organization. A redefinition of the Lagrangian of a multi-particle system in fields reformulates the single-particle, kinetic, and fluid equations governing fluid and plasma dynamics as a single set of generalized Maxwell's equations and Ohm's law for canonical force-fields. The Lagrangian includes new terms representing the coupling between the motion of particle distributions, between distributions and electromagnetic fields, with relativistic contributions. The formulation shows that the concepts of self-organization and canonical helicity transport are applicable across single-particle, kinetic, and fluid regimes, at classical and relativistic scales. The theory gives the basis for comparing canonical helicity change to energy change in general systems. This work is supported by by US DOE Grant DE-SC0010340.

  10. Reliable measurement of the Seebeck coefficient of organic and inorganic materials between 260 K and 460 K

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beretta, D.; Lanzani, G.; Dipartimento di Fisica, P.zza Leonardo da Vinci 32, Politecnico di Milano, 20133 Milano

    2015-07-15

    A new experimental setup for reliable measurement of the in-plane Seebeck coefficient of organic and inorganic thin films and bulk materials is reported. The system is based on the “Quasi-Static” approach and can measure the thermopower in the range of temperature between 260 K and 460 K. The system has been tested on a pure nickel bulk sample and on a thin film of commercially available PEDOT:PSS deposited by spin coating on glass. Repeatability within 1.5% for the nickel sample is demonstrated, while accuracy in the measurement of both organic and inorganic samples is guaranteed by time interpolation of datamore » and by operating with a temperature difference over the sample of less than 1 K.« less

  11. Quantiative reliability of the Migdal-Eliashberg theory for strong coupling superconductors

    NASA Astrophysics Data System (ADS)

    Bauer, Johannes; Han, Jong; Gunnarsson, Olle

    2012-02-01

    The Migdal-Eliashberg (ME) theory for strong electron-phonon coupling and retardation effects of the Morel-Anderson type form the basis for the quantitative understanding of conventional superconductors. The validity of the ME theory for values of the electron-phonon coupling strength λ>1 has been questioned by model studies. By distinguishing bare and effective parameters, and by comparing the ME theory with the dynamical mean field theory (DMFT), we clarify the range of applicability of the ME theory. Specifically, we show that ME theory is very accurate as long as the product of effective parameters, λφph/D, where φph is an appropriate phonon scale and D an electronic scale, is small enough [1]. The effectiveness of retardation effects is usually considered based on the lowest order diagram in the perturbation theory. We analyze these effects to higher order and find modifications to the usual result for the Coulomb pseudo-potential &*circ;. Retardation effects are weakened due to a reduced effective bandwidth. Comparison with the non-perturbative DMFT corroborates our findings [2]. [4pt] [1] J Bauer, J E Han, and O Gunnarsson, Phys. Rev. B. 84, 184531 (2011).[0pt] [2] J Bauer, J E Han, and O Gunnarsson, in preparation (2011).

  12. Leading Change: Transitioning the AFMS into a High Reliability Organization

    DTIC Science & Technology

    2016-02-16

    Belief and Drive Big Results (New York, NY: Free Press, 2012), 17. 61 William Riley, “High Reliability and Implications for Nursing Leaders,” Journal of... Nursing Management 17, no. 2 (March 2009): 241. 62 The Joint Commission, “About Us,” 25 November 2015, http://www.jointcommission.org/about_us...Air Force Surgeon General. Trusted Care Concept of Operations, October 2015. Riley, William. “High reliability and implications for nursing leaders

  13. Substance Use Stigma: Reliability and validity of a theory-based scale for substance-using populations*

    PubMed Central

    Smith, Laramie R.; Earnshaw, Valerie A.; Copenhaver, Michael M.; Cunningham, Chinazo O.

    2016-01-01

    Background Substance use disorders consistently rank among the most stigmatized conditions worldwide. Thus, substance use stigma fosters health inequities among persons with substance use disorders and remains a key barrier to successful screening and treatment efforts. Current efforts to measure substance use stigma are limited. This study aims to advance measurement efforts by drawing on stigma theory to develop and evaluate the Substance Use Stigma Mechanisms Scale (SU-SMS). The SU-SMS was designed to capture enacted, anticipated, and internalized substance use stigma mechanisms among persons with current and past substance use disorders, and distinguish between key stigma sources most likely to impact this target population. Methods This study was a cross-sectional evaluation of the validity, reliability, and generalizability of the SU-SMS across two independent samples with diverse substance use and treatment histories. Results Findings support the structural and construct validity of the SU-SMS, suggesting the scale was able to capture enacted, anticipated, and internalized stigma as distinct stigma experiences. It also further differentiated between two distinct stigma sources (family and healthcare providers). Analysis of these mechanisms and psychosocial metrics suggests that the scale is also associated with other health-related outcomes. Furthermore, the SU-SMS demonstrated high levels of internal reliability and generalizability across two independent samples of persons with diverse substance use disorders and treatment histories. Conclusion The SU-SMS may serve as a valuable tool for better understanding the processes through which substance use stigma serves to undermine key health behaviors and outcomes among persons with substance use disorders. PMID:26972790

  14. General Contingency Theory of Organizations: An Alternative to Open Systems Theory.

    DTIC Science & Technology

    1982-08-01

    genetic and mechanical open systems. We have recently proposed a general contingency theory (GCT) of management (Luthans and Stewart, 1977) which promises...developed in response to the need for an integrative theory of management that incorporates the environment (in the open systems sense. and begins to... management and desired performance out- comes. We will show that the GCT matrix can lead to organizational effec- tiveness. The Theory as a Basis for More

  15. Basic Concepts in Classical Test Theory: Tests Aren't Reliable, the Nature of Alpha, and Reliability Generalization as a Meta-analytic Method.

    ERIC Educational Resources Information Center

    Helms, LuAnn Sherbeck

    This paper discusses the fact that reliability is about scores and not tests and how reliability limits effect sizes. The paper also explores the classical reliability coefficients of stability, equivalence, and internal consistency. Stability is concerned with how stable test scores will be over time, while equivalence addresses the relationship…

  16. Software reliability experiments data analysis and investigation

    NASA Technical Reports Server (NTRS)

    Walker, J. Leslie; Caglayan, Alper K.

    1991-01-01

    The objectives are to investigate the fundamental reasons which cause independently developed software programs to fail dependently, and to examine fault tolerant software structures which maximize reliability gain in the presence of such dependent failure behavior. The authors used 20 redundant programs from a software reliability experiment to analyze the software errors causing coincident failures, to compare the reliability of N-version and recovery block structures composed of these programs, and to examine the impact of diversity on software reliability using subpopulations of these programs. The results indicate that both conceptually related and unrelated errors can cause coincident failures and that recovery block structures offer more reliability gain than N-version structures if acceptance checks that fail independently from the software components are available. The authors present a theory of general program checkers that have potential application for acceptance tests.

  17. Investigating Postgraduate College Admission Interviews: Generalizability Theory Reliability and Incremental Predictive Validity

    ERIC Educational Resources Information Center

    Arce-Ferrer, Alvaro J.; Castillo, Irene Borges

    2007-01-01

    The use of face-to-face interviews is controversial for college admissions decisions in light of the lack of availability of validity and reliability evidence for most college admission processes. This study investigated reliability and incremental predictive validity of a face-to-face postgraduate college admission interview with a sample of…

  18. Theories of risk and safety: what is their relevance to nursing?

    PubMed

    Cooke, Hannah

    2009-03-01

    The aim of this paper is to review key theories of risk and safety and their implications for nursing. The concept of of patient safety has only recently risen to prominence as an organising principle in healthcare. The paper considers the wider social context in which contemporary concepts of risk and safety have developed. In particular it looks at sociological debates about the rise of risk culture and the risk society and their influence on the patient safety movement. The paper discusses three bodies of theory which have attempted to explain the management of risk and safety in organisations: normal accident theory, high reliability theory, and grid-group cultural theory. It examine debates between these theories and their implications for healthcare. It discusses reasons for the dominance of high reliability theory in healthcare and its strengths and limitations. The paper suggest that high reliability theory has particular difficulties in explaining some aspects of organisational culture. It also suggest that the implementation of high reliability theory in healthcare has involved over reliance on numerical indicators. It suggests that patient safety could be improved by openness to a wider range of theoretical perspectives.

  19. Many-body perturbation theory for understanding optical excitations in organic molecules and solids

    NASA Astrophysics Data System (ADS)

    Sharifzadeh, Sahar

    2018-04-01

    Semiconductors composed of organic molecules are promising as components for flexible and inexpensive optoelectronic devices, with many recent studies aimed at understanding their electronic and optical properties. In particular, computational modeling of these complex materials has provided new understanding of the underlying properties which give rise to their excited-state phenomena. This article provides an overview of recent many-body perturbation theory (MBPT) studies of optical excitations within organic molecules and solids. We discuss the accuracy of MBPT within the GW/BSE approach in predicting excitation energies and absorption spectra, and assess the impact of two commonly used approximations, the DFT starting point and the Tamm–Dancoff approximation. Moreover, we summarize studies that elucidate the role of solid-state structure on the nature of excitons in organic crystals. These studies show that a rich physical understanding of organic materials can be obtained from GW/BSE.

  20. Adsorptive desulfurization with metal-organic frameworks: A density functional theory investigation

    NASA Astrophysics Data System (ADS)

    Chen, Zhiping; Ling, Lixia; Wang, Baojun; Fan, Huiling; Shangguan, Ju; Mi, Jie

    2016-11-01

    The contribution of each fragment of metal-organic frameworks (MOFs) to the adsorption of sulfur compounds were investigated using density functional theory (DFT). The involved sulfur compounds are dimethyl sulfide (CH3SCH3), ethyl mercaptan (CH3CH2SH) and hydrogen sulfide (H2S). MOFs with different organic ligands (NH2-BDC, BDC and NDC), metal centers structures (M, M-M and M3O) and metal ions (Zn, Cu and Fe) were used to study their effects on sulfur species adsorption. The results revealed that, MOFs with coordinatively unsaturated sites (CUS) have the strongest binding strength with sulfur compounds, MOFs with NH2-BDC substituent group ligand comes second, followed by that with saturated metal center, and the organic ligands without substituent group has the weakest adsorption strength. Moreover, it was also found that, among different metal ions (Fe, Zn and Cu), MOFs with unsaturated Fe has the strongest adsorption strength for sulfur compounds. These results are consistent with our previous experimental observations, and therefore provide insights on the better design of MOFs for desulfurization application.

  1. How Youth Get Engaged: Grounded-Theory Research on Motivational Development in Organized Youth Programs

    ERIC Educational Resources Information Center

    Dawes, Nickki Pearce; Larson, Reed

    2011-01-01

    For youth to benefit from many of the developmental opportunities provided by organized programs, they need to not only attend but become psychologically engaged in program activities. This research was aimed at formulating empirically based grounded theory on the processes through which this engagement develops. Longitudinal interviews were…

  2. The design organization test: further demonstration of reliability and validity as a brief measure of visuospatial ability.

    PubMed

    Killgore, William D S; Gogel, Hannah

    2014-01-01

    Neuropsychological assessments are frequently time-consuming and fatiguing for patients. Brief screening evaluations may reduce test duration and allow more efficient use of time by permitting greater attention toward neuropsychological domains showing probable deficits. The Design Organization Test (DOT) was initially developed as a 2-min paper-and-pencil alternative for the Block Design (BD) subtest of the Wechsler scales. Although initially validated for clinical neurologic patients, we sought to further establish the reliability and validity of this test in a healthy, more diverse population. Two alternate versions of the DOT and the Wechsler Abbreviated Scale of Intelligence (WASI) were administered to 61 healthy adult participants. The DOT showed high alternate forms reliability (r = .90-.92), and the two versions yielded equivalent levels of performance. The DOT was highly correlated with BD (r = .76-.79) and was significantly correlated with all subscales of the WASI. The DOT proved useful when used in lieu of BD in the calculation of WASI IQ scores. Findings support the reliability and validity of the DOT as a measure of visuospatial ability and suggest its potential worth as an efficient estimate of intellectual functioning in situations where lengthier tests may be inappropriate or unfeasible.

  3. Molecular Fingerprints in the Electronic Properties of Crystalline Organic Semiconductors: From Experiment to Theory

    NASA Astrophysics Data System (ADS)

    Ciuchi, S.; Hatch, R. C.; Höchst, H.; Faber, C.; Blase, X.; Fratini, S.

    2012-06-01

    By comparing photoemission spectroscopy with a nonperturbative dynamical mean field theory extension to many-body ab initio calculations, we show in the prominent case of pentacene crystals that an excellent agreement with experiment for the bandwidth, dispersion, and lifetime of the hole carrier bands can be achieved in organic semiconductors, provided that one properly accounts for the coupling to molecular vibrational modes and the presence of disorder. Our findings rationalize the growing experimental evidence that even the best band structure theories based on a many-body treatment of electronic interactions cannot reproduce the experimental photoemission data in this important class of materials.

  4. Assessing governance theory and practice in health-care organizations: a survey of UK hospices.

    PubMed

    Chambers, Naomi; Benson, Lawrence; Boyd, Alan; Girling, Jeff

    2012-05-01

    This paper sets out a theoretical framework for analyzing board governance, and describes an empirical study of corporate governance practices in a subset of non-profit organizations (hospices in the UK). It examines how practices in hospice governance compare with what is known about effective board working. We found that key strengths of hospice boards included a strong focus on the mission and the finances of the organizations, and common weaknesses included a lack of involvement in strategic matters and a lack of confidence, and some nervousness about challenging the organization on the quality of clinical care. Finally, the paper offers suggestions for theoretical development particularly in relation to board governance in non-profit organizations. It develops an engagement theory for boards which comprises a triadic proposition of high challenge, high support and strong grip.

  5. Predicting behavioural responses to novel organisms: state-dependent detection theory

    PubMed Central

    Sih, Andrew

    2017-01-01

    Human activity alters natural habitats for many species. Understanding variation in animals' behavioural responses to these changing environments is critical. We show how signal detection theory can be used within a wider framework of state-dependent modelling to predict behavioural responses to a major environmental change: novel, exotic species. We allow thresholds for action to be a function of reserves, and demonstrate how optimal thresholds can be calculated. We term this framework ‘state-dependent detection theory’ (SDDT). We focus on behavioural and fitness outcomes when animals continue to use formerly adaptive thresholds following environmental change. In a simple example, we show that exposure to novel animals which appear dangerous—but are actually safe—(e.g. ecotourists) can have catastrophic consequences for ‘prey’ (organisms that respond as if the new organisms are predators), significantly increasing mortality even when the novel species is not predatory. SDDT also reveals that the effect on reproduction can be greater than the effect on lifespan. We investigate factors that influence the effect of novel organisms, and address the potential for behavioural adjustments (via evolution or learning) to recover otherwise reduced fitness. Although effects of environmental change are often difficult to predict, we suggest that SDDT provides a useful route ahead. PMID:28100814

  6. An Examination of Test-Retest, Alternate Form Reliability, and Generalizability Theory Study of the easyCBM Passage Reading Fluency Assessments: Grade 4. Technical Report #1219

    ERIC Educational Resources Information Center

    Park, Bitnara Jasmine; Anderson, Daniel; Alonzo, Julie; Lai, Cheng-Fei; Tindal, Gerald

    2012-01-01

    This technical report is one in a series of five describing the reliability (test/retest and alternate form) and G-Theory/D-Study research on the easyCBM reading measures, grades 1-5. Data were gathered in the spring of 2011 from a convenience sample of students nested within classrooms at a medium-sized school district in the Pacific Northwest.…

  7. Mass Media Theory, Leveraging Relationships, and Reliable Strategic Communication Effects

    DTIC Science & Technology

    2008-03-19

    other people who are in the same social and cultural groups. Families respond to patriarchs and matriarchs , congregations respond to pastors, and teens...media to self-correct behavior in order to make society seem more “normal.” Verbal and Written Message- Centric Theories Premise of Theory Magic...Effects Harmony and Balance People gravitate toward information they already believe. Structural Functionalism When society begins to seem

  8. Communication as a predictor of willingness to donate one's organs: an addition to the Theory of Reasoned Action.

    PubMed

    Jeffres, Leo W; Carroll, Jeanine A; Rubenking, Bridget E; Amschlinger, Joe

    2008-12-01

    Fishbein and Ajzen's theory of reasoned action has been used by many researchers, particularly in regard to health communication, to predict behavioral intentions and behavior. According to that theory, one's intention is the best predictor that one will engage in a behavior, and attitudes and social norms predict behavioral intentions. Other researchers have added different variables to the postulates of attitudes and social norms that Fishbein and Ajzen maintain are the best predictors of behavioral intention. Here we draw on data from a 2006 telephone survey (N = 420) gauging the awareness of an organ donation campaign in Northeast Ohio to examine the impact of communication on people's intentions. The current study supports the hypothesis that those who communicate with others are more likely to express a greater willingness to become an organ donor, but it expands the range of communication contexts. With demographics and attitudes toward organ donation controlled for, this study shows that communication with others about organ donation increases the willingness of individuals to have favorable attitudes about being an organ donor.

  9. Reliability Standards of Complex Engineering Systems

    NASA Astrophysics Data System (ADS)

    Galperin, E. M.; Zayko, V. A.; Gorshkalev, P. A.

    2017-11-01

    Production and manufacture play an important role in today’s modern society. Industrial production is nowadays characterized by increased and complex communications between its parts. The problem of preventing accidents in a large industrial enterprise becomes especially relevant. In these circumstances, the reliability of enterprise functioning is of particular importance. Potential damage caused by an accident at such enterprise may lead to substantial material losses and, in some cases, can even cause a loss of human lives. That is why industrial enterprise functioning reliability is immensely important. In terms of their reliability, industrial facilities (objects) are divided into simple and complex. Simple objects are characterized by only two conditions: operable and non-operable. A complex object exists in more than two conditions. The main characteristic here is the stability of its operation. This paper develops the reliability indicator combining the set theory methodology and a state space method. Both are widely used to analyze dynamically developing probability processes. The research also introduces a set of reliability indicators for complex technical systems.

  10. Benchmarking DFT and semi-empirical methods for a reliable and cost-efficient computational screening of benzofulvene derivatives as donor materials for small-molecule organic solar cells.

    PubMed

    Tortorella, Sara; Talamo, Maurizio Mastropasqua; Cardone, Antonio; Pastore, Mariachiara; De Angelis, Filippo

    2016-02-24

    A systematic computational investigation on the optical properties of a group of novel benzofulvene derivatives (Martinelli 2014 Org. Lett. 16 3424-7), proposed as possible donor materials in small molecule organic photovoltaic (smOPV) devices, is presented. A benchmark evaluation against experimental results on the accuracy of different exchange and correlation functionals and semi-empirical methods in predicting both reliable ground state equilibrium geometries and electronic absorption spectra is carried out. The benchmark of the geometry optimization level indicated that the best agreement with x-ray data is achieved by using the B3LYP functional. Concerning the optical gap prediction, we found that, among the employed functionals, MPW1K provides the most accurate excitation energies over the entire set of benzofulvenes. Similarly reliable results were also obtained for range-separated hybrid functionals (CAM-B3LYP and wB97XD) and for global hybrid methods incorporating a large amount of non-local exchange (M06-2X and M06-HF). Density functional theory (DFT) hybrids with a moderate (about 20-30%) extent of Hartree-Fock exchange (HFexc) (PBE0, B3LYP and M06) were also found to deliver HOMO-LUMO energy gaps which compare well with the experimental absorption maxima, thus representing a valuable alternative for a prompt and predictive estimation of the optical gap. The possibility of using completely semi-empirical approaches (AM1/ZINDO) is also discussed.

  11. Some Clinical Diagnoses are More Reliable than Others

    DTIC Science & Technology

    1989-03-29

    a more reliable measure than diagnostic type (e.g., schizophrenia versus personality disor- der). Diagnostic type, in turn, was a more reliable...measure than diagnostic subtype (e.g., chronic catatonic schizophrenic). And, certain diagnostic types aild Suutypes wete consistently more reliable across...diagnoses included thie following diagnostic typesý: 1) organic psychoses, 2) schizophrenia , 3) affective psychoses, 4) pala- 4 noia, 5) other

  12. Automation of reliability evaluation procedures through CARE - The computer-aided reliability estimation program.

    NASA Technical Reports Server (NTRS)

    Mathur, F. P.

    1972-01-01

    Description of an on-line interactive computer program called CARE (Computer-Aided Reliability Estimation) which can model self-repair and fault-tolerant organizations and perform certain other functions. Essentially CARE consists of a repository of mathematical equations defining the various basic redundancy schemes. These equations, under program control, are then interrelated to generate the desired mathematical model to fit the architecture of the system under evaluation. The mathematical model is then supplied with ground instances of its variables and is then evaluated to generate values for the reliability-theoretic functions applied to the model.

  13. 75 FR 72664 - System Personnel Training Reliability Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-26

    ...Under section 215 of the Federal Power Act, the Commission approves two Personnel Performance, Training and Qualifications (PER) Reliability Standards, PER-004-2 (Reliability Coordination--Staffing) and PER-005-1 (System Personnel Training), submitted to the Commission for approval by the North American Electric Reliability Corporation, the Electric Reliability Organization certified by the Commission. The approved Reliability Standards require reliability coordinators, balancing authorities, and transmission operators to establish a training program for their system operators, verify each of their system operators' capability to perform tasks, and provide emergency operations training to every system operator. The Commission also approves NERC's proposal to retire two existing PER Reliability Standards that are replaced by the standards approved in this Final Rule.

  14. Using organization theory to understand the determinants of effective implementation of worksite health promotion programs.

    PubMed

    Weiner, Bryan J; Lewis, Megan A; Linnan, Laura A

    2009-04-01

    The field of worksite health promotion has moved toward the development and testing of comprehensive programs that target health behaviors with interventions operating at multiple levels of influence. Yet, observational and process evaluation studies indicate that such programs are challenging for worksites to implement effectively. Research has identified several organizational factors that promote or inhibit effective implementation of comprehensive worksite health promotion programs. However, no integrated theory of implementation has emerged from this research. This article describes a theory of the organizational determinants of effective implementation of comprehensive worksite health promotion programs. The model is adapted from theory and research on the implementation of complex innovations in manufacturing, education and health care settings. The article uses the Working Well Trial to illustrate the model's theoretical constructs. Although the article focuses on comprehensive worksite health promotion programs, the conceptual model may also apply to other types of complex health promotion programs. An organization-level theory of the determinants of effective implementation of worksite health promotion programs.

  15. Reliability analysis of structural ceramics subjected to biaxial flexure

    NASA Technical Reports Server (NTRS)

    Chao, Luen-Yuan; Shetty, Dinesh K.

    1991-01-01

    The reliability of alumina disks subjected to biaxial flexure is predicted on the basis of statistical fracture theory using a critical strain energy release rate fracture criterion. Results on a sintered silicon nitride are consistent with reliability predictions based on pore-initiated penny-shaped cracks with preferred orientation normal to the maximum principal stress. Assumptions with regard to flaw types and their orientations in each ceramic can be justified by fractography. It is shown that there are no universal guidelines for selecting fracture criteria or assuming flaw orientations in reliability analyses.

  16. Review of Reliability-Based Design Optimization Approach and Its Integration with Bayesian Method

    NASA Astrophysics Data System (ADS)

    Zhang, Xiangnan

    2018-03-01

    A lot of uncertain factors lie in practical engineering, such as external load environment, material property, geometrical shape, initial condition, boundary condition, etc. Reliability method measures the structural safety condition and determine the optimal design parameter combination based on the probabilistic theory. Reliability-based design optimization (RBDO) is the most commonly used approach to minimize the structural cost or other performance under uncertainty variables which combines the reliability theory and optimization. However, it cannot handle the various incomplete information. The Bayesian approach is utilized to incorporate this kind of incomplete information in its uncertainty quantification. In this paper, the RBDO approach and its integration with Bayesian method are introduced.

  17. Assessing reliability of protein-protein interactions by integrative analysis of data in model organisms.

    PubMed

    Lin, Xiaotong; Liu, Mei; Chen, Xue-wen

    2009-04-29

    Protein-protein interactions play vital roles in nearly all cellular processes and are involved in the construction of biological pathways such as metabolic and signal transduction pathways. Although large-scale experiments have enabled the discovery of thousands of previously unknown linkages among proteins in many organisms, the high-throughput interaction data is often associated with high error rates. Since protein interaction networks have been utilized in numerous biological inferences, the inclusive experimental errors inevitably affect the quality of such prediction. Thus, it is essential to assess the quality of the protein interaction data. In this paper, a novel Bayesian network-based integrative framework is proposed to assess the reliability of protein-protein interactions. We develop a cross-species in silico model that assigns likelihood scores to individual protein pairs based on the information entirely extracted from model organisms. Our proposed approach integrates multiple microarray datasets and novel features derived from gene ontology. Furthermore, the confidence scores for cross-species protein mappings are explicitly incorporated into our model. Applying our model to predict protein interactions in the human genome, we are able to achieve 80% in sensitivity and 70% in specificity. Finally, we assess the overall quality of the experimentally determined yeast protein-protein interaction dataset. We observe that the more high-throughput experiments confirming an interaction, the higher the likelihood score, which confirms the effectiveness of our approach. This study demonstrates that model organisms certainly provide important information for protein-protein interaction inference and assessment. The proposed method is able to assess not only the overall quality of an interaction dataset, but also the quality of individual protein-protein interactions. We expect the method to continually improve as more high quality interaction data from more

  18. 78 FR 803 - Revisions to Electric Reliability Organization Definition of Bulk Electric System and Rules of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-04

    ...In this Final Rule, pursuant to section 215 of the Federal Power Act, the Federal Energy Regulatory Commission (Commission) approves modifications to the currently-effective definition of ``bulk electric system'' developed by the North American Electric Reliability Corporation (NERC), the Commission-certified Electric Reliability Organization. The Commission finds that the modified definition of ``bulk electric system'' removes language allowing for regional discretion in the currently-effective bulk electric system definition and establishes a bright-line threshold that includes all facilities operated at or above 100 kV. The modified definition also identifies specific categories of facilities and configurations as inclusions and exclusions to provide clarity in the definition of ``bulk electric system.'' In this Final Rule, the Commission also approves: NERC's revisions to its Rules of Procedure, which create an exception process to add elements to, or remove elements from, the definition of ``bulk electric system'' on a case-by-case basis; NERC's form entitled ``Detailed Information To Support an Exception Request'' that entities will use to support requests for exception from the ``bulk electric system'' definition; and NERC's implementation plan for the revised ``bulk electric system'' definition.

  19. An Investigation of the Impact of Guessing on Coefficient α and Reliability

    PubMed Central

    2014-01-01

    Guessing is known to influence the test reliability of multiple-choice tests. Although there are many studies that have examined the impact of guessing, they used rather restrictive assumptions (e.g., parallel test assumptions, homogeneous inter-item correlations, homogeneous item difficulty, and homogeneous guessing levels across items) to evaluate the relation between guessing and test reliability. Based on the item response theory (IRT) framework, this study investigated the extent of the impact of guessing on reliability under more realistic conditions where item difficulty, item discrimination, and guessing levels actually vary across items with three different test lengths (TL). By accommodating multiple item characteristics simultaneously, this study also focused on examining interaction effects between guessing and other variables entered in the simulation to be more realistic. The simulation of the more realistic conditions and calculations of reliability and classical test theory (CTT) item statistics were facilitated by expressing CTT item statistics, coefficient α, and reliability in terms of IRT model parameters. In addition to the general negative impact of guessing on reliability, results showed interaction effects between TL and guessing and between guessing and test difficulty.

  20. The Effects of Instructors' Autonomy Support and Students' Autonomous Motivation on Learning Organic Chemistry: A Self-Determination Theory Perspective.

    ERIC Educational Resources Information Center

    Black, Aaron E.; Deci, Edward L.

    2000-01-01

    Applies self-determination theory to investigate the effects of students' course-specific self-regulation and their perceptions of their instructors' autonomous support on adjustment and academic performance in a college-level organic chemistry course. Hypothesizes that students taking the organic chemistry course for relatively autonomous reasons…

  1. The Examination of Reliability According to Classical Test and Generalizability on a Job Performance Scale

    ERIC Educational Resources Information Center

    Yelboga, Atilla; Tavsancil, Ezel

    2010-01-01

    In this research, the classical test theory and generalizability theory analyses were carried out with the data obtained by a job performance scale for the years 2005 and 2006. The reliability coefficients obtained (estimated) from the classical test theory and generalizability theory analyses were compared. In classical test theory, test retest…

  2. Developing Reliable Life Support for Mars

    NASA Technical Reports Server (NTRS)

    Jones, Harry W.

    2017-01-01

    A human mission to Mars will require highly reliable life support systems. Mars life support systems may recycle water and oxygen using systems similar to those on the International Space Station (ISS). However, achieving sufficient reliability is less difficult for ISS than it will be for Mars. If an ISS system has a serious failure, it is possible to provide spare parts, or directly supply water or oxygen, or if necessary bring the crew back to Earth. Life support for Mars must be designed, tested, and improved as needed to achieve high demonstrated reliability. A quantitative reliability goal should be established and used to guide development t. The designers should select reliable components and minimize interface and integration problems. In theory a system can achieve the component-limited reliability, but testing often reveal unexpected failures due to design mistakes or flawed components. Testing should extend long enough to detect any unexpected failure modes and to verify the expected reliability. Iterated redesign and retest may be required to achieve the reliability goal. If the reliability is less than required, it may be improved by providing spare components or redundant systems. The number of spares required to achieve a given reliability goal depends on the component failure rate. If the failure rate is under estimated, the number of spares will be insufficient and the system may fail. If the design is likely to have undiscovered design or component problems, it is advisable to use dissimilar redundancy, even though this multiplies the design and development cost. In the ideal case, a human tended closed system operational test should be conducted to gain confidence in operations, maintenance, and repair. The difficulty in achieving high reliability in unproven complex systems may require the use of simpler, more mature, intrinsically higher reliability systems. The limitations of budget, schedule, and technology may suggest accepting lower and

  3. Achieving High Reliability with People, Processes, and Technology.

    PubMed

    Saunders, Candice L; Brennan, John A

    2017-01-01

    High reliability as a corporate value in healthcare can be achieved by meeting the "Quadruple Aim" of improving population health, reducing per capita costs, enhancing the patient experience, and improving provider wellness. This drive starts with the board of trustees, CEO, and other senior leaders who ingrain high reliability throughout the organization. At WellStar Health System, the board developed an ambitious goal to become a top-decile health system in safety and quality metrics. To achieve this goal, WellStar has embarked on a journey toward high reliability and has committed to Lean management practices consistent with the Institute for Healthcare Improvement's definition of a high-reliability organization (HRO): one that is committed to the prevention of failure, early identification and mitigation of failure, and redesign of processes based on identifiable failures. In the end, a successful HRO can provide safe, effective, patient- and family-centered, timely, efficient, and equitable care through a convergence of people, processes, and technology.

  4. Correcting Fallacies in Validity, Reliability, and Classification

    ERIC Educational Resources Information Center

    Sijtsma, Klaas

    2009-01-01

    This article reviews three topics from test theory that continue to raise discussion and controversy and capture test theorists' and constructors' interest. The first topic concerns the discussion of the methodology of investigating and establishing construct validity; the second topic concerns reliability and its misuse, alternative definitions…

  5. Top-emitting white organic light-emitting devices with down-conversion phosphors: theory and experiment.

    PubMed

    Ji, Wenyu; Zhang, Letian; Gao, Ruixue; Zhang, Liming; Xie, Wenfa; Zhang, Hanzhuang; Li, Bin

    2008-09-29

    White top-emitting organic light-emitting devices (TEOLEDs) with down-conversion phosphors are investigated from theory and experiment. The theoretical simulation was described by combining the microcavity model with the down-conversion model. A White TEOLED by the combination of a blue TEOLED with organic down-conversion phosphor 3-(4-(diphenylamino)phenyl)-1-pheny1prop-2-en-1-one was fabricated to validate the simulated results. It is shown that this approach permits the generation of white light in TEOLEDs. The efficiency of the white TEOLED is twice over the corresponding blue TEOLED. The feasible methods to improve the performance of such white TEOLEDs are discussed.

  6. Design of Oil-Lubricated Machine for Life and Reliability

    NASA Technical Reports Server (NTRS)

    Zaretsky, Erwin V.

    2007-01-01

    In the post-World War II era, the major technology drivers for improving the life, reliability, and performance of rolling-element bearings and gears have been the jet engine and the helicopter. By the late 1950s, most of the materials used for bearings and gears in the aerospace industry had been introduced into use. By the early 1960s, the life of most steels was increased over that experienced in the early 1940s, primarily by the introduction of vacuum degassing and vacuum melting processes in the late 1950s. The development of elastohydrodynamic (EHD) theory showed that most rolling bearings and gears have a thin film separating the contacting bodies during motion and it is that film which affects their lives. Computer programs modeling bearing and gear dynamics that incorporate probabilistic life prediction methods and EHD theory enable optimization of rotating machinery based on life and reliability. With improved manufacturing and processing, the potential improvement in bearing and gear life can be as much as 80 times that attainable in the early 1950s. The work presented summarizes the use of laboratory fatigue data for bearings and gears coupled with probabilistic life prediction and EHD theories to predict the life and reliability of a commercial turboprop gearbox. The resulting predictions are compared with field data.

  7. The Mathematics of Aggregation, Interdependence, Organizations and Systems of Nash Equilibria (NE): A Replacement for Game Theory

    DTIC Science & Technology

    2011-06-01

    there a free-market economist in the House?, American Journal of Economics and Sociology, 66(2): 309-334. Jasny, B. R., Zahn, L.M., & Marshall, E...1   The Mathematics of Aggregation, Interdependence, Organizations and Systems of Nash equilibria (NE): A replacement for Game Theory...level data to group, organization and systems levels, making it one of social science’s biggest challenges, if not the most important (Giles, 2011). For

  8. An Examination of Test-Retest, Alternate Form Reliability, and Generalizability Theory Study of the easyCBM Word and Passage Reading Fluency Assessments: Grade 3. Technical Report #1218

    ERIC Educational Resources Information Center

    Park, Bitnara Jasmine; Anderson, Daniel; Alonzo, Julie; Lai, Cheng-Fei; Tindal, Gerald

    2012-01-01

    This technical report is one in a series of five describing the reliability (test/retest and alternate form) and G-Theory/D-Study research on the easyCBM reading measures, grades 1-5. Data were gathered in the spring of 2011 from a convenience sample of students nested within classrooms at a medium-sized school district in the Pacific Northwest.…

  9. Investigation of a Diagnostic for Perturbation Theory: Comparison to the T(sub 1) Diagnostic of Coupled-Cluster Theory

    NASA Technical Reports Server (NTRS)

    Lee, Timothy J.; Head-Gordon, Martin; Rendell, Alistair P.; Langhoff, Stephen R. (Technical Monitor)

    1995-01-01

    A diagnostic for perturbation theory calculations, S(sub 2), is defined and numerical results are compared to the established T(sub 1) diagnostic from coupled-cluster theory. S(sub 2) is the lowest order non-zero contribution to a perturbation expansion of T(sub 1). S(sub 2) is a reasonable estimate of the importance of non-dynamical electron correlation, although not as reliable as T(sub 1). S(sub 2) values less than or equal to 0.012 suggest that low orders of perturbation theory should yield reasonable results; S(sub 2) values between 0.012-0.015 suggest that caution is required in interpreting results from low orders of perturbation theory; S(sub 2) values greater than or equal to 0.015 indicate that low orders of perturbation theory are not reliable for accurate results. Although not required mathematically, S(sub 2) is always less than T(sub 1) for the examples studied here.

  10. The Information Function for the One-Parameter Logistic Model: Is it Reliability?

    ERIC Educational Resources Information Center

    Doran, Harold C.

    2005-01-01

    The information function is an important statistic in item response theory (IRT) applications. Although the information function is often described as the IRT version of reliability, it differs from the classical notion of reliability from a critical perspective: replication. This article first explores the information function for the…

  11. Construct Definition Methodology and Generalizability Theory Applied to Career Education Measurement.

    ERIC Educational Resources Information Center

    Stenner, A. Jackson; Rohlf, Richard J.

    The merits of generalizability theory in the formulation of construct definitions and in the determination of reliability estimates are discussed. The broadened conceptualization of reliability brought about by Cronbach's generalizability theory is reviewed. Career Maturity Inventory data from a sample of 60 ninth grade students is used to…

  12. Firm Size, a Self-Organized Critical Phenomenon: Evidence from the Dynamical Systems Theory

    NASA Astrophysics Data System (ADS)

    Chandra, Akhilesh

    This research draws upon a recent innovation in the dynamical systems literature called the theory of self -organized criticality (SOC) (Bak, Tang, and Wiesenfeld 1988) to develop a computational model of a firm's size by relating its internal and the external sub-systems. As a holistic paradigm, the theory of SOC implies that a firm as a composite system of many degrees of freedom naturally evolves to a critical state in which a minor event starts a chain reaction that can affect either a part or the system as a whole. Thus, the global features of a firm cannot be understood by analyzing its individual parts separately. The causal framework builds upon a constant capital resource to support a volume of production at the existing level of efficiency. The critical size is defined as the production level at which the average product of a firm's factors of production attains its maximum value. The non -linearity is inferred by a change in the nature of relations at the border of criticality, between size and the two performance variables, viz., the operating efficiency and the financial efficiency. The effect of breaching the critical size is examined on the stock price reactions. Consistent with the theory of SOC, it is hypothesized that the temporal response of a firm breaching the level of critical size should behave as a flicker noise (1/f) process. The flicker noise is characterized by correlations extended over a wide range of time scales, indicating some sort of cooperative effect among a firm's degrees of freedom. It is further hypothesized that a firm's size evolves to a spatial structure with scale-invariant, self-similar (fractal) properties. The system is said to be self-organized inasmuch as it naturally evolves to the state of criticality without any detailed specifications of the initial conditions. In this respect, the critical state is an attractor of the firm's dynamics. Another set of hypotheses examines the relations between the size and the

  13. A simple theory of molecular organization in fullerene-containing liquid crystals

    NASA Astrophysics Data System (ADS)

    Peroukidis, S. D.; Vanakaras, A. G.; Photinos, D. J.

    2005-10-01

    Systematic efforts to synthesize fullerene-containing liquid crystals have produced a variety of successful model compounds. We present a simple molecular theory, based on the interconverting shape approach [Vanakaras and Photinos, J. Mater. Chem. 15, 2002 (2005)], that relates the self-organization observed in these systems to their molecular structure. The interactions are modeled by dividing each molecule into a number of submolecular blocks to which specific interactions are assigned. Three types of blocks are introduced, corresponding to fullerene units, mesogenic units, and nonmesogenic linkage units. The blocks are constrained to move on a cubic three-dimensional lattice and molecular flexibility is allowed by retaining a number of representative conformations within the block representation of the molecule. Calculations are presented for a variety of molecular architectures including twin mesogenic branch monoadducts of C60, twin dendromesogenic branch monoadducts, and conical (badminton shuttlecock) multiadducts of C60. The dependence of the phase diagrams on the interaction parameters is explored. In spite of its many simplifications and the minimal molecular modeling used (three types of chemically distinct submolecular blocks with only repulsive interactions), the theory accounts remarkably well for the phase behavior of these systems.

  14. A Unifying Theory of Biological Function.

    PubMed

    van Hateren, J H

    2017-01-01

    A new theory that naturalizes biological function is explained and compared with earlier etiological and causal role theories. Etiological (or selected effects) theories explain functions from how they are caused over their evolutionary history. Causal role theories analyze how functional mechanisms serve the current capacities of their containing system. The new proposal unifies the key notions of both kinds of theories, but goes beyond them by explaining how functions in an organism can exist as factors with autonomous causal efficacy. The goal-directedness and normativity of functions exist in this strict sense as well. The theory depends on an internal physiological or neural process that mimics an organism's fitness, and modulates the organism's variability accordingly. The structure of the internal process can be subdivided into subprocesses that monitor specific functions in an organism. The theory matches well with each intuition on a previously published list of intuited ideas about biological functions, including intuitions that have posed difficulties for other theories.

  15. The Assessment of Reliability Under Range Restriction: A Comparison of [Alpha], [Omega], and Test-Retest Reliability for Dichotomous Data

    ERIC Educational Resources Information Center

    Fife, Dustin A.; Mendoza, Jorge L.; Terry, Robert

    2012-01-01

    Though much research and attention has been directed at assessing the correlation coefficient under range restriction, the assessment of reliability under range restriction has been largely ignored. This article uses item response theory to simulate dichotomous item-level data to assess the robustness of KR-20 ([alpha]), [omega], and test-retest…

  16. The self-perceived survival ability and reproductive fitness (SPFit) theory of substance use disorders.

    PubMed

    Newlin, David B

    2002-04-01

    A new theory of substance use disorders is proposed-the SPFit theory-that is based on evolutionary biology and adaptive systems. Self-perceived survival ability and reproductive fitness (SPFit) is proposed as a human psychobiological construct that prioritizes and organizes (i.e. motivates) behavior, but is highly vulnerable to temporary, artificial activation by drugs of abuse. Autoshaping/sign-tracking/feature positive phenomena are proposed to underlie the development of craving and expectations about drugs as the individual learns that abused drugs will easily and reliably inflate SPFit. The cortico-mesolimbic dopamine system and its modulating interconnections are viewed as the biological substrate of SPFit; it is proposed to be a survival and reproductive motivation system rather than a reward center or reward pathway. Finally, the concept of modularity of mind is applied to the SPFit construct. Although considerable empirical data are consistent with the theory, new research is needed to test specific hypotheses derived from SPFit theory.

  17. Using generalizability theory to develop clinical assessment protocols.

    PubMed

    Preuss, Richard A

    2013-04-01

    Clinical assessment protocols must produce data that are reliable, with a clinically attainable minimal detectable change (MDC). In a reliability study, generalizability theory has 2 advantages over classical test theory. These advantages provide information that allows assessment protocols to be adjusted to match individual patient profiles. First, generalizability theory allows the user to simultaneously consider multiple sources of measurement error variance (facets). Second, it allows the user to generalize the findings of the main study across the different study facets and to recalculate the reliability and MDC based on different combinations of facet conditions. In doing so, clinical assessment protocols can be chosen based on minimizing the number of measures that must be taken to achieve a realistic MDC, using repeated measures to minimize the MDC, or simply based on the combination that best allows the clinician to monitor an individual patient's progress over a specified period of time.

  18. Analysis of fatigue reliability for high temperature and high pressure multi-stage decompression control valve

    NASA Astrophysics Data System (ADS)

    Yu, Long; Xu, Juanjuan; Zhang, Lifang; Xu, Xiaogang

    2018-03-01

    Based on stress-strength interference theory to establish the reliability mathematical model for high temperature and high pressure multi-stage decompression control valve (HMDCV), and introduced to the temperature correction coefficient for revising material fatigue limit at high temperature. Reliability of key dangerous components and fatigue sensitivity curve of each component are calculated and analyzed by the means, which are analyzed the fatigue life of control valve and combined with reliability theory of control valve model. The impact proportion of each component on the control valve system fatigue failure was obtained. The results is shown that temperature correction factor makes the theoretical calculations of reliability more accurate, prediction life expectancy of main pressure parts accords with the technical requirements, and valve body and the sleeve have obvious influence on control system reliability, the stress concentration in key part of control valve can be reduced in the design process by improving structure.

  19. A Comparison of the Approaches of Generalizability Theory and Item Response Theory in Estimating the Reliability of Test Scores for Testlet-Composed Tests

    ERIC Educational Resources Information Center

    Lee, Guemin; Park, In-Yong

    2012-01-01

    Previous assessments of the reliability of test scores for testlet-composed tests have indicated that item-based estimation methods overestimate reliability. This study was designed to address issues related to the extent to which item-based estimation methods overestimate the reliability of test scores composed of testlets and to compare several…

  20. Body without Organs: Notes on Deleuze & Guattari, Critical Race Theory and the Socius of Anti-Racism

    ERIC Educational Resources Information Center

    Ibrahim, Awad

    2015-01-01

    My aim in this article is to epistemologically read Deleuze and Guattari (D & G) against critical race theory (CRT) and simultaneously delineate how D & G's notion of "body without organs" can benefit from CRT. At first glance, especially for language instructors and researchers, these two epistemological frameworks not only…

  1. A pilot study to validate measures of the theory of reasoned action for organ donation behavior.

    PubMed

    Wong, Shui Hung; Chow, Amy Yin Man

    2018-04-01

    The present study aimed at taking the first attempt in validating the measures generated based on the theory of reasoned action (TRA). A total of 211 university students participated in the study, 95 were included in the exploratory factor analysis and 116 were included in the confirmatory factor analysis. The TRA measurements were established with adequate psychometric properties, internal consistency, and construct validity. Findings also suggested that attitude toward organ donation has both a cognitive and affective nature, while the subjective norm of the family seems to be important to students' views on organ donation.

  2. Reliable sex and strain discrimination in the mouse vomeronasal organ and accessory olfactory bulb.

    PubMed

    Tolokh, Illya I; Fu, Xiaoyan; Holy, Timothy E

    2013-08-21

    Animals modulate their courtship and territorial behaviors in response to olfactory cues produced by other animals. In rodents, detecting these cues is the primary role of the accessory olfactory system (AOS). We sought to systematically investigate the natural stimulus coding logic and robustness in neurons of the first two stages of accessory olfactory processing, the vomeronasal organ (VNO) and accessory olfactory bulb (AOB). We show that firing rate responses of just a few well-chosen mouse VNO or AOB neurons can be used to reliably encode both sex and strain of other mice from cues contained in urine. Additionally, we show that this population code can generalize to new concentrations of stimuli and appears to represent stimulus identity in terms of diverging paths in coding space. Together, the results indicate that firing rate code on the temporal order of seconds is sufficient for accurate classification of pheromonal patterns at different concentrations and may be used by AOS neural circuitry to discriminate among naturally occurring urine stimuli.

  3. The relationship between personality organization as assessed by theory-driven profiles of the Dutch Short Form of the MMPI and self-reported features of personality organization.

    PubMed

    Eurelings-Bontekoe, Elisabeth H M; Luyten, Patrick; Remijsen, Mila; Koelen, Jurrijn

    2010-11-01

    In this study, we investigated the relationships between features of personality organization (PO) as assessed by theory driven profiles of the Dutch Short Form of the MMPI (DSFM; Luteijn & Kok, 1985) and 2 self-report measures of personality pathology, that is, the Dutch Inventory of Personality Organization (Berghuis, Kamphuis, Boedijn, & Verheul, 2009) and the Dutch Schizotypy Personality Questionnaire-Revised (Vollema & Hoijtink, 2000), in a sample of 190 outpatient psychiatric patients. Results showed that the single scales of all 3 measures segregated into 2 theoretically expected and meaningful dimensions, that is, a dimension assessing severity of personality pathology and an introversion/extraversion dimension. Theory-driven combinations of single DSFM subscales as a measure of level of PO distinguished characteristics of patients at various levels of PO in theoretically predicted ways. Results also suggest that structural personality pathology may not be fully captured by self-report measures.

  4. Improving the Reliability of Student Scores from Speeded Assessments: An Illustration of Conditional Item Response Theory Using a Computer-Administered Measure of Vocabulary.

    PubMed

    Petscher, Yaacov; Mitchell, Alison M; Foorman, Barbara R

    2015-01-01

    A growing body of literature suggests that response latency, the amount of time it takes an individual to respond to an item, may be an important factor to consider when using assessment data to estimate the ability of an individual. Considering that tests of passage and list fluency are being adapted to a computer administration format, it is possible that accounting for individual differences in response times may be an increasingly feasible option to strengthen the precision of individual scores. The present research evaluated the differential reliability of scores when using classical test theory and item response theory as compared to a conditional item response model which includes response time as an item parameter. Results indicated that the precision of student ability scores increased by an average of 5 % when using the conditional item response model, with greater improvements for those who were average or high ability. Implications for measurement models of speeded assessments are discussed.

  5. Improving the Reliability of Student Scores from Speeded Assessments: An Illustration of Conditional Item Response Theory Using a Computer-Administered Measure of Vocabulary

    PubMed Central

    Petscher, Yaacov; Mitchell, Alison M.; Foorman, Barbara R.

    2016-01-01

    A growing body of literature suggests that response latency, the amount of time it takes an individual to respond to an item, may be an important factor to consider when using assessment data to estimate the ability of an individual. Considering that tests of passage and list fluency are being adapted to a computer administration format, it is possible that accounting for individual differences in response times may be an increasingly feasible option to strengthen the precision of individual scores. The present research evaluated the differential reliability of scores when using classical test theory and item response theory as compared to a conditional item response model which includes response time as an item parameter. Results indicated that the precision of student ability scores increased by an average of 5 % when using the conditional item response model, with greater improvements for those who were average or high ability. Implications for measurement models of speeded assessments are discussed. PMID:27721568

  6. 'Broken hospital windows': debating the theory of spreading disorder and its application to healthcare organizations.

    PubMed

    Churruca, Kate; Ellis, Louise A; Braithwaite, Jeffrey

    2018-03-22

    Research in criminology and social-psychology supports the idea that visible signs of disorder, both physical and social, may perpetuate further disorder, leading to neighborhood incivilities, petty violations, and potentially criminal behavior. This theory of 'broken windows' has now also been applied to more enclosed environments, such as organizations. This paper debates whether the premise of broken windows theory, and the concept of 'disorder', might also have utility in the context of health services. There is already a body of work on system migration, which suggests a role for violations and workarounds in normalizing unwarranted deviations from safe practices in healthcare organizations. Studies of visible disorder may be needed in healthcare, where the risks of norm violations and disorderly environments, and potential for harm to patients, are considerable. Everyday adjustments and flexibility is mostly beneficial, but in this paper, we ask: how might deviations from the norm escalate from necessary workarounds to risky violations in care settings? Does physical or social disorder in healthcare contexts perpetuate further disorder, leading to downstream effects, including increased risk of harm to patients? We advance a model of broken windows in healthcare, and a proposal to study this phenomenon.

  7. Sample Size for Estimation of G and Phi Coefficients in Generalizability Theory

    ERIC Educational Resources Information Center

    Atilgan, Hakan

    2013-01-01

    Problem Statement: Reliability, which refers to the degree to which measurement results are free from measurement errors, as well as its estimation, is an important issue in psychometrics. Several methods for estimating reliability have been suggested by various theories in the field of psychometrics. One of these theories is the generalizability…

  8. Near-misses are an opportunity to improve patient safety: adapting strategies of high reliability organizations to healthcare.

    PubMed

    Van Spall, Harriette; Kassam, Alisha; Tollefson, Travis T

    2015-08-01

    Near-miss investigations in high reliability organizations (HROs) aim to mitigate risk and improve system safety. Healthcare settings have a higher rate of near-misses and subsequent adverse events than most high-risk industries, but near-misses are not systematically reported or analyzed. In this review, we will describe the strategies for near-miss analysis that have facilitated a culture of safety and continuous quality improvement in HROs. Near-miss analysis is routine and systematic in HROs such as aviation. Strategies implemented in aviation include the Commercial Aviation Safety Team, which undertakes systematic analyses of near-misses, so that findings can be incorporated into Standard Operating Procedures (SOPs). Other strategies resulting from incident analyses include Crew Resource Management (CRM) for enhanced communication, situational awareness training, adoption of checklists during operations, and built-in redundancy within systems. Health care organizations should consider near-misses as opportunities for quality improvement. The systematic reporting and analysis of near-misses, commonplace in HROs, can be adapted to health care settings to prevent adverse events and improve clinical outcomes.

  9. Assessment of physical server reliability in multi cloud computing system

    NASA Astrophysics Data System (ADS)

    Kalyani, B. J. D.; Rao, Kolasani Ramchand H.

    2018-04-01

    Business organizations nowadays functioning with more than one cloud provider. By spreading cloud deployment across multiple service providers, it creates space for competitive prices that minimize the burden on enterprises spending budget. To assess the software reliability of multi cloud application layered software reliability assessment paradigm is considered with three levels of abstractions application layer, virtualization layer, and server layer. The reliability of each layer is assessed separately and is combined to get the reliability of multi-cloud computing application. In this paper, we focused on how to assess the reliability of server layer with required algorithms and explore the steps in the assessment of server reliability.

  10. VFS interjudge reliability using a free and directed search.

    PubMed

    Bryant, Karen N; Finnegan, Eileen; Berbaum, Kevin

    2012-03-01

    Reports in the literature suggest that clinicians demonstrate poor reliability in rating videofluoroscopic swallow (VFS) variables. Contemporary perception theories suggest that the methods used in VFS reliability studies constrain subjects to make judgments in an abnormal way. The purpose of this study was to determine whether a directed search or a free search approach to rating swallow studies results in better interjudge reliability. Ten speech pathologists served as judges. Five clinical judges were assigned to the directed search group (use checklist) and five to the free search group (unguided observations). Clinical judges interpreted 20 VFS examinations of swallowing. Interjudge reliability of ratings of dysphagia severity, affected stage of swallow, dysphagia symptoms, and attributes identified by clinical judges using a directed search was compared with that using a free search approach. Interjudge reliability for rating the presence of aspiration and penetration was significantly better using a free search ("substantial" to "almost perfect" agreement) compared to a directed search ("moderate" agreement). Reliability of dysphagia severity ratings ranged from "moderate" to "almost perfect" agreement for both methods of search. Reliability for reporting all other symptoms and attributes of dysphagia was variable and was not significantly different between the groups.

  11. Scaled CMOS Technology Reliability Users Guide

    NASA Technical Reports Server (NTRS)

    White, Mark

    2010-01-01

    The desire to assess the reliability of emerging scaled microelectronics technologies through faster reliability trials and more accurate acceleration models is the precursor for further research and experimentation in this relevant field. The effect of semiconductor scaling on microelectronics product reliability is an important aspect to the high reliability application user. From the perspective of a customer or user, who in many cases must deal with very limited, if any, manufacturer's reliability data to assess the product for a highly-reliable application, product-level testing is critical in the characterization and reliability assessment of advanced nanometer semiconductor scaling effects on microelectronics reliability. A methodology on how to accomplish this and techniques for deriving the expected product-level reliability on commercial memory products are provided.Competing mechanism theory and the multiple failure mechanism model are applied to the experimental results of scaled SDRAM products. Accelerated stress testing at multiple conditions is applied at the product level of several scaled memory products to assess the performance degradation and product reliability. Acceleration models are derived for each case. For several scaled SDRAM products, retention time degradation is studied and two distinct soft error populations are observed with each technology generation: early breakdown, characterized by randomly distributed weak bits with Weibull slope (beta)=1, and a main population breakdown with an increasing failure rate. Retention time soft error rates are calculated and a multiple failure mechanism acceleration model with parameters is derived for each technology. Defect densities are calculated and reflect a decreasing trend in the percentage of random defective bits for each successive product generation. A normalized soft error failure rate of the memory data retention time in FIT/Gb and FIT/cm2 for several scaled SDRAM generations is

  12. Estimating Between-Person and Within-Person Subscore Reliability with Profile Analysis.

    PubMed

    Bulut, Okan; Davison, Mark L; Rodriguez, Michael C

    2017-01-01

    Subscores are of increasing interest in educational and psychological testing due to their diagnostic function for evaluating examinees' strengths and weaknesses within particular domains of knowledge. Previous studies about the utility of subscores have mostly focused on the overall reliability of individual subscores and ignored the fact that subscores should be distinct and have added value over the total score. This study introduces a profile reliability approach that partitions the overall subscore reliability into within-person and between-person subscore reliability. The estimation of between-person reliability and within-person reliability coefficients is demonstrated using subscores from number-correct scoring, unidimensional and multidimensional item response theory scoring, and augmented scoring approaches via a simulation study and a real data study. The effects of various testing conditions, such as subtest length, correlations among subscores, and the number of subtests, are examined. Results indicate that there is a substantial trade-off between within-person and between-person reliability of subscores. Profile reliability coefficients can be useful in determining the extent to which subscores provide distinct and reliable information under various testing conditions.

  13. Reliably Modeling the Mechanical Stability of Rigid and Flexible Metal-Organic Frameworks.

    PubMed

    Rogge, Sven M J; Waroquier, Michel; Van Speybroeck, Veronique

    2018-01-16

    Over the past two decades, metal-organic frameworks (MOFs) have matured from interesting academic peculiarities toward a continuously expanding class of hybrid, nanoporous materials tuned for targeted technological applications such as gas storage and heterogeneous catalysis. These oft-times crystalline materials, composed of inorganic moieties interconnected by organic ligands, can be endowed with desired structural and chemical features by judiciously functionalizing or substituting these building blocks. As a result of this reticular synthesis, MOF research is situated at the intriguing intersection between chemistry and physics, and the building block approach could pave the way toward the construction of an almost infinite number of possible crystalline structures, provided that they exhibit stability under the desired operational conditions. However, this enormous potential is largely untapped to date, as MOFs have not yet found a major breakthrough in technological applications. One of the remaining challenges for this scale-up is the densification of MOF powders, which is generally achieved by subjecting the material to a pressurization step. However, application of an external pressure may substantially alter the chemical and physical properties of the material. A reliable theoretical guidance that can presynthetically identify the most stable materials could help overcome this technological challenge. In this Account, we describe the recent research the progress on computational characterization of the mechanical stability of MOFs. So far, three complementary approaches have been proposed, focusing on different aspects of mechanical stability: (i) the Born stability criteria, (ii) the anisotropy in mechanical moduli such as the Young and shear moduli, and (iii) the pressure-versus-volume equations of state. As these three methods are grounded in distinct computational approaches, it is expected that their accuracy and efficiency will vary. To date

  14. Reliably Modeling the Mechanical Stability of Rigid and Flexible Metal–Organic Frameworks

    PubMed Central

    2017-01-01

    Conspectus Over the past two decades, metal–organic frameworks (MOFs) have matured from interesting academic peculiarities toward a continuously expanding class of hybrid, nanoporous materials tuned for targeted technological applications such as gas storage and heterogeneous catalysis. These oft-times crystalline materials, composed of inorganic moieties interconnected by organic ligands, can be endowed with desired structural and chemical features by judiciously functionalizing or substituting these building blocks. As a result of this reticular synthesis, MOF research is situated at the intriguing intersection between chemistry and physics, and the building block approach could pave the way toward the construction of an almost infinite number of possible crystalline structures, provided that they exhibit stability under the desired operational conditions. However, this enormous potential is largely untapped to date, as MOFs have not yet found a major breakthrough in technological applications. One of the remaining challenges for this scale-up is the densification of MOF powders, which is generally achieved by subjecting the material to a pressurization step. However, application of an external pressure may substantially alter the chemical and physical properties of the material. A reliable theoretical guidance that can presynthetically identify the most stable materials could help overcome this technological challenge. In this Account, we describe the recent research the progress on computational characterization of the mechanical stability of MOFs. So far, three complementary approaches have been proposed, focusing on different aspects of mechanical stability: (i) the Born stability criteria, (ii) the anisotropy in mechanical moduli such as the Young and shear moduli, and (iii) the pressure-versus-volume equations of state. As these three methods are grounded in distinct computational approaches, it is expected that their accuracy and efficiency will vary. To

  15. Push-Pull Receptive Field Organization and Synaptic Depression: Mechanisms for Reliably Encoding Naturalistic Stimuli in V1

    PubMed Central

    Kremkow, Jens; Perrinet, Laurent U.; Monier, Cyril; Alonso, Jose-Manuel; Aertsen, Ad; Frégnac, Yves; Masson, Guillaume S.

    2016-01-01

    Neurons in the primary visual cortex are known for responding vigorously but with high variability to classical stimuli such as drifting bars or gratings. By contrast, natural scenes are encoded more efficiently by sparse and temporal precise spiking responses. We used a conductance-based model of the visual system in higher mammals to investigate how two specific features of the thalamo-cortical pathway, namely push-pull receptive field organization and fast synaptic depression, can contribute to this contextual reshaping of V1 responses. By comparing cortical dynamics evoked respectively by natural vs. artificial stimuli in a comprehensive parametric space analysis, we demonstrate that the reliability and sparseness of the spiking responses during natural vision is not a mere consequence of the increased bandwidth in the sensory input spectrum. Rather, it results from the combined impacts of fast synaptic depression and push-pull inhibition, the later acting for natural scenes as a form of “effective” feed-forward inhibition as demonstrated in other sensory systems. Thus, the combination of feedforward-like inhibition with fast thalamo-cortical synaptic depression by simple cells receiving a direct structured input from thalamus composes a generic computational mechanism for generating a sparse and reliable encoding of natural sensory events. PMID:27242445

  16. Identity theory and personality theory: mutual relevance.

    PubMed

    Stryker, Sheldon

    2007-12-01

    Some personality psychologists have found a structural symbolic interactionist frame and identity theory relevant to their work. This frame and theory, developed in sociology, are first reviewed. Emphasized in the review are a multiple identity conception of self, identities as internalized expectations derived from roles embedded in organized networks of social interaction, and a view of social structures as facilitators in bringing people into networks or constraints in keeping them out, subsequently, attention turns to a discussion of the mutual relevance of structural symbolic interactionism/identity theory and personality theory, looking to extensions of the current literature on these topics.

  17. The Impact of Multiple Master Patient Index Records on the Business Performance of Health Care Organizations: A Qualitative Grounded Theory Study

    ERIC Educational Resources Information Center

    Banton, Cynthia L.

    2014-01-01

    The purpose of this qualitative grounded theory study was to explore and examine the factors that led to the creation of multiple record entries, and present a theory on the impact the problem has on the business performance of health care organizations. A sample of 59 health care professionals across the United States participated in an online…

  18. Managing Reliability in the 21st Century

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dellin, T.A.

    1998-11-23

    The rapid pace of change at Ike end of the 20th Century should continue unabated well into the 21st Century. The driver will be the marketplace imperative of "faster, better, cheaper." This imperative has already stimulated a revolution-in-engineering in design and manufacturing. In contrast, to date, reliability engineering has not undergone a similar level of change. It is critical that we implement a corresponding revolution-in-reliability-engineering as we enter the new millennium. If we are still using 20th Century reliability approaches in the 21st Century, then reliability issues will be the limiting factor in faster, better, and cheaper. At the heartmore » of this reliability revolution will be a science-based approach to reliability engineering. Science-based reliability will enable building-in reliability, application-specific products, virtual qualification, and predictive maintenance. The purpose of this paper is to stimulate a dialogue on the future of reliability engineering. We will try to gaze into the crystal ball and predict some key issues that will drive reliability programs in the new millennium. In the 21st Century, we will demand more of our reliability programs. We will need the ability to make accurate reliability predictions that will enable optimizing cost, performance and time-to-market to meet the needs of every market segment. We will require that all of these new capabilities be in place prior to the stint of a product development cycle. The management of reliability programs will be driven by quantifiable metrics of value added to the organization business objectives.« less

  19. 18 CFR 39.5 - Reliability Standards.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    .... 39.5 Section 39.5 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY REGULATIONS UNDER THE FEDERAL POWER ACT RULES CONCERNING CERTIFICATION OF THE ELECTRIC RELIABILITY ORGANIZATION; AND PROCEDURES FOR THE ESTABLISHMENT, APPROVAL, AND ENFORCEMENT OF ELECTRIC...

  20. Many-Body Perturbation Theory for Understanding Optical Excitations in Organic Molecules and Solids

    NASA Astrophysics Data System (ADS)

    Sharifzadeh, Sahar

    Organic semiconductors are promising as light-weight, flexible, and strongly absorbing materials for next-generation optoelectronics. The advancement of such technologies relies on understanding the fundamental excited-state properties of organic molecules and solids, motivating the development of accurate computational approaches for this purpose. Here, I will present first-principles many-body perturbation theory (MBPT) calculations aimed at understanding the spectroscopic properties of select organic molecules and crystalline semiconductors, and improving these properties for enhanced photovoltaic performance. We show that for both gas-phase molecules and condensed-phase crystals, MBPT within the GW/BSE approximation provides quantitative accuracy of transport gaps extracted from photoemission spectroscopy and conductance measurements, as well as with measured polarization-dependent optical absorption spectra. We discuss the implications of standard approximations within GW/BSE on accuracy of these results. Additionally, we demonstrate significant exciton binding energies and charge-transfer character in the crystalline systems, which can be controlled through solid-state morphology or change of conjugation length, suggesting a new strategy for the design of optoelectronic materials. We acknowledge NSF for financial support; NERSC and Boston University for computational resources.

  1. Challenges for dynamic energy budget theory. Comment on ;Physics of metabolic organization; by Marko Jusup et al.

    NASA Astrophysics Data System (ADS)

    Nisbet, Roger M.

    2017-03-01

    Jusup et al. [1] provide a comprehensive review of Dynamic Energy Budget (DEB) theory - a theory of metabolic organization that has its roots in a model by S.A.L.M Kooijman [2] and has evolved over three decades into a remarkable general theory whose use appears to be growing exponentially. The definitive text on DEB theory [3] is a challenging (though exceptionally rewarding) read, and previous reviews (e.g. [4,5]) have provided focused summaries of some of its main themes, targeted at specific groups of readers. The strong case for a further review is well captured in the abstract: ;Hitherto, the foundations were more accessible to physicists or mathematicians, and the applications to biologists, causing a dichotomy in what always should have been a single body of work.; In response to this need, Jusup et al. provide a review that combines a lucid, rigorous exposition of the core components of DEB theory with a diverse collection of DEB applications. They also highlight some recent advances, notably the rapidly growing on-line database of DEB model parameters (451 species on 15 August 2016 according to [1], now, just a few months later, over 500 species).

  2. CARES/LIFE Ceramics Analysis and Reliability Evaluation of Structures Life Prediction Program

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.; Powers, Lynn M.; Janosik, Lesley A.; Gyekenyesi, John P.

    2003-01-01

    This manual describes the Ceramics Analysis and Reliability Evaluation of Structures Life Prediction (CARES/LIFE) computer program. The program calculates the time-dependent reliability of monolithic ceramic components subjected to thermomechanical and/or proof test loading. CARES/LIFE is an extension of the CARES (Ceramic Analysis and Reliability Evaluation of Structures) computer program. The program uses results from MSC/NASTRAN, ABAQUS, and ANSYS finite element analysis programs to evaluate component reliability due to inherent surface and/or volume type flaws. CARES/LIFE accounts for the phenomenon of subcritical crack growth (SCG) by utilizing the power law, Paris law, or Walker law. The two-parameter Weibull cumulative distribution function is used to characterize the variation in component strength. The effects of multiaxial stresses are modeled by using either the principle of independent action (PIA), the Weibull normal stress averaging method (NSA), or the Batdorf theory. Inert strength and fatigue parameters are estimated from rupture strength data of naturally flawed specimens loaded in static, dynamic, or cyclic fatigue. The probabilistic time-dependent theories used in CARES/LIFE, along with the input and output for CARES/LIFE, are described. Example problems to demonstrate various features of the program are also included.

  3. How does organic matter occurrence set limit onto the use of Ce anomaly as a reliable proxy of redox conditions in shallow groundwaters?

    NASA Astrophysics Data System (ADS)

    Dia, A.; Gruau, G.; Davranche, M.; Vidy, A.; Henin, O.; Petitjean, P.; Le Coz-Bouhnik, M.

    2003-04-01

    This study is dedicated to the effects of organic matter on the hydrochemistry of Rare Earth Elements (REE) and the ability of using the Ce anomaly as a reliable proxy of redox conditions in surface waters when organic matter occurs. The data include a : i) two-year survey of SREE and Ce anomalies in organic-rich waters recovered from a catchment located in Brittany (western Europe) and (ii) experimental incubation of organic soils from this catchment set under controlled conditions, as well as, (iii) a REE speciation calculation in both the natural organic-rich waters from the wetlands and the experimental solutions. Field and experimental data appear to be extremely coherent, displaying good correlation between the SREE, the Dissolved Organic Carbon (DOC) contents and the redox state. The field data show a strong increase of the SREE and DOC concentrations in soil waters when the environment becomes more reducing. The onset of DOC and SREE contents is seen to be in phase with the increase of dissolved Fe and Mn. The role of Fe-, Mn-oxyhydroxides is confirmed by the experimental data as the maximum of DOC and SREE content is reached when Fe2+ reaches a maximum in the soil solution, suggesting that reductive dissolution of Fe, Mn-oxyhydroxides happens. Despite the strong redox changes and the known redox sensitive behaviour of Ce as compared to other REE, none Ce anomaly variation is observed during either, the experimental procedure, or the field survey through time. Speciation calculations were performed showing that in both such pH range and moderately oxidizing waters in DOC-rich waters, REE should have an organic speciation. Such an organic speciation prevents the formation of Ce(IV) and therefore the development of any Ce anomaly. However, since the studied waters are highly oxidizing (high nitrate contents), the nitrates impose the redox formation of Ce(IV) and a Ce anomaly should appear. Therefore, Ce(IV) is not formed in these waters either because (i) the

  4. 18 CFR 39.6 - Conflict of a Reliability Standard with a Commission Order.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY REGULATIONS UNDER THE FEDERAL POWER ACT RULES CONCERNING CERTIFICATION OF THE ELECTRIC RELIABILITY ORGANIZATION; AND PROCEDURES FOR THE ESTABLISHMENT, APPROVAL, AND ENFORCEMENT OF ELECTRIC RELIABILITY STANDARDS § 39.6 Conflict of a Reliability Standard with...

  5. Power counting and Wilsonian renormalization in nuclear effective field theory

    NASA Astrophysics Data System (ADS)

    Valderrama, Manuel Pavón

    2016-05-01

    Effective field theories are the most general tool for the description of low energy phenomena. They are universal and systematic: they can be formulated for any low energy systems we can think of and offer a clear guide on how to calculate predictions with reliable error estimates, a feature that is called power counting. These properties can be easily understood in Wilsonian renormalization, in which effective field theories are the low energy renormalization group evolution of a more fundamental — perhaps unknown or unsolvable — high energy theory. In nuclear physics they provide the possibility of a theoretically sound derivation of nuclear forces without having to solve quantum chromodynamics explicitly. However there is the problem of how to organize calculations within nuclear effective field theory: the traditional knowledge about power counting is perturbative but nuclear physics is not. Yet power counting can be derived in Wilsonian renormalization and there is already a fairly good understanding of how to apply these ideas to non-perturbative phenomena and in particular to nuclear physics. Here we review a few of these ideas, explain power counting in two-nucleon scattering and reactions with external probes and hint at how to extend the present analysis beyond the two-body problem.

  6. Reliability Evaluation for Clustered WSNs under Malware Propagation

    PubMed Central

    Shen, Shigen; Huang, Longjun; Liu, Jianhua; Champion, Adam C.; Yu, Shui; Cao, Qiying

    2016-01-01

    We consider a clustered wireless sensor network (WSN) under epidemic-malware propagation conditions and solve the problem of how to evaluate its reliability so as to ensure efficient, continuous, and dependable transmission of sensed data from sensor nodes to the sink. Facing the contradiction between malware intention and continuous-time Markov chain (CTMC) randomness, we introduce a strategic game that can predict malware infection in order to model a successful infection as a CTMC state transition. Next, we devise a novel measure to compute the Mean Time to Failure (MTTF) of a sensor node, which represents the reliability of a sensor node continuously performing tasks such as sensing, transmitting, and fusing data. Since clustered WSNs can be regarded as parallel-serial-parallel systems, the reliability of a clustered WSN can be evaluated via classical reliability theory. Numerical results show the influence of parameters such as the true positive rate and the false positive rate on a sensor node’s MTTF. Furthermore, we validate the method of reliability evaluation for a clustered WSN according to the number of sensor nodes in a cluster, the number of clusters in a route, and the number of routes in the WSN. PMID:27294934

  7. Reliability Evaluation for Clustered WSNs under Malware Propagation.

    PubMed

    Shen, Shigen; Huang, Longjun; Liu, Jianhua; Champion, Adam C; Yu, Shui; Cao, Qiying

    2016-06-10

    We consider a clustered wireless sensor network (WSN) under epidemic-malware propagation conditions and solve the problem of how to evaluate its reliability so as to ensure efficient, continuous, and dependable transmission of sensed data from sensor nodes to the sink. Facing the contradiction between malware intention and continuous-time Markov chain (CTMC) randomness, we introduce a strategic game that can predict malware infection in order to model a successful infection as a CTMC state transition. Next, we devise a novel measure to compute the Mean Time to Failure (MTTF) of a sensor node, which represents the reliability of a sensor node continuously performing tasks such as sensing, transmitting, and fusing data. Since clustered WSNs can be regarded as parallel-serial-parallel systems, the reliability of a clustered WSN can be evaluated via classical reliability theory. Numerical results show the influence of parameters such as the true positive rate and the false positive rate on a sensor node's MTTF. Furthermore, we validate the method of reliability evaluation for a clustered WSN according to the number of sensor nodes in a cluster, the number of clusters in a route, and the number of routes in the WSN.

  8. An Exploratory Study on University Students' Perceptions of Posthumous Organ Donation Base on the Theory of Reasoned Action.

    PubMed

    Wong, Shui Hung; Chow, Amy Yin Man

    2017-08-01

    In view of the general support for organ donation but low registration rate in Hong Kong, the present research attempted to understand the attitude-behavior inconsistency by identifying the underlying beliefs for organ donation through employing the theory of reasoned action. A qualitative approach using semi-structured focus groups was adopted and 19 students from three universities in Hong Kong participated; 10 constructs were identified: attitude, subjective norm, helping, continuation, contribution, body intact, distrust to the medical system, indifference to organ donation, negative affect, and family burden. Findings suggested that their attitudes toward organ donation were of both the cognitive and affective nature; subjective norm of family, friends, and people they respect were identified as influential to students' views on organ donation. The study provided insight in promoting organ donation, that the cognitive concerns about keeping the body intact, and the negative affects introduced should also be addressed.

  9. Barrels, stripes, and fingerprints in the brain - implications for theories of cortical organization.

    PubMed

    Catania, Kenneth C

    2002-01-01

    In the last decade improvements in the histological processing of cortical tissue in conjunction with the investigation of additional mammalian species in comparative brain studies has expanded the information available to guide theories of cortical organization. Here I review some of these recent findings in the somatosensory system with an emphasis on modules related to specializations of the peripheral sensory surface. The diversity of modular representations, or cortical "isomorphs" suggest that information from the sensory sheet guides many of the features of cortical maps and suggest that cortex is not constrained to form circular units in the form of a traditional cortical column.

  10. Two theories/a sharper lens: the staff nurse voice in the workplace.

    PubMed

    DeMarco, Rosanna

    2002-06-01

    This paper (1) introduces the two theoretical frameworks, Silencing the Self and the Framework of Systemic Organization (2) describes the design and findings briefly of a study exploring spillover in nurses utilizing the frameworks, and (3) discusses the process and value of theory triangulation when conducting research in the context of complex nursing systems phenomena where gender, professional work, and gender identity merge. A research study was designed to analyse the actual workplace behaviours of nurses in the context of their lives at work and outside work. An exploration of theoretical frameworks that could direct the measurement of the phenomena in question led to the use of two frameworks, the Framework of Systemic Organization (Friedemann 1995) and the Silencing the Self Theory (Jack 1991), and the creation of a valid and reliable summative rating instrument (the Staff Nurse Workplace Behaviours Scale, SNWBS). A descriptive correlational design was used to measure behaviours between work and home. There were statistically significant relationships found between workplace behaviours, family behaviours, and silencing behaviours as measured by the two separate scales measuring framework concepts. Although both theories had different origins and philosophical tenets, the findings of a research study created an opportunity to integrate the concepts of each and unexpectedly increase and broaden the understanding of spillover for women who are often nurses.

  11. [Mathematical apparatus of the circuit theory in modeling of heat transfer upon extreme heating of an organism].

    PubMed

    2010-01-01

    The mathematical model of heat transfer in whole-body hyperthermia, developed earlier by the author, has been refined using the mathematical apparatus of the circuit theory. The model can be used to calculate the temperature of each organ, which can increase the efficacy and safety of the immersion-convection technique of whole-body hyperthermia.

  12. The Reliability and Precision of Total Scores and IRT Estimates as a Function of Polytomous IRT Parameters and Latent Trait Distribution

    ERIC Educational Resources Information Center

    Culpepper, Steven Andrew

    2013-01-01

    A classic topic in the fields of psychometrics and measurement has been the impact of the number of scale categories on test score reliability. This study builds on previous research by further articulating the relationship between item response theory (IRT) and classical test theory (CTT). Equations are presented for comparing the reliability and…

  13. Complexity Theory

    USGS Publications Warehouse

    Lee, William H K.

    2016-01-01

    A complex system consists of many interacting parts, generates new collective behavior through self organization, and adaptively evolves through time. Many theories have been developed to study complex systems, including chaos, fractals, cellular automata, self organization, stochastic processes, turbulence, and genetic algorithms.

  14. Utilizing Generalizability Theory to Investigate the Reliability of the Grades Assigned to Undergraduate Research Papers

    ERIC Educational Resources Information Center

    Gugiu, Mihaiela R.; Gugiu, Paul C.; Baldus, Robert

    2012-01-01

    Background: Educational researchers have long espoused the virtues of writing with regard to student cognitive skills. However, research on the reliability of the grades assigned to written papers reveals a high degree of contradiction, with some researchers concluding that the grades assigned are very reliable whereas others suggesting that they…

  15. Conceptual Models and Theory-Embedded Principles on Effective Schooling.

    ERIC Educational Resources Information Center

    Scheerens, Jaap

    1997-01-01

    Reviews models and theories on effective schooling. Discusses four rationality-based organization theories and a fifth perspective, chaos theory, as applied to organizational functioning. Discusses theory-embedded principles flowing from these theories: proactive structuring, fit, market mechanisms, cybernetics, and self-organization. The…

  16. Some Characteristics of One Type of High Reliability Organization.

    ERIC Educational Resources Information Center

    Roberts, Karlene H.

    1990-01-01

    Attempts to define organizational processes necessary to operate safely technologically complex organizations. Identifies nuclear powered aircraft carriers as examples of potentially hazardous organizations with histories of excellent operations. Discusses how carriers deal with components of risk and antecedents to catastrophe cited by Perrow and…

  17. Exposing Students to the Idea that Theories Can Change

    ERIC Educational Resources Information Center

    Hoellwarth, Chance; Moelter, Matthew J.

    2011-01-01

    The scientific method is arguably the most reliable way to understand the physical world, yet this aspect of science is rarely addressed in introductory science courses. Students typically learn about the theory in its final, refined form, and seldom experience the experiment-to-theory cycle that goes into producing the theory. One exception to…

  18. Crystalline-silicon reliability lessons for thin-film modules

    NASA Technical Reports Server (NTRS)

    Ross, Ronald G., Jr.

    1985-01-01

    Key reliability and engineering lessons learned from the 10-year history of the Jet Propulsion Laboratory's Flat-Plate Solar Array Project are presented and analyzed. Particular emphasis is placed on lessons applicable to the evolving new thin-film cell and module technologies and the organizations involved with these technologies. The user-specific demand for reliability is a strong function of the application, its location, and its expected duration. Lessons relative to effective means of specifying reliability are described, and commonly used test requirements are assessed from the standpoint of which are the most troublesome to pass, and which correlate best with field experience. Module design lessons are also summarized, including the significance of the most frequently encountered failure mechanisms and the role of encapsulant and cell reliability in determining module reliability. Lessons pertaining to research, design, and test approaches include the historical role and usefulness of qualification tests and field tests.

  19. The Learning Organization: Theory into Practice.

    ERIC Educational Resources Information Center

    Otala, Matti

    1995-01-01

    Key elements of learning organizations are as follows: understanding strengths, weaknesses, threats, and opportunities; open-book management; streamlined processes; team spirit; lifelong learning and skill recycling; and removing anxiety. A learning organization consists of empowered, motivated people committed to improving continuously. (SK)

  20. Feasibility, reliability, and validity of the Japanese version of the 12-item World Health Organization Disability Assessment Schedule-2 in preoperative patients.

    PubMed

    Ida, Mitsuru; Naito, Yusuke; Tanaka, Yuu; Matsunari, Yasunori; Inoue, Satoki; Kawaguchi, Masahiko

    2017-08-01

    The avoidance of postoperative functional disability is one of the most important concerns of patients facing surgery, but methods to evaluate disability have not been definitively established. The aim of our study was to evaluate the feasibility, reliability, and validity of the Japanese version of the 12-item World Health Organization Disability Assessment Schedule-2 (WHODAS 2.0-J) in preoperative patients. Individuals aged ≥55 years who were scheduled to undergo surgery in a tertiary-care hospital in Japan between April 2016 and September 2016 were eligible for enrolment in the study. All patients were assessed preoperatively using the WHODAS 2.0-J, the 8-Item Short Form (SF-8) questionnaire, and the Tokyo Metropolitan Institute of Gerontology Index (TMIG Index). The feasibility, reliability, and validity of WHODAS2.0-J were evaluated using response rate, Cronbach's alpha (a measure of reliability), and the correlation between the WHODAS 2.0-J and the SF-8 questionnaire and TMIG Index, respectively. A total of 934 patients were enrolled in the study during the study period, of whom 930 completed the WHODAS 2.0-J (response rate 99.5%) preoperatively. Reliability and validity were assessed in the 898 patients who completed all three assessment tools (WHODAS 2.0-J, SF-8 questionnaire, and TMIG Index) and for whom all demographic data were available. Cronbach's alpha was 0.92. The total score of the WHODAS 2.0-J showed a mild or moderate correlation with the SF-8 questionnaire and TMIG Index (r = -0.63 to -0.34). The WHODAS 2.0-J is a feasible, reliable, and valid instrument for evaluating preoperative functional disability in surgical patients.

  1. Monte Carlo Approach for Reliability Estimations in Generalizability Studies.

    ERIC Educational Resources Information Center

    Dimitrov, Dimiter M.

    A Monte Carlo approach is proposed, using the Statistical Analysis System (SAS) programming language, for estimating reliability coefficients in generalizability theory studies. Test scores are generated by a probabilistic model that considers the probability for a person with a given ability score to answer an item with a given difficulty…

  2. Autotrophs' challenge to Dynamic Energy Budget theory: Comment on ;Physics of metabolic organization; by Marko Jusup et al.

    NASA Astrophysics Data System (ADS)

    Geček, Sunčana

    2017-03-01

    Jusup and colleagues in the recent review on physics of metabolic organization [1] discuss in detail motivational considerations and common assumptions of Dynamic Energy Budget (DEB) theory, supply readers with a practical guide to DEB-based modeling, demonstrate the construction and dynamics of the standard DEB model, and illustrate several applications. The authors make a step forward from the existing literature by seamlessly bridging over the dichotomy between (i) thermodynamic foundations of the theory (which are often more accessible and understandable to physicists and mathematicians), and (ii) the resulting bioenergetic models (mostly used by biologists in real-world applications).

  3. Organizational theory for dissemination and implementation research.

    PubMed

    Birken, Sarah A; Bunger, Alicia C; Powell, Byron J; Turner, Kea; Clary, Alecia S; Klaman, Stacey L; Yu, Yan; Whitaker, Daniel J; Self, Shannon R; Rostad, Whitney L; Chatham, Jenelle R Shanley; Kirk, M Alexis; Shea, Christopher M; Haines, Emily; Weiner, Bryan J

    2017-05-12

    Even under optimal internal organizational conditions, implementation can be undermined by changes in organizations' external environments, such as fluctuations in funding, adjustments in contracting practices, new technology, new legislation, changes in clinical practice guidelines and recommendations, or other environmental shifts. Internal organizational conditions are increasingly reflected in implementation frameworks, but nuanced explanations of how organizations' external environments influence implementation success are lacking in implementation research. Organizational theories offer implementation researchers a host of existing, highly relevant, and heretofore largely untapped explanations of the complex interaction between organizations and their environment. In this paper, we demonstrate the utility of organizational theories for implementation research. We applied four well-known organizational theories (institutional theory, transaction cost economics, contingency theories, and resource dependency theory) to published descriptions of efforts to implement SafeCare, an evidence-based practice for preventing child abuse and neglect. Transaction cost economics theory explained how frequent, uncertain processes for contracting for SafeCare may have generated inefficiencies and thus compromised implementation among private child welfare organizations. Institutional theory explained how child welfare systems may have been motivated to implement SafeCare because doing so aligned with expectations of key stakeholders within child welfare systems' professional communities. Contingency theories explained how efforts such as interagency collaborative teams promoted SafeCare implementation by facilitating adaptation to child welfare agencies' internal and external contexts. Resource dependency theory (RDT) explained how interagency relationships, supported by contracts, memoranda of understanding, and negotiations, facilitated SafeCare implementation by balancing

  4. Organism and artifact: Proper functions in Paley organisms.

    PubMed

    Holm, Sune

    2013-12-01

    In this paper I assess the explanatory powers of theories of function in the context of products that may result from synthetic biology. The aim is not to develop a new theory of functions, but to assess existing theories of function in relation to a new kind of biological and artifactual entity that might be produced in the not-too-distant future by means of synthetic biology. The paper thus investigates how to conceive of the functional nature of living systems that are not the result of evolution by natural selection, or instantly generated by cosmic coincidence, but which are products of intelligent design. The paper argues that the aetiological theory of proper functions in organisms and artifacts is inadequate as an account of proper functions in such 'Paley organisms' and defends an alternative organisational approach. The paper ends by considering the implications of the discussion of biological function for questions about the interests and moral status of non-sentient organisms. Copyright © 2013 Elsevier Ltd. All rights reserved.

  5. COGNITRON THEORY,

    DTIC Science & Technology

    ARTIFICIAL INTELLIGENCE , THEORY), NERVE CELLS, SIMULATION, SENSE ORGANS, SENSES(PHYSIOLOGY), CONDITIONED RESPONSE, MATRICES(MATHEMATICS), MAPPING (TRANSFORMATIONS), MATHEMATICAL MODELS, FEEDBACK, BIONICS

  6. Humic First Theory: A New Theory on the Origin of Life

    NASA Astrophysics Data System (ADS)

    Daei, Mohammad Ali; Daei, Manijeh; Daei, Bijan

    2017-04-01

    In 1953, Miller & Urey through a brilliant experiment demonstrated that the building blocks of life could evolve in primitive earth conditions [1]. During the recent years scientists revealed that organic matters are not very rare compounds in comets, asteroids, and meteorites which have bombarded the ancient earth repeatedly [2]. So simple organic molecules on early earth could be quite enough to start chemical evolution and steadily, proceeded to the very simple form of life. Many theories have tried to explain how life emerged from non life, but failed, largely due to the lack of a distinct methodology. There is a huge gap between the simple building blocks, like amino acid, sugar, and lipid molecules, to a living cell with a very sophisticated structure and organization. It is unacceptable to fill this great distance, only by accidental reactions in a passive media (primitive soap) even, over a very long time. Obviously, manufacturing of a primitive cell required a natural factory with rather firm and resistant basement, plenty of organic and inorganic raw materials and qualified production line, plus some sources of energy. There were plenty of solar energy and water in the early Earth, but what about the other factors? Availability of essential minerals was not guaranteed at all, in primitive earth which covered with bare, dead rocks. While we are not able today, to multiply any microorganisms in ideal conditions of modern laboratory in the absence of only one of essential nutrients or elements, how can we expect primitive cells appear on early earth conditions without the support of soluble minerals and organic matters? Ideal production line must be active and protective, have catalyzing ability, could provide numerous opportunities for interaction between basic bio molecules (mainly RNA and proteins) and above all, have capability to react with different sources of energy. There are strong evidences that show only some form of stable, rich and active

  7. Chemical Applications of Graph Theory: Part II. Isomer Enumeration.

    ERIC Educational Resources Information Center

    Hansen, Peter J.; Jurs, Peter C.

    1988-01-01

    Discusses the use of graph theory to aid in the depiction of organic molecular structures. Gives a historical perspective of graph theory and explains graph theory terminology with organic examples. Lists applications of graph theory to current research projects. (ML)

  8. An Enhanced Backbone-Assisted Reliable Framework for Wireless Sensor Networks

    PubMed Central

    Tufail, Ali; Khayam, Syed Ali; Raza, Muhammad Taqi; Ali, Amna; Kim, Ki-Hyung

    2010-01-01

    An extremely reliable source to sink communication is required for most of the contemporary WSN applications especially pertaining to military, healthcare and disaster-recovery. However, due to their intrinsic energy, bandwidth and computational constraints, Wireless Sensor Networks (WSNs) encounter several challenges in reliable source to sink communication. In this paper, we present a novel reliable topology that uses reliable hotlines between sensor gateways to boost the reliability of end-to-end transmissions. This reliable and efficient routing alternative reduces the number of average hops from source to the sink. We prove, with the help of analytical evaluation, that communication using hotlines is considerably more reliable than traditional WSN routing. We use reliability theory to analyze the cost and benefit of adding gateway nodes to a backbone-assisted WSN. However, in hotline assisted routing some scenarios where source and the sink are just a couple of hops away might bring more latency, therefore, we present a Signature Based Routing (SBR) scheme. SBR enables the gateways to make intelligent routing decisions, based upon the derived signature, hence providing lesser end-to-end delay between source to the sink communication. Finally, we evaluate our proposed hotline based topology with the help of a simulation tool and show that the proposed topology provides manifold increase in end-to-end reliability. PMID:22294890

  9. Distributed collaborative response surface method for mechanical dynamic assembly reliability design

    NASA Astrophysics Data System (ADS)

    Bai, Guangchen; Fei, Chengwei

    2013-11-01

    Because of the randomness of many impact factors influencing the dynamic assembly relationship of complex machinery, the reliability analysis of dynamic assembly relationship needs to be accomplished considering the randomness from a probabilistic perspective. To improve the accuracy and efficiency of dynamic assembly relationship reliability analysis, the mechanical dynamic assembly reliability(MDAR) theory and a distributed collaborative response surface method(DCRSM) are proposed. The mathematic model of DCRSM is established based on the quadratic response surface function, and verified by the assembly relationship reliability analysis of aeroengine high pressure turbine(HPT) blade-tip radial running clearance(BTRRC). Through the comparison of the DCRSM, traditional response surface method(RSM) and Monte Carlo Method(MCM), the results show that the DCRSM is not able to accomplish the computational task which is impossible for the other methods when the number of simulation is more than 100 000 times, but also the computational precision for the DCRSM is basically consistent with the MCM and improved by 0.40˜4.63% to the RSM, furthermore, the computational efficiency of DCRSM is up to about 188 times of the MCM and 55 times of the RSM under 10000 times simulations. The DCRSM is demonstrated to be a feasible and effective approach for markedly improving the computational efficiency and accuracy of MDAR analysis. Thus, the proposed research provides the promising theory and method for the MDAR design and optimization, and opens a novel research direction of probabilistic analysis for developing the high-performance and high-reliability of aeroengine.

  10. Discovery of fairy circles in Australia supports self-organization theory

    PubMed Central

    Getzin, Stephan; Yizhaq, Hezi; Bell, Bronwyn; Erickson, Todd E.; Postle, Anthony C.; Katra, Itzhak; Tzuk, Omer; Zelnik, Yuval R.; Wiegand, Kerstin; Wiegand, Thorsten; Meron, Ehud

    2016-01-01

    Vegetation gap patterns in arid grasslands, such as the “fairy circles” of Namibia, are one of nature’s greatest mysteries and subject to a lively debate on their origin. They are characterized by small-scale hexagonal ordering of circular bare-soil gaps that persists uniformly in the landscape scale to form a homogeneous distribution. Pattern-formation theory predicts that such highly ordered gap patterns should be found also in other water-limited systems across the globe, even if the mechanisms of their formation are different. Here we report that so far unknown fairy circles with the same spatial structure exist 10,000 km away from Namibia in the remote outback of Australia. Combining fieldwork, remote sensing, spatial pattern analysis, and process-based mathematical modeling, we demonstrate that these patterns emerge by self-organization, with no correlation with termite activity; the driving mechanism is a positive biomass–water feedback associated with water runoff and biomass-dependent infiltration rates. The remarkable match between the patterns of Australian and Namibian fairy circles and model results indicate that both patterns emerge from a nonuniform stationary instability, supporting a central universality principle of pattern-formation theory. Applied to the context of dryland vegetation, this principle predicts that different systems that go through the same instability type will show similar vegetation patterns even if the feedback mechanisms and resulting soil–water distributions are different, as we indeed found by comparing the Australian and the Namibian fairy-circle ecosystems. These results suggest that biomass–water feedbacks and resultant vegetation gap patterns are likely more common in remote drylands than is currently known. PMID:26976567

  11. Theory of Work Adjustment Personality Constructs.

    ERIC Educational Resources Information Center

    Lawson, Loralie

    1993-01-01

    To measure Theory of Work Adjustment personality and adjustment style dimensions, content-based scales were analyzed for homogeneity and successively reanalyzed for reliability improvement. Three sound scales were developed: inflexibility, activeness, and reactiveness. (SK)

  12. Validity and reliability of four language mapping paradigms.

    PubMed

    Wilson, Stephen M; Bautista, Alexa; Yen, Melodie; Lauderdale, Stefanie; Eriksson, Dana K

    2017-01-01

    Language areas of the brain can be mapped in individual participants with functional MRI. We investigated the validity and reliability of four language mapping paradigms that may be appropriate for individuals with acquired aphasia: sentence completion, picture naming, naturalistic comprehension, and narrative comprehension. Five neurologically normal older adults were scanned on each of the four paradigms on four separate occasions. Validity was assessed in terms of whether activation patterns reflected the known typical organization of language regions, that is, lateralization to the left hemisphere, and involvement of the left inferior frontal gyrus and the left middle and/or superior temporal gyri. Reliability (test-retest reproducibility) was quantified in terms of the Dice coefficient of similarity, which measures overlap of activations across time points. We explored the impact of different absolute and relative voxelwise thresholds, a range of cluster size cutoffs, and limitation of analyses to a priori potential language regions. We found that the narrative comprehension and sentence completion paradigms offered the best balance of validity and reliability. However, even with optimal combinations of analysis parameters, there were many scans on which known features of typical language organization were not demonstrated, and test-retest reproducibility was only moderate for realistic parameter choices. These limitations in terms of validity and reliability may constitute significant limitations for many clinical or research applications that depend on identifying language regions in individual participants.

  13. Modified personal interviews: resurrecting reliable personal interviews for admissions?

    PubMed

    Hanson, Mark D; Kulasegaram, Kulamakan Mahan; Woods, Nicole N; Fechtig, Lindsey; Anderson, Geoff

    2012-10-01

    Traditional admissions personal interviews provide flexible faculty-student interactions but are plagued by low inter-interview reliability. Axelson and Kreiter (2009) retrospectively showed that multiple independent sampling (MIS) may improve reliability of personal interviews; thus, the authors incorporated MIS into the admissions process for medical students applying to the University of Toronto's Leadership Education and Development Program (LEAD). They examined the reliability and resource demands of this modified personal interview (MPI) format. In 2010-2011, LEAD candidates submitted written applications, which were used to screen for participation in the MPI process. Selected candidates completed four brief (10-12 minutes) independent MPIs each with a different interviewer. The authors blueprinted MPI questions to (i.e., aligned them with) leadership attributes, and interviewers assessed candidates' eligibility on a five-point Likert-type scale. The authors analyzed inter-interview reliability using the generalizability theory. Sixteen candidates submitted applications; 10 proceeded to the MPI stage. Reliability of the written application components was 0.75. The MPI process had overall inter-interview reliability of 0.79. Correlation between the written application and MPI scores was 0.49. A decision study showed acceptable reliability of 0.74 with only three MPIs scored using one global rating. Furthermore, a traditional admissions interview format would take 66% more time than the MPI format. The MPI format, used during the LEAD admissions process, achieved high reliability with minimal faculty resources. The MPI format's reliability and effective resource use were possible through MIS and employment of expert interviewers. MPIs may be useful for other admissions tasks.

  14. Assessing Reliability of Student Ratings of Advisor: A Comparison of Univariate and Multivariate Generalizability Approaches.

    ERIC Educational Resources Information Center

    Sun, Anji; Valiga, Michael J.

    In this study, the reliability of the American College Testing (ACT) Program's "Survey of Academic Advising" (SAA) was examined using both univariate and multivariate generalizability theory approaches. The primary purpose of the study was to compare the results of three generalizability theory models (a random univariate model, a mixed…

  15. Reliability analysis of airship remote sensing system

    NASA Astrophysics Data System (ADS)

    Qin, Jun

    1998-08-01

    Airship Remote Sensing System (ARSS) for obtain the dynamic or real time images in the remote sensing of the catastrophe and the environment, is a mixed complex system. Its sensor platform is a remote control airship. The achievement of a remote sensing mission depends on a series of factors. For this reason, it is very important for us to analyze reliability of ARSS. In first place, the system model was simplified form multi-stage system to two-state system on the basis of the result of the failure mode and effect analysis and the failure tree failure mode effect and criticality analysis. The failure tree was created after analyzing all factors and their interrelations. This failure tree includes four branches, e.g. engine subsystem, remote control subsystem, airship construction subsystem, flying metrology and climate subsystem. By way of failure tree analysis and basic-events classing, the weak links were discovered. The result of test running shown no difference in comparison with theory analysis. In accordance with the above conclusions, a plan of the reliability growth and reliability maintenance were posed. System's reliability are raised from 89 percent to 92 percent with the reformation of the man-machine interactive interface, the augmentation of the secondary better-groupie and the secondary remote control equipment.

  16. Interpreting Variance Components as Evidence for Reliability and Validity.

    ERIC Educational Resources Information Center

    Kane, Michael T.

    The reliability and validity of measurement is analyzed by a sampling model based on generalizability theory. A model for the relationship between a measurement procedure and an attribute is developed from an analysis of how measurements are used and interpreted in science. The model provides a basis for analyzing the concept of an error of…

  17. Advancing methods for reliably assessing motivational interviewing fidelity using the Motivational Interviewing Skills Code

    PubMed Central

    Lord, Sarah Peregrine; Can, Doğan; Yi, Michael; Marin, Rebeca; Dunn, Christopher W.; Imel, Zac E.; Georgiou, Panayiotis; Narayanan, Shrikanth; Steyvers, Mark; Atkins, David C.

    2014-01-01

    The current paper presents novel methods for collecting MISC data and accurately assessing reliability of behavior codes at the level of the utterance. The MISC 2.1 was used to rate MI interviews from five randomized trials targeting alcohol and drug use. Sessions were coded at the utterance-level. Utterance-based coding reliability was estimated using three methods and compared to traditional reliability estimates of session tallies. Session-level reliability was generally higher compared to reliability using utterance-based codes, suggesting that typical methods for MISC reliability may be biased. These novel methods in MI fidelity data collection and reliability assessment provided rich data for therapist feedback and further analyses. Beyond implications for fidelity coding, utterance-level coding schemes may elucidate important elements in the counselor-client interaction that could inform theories of change and the practice of MI. PMID:25242192

  18. Reliability Abstracts and Technical Reviews January - December 1970. Volume 10, Nos. 1-12; R70-14805 - R70-15438

    NASA Technical Reports Server (NTRS)

    1970-01-01

    Reliability Abstracts and Technical Reviews is an abstract and critical analysis service covering published and report literature on reliability. The service is designed to provide information on theory and practice of reliability as applied to aerospace and an objective appraisal of the quality, significance, and applicability of the literature abstracted.

  19. Did Geomagnetic Activity Challenge Electric Power Reliability During Solar Cycle 23? Evidence from the PJM Regional Transmission Organization in North America

    NASA Technical Reports Server (NTRS)

    Forbes, Kevin F.; Cyr, Chris St

    2012-01-01

    During solar cycle 22, a very intense geomagnetic storm on 13 March 1989 contributed to the collapse of the Hydro-Quebec power system in Canada. This event clearly demonstrated that geomagnetic storms have the potential to lead to blackouts. This paper addresses whether geomagnetic activity challenged power system reliability during solar cycle 23. Operations by PJM Interconnection, LLC (hereafter PJM), a regional transmission organization in North America, are examined over the period 1 April 2002 through 30 April 2004. During this time PJM coordinated the movement of wholesale electricity in all or parts of Delaware, Maryland, New Jersey, Ohio, Pennsylvania, Virginia, West Virginia, and the District of Columbia in the United States. We examine the relationship between a proxy of geomagnetically induced currents (GICs) and a metric of challenged reliability. In this study, GICs are proxied using magnetometer data from a geomagnetic observatory located just outside the PJM control area. The metric of challenged reliability is the incidence of out-of-economic-merit order dispatching due to adverse reactive power conditions. The statistical methods employed make it possible to disentangle the effects of GICs on power system operations from purely terrestrial factors. The results of the analysis indicate that geomagnetic activity can significantly increase the likelihood that the system operator will dispatch generating units based on system stability considerations rather than economic merit.

  20. Optimization of life support systems and their systems reliability

    NASA Technical Reports Server (NTRS)

    Fan, L. T.; Hwang, C. L.; Erickson, L. E.

    1971-01-01

    The identification, analysis, and optimization of life support systems and subsystems have been investigated. For each system or subsystem that has been considered, the procedure involves the establishment of a set of system equations (or mathematical model) based on theory and experimental evidences; the analysis and simulation of the model; the optimization of the operation, control, and reliability; analysis of sensitivity of the system based on the model; and, if possible, experimental verification of the theoretical and computational results. Research activities include: (1) modeling of air flow in a confined space; (2) review of several different gas-liquid contactors utilizing centrifugal force: (3) review of carbon dioxide reduction contactors in space vehicles and other enclosed structures: (4) application of modern optimal control theory to environmental control of confined spaces; (5) optimal control of class of nonlinear diffusional distributed parameter systems: (6) optimization of system reliability of life support systems and sub-systems: (7) modeling, simulation and optimal control of the human thermal system: and (8) analysis and optimization of the water-vapor eletrolysis cell.

  1. Purposeful Program Theory: Effective Use of Theories of Change and Logic Models

    ERIC Educational Resources Information Center

    Funnell, Sue C.; Rogers, Patricia J.

    2011-01-01

    Between good intentions and great results lies a program theory--not just a list of tasks but a vision of what needs to happen, and how. Now widely used in government and not-for-profit organizations, program theory provides a coherent picture of how change occurs and how to improve performance. "Purposeful Program Theory" shows how to develop,…

  2. Motivation in later life: theory and assessment.

    PubMed

    Vallerand, R J; O'Connor, B P; Hamel, M

    1995-01-01

    A framework that has been found useful in research on young adults, Deci and Ryan's self-determination theory [1, 2], is suggested as a promising direction for research on motivation in later life. The theory proposes the existence of four types of motivation (intrinsic, self-determined extrinsic, nonself-determined extrinsic, and amotivation) which are assumed to have varying consequences for adaptation and well-being. A previously published French measure of motivational styles which is known to be reliable and valid was translated into English and was tested on seventy-seven nursing home residents (aged 60 to 98 years). It was found that the four motivational styles can be reliably measured; that the intercorrelations between the motivational styles are consistent with theoretical predictions; and that the four types of motivation are related to other important aspects of the lives of elderly people in a theoretically meaningful manner. Suggestions are made for further research using self-determination theory and the present scales.

  3. Reliability analysis of the Chinese version of the Functional Assessment of Cancer Therapy - Leukemia (FACT-Leu) scale based on multivariate generalizability theory.

    PubMed

    Meng, Qiong; Yang, Zheng; Wu, Yang; Xiao, Yuanyuan; Gu, Xuezhong; Zhang, Meixia; Wan, Chonghua; Li, Xiaosong

    2017-05-04

    The Functional Assessment of Cancer Therapy-Leukemia (FACT-Leu) scale, a leukemia-specific instrument for determining the health-related quality of life (HRQOL) in patients with leukemia, had been developed and validated, but there have been no reports on the development of a simplified Chinese version of this scale. This is a new exploration to analyze the reliability of the HRQOL measurement using multivariate generalizability theory (MGT). This study aimed to develop a Chinese version of the FACT-Leu scale and evaluate its reliability using MGT to provide evidence to support the revision and improvement of this scale. The Chinese version of the FACT-Leu scale was developed by four steps: forward translation, backward translation, cultural adaptation and pilot-testing. The HRQOL was measured for eligible inpatients with leukemia using this scale to provide data. A single-facet multivariate Generalizability Study (G-study) design was demonstrated to estimate the variance-covariance components and then several Decision Studies (D-studies) with varying numbers of items were analyzed to obtain reliability coefficients and to understand how much the measurement reliability could be vary as the number of items in MGT changes. One-hundred and one eligible inpatients diagnosed with leukemia were recruited and completed the HRQOL measurement at the time of admission to the hospital. In the G-study, the variation component of the patient-item interaction was largest while the variation component of the item was the smallest for the four of five domains, except for the leukemia-specific (LEUS) domain. In the D-study, at the level of domain, the generalizability coefficients (G) and the indexes of dependability (Ф) for four of the five domains were approximately equal to or greater than 0.80 except for the Emotional Well-being (EWB) domain (>0.70 but <0.80). For the overall scale, the composite G and composite Ф coefficients were greater than 0.90. Based on the G

  4. Organizing safety: conditions for successful information assurance programs.

    PubMed

    Collmann, Jeff; Coleman, Johnathan; Sostrom, Kristen; Wright, Willie

    2004-01-01

    Organizations must continuously seek safety. When considering computerized health information systems, "safety" includes protecting the integrity, confidentiality, and availability of information assets such as patient information, key components of the technical information system, and critical personnel. "High Reliability Theory" (HRT) argues that organizations with strong leadership support, continuous training, redundant safety mechanisms, and "cultures of high reliability" can deploy and safely manage complex, risky technologies such as nuclear weapons systems or computerized health information systems. In preparation for the Health Insurance Portability and Accountability Act (HIPAA) of 1996, the Office of the Assistant Secretary of Defense (Health Affairs), the Offices of the Surgeons General of the United States Army, Navy and Air Force, and the Telemedicine and Advanced Technology Research Center (TATRC), US Army Medical Research and Materiel Command sponsored organizational, doctrinal, and technical projects that individually and collectively promote conditions for a "culture of information assurance." These efforts include sponsoring the "P3 Working Group" (P3WG), an interdisciplinary, tri-service taskforce that reviewed all relevant Department of Defense (DoD), Miliary Health System (MHS), Army, Navy and Air Force policies for compliance with the HIPAA medical privacy and data security regulations; supporting development, training, and deployment of OCTAVE(sm), a self-directed information security risk assessment process; and sponsoring development of the Risk Information Management Resource (RIMR), a Web-enabled enterprise portal about health information assurance.

  5. Reliability Evaluation of Machine Center Components Based on Cascading Failure Analysis

    NASA Astrophysics Data System (ADS)

    Zhang, Ying-Zhi; Liu, Jin-Tong; Shen, Gui-Xiang; Long, Zhe; Sun, Shu-Guang

    2017-07-01

    In order to rectify the problems that the component reliability model exhibits deviation, and the evaluation result is low due to the overlook of failure propagation in traditional reliability evaluation of machine center components, a new reliability evaluation method based on cascading failure analysis and the failure influenced degree assessment is proposed. A direct graph model of cascading failure among components is established according to cascading failure mechanism analysis and graph theory. The failure influenced degrees of the system components are assessed by the adjacency matrix and its transposition, combined with the Pagerank algorithm. Based on the comprehensive failure probability function and total probability formula, the inherent failure probability function is determined to realize the reliability evaluation of the system components. Finally, the method is applied to a machine center, it shows the following: 1) The reliability evaluation values of the proposed method are at least 2.5% higher than those of the traditional method; 2) The difference between the comprehensive and inherent reliability of the system component presents a positive correlation with the failure influenced degree of the system component, which provides a theoretical basis for reliability allocation of machine center system.

  6. Theories of autism.

    PubMed

    Levy, Florence

    2007-11-01

    The purpose of the present paper was to review psychological theories of autism, and to integrate these theories with neurobiological findings. Cognitive, theory of mind, language and coherence theories were identified, and briefly reviewed. Psychological theories were found not to account for the rigid/repetitive behaviours universally described in autistic subjects, and underlying neurobiological systems were identified. When the developing brain encounters constrained connectivity, it evolves an abnormal organization, the features of which may be best explained by a developmental failure of neural connectivity, where high local connectivity develops in tandem with low long-range connectivity, resulting in constricted repetitive behaviours.

  7. Business of reliability

    NASA Astrophysics Data System (ADS)

    Engel, Pierre

    1999-12-01

    The presentation is organized around three themes: (1) The decrease of reception equipment costs allows non-Remote Sensing organization to access a technology until recently reserved to scientific elite. What this means is the rise of 'operational' executive agencies considering space-based technology and operations as a viable input to their daily tasks. This is possible thanks to totally dedicated ground receiving entities focusing on one application for themselves, rather than serving a vast community of users. (2) The multiplication of earth observation platforms will form the base for reliable technical and financial solutions. One obstacle to the growth of the earth observation industry is the variety of policies (commercial versus non-commercial) ruling the distribution of the data and value-added products. In particular, the high volume of data sales required for the return on investment does conflict with traditional low-volume data use for most applications. Constant access to data sources supposes monitoring needs as well as technical proficiency. (3) Large volume use of data coupled with low- cost equipment costs is only possible when the technology has proven reliable, in terms of application results, financial risks and data supply. Each of these factors is reviewed. The expectation is that international cooperation between agencies and private ventures will pave the way for future business models. As an illustration, the presentation proposes to use some recent non-traditional monitoring applications, that may lead to significant use of earth observation data, value added products and services: flood monitoring, ship detection, marine oil pollution deterrent systems and rice acreage monitoring.

  8. Using Multivariate Generalizability Theory to Assess the Effect of Content Stratification on the Reliability of a Performance Assessment

    ERIC Educational Resources Information Center

    Keller, Lisa A.; Clauser, Brian E.; Swanson, David B.

    2010-01-01

    In recent years, demand for performance assessments has continued to grow. However, performance assessments are notorious for lower reliability, and in particular, low reliability resulting from task specificity. Since reliability analyses typically treat the performance tasks as randomly sampled from an infinite universe of tasks, these estimates…

  9. An Investment Level Decision Method to Secure Long-term Reliability

    NASA Astrophysics Data System (ADS)

    Bamba, Satoshi; Yabe, Kuniaki; Seki, Tomomichi; Shibaya, Tetsuji

    The slowdown in power demand increase and facility replacement causes the aging and lower reliability in power facility. And the aging is followed by the rapid increase of repair and replacement when many facilities reach their lifetime in future. This paper describes a method to estimate the repair and replacement costs in future by applying the life-cycle cost model and renewal theory to the historical data. This paper also describes a method to decide the optimum investment plan, which replaces facilities in the order of cost-effectiveness by setting replacement priority formula, and the minimum investment level to keep the reliability. Estimation examples applied to substation facilities show that the reasonable and leveled future cash-out can keep the reliability by lowering the percentage of replacements caused by fatal failures.

  10. Military Equal Opportunity Climate Survey: Reliability, Construct Validity, and Preliminary Field Test

    DTIC Science & Technology

    1990-01-10

    reason for the fairly low reliability of the fourth and fifth MEOCS factors), issues of sexism and more subtle forms of racism have come to the fore...psychological climate (for which the individual is the unit for theory ). One approach, described by Glick, would use the intraclass correlation from a...and outcome measures are forced to remain obscure. A major flaw in the measurement of organizational climate is the lack of theory which would serve

  11. Quantitative metal magnetic memory reliability modeling for welded joints

    NASA Astrophysics Data System (ADS)

    Xing, Haiyan; Dang, Yongbin; Wang, Ben; Leng, Jiancheng

    2016-03-01

    Metal magnetic memory(MMM) testing has been widely used to detect welded joints. However, load levels, environmental magnetic field, and measurement noises make the MMM data dispersive and bring difficulty to quantitative evaluation. In order to promote the development of quantitative MMM reliability assessment, a new MMM model is presented for welded joints. Steel Q235 welded specimens are tested along the longitudinal and horizontal lines by TSC-2M-8 instrument in the tensile fatigue experiments. The X-ray testing is carried out synchronously to verify the MMM results. It is found that MMM testing can detect the hidden crack earlier than X-ray testing. Moreover, the MMM gradient vector sum K vs is sensitive to the damage degree, especially at early and hidden damage stages. Considering the dispersion of MMM data, the K vs statistical law is investigated, which shows that K vs obeys Gaussian distribution. So K vs is the suitable MMM parameter to establish reliability model of welded joints. At last, the original quantitative MMM reliability model is first presented based on the improved stress strength interference theory. It is shown that the reliability degree R gradually decreases with the decreasing of the residual life ratio T, and the maximal error between prediction reliability degree R 1 and verification reliability degree R 2 is 9.15%. This presented method provides a novel tool of reliability testing and evaluating in practical engineering for welded joints.

  12. Advancing methods for reliably assessing motivational interviewing fidelity using the motivational interviewing skills code.

    PubMed

    Lord, Sarah Peregrine; Can, Doğan; Yi, Michael; Marin, Rebeca; Dunn, Christopher W; Imel, Zac E; Georgiou, Panayiotis; Narayanan, Shrikanth; Steyvers, Mark; Atkins, David C

    2015-02-01

    The current paper presents novel methods for collecting MISC data and accurately assessing reliability of behavior codes at the level of the utterance. The MISC 2.1 was used to rate MI interviews from five randomized trials targeting alcohol and drug use. Sessions were coded at the utterance-level. Utterance-based coding reliability was estimated using three methods and compared to traditional reliability estimates of session tallies. Session-level reliability was generally higher compared to reliability using utterance-based codes, suggesting that typical methods for MISC reliability may be biased. These novel methods in MI fidelity data collection and reliability assessment provided rich data for therapist feedback and further analyses. Beyond implications for fidelity coding, utterance-level coding schemes may elucidate important elements in the counselor-client interaction that could inform theories of change and the practice of MI. Copyright © 2015 Elsevier Inc. All rights reserved.

  13. Nursing Services Delivery Theory: an open system approach.

    PubMed

    Meyer, Raquel M; O'Brien-Pallas, Linda L

    2010-12-01

    This paper is a discussion of the derivation of the Nursing Services Delivery Theory from the application of open system theory to large-scale organizations. The underlying mechanisms by which staffing indicators influence outcomes remain under-theorized and unmeasured, resulting in a 'black box' that masks the nature and organization of nursing work. Theory linking nursing work, staffing, work environments, and outcomes in different settings is urgently needed to inform management decisions about the allocation of nurse staffing resources in organizations. A search of CINAHL and Business Source Premier for the years 1980-2008 was conducted using the following terms: theory, models, organization, organizational structure, management, administration, nursing units, and nursing. Seminal works were included. The healthcare organization is conceptualized as an open system characterized by energy transformation, a dynamic steady state, negative entropy, event cycles, negative feedback, differentiation, integration and coordination, and equifinality. The Nursing Services Delivery Theory proposes that input, throughput, and output factors interact dynamically to influence the global work demands placed on nursing work groups at the point of care in production subsystems. THE Nursing Services Delivery Theory can be applied to varied settings, cultures, and countries and supports the study of multi-level phenomena and cross-level effects. The Nursing Services Delivery Theory gives a relational structure for reconciling disparate streams of research related to nursing work, staffing, and work environments. The theory can guide future research and the management of nursing services in large-scale healthcare organizations. © 2010 Blackwell Publishing Ltd.

  14. Constructing a Grounded Theory of E-Learning Assessment

    ERIC Educational Resources Information Center

    Alonso-Díaz, Laura; Yuste-Tosina, Rocío

    2015-01-01

    This study traces the development of a grounded theory of assessment in e-learning environments, a field in need of research to establish the parameters of an assessment that is both reliable and worthy of higher learning accreditation. Using grounded theory as a research method, we studied an e-assessment model that does not require physical…

  15. Integrating High-Reliability Principles to Transform Access and Throughput by Creating a Centralized Operations Center.

    PubMed

    Davenport, Paul B; Carter, Kimberly F; Echternach, Jeffrey M; Tuck, Christopher R

    2018-02-01

    High-reliability organizations (HROs) demonstrate unique and consistent characteristics, including operational sensitivity and control, situational awareness, hyperacute use of technology and data, and actionable process transformation. System complexity and reliance on information-based processes challenge healthcare organizations to replicate HRO processes. This article describes a healthcare organization's 3-year journey to achieve key HRO features to deliver high-quality, patient-centric care via an operations center powered by the principles of high-reliability data and software to impact patient throughput and flow.

  16. Competing risk models in reliability systems, a weibull distribution model with bayesian analysis approach

    NASA Astrophysics Data System (ADS)

    Iskandar, Ismed; Satria Gondokaryono, Yudi

    2016-02-01

    In reliability theory, the most important problem is to determine the reliability of a complex system from the reliability of its components. The weakness of most reliability theories is that the systems are described and explained as simply functioning or failed. In many real situations, the failures may be from many causes depending upon the age and the environment of the system and its components. Another problem in reliability theory is one of estimating the parameters of the assumed failure models. The estimation may be based on data collected over censored or uncensored life tests. In many reliability problems, the failure data are simply quantitatively inadequate, especially in engineering design and maintenance system. The Bayesian analyses are more beneficial than the classical one in such cases. The Bayesian estimation analyses allow us to combine past knowledge or experience in the form of an apriori distribution with life test data to make inferences of the parameter of interest. In this paper, we have investigated the application of the Bayesian estimation analyses to competing risk systems. The cases are limited to the models with independent causes of failure by using the Weibull distribution as our model. A simulation is conducted for this distribution with the objectives of verifying the models and the estimators and investigating the performance of the estimators for varying sample size. The simulation data are analyzed by using Bayesian and the maximum likelihood analyses. The simulation results show that the change of the true of parameter relatively to another will change the value of standard deviation in an opposite direction. For a perfect information on the prior distribution, the estimation methods of the Bayesian analyses are better than those of the maximum likelihood. The sensitivity analyses show some amount of sensitivity over the shifts of the prior locations. They also show the robustness of the Bayesian analysis within the range

  17. "High Stage" Organizing.

    ERIC Educational Resources Information Center

    Torbert, William R.

    Although a psychological theory of stages of transformation in human development currently exists, organizational researchers have yet to elaborate and test any theory of organizational transformation of comparable elegance. According to the organizational stage theory being developed since 1974 by William Torbert, bureaucratic organization, which…

  18. Using Ontological Engineering to Organize Learning/Instructional Theories and Build a Theory-Aware Authoring System

    ERIC Educational Resources Information Center

    Hayashi, Yusuke; Bourdeau, Jacqueline; Mizoguchi, Riichiro

    2009-01-01

    This paper describes the achievements of an innovative eight-year research program first introduced in Mizoguchi and Bourdeau (2000), which was aimed at building a theory-aware authoring system by using ontological engineering. To date, we have proposed OMNIBUS, an ontology that comprehensively covers different learning/instructional theories and…

  19. A new theory for X-ray diffraction

    PubMed Central

    Fewster, Paul F.

    2014-01-01

    This article proposes a new theory of X-ray scattering that has particular relevance to powder diffraction. The underlying concept of this theory is that the scattering from a crystal or crystallite is distributed throughout space: this leads to the effect that enhanced scatter can be observed at the ‘Bragg position’ even if the ‘Bragg condition’ is not satisfied. The scatter from a single crystal or crystallite, in any fixed orientation, has the fascinating property of contributing simultaneously to many ‘Bragg positions’. It also explains why diffraction peaks are obtained from samples with very few crystallites, which cannot be explained with the conventional theory. The intensity ratios for an Si powder sample are predicted with greater accuracy and the temperature factors are more realistic. Another consequence is that this new theory predicts a reliability in the intensity measurements which agrees much more closely with experimental observations compared to conventional theory that is based on ‘Bragg-type’ scatter. The role of dynamical effects (extinction etc.) is discussed and how they are suppressed with diffuse scattering. An alternative explanation for the Lorentz factor is presented that is more general and based on the capture volume in diffraction space. This theory, when applied to the scattering from powders, will evaluate the full scattering profile, including peak widths and the ‘background’. The theory should provide an increased understanding of the reliability of powder diffraction measurements, and may also have wider implications for the analysis of powder diffraction data, by increasing the accuracy of intensities predicted from structural models. PMID:24815975

  20. A new theory for X-ray diffraction.

    PubMed

    Fewster, Paul F

    2014-05-01

    This article proposes a new theory of X-ray scattering that has particular relevance to powder diffraction. The underlying concept of this theory is that the scattering from a crystal or crystallite is distributed throughout space: this leads to the effect that enhanced scatter can be observed at the `Bragg position' even if the `Bragg condition' is not satisfied. The scatter from a single crystal or crystallite, in any fixed orientation, has the fascinating property of contributing simultaneously to many `Bragg positions'. It also explains why diffraction peaks are obtained from samples with very few crystallites, which cannot be explained with the conventional theory. The intensity ratios for an Si powder sample are predicted with greater accuracy and the temperature factors are more realistic. Another consequence is that this new theory predicts a reliability in the intensity measurements which agrees much more closely with experimental observations compared to conventional theory that is based on `Bragg-type' scatter. The role of dynamical effects (extinction etc.) is discussed and how they are suppressed with diffuse scattering. An alternative explanation for the Lorentz factor is presented that is more general and based on the capture volume in diffraction space. This theory, when applied to the scattering from powders, will evaluate the full scattering profile, including peak widths and the `background'. The theory should provide an increased understanding of the reliability of powder diffraction measurements, and may also have wider implications for the analysis of powder diffraction data, by increasing the accuracy of intensities predicted from structural models.

  1. Describing the Climate of Student Organizations: The Student Organization Environment Scales.

    ERIC Educational Resources Information Center

    Winston, Roger B., Jr.; Bledsoe, Tyrone; Goldstein, Adam R.; Wisbey, Martha E.; Street, James L.; Brown, Steven R.; Goyen, Kenneth D.; Rounds, Linda E.

    1997-01-01

    Using M. R. Weisbord's model of organizational diagnosis, researchers developed the Student Organization Environment Scales to measure students' perceptions of the psychosocial environment or climate of college student organizations. Development of the instrument is described and estimates of its reliability and validity are reported. Describes…

  2. Reliability and Validity of the Sexual Pressure Scale for Women-Revised

    PubMed Central

    Jones, Rachel; Gulick, Elsie

    2008-01-01

    Sexual pressure among young urban women represents adherence to gender stereotypical expectations to engage in sex. Revision of the original 5-factor Sexual Pressure Scale was undertaken in two studies to improve reliabilities in two of the five factors. In Study 1 the reliability of the Sexual Pressure Scale for Women-Revised (SPSW-R) was tested, and principal components analysis was performed in a sample of 325 young, urban women. A parsimonious 18-item, 4-factor model explained 61% of the variance. In Study 2 the theory underlying sexual pressure was supported by confirmatory factor analysis using structural equation modeling in a sample of 181 women. Reliabilities of the SPSW-R total and subscales were very satisfactory, suggesting it may be used in intervention research. PMID:18666222

  3. A Bayesian approach to estimating variance components within a multivariate generalizability theory framework.

    PubMed

    Jiang, Zhehan; Skorupski, William

    2017-12-12

    In many behavioral research areas, multivariate generalizability theory (mG theory) has been typically used to investigate the reliability of certain multidimensional assessments. However, traditional mG-theory estimation-namely, using frequentist approaches-has limits, leading researchers to fail to take full advantage of the information that mG theory can offer regarding the reliability of measurements. Alternatively, Bayesian methods provide more information than frequentist approaches can offer. This article presents instructional guidelines on how to implement mG-theory analyses in a Bayesian framework; in particular, BUGS code is presented to fit commonly seen designs from mG theory, including single-facet designs, two-facet crossed designs, and two-facet nested designs. In addition to concrete examples that are closely related to the selected designs and the corresponding BUGS code, a simulated dataset is provided to demonstrate the utility and advantages of the Bayesian approach. This article is intended to serve as a tutorial reference for applied researchers and methodologists conducting mG-theory studies.

  4. Collection-limited theory interprets the extraordinary response of single semiconductor organic solar cells

    PubMed Central

    Ray, Biswajit; Baradwaj, Aditya G.; Khan, Mohammad Ryyan; Boudouris, Bryan W.; Alam, Muhammad Ashraful

    2015-01-01

    The bulk heterojunction (BHJ) organic photovoltaic (OPV) architecture has dominated the literature due to its ability to be implemented in devices with relatively high efficiency values. However, a simpler device architecture based on a single organic semiconductor (SS-OPV) offers several advantages: it obviates the need to control the highly system-dependent nanoscale BHJ morphology, and therefore, would allow the use of broader range of organic semiconductors. Unfortunately, the photocurrent in standard SS-OPV devices is typically very low, which generally is attributed to inefficient charge separation of the photogenerated excitons. Here we show that the short-circuit current density from SS-OPV devices can be enhanced significantly (∼100-fold) through the use of inverted device configurations, relative to a standard OPV device architecture. This result suggests that charge generation may not be the performance bottleneck in OPV device operation. Instead, poor charge collection, caused by defect-induced electric field screening, is most likely the primary performance bottleneck in regular-geometry SS-OPV cells. We justify this hypothesis by: (i) detailed numerical simulations, (ii) electrical characterization experiments of functional SS-OPV devices using multiple polymers as active layer materials, and (iii) impedance spectroscopy measurements. Furthermore, we show that the collection-limited photocurrent theory consistently interprets typical characteristics of regular SS-OPV devices. These insights should encourage the design and OPV implementation of high-purity, high-mobility polymers, and other soft materials that have shown promise in organic field-effect transistor applications, but have not performed well in BHJ OPV devices, wherein they adopt less-than-ideal nanostructures when blended with electron-accepting materials. PMID:26290582

  5. Collection-limited theory interprets the extraordinary response of single semiconductor organic solar cells

    DOE PAGES

    Ray, Biswajit; Baradwaj, Aditya G.; Khan, Mohammad Ryyan; ...

    2015-08-19

    The bulk heterojunction (BHJ) organic photovoltaic (OPV) architecture has dominated the literature due to its ability to be implemented in devices with relatively high efficiency values. However, a simpler device architecture based on a single organic semiconductor (SS-OPV) offers several advantages: it obviates the need to control the highly system-dependent nanoscale BHJ morphology, and therefore, would allow the use of broader range of organic semiconductors. Unfortunately, the photocurrent in standard SS-OPV devices is typically very low, which generally is attributed to inefficient charge separation of the photogenerated excitons. In this paper, we show that the short-circuit current density from SS-OPVmore » devices can be enhanced significantly (~100-fold) through the use of inverted device configurations, relative to a standard OPV device architecture. This result suggests that charge generation may not be the performance bottleneck in OPV device operation. Instead, poor charge collection, caused by defect-induced electric field screening, is most likely the primary performance bottleneck in regular-geometry SS-OPV cells. We justify this hypothesis by: ( i) detailed numerical simulations, ( ii) electrical characterization experiments of functional SS-OPV devices using multiple polymers as active layer materials, and ( iii) impedance spectroscopy measurements. Furthermore, we show that the collection-limited photocurrent theory consistently interprets typical characteristics of regular SS-OPV devices. Finally, these insights should encourage the design and OPV implementation of high-purity, high-mobility polymers, and other soft materials that have shown promise in organic field-effect transistor applications, but have not performed well in BHJ OPV devices, wherein they adopt less-than-ideal nanostructures when blended with electron-accepting materials.« less

  6. Collection-limited theory interprets the extraordinary response of single semiconductor organic solar cells.

    PubMed

    Ray, Biswajit; Baradwaj, Aditya G; Khan, Mohammad Ryyan; Boudouris, Bryan W; Alam, Muhammad Ashraful

    2015-09-08

    The bulk heterojunction (BHJ) organic photovoltaic (OPV) architecture has dominated the literature due to its ability to be implemented in devices with relatively high efficiency values. However, a simpler device architecture based on a single organic semiconductor (SS-OPV) offers several advantages: it obviates the need to control the highly system-dependent nanoscale BHJ morphology, and therefore, would allow the use of broader range of organic semiconductors. Unfortunately, the photocurrent in standard SS-OPV devices is typically very low, which generally is attributed to inefficient charge separation of the photogenerated excitons. Here we show that the short-circuit current density from SS-OPV devices can be enhanced significantly (∼100-fold) through the use of inverted device configurations, relative to a standard OPV device architecture. This result suggests that charge generation may not be the performance bottleneck in OPV device operation. Instead, poor charge collection, caused by defect-induced electric field screening, is most likely the primary performance bottleneck in regular-geometry SS-OPV cells. We justify this hypothesis by: (i) detailed numerical simulations, (ii) electrical characterization experiments of functional SS-OPV devices using multiple polymers as active layer materials, and (iii) impedance spectroscopy measurements. Furthermore, we show that the collection-limited photocurrent theory consistently interprets typical characteristics of regular SS-OPV devices. These insights should encourage the design and OPV implementation of high-purity, high-mobility polymers, and other soft materials that have shown promise in organic field-effect transistor applications, but have not performed well in BHJ OPV devices, wherein they adopt less-than-ideal nanostructures when blended with electron-accepting materials.

  7. Improving patient safety: patient-focused, high-reliability team training.

    PubMed

    McKeon, Leslie M; Cunningham, Patricia D; Oswaks, Jill S Detty

    2009-01-01

    Healthcare systems are recognizing "human factor" flaws that result in adverse outcomes. Nurses work around system failures, although increasing healthcare complexity makes this harder to do without risk of error. Aviation and military organizations achieve ultrasafe outcomes through high-reliability practice. We describe how reliability principles were used to teach nurses to improve patient safety at the front line of care. Outcomes include safety-oriented, teamwork communication competency; reflections on safety culture and clinical leadership are discussed.

  8. Biological atomism and cell theory.

    PubMed

    Nicholson, Daniel J

    2010-09-01

    Biological atomism postulates that all life is composed of elementary and indivisible vital units. The activity of a living organism is thus conceived as the result of the activities and interactions of its elementary constituents, each of which individually already exhibits all the attributes proper to life. This paper surveys some of the key episodes in the history of biological atomism, and situates cell theory within this tradition. The atomistic foundations of cell theory are subsequently dissected and discussed, together with the theory's conceptual development and eventual consolidation. This paper then examines the major criticisms that have been waged against cell theory, and argues that these too can be interpreted through the prism of biological atomism as attempts to relocate the true biological atom away from the cell to a level of organization above or below it. Overall, biological atomism provides a useful perspective through which to examine the history and philosophy of cell theory, and it also opens up a new way of thinking about the epistemic decomposition of living organisms that significantly departs from the physicochemical reductionism of mechanistic biology. Copyright © 2010 Elsevier Ltd. All rights reserved.

  9. Theories of Levels in Organizational Science.

    ERIC Educational Resources Information Center

    Rousseau, Denise M.

    This paper presents concepts and principles pertinent to the development of cross-level and multilevel theory in organizational science by addressing a number of fundamental theoretical issues. It describes hierarchy theory, systems theory, and mixed-level models of organization developed by organizational scientists. Hierarchy theory derives from…

  10. Construction of a memory battery for computerized administration, using item response theory.

    PubMed

    Ferreira, Aristides I; Almeida, Leandro S; Prieto, Gerardo

    2012-10-01

    In accordance with Item Response Theory, a computer memory battery with six tests was constructed for use in the Portuguese adult population. A factor analysis was conducted to assess the internal structure of the tests (N = 547 undergraduate students). According to the literature, several confirmatory factor models were evaluated. Results showed better fit of a model with two independent latent variables corresponding to verbal and non-verbal factors, reproducing the initial battery organization. Internal consistency reliability for the six tests were alpha = .72 to .89. IRT analyses (Rasch and partial credit models) yielded good Infit and Outfit measures and high precision for parameter estimation. The potential utility of these memory tasks for psychological research and practice willbe discussed.

  11. Overview of Management Theory

    DTIC Science & Technology

    1991-02-01

    theory orients command leadership for the enormous task of managing organizations in our environment fraught with volatility, uncertainty...performance and organizational ethics. A THEORY OF MANAGEMENT BACKGROUND BASIC MANAGEMENT BEHAVIORAL Definitions FUNCTIONS ASPECTS History Planning Leadership ...the best way to manage in their theory of managerial leadership . To them, the 9,9 position on their model, "is acknowledged by managers as the

  12. Nursing Services Delivery Theory: an open system approach

    PubMed Central

    Meyer, Raquel M; O’Brien-Pallas, Linda L

    2010-01-01

    meyer r.m. & o’brien-pallas l.l. (2010)Nursing services delivery theory: an open system approach. Journal of Advanced Nursing66(12), 2828–2838. Aim This paper is a discussion of the derivation of the Nursing Services Delivery Theory from the application of open system theory to large-scale organizations. Background The underlying mechanisms by which staffing indicators influence outcomes remain under-theorized and unmeasured, resulting in a ‘black box’ that masks the nature and organization of nursing work. Theory linking nursing work, staffing, work environments, and outcomes in different settings is urgently needed to inform management decisions about the allocation of nurse staffing resources in organizations. Data sources A search of CINAHL and Business Source Premier for the years 1980–2008 was conducted using the following terms: theory, models, organization, organizational structure, management, administration, nursing units, and nursing. Seminal works were included. Discussion The healthcare organization is conceptualized as an open system characterized by energy transformation, a dynamic steady state, negative entropy, event cycles, negative feedback, differentiation, integration and coordination, and equifinality. The Nursing Services Delivery Theory proposes that input, throughput, and output factors interact dynamically to influence the global work demands placed on nursing work groups at the point of care in production subsystems. Implications for nursing The Nursing Services Delivery Theory can be applied to varied settings, cultures, and countries and supports the study of multi-level phenomena and cross-level effects. Conclusion The Nursing Services Delivery Theory gives a relational structure for reconciling disparate streams of research related to nursing work, staffing, and work environments. The theory can guide future research and the management of nursing services in large-scale healthcare organizations. PMID:20831573

  13. Field-induced metal-insulator transition in a two-dimensional organic superconductor.

    PubMed

    Wosnitza, J; Wanka, S; Hagel, J; Löhneysen, H; Qualls, J S; Brooks, J S; Balthes, E; Schlueter, J A; Geiser, U; Mohtasham, J; Winter, R W; Gard, G L

    2001-01-15

    The quasi-two-dimensional organic superconductor beta"-(BEDT-TTF)2SF5CH2CF2SO3 (Tc approximately 4.4 K) shows very strong Shubnikov-de Haas (SdH) oscillations which are superimposed on a highly anomalous steady background magnetoresistance, Rb. Comparison with de Haas-van Alphen oscillations allows a reliable estimate of Rb which is crucial for the correct extraction of the SdH signal. At low temperatures and high magnetic fields insulating behavior evolves. The magnetoresistance data violate Kohler's rule, i.e., cannot be described within the framework of semiclassical transport theory, but converge onto a universal curve appropriate for dynamical scaling at a metal-insulator transition.

  14. Can the second order multireference perturbation theory be considered a reliable tool to study mixed-valence compounds?

    PubMed

    Pastore, Mariachiara; Helal, Wissam; Evangelisti, Stefano; Leininger, Thierry; Malrieu, Jean-Paul; Maynau, Daniel; Angeli, Celestino; Cimiraglia, Renzo

    2008-05-07

    In this paper, the problem of the calculation of the electronic structure of mixed-valence compounds is addressed in the frame of multireference perturbation theory (MRPT). Using a simple mixed-valence compound (the 5,5(') (4H,4H('))-spirobi[ciclopenta[c]pyrrole] 2,2('),6,6(') tetrahydro cation), and the n-electron valence state perturbation theory (NEVPT2) and CASPT2 approaches, it is shown that the ground state (GS) energy curve presents an unphysical "well" for nuclear coordinates close to the symmetric case, where a maximum is expected. For NEVPT, the correct shape of the energy curve is retrieved by applying the MPRT at the (computationally expensive) third order. This behavior is rationalized using a simple model (the ionized GS of two weakly interacting identical systems, each neutral system being described by two electrons in two orbitals), showing that the unphysical well is due to the canonical orbital energies which at the symmetric (delocalized) conformation lead to a sudden modification of the denominators in the perturbation expansion. In this model, the bias introduced in the second order correction to the energy is almost entirely removed going to the third order. With the results of the model in mind, one can predict that all MRPT methods in which the zero order Hamiltonian is based on canonical orbital energies are prone to present unreasonable energy profiles close to the symmetric situation. However, the model allows a strategy to be devised which can give a correct behavior even at the second order, by simply averaging the orbital energies of the two charge-localized electronic states. Such a strategy is adopted in a NEVPT2 scheme obtaining a good agreement with the third order results based on the canonical orbital energies. The answer to the question reported in the title (is this theoretical approach a reliable tool for a correct description of these systems?) is therefore positive, but care must be exercised, either in defining the orbital

  15. The Process by Which Black Male College Students Become Leaders of Predominantly White Organizations in Higher Education: A Grounded Theory

    ERIC Educational Resources Information Center

    Moschella, Eric J.

    2013-01-01

    This study sought to understand the process by which Black undergraduate men on predominately White college campuses become leaders of predominately White organizations. Using the theoretical frameworks of Black and White racial identity development (Helms, 1990), Critical Race Theory (Delgado & Stefancic, 2001), and Wijeyesinghe's (2001)…

  16. Noninteractive macroscopic reliability model for whisker-reinforced ceramic composites

    NASA Technical Reports Server (NTRS)

    Duffy, Stephen F.; Arnold, Steven M.

    1990-01-01

    Considerable research is underway in the field of material science focusing on incorporating silicon carbide whiskers into silicon nitride and alumina matrices. These composites show the requisite thermal stability and thermal shock resistance necessary for use as components in advanced gas turbines and heat exchangers. This paper presents a macroscopic noninteractive reliability model for whisker-reinforced ceramic composites. The theory is multiaxial and is applicable to composites that can be characterized as transversely isotropic. Enough processing data exists to suggest this idealization encompasses a significantly large class of fabricated components. A qualitative assessment of the model is made by presenting reliability surfaces in several different stress spaces and for different values of model parameters.

  17. Pattern activation/recognition theory of mind.

    PubMed

    du Castel, Bertrand

    2015-01-01

    In his 2012 book How to Create a Mind, Ray Kurzweil defines a "Pattern Recognition Theory of Mind" that states that the brain uses millions of pattern recognizers, plus modules to check, organize, and augment them. In this article, I further the theory to go beyond pattern recognition and include also pattern activation, thus encompassing both sensory and motor functions. In addition, I treat checking, organizing, and augmentation as patterns of patterns instead of separate modules, therefore handling them the same as patterns in general. Henceforth I put forward a unified theory I call "Pattern Activation/Recognition Theory of Mind." While the original theory was based on hierarchical hidden Markov models, this evolution is based on their precursor: stochastic grammars. I demonstrate that a class of self-describing stochastic grammars allows for unifying pattern activation, recognition, organization, consistency checking, metaphor, and learning, into a single theory that expresses patterns throughout. I have implemented the model as a probabilistic programming language specialized in activation/recognition grammatical and neural operations. I use this prototype to compute and present diagrams for each stochastic grammar and corresponding neural circuit. I then discuss the theory as it relates to artificial network developments, common coding, neural reuse, and unity of mind, concluding by proposing potential paths to validation.

  18. Applying Learning Theories and Instructional Design Models for Effective Instruction

    ERIC Educational Resources Information Center

    Khalil, Mohammed K.; Elkhider, Ihsan A.

    2016-01-01

    Faculty members in higher education are involved in many instructional design activities without formal training in learning theories and the science of instruction. Learning theories provide the foundation for the selection of instructional strategies and allow for reliable prediction of their effectiveness. To achieve effective learning…

  19. The Leadership of Groups in Organizations

    DTIC Science & Technology

    1985-07-01

    Managemert • July, 1985 01 i J JAN14 19866 K) Abstract A theory of leadership that focusses specifically on task-performing , groups in organizations in...p:xoposed. The theory takes a functional approach to leadership , explcring how leaders fulfill functions that are required for group effectiveness...that there are no theories of leadership around. There are theories of managerial leadership , from the classic statements of organization theorists

  20. Organization Theory: Implications for Design.

    ERIC Educational Resources Information Center

    Young, David A.

    1979-01-01

    This paper outlines the possibilities for scientific inquiry into the design of the university organization structure. In a theoretical context, bureaucratic management techniques were not refined enough to apply to university structures until the mid-twentieth century. Universities today are bureaucracies in that they have a formal division of…

  1. Lifetime Reliability Prediction of Ceramic Structures Under Transient Thermomechanical Loads

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.; Jadaan, Osama J.; Gyekenyesi, John P.

    2005-01-01

    An analytical methodology is developed to predict the probability of survival (reliability) of ceramic components subjected to harsh thermomechanical loads that can vary with time (transient reliability analysis). This capability enables more accurate prediction of ceramic component integrity against fracture in situations such as turbine startup and shutdown, operational vibrations, atmospheric reentry, or other rapid heating or cooling situations (thermal shock). The transient reliability analysis methodology developed herein incorporates the following features: fast-fracture transient analysis (reliability analysis without slow crack growth, SCG); transient analysis with SCG (reliability analysis with time-dependent damage due to SCG); a computationally efficient algorithm to compute the reliability for components subjected to repeated transient loading (block loading); cyclic fatigue modeling using a combined SCG and Walker fatigue law; proof testing for transient loads; and Weibull and fatigue parameters that are allowed to vary with temperature or time. Component-to-component variation in strength (stochastic strength response) is accounted for with the Weibull distribution, and either the principle of independent action or the Batdorf theory is used to predict the effect of multiaxial stresses on reliability. The reliability analysis can be performed either as a function of the component surface (for surface-distributed flaws) or component volume (for volume-distributed flaws). The transient reliability analysis capability has been added to the NASA CARES/ Life (Ceramic Analysis and Reliability Evaluation of Structures/Life) code. CARES/Life was also updated to interface with commercially available finite element analysis software, such as ANSYS, when used to model the effects of transient load histories. Examples are provided to demonstrate the features of the methodology as implemented in the CARES/Life program.

  2. Increasing Reliability of Direct Observation Measurement Approaches in Emotional and/or Behavioral Disorders Research Using Generalizability Theory

    ERIC Educational Resources Information Center

    Gage, Nicholas A.; Prykanowski, Debra; Hirn, Regina

    2014-01-01

    Reliability of direct observation outcomes ensures the results are consistent, dependable, and trustworthy. Typically, reliability of direct observation measurement approaches is assessed using interobserver agreement (IOA) and the calculation of observer agreement (e.g., percentage of agreement). However, IOA does not address intraobserver…

  3. Work organization and ergonomics.

    PubMed

    Carayon, P; Smith, M J

    2000-12-01

    This paper examines the impact of sociotechnical and business trends on work organization and ergonomics. This analysis is performed with the use of Balance Theory (Smith and Carayon-Sainfort, Int. J. Ind. Ergon. 1989, 4, 67-79). The impact on work organization and the work system of the following sociotechnical and business trends is discussed: re-structuring and re-organizing of companies, new forms of work organization, workforce diversity, and information and communication technology. An expansion of Balance Theory, from the design of work systems to the design of organizations, is discussed. Finally, the issue of change is examined. Several elements and methods are discussed for the design of change processes.

  4. Self-organizing biochemical cycles

    NASA Technical Reports Server (NTRS)

    Orgel, L. E.; Bada, J. L. (Principal Investigator)

    2000-01-01

    I examine the plausibility of theories that postulate the development of complex chemical organization without requiring the replication of genetic polymers such as RNA. One conclusion is that theories that involve the organization of complex, small-molecule metabolic cycles such as the reductive citric acid cycle on mineral surfaces make unreasonable assumptions about the catalytic properties of minerals and the ability of minerals to organize sequences of disparate reactions. Another conclusion is that data in the Beilstein Handbook of Organic Chemistry that have been claimed to support the hypothesis that the reductive citric acid cycle originated as a self-organized cycle can more plausibly be interpreted in a different way.

  5. 18 CFR 39.4 - Funding of the Electric Reliability Organization.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... business plan and organization chart, explaining the proposed collection of all dues, fees and charges and... budget and business plan no later than sixty (60) days in advance of the beginning of the Electric... Organization may include in the application a plan for a transitional funding mechanism that would allow such...

  6. Method and Theory of Intergroups in Organizations.

    DTIC Science & Technology

    1980-10-29

    Relations and Organizational Diagnosis " (T.R. #3) identified the philosophical formulations of clinical methods for studying organizations, reviewed the...earliest clinical studies of organizations that 4- took a group and intergroup perspective, and critiqued existing approaches to organizational diagnosis . This

  7. Can theory of mind deficits be measured reliably in people with mild and moderate Alzheimer's dementia?

    PubMed

    Choong, Caroline Sm; Doody, Gillian A

    2013-01-01

    Patients suffering from Alzheimer's dementia develop difficulties in social functioning. This has led to an interest in the study of "theory of mind" in this population. However, difficulty has arisen because the associated cognitive demands of traditional short story theory of mind assessments result in failure per se in this population, making it challenging to test pure theory of mind ability. Simplified, traditional 1st and 2nd order theory of mind short story tasks and a battery of alternative theory of mind cartoon jokes and control slapstick cartoon jokes, without memory components, were administered to 16 participants with mild-moderate Alzheimer's dementia, and 11 age-matched healthy controls. No significant differences were detected between participants with Alzheimer's dementia and controls on the 1st or 2nd order traditional short story theory of mind tasks (p = 0.155 and p = 0.154 respectively). However, in the cartoon joke tasks there were significant differences in performance between the Alzheimer participants and the control group, this was evident for both theory of mind cartoons and the control 'slapstick' jokes. It remains very difficult to assess theory of mind as an isolated phenomenon in populations with global cognitive impairment, such as Alzheimer's dementia, as the tasks used to assess this cognition invariably depend on other cognitive functions. Although a limitation of this study is the small sample size, the results suggest that there is no measurable specific theory of mind deficit in people with Alzheimer's dementia, and that the use of theory of mind representational models to measure social cognitive ability may not be appropriate in this population.

  8. Time-dependent reliability analysis of ceramic engine components

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.

    1993-01-01

    The computer program CARES/LIFE calculates the time-dependent reliability of monolithic ceramic components subjected to thermomechanical and/or proof test loading. This program is an extension of the CARES (Ceramics Analysis and Reliability Evaluation of Structures) computer program. CARES/LIFE accounts for the phenomenon of subcritical crack growth (SCG) by utilizing either the power or Paris law relations. The two-parameter Weibull cumulative distribution function is used to characterize the variation in component strength. The effects of multiaxial stresses are modeled using either the principle of independent action (PIA), the Weibull normal stress averaging method (NSA), or the Batdorf theory. Inert strength and fatigue parameters are estimated from rupture strength data of naturally flawed specimens loaded in static, dynamic, or cyclic fatigue. Two example problems demonstrating proof testing and fatigue parameter estimation are given.

  9. Surface flaw reliability analysis of ceramic components with the SCARE finite element postprocessor program

    NASA Technical Reports Server (NTRS)

    Gyekenyesi, John P.; Nemeth, Noel N.

    1987-01-01

    The SCARE (Structural Ceramics Analysis and Reliability Evaluation) computer program on statistical fast fracture reliability analysis with quadratic elements for volume distributed imperfections is enhanced to include the use of linear finite elements and the capability of designing against concurrent surface flaw induced ceramic component failure. The SCARE code is presently coupled as a postprocessor to the MSC/NASTRAN general purpose, finite element analysis program. The improved version now includes the Weibull and Batdorf statistical failure theories for both surface and volume flaw based reliability analysis. The program uses the two-parameter Weibull fracture strength cumulative failure probability distribution model with the principle of independent action for poly-axial stress states, and Batdorf's shear-sensitive as well as shear-insensitive statistical theories. The shear-sensitive surface crack configurations include the Griffith crack and Griffith notch geometries, using the total critical coplanar strain energy release rate criterion to predict mixed-mode fracture. Weibull material parameters based on both surface and volume flaw induced fracture can also be calculated from modulus of rupture bar tests, using the least squares method with known specimen geometry and grouped fracture data. The statistical fast fracture theories for surface flaw induced failure, along with selected input and output formats and options, are summarized. An example problem to demonstrate various features of the program is included.

  10. School management and contingency theory: an emerging perspective.

    PubMed

    Hanson, E M

    1979-01-01

    In an article written for educational administrators, Hanson explains the assumptions, framework, and application of contingency theory. The author sees contingency theory as a way for organizations to adapt to uncertainty by developing a strategic plan with alternative scenarios. He urges school administrators to join businessmen and public managers in using a technique described as "the most powerful current sweeping over the organizational field." The theory assumes that: (1) a maze of goals govern the development of events; (2) different management approaches may be appropriate within the same organization; and (3) different leadership styles suit different situations. Contingency planning helps the organization to respond to uncertainty in the external environment by identifying possible events that may occur and by preparing alternative stratgies to deal with them. Hanson describes the purpose of this process as providing "a more effective match between an organization and its environment." He explains that contingency theory analyzes the internal adjustments of the organization (e.g., decision making process, structure, technology, instructional techniques) as it seeks to meet the shifting demands of its external or internal environments. According to the author, the intent of contingency theory is to establish an optimal "match" between environmental demands (and support) and the response capabilities of the organization including its structure, planning process, and leadership style.

  11. 78 FR 14783 - Citizens Energy Task Force, Save Our Unique Lands (Complainants) v. Midwest Reliability...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-07

    ...) that the Midwest Reliability Organization has neglected its duty to preserve the reliability of the... [email protected] , or call (866) 208-3676 (toll free). For TTY, call (202) 502-8659. Comment Date...

  12. Older males signal more reliably.

    PubMed Central

    Proulx, Stephen R; Day, Troy; Rowe, Locke

    2002-01-01

    The hypothesis that females prefer older males because they have higher mean fitness than younger males has been the centre of recent controversy. These discussions have focused on the success of a female who prefers males of a particular age class when age cues, but not quality cues, are available. Thus, if the distribution of male quality changes with age, such that older males have on average genotypes with higher fitness than younger males, then a female who mates with older males has fitter offspring, which allows the female preference to spread through a genetic correlation. We develop a general model for male display in a species with multiple reproductive bouts that allows us to identify the conditions that promote reliable signalling within an age class. Because males have opportunities for future reproduction, they will reduce their levels of advertising compared with a semelparous species. In addition, because higher-quality males have more future reproduction, they will reduce their advertising more than low-quality males. Thus, the conditions for reliable signalling in a semelparous organism are generally not sufficient to produce reliable signalling in species with multiple reproductive bouts. This result is due to the possibility of future reproduction so that, as individuals age and the opportunities for future reproduction fade, signalling becomes more reliable. This provides a novel rationale for female preference for older mates; older males reveal more information in their sexual displays. PMID:12495495

  13. Interactive Reliability Model for Whisker-toughened Ceramics

    NASA Technical Reports Server (NTRS)

    Palko, Joseph L.

    1993-01-01

    Wider use of ceramic matrix composites (CMC) will require the development of advanced structural analysis technologies. The use of an interactive model to predict the time-independent reliability of a component subjected to multiaxial loads is discussed. The deterministic, three-parameter Willam-Warnke failure criterion serves as the theoretical basis for the reliability model. The strength parameters defining the model are assumed to be random variables, thereby transforming the deterministic failure criterion into a probabilistic criterion. The ability of the model to account for multiaxial stress states with the same unified theory is an improvement over existing models. The new model was coupled with a public-domain finite element program through an integrated design program. This allows a design engineer to predict the probability of failure of a component. A simple structural problem is analyzed using the new model, and the results are compared to existing models.

  14. CARES - CERAMICS ANALYSIS AND RELIABILITY EVALUATION OF STRUCTURES

    NASA Technical Reports Server (NTRS)

    Nemeth, N. N.

    1994-01-01

    The beneficial properties of structural ceramics include their high-temperature strength, light weight, hardness, and corrosion and oxidation resistance. For advanced heat engines, ceramics have demonstrated functional abilities at temperatures well beyond the operational limits of metals. This is offset by the fact that ceramic materials tend to be brittle. When a load is applied, their lack of significant plastic deformation causes the material to crack at microscopic flaws, destroying the component. CARES calculates the fast-fracture reliability or failure probability of macroscopically isotropic ceramic components. These components may be subjected to complex thermomechanical loadings. The program uses results from a commercial structural analysis program (MSC/NASTRAN or ANSYS) to evaluate component reliability due to inherent surface and/or volume type flaws. A multiple material capability allows the finite element model reliability to be a function of many different ceramic material statistical characterizations. The reliability analysis uses element stress, temperature, area, and volume output, which are obtained from two dimensional shell and three dimensional solid isoparametric or axisymmetric finite elements. CARES utilizes the Batdorf model and the two-parameter Weibull cumulative distribution function to describe the effects of multi-axial stress states on material strength. The shear-sensitive Batdorf model requires a user-selected flaw geometry and a mixed-mode fracture criterion. Flaws intersecting the surface and imperfections embedded in the volume can be modeled. The total strain energy release rate theory is used as a mixed mode fracture criterion for co-planar crack extension. Out-of-plane crack extension criteria are approximated by a simple equation with a semi-empirical constant that can model the maximum tangential stress theory, the minimum strain energy density criterion, the maximum strain energy release rate theory, or experimental

  15. Factor Structure and Reliability of Test Items for Saudi Teacher Licence Assessment

    ERIC Educational Resources Information Center

    Alsadaawi, Abdullah Saleh

    2017-01-01

    The Saudi National Assessment Centre administers the Computer Science Teacher Test for teacher certification. The aim of this study is to explore gender differences in candidates' scores, and investigate dimensionality, reliability, and differential item functioning using confirmatory factor analysis and item response theory. The confirmatory…

  16. Naive Theories of Social Groups

    ERIC Educational Resources Information Center

    Rhodes, Marjorie

    2012-01-01

    Four studies examined children's (ages 3-10, Total N = 235) naive theories of social groups, in particular, their expectations about how group memberships constrain social interactions. After introduction to novel groups of people, preschoolers (ages 3-5) reliably expected agents from one group to harm members of the other group (rather than…

  17. Modeling of a bubble-memory organization with self-checking translators to achieve high reliability.

    NASA Technical Reports Server (NTRS)

    Bouricius, W. G.; Carter, W. C.; Hsieh, E. P.; Wadia, A. B.; Jessep, D. C., Jr.

    1973-01-01

    Study of the design and modeling of a highly reliable bubble-memory system that has the capabilities of: (1) correcting a single 16-adjacent bit-group error resulting from failures in a single basic storage module (BSM), and (2) detecting with a probability greater than 0.99 any double errors resulting from failures in BSM's. The results of the study justify the design philosophy adopted of employing memory data encoding and a translator to correct single group errors and detect double group errors to enhance the overall system reliability.

  18. Structural Reliability Analysis and Optimization: Use of Approximations

    NASA Technical Reports Server (NTRS)

    Grandhi, Ramana V.; Wang, Liping

    1999-01-01

    This report is intended for the demonstration of function approximation concepts and their applicability in reliability analysis and design. Particularly, approximations in the calculation of the safety index, failure probability and structural optimization (modification of design variables) are developed. With this scope in mind, extensive details on probability theory are avoided. Definitions relevant to the stated objectives have been taken from standard text books. The idea of function approximations is to minimize the repetitive use of computationally intensive calculations by replacing them with simpler closed-form equations, which could be nonlinear. Typically, the approximations provide good accuracy around the points where they are constructed, and they need to be periodically updated to extend their utility. There are approximations in calculating the failure probability of a limit state function. The first one, which is most commonly discussed, is how the limit state is approximated at the design point. Most of the time this could be a first-order Taylor series expansion, also known as the First Order Reliability Method (FORM), or a second-order Taylor series expansion (paraboloid), also known as the Second Order Reliability Method (SORM). From the computational procedure point of view, this step comes after the design point identification; however, the order of approximation for the probability of failure calculation is discussed first, and it is denoted by either FORM or SORM. The other approximation of interest is how the design point, or the most probable failure point (MPP), is identified. For iteratively finding this point, again the limit state is approximated. The accuracy and efficiency of the approximations make the search process quite practical for analysis intensive approaches such as the finite element methods; therefore, the crux of this research is to develop excellent approximations for MPP identification and also different

  19. Goal Theory and Individual Productivity.

    ERIC Educational Resources Information Center

    Frost, Peter J.

    The paper provides a review of goal theory as articulated by Edwin Locke. The theory is evaluated in terms of laboratory and field research and its practical usefulnes is explored as a means to improving individual productivity in "real world" organizations Research findings provide support for some goal theory propositions but suggest also the…

  20. Pattern activation/recognition theory of mind

    PubMed Central

    du Castel, Bertrand

    2015-01-01

    In his 2012 book How to Create a Mind, Ray Kurzweil defines a “Pattern Recognition Theory of Mind” that states that the brain uses millions of pattern recognizers, plus modules to check, organize, and augment them. In this article, I further the theory to go beyond pattern recognition and include also pattern activation, thus encompassing both sensory and motor functions. In addition, I treat checking, organizing, and augmentation as patterns of patterns instead of separate modules, therefore handling them the same as patterns in general. Henceforth I put forward a unified theory I call “Pattern Activation/Recognition Theory of Mind.” While the original theory was based on hierarchical hidden Markov models, this evolution is based on their precursor: stochastic grammars. I demonstrate that a class of self-describing stochastic grammars allows for unifying pattern activation, recognition, organization, consistency checking, metaphor, and learning, into a single theory that expresses patterns throughout. I have implemented the model as a probabilistic programming language specialized in activation/recognition grammatical and neural operations. I use this prototype to compute and present diagrams for each stochastic grammar and corresponding neural circuit. I then discuss the theory as it relates to artificial network developments, common coding, neural reuse, and unity of mind, concluding by proposing potential paths to validation. PMID:26236228

  1. Generalizability Theory Analysis of CBM Maze Reliability in Third- through Fifth-Grade Students

    ERIC Educational Resources Information Center

    Mercer, Sterett H.; Dufrene, Brad A.; Zoder-Martell, Kimberly; Harpole, Lauren Lestremau; Mitchell, Rachel R.; Blaze, John T.

    2012-01-01

    Despite growing use of CBM Maze in universal screening and research, little information is available regarding the number of CBM Maze probes needed for reliable decisions. The current study extends existing research on the technical adequacy of CBM Maze by investigating the number of probes and assessment durations (1-3 min) needed for reliable…

  2. Training and Maintaining System-Wide Reliability in Outcome Management.

    PubMed

    Barwick, Melanie A; Urajnik, Diana J; Moore, Julia E

    2014-01-01

    The Child and Adolescent Functional Assessment Scale (CAFAS) is widely used for outcome management, for providing real time client and program level data, and the monitoring of evidence-based practices. Methods of reliability training and the assessment of rater drift are critical for service decision-making within organizations and systems of care. We assessed two approaches for CAFAS training: external technical assistance and internal technical assistance. To this end, we sampled 315 practitioners trained by external technical assistance approach from 2,344 Ontario practitioners who had achieved reliability on the CAFAS. To assess the internal technical assistance approach as a reliable alternative training method, 140 practitioners trained internally were selected from the same pool of certified raters. Reliabilities were high for both practitioners trained by external technical assistance and internal technical assistance approaches (.909-.995, .915-.997, respectively). 1 and 3-year estimates showed some drift on several scales. High and consistent reliabilities over time and training method has implications for CAFAS training of behavioral health care practitioners, and the maintenance of CAFAS as a global outcome management tool in systems of care.

  3. Functional Organization of the Action Observation Network in Autism: A Graph Theory Approach.

    PubMed

    Alaerts, Kaat; Geerlings, Franca; Herremans, Lynn; Swinnen, Stephan P; Verhoeven, Judith; Sunaert, Stefan; Wenderoth, Nicole

    2015-01-01

    The ability to recognize, understand and interpret other's actions and emotions has been linked to the mirror system or action-observation-network (AON). Although variations in these abilities are prevalent in the neuro-typical population, persons diagnosed with autism spectrum disorders (ASD) have deficits in the social domain and exhibit alterations in this neural network. Here, we examined functional network properties of the AON using graph theory measures and region-to-region functional connectivity analyses of resting-state fMRI-data from adolescents and young adults with ASD and typical controls (TC). Overall, our graph theory analyses provided convergent evidence that the network integrity of the AON is altered in ASD, and that reductions in network efficiency relate to reductions in overall network density (i.e., decreased overall connection strength). Compared to TC, individuals with ASD showed significant reductions in network efficiency and increased shortest path lengths and centrality. Importantly, when adjusting for overall differences in network density between ASD and TC groups, participants with ASD continued to display reductions in network integrity, suggesting that also network-level organizational properties of the AON are altered in ASD. While differences in empirical connectivity contributed to reductions in network integrity, graph theoretical analyses provided indications that also changes in the high-level network organization reduced integrity of the AON.

  4. Functional Organization of the Action Observation Network in Autism: A Graph Theory Approach

    PubMed Central

    Alaerts, Kaat; Geerlings, Franca; Herremans, Lynn; Swinnen, Stephan P.; Verhoeven, Judith; Sunaert, Stefan; Wenderoth, Nicole

    2015-01-01

    Background The ability to recognize, understand and interpret other’s actions and emotions has been linked to the mirror system or action-observation-network (AON). Although variations in these abilities are prevalent in the neuro-typical population, persons diagnosed with autism spectrum disorders (ASD) have deficits in the social domain and exhibit alterations in this neural network. Method Here, we examined functional network properties of the AON using graph theory measures and region-to-region functional connectivity analyses of resting-state fMRI-data from adolescents and young adults with ASD and typical controls (TC). Results Overall, our graph theory analyses provided convergent evidence that the network integrity of the AON is altered in ASD, and that reductions in network efficiency relate to reductions in overall network density (i.e., decreased overall connection strength). Compared to TC, individuals with ASD showed significant reductions in network efficiency and increased shortest path lengths and centrality. Importantly, when adjusting for overall differences in network density between ASD and TC groups, participants with ASD continued to display reductions in network integrity, suggesting that also network-level organizational properties of the AON are altered in ASD. Conclusion While differences in empirical connectivity contributed to reductions in network integrity, graph theoretical analyses provided indications that also changes in the high-level network organization reduced integrity of the AON. PMID:26317222

  5. Factors that Affect Operational Reliability of Turbojet Engines

    NASA Technical Reports Server (NTRS)

    1956-01-01

    The problem of improving operational reliability of turbojet engines is studied in a series of papers. Failure statistics for this engine are presented, the theory and experimental evidence on how engine failures occur are described, and the methods available for avoiding failure in operation are discussed. The individual papers of the series are Objectives, Failure Statistics, Foreign-Object Damage, Compressor Blades, Combustor Assembly, Nozzle Diaphrams, Turbine Buckets, Turbine Disks, Rolling Contact Bearings, Engine Fuel Controls, and Summary Discussion.

  6. [An examination of "Minamata disease general investigation and research liaison council"--The process of making uncertain the organic mercury causal theory].

    PubMed

    Nakano, Hiroshi

    2010-01-01

    Minamata disease occurred because inhabitants consumed the polluted seafood. The official confirmation of Minamata disease was in 1956. However, the material cause of that disease was uncertain at that time. The Minamata Food Poisoning Sub-committee, under authority of the Food Hygiene Investigation Committee of the Ministry of Health and Welfare, determined the material cause of Minamata disease to be a certain kind of organic mercury in 1959. The sub-committee was dissolved after their report. The discussion about the investigation of the cause was performed in a conference initiated by the Economic Planning Agency, which was titled "Minamata Disease General Investigation and Research Liaison Council". The Participants were eight scientists; four fishery scientists, two chemists, and only two medical scientists, which implied that only examination of the organic mercury was to be discussion. The conference was held four times from 1960 to 1961. In the first and second conferences, the organic mercury research from a medical perspective progressed in cooperation with fishery sciences. In the third conference, it was reported that UCHIDA Makio, professor of Kumamoto University, had found organic mercury crystal in the shellfish found in Minamata-bay. Authorities of biochemistry and medicine in the third conference criticized UCHIDA's research. At the fourth conference, reports contradicting his research were presented. Although those anti-UCHIDA reports were not verified, AKAHORI Shiro, the highest authority of biochemistry, not only accepted them, but also expressed doubt in the organic mercury causal theory. Therefore, this theory was recognized as uncertain.

  7. Assessing Variations in Areal Organization for the Intrinsic Brain: From Fingerprints to Reliability

    PubMed Central

    Xu, Ting; Opitz, Alexander; Craddock, R. Cameron; Wright, Margaret J.; Zuo, Xi-Nian; Milham, Michael P.

    2016-01-01

    Resting state fMRI (R-fMRI) is a powerful in-vivo tool for examining the functional architecture of the human brain. Recent studies have demonstrated the ability to characterize transitions between functionally distinct cortical areas through the mapping of gradients in intrinsic functional connectivity (iFC) profiles. To date, this novel approach has primarily been applied to iFC profiles averaged across groups of individuals, or in one case, a single individual scanned multiple times. Here, we used a publically available R-fMRI dataset, in which 30 healthy participants were scanned 10 times (10 min per session), to investigate differences in full-brain transition profiles (i.e., gradient maps, edge maps) across individuals, and their reliability. 10-min R-fMRI scans were sufficient to achieve high accuracies in efforts to “fingerprint” individuals based upon full-brain transition profiles. Regarding test–retest reliability, the image-wise intraclass correlation coefficient (ICC) was moderate, and vertex-level ICC varied depending on region; larger durations of data yielded higher reliability scores universally. Initial application of gradient-based methodologies to a recently published dataset obtained from twins suggested inter-individual variation in areal profiles might have genetic and familial origins. Overall, these results illustrate the utility of gradient-based iFC approaches for studying inter-individual variation in brain function. PMID:27600846

  8. The reliability of in-training assessment when performance improvement is taken into account.

    PubMed

    van Lohuizen, Mirjam T; Kuks, Jan B M; van Hell, Elisabeth A; Raat, A N; Stewart, Roy E; Cohen-Schotanus, Janke

    2010-12-01

    During in-training assessment students are frequently assessed over a longer period of time and therefore it can be expected that their performance will improve. We studied whether there really is a measurable performance improvement when students are assessed over an extended period of time and how this improvement affects the reliability of the overall judgement. In-training assessment results were obtained from 104 students on rotation at our university hospital or at one of the six affiliated hospitals. Generalisability theory was used in combination with multilevel analysis to obtain reliability coefficients and to estimate the number of assessments needed for reliable overall judgement, both including and excluding performance improvement. Students' clinical performance ratings improved significantly from a mean of 7.6 at the start to a mean of 7.8 at the end of their clerkship. When taking performance improvement into account, reliability coefficients were higher. The number of assessments needed to achieve a reliability of 0.80 or higher decreased from 17 to 11. Therefore, when studying reliability of in-training assessment, performance improvement should be considered.

  9. Criteria for evaluating programme theory diagrams in quality improvement initiatives: a structured method for appraisal.

    PubMed

    Issen, Laurel; Woodcock, Thomas; McNicholas, Christopher; Lennox, Laura; Reed, Julie E

    2018-04-09

    Despite criticisms that many quality improvement (QI) initiatives fail due to incomplete programme theory, there is no defined way to evaluate how programme theory has been articulated. The objective of this research was to develop, and assess the usability and reliability of scoring criteria to evaluate programme theory diagrams. Criteria development was informed by published literature and QI experts. Inter-rater reliability was tested between two evaluators. About 63 programme theory diagrams (42 driver diagrams and 21 action-effect diagrams) were reviewed to establish whether the criteria could support comparative analysis of different approaches to constructing diagrams. Components of the scoring criteria include: assessment of overall aim, logical overview, clarity of components, cause-effect relationships, evidence and measurement. Independent reviewers had 78% inter-rater reliability. Scoring enabled direct comparison of different approaches to developing programme theory; action-effect diagrams were found to have had a statistically significant but moderate improvement in programme theory quality over driver diagrams; no significant differences were observed based on the setting in which driver diagrams were developed. The scoring criteria summarise the necessary components of programme theory that are thought to contribute to successful QI projects. The viability of the scoring criteria for practical application was demonstrated. Future uses include assessment of individual programme theory diagrams and comparison of different approaches (e.g. methodological, teaching or other QI support) to produce programme theory. The criteria can be used as a tool to guide the production of better programme theory diagrams, and also highlights where additional support for QI teams could be needed.

  10. Reliability of a Measure of Institutional Discrimination against Minorities

    DTIC Science & Technology

    1979-12-01

    samples are presented. The first is based upon classical statistical theory and the second derives from a series of computer-generated Monte Carlo...Institutional racism and sexism . Englewood Cliffs, N. J.: Prentice-Hall, Inc., 1978. Hays, W. L. and Winkler, R. L. Statistics : probability, inference... statistical measure of the e of institutional discrimination are discussed. Two methods of dealing with the problem of reliability of the measure in small

  11. PV Reliability Development Lessons from JPL's Flat Plate Solar Array Project

    NASA Technical Reports Server (NTRS)

    Ross, Ronald G., Jr.

    2013-01-01

    Key reliability and engineering lessons learned from the 20-year history of the Jet Propulsion Laboratory's Flat-Plate Solar Array Project and thin film module reliability research activities are presented and analyzed. Particular emphasis is placed on lessons applicable to evolving new module technologies and the organizations involved with these technologies. The user-specific demand for reliability is a strong function of the application, its location, and its expected duration. Lessons relative to effective means of specifying reliability are described, and commonly used test requirements are assessed from the standpoint of which are the most troublesome to pass, and which correlate best with field experience. Module design lessons are also summarized, including the significance of the most frequently encountered failure mechanisms and the role of encapsulate and cell reliability in determining module reliability. Lessons pertaining to research, design, and test approaches include the historical role and usefulness of qualification tests and field tests.

  12. Adsorption structures and energetics of molecules on metal surfaces: Bridging experiment and theory

    NASA Astrophysics Data System (ADS)

    Maurer, Reinhard J.; Ruiz, Victor G.; Camarillo-Cisneros, Javier; Liu, Wei; Ferri, Nicola; Reuter, Karsten; Tkatchenko, Alexandre

    2016-05-01

    Adsorption geometry and stability of organic molecules on surfaces are key parameters that determine the observable properties and functions of hybrid inorganic/organic systems (HIOSs). Despite many recent advances in precise experimental characterization and improvements in first-principles electronic structure methods, reliable databases of structures and energetics for large adsorbed molecules are largely amiss. In this review, we present such a database for a range of molecules adsorbed on metal single-crystal surfaces. The systems we analyze include noble-gas atoms, conjugated aromatic molecules, carbon nanostructures, and heteroaromatic compounds adsorbed on five different metal surfaces. The overall objective is to establish a diverse benchmark dataset that enables an assessment of current and future electronic structure methods, and motivates further experimental studies that provide ever more reliable data. Specifically, the benchmark structures and energetics from experiment are here compared with the recently developed van der Waals (vdW) inclusive density-functional theory (DFT) method, DFT + vdWsurf. In comparison to 23 adsorption heights and 17 adsorption energies from experiment we find a mean average deviation of 0.06 Å and 0.16 eV, respectively. This confirms the DFT + vdWsurf method as an accurate and efficient approach to treat HIOSs. A detailed discussion identifies remaining challenges to be addressed in future development of electronic structure methods, for which the here presented benchmark database may serve as an important reference.

  13. Reliability Generalization (RG) Analysis: The Test Is Not Reliable

    ERIC Educational Resources Information Center

    Warne, Russell

    2008-01-01

    Literature shows that most researchers are unaware of some of the characteristics of reliability. This paper clarifies some misconceptions by describing the procedures, benefits, and limitations of reliability generalization while using it to illustrate the nature of score reliability. Reliability generalization (RG) is a meta-analytic method…

  14. Ferroelectric and reliability properties of metal-organic chemical vapor deposited Pb(Zr0.15Ti0.85)O3 thin films grown in the self-regulation process window

    NASA Astrophysics Data System (ADS)

    Zhao, Jin Shi; Lee, Hyun Ju; Sim, Joon Seop; Lee, Keun; Hwang, Cheol Seong

    2006-04-01

    Ferroelectric reliability of Pb(Zr0.15Ti0.85)O3 films grown by metal-organic chemical vapor deposition at 570°C on an Ir electrode in the self-regulation process window [constant Pb concentration irrespective of the precursor input ratio (Pb /(Zr+Ti), PIR)] was studied. Although the Pb composition and crystallinity of the films grown under different PIR were almost identical, the film grown under a PIR which was near the center of the process window showed the best ferroelectric performance. X-ray photoelectron spectroscopy showed that the films grown at lower and higher PIR have residual ZrO2 and metallic Pb, respectively, which resulted in reduced remanent polarization and reliability.

  15. Can Sex Differences in Science Be Tied to the Long Reach of Prenatal Hormones? Brain Organization Theory, Digit Ratio (2D/4D), and Sex Differences in Preferences and Cognition.

    PubMed

    Valla, Jeffrey; Ceci, Stephen J

    2011-03-01

    Brain organization theory posits a cascade of physiological and behavioral changes initiated and shaped by prenatal hormones. Recently, this theory has been associated with outcomes including gendered toy preference, 2D/4D digit ratio, personality characteristics, sexual orientation, and cognitive profile (spatial, verbal, and mathematical abilities). We examine the evidence for this claim, focusing on 2D/4D and its putative role as a biomarker for organizational features that influence cognitive abilities/interests predisposing males toward mathematically and spatially intensive careers. Although massive support exists for early brain organization theory overall, there are myriad inconsistencies, alternative explanations, and outright contradictions that must be addressed while still taking the entire theory into account. Like a fractal within the larger theory, the 2D/4D hypothesis mirrors this overall support on a smaller scale while likewise suffering from inconsistencies (positive, negative, and sex-dependent correlations), alternative explanations (2D/4D related to spatial preferences rather than abilities per se), and contradictions (feminine 2D/4D in men associated with higher spatial ability). Using the debate over brain organization theory as the theoretical stage, we focus on 2D/4D evidence as an increasingly important player on this stage, a demonstrative case in point of the evidential complexities of the broader debate, and an increasingly important topic in its own right.

  16. How youth get engaged: grounded-theory research on motivational development in organized youth programs.

    PubMed

    Dawes, Nickki Pearce; Larson, Reed

    2011-01-01

    For youth to benefit from many of the developmental opportunities provided by organized programs, they need to not only attend but become psychologically engaged in program activities. This research was aimed at formulating empirically based grounded theory on the processes through which this engagement develops. Longitudinal interviews were conducted with 100 ethnically diverse youth (ages 14–21) in 10 urban and rural arts and leadership programs. Qualitative analysis focused on narrative accounts from the 44 youth who reported experiencing a positive turning point in their motivation or engagement. For 38 of these youth, this change process involved forming a personal connection. Similar to processes suggested by self-determination theory (Ryan & Deci, 2000), forming a personal connection involved youth's progressive integration of personal goals with the goals of program activities. Youth reported developing a connection to 3 personal goals that linked the self with the activity: learning for the future, developing competence, and pursuing a purpose. The role of purpose for many youth suggests that motivational change can be driven by goals that transcend self-needs. These findings suggest that youth need not enter programs intrinsically engaged--motivation can be fostered--and that programs should be creative in helping youth explore ways to form authentic connections to program activities.

  17. Insightful practice: a reliable measure for medical revalidation

    PubMed Central

    Guthrie, Bruce; Sullivan, Frank M; Mercer, Stewart W; Russell, Andrew; Bruce, David A

    2012-01-01

    Background Medical revalidation decisions need to be reliable if they are to reassure on the quality and safety of professional practice. This study tested an innovative method in which general practitioners (GPs) were assessed on their reflection and response to a set of externally specified feedback. Setting and participants 60 GPs and 12 GP appraisers in the Tayside region of Scotland, UK. Methods A feedback dataset was specified as (1) GP-specific data collected by GPs themselves (patient and colleague opinion; open book self-evaluated knowledge test; complaints) and (2) Externally collected practice-level data provided to GPs (clinical quality and prescribing safety). GPs' perceptions of whether the feedback covered UK General Medical Council specified attributes of a ‘good doctor’ were examined using a mapping exercise. GPs' professionalism was examined in terms of appraiser assessment of GPs' level of insightful practice, defined as: engagement with, insight into and appropriate action on feedback data. The reliability of assessment of insightful practice and subsequent recommendations on GPs' revalidation by face-to-face and anonymous assessors were investigated using Generalisability G-theory. Main outcome measures Coverage of General Medical Council attributes by specified feedback and reliability of assessor recommendations on doctors' suitability for revalidation. Results Face-to-face assessment proved unreliable. Anonymous global assessment by three appraisers of insightful practice was highly reliable (G=0.85), as were revalidation decisions using four anonymous assessors (G=0.83). Conclusions Unlike face-to-face appraisal, anonymous assessment of insightful practice offers a valid and reliable method to decide GP revalidation. Further validity studies are needed. PMID:22653078

  18. Reliability of System Identification Techniques to Assess Standing Balance in Healthy Elderly

    PubMed Central

    Maier, Andrea B.; Aarts, Ronald G. K. M.; van Gerven, Joop M. A.; Arendzen, J. Hans; Schouten, Alfred C.; Meskers, Carel G. M.; van der Kooij, Herman

    2016-01-01

    Objectives System identification techniques have the potential to assess the contribution of the underlying systems involved in standing balance by applying well-known disturbances. We investigated the reliability of standing balance parameters obtained with multivariate closed loop system identification techniques. Methods In twelve healthy elderly balance tests were performed twice a day during three days. Body sway was measured during two minutes of standing with eyes closed and the Balance test Room (BalRoom) was used to apply four disturbances simultaneously: two sensory disturbances, to the proprioceptive and the visual system, and two mechanical disturbances applied at the leg and trunk segment. Using system identification techniques, sensitivity functions of the sensory disturbances and the neuromuscular controller were estimated. Based on the generalizability theory (G theory), systematic errors and sources of variability were assessed using linear mixed models and reliability was assessed by computing indexes of dependability (ID), standard error of measurement (SEM) and minimal detectable change (MDC). Results A systematic error was found between the first and second trial in the sensitivity functions. No systematic error was found in the neuromuscular controller and body sway. The reliability of 15 of 25 parameters and body sway were moderate to excellent when the results of two trials on three days were averaged. To reach an excellent reliability on one day in 7 out of 25 parameters, it was predicted that at least seven trials must be averaged. Conclusion This study shows that system identification techniques are a promising method to assess the underlying systems involved in standing balance in elderly. However, most of the parameters do not appear to be reliable unless a large number of trials are collected across multiple days. To reach an excellent reliability in one third of the parameters, a training session for participants is needed and at

  19. Reliability of intracerebral hemorrhage classification systems: A systematic review.

    PubMed

    Rannikmäe, Kristiina; Woodfield, Rebecca; Anderson, Craig S; Charidimou, Andreas; Chiewvit, Pipat; Greenberg, Steven M; Jeng, Jiann-Shing; Meretoja, Atte; Palm, Frederic; Putaala, Jukka; Rinkel, Gabriel Je; Rosand, Jonathan; Rost, Natalia S; Strbian, Daniel; Tatlisumak, Turgut; Tsai, Chung-Fen; Wermer, Marieke Jh; Werring, David; Yeh, Shin-Joe; Al-Shahi Salman, Rustam; Sudlow, Cathie Lm

    2016-08-01

    Accurately distinguishing non-traumatic intracerebral hemorrhage (ICH) subtypes is important since they may have different risk factors, causal pathways, management, and prognosis. We systematically assessed the inter- and intra-rater reliability of ICH classification systems. We sought all available reliability assessments of anatomical and mechanistic ICH classification systems from electronic databases and personal contacts until October 2014. We assessed included studies' characteristics, reporting quality and potential for bias; summarized reliability with kappa value forest plots; and performed meta-analyses of the proportion of cases classified into each subtype. We included 8 of 2152 studies identified. Inter- and intra-rater reliabilities were substantial to perfect for anatomical and mechanistic systems (inter-rater kappa values: anatomical 0.78-0.97 [six studies, 518 cases], mechanistic 0.89-0.93 [three studies, 510 cases]; intra-rater kappas: anatomical 0.80-1 [three studies, 137 cases], mechanistic 0.92-0.93 [two studies, 368 cases]). Reporting quality varied but no study fulfilled all criteria and none was free from potential bias. All reliability studies were performed with experienced raters in specialist centers. Proportions of ICH subtypes were largely consistent with previous reports suggesting that included studies are appropriately representative. Reliability of existing classification systems appears excellent but is unknown outside specialist centers with experienced raters. Future reliability comparisons should be facilitated by studies following recently published reporting guidelines. © 2016 World Stroke Organization.

  20. Missile and Space Systems Reliability versus Cost Trade-Off Study

    DTIC Science & Technology

    1983-01-01

    F00-1C09 Robert C. Schneider F00-1C09 V . PERFORMING ORGANIZATION NAME AM0 ADDRESS 16 PRGRAM ELEMENT. PROJECT. TASK BoeingAerosace CmpAnyA CA WORK UNIT...reliability problems, which has the - real bearing on program effectiveness. A well planned and funded reliability effort can prevent or ferret out...failure analysis, and the in- corporation and verification of design corrections to prevent recurrence of failures. 302.2.2 A TMJ test plan shall be

  1. Similarity or dissimilarity in the relations between human service organizations.

    PubMed

    Bruynooghe, Kevin; Verhaeghe, Mieke; Bracke, Piet

    2008-01-01

    Exchange theory and homophily theory give rise to counteracting expectations for the interaction between human service organizations. Based on arguments of exchange theory, more interaction is expected between dissimilar organizations having complementary resources. Based on arguments of homophily theory, organizations having similar characteristics are expected to interact more. Interorganizational relations between human service organizations in two regional networks in Flanders are examined in this study. Results indicate that human service organizations tend to cooperate more with similar organizations as several homophily effects but not one effect of dissimilarity were found to be significant. The results of this study contribute to the understanding of interorganizational networks of human service organizations and have implications for the development of integrated care.

  2. Evolutionary theory and teleology.

    PubMed

    O'Grady, R T

    1984-04-21

    The order within and among living systems can be explained rationally by postulating a process of descent with modification, effected by factors which are extrinsic or intrinsic to the organisms. Because at the time Darwin proposed his theory of evolution there was no concept of intrinsic factors which could evolve, he postulated a process of extrinsic effects--natural selection. Biological order was thus seen as an imposed, rather than an emergent, property. Evolutionary change was seen as being determined by the functional efficiency (adaptedness) of the organism in its environment, rather than by spontaneous changes in intrinsically generated organizing factors. The initial incompleteness of Darwin's explanatory model, and the axiomatization of its postulates in neo-Darwinism, has resulted in a theory of functionalism, rather than structuralism. As such, it introduces an unnecessary teleology which confounds evolutionary studies and reduces the usefulness of the theory. This problem cannot be detected from within the neo-Darwinian paradigm because the different levels of end-directed activity--teleomatic, teleonomic, and teleological--are not recognized. They are, in fact, considered to influence one another. The theory of nonequilibrium evolution avoids these problems by returning to the basic principles of biological order and developing a structuralist explanation of intrinsically generated change. Extrinsic factors may affect the resultant evolutionary pattern, but they are neither necessary nor sufficient for evolution to occur.

  3. Reliability analysis of a robotic system using hybridized technique

    NASA Astrophysics Data System (ADS)

    Kumar, Naveen; Komal; Lather, J. S.

    2017-09-01

    In this manuscript, the reliability of a robotic system has been analyzed using the available data (containing vagueness, uncertainty, etc). Quantification of involved uncertainties is done through data fuzzification using triangular fuzzy numbers with known spreads as suggested by system experts. With fuzzified data, if the existing fuzzy lambda-tau (FLT) technique is employed, then the computed reliability parameters have wide range of predictions. Therefore, decision-maker cannot suggest any specific and influential managerial strategy to prevent unexpected failures and consequently to improve complex system performance. To overcome this problem, the present study utilizes a hybridized technique. With this technique, fuzzy set theory is utilized to quantify uncertainties, fault tree is utilized for the system modeling, lambda-tau method is utilized to formulate mathematical expressions for failure/repair rates of the system, and genetic algorithm is utilized to solve established nonlinear programming problem. Different reliability parameters of a robotic system are computed and the results are compared with the existing technique. The components of the robotic system follow exponential distribution, i.e., constant. Sensitivity analysis is also performed and impact on system mean time between failures (MTBF) is addressed by varying other reliability parameters. Based on analysis some influential suggestions are given to improve the system performance.

  4. CERTS: Consortium for Electric Reliability Technology Solutions - Research Highlights

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eto, Joseph

    2003-07-30

    Historically, the U.S. electric power industry was vertically integrated, and utilities were responsible for system planning, operations, and reliability management. As the nation moves to a competitive market structure, these functions have been disaggregated, and no single entity is responsible for reliability management. As a result, new tools, technologies, systems, and management processes are needed to manage the reliability of the electricity grid. However, a number of simultaneous trends prevent electricity market participants from pursuing development of these reliability tools: utilities are preoccupied with restructuring their businesses, research funding has declined, and the formation of Independent System Operators (ISOs) andmore » Regional Transmission Organizations (RTOs) to operate the grid means that control of transmission assets is separate from ownership of these assets; at the same time, business uncertainty, and changing regulatory policies have created a climate in which needed investment for transmission infrastructure and tools for reliability management has dried up. To address the resulting emerging gaps in reliability R&D, CERTS has undertaken much-needed public interest research on reliability technologies for the electricity grid. CERTS' vision is to: (1) Transform the electricity grid into an intelligent network that can sense and respond automatically to changing flows of power and emerging problems; (2) Enhance reliability management through market mechanisms, including transparency of real-time information on the status of the grid; (3) Empower customers to manage their energy use and reliability needs in response to real-time market price signals; and (4) Seamlessly integrate distributed technologies--including those for generation, storage, controls, and communications--to support the reliability needs of both the grid and individual customers.« less

  5. History of solid organ transplantation and organ donation.

    PubMed

    Linden, Peter K

    2009-01-01

    Solid organ transplantation is one of the most remarkable and dramatic therapeutic advances in medicine during the past 60 years. This field has progressed initially from what can accurately be termed a "clinical experiment" to routine and reliable practice, which has proven to be clinically effective, life-saving and cost-effective. This remarkable evolution stems from a serial confluence of: cultural acceptance; legal and political evolution to facilitate organ donation, procurement and allocation; technical and cognitive advances in organ preservation, surgery, immunology, immunosuppression; and management of infectious diseases. Some of the major milestones of this multidisciplinary clinical science are reviewed in this article.

  6. Reliability, Validity and Utility of a Multiple Intelligences Assessment for Career Planning.

    ERIC Educational Resources Information Center

    Shearer, C. Branton

    "The Multiple Intelligences Developmental Assessment Scales" (MIDAS) is a self- (or other-) completed instrument which is based upon the theory of multiple intelligences. The validity, reliability, and utility data regarding the MIDAS are reported here. The measure consists of 7 main scales and 24 subscales which summarize a person's intellectual…

  7. A predictive framework for evaluating models of semantic organization in free recall

    PubMed Central

    Morton, Neal W; Polyn, Sean M.

    2016-01-01

    Research in free recall has demonstrated that semantic associations reliably influence the organization of search through episodic memory. However, the specific structure of these associations and the mechanisms by which they influence memory search remain unclear. We introduce a likelihood-based model-comparison technique, which embeds a model of semantic structure within the context maintenance and retrieval (CMR) model of human memory search. Within this framework, model variants are evaluated in terms of their ability to predict the specific sequence in which items are recalled. We compare three models of semantic structure, latent semantic analysis (LSA), global vectors (GloVe), and word association spaces (WAS), and find that models using WAS have the greatest predictive power. Furthermore, we find evidence that semantic and temporal organization is driven by distinct item and context cues, rather than a single context cue. This finding provides important constraint for theories of memory search. PMID:28331243

  8. Reliability of hospital cost profiles in inpatient surgery.

    PubMed

    Grenda, Tyler R; Krell, Robert W; Dimick, Justin B

    2016-02-01

    With increased policy emphasis on shifting risk from payers to providers through mechanisms such as bundled payments and accountable care organizations, hospitals are increasingly in need of metrics to understand their costs relative to peers. However, it is unclear whether Medicare payments for surgery can reliably compare hospital costs. We used national Medicare data to assess patients undergoing colectomy, pancreatectomy, and open incisional hernia repair from 2009 to 2010 (n = 339,882 patients). We first calculated risk-adjusted hospital total episode payments for each procedure. We then used hierarchical modeling techniques to estimate the reliability of total episode payments for each procedure and explored the impact of hospital caseload on payment reliability. Finally, we quantified the number of hospitals meeting published reliability benchmarks. Mean risk-adjusted total episode payments ranged from $13,262 (standard deviation [SD] $14,523) for incisional hernia repair to $25,055 (SD $22,549) for pancreatectomy. The reliability of hospital episode payments varied widely across procedures and depended on sample size. For example, mean episode payment reliability for colectomy (mean caseload, 157) was 0.80 (SD 0.18), whereas for pancreatectomy (mean caseload, 13) the mean reliability was 0.45 (SD 0.27). Many hospitals met published reliability benchmarks for each procedure. For example, 90% of hospitals met reliability benchmarks for colectomy, 40% for pancreatectomy, and 66% for incisional hernia repair. Episode payments for inpatient surgery are a reliable measure of hospital costs for commonly performed procedures, but are less reliable for lower volume operations. These findings suggest that hospital cost profiles based on Medicare claims data may be used to benchmark efficiency, especially for more common procedures. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. Toward Predictive Theories of Nuclear Reactions Across the Isotopic Chart: Web Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Escher, J. E.; Blackmon, J.; Elster, C.

    Recent years have seen exciting new developments and progress in nuclear structure theory, reaction theory, and experimental techniques, that allow us to move towards a description of exotic systems and environments, setting the stage for new discoveries. The purpose of the 5-week program was to bring together physicists from the low-energy nuclear structure and reaction communities to identify avenues for achieving reliable and predictive descriptions of reactions involving nuclei across the isotopic chart. The 4-day embedded workshop focused on connecting theory developments to experimental advances and data needs for astrophysics and other applications. Nuclear theory must address phenomena from laboratorymore » experiments to stellar environments, from stable nuclei to weakly-bound and exotic isotopes. Expanding the reach of theory to these regimes requires a comprehensive understanding of the reaction mechanisms involved as well as detailed knowledge of nuclear structure. A recurring theme throughout the program was the desire to produce reliable predictions rooted in either ab initio or microscopic approaches. At the same time it was recognized that some applications involving heavy nuclei away from stability, e.g. those involving fi ssion fragments, may need to rely on simple parameterizations of incomplete data for the foreseeable future. The goal here, however, is to subsequently improve and refine the descriptions, moving to phenomenological, then microscopic approaches. There was overarching consensus that future work should also focus on reliable estimates of errors in theoretical descriptions.« less

  10. Density Functional Theory Investigations of D-A-D' Structural Molecules as Donor Materials in Organic Solar Cell.

    PubMed

    Chen, Junxian; Liu, Qingyu; Li, Hao; Zhao, Zhigang; Lu, Zhiyun; Huang, Yan; Xu, Dingguo

    2018-01-01

    Squaraine core based small molecules in bulk heterojunction organic solar cells have received extensive attentions due to their distinguished photochemical properties in far red and infrared domain. In this paper, combining theoretical simulations and experimental syntheses and characterizations, three major factors (fill factor, short circuit and open-cirvuit voltage) have been carried out together to achieve improvement of power conversion efficiencies of solar cells. As model material systems with D-A-D' framework, two asymmetric squaraines (CNSQ and CCSQ-Tol) as donor materials in bulk heterojunction organic solar cell were synthesized and characterized. Intensive density functional theory computations were applied to identify some direct connections between three factors and corresponding molecular structural properties. It then helps us to predict one new molecule of CCSQ'-Ox that matches all the requirements to improve the power conversion efficiency.

  11. Reliability model generator

    NASA Technical Reports Server (NTRS)

    Cohen, Gerald C. (Inventor); McMann, Catherine M. (Inventor)

    1991-01-01

    An improved method and system for automatically generating reliability models for use with a reliability evaluation tool is described. The reliability model generator of the present invention includes means for storing a plurality of low level reliability models which represent the reliability characteristics for low level system components. In addition, the present invention includes means for defining the interconnection of the low level reliability models via a system architecture description. In accordance with the principles of the present invention, a reliability model for the entire system is automatically generated by aggregating the low level reliability models based on the system architecture description.

  12. High reliable and stable organic field-effect transistor nonvolatile memory with a poly(4-vinyl phenol) charge trapping layer based on a pn-heterojunction active layer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xiang, Lanyi; Ying, Jun; Han, Jinhua

    2016-04-25

    In this letter, we demonstrate a high reliable and stable organic field-effect transistor (OFET) based nonvolatile memory (NVM) with a polymer poly(4-vinyl phenol) (PVP) as the charge trapping layer. In the unipolar OFETs, the inreversible shifts of the turn-on voltage (V{sub on}) and severe degradation of the memory window (ΔV{sub on}) at programming (P) and erasing (E) voltages, respectively, block their application in NVMs. The obstacle is overcome by using a pn-heterojunction as the active layer in the OFET memory, which supplied a holes and electrons accumulating channel at the supplied P and E voltages, respectively. Both holes and electronsmore » transferring from the channels to PVP layer and overwriting the trapped charges with an opposite polarity result in the reliable bidirectional shifts of V{sub on} at P and E voltages, respectively. The heterojunction OFET exhibits excellent nonvolatile memory characteristics, with a large ΔV{sub on} of 8.5 V, desired reading (R) voltage at 0 V, reliable P/R/E/R dynamic endurance over 100 cycles and a long retention time over 10 years.« less

  13. The Mini Questionnaire of Personal Organization (MQPO): preliminary validation of a new post-rationalist personality questionnaire.

    PubMed

    Nardi, Bernardo; Arimatea, Emidio; Giovagnoli, Sara; Blasi, Stefano; Bellantuono, Cesario; Rezzonico, Giorgio

    2012-01-01

    The Mini Questionnaire of Personal Organization (MQPO) has been constructed in order to comply with the inward/outward Personal Meaning Organization's (PMO) theory. According to Nardi's Adaptive Post-Rationalist approach, predictable and invariable caregivers' behaviours allow inward focus and a physical sight of reciprocity; non-predictable and variable caregivers' behaviours allow outward focus and a semantic sight of reciprocity. The 20 items of MQPO have been selected from 29 intermediate (n = 160) and 40 initial items (n = 204). Psychometric validation has been conducted (n = 296), including Internal Validity (Item-Total Correlation; Factor Analysis), Internal Coherence by Factor Analysis, two analyses in Discriminant Validity (n = 132 and n = 80) and Reliability by Test-Retest Analysis (n = 49). All subjects have been given their written informed consent before beginning the test. The validation of the MQPO shows that the ultimate version is consistent with its post-rationalist paradigm. Four different factors have been found, one for each PMO. Validity of the construct and the internal reliability index are satisfying (Alpha = 0.73). Moreover, the results obtained are constant (from r = 0.80 to r = 0.89). There is an adequate agreement between the MQPO scales and the clinical evaluations (72.5%), as well as an excellent agreement (80.0%) between the scores of the MQPO and those of the Personal Meaning Questionnaire. The MQPO is a tool able to study personality as a process by focusing on the relationships between personality and developmental process axes, which are the bases of the PMO's theory, according to the APR approach. Copyright © 2011 John Wiley & Sons, Ltd.

  14. Creating and Sustaining Secondary Schools' Success: Sandfields, Cwmtawe, and the Neath-Port Talbot Local Authority's High Reliability Schools Reform

    ERIC Educational Resources Information Center

    Stringfield, Sam; Reynolds, David; Schaffer, Eugene

    2016-01-01

    This chapter presents data from a 15-year, mixed-methods school improvement effort. The High Reliability Schools (HRS) reform made use of previous research on school effects and on High Reliability Organizations (HROs). HROs are organizations in various parts of our cultures that are required to operate successfully "the first time, every…

  15. Organizational Learning Theory in Schools

    ERIC Educational Resources Information Center

    Fauske, Janice R.; Raybould, Rebecca

    2005-01-01

    Purpose: The paper's purposes are to establish organizational learning theory as evolving from the theoretical and empirical study of organizations and to build grounded theory explaining organizational learning in schools. Design/methodology/approach: Implementation of instructional technology as a process of organizational learning was explored…

  16. Theories of lean management: an empirical evaluation.

    PubMed

    Handel, Michael J

    2014-03-01

    Debates within organization theory traditionally argued the relative merits of bureaucracy but today there is broad agreement across different perspectives that bureaucratic organization is inefficient and outmoded. Despite their differences, post-bureaucratic and neo-liberal theories argue that organizations with relatively flat hierarchies and low management overhead are better adapted to current market requirements. Post-bureaucratic theory also argues that employees, as well as firms, benefit from leaner management structures. This paper investigates trends in managerial leanness, proposed explanations for such trends, and the consequences of leanness for firms and employees. Although there is a trend toward flatter management hierarchies, there is only weak support for current claims regarding both the causes and consequences of lean management. Copyright © 2013 Elsevier Inc. All rights reserved.

  17. SPSS and SAS programs for generalizability theory analyses.

    PubMed

    Mushquash, Christopher; O'Connor, Brian P

    2006-08-01

    The identification and reduction of measurement errors is a major challenge in psychological testing. Most investigators rely solely on classical test theory for assessing reliability, whereas most experts have long recommended using generalizability theory instead. One reason for the common neglect of generalizability theory is the absence of analytic facilities for this purpose in popular statistical software packages. This article provides a brief introduction to generalizability theory, describes easy to use SPSS, SAS, and MATLAB programs for conducting the recommended analyses, and provides an illustrative example, using data (N = 329) for the Rosenberg Self-Esteem Scale. Program output includes variance components, relative and absolute errors and generalizability coefficients, coefficients for D studies, and graphs of D study results.

  18. Development, scoring, and reliability of the Microscale Audit of Pedestrian Streetscapes (MAPS)

    PubMed Central

    2013-01-01

    Background Streetscape (microscale) features of the built environment can influence people’s perceptions of their neighborhoods’ suitability for physical activity. Many microscale audit tools have been developed, but few have published systematic scoring methods. We present the development, scoring, and reliability of the Microscale Audit of Pedestrian Streetscapes (MAPS) tool and its theoretically-based subscales. Methods MAPS was based on prior instruments and was developed to assess details of streetscapes considered relevant for physical activity. MAPS sections (route, segments, crossings, and cul-de-sacs) were scored by two independent raters for reliability analyses. There were 290 route pairs, 516 segment pairs, 319 crossing pairs, and 53 cul-de-sac pairs in the reliability sample. Individual inter-rater item reliability analyses were computed using Kappa, intra-class correlation coefficient (ICC), and percent agreement. A conceptual framework for subscale creation was developed using theory, expert consensus, and policy relevance. Items were grouped into subscales, and subscales were analyzed for inter-rater reliability at tiered levels of aggregation. Results There were 160 items included in the subscales (out of 201 items total). Of those included in the subscales, 80 items (50.0%) had good/excellent reliability, 41 items (25.6%) had moderate reliability, and 18 items (11.3%) had low reliability, with limited variability in the remaining 21 items (13.1%). Seventeen of the 20 route section subscales, valence (positive/negative) scores, and overall scores (85.0%) demonstrated good/excellent reliability and 3 demonstrated moderate reliability. Of the 16 segment subscales, valence scores, and overall scores, 12 (75.0%) demonstrated good/excellent reliability, three demonstrated moderate reliability, and one demonstrated poor reliability. Of the 8 crossing subscales, valence scores, and overall scores, 6 (75.0%) demonstrated good/excellent reliability, and

  19. Organizational Theories and Analysis: A Feminist Perspective

    NASA Astrophysics Data System (ADS)

    Irefin, Peace; Ifah, S. S.; Bwala, M. H.

    2012-06-01

    This paper is a critique of organization theories and their failure to come to terms with the fact of the reproduction of labour power within a particular form of the division of labour. It examines feminist theory and its aims to understand the nature of inequality and focuses on gender, power relations and sexuality part of the task of feminists which organizational theories have neglected is to offer an account of how the different treatments of the sexes operate in our culture. The paper concludes that gender has been completely neglected within the organizational theory which result in a rhetorical reproduction of males as norms and women as others. It is recommended that only radical form of organization theory can account for the situation of women in organisational setting

  20. Transformational Teaching: Connecting the Full-Range Leadership Theory and Graduate Teaching Practice

    ERIC Educational Resources Information Center

    Kim, Won J.

    2012-01-01

    Reliable measurements for effective teaching are lacking. In contrast, some theories of leadership (particularly transformational leadership) have been tested and found to have efficacy in a variety of organizational settings. In this study, the full-range leadership theory, which includes transformational leadership, was applied to the…

  1. 76 FR 16263 - Revision to Electric Reliability Organization Definition of Bulk Electric System

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-23

    ...'s Reliability Standards Development Process, to revise its definition of the term ``bulk electric... definition of ``bulk electric system'' through the NERC Standards Development Process to address the... undertake the process of revising the bulk electric system definition to address the Commission's concerns...

  2. An improved spanning tree approach for the reliability analysis of supply chain collaborative network

    NASA Astrophysics Data System (ADS)

    Lam, C. Y.; Ip, W. H.

    2012-11-01

    A higher degree of reliability in the collaborative network can increase the competitiveness and performance of an entire supply chain. As supply chain networks grow more complex, the consequences of unreliable behaviour become increasingly severe in terms of cost, effort and time. Moreover, it is computationally difficult to calculate the network reliability of a Non-deterministic Polynomial-time hard (NP-hard) all-terminal network using state enumeration, as this may require a huge number of iterations for topology optimisation. Therefore, this paper proposes an alternative approach of an improved spanning tree for reliability analysis to help effectively evaluate and analyse the reliability of collaborative networks in supply chains and reduce the comparative computational complexity of algorithms. Set theory is employed to evaluate and model the all-terminal reliability of the improved spanning tree algorithm and present a case study of a supply chain used in lamp production to illustrate the application of the proposed approach.

  3. Implications of complex adaptive systems theory for the design of research on health care organizations.

    PubMed

    McDaniel, Reuben R; Lanham, Holly Jordan; Anderson, Ruth A

    2009-01-01

    Because health care organizations (HCOs) are complex adaptive systems (CASs), phenomena of interest often are dynamic and unfold in unpredictable ways, and unfolding events are often unique. Researchers of HCOs may recognize that the subject of their research is dynamic; however, their research designs may not take this into account. Researchers may also know that unfolding events are often unique, but their design may not have the capacity to obtain information from meager evidence. These two concerns led us to examine two ideas from organizational theory: (a) the ideas of K. E. Weick (1993) on organizational design as a verb and (b) the ideas of J. G. March, L. S. Sproull, and M. Tamuz (1991) on learning from samples of one or fewer. In this article, we applied these ideas to develop an enriched perspective of research design for studying CASs. We conducted a theoretical analysis of organizations as CASs, identifying relevant characteristics for research designs. We then explored two ideas from organizational theory and discussed the implications for research designs. Weick's idea of "design as a verb" helps in understanding dynamic and process-oriented research design. The idea of "learning from samples of one or fewer" of March, Sproull, and Tamuz provides strategies for research design that enables learning from meager evidence. When studying HCOs, research designs are likely to be more effective when they (a) anticipate change, (b) include tension, (c) capitalize on serendipity, and (d) use an "act-then-look" mind set. Implications for practice are discussed. Practitioners who understand HCOs as CASs will be cautious in accepting findings from studies that treat HCOs mechanistically. They will consider the characteristics of CAS when evaluating the evidence base for practice. Practitioners can use the strategies proposed in this article to stimulate discussion with researchers seeking to conduct research in their HCO.

  4. The application of the statistical theory of extreme values to gust-load problems

    NASA Technical Reports Server (NTRS)

    Press, Harry

    1950-01-01

    An analysis is presented which indicates that the statistical theory of extreme values is applicable to the problems of predicting the frequency of encountering the larger gust loads and gust velocities for both specific test conditions as well as commercial transport operations. The extreme-value theory provides an analytic form for the distributions of maximum values of gust load and velocity. Methods of fitting the distribution are given along with a method of estimating the reliability of the predictions. The theory of extreme values is applied to available load data from commercial transport operations. The results indicate that the estimates of the frequency of encountering the larger loads are more consistent with the data and more reliable than those obtained in previous analyses. (author)

  5. The reliability-quality relationship for quality systems and quality risk management.

    PubMed

    Claycamp, H Gregg; Rahaman, Faiad; Urban, Jason M

    2012-01-01

    Engineering reliability typically refers to the probability that a system, or any of its components, will perform a required function for a stated period of time and under specified operating conditions. As such, reliability is inextricably linked with time-dependent quality concepts, such as maintaining a state of control and predicting the chances of losses from failures for quality risk management. Two popular current good manufacturing practice (cGMP) and quality risk management tools, failure mode and effects analysis (FMEA) and root cause analysis (RCA) are examples of engineering reliability evaluations that link reliability with quality and risk. Current concepts in pharmaceutical quality and quality management systems call for more predictive systems for maintaining quality; yet, the current pharmaceutical manufacturing literature and guidelines are curiously silent on engineering quality. This commentary discusses the meaning of engineering reliability while linking the concept to quality systems and quality risk management. The essay also discusses the difference between engineering reliability and statistical (assay) reliability. The assurance of quality in a pharmaceutical product is no longer measured only "after the fact" of manufacturing. Rather, concepts of quality systems and quality risk management call for designing quality assurance into all stages of the pharmaceutical product life cycle. Interestingly, most assays for quality are essentially static and inform product quality over the life cycle only by being repeated over time. Engineering process reliability is the fundamental concept that is meant to anticipate quality failures over the life cycle of the product. Reliability is a well-developed theory and practice for other types of manufactured products and manufacturing processes. Thus, it is well known to be an appropriate index of manufactured product quality. This essay discusses the meaning of reliability and its linkages with quality

  6. Precise segmentation of multiple organs in CT volumes using learning-based approach and information theory.

    PubMed

    Lu, Chao; Zheng, Yefeng; Birkbeck, Neil; Zhang, Jingdan; Kohlberger, Timo; Tietjen, Christian; Boettger, Thomas; Duncan, James S; Zhou, S Kevin

    2012-01-01

    In this paper, we present a novel method by incorporating information theory into the learning-based approach for automatic and accurate pelvic organ segmentation (including the prostate, bladder and rectum). We target 3D CT volumes that are generated using different scanning protocols (e.g., contrast and non-contrast, with and without implant in the prostate, various resolution and position), and the volumes come from largely diverse sources (e.g., diseased in different organs). Three key ingredients are combined to solve this challenging segmentation problem. First, marginal space learning (MSL) is applied to efficiently and effectively localize the multiple organs in the largely diverse CT volumes. Second, learning techniques, steerable features, are applied for robust boundary detection. This enables handling of highly heterogeneous texture pattern. Third, a novel information theoretic scheme is incorporated into the boundary inference process. The incorporation of the Jensen-Shannon divergence further drives the mesh to the best fit of the image, thus improves the segmentation performance. The proposed approach is tested on a challenging dataset containing 188 volumes from diverse sources. Our approach not only produces excellent segmentation accuracy, but also runs about eighty times faster than previous state-of-the-art solutions. The proposed method can be applied to CT images to provide visual guidance to physicians during the computer-aided diagnosis, treatment planning and image-guided radiotherapy to treat cancers in pelvic region.

  7. Reliability analysis of component-level redundant topologies for solid-state fault current limiter

    NASA Astrophysics Data System (ADS)

    Farhadi, Masoud; Abapour, Mehdi; Mohammadi-Ivatloo, Behnam

    2018-04-01

    Experience shows that semiconductor switches in power electronics systems are the most vulnerable components. One of the most common ways to solve this reliability challenge is component-level redundant design. There are four possible configurations for the redundant design in component level. This article presents a comparative reliability analysis between different component-level redundant designs for solid-state fault current limiter. The aim of the proposed analysis is to determine the more reliable component-level redundant configuration. The mean time to failure (MTTF) is used as the reliability parameter. Considering both fault types (open circuit and short circuit), the MTTFs of different configurations are calculated. It is demonstrated that more reliable configuration depends on the junction temperature of the semiconductor switches in the steady state. That junction temperature is a function of (i) ambient temperature, (ii) power loss of the semiconductor switch and (iii) thermal resistance of heat sink. Also, results' sensitivity to each parameter is investigated. The results show that in different conditions, various configurations have higher reliability. The experimental results are presented to clarify the theory and feasibility of the proposed approaches. At last, levelised costs of different configurations are analysed for a fair comparison.

  8. Curriculum: From Theory to Practice

    ERIC Educational Resources Information Center

    Null, Wesley

    2011-01-01

    "Curriculum: From Theory to Practice" introduces readers to curriculum theory and how it relates to classroom practice. Wesley Null provides a unique organization of the curriculum field into five traditions: systematic, existential, radical, pragmatic, and deliberative. He discusses the philosophical foundations of curriculum as well as…

  9. Environmental control of sepalness and petalness in perianth organs of waterlilies: a new Mosaic Theory for the evolutionary origin of a differentiated perianth

    PubMed Central

    Warner, Kate A.; Rudall, Paula J.; Frohlich, Michael W.

    2009-01-01

    The conventional concept of an ‘undifferentiated perianth’, implying that all perianth organs of a flower are alike, obscures the fact that individual perianth organs are sometimes differentiated into sepaloid and petaloid regions, as in the early-divergent angiosperms Nuphar, Nymphaea, and Schisandra. In the waterlilies Nuphar and Nymphaea, sepaloid regions closely coincide with regions of the perianth that were exposed when the flower was in bud, whereas petaloid regions occur in covered regions, suggesting that their development is at least partly controlled by the environment of the developing tepal. Green and colourful areas differ from each other in trichome density and presence of papillae, features that often distinguish sepals and petals. Field experiments to test whether artificial exposure can induce sepalness in the inner tepals showed that development of sepaloid patches is initiated by exposure, at least in the waterlily species examined. Although light is an important environmental cue, other important factors include an absence of surface contact. Our interpretation contradicts the unspoken rule that ‘sepal’ and ‘petal’ must refer to whole organs. We propose a novel theory (the Mosaic theory), in which the distinction between sepalness and petalness evolved early in angiosperm history, but these features were not fixed to particular organs and were primarily environmentally controlled. At a later stage in angiosperm evolution, sepaloid and petaloid characteristics became fixed to whole organs in specific whorls, thus reducing or removing the need for environmental control in favour of fixed developmental control. PMID:19574253

  10. Extension of nanoconfined DNA: Quantitative comparison between experiment and theory

    NASA Astrophysics Data System (ADS)

    Iarko, V.; Werner, E.; Nyberg, L. K.; Müller, V.; Fritzsche, J.; Ambjörnsson, T.; Beech, J. P.; Tegenfeldt, J. O.; Mehlig, K.; Westerlund, F.; Mehlig, B.

    2015-12-01

    The extension of DNA confined to nanochannels has been studied intensively and in detail. However, quantitative comparisons between experiments and model calculations are difficult because most theoretical predictions involve undetermined prefactors, and because the model parameters (contour length, Kuhn length, effective width) are difficult to compute reliably, leading to substantial uncertainties. Here we use a recent asymptotically exact theory for the DNA extension in the "extended de Gennes regime" that allows us to compare experimental results with theory. For this purpose, we performed experiments measuring the mean DNA extension and its standard deviation while varying the channel geometry, dye intercalation ratio, and ionic strength of the buffer. The experimental results agree very well with theory at high ionic strengths, indicating that the model parameters are reliable. At low ionic strengths, the agreement is less good. We discuss possible reasons. In principle, our approach allows us to measure the Kuhn length and the effective width of a single DNA molecule and more generally of semiflexible polymers in solution.

  11. Critical Social Class Theory for Music Education

    ERIC Educational Resources Information Center

    Bates, Vincent C.

    2017-01-01

    This work of critical social theory explores how formal music education in modern capitalist societies mirrors the hierarchical, means-ends, one-dimensional structures of capitalism. So, rather than consistently or reliably empowering and emancipating children musically, school music can tend to marginalize, exploit, repress, and alienate. The…

  12. The Impact of Automation Reliability and Operator Fatigue on Performance and Reliance

    DTIC Science & Technology

    2016-09-23

    Matthews1 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER H0HJ (53290813) 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) 8... performance . Reliability Reliability of automation is a key factor in an operator’s reliance on automation. Previous work has shown that... Performance in a complex multiple-task environment during a laboratory-based simulation of occasional night work . Human Factors: The Journal of the

  13. Toward a Sociology of Criminological Theory

    ERIC Educational Resources Information Center

    Hauhart, Robert C.

    2012-01-01

    It is a truism to remind ourselves that scientific theory is a human product subject to many of the same social processes that govern other social acts. Science, however, whether social or natural, pretends to claim a higher mission, a more sophisticated methodology, and more consequential and reliable outcomes than human efforts arising from…

  14. Why do we need theories?

    PubMed Central

    Longo, Giuseppe; Soto, Ana M.

    2017-01-01

    Theories organize knowledge and construct objectivity by framing observations and experiments. The elaboration of theoretical principles is examined in the light of the rich interactions between physics and mathematics. These two disciplines share common principles of construction of concepts and of the proper objects of inquiry. Theory construction in physics relies on mathematical symmetries that preserve the key invariants observed and proposed by such theory; these invariants buttress the idea that the objects of physics are generic and thus interchangeable and they move along specific trajectories which are uniquely determined, in classical and relativistic physics. In contrast to physics, biology is a historical science that centers on the changes that organisms experience while undergoing ontogenesis and phylogenesis. Biological objects, namely organisms, are not generic but specific; they are individuals. The incessant changes they undergo represent the breaking of symmetries, and thus the opposite of symmetry conservation, a central component of physical theories. This instability corresponds to the changes of the environment and the phenotypes. Inspired by Galileo’s principle of inertia, the “default state” of inert matter, we propose a “default state” for biological dynamics following Darwin’s first principle, “descent with modification” that we transform into “proliferation with variation and motility” as a property that spans life, including cells in an organism. These dissimilarities between theories of the inert and of biology also apply to causality: biological causality is to be understood in relation to the distinctive role that constraints assume in this discipline. Consequently, the notion of cause will be reframed in a context where constraints to activity are seen as the core component of biological analyses. Finally, we assert that the radical materiality of life rules out distinctions such as “software vs. hardware

  15. Why do we need theories?

    PubMed

    Longo, Giuseppe; Soto, Ana M

    2016-10-01

    Theories organize knowledge and construct objectivity by framing observations and experiments. The elaboration of theoretical principles is examined in the light of the rich interactions between physics and mathematics. These two disciplines share common principles of construction of concepts and of the proper objects of inquiry. Theory construction in physics relies on mathematical symmetries that preserve the key invariants observed and proposed by such theory; these invariants buttress the idea that the objects of physics are generic and thus interchangeable and they move along specific trajectories which are uniquely determined, in classical and relativistic physics. In contrast to physics, biology is a historical science that centers on the changes that organisms experience while undergoing ontogenesis and phylogenesis. Biological objects, namely organisms, are not generic but specific; they are individuals. The incessant changes they undergo represent the breaking of symmetries, and thus the opposite of symmetry conservation, a central component of physical theories. This instability corresponds to the changes of the environment and the phenotypes. Inspired by Galileo's principle of inertia, the "default state" of inert matter, we propose a "default state" for biological dynamics following Darwin's first principle, "descent with modification" that we transform into "proliferation with variation and motility" as a property that spans life, including cells in an organism. These dissimilarities between theories of the inert and of biology also apply to causality: biological causality is to be understood in relation to the distinctive role that constraints assume in this discipline. Consequently, the notion of cause will be reframed in a context where constraints to activity are seen as the core component of biological analyses. Finally, we assert that the radical materiality of life rules out distinctions such as "software vs. hardware." Copyright © 2016

  16. The Evaluation Method of the Lightning Strike on Transmission Lines Aiming at Power Grid Reliability

    NASA Astrophysics Data System (ADS)

    Wen, Jianfeng; Wu, Jianwei; Huang, Liandong; Geng, Yinan; Yu, zhanqing

    2018-01-01

    Lightning protection of power system focuses on reducing the flashover rate, only distinguishing by the voltage level, without considering the functional differences between the transmission lines, and being lack of analysis the effect on the reliability of power grid. This will lead lightning protection design of general transmission lines is surplus but insufficient for key lines. In order to solve this problem, the analysis method of lightning striking on transmission lines for power grid reliability is given. Full wave process theory is used to analyze the lightning back striking; the leader propagation model is used to describe the process of shielding failure of transmission lines. The index of power grid reliability is introduced and the effect of transmission line fault on the reliability of power system is discussed in detail.

  17. The organic surface of 5145 Pholus: Constraints set by scattering theory

    NASA Technical Reports Server (NTRS)

    Wilson, Peter D.; Sagan, Carl; Thompson, W. Reid

    1994-01-01

    No known body in the Solar System has a spectrum redder than that of object 5145 Pholus. We use Hapke scattering theory and optical constants measured in this laboratory to examine the ability of mixtures of a number of organic solids and ices to reproduce the observed spectrum and phase variation. The primary materials considered are poly-HCN, kerogen, Murchison organic extract, Titan tholin, ice tholin, and water ice. In a computer grid search of over 10 million models, we find an intraparticle mixture of 15% Titan tholin, 10% poly-HCN, and 75% water ice with 10-micrometers particles to provide an excellent fit. Replacing water ice with ammonia ice improves the fits significantly while using a pure hydrocarbon tholin, Tholin alpha, instead of Titan tholin makes only modest improvements. All acceptable fits require Titan tholin or some comparable material to provide the steep slope in the visible, and poly-HCN or some comparable material to provide strong absorption in the near-infrared. A pure Titan tholin surface with 16-micrometers particles, as well as all acceptable Pholus models, fit the present spectrophotometric data for the transplutonian object 1992 QB(sub 1). The feasibility of gas-phase chemistry to generate material like Titan tholin on such small objects is examined. An irradiated transient atmosphere arising from sublimating ices may generate at most a few centimeters of tholin over the lifetime of the Solar System, but this is insignificant compared to the expected lag deposit of primordial contaminants left behind by the sublimating ice. Irradiation of subsurface N2/CH4 or NH3/CH4 ice by cosmic rays may generate approximately 20 cm of tholin in the upper 10 m of regolith in the same time scale but the identity of this tholin to its gas-phase equivalent has not been demonstrated.

  18. Comparing the Fit of Item Response Theory and Factor Analysis Models

    ERIC Educational Resources Information Center

    Maydeu-Olivares, Alberto; Cai, Li; Hernandez, Adolfo

    2011-01-01

    Linear factor analysis (FA) models can be reliably tested using test statistics based on residual covariances. We show that the same statistics can be used to reliably test the fit of item response theory (IRT) models for ordinal data (under some conditions). Hence, the fit of an FA model and of an IRT model to the same data set can now be…

  19. Neuropsychological Contributions to Theories of Part/Whole Organization.

    ERIC Educational Resources Information Center

    Robertson, Lynn C.; Lamb, Marvin R.

    1991-01-01

    It is proposed that there is a modular but interconnected system underlying the perceived hierarchical organization of objects. The discussion centers on neural and cognitive mechanisms of organizing objects within objects in at least four separate subsystems. (SLD)

  20. A conceptual framework for organismal biology: linking theories, models, and data.

    PubMed

    Zamer, William E; Scheiner, Samuel M

    2014-11-01

    Implicit or subconscious theory is especially common in the biological sciences. Yet, theory plays a variety of roles in scientific inquiry. First and foremost, it determines what does and does not count as a valid or interesting question or line of inquiry. Second, theory determines the background assumptions within which inquiries are pursued. Third, theory provides linkages among disciplines. For these reasons, it is important and useful to develop explicit theories for biology. A general theory of organisms is developed, which includes 10 fundamental principles that apply to all organisms, and 6 that apply to multicellular organisms only. The value of a general theory comes from its utility to help guide the development of more specific theories and models. That process is demonstrated by examining two domains: ecoimmunology and development. For the former, a constitutive theory of ecoimmunology is presented, and used to develop a specific model that explains energetic trade-offs that may result from an immunological response of a host to a pathogen. For the latter, some of the issues involved in trying to devise a constitutive theory that covers all of development are explored, and a more narrow theory of phenotypic novelty is presented. By its very nature, little of a theory of organisms will be new. Rather, the theory presented here is a formal expression of nearly two centuries of conceptual advances and practice in research. Any theory is dynamic and subject to debate and change. Such debate will occur as part of the present, initial formulation, as the ideas presented here are refined. The very process of debating the form of the theory acts to clarify thinking. The overarching goal is to stimulate debate about the role of theory in the study of organisms, and thereby advance our understanding of them. Published by Oxford University Press on behalf of the Society for Integrative and Comparative Biology 2014. This work is written by US Government employees

  1. Hierarchical Recursive Organization and the Free Energy Principle: From Biological Self-Organization to the Psychoanalytic Mind

    PubMed Central

    Connolly, Patrick; van Deventer, Vasi

    2017-01-01

    The present paper argues that a systems theory epistemology (and particularly the notion of hierarchical recursive organization) provides the critical theoretical context within which the significance of Friston's (2010a) Free Energy Principle (FEP) for both evolution and psychoanalysis is best understood. Within this perspective, the FEP occupies a particular level of the hierarchical organization of the organism, which is the level of biological self-organization. This form of biological self-organization is in turn understood as foundational and pervasive to the higher levels of organization of the human organism that are of interest to both neuroscience as well as psychoanalysis. Consequently, central psychoanalytic claims should be restated, in order to be located in their proper place within a hierarchical recursive organization of the (situated) organism. In light of the FEP the realization of the psychoanalytic mind by the brain should be seen in terms of the evolution of different levels of systematic organization where the concepts of psychoanalysis describe a level of hierarchical recursive organization superordinate to that of biological self-organization and the FEP. The implication of this formulation is that while “psychoanalytic” mental processes are fundamentally subject to the FEP, they nonetheless also add their own principles of process over and above that of the FEP. A model found in Grobbelaar (1989) offers a recursive bottom-up description of the self-organization of the psychoanalytic ego as dependent on the organization of language (and affect), which is itself founded upon the tendency toward autopoiesis (self-making) within the organism, which is in turn described as formally similar to the FEP. Meaningful consilience between Grobbelaar's model and the hierarchical recursive description available in Friston's (2010a) theory is described. The paper concludes that the valuable contribution of the FEP to psychoanalysis underscores the

  2. Hierarchical Recursive Organization and the Free Energy Principle: From Biological Self-Organization to the Psychoanalytic Mind.

    PubMed

    Connolly, Patrick; van Deventer, Vasi

    2017-01-01

    The present paper argues that a systems theory epistemology (and particularly the notion of hierarchical recursive organization) provides the critical theoretical context within which the significance of Friston's (2010a) Free Energy Principle (FEP) for both evolution and psychoanalysis is best understood. Within this perspective, the FEP occupies a particular level of the hierarchical organization of the organism, which is the level of biological self-organization. This form of biological self-organization is in turn understood as foundational and pervasive to the higher levels of organization of the human organism that are of interest to both neuroscience as well as psychoanalysis. Consequently, central psychoanalytic claims should be restated, in order to be located in their proper place within a hierarchical recursive organization of the (situated) organism. In light of the FEP the realization of the psychoanalytic mind by the brain should be seen in terms of the evolution of different levels of systematic organization where the concepts of psychoanalysis describe a level of hierarchical recursive organization superordinate to that of biological self-organization and the FEP. The implication of this formulation is that while "psychoanalytic" mental processes are fundamentally subject to the FEP, they nonetheless also add their own principles of process over and above that of the FEP. A model found in Grobbelaar (1989) offers a recursive bottom-up description of the self-organization of the psychoanalytic ego as dependent on the organization of language (and affect), which is itself founded upon the tendency toward autopoiesis (self-making) within the organism, which is in turn described as formally similar to the FEP. Meaningful consilience between Grobbelaar's model and the hierarchical recursive description available in Friston's (2010a) theory is described. The paper concludes that the valuable contribution of the FEP to psychoanalysis underscores the

  3. Detecting Nonadditivity in Single-Facet Generalizability Theory Applications: Tukey's Test

    ERIC Educational Resources Information Center

    Lin, Chih-Kai; Zhang, Jinming

    2018-01-01

    Under the generalizability-theory (G-theory) framework, the estimation precision of variance components (VCs) is of significant importance in that they serve as the foundation of estimating reliability. Zhang and Lin advanced the discussion of nonadditivity in data from a theoretical perspective and showed the adverse effects of nonadditivity on…

  4. Metabolic theory predicts whole-ecosystem properties.

    PubMed

    Schramski, John R; Dell, Anthony I; Grady, John M; Sibly, Richard M; Brown, James H

    2015-02-24

    Understanding the effects of individual organisms on material cycles and energy fluxes within ecosystems is central to predicting the impacts of human-caused changes on climate, land use, and biodiversity. Here we present a theory that integrates metabolic (organism-based bottom-up) and systems (ecosystem-based top-down) approaches to characterize how the metabolism of individuals affects the flows and stores of materials and energy in ecosystems. The theory predicts how the average residence time of carbon molecules, total system throughflow (TST), and amount of recycling vary with the body size and temperature of the organisms and with trophic organization. We evaluate the theory by comparing theoretical predictions with outputs of numerical models designed to simulate diverse ecosystem types and with empirical data for real ecosystems. Although residence times within different ecosystems vary by orders of magnitude-from weeks in warm pelagic oceans with minute phytoplankton producers to centuries in cold forests with large tree producers-as predicted, all ecosystems fall along a single line: residence time increases linearly with slope = 1.0 with the ratio of whole-ecosystem biomass to primary productivity (B/P). TST was affected predominantly by primary productivity and recycling by the transfer of energy from microbial decomposers to animal consumers. The theory provides a robust basis for estimating the flux and storage of energy, carbon, and other materials in terrestrial, marine, and freshwater ecosystems and for quantifying the roles of different kinds of organisms and environments at scales from local ecosystems to the biosphere.

  5. Exploring the validity and reliability of a questionnaire for evaluating veterinary clinical teachers' supervisory skills during clinical rotations.

    PubMed

    Boerboom, T B B; Dolmans, D H J M; Jaarsma, A D C; Muijtjens, A M M; Van Beukelen, P; Scherpbier, A J J A

    2011-01-01

    Feedback to aid teachers in improving their teaching requires validated evaluation instruments. When implementing an evaluation instrument in a different context, it is important to collect validity evidence from multiple sources. We examined the validity and reliability of the Maastricht Clinical Teaching Questionnaire (MCTQ) as an instrument to evaluate individual clinical teachers during short clinical rotations in veterinary education. We examined four sources of validity evidence: (1) Content was examined based on theory of effective learning. (2) Response process was explored in a pilot study. (3) Internal structure was assessed by confirmatory factor analysis using 1086 student evaluations and reliability was examined utilizing generalizability analysis. (4) Relations with other relevant variables were examined by comparing factor scores with other outcomes. Content validity was supported by theory underlying the cognitive apprenticeship model on which the instrument is based. The pilot study resulted in an additional question about supervision time. A five-factor model showed a good fit with the data. Acceptable reliability was achievable with 10-12 questionnaires per teacher. Correlations between the factors and overall teacher judgement were strong. The MCTQ appears to be a valid and reliable instrument to evaluate clinical teachers' performance during short rotations.

  6. Theory and modeling of correlated ionic motions in hybrid organic-inorganic perovskites

    NASA Astrophysics Data System (ADS)

    Rappe, Andrew

    The perovskite crystal structure hosts a wealth of intriguing properties, and the renaissance of interest in halide (and hybrid organic-inorganic) perovskites (HOIPs) has further broadened the palette of exciting physical phenomena. Breakthroughs in HOIP synthesis, characterization, and solar cell design have led to remarkable increases in reported photovoltaic efficiency. However, the observed long carrier lifetime and PV performance have eluded comprehensive physical justification. The hybrid perovskites serve as an enigmatic crossroads of physics. Concepts from crystalline band theory, molecular physics, liquids, and phase transitions have been applied with some success, but the observations of HOIPs make it clear that none of these conceptual frameworks completely fits. In this talk, recent theoretical progress in understanding HOIPs will be reviewed and integrated with experimental findings. The large amplitude motions of HOIPs will be highlighted, including ionic diffusion, anharmonic phonons, and dynamic incipient order on various length and time scales. The intricate relationships between correlated structural fluctuations, polar order, and excited charge carrier dynamics will also be discussed. This work was supported by the Office of Naval Research, under Grant N00014-14-1-0761.

  7. Assessing transfer property and reliability of urban bus network based on complex network theory

    NASA Astrophysics Data System (ADS)

    Zhang, Hui; Zhuge, Cheng-Xiang; Zhao, Xiang; Song, Wen-Bo

    Transfer reliability has an important impact on the urban bus network. The proportion of zero and one transfer time is a key indicator to measure the connectivity of bus networks. However, it is hard to calculate the transfer time between nodes because of the complicated network structure. In this paper, the topological structures of urban bus network in Jinan are constructed by space L and space P. A method to calculate transfer times between stations has been proposed by reachable matrix under space P. The result shows that it is efficient to calculate the transfer time between nodes in large networks. In order to test the transfer reliability, a node failure process has been built according to degree, clustering coefficient and betweenness centrality under space L and space P. The results show that the deliberate attack by betweenness centrality under space P is more effective compared with other five attack modes. This research could provide a power tool to find hub stations in bus networks and give a help for traffic manager to guarantee the normal operation of urban bus systems.

  8. The reliable solution and computation time of variable parameters logistic model

    NASA Astrophysics Data System (ADS)

    Wang, Pengfei; Pan, Xinnong

    2018-05-01

    The study investigates the reliable computation time (RCT, termed as T c) by applying a double-precision computation of a variable parameters logistic map (VPLM). Firstly, by using the proposed method, we obtain the reliable solutions for the logistic map. Secondly, we construct 10,000 samples of reliable experiments from a time-dependent non-stationary parameters VPLM and then calculate the mean T c. The results indicate that, for each different initial value, the T cs of the VPLM are generally different. However, the mean T c trends to a constant value when the sample number is large enough. The maximum, minimum, and probable distribution functions of T c are also obtained, which can help us to identify the robustness of applying a nonlinear time series theory to forecasting by using the VPLM output. In addition, the T c of the fixed parameter experiments of the logistic map is obtained, and the results suggest that this T c matches the theoretical formula-predicted value.

  9. A Meta-Analysis of Institutional Theories

    DTIC Science & Technology

    1989-06-01

    GPOUP SUBGROUP Institutional Theory , Isomorphism, Administrative Difterpntiation, Diffusion of Change, Rational, Unit Of Analysis 19 ABSTRACT (Continue on... institutional theory may lead to better decision making and evaluation criteria on the part of managers in the non-profit sector. C. SCOPE This paper... institutional theory : I) Organizations evolving in environments with elabora- ted institutional rules create structure that conform to those rules. 2

  10. Fatigue Reliability of Gas Turbine Engine Structures

    NASA Technical Reports Server (NTRS)

    Cruse, Thomas A.; Mahadevan, Sankaran; Tryon, Robert G.

    1997-01-01

    The results of an investigation are described for fatigue reliability in engine structures. The description consists of two parts. Part 1 is for method development. Part 2 is a specific case study. In Part 1, the essential concepts and practical approaches to damage tolerance design in the gas turbine industry are summarized. These have evolved over the years in response to flight safety certification requirements. The effect of Non-Destructive Evaluation (NDE) methods on these methods is also reviewed. Assessment methods based on probabilistic fracture mechanics, with regard to both crack initiation and crack growth, are outlined. Limit state modeling techniques from structural reliability theory are shown to be appropriate for application to this problem, for both individual failure mode and system-level assessment. In Part 2, the results of a case study for the high pressure turbine of a turboprop engine are described. The response surface approach is used to construct a fatigue performance function. This performance function is used with the First Order Reliability Method (FORM) to determine the probability of failure and the sensitivity of the fatigue life to the engine parameters for the first stage disk rim of the two stage turbine. A hybrid combination of regression and Monte Carlo simulation is to use incorporate time dependent random variables. System reliability is used to determine the system probability of failure, and the sensitivity of the system fatigue life to the engine parameters of the high pressure turbine. 'ne variation in the primary hot gas and secondary cooling air, the uncertainty of the complex mission loading, and the scatter in the material data are considered.

  11. [The cell theory. Progress in studies on cell-cell communications].

    PubMed

    Brodskiĭ, V Ia

    2009-01-01

    Current data confirm the fundamental statement of the cell theory concerning the cell reproduction in a series of generations (omnis cellula e cellula). Cell communities or ensembles integrated by the signaling systems established in prokaryotes and protists and functioning in multicellular organisms including mammals are considered as the structural and functional unit of a multicellular organism. The cell is an elementary unit of life and basis of organism development and functioning. At the same time, the adult organism is not just a totality of cells. Multinucleated cells in some tissues, syncytial structure, and structural-functional units of organs are adaptations for optimal functioning of the multicellular organism and manifestations of cell-cell communications in development and definitive functioning. The cell theory was supplemented and developed by studies on cell-cell communications; however, these studies do not question the main generalizations of the theory.

  12. Interstellar organic chemistry.

    NASA Technical Reports Server (NTRS)

    Sagan, C.

    1972-01-01

    Most of the interstellar organic molecules have been found in the large radio source Sagittarius B2 toward the galactic center, and in such regions as W51 and the IR source in the Orion nebula. Questions of the reliability of molecular identifications are discussed together with aspects of organic synthesis in condensing clouds, degradational origin, synthesis on grains, UV natural selection, interstellar biology, and contributions to planetary biology.

  13. Career Development: Theory and Practice.

    ERIC Educational Resources Information Center

    Montross, David H., Ed.; Shinkman, Christopher J., Ed.

    This book explores the latest developments in the theory and practice of career development, as seen by 21 professionals in the field. The study is organized in four parts that cover the following areas: the latest thinking about career theory; the career stages of exploration, establishment, maintenance, and decline; current thinking about the…

  14. Toward a Unified Communication Theory.

    ERIC Educational Resources Information Center

    McMillan, Saundra

    After discussing the nature of theory itself, the author explains her concept of the Unified Communication Theory, which rests on the assumption that there exists in all living structures a potential communication factor which is delimited by species and ontogeny. An organism develops "symbol fixation" at the level where its perceptual abilities…

  15. Translating Theory into Practice: Implications of Japanese Management Theory for Student Personnel Administrators. NASPA Monograph Series Volume 3. First Edition.

    ERIC Educational Resources Information Center

    Deegan, William L.; And Others

    Japanese management theory was studied to identify specific models for consideration by student personnel administrators. The report is organized into three sections: major components of Japanese management theory, potential implications for student personnel administration, and three models, based on components of Japanese management theory, for…

  16. Notes on a Political Theory of Educational Organizations.

    ERIC Educational Resources Information Center

    Bacharach, Samuel B.

    This essay reviews major trends in methodological and theoretical approaches to the study of organizations since the mid-sixties and espouses the political analysis of organizations, a position representing a middle ground between comparative structuralism and the loosely coupled systems approach. This position emphasizes micropolitics as well as…

  17. Application of the IAS theory combining to a three compartments description of natural organic matter to the adsorption of atrazine or diuron on activated carbon.

    PubMed

    Baudu, M; Raveau, D; Guibaud, G

    2004-07-01

    The study of natural organic matter (NOM) adsorption on an activated carbon showed that equilibrium cannot be described according to a simple model such as a Freundlich isotherm and confirms the need for a closer description of the organic matter to simulate the competitive adsorption with micropollutants. A representation of the organic matter in three fractions is chosen: non-adsorbable, weak and strong adsorbable. The Ideal Adsorbed Solution Theory (IAST) can, under restrictive conditions, be used to effectively predict the competition between the pesticides and the organic matter. Therefore, it was noted that the model simulated with good precision the competition between atrazine or diuron and natural organic matter in aqueous solution for two activated carbons (A and B). The same parameters for the modeling of organic matter adsorption (Freudlich constants for two absorbable fractions) are used with the two pesticides. However, IAST does not allow correct modeling of pesticide adsorption onto two other (C and D) activated carbons in solution in natural water to be described. IAS theory does not reveal competition between diuron and NOM and pore blockage mechanism by the NOM is proposed as the major effect for the adsorption capacity reduction. However, the difference observed between the two pesticides could be due to in addition to the pore blockage effect, a particular phenomenon with the diuron, especially with D activated carbon. We can suppose specific interactions between the diuron and the adsorbed organic matter and a competition between adsorption sites of NOM and activated carbon surface.

  18. Effects of Differential Item Functioning on Examinees' Test Performance and Reliability of Test

    ERIC Educational Resources Information Center

    Lee, Yi-Hsuan; Zhang, Jinming

    2017-01-01

    Simulations were conducted to examine the effect of differential item functioning (DIF) on measurement consequences such as total scores, item response theory (IRT) ability estimates, and test reliability in terms of the ratio of true-score variance to observed-score variance and the standard error of estimation for the IRT ability parameter. The…

  19. Neurodynamic system theory: scope and limits.

    PubMed

    Erdi, P

    1993-06-01

    This paper proposes that neurodynamic system theory may be used to connect structural and functional aspects of neural organization. The paper claims that generalized causal dynamic models are proper tools for describing the self-organizing mechanism of the nervous system. In particular, it is pointed out that ontogeny, development, normal performance, learning, and plasticity, can be treated by coherent concepts and formalism. Taking into account the self-referential character of the brain, autopoiesis, endophysics and hermeneutics are offered as elements of a poststructuralist brain (-mind-computer) theory.

  20. 75 FR 71613 - Mandatory Reliability Standards for Interconnection Reliability Operating Limits

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-24

    ... Reliability Standards. The proposed Reliability Standards were designed to prevent instability, uncontrolled... Reliability Standards.\\2\\ The proposed Reliability Standards were designed to prevent instability... the SOLs, which if exceeded, could expose a widespread area of the bulk electric system to instability...

  1. The reliability paradox of the Parent-Child Conflict Tactics Corporal Punishment Subscale.

    PubMed

    Lorber, Michael F; Slep, Amy M Smith

    2018-02-01

    In the present investigation we consider and explain an apparent paradox in the measurement of corporal punishment with the Parent-Child Conflict Tactics Scale (CTS-PC): How can it have poor internal consistency and still be reliable? The CTS-PC was administered to a community sample of 453 opposite sex couples who were parents of 3- to 7-year-old children. Internal consistency was marginal, yet item response theory analyses revealed that reliability rose sharply with increasing corporal punishment, exceeding .80 in the upper ranges of the construct. The results suggest that the CTS-PC Corporal Punishment subscale reliably discriminates among parents who report average to high corporal punishment (64% of mothers and 56% of fathers in the present sample), despite low overall internal consistency. These results have straightforward implications for the use and reporting of the scale. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  2. From Theory to Practice: Measuring end-of-life communication quality using multiple goals theory.

    PubMed

    Van Scoy, L J; Scott, A M; Reading, J M; Chuang, C H; Chinchilli, V M; Levi, B H; Green, M J

    2017-05-01

    To describe how multiple goals theory can be used as a reliable and valid measure (i.e., coding scheme) of the quality of conversations about end-of-life issues. We analyzed conversations from 17 conversations in which 68 participants (mean age=51years) played a game that prompted discussion in response to open-ended questions about end-of-life issues. Conversations (mean duration=91min) were audio-recorded and transcribed. Communication quality was assessed by three coders who assigned numeric scores rating how well individuals accomplished task, relational, and identity goals in the conversation. The coding measure, which results in a quantifiable outcome, yielded strong reliability (intra-class correlation range=0.73-0.89 and Cronbach's alpha range=0.69-0.89 for each of the coded domains) and validity (using multilevel nonlinear modeling, we detected significant variability in scores between games for each of the coded domains, all p-values <0.02). Our coding scheme provides a theory-based measure of end-of-life conversation quality that is superior to other methods of measuring communication quality. Our description of the coding method enables researches to adapt and apply this measure to communication interventions in other clinical contexts. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  3. Reliability Growth and Its Applications to Dormant Reliability

    DTIC Science & Technology

    1981-12-01

    ability to make projection about future reli- ability (Rof 9:41-42). Barlow and Scheuer Model. Richard E. Barlow and Ernest M. Sch~uvr, of the University...Reliability Growth Prediction Models," Operations Research, 18(l):S2-6S (January/February 1970). 7. Bauer, John, William Hadley, and Robert Dietz... Texarkana , Texas, May 1973. (AD 768 119). 10. Bonis, Austin J. "Reliability Growth Curves for One Shot Devices," Proceedings 1977 Annual Reliability and

  4. Theory of mind predicts severity level in autism.

    PubMed

    Hoogenhout, Michelle; Malcolm-Smith, Susan

    2017-02-01

    We investigated whether theory of mind skills can indicate autism spectrum disorder severity. In all, 62 children with autism spectrum disorder completed a developmentally sensitive theory of mind battery. We used intelligence quotient, Diagnostic and Statistical Manual of Mental Disorders (4th ed.) diagnosis and level of support needed as indicators of severity level. Using hierarchical cluster analysis, we found three distinct clusters of theory of mind ability: early-developing theory of mind (Cluster 1), false-belief reasoning (Cluster 2) and sophisticated theory of mind understanding (Cluster 3). The clusters corresponded to severe, moderate and mild autism spectrum disorder. As an indicator of level of support needed, cluster grouping predicted the type of school children attended. All Cluster 1 children attended autism-specific schools; Cluster 2 was divided between autism-specific and special needs schools and nearly all Cluster 3 children attended general special needs and mainstream schools. Assessing theory of mind skills can reliably discriminate severity levels within autism spectrum disorder.

  5. Measuring the emotional climate of an organization.

    PubMed

    Yurtsever, Gülçimen; De Rivera, Joseph

    2010-04-01

    The importance of emotional climate in the organizational climate literature has gained interest. However, few studies have concentrated on adequately measuring the emotional climate of organizations. In this study, a reliable and valid scale was developed to measure the most important aspects of emotional climate in different organizations. This study presents evidence of reliability and validity for 28 items constructed to measure emotional climate in an organization in four separate studies. The data were obtained from working people from four different organizations by self-administered questionnaires. The findings indicate that three factors--Trust, Hope, and Security--were factors of the 28-item scale. Validation data also included correlations with duration of employment. The other method of assessing criterion validity was by comparing mean scores in organizations with differing productivity; results indicated that the organization with more productive members had a significantly higher mean score on emotional climate and its subscales. The generalizability of the results to private businesses also was assessed.

  6. General theory for multiple input-output perturbations in complex molecular systems. 1. Linear QSPR electronegativity models in physical, organic, and medicinal chemistry.

    PubMed

    González-Díaz, Humberto; Arrasate, Sonia; Gómez-SanJuan, Asier; Sotomayor, Nuria; Lete, Esther; Besada-Porto, Lina; Ruso, Juan M

    2013-01-01

    In general perturbation methods starts with a known exact solution of a problem and add "small" variation terms in order to approach to a solution for a related problem without known exact solution. Perturbation theory has been widely used in almost all areas of science. Bhor's quantum model, Heisenberg's matrix mechanincs, Feyman diagrams, and Poincare's chaos model or "butterfly effect" in complex systems are examples of perturbation theories. On the other hand, the study of Quantitative Structure-Property Relationships (QSPR) in molecular complex systems is an ideal area for the application of perturbation theory. There are several problems with exact experimental solutions (new chemical reactions, physicochemical properties, drug activity and distribution, metabolic networks, etc.) in public databases like CHEMBL. However, in all these cases, we have an even larger list of related problems without known solutions. We need to know the change in all these properties after a perturbation of initial boundary conditions. It means, when we test large sets of similar, but different, compounds and/or chemical reactions under the slightly different conditions (temperature, time, solvents, enzymes, assays, protein targets, tissues, partition systems, organisms, etc.). However, to the best of our knowledge, there is no QSPR general-purpose perturbation theory to solve this problem. In this work, firstly we review general aspects and applications of both perturbation theory and QSPR models. Secondly, we formulate a general-purpose perturbation theory for multiple-boundary QSPR problems. Last, we develop three new QSPR-Perturbation theory models. The first model classify correctly >100,000 pairs of intra-molecular carbolithiations with 75-95% of Accuracy (Ac), Sensitivity (Sn), and Specificity (Sp). The model predicts probabilities of variations in the yield and enantiomeric excess of reactions due to at least one perturbation in boundary conditions (solvent, temperature

  7. Targeting helicase-dependent amplification products with an electrochemical genosensor for reliable and sensitive screening of genetically modified organisms.

    PubMed

    Moura-Melo, Suely; Miranda-Castro, Rebeca; de-Los-Santos-Álvarez, Noemí; Miranda-Ordieres, Arturo J; Dos Santos Junior, J Ribeiro; da Silva Fonseca, Rosana A; Lobo-Castañón, Maria Jesús

    2015-08-18

    Cultivation of genetically modified organisms (GMOs) and their use in food and feed is constantly expanding; thus, the question of informing consumers about their presence in food has proven of significant interest. The development of sensitive, rapid, robust, and reliable methods for the detection of GMOs is crucial for proper food labeling. In response, we have experimentally characterized the helicase-dependent isothermal amplification (HDA) and sequence-specific detection of a transgene from the Cauliflower Mosaic Virus 35S Promoter (CaMV35S), inserted into most transgenic plants. HDA is one of the simplest approaches for DNA amplification, emulating the bacterial replication machinery, and resembling PCR but under isothermal conditions. However, it usually suffers from a lack of selectivity, which is due to the accumulation of spurious amplification products. To improve the selectivity of HDA, which makes the detection of amplification products more reliable, we have developed an electrochemical platform targeting the central sequence of HDA copies of the transgene. A binary monolayer architecture is built onto a thin gold film where, upon the formation of perfect nucleic acid duplexes with the amplification products, these are enzyme-labeled and electrochemically transduced. The resulting combined system increases genosensor detectability up to 10(6)-fold, allowing Yes/No detection of GMOs with a limit of detection of ∼30 copies of the CaMV35S genomic DNA. A set of general utility rules in the design of genosensors for detection of HDA amplicons, which may assist in the development of point-of-care tests, is also included. The method provides a versatile tool for detecting nucleic acids with extremely low abundance not only for food safety control but also in the diagnostics and environmental control areas.

  8. Types and status of high reliability practices in the federal fire community (Abstract)

    Treesearch

    Anne Black; Brooke McBride

    2012-01-01

    Since emerging in the late 1980s, the paradigm of High Reliability Organizing (HRO) has sought to describe how and why certain organizations consistently function safely under hazardous conditions. Researchers have explored a variety of industrial situations, from the early observations of nuclear aircraft carrier operations to nursing units and even high-tech IPOs....

  9. A Practical Solution to Optimizing the Reliability of Teaching Observation Measures under Budget Constraints

    ERIC Educational Resources Information Center

    Meyer, J. Patrick; Liu, Xiang; Mashburn, Andrew J.

    2014-01-01

    Researchers often use generalizability theory to estimate relative error variance and reliability in teaching observation measures. They also use it to plan future studies and design the best possible measurement procedures. However, designing the best possible measurement procedure comes at a cost, and researchers must stay within their budget…

  10. Development, test-retest reliability and validity of the Pharmacy Value-Added Services Questionnaire (PVASQ)

    PubMed Central

    Tan, Christine L.; Hassali, Mohamed A.; Saleem, Fahad; Shafie, Asrul A.; Aljadhey, Hisham; Gan, Vincent B.

    2015-01-01

    Objective: (i) To develop the Pharmacy Value-Added Services Questionnaire (PVASQ) using emerging themes generated from interviews. (ii) To establish reliability and validity of questionnaire instrument. Methods: Using an extended Theory of Planned Behavior as the theoretical model, face-to-face interviews generated salient beliefs of pharmacy value-added services. The PVASQ was constructed initially in English incorporating important themes and later translated into the Malay language with forward and backward translation. Intention (INT) to adopt pharmacy value-added services is predicted by attitudes (ATT), subjective norms (SN), perceived behavioral control (PBC), knowledge and expectations. Using a 7-point Likert-type scale and a dichotomous scale, test-retest reliability (N=25) was assessed by administrating the questionnaire instrument twice at an interval of one week apart. Internal consistency was measured by Cronbach’s alpha and construct validity between two administrations was assessed using the kappa statistic and the intraclass correlation coefficient (ICC). Confirmatory Factor Analysis, CFA (N=410) was conducted to assess construct validity of the PVASQ. Results: The kappa coefficients indicate a moderate to almost perfect strength of agreement between test and retest. The ICC for all scales tested for intra-rater (test-retest) reliability was good. The overall Cronbach’ s alpha (N=25) is 0.912 and 0.908 for the two time points. The result of CFA (N=410) showed most items loaded strongly and correctly into corresponding factors. Only one item was eliminated. Conclusions: This study is the first to develop and establish the reliability and validity of the Pharmacy Value-Added Services Questionnaire instrument using the Theory of Planned Behavior as the theoretical model. The translated Malay language version of PVASQ is reliable and valid to predict Malaysian patients’ intention to adopt pharmacy value-added services to collect partial medicine

  11. Development, test-retest reliability and validity of the Pharmacy Value-Added Services Questionnaire (PVASQ).

    PubMed

    Tan, Christine L; Hassali, Mohamed A; Saleem, Fahad; Shafie, Asrul A; Aljadhey, Hisham; Gan, Vincent B

    2015-01-01

    (i) To develop the Pharmacy Value-Added Services Questionnaire (PVASQ) using emerging themes generated from interviews. (ii) To establish reliability and validity of questionnaire instrument. Using an extended Theory of Planned Behavior as the theoretical model, face-to-face interviews generated salient beliefs of pharmacy value-added services. The PVASQ was constructed initially in English incorporating important themes and later translated into the Malay language with forward and backward translation. Intention (INT) to adopt pharmacy value-added services is predicted by attitudes (ATT), subjective norms (SN), perceived behavioral control (PBC), knowledge and expectations. Using a 7-point Likert-type scale and a dichotomous scale, test-retest reliability (N=25) was assessed by administrating the questionnaire instrument twice at an interval of one week apart. Internal consistency was measured by Cronbach's alpha and construct validity between two administrations was assessed using the kappa statistic and the intraclass correlation coefficient (ICC). Confirmatory Factor Analysis, CFA (N=410) was conducted to assess construct validity of the PVASQ. The kappa coefficients indicate a moderate to almost perfect strength of agreement between test and retest. The ICC for all scales tested for intra-rater (test-retest) reliability was good. The overall Cronbach' s alpha (N=25) is 0.912 and 0.908 for the two time points. The result of CFA (N=410) showed most items loaded strongly and correctly into corresponding factors. Only one item was eliminated. This study is the first to develop and establish the reliability and validity of the Pharmacy Value-Added Services Questionnaire instrument using the Theory of Planned Behavior as the theoretical model. The translated Malay language version of PVASQ is reliable and valid to predict Malaysian patients' intention to adopt pharmacy value-added services to collect partial medicine supply.

  12. Computational methods for efficient structural reliability and reliability sensitivity analysis

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.

    1993-01-01

    This paper presents recent developments in efficient structural reliability analysis methods. The paper proposes an efficient, adaptive importance sampling (AIS) method that can be used to compute reliability and reliability sensitivities. The AIS approach uses a sampling density that is proportional to the joint PDF of the random variables. Starting from an initial approximate failure domain, sampling proceeds adaptively and incrementally with the goal of reaching a sampling domain that is slightly greater than the failure domain to minimize over-sampling in the safe region. Several reliability sensitivity coefficients are proposed that can be computed directly and easily from the above AIS-based failure points. These probability sensitivities can be used for identifying key random variables and for adjusting design to achieve reliability-based objectives. The proposed AIS methodology is demonstrated using a turbine blade reliability analysis problem.

  13. A Note on the Reliability Coefficients for Item Response Model-Based Ability Estimates

    ERIC Educational Resources Information Center

    Kim, Seonghoon

    2012-01-01

    Assuming item parameters on a test are known constants, the reliability coefficient for item response theory (IRT) ability estimates is defined for a population of examinees in two different ways: as (a) the product-moment correlation between ability estimates on two parallel forms of a test and (b) the squared correlation between the true…

  14. Studies on Mathematical Models of Wet Adhesion and Lifetime Prediction of Organic Coating/Steel by Grey System Theory.

    PubMed

    Meng, Fandi; Liu, Ying; Liu, Li; Li, Ying; Wang, Fuhui

    2017-06-28

    A rapid degradation of wet adhesion is the key factor controlling coating lifetime, for the organic coatings under marine hydrostatic pressure. The mathematical models of wet adhesion have been studied by Grey System Theory (GST). Grey models (GM) (1, 1) of epoxy varnish (EV) coating/steel and epoxy glass flake (EGF) coating/steel have been established, and a lifetime prediction formula has been proposed on the basis of these models. The precision assessments indicate that the established models are accurate, and the prediction formula is capable of making precise lifetime forecasting of the coatings.

  15. Lifetime Reliability Evaluation of Structural Ceramic Parts with the CARES/LIFE Computer Program

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.; Powers, Lynn M.; Janosik, Lesley A.; Gyekenyesi, John P.

    1993-01-01

    The computer program CARES/LIFE calculates the time-dependent reliability of monolithic ceramic components subjected to thermomechanical and/or proof test loading. This program is an extension of the CARES (Ceramics Analysis and Reliability Evaluation of Structures) computer program. CARES/LIFE accounts for the phenomenon of subcritical crack growth (SCG) by utilizing the power law, Paris law, or Walker equation. The two-parameter Weibull cumulative distribution function is used to characterize the variation in component strength. The effects of multiaxial stresses are modeled using either the principle of independent action (PIA), Weibull's normal stress averaging method (NSA), or Batdorf's theory. Inert strength and fatigue parameters are estimated from rupture strength data of naturally flawed specimens loaded in static, dynamic, or cyclic fatigue. Two example problems demonstrating cyclic fatigue parameter estimation and component reliability analysis with proof testing are included.

  16. Reliability analysis applied to structural tests

    NASA Technical Reports Server (NTRS)

    Diamond, P.; Payne, A. O.

    1972-01-01

    The application of reliability theory to predict, from structural fatigue test data, the risk of failure of a structure under service conditions because its load-carrying capability is progressively reduced by the extension of a fatigue crack, is considered. The procedure is applicable to both safe-life and fail-safe structures and, for a prescribed safety level, it will enable an inspection procedure to be planned or, if inspection is not feasible, it will evaluate the life to replacement. The theory has been further developed to cope with the case of structures with initial cracks, such as can occur in modern high-strength materials which are susceptible to the formation of small flaws during the production process. The method has been applied to a structure of high-strength steel and the results are compared with those obtained by the current life estimation procedures. This has shown that the conventional methods can be unconservative in certain cases, depending on the characteristics of the structure and the design operating conditions. The suitability of the probabilistic approach to the interpretation of the results from full-scale fatigue testing of aircraft structures is discussed and the assumptions involved are examined.

  17. Software Reliability 2002

    NASA Technical Reports Server (NTRS)

    Wallace, Dolores R.

    2003-01-01

    In FY01 we learned that hardware reliability models need substantial changes to account for differences in software, thus making software reliability measurements more effective, accurate, and easier to apply. These reliability models are generally based on familiar distributions or parametric methods. An obvious question is 'What new statistical and probability models can be developed using non-parametric and distribution-free methods instead of the traditional parametric method?" Two approaches to software reliability engineering appear somewhat promising. The first study, begin in FY01, is based in hardware reliability, a very well established science that has many aspects that can be applied to software. This research effort has investigated mathematical aspects of hardware reliability and has identified those applicable to software. Currently the research effort is applying and testing these approaches to software reliability measurement, These parametric models require much project data that may be difficult to apply and interpret. Projects at GSFC are often complex in both technology and schedules. Assessing and estimating reliability of the final system is extremely difficult when various subsystems are tested and completed long before others. Parametric and distribution free techniques may offer a new and accurate way of modeling failure time and other project data to provide earlier and more accurate estimates of system reliability.

  18. Implications of complex adaptive systems theory for the design of research on health care organizations

    PubMed Central

    McDaniel, Reuben R.; Lanham, Holly Jordan; Anderson, Ruth A.

    2013-01-01

    Background Because health care organizations (HCOs) are complex adaptive systems (CASs), phenomena of interest often are dynamic and unfold in unpredictable ways, and unfolding events are often unique. Researchers of HCOs may recognize that the subject of their research is dynamic; however, their research designs may not take this into account. Researchers may also know that unfolding events are often unique, but their design may not have the capacity to obtain information from meager evidence. Purpose These two concerns led us to examine two ideas from organizational theory: (a) the ideas of K. E. Weick (1993) on organizational design as a verb and (b) the ideas of J. G. March, L. S. Sproull, and M. Tamuz (1991) on learning from samples of one or fewer. In this article, we applied these ideas to develop an enriched perspective of research design for studying CASs. Methodology/Approach We conducted a theoretical analysis of organizations as CASs, identifying relevant characteristics for research designs. We then explored two ideas from organizational theory and discussed the implications for research designs. Findings Weick's idea of “design as a verb” helps in understanding dynamic and process-oriented research design. The idea of “learning from samples of one or fewer” of March, Sproull, and Tamuz provides strategies for research design that enables learning from meager evidence. When studying HCOs, research designs are likely to be more effective when they (a) anticipate change, (b) include tension, (c) capitalize on serendipity, and (d) use an “act-then-look” mind set. Implications for practice are discussed. Practice Implications Practitioners who understand HCOs as CASs will be cautious in accepting findings from studies that treat HCOs mechanistically. They will consider the characteristics of CAS when evaluating the evidence base for practice. Practitioners can use the strategies proposed in this article to stimulate discussion with researchers

  19. Reliable and valid assessment of point-of-care ultrasonography.

    PubMed

    Todsen, Tobias; Tolsgaard, Martin Grønnebæk; Olsen, Beth Härstedt; Henriksen, Birthe Merete; Hillingsø, Jens Georg; Konge, Lars; Jensen, Morten Lind; Ringsted, Charlotte

    2015-02-01

    To explore the reliability and validity of the Objective Structured Assessment of Ultrasound Skills (OSAUS) scale for point-of-care ultrasonography (POC US) performance. POC US is increasingly used by clinicians and is an essential part of the management of acute surgical conditions. However, the quality of performance is highly operator-dependent. Therefore, reliable and valid assessment of trainees' ultrasonography competence is needed to ensure patient safety. Twenty-four physicians, representing novices, intermediates, and experts in POC US, scanned 4 different surgical patient cases in a controlled set-up. All ultrasound examinations were video-recorded and assessed by 2 blinded radiologists using OSAUS. Reliability was examined using generalizability theory. Construct validity was examined by comparing performance scores between the groups and by correlating physicians' OSAUS scores with diagnostic accuracy. The generalizability coefficient was high (0.81) and a D-study demonstrated that 1 assessor and 5 cases would result in similar reliability. The construct validity of the OSAUS scale was supported by a significant difference in the mean scores between the novice group (17.0; SD 8.4) and the intermediate group (30.0; SD 10.1), P = 0.007, as well as between the intermediate group and the expert group (72.9; SD 4.4), P = 0.04, and by a high correlation between OSAUS scores and diagnostic accuracy (Spearman ρ correlation coefficient = 0.76; P < 0.001). This study demonstrates high reliability as well as evidence of construct validity of the OSAUS scale for assessment of POC US competence. Hence, the OSAUS scale may be suitable for both in-training as well as end-of-training assessment.

  20. Molecular Theory of Detonation Initiation: Insight from First Principles Modeling of the Decomposition Mechanisms of Organic Nitro Energetic Materials.

    PubMed

    Tsyshevsky, Roman V; Sharia, Onise; Kuklja, Maija M

    2016-02-19

    This review presents a concept, which assumes that thermal decomposition processes play a major role in defining the sensitivity of organic energetic materials to detonation initiation. As a science and engineering community we are still far away from having a comprehensive molecular detonation initiation theory in a widely agreed upon form. However, recent advances in experimental and theoretical methods allow for a constructive and rigorous approach to design and test the theory or at least some of its fundamental building blocks. In this review, we analyzed a set of select experimental and theoretical articles, which were augmented by our own first principles modeling and simulations, to reveal new trends in energetic materials and to refine known existing correlations between their structures, properties, and functions. Our consideration is intentionally limited to the processes of thermally stimulated chemical reactions at the earliest stage of decomposition of molecules and materials containing defects.

  1. The effect of the labile organic fraction in food waste and the substrate/inoculum ratio on anaerobic digestion for a reliable methane yield.

    PubMed

    Kawai, Minako; Nagao, Norio; Tajima, Nobuaki; Niwa, Chiaki; Matsuyama, Tatsushi; Toda, Tatsuki

    2014-04-01

    Influence of the labile organic fraction (LOF) on anaerobic digestion of food waste was investigated in different S/I ratio of 0.33, 0.5, 1.0, 2.0 and 4.0g-VSsubstrate/g-VSinoculum. Two types of substrate, standard food waste (Substrate 1) and standard food waste with the supernatant (containing LOF) removed (Substrate 2) were used. Highest methane yield of 435ml-CH4g-VS(-1) in Substrate 1 was observed in the lowest S/I ratio, while the methane yield of the other S/I ratios were 38-73% lower than the highest yield due to acidification. The methane yields in Substrate 2 were relatively stable in all S/I conditions, although the maximum methane yield was low compared with Substrate 1. These results showed that LOF in food waste causes acidification, but also contributes to high methane yields, suggesting that low S/I ratio (<0.33) is required to obtain a reliable methane yield from food waste compared to other organic substrates. Copyright © 2014 Elsevier Ltd. All rights reserved.

  2. Work Design Theory: A Review and Critique with Implications for Human Resource Development

    ERIC Educational Resources Information Center

    Torraco, Richard J.

    2005-01-01

    Six theoretical perspectives on work design are examined for their contributions to our understanding of how work is organized and designed in organizations: sociotechnical systems theory, process improvement, adaptive structuration theory, the job characteristics model, technostructural change models, and activity theory. A critique of these…

  3. A comparison of reliability and conventional estimation of safe fatigue life and safe inspection intervals

    NASA Technical Reports Server (NTRS)

    Hooke, F. H.

    1972-01-01

    Both the conventional and reliability analyses for determining safe fatigue life are predicted on a population having a specified (usually log normal) distribution of life to collapse under a fatigue test load. Under a random service load spectrum, random occurrences of load larger than the fatigue test load may confront and cause collapse of structures which are weakened, though not yet to the fatigue test load. These collapses are included in reliability but excluded in conventional analysis. The theory of risk determination by each method is given, and several reasonably typical examples have been worked out, in which it transpires that if one excludes collapse through exceedance of the uncracked strength, the reliability and conventional analyses gave virtually identical probabilities of failure or survival.

  4. Creep-rupture reliability analysis

    NASA Technical Reports Server (NTRS)

    Peralta-Duran, A.; Wirsching, P. H.

    1984-01-01

    A probabilistic approach to the correlation and extrapolation of creep-rupture data is presented. Time temperature parameters (TTP) are used to correlate the data, and an analytical expression for the master curve is developed. The expression provides a simple model for the statistical distribution of strength and fits neatly into a probabilistic design format. The analysis focuses on the Larson-Miller and on the Manson-Haferd parameters, but it can be applied to any of the TTP's. A method is developed for evaluating material dependent constants for TTP's. It is shown that optimized constants can provide a significant improvement in the correlation of the data, thereby reducing modelling error. Attempts were made to quantify the performance of the proposed method in predicting long term behavior. Uncertainty in predicting long term behavior from short term tests was derived for several sets of data. Examples are presented which illustrate the theory and demonstrate the application of state of the art reliability methods to the design of components under creep.

  5. A Reappraisal of Leadership Theory and Training.

    ERIC Educational Resources Information Center

    Owens, James

    1981-01-01

    Reviews and organizes modern leadership theories. Notes the research supporting the main thesis of contingency theory and that effective leadership style is contingent upon situational factors. Characteristics of management training based on the contingency approach are identified. (Author/MLF)

  6. Software analysis handbook: Software complexity analysis and software reliability estimation and prediction

    NASA Technical Reports Server (NTRS)

    Lee, Alice T.; Gunn, Todd; Pham, Tuan; Ricaldi, Ron

    1994-01-01

    This handbook documents the three software analysis processes the Space Station Software Analysis team uses to assess space station software, including their backgrounds, theories, tools, and analysis procedures. Potential applications of these analysis results are also presented. The first section describes how software complexity analysis provides quantitative information on code, such as code structure and risk areas, throughout the software life cycle. Software complexity analysis allows an analyst to understand the software structure, identify critical software components, assess risk areas within a software system, identify testing deficiencies, and recommend program improvements. Performing this type of analysis during the early design phases of software development can positively affect the process, and may prevent later, much larger, difficulties. The second section describes how software reliability estimation and prediction analysis, or software reliability, provides a quantitative means to measure the probability of failure-free operation of a computer program, and describes the two tools used by JSC to determine failure rates and design tradeoffs between reliability, costs, performance, and schedule.

  7. Subjective indicators as a gauge for improving organizational well-being. An attempt to apply the cognitive activation theory to organizations.

    PubMed

    Arnetz, Bengt B

    2005-11-01

    Globally, organizations are undergoing substantial changes, commonly resulting in significant employee stress. However, facing similar stressors and challenges, departments within an organizations, as well as companies within the same area of business, vary in the way they cope with change. It was hypothesized that collective uncertainty about the future as well as unclear organizational goals contribute to chronic stress in organizations exposed to change. Applying the theoretical cognitive activation theory of stress--CATS--model by Ursin and Eriksen at an organizational level, support was found for the above hypothesis. Changes in chronic stress indicators between two assessments were related to clarity of organizational goals. It is suggested that the CATS model might be fruitful, not only in understanding variations in individual stress responses and experiences, but also to interpret and manage organizational stress. By doing so, both organizational health and well-being will improve, creating enterprises with healthy employees and healthy productivity and economic results.

  8. Employment-Based Training in Japanese Firms in Japan and in the United States: Experiences of Automobile Manufacturers.

    ERIC Educational Resources Information Center

    Hashimoto, Masanori

    An economic theory of training holds that training in technical skills and training in employment relations (namely, information reliability or the ability to quickly and reliably disseminate information among the members of the firm) reinforce each other. This theory is an organizing framework for understanding some practices at Japanese firms in…

  9. Administrative Inservice and Theories of Groups.

    ERIC Educational Resources Information Center

    Wimpelberg, Robert K.

    Voluntary organizations providing inservice activities for principals are the newest in the administrative development field. This paper explores those organizations' prospects, particularly the voluntary, administrator-directed "principals' center," and borrows its analytical framework from theories of group formation. The Principals' Center in…

  10. Reliable, Low-Cost, Low-Weight, Non-Hermetic Coating for MCM Applications

    NASA Technical Reports Server (NTRS)

    Jones, Eric W.; Licari, James J.

    2000-01-01

    Through an Air Force Research Laboratory sponsored STM program, reliable, low-cost, low-weight, non-hermetic coatings for multi-chip-module(MCK applications were developed. Using the combination of Sandia Laboratory ATC-01 test chips, AvanTeco's moisture sensor chips(MSC's), and silicon slices, we have shown that organic and organic/inorganic overcoatings are reliable and practical non-hermetic moisture and oxidation barriers. The use of the MSC and unpassivated ATC-01 test chips provided rapid test results and comparison of moisture barrier quality of the overcoatings. The organic coatings studied were Parylene and Cyclotene. The inorganic coatings were Al2O3 and SiO2. The choice of coating(s) is dependent on the environment that the device(s) will be exposed to. We have defined four(4) classes of environments: Class I(moderate temperature/moderate humidity). Class H(high temperature/moderate humidity). Class III(moderate temperature/high humidity). Class IV(high temperature/high humidity). By subjecting the components to adhesion, FTIR, temperature-humidity(TH), pressure cooker(PCT), and electrical tests, we have determined that it is possible to reduce failures 50-70% for organic/inorganic coated components compared to organic coated components. All materials and equipment used are readily available commercially or are standard in most semiconductor fabrication lines. It is estimated that production cost for the developed technology would range from $1-10/module, compared to $20-200 for hermetically sealed packages.

  11. Gendered Organizations in the New Economy

    PubMed Central

    Williams, Christine L.; Muller, Chandra; Kilanski, Kristine

    2014-01-01

    Gender scholars draw on the “theory of gendered organizations” to explain persistent gender inequality in the workplace. This theory argues that gender inequality is built into work organizations in which jobs are characterized by long-term security, standardized career ladders and job descriptions, and management controlled evaluations. Over the past few decades, this basic organizational logic has been transformed. in the so-called new economy, work is increasingly characterized by job insecurity, teamwork, career maps, and networking. Using a case study of geoscientists in the oil and gas industry, we apply a gender lens to this evolving organization of work. This article extends Acker's theory of gendered organizations by identifying the mechanisms that reproduce gender inequality in the twenty-first-century workplace, and by suggesting appropriate policy approaches to remedy these disparities. PMID:25419048

  12. Argumentation Theory. [A Selected Annotated Bibliography].

    ERIC Educational Resources Information Center

    Benoit, William L.

    Materials dealing with aspects of argumentation theory are cited in this annotated bibliography. The 50 citations are organized by topic as follows: (1) argumentation; (2) the nature of argument; (3) traditional perspectives on argument; (4) argument diagrams; (5) Chaim Perelman's theory of rhetoric; (6) the evaluation of argument; (7) argument…

  13. Validity evidence and reliability of a simulated patient feedback instrument

    PubMed Central

    2012-01-01

    Background In the training of healthcare professionals, one of the advantages of communication training with simulated patients (SPs) is the SP's ability to provide direct feedback to students after a simulated clinical encounter. The quality of SP feedback must be monitored, especially because it is well known that feedback can have a profound effect on student performance. Due to the current lack of valid and reliable instruments to assess the quality of SP feedback, our study examined the validity and reliability of one potential instrument, the 'modified Quality of Simulated Patient Feedback Form' (mQSF). Methods Content validity of the mQSF was assessed by inviting experts in the area of simulated clinical encounters to rate the importance of the mQSF items. Moreover, generalizability theory was used to examine the reliability of the mQSF. Our data came from videotapes of clinical encounters between six simulated patients and six students and the ensuing feedback from the SPs to the students. Ten faculty members judged the SP feedback according to the items on the mQSF. Three weeks later, this procedure was repeated with the same faculty members and recordings. Results All but two items of the mQSF received importance ratings of > 2.5 on a four-point rating scale. A generalizability coefficient of 0.77 was established with two judges observing one encounter. Conclusions The findings for content validity and reliability with two judges suggest that the mQSF is a valid and reliable instrument to assess the quality of feedback provided by simulated patients. PMID:22284898

  14. Validity evidence and reliability of a simulated patient feedback instrument.

    PubMed

    Schlegel, Claudia; Woermann, Ulrich; Rethans, Jan-Joost; van der Vleuten, Cees

    2012-01-27

    In the training of healthcare professionals, one of the advantages of communication training with simulated patients (SPs) is the SP's ability to provide direct feedback to students after a simulated clinical encounter. The quality of SP feedback must be monitored, especially because it is well known that feedback can have a profound effect on student performance. Due to the current lack of valid and reliable instruments to assess the quality of SP feedback, our study examined the validity and reliability of one potential instrument, the 'modified Quality of Simulated Patient Feedback Form' (mQSF). Content validity of the mQSF was assessed by inviting experts in the area of simulated clinical encounters to rate the importance of the mQSF items. Moreover, generalizability theory was used to examine the reliability of the mQSF. Our data came from videotapes of clinical encounters between six simulated patients and six students and the ensuing feedback from the SPs to the students. Ten faculty members judged the SP feedback according to the items on the mQSF. Three weeks later, this procedure was repeated with the same faculty members and recordings. All but two items of the mQSF received importance ratings of > 2.5 on a four-point rating scale. A generalizability coefficient of 0.77 was established with two judges observing one encounter. The findings for content validity and reliability with two judges suggest that the mQSF is a valid and reliable instrument to assess the quality of feedback provided by simulated patients.

  15. 76 FR 42534 - Mandatory Reliability Standards for Interconnection Reliability Operating Limits; System...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-19

    ... Reliability Operating Limits; System Restoration Reliability Standards AGENCY: Federal Energy Regulatory... data necessary to analyze and monitor Interconnection Reliability Operating Limits (IROL) within its... Interconnection Reliability Operating Limits, Order No. 748, 134 FERC ] 61,213 (2011). \\2\\ The term ``Wide-Area...

  16. Managing corporate capabilities:theory and industry approaches.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Slavin, Adam M.

    2007-02-01

    This study characterizes theoretical and industry approaches to organizational capabilities management and ascertains whether there is a distinct ''best practice'' in this regard. We consider both physical capabilities, such as technical disciplines and infrastructure, and non-physical capabilities such as corporate culture and organizational procedures. We examine Resource-Based Theory (RBT), which is the predominant organizational management theory focused on capabilities. RBT seeks to explain the effect of capabilities on competitiveness, and thus provide a basis for investment/divestment decisions. We then analyze industry approaches described to us in interviews with representatives from Goodyear, 3M, Intel, Ford, NASA, Lockheed Martin, and Boeing. Wemore » found diversity amongst the industry capability management approaches. Although all organizations manage capabilities and consider them to some degree in their strategies, no two approaches that we observed were identical. Furthermore, we observed that theory is not a strong driver in this regard. No organization used the term ''Resource-Based Theory'', nor did any organization mention any other guiding theory or practice from the organizational management literature when explaining their capabilities management approaches. As such, we concluded that there is no single best practice for capabilities management. Nevertheless, we believe that RBT and the diverse industry experiences described herein can provide useful insights to support development of capabilities management approaches.« less

  17. Reliability-Based Design Optimization of a Composite Airframe Component

    NASA Technical Reports Server (NTRS)

    Pai, Shantaram S.; Coroneos, Rula; Patnaik, Surya N.

    2011-01-01

    A stochastic optimization methodology (SDO) has been developed to design airframe structural components made of metallic and composite materials. The design method accommodates uncertainties in load, strength, and material properties that are defined by distribution functions with mean values and standard deviations. A response parameter, like a failure mode, has become a function of reliability. The primitive variables like thermomechanical loads, material properties, and failure theories, as well as variables like depth of beam or thickness of a membrane, are considered random parameters with specified distribution functions defined by mean values and standard deviations.

  18. A game theory-based trust measurement model for social networks.

    PubMed

    Wang, Yingjie; Cai, Zhipeng; Yin, Guisheng; Gao, Yang; Tong, Xiangrong; Han, Qilong

    2016-01-01

    In social networks, trust is a complex social network. Participants in online social networks want to share information and experiences with as many reliable users as possible. However, the modeling of trust is complicated and application dependent. Modeling trust needs to consider interaction history, recommendation, user behaviors and so on. Therefore, modeling trust is an important focus for online social networks. We propose a game theory-based trust measurement model for social networks. The trust degree is calculated from three aspects, service reliability, feedback effectiveness, recommendation credibility, to get more accurate result. In addition, to alleviate the free-riding problem, we propose a game theory-based punishment mechanism for specific trust and global trust, respectively. We prove that the proposed trust measurement model is effective. The free-riding problem can be resolved effectively through adding the proposed punishment mechanism.

  19. Allometric scaling theory applied to FIA biomass estimation

    Treesearch

    David C. Chojnacky

    2002-01-01

    Tree biomass estimates in the Forest Inventory and Analysis (FIA) database are derived from numerous methodologies whose abundance and complexity raise questions about consistent results throughout the U.S. A new model based on allometric scaling theory ("WBE") offers simplified methodology and a theoretically sound basis for improving the reliability and...

  20. Reliability of two social cognition tests: The combined stories test and the social knowledge test.

    PubMed

    Thibaudeau, Élisabeth; Cellard, Caroline; Legendre, Maxime; Villeneuve, Karèle; Achim, Amélie M

    2018-04-01

    Deficits in social cognition are common in psychiatric disorders. Validated social cognition measures with good psychometric properties are necessary to assess and target social cognitive deficits. Two recent social cognition tests, the Combined Stories Test (COST) and the Social Knowledge Test (SKT), respectively assess theory of mind and social knowledge. Previous studies have shown good psychometric properties for these tests, but the test-retest reliability has never been documented. The aim of this study was to evaluate the test-retest reliability and the inter-rater reliability of the COST and the SKT. The COST and the SKT were administered twice to a group of forty-two healthy adults, with a delay of approximately four weeks between the assessments. Excellent test-retest reliability was observed for the COST, and a good test-retest reliability was observed for the SKT. There was no evidence of practice effect. Furthermore, an excellent inter-rater reliability was observed for both tests. This study shows a good reliability of the COST and the SKT that adds to the good validity previously reported for these two tests. These good psychometrics properties thus support that the COST and the SKT are adequate measures for the assessment of social cognition. Copyright © 2018. Published by Elsevier B.V.

  1. Reliability of the AMA Guides to the Evaluation of Permanent Impairment.

    PubMed

    Forst, Linda; Friedman, Lee; Chukwu, Abraham

    2010-12-01

    AMA's Guides to the Evaluation of Permanent Impairment is used to rate loss of function and determine compensation and ability to work after injury or illness; however, there are few studies that evaluate reliability or construct validity. To evaluate the reliability of the fifth and sixth editions for back injury; to determine best methods for further study. Intra-class correlation coefficients within and between raters were relatively high. There was wider variability for individual cases. Impairment ratings were lower and correlated less well for the sixth edition, though confidence intervals overlapped. The sixth edition may not be an improvement over the fifth. A research agenda should include investigations of reliability and construct validity for different body sites and organ systems along the entire rating scale and among different categories of raters.

  2. Using minimal spanning trees to compare the reliability of network topologies

    NASA Technical Reports Server (NTRS)

    Leister, Karen J.; White, Allan L.; Hayhurst, Kelly J.

    1990-01-01

    Graph theoretic methods are applied to compute the reliability for several types of networks of moderate size. The graph theory methods used are minimal spanning trees for networks with bi-directional links and the related concept of strongly connected directed graphs for networks with uni-directional links. A comparison is conducted of ring networks and braided networks. The case is covered where just the links fail and the case where both links and nodes fail. Two different failure modes for the links are considered. For one failure mode, the link no longer carries messages. For the other failure mode, the link delivers incorrect messages. There is a description and comparison of link-redundancy versus path-redundancy as methods to achieve reliability. All the computations are carried out by means of a fault tree program.

  3. The search for reliable aqueous solubility (Sw) and octanol-water partition coefficient (Kow) data for hydrophobic organic compounds; DDT and DDE as a case study

    USGS Publications Warehouse

    Pontolillo, James; Eganhouse, R.P.

    2001-01-01

    The accurate determination of an organic contaminant?s physico-chemical properties is essential for predicting its environmental impact and fate. Approximately 700 publications (1944?2001) were reviewed and all known aqueous solubilities (Sw) and octanol-water partition coefficients (Kow) for the organochlorine pesticide, DDT, and its persistent metabolite, DDE were compiled and examined. Two problems are evident with the available database: 1) egregious errors in reporting data and references, and 2) poor data quality and/or inadequate documentation of procedures. The published literature (particularly the collative literature such as compilation articles and handbooks) is characterized by a preponderance of unnecessary data duplication. Numerous data and citation errors are also present in the literature. The percentage of original Sw and Kow data in compilations has decreased with time, and in the most recent publications (1994?97) it composes only 6?26 percent of the reported data. The variability of original DDT/DDE Sw and Kow data spans 2?4 orders of magnitude, and there is little indication that the uncertainty in these properties has declined over the last 5 decades. A criteria-based evaluation of DDT/DDE Sw and Kow data sources shows that 95?100 percent of the database literature is of poor or unevaluatable quality. The accuracy and reliability of the vast majority of the data are unknown due to inadequate documentation of the methods of determination used by the authors. [For example, estimates of precision have been reported for only 20 percent of experimental Sw data and 10 percent of experimental Kow data.] Computational methods for estimating these parameters have been increasingly substituted for direct or indirect experimental determination despite the fact that the data used for model development and validation may be of unknown reliability. Because of the prevalence of errors, the lack of methodological documentation, and unsatisfactory data

  4. Adaptation as organism design

    PubMed Central

    Gardner, Andy

    2009-01-01

    The problem of adaptation is to explain the apparent design of organisms. Darwin solved this problem with the theory of natural selection. However, population geneticists, whose responsibility it is to formalize evolutionary theory, have long neglected the link between natural selection and organismal design. Here, I review the major historical developments in theory of organismal adaptation, clarifying what adaptation is and what it is not, and I point out future avenues for research. PMID:19793739

  5. Explicit polarization (X-Pol) potential using ab initio molecular orbital theory and density functional theory.

    PubMed

    Song, Lingchun; Han, Jaebeom; Lin, Yen-lin; Xie, Wangshen; Gao, Jiali

    2009-10-29

    The explicit polarization (X-Pol) method has been examined using ab initio molecular orbital theory and density functional theory. The X-Pol potential was designed to provide a novel theoretical framework for developing next-generation force fields for biomolecular simulations. Importantly, the X-Pol potential is a general method, which can be employed with any level of electronic structure theory. The present study illustrates the implementation of the X-Pol method using ab initio Hartree-Fock theory and hybrid density functional theory. The computational results are illustrated by considering a set of bimolecular complexes of small organic molecules and ions with water. The computed interaction energies and hydrogen bond geometries are in good accord with CCSD(T) calculations and B3LYP/aug-cc-pVDZ optimizations.

  6. Test-retest reliability of computer-based video analysis of general movements in healthy term-born infants.

    PubMed

    Valle, Susanne Collier; Støen, Ragnhild; Sæther, Rannei; Jensenius, Alexander Refsum; Adde, Lars

    2015-10-01

    A computer-based video analysis has recently been presented for quantitative assessment of general movements (GMs). This method's test-retest reliability, however, has not yet been evaluated. The aim of the current study was to evaluate the test-retest reliability of computer-based video analysis of GMs, and to explore the association between computer-based video analysis and the temporal organization of fidgety movements (FMs). Test-retest reliability study. 75 healthy, term-born infants were recorded twice the same day during the FMs period using a standardized video set-up. The computer-based movement variables "quantity of motion mean" (Qmean), "quantity of motion standard deviation" (QSD) and "centroid of motion standard deviation" (CSD) were analyzed, reflecting the amount of motion and the variability of the spatial center of motion of the infant, respectively. In addition, the association between the variable CSD and the temporal organization of FMs was explored. Intraclass correlation coefficients (ICC 1.1 and ICC 3.1) were calculated to assess test-retest reliability. The ICC values for the variables CSD, Qmean and QSD were 0.80, 0.80 and 0.86 for ICC (1.1), respectively; and 0.80, 0.86 and 0.90 for ICC (3.1), respectively. There were significantly lower CSD values in the recordings with continual FMs compared to the recordings with intermittent FMs (p<0.05). This study showed high test-retest reliability of computer-based video analysis of GMs, and a significant association between our computer-based video analysis and the temporal organization of FMs. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  7. [New conditions for organ donation in France].

    PubMed

    Antoine, Corinne; Maroudy, Daniel

    2016-09-01

    The procurement of organs from donors after circulatory death is a reliable technique which gives satisfactory posttransplant results and also represents a potential source of additional organs. In order to meet the growing need for organ donations, the 'anticipated organ donation approach' procedure is currently receiving renewed interest with new conditions for its implementation in France. Copyright © 2016 Elsevier Masson SAS. All rights reserved.

  8. Types and status of high reliability practices in the US federal wildland fire community

    Treesearch

    Anne E. Black

    2012-01-01

    Since emerging in the late 1980s, the paradigm of High Reliability Organizing (HRO) has sought to describe how and why certain organizations consistently function safely under hazardous conditions. Researchers have explored a variety of industrial situations, from the early observations of nuclear aircraft carrier operations to nursing units and even high-tech IPOs....

  9. Cultural Organization: Fragments of a Theory,

    DTIC Science & Technology

    1983-11-01

    34 In B. Staw & L.L. Cummings (eds.) Research in Organization Behavior , Vol. 6, Greenwich, CT: JAI Press, 1963. November, 1982. 0070-11H 0983 TR-11 Bailyn... Behavior . November, 1982. TR-12 Schein, Edgar H. "The Role of the Founder In the Creation of Organizational Culture." Organizational Dynamics, Summer...34 Forthcoming in J. Lorscb (ed.) Handbook of Organizational Behavior , Englewood Cliffs, NJ: Prentice-Hall. May, 1983. TR-20 Van Maanen, John

  10. Remarks on worldsheet theories dual to free large N gauge theories

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aharony, Ofer; SITP, Department of Physics and SLAC, Stanford University, Stanford, California 94305; David, Justin R.

    2007-05-15

    We continue to investigate properties of the worldsheet conformal field theories (CFTs) which are conjectured to be dual to free large N gauge theories, using the mapping of Feynman diagrams to the worldsheet suggested in [R. Gopakumar, Phys. Rev. D 70, 025009 (2004); ibid.70, 025010 (2004); C. R. Physique 5, 1111 (2004); Phys. Rev. D 72, 066008 (2005)]. The modular invariance of these CFTs is shown to be built into the formalism. We show that correlation functions in these CFTs which are localized on subspaces of the moduli space may be interpreted as delta-function distributions, and that this can bemore » consistent with a local worldsheet description given some constraints on the operator product expansion coefficients. We illustrate these features by a detailed analysis of a specific four-point function diagram. To reliably compute this correlator, we use a novel perturbation scheme which involves an expansion in the large dimension of some operators.« less

  11. hfAIM: A reliable bioinformatics approach for in silico genome-wide identification of autophagy-associated Atg8-interacting motifs in various organisms

    PubMed Central

    Xie, Qingjun; Tzfadia, Oren; Levy, Matan; Weithorn, Efrat; Peled-Zehavi, Hadas; Van Parys, Thomas; Van de Peer, Yves; Galili, Gad

    2016-01-01

    ABSTRACT Most of the proteins that are specifically turned over by selective autophagy are recognized by the presence of short Atg8 interacting motifs (AIMs) that facilitate their association with the autophagy apparatus. Such AIMs can be identified by bioinformatics methods based on their defined degenerate consensus F/W/Y-X-X-L/I/V sequences in which X represents any amino acid. Achieving reliability and/or fidelity of the prediction of such AIMs on a genome-wide scale represents a major challenge. Here, we present a bioinformatics approach, high fidelity AIM (hfAIM), which uses additional sequence requirements—the presence of acidic amino acids and the absence of positively charged amino acids in certain positions—to reliably identify AIMs in proteins. We demonstrate that the use of the hfAIM method allows for in silico high fidelity prediction of AIMs in AIM-containing proteins (ACPs) on a genome-wide scale in various organisms. Furthermore, by using hfAIM to identify putative AIMs in the Arabidopsis proteome, we illustrate a potential contribution of selective autophagy to various biological processes. More specifically, we identified 9 peroxisomal PEX proteins that contain hfAIM motifs, among which AtPEX1, AtPEX6 and AtPEX10 possess evolutionary-conserved AIMs. Bimolecular fluorescence complementation (BiFC) results verified that AtPEX6 and AtPEX10 indeed interact with Atg8 in planta. In addition, we show that mutations occurring within or nearby hfAIMs in PEX1, PEX6 and PEX10 caused defects in the growth and development of various organisms. Taken together, the above results suggest that the hfAIM tool can be used to effectively perform genome-wide in silico screens of proteins that are potentially regulated by selective autophagy. The hfAIM system is a web tool that can be accessed at link: http://bioinformatics.psb.ugent.be/hfAIM/. PMID:27071037

  12. Structural reliability calculation method based on the dual neural network and direct integration method.

    PubMed

    Li, Haibin; He, Yun; Nie, Xiaobo

    2018-01-01

    Structural reliability analysis under uncertainty is paid wide attention by engineers and scholars due to reflecting the structural characteristics and the bearing actual situation. The direct integration method, started from the definition of reliability theory, is easy to be understood, but there are still mathematics difficulties in the calculation of multiple integrals. Therefore, a dual neural network method is proposed for calculating multiple integrals in this paper. Dual neural network consists of two neural networks. The neural network A is used to learn the integrand function, and the neural network B is used to simulate the original function. According to the derivative relationships between the network output and the network input, the neural network B is derived from the neural network A. On this basis, the performance function of normalization is employed in the proposed method to overcome the difficulty of multiple integrations and to improve the accuracy for reliability calculations. The comparisons between the proposed method and Monte Carlo simulation method, Hasofer-Lind method, the mean value first-order second moment method have demonstrated that the proposed method is an efficient and accurate reliability method for structural reliability problems.

  13. Measuring the Psychological Distance between an Organization and Its Members—The Construction and Validation of a New Scale

    PubMed Central

    Chen, Hong; Li, Shanshan

    2018-01-01

    There exists a lack of specific research methods to estimate the relationship between an organization and its employees, which has long challenged research in the field of organizational management. Therefore, this article introduces psychological distance concept into the research of organizational behavior, which can define the concept of psychological distance between employees and an organization and describe a level of perceived correspondence or interaction between subjects and objects. We developed an employee-organization psychological distance (EOPD) scale through both qualitative and quantitative analysis methods. As indicated by the research results based on grounded theory (10 employee in-depth interview records and 277 opening questionnaires) and formal investigation (544 questionnaires), this scale consists of six dimensions: experiential distance, behavioral distance, emotional distance, cognitive distance, spatial-temporal distance, and objective social distance based on 44 items. Finally, we determined that the EOPD scale exhibited acceptable reliability and validity using confirmatory factor analysis. This research may establish a foundation for future research on the measurement of psychological relationships between employees and organizations. PMID:29375427

  14. Studies on Mathematical Models of Wet Adhesion and Lifetime Prediction of Organic Coating/Steel by Grey System Theory

    PubMed Central

    Meng, Fandi; Liu, Ying; Liu, Li; Li, Ying; Wang, Fuhui

    2017-01-01

    A rapid degradation of wet adhesion is the key factor controlling coating lifetime, for the organic coatings under marine hydrostatic pressure. The mathematical models of wet adhesion have been studied by Grey System Theory (GST). Grey models (GM) (1, 1) of epoxy varnish (EV) coating/steel and epoxy glass flake (EGF) coating/steel have been established, and a lifetime prediction formula has been proposed on the basis of these models. The precision assessments indicate that the established models are accurate, and the prediction formula is capable of making precise lifetime forecasting of the coatings. PMID:28773073

  15. Universal Exciton Size in Organic Polymers is Determined by Nonlocal Orbital Exchange in Time-Dependent Density Functional Theory.

    PubMed

    Mewes, Stefanie A; Plasser, Felix; Dreuw, Andreas

    2017-03-16

    The exciton size of the lowest singlet excited state in a diverse set of organic π-conjugated polymers is studied and found to be a universal, system-independent quantity of approximately 7 Å in the single-chain picture. With time-dependent density functional theory (TDDFT), its value as well as the overall description of the exciton is almost exclusively governed by the amount of nonlocal orbital exchange. This is traced back to the lack of the Coulomb attraction between the electron and hole quasiparticles in pure TDDFT, which is reintroduced only with the admixture of nonlocal orbital exchange.

  16. The Reliability and Sources of Error of Using Rubrics-Based Assessment for Student Projects

    ERIC Educational Resources Information Center

    Menéndez-Varela, José-Luis; Gregori-Giralt, Eva

    2018-01-01

    Rubrics are widely used in higher education to assess performance in project-based learning environments. To date, the sources of error that may affect their reliability have not been studied in depth. Using generalisability theory as its starting-point, this article analyses the influence of the assessors and the criteria of the rubrics on the…

  17. Molecular Theory of Detonation Initiation: Insight from First Principles Modeling of the Decomposition Mechanisms of Organic Nitro Energetic Materials

    DOE PAGES

    Tsyshevsky, Roman; Sharia, Onise; Kuklja, Maija

    2016-02-19

    Our review presents a concept, which assumes that thermal decomposition processes play a major role in defining the sensitivity of organic energetic materials to detonation initiation. As a science and engineering community we are still far away from having a comprehensive molecular detonation initiation theory in a widely agreed upon form. However, recent advances in experimental and theoretical methods allow for a constructive and rigorous approach to design and test the theory or at least some of its fundamental building blocks. In this review, we analyzed a set of select experimental and theoretical articles, which were augmented by our ownmore » first principles modeling and simulations, to reveal new trends in energetic materials and to refine known existing correlations between their structures, properties, and functions. Lastly, our consideration is intentionally limited to the processes of thermally stimulated chemical reactions at the earliest stage of decomposition of molecules and materials containing defects.« less

  18. Reliable Thermoelectric Module Design under Opposing Requirements from Structural and Thermoelectric Considerations

    NASA Astrophysics Data System (ADS)

    Karri, Naveen K.; Mo, Changki

    2018-06-01

    Structural reliability of thermoelectric generation (TEG) systems still remains an issue, especially for applications such as large-scale industrial or automobile exhaust heat recovery, in which TEG systems are subject to dynamic loads and thermal cycling. Traditional thermoelectric (TE) system design and optimization techniques, focused on performance alone, could result in designs that may fail during operation as the geometric requirements for optimal performance (especially the power) are often in conflict with the requirements for mechanical reliability. This study focused on reducing the thermomechanical stresses in a TEG system without compromising the optimized system performance. Finite element simulations were carried out to study the effect of TE element (leg) geometry such as leg length and cross-sectional shape under constrained material volume requirements. Results indicated that the element length has a major influence on the element stresses whereas regular cross-sectional shapes have minor influence. The impact of TE element stresses on the mechanical reliability is evaluated using brittle material failure theory based on Weibull analysis. An alternate couple configuration that relies on the industry practice of redundant element design is investigated. Results showed that the alternate configuration considerably reduced the TE element and metallization stresses, thereby enhancing the structural reliability, with little trade-off in the optimized performance. The proposed alternate configuration could serve as a potential design modification for improving the reliability of systems optimized for thermoelectric performance.

  19. Reliability and maintainability assessment factors for reliable fault-tolerant systems

    NASA Technical Reports Server (NTRS)

    Bavuso, S. J.

    1984-01-01

    A long term goal of the NASA Langley Research Center is the development of a reliability assessment methodology of sufficient power to enable the credible comparison of the stochastic attributes of one ultrareliable system design against others. This methodology, developed over a 10 year period, is a combined analytic and simulative technique. An analytic component is the Computer Aided Reliability Estimation capability, third generation, or simply CARE III. A simulative component is the Gate Logic Software Simulator capability, or GLOSS. The numerous factors that potentially have a degrading effect on system reliability and the ways in which these factors that are peculiar to highly reliable fault tolerant systems are accounted for in credible reliability assessments. Also presented are the modeling difficulties that result from their inclusion and the ways in which CARE III and GLOSS mitigate the intractability of the heretofore unworkable mathematics.

  20. A Proposed Model of Jazz Theory Knowledge Acquisition

    ERIC Educational Resources Information Center

    Ciorba, Charles R.; Russell, Brian E.

    2014-01-01

    The purpose of this study was to test a hypothesized model that proposes a causal relationship between motivation and academic achievement on the acquisition of jazz theory knowledge. A reliability analysis of the latent variables ranged from 0.92 to 0.94. Confirmatory factor analyses of the motivation (standardized root mean square residual…