Study on the supply chain of an enterprise based on axiomatic design
NASA Astrophysics Data System (ADS)
Fan, Shu-hai; Lin, Chao-qun; Ji, Chun; Zhou, Ce; Chen, Peng
2018-06-01
This paper first expounds the basic theoretical knowledge of axiomatic design, and then designs and improves the enterprise supply chain through two design axioms (axiom of independence and information axiom). In the axiomatic design of the axiom of independence, the user needs to determine the needs and problems to be solved, to determine the top total goals, the total goal decomposition, and to determine their own design equations. In the application of information axiom, the concept of cloud is used to quantify the amount of information, and the two schemes are evaluated and compared. Finally, through axiomatic design, we can get the best solution for the improvement of supply chain design. Axiomatic design is a generic, systematic and sophisticated approach to design that addresses the needs of different customers. Using this method to improve the level of supply chain management is creative. As a mature method, it will make the process efficient and convenient.
Hilbert's axiomatic method and Carnap's general axiomatics.
Stöltzner, Michael
2015-10-01
This paper compares the axiomatic method of David Hilbert and his school with Rudolf Carnap's general axiomatics that was developed in the late 1920s, and that influenced his understanding of logic of science throughout the 1930s, when his logical pluralism developed. The distinct perspectives become visible most clearly in how Richard Baldus, along the lines of Hilbert, and Carnap and Friedrich Bachmann analyzed the axiom system of Hilbert's Foundations of Geometry—the paradigmatic example for the axiomatization of science. Whereas Hilbert's axiomatic method started from a local analysis of individual axiom systems in which the foundations of mathematics as a whole entered only when establishing the system's consistency, Carnap and his Vienna Circle colleague Hans Hahn instead advocated a global analysis of axiom systems in general. A primary goal was to evade, or formalize ex post, mathematicians' 'material' talk about axiom systems for such talk was held to be error-prone and susceptible to metaphysics. Copyright © 2015 Elsevier Ltd. All rights reserved.
Axiomatic Evaluation Method and Content Structure for Information Appliances
ERIC Educational Resources Information Center
Guo, Yinni
2010-01-01
Extensive studies have been conducted to determine how best to present information in order to enhance usability, but not what information is needed to be presented for effective decision making. Hence, this dissertation addresses the factor structure of the nature of information needed for presentation and proposes a more effective method than…
Impossibility Theorem in Proportional Representation Problem
NASA Astrophysics Data System (ADS)
Karpov, Alexander
2010-09-01
The study examines general axiomatics of Balinski and Young and analyzes existed proportional representation methods using this approach. The second part of the paper provides new axiomatics based on rational choice models. New system of axioms is applied to study known proportional representation systems. It is shown that there is no proportional representation method satisfying a minimal set of the axioms (monotonicity and neutrality).
ERIC Educational Resources Information Center
Nickerson, Carol A.; McClelland, Gary H.
1988-01-01
A methodology is developed based on axiomatic conjoint measurement to accompany a fertility decision-making model. The usefulness of the model is then demonstrated via an application to a study of contraceptive choice (N=100 male and female family-planning clinic clients). Finally, the validity of the model is evaluated. (TJH)
Hilbert's sixth problem: between the foundations of geometry and the axiomatization of physics.
Corry, Leo
2018-04-28
The sixth of Hilbert's famous 1900 list of 23 problems was a programmatic call for the axiomatization of the physical sciences. It was naturally and organically rooted at the core of Hilbert's conception of what axiomatization is all about. In fact, the axiomatic method which he applied at the turn of the twentieth century in his famous work on the foundations of geometry originated in a preoccupation with foundational questions related with empirical science in general. Indeed, far from a purely formal conception, Hilbert counted geometry among the sciences with strong empirical content, closely related to other branches of physics and deserving a treatment similar to that reserved for the latter. In this treatment, the axiomatization project was meant to play, in his view, a crucial role. Curiously, and contrary to a once-prevalent view, from all the problems in the list, the sixth is the only one that continually engaged Hilbet's efforts over a very long period of time, at least between 1894 and 1932.This article is part of the theme issue 'Hilbert's sixth problem'. © 2018 The Author(s).
Hilbert's sixth problem: between the foundations of geometry and the axiomatization of physics
NASA Astrophysics Data System (ADS)
Corry, Leo
2018-04-01
The sixth of Hilbert's famous 1900 list of 23 problems was a programmatic call for the axiomatization of the physical sciences. It was naturally and organically rooted at the core of Hilbert's conception of what axiomatization is all about. In fact, the axiomatic method which he applied at the turn of the twentieth century in his famous work on the foundations of geometry originated in a preoccupation with foundational questions related with empirical science in general. Indeed, far from a purely formal conception, Hilbert counted geometry among the sciences with strong empirical content, closely related to other branches of physics and deserving a treatment similar to that reserved for the latter. In this treatment, the axiomatization project was meant to play, in his view, a crucial role. Curiously, and contrary to a once-prevalent view, from all the problems in the list, the sixth is the only one that continually engaged Hilbet's efforts over a very long period of time, at least between 1894 and 1932. This article is part of the theme issue `Hilbert's sixth problem'.
A General Symbolic Method with Physical Applications
NASA Astrophysics Data System (ADS)
Smith, Gregory M.
2000-06-01
A solution to the problem of unifying the General Relativistic and Quantum Theoretical formalisms is given which introduces a new non-axiomatic symbolic method and an algebraic generalization of the Calculus to non-finite symbolisms without reference to the concept of a limit. An essential feature of the non-axiomatic method is the inadequacy of any (finite) statements: Identifying this aspect of the theory with the "existence of an external physical reality" both allows for the consistency of the method with the results of experiments and avoids the so-called "measurement problem" of quantum theory.
The Automatic Integration of Folksonomies with Taxonomies Using Non-axiomatic Logic
NASA Astrophysics Data System (ADS)
Geldart, Joe; Cummins, Stephen
Cooperative tagging systems such as folksonomies are powerful tools when used to annotate information resources. The inherent power of folksonomies is in their ability to allow casual users to easily contribute ad hoc, yet meaningful, resource metadata without any specialist training. Older folksonomies have begun to degrade due to the lack of internal structure and from the use of many low quality tags. This chapter describes a remedy for some of the problems associated with folksonomies. We introduce a method of automatic integration and inference of the relationships between tags and resources in a folksonomy using non-axiomatic logic. We test this method on the CiteULike corpus of tags by comparing precision and recall between it and standard keyword search. Our results show that non-axiomatic reasoning is a promising technique for integrating tagging systems with more structured knowledge representations.
Approaching the axiomatic enrichment of the Gene Ontology from a lexical perspective.
Quesada-Martínez, Manuel; Mikroyannidi, Eleni; Fernández-Breis, Jesualdo Tomás; Stevens, Robert
2015-09-01
The main goal of this work is to measure how lexical regularities in biomedical ontology labels can be used for the automatic creation of formal relationships between classes, and to evaluate the results of applying our approach to the Gene Ontology (GO). In recent years, we have developed a method for the lexical analysis of regularities in biomedical ontology labels, and we showed that the labels can present a high degree of regularity. In this work, we extend our method with a cross-products extension (CPE) metric, which estimates the potential interest of a specific regularity for axiomatic enrichment in the lexical analysis, using information on exact matches in external ontologies. The GO consortium recently enriched the GO by using so-called cross-product extensions. Cross-products are generated by establishing axioms that relate a given GO class with classes from the GO or other biomedical ontologies. We apply our method to the GO and study how its lexical analysis can identify and reconstruct the cross-products that are defined by the GO consortium. The label of the classes of the GO are highly regular in lexical terms, and the exact matches with labels of external ontologies affect 80% of the GO classes. The CPE metric reveals that 31.48% of the classes that exhibit regularities have fragments that are classes into two external ontologies that are selected for our experiment, namely, the Cell Ontology and the Chemical Entities of Biological Interest ontology, and 18.90% of them are fully decomposable into smaller parts. Our results show that the CPE metric permits our method to detect GO cross-product extensions with a mean recall of 62% and a mean precision of 28%. The study is completed with an analysis of false positives to explain this precision value. We think that our results support the claim that our lexical approach can contribute to the axiomatic enrichment of biomedical ontologies and that it can provide new insights into the engineering of biomedical ontologies. Copyright © 2014 Elsevier B.V. All rights reserved.
Geometry and experience: Einstein's 1921 paper and Hilbert's axiomatic system
DOE Office of Scientific and Technical Information (OSTI.GOV)
De Gandt, Francois
2006-06-19
In his 1921 paper Geometrie und Erfahrung, Einstein decribes the new epistemological status of geometry, divorced from any intuitive or a priori content. He calls that 'axiomatics', following Hilbert's theoretical developments on axiomatic systems, which started with the stimulus given by a talk by Hermann Wiener in 1891 and progressed until the Foundations of geometry in 1899. Difficult questions arise: how is a theoretical system related to an intuitive empirical content?.
Quantum probability and Hilbert's sixth problem
NASA Astrophysics Data System (ADS)
Accardi, Luigi
2018-04-01
With the birth of quantum mechanics, the two disciplines that Hilbert proposed to axiomatize, probability and mechanics, became entangled and a new probabilistic model arose in addition to the classical one. Thus, to meet Hilbert's challenge, an axiomatization should account deductively for the basic features of all three disciplines. This goal was achieved within the framework of quantum probability. The present paper surveys the quantum probabilistic axiomatization. This article is part of the themed issue `Hilbert's sixth problem'.
Upper entropy axioms and lower entropy axioms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guo, Jin-Li, E-mail: phd5816@163.com; Suo, Qi
2015-04-15
The paper suggests the concepts of an upper entropy and a lower entropy. We propose a new axiomatic definition, namely, upper entropy axioms, inspired by axioms of metric spaces, and also formulate lower entropy axioms. We also develop weak upper entropy axioms and weak lower entropy axioms. Their conditions are weaker than those of Shannon–Khinchin axioms and Tsallis axioms, while these conditions are stronger than those of the axiomatics based on the first three Shannon–Khinchin axioms. The subadditivity and strong subadditivity of entropy are obtained in the new axiomatics. Tsallis statistics is a special case of satisfying our axioms. Moreover,more » different forms of information measures, such as Shannon entropy, Daroczy entropy, Tsallis entropy and other entropies, can be unified under the same axiomatics.« less
An algebra of reversible computation.
Wang, Yong
2016-01-01
We design an axiomatization for reversible computation called reversible ACP (RACP). It has four extendible modules: basic reversible processes algebra, algebra of reversible communicating processes, recursion and abstraction. Just like process algebra ACP in classical computing, RACP can be treated as an axiomatization foundation for reversible computation.
The place of probability in Hilbert's axiomatization of physics, ca. 1900-1928
NASA Astrophysics Data System (ADS)
Verburgt, Lukas M.
2016-02-01
Although it has become a common place to refer to the 'sixth problem' of Hilbert's (1900) Paris lecture as the starting point for modern axiomatized probability theory, his own views on probability have received comparatively little explicit attention. The central aim of this paper is to provide a detailed account of this topic in light of the central observation that the development of Hilbert's project of the axiomatization of physics went hand-in-hand with a redefinition of the status of probability theory and the meaning of probability. Where Hilbert first regarded the theory as a mathematizable physical discipline and later approached it as a 'vague' mathematical application in physics, he eventually understood probability, first, as a feature of human thought and, then, as an implicitly defined concept without a fixed physical interpretation. It thus becomes possible to suggest that Hilbert came to question, from the early 1920s on, the very possibility of achieving the goal of the axiomatization of probability as described in the 'sixth problem' of 1900.
Slow dynamics in glasses: A comparison between theory and experiment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Phillips, J. C.
Minimalist theories of complex systems are broadly of two kinds: mean field and axiomatic. So far, all theories of complex properties absent from simple systems and intrinsic to glasses are axiomatic. Stretched Exponential Relaxation (SER) is the prototypical complex temporal property of glasses, discovered by Kohlrausch 150 years ago, and now observed almost universally in microscopically homogeneous, complex nonequilibrium materials, including luminescent electronic Coulomb glasses. A critical comparison of alternative axiomatic theories with both numerical simulations and experiments strongly favors channeled dynamical trap models over static percolative or energy landscape models. The topics discussed cover those reported since the author'smore » review article in 1996, with an emphasis on parallels between channel bifurcation in electronic and molecular relaxation.« less
NASA Technical Reports Server (NTRS)
Lee, Taesik; Jeziorek, Peter
2004-01-01
Large complex projects cost large sums of money throughout their life cycle for a variety of reasons and causes. For such large programs, the credible estimation of the project cost, a quick assessment of the cost of making changes, and the management of the project budget with effective cost reduction determine the viability of the project. Cost engineering that deals with these issues requires a rigorous method and systematic processes. This paper introduces a logical framework to a&e effective cost engineering. The framework is built upon Axiomatic Design process. The structure in the Axiomatic Design process provides a good foundation to closely tie engineering design and cost information together. The cost framework presented in this paper is a systematic link between the functional domain (FRs), physical domain (DPs), cost domain (CUs), and a task/process-based model. The FR-DP map relates a system s functional requirements to design solutions across all levels and branches of the decomposition hierarchy. DPs are mapped into CUs, which provides a means to estimate the cost of design solutions - DPs - from the cost of the physical entities in the system - CUs. The task/process model describes the iterative process ot-developing each of the CUs, and is used to estimate the cost of CUs. By linking the four domains, this framework provides a superior traceability from requirements to cost information.
The grey Shapley value: an axiomatization
NASA Astrophysics Data System (ADS)
Ekici, M.; Palanci, O.; Alparslan Gök, S. Z.
2018-01-01
This study focuses on an interesting class of cooperative games where the coalitional values are interval grey numbers. These cooper- ative games are called cooperative grey games. In this paper, we deal with an axiomatization of the grey Shapley value. We introduce the Banzhaf value and the egalitarian rule by using cooperative grey game theory. Finally, we conclude the paper with a conclusion.
Axiomatic Design of a Framework for the Comprehensive Optimization of Patient Flows in Hospitals
Matt, Dominik T.
2017-01-01
Lean Management and Six Sigma are nowadays applied not only to the manufacturing industry but also to service industry and public administration. The manifold variables affecting the Health Care system minimize the effect of a narrow Lean intervention. Therefore, this paper aims to discuss a comprehensive, system-based approach to achieve a factual holistic optimization of patient flows. This paper debates the efficacy of Lean principles applied to the optimization of patient flows and related activities, structures, and resources, developing a theoretical framework based on the principles of the Axiomatic Design. The demand for patient-oriented and efficient health services leads to use these methodologies to improve hospital processes. In the framework, patients with similar characteristics are clustered in families to achieve homogeneous flows through the value stream. An optimization checklist is outlined as the result of the mapping between Functional Requirements and Design Parameters, with the right sequence of the steps to optimize the patient flow according to the principles of Axiomatic Design. The Axiomatic Design-based top-down implementation of Health Care evidence, according to Lean principles, results in a holistic optimization of hospital patient flows, by reducing the complexity of the system. PMID:29065578
Axiomatic Design of a Framework for the Comprehensive Optimization of Patient Flows in Hospitals.
Arcidiacono, Gabriele; Matt, Dominik T; Rauch, Erwin
2017-01-01
Lean Management and Six Sigma are nowadays applied not only to the manufacturing industry but also to service industry and public administration. The manifold variables affecting the Health Care system minimize the effect of a narrow Lean intervention. Therefore, this paper aims to discuss a comprehensive, system-based approach to achieve a factual holistic optimization of patient flows. This paper debates the efficacy of Lean principles applied to the optimization of patient flows and related activities, structures, and resources, developing a theoretical framework based on the principles of the Axiomatic Design. The demand for patient-oriented and efficient health services leads to use these methodologies to improve hospital processes. In the framework, patients with similar characteristics are clustered in families to achieve homogeneous flows through the value stream. An optimization checklist is outlined as the result of the mapping between Functional Requirements and Design Parameters, with the right sequence of the steps to optimize the patient flow according to the principles of Axiomatic Design. The Axiomatic Design-based top-down implementation of Health Care evidence, according to Lean principles, results in a holistic optimization of hospital patient flows, by reducing the complexity of the system.
Axiomatic Design of a Framework for the Comprehensive Optimization of Patient Flows in Hospitals
Arcidiacono, Gabriele; Matt, Dominik T.; Rauch, Erwin
2017-01-01
Lean Management and Six Sigma are nowadays applied not only to the manufacturing industry but also to service industry and public administration. The manifold variables affecting the Health Care system minimize the effect of a narrow Lean intervention. Therefore, this paper aims to discuss a comprehensive, system-based approach to achieve a factual holistic optimization of patient flows. This paper debates the efficacy of Lean principles applied to the optimization of patient flows and related activities, structures, and resources, developing a theoretical framework based on the principles of the Axiomatic Design. The demand for patient-oriented and efficient health services leads to use these methodologies to improve hospital processes. In the framework, patients with similar characteristics are clustered in families to achieve homogeneous flows through the value stream. An optimization checklist is outlined as the result of the mapping between Functional Requirements and Design Parameters, with the right sequence of the steps to optimize the patient flow according to the principles of Axiomatic Design. The Axiomatic Design-based top-down implementation of Health Care evidence, according to Lean principles, results in a holistic optimization of hospital patient flows, by reducing the complexity of the system. © 2017 Gabriele Arcidiacono et al.
An MEG signature corresponding to an axiomatic model of reward prediction error.
Talmi, Deborah; Fuentemilla, Lluis; Litvak, Vladimir; Duzel, Emrah; Dolan, Raymond J
2012-01-02
Optimal decision-making is guided by evaluating the outcomes of previous decisions. Prediction errors are theoretical teaching signals which integrate two features of an outcome: its inherent value and prior expectation of its occurrence. To uncover the magnetic signature of prediction errors in the human brain we acquired magnetoencephalographic (MEG) data while participants performed a gambling task. Our primary objective was to use formal criteria, based upon an axiomatic model (Caplin and Dean, 2008a), to determine the presence and timing profile of MEG signals that express prediction errors. We report analyses at the sensor level, implemented in SPM8, time locked to outcome onset. We identified, for the first time, a MEG signature of prediction error, which emerged approximately 320 ms after an outcome and expressed as an interaction between outcome valence and probability. This signal followed earlier, separate signals for outcome valence and probability, which emerged approximately 200 ms after an outcome. Strikingly, the time course of the prediction error signal, as well as the early valence signal, resembled the Feedback-Related Negativity (FRN). In simultaneously acquired EEG data we obtained a robust FRN, but the win and loss signals that comprised this difference wave did not comply with the axiomatic model. Our findings motivate an explicit examination of the critical issue of timing embodied in computational models of prediction errors as seen in human electrophysiological data. Copyright © 2011 Elsevier Inc. All rights reserved.
Applying axiomatic design to a medication distribution system
NASA Astrophysics Data System (ADS)
Raguini, Pepito B.
As the need to minimize medication errors drives many medical facilities to come up with robust solutions to the most common error that affects patient's safety, these hospitals would be wise to put a concerted effort into finding methodologies that can facilitate an optimized medical distribution system. If the hospitals' upper management is looking for an optimization method that is an ideal fit, it is just as important that the right tool be selected for the application at hand. In the present work, we propose the application of Axiomatic Design (AD), which is a process that focuses on the generation and selection of functional requirements to meet the customer needs for product and/or process design. The appeal of the axiomatic approach is to provide both a formal design process and a set of technical coefficients for meeting the customer's needs. Thus, AD offers a strategy for the effective integration of people, design methods, design tools and design data. Therefore, we propose the AD methodology to medical applications with the main objective of allowing nurses the opportunity to provide cost effective delivery of medications to inpatients, thereby improving quality patient care. The AD methodology will be implemented through the use of focused stores, where medications can be readily stored and can be conveniently located near patients, as well as a mobile apparatus that can also store medications and is commonly used by hospitals, the medication cart. Moreover, a robust methodology called the focused store methodology will be introduced and developed for both the uncapacitated and capacitated case studies, which will set up an appropriate AD framework and design problem for a medication distribution case study.
Atoms in molecules, an axiomatic approach. I. Maximum transferability
NASA Astrophysics Data System (ADS)
Ayers, Paul W.
2000-12-01
Central to chemistry is the concept of transferability: the idea that atoms and functional groups retain certain characteristic properties in a wide variety of environments. Providing a completely satisfactory mathematical basis for the concept of atoms in molecules, however, has proved difficult. The present article pursues an axiomatic basis for the concept of an atom within a molecule, with particular emphasis devoted to the definition of transferability and the atomic description of Hirshfeld.
Reflections on Heckman and Pinto’s Causal Analysis After Haavelmo
2013-11-01
Econometric Analysis , Cambridge University Press, 477–490, 1995. Halpern, J. (1998). Axiomatizing causal reasoning. In Uncertainty in Artificial...Models, Structural Models and Econometric Policy Evaluation. Elsevier B.V., Amsterdam, 4779–4874. Heckman, J. J. (1979). Sample selection bias as a...Reflections on Heckman and Pinto’s “Causal Analysis After Haavelmo” Judea Pearl University of California, Los Angeles Computer Science Department Los
A Default Temporal Logic for Regulatory Conformance Checking
2008-04-01
proofs. In Section 4.3, we provide an axiomatization using Fitting’s sequent calculus [25]. Completeness is proved in Section 4.4. We conclude, in...axiomatize RefL. 4.3 Sequent Calculus We use Fitting’s sequent calculus [25]. A sequent is a statement of the form Γ → ∆, where Γ and ∆ are finite sets of...T.D., Vail, M.W., Anton , A.I.: Towards regulatory compliance: Extracting rights and obligations to align requirements with regulations. In
Macroinformational analysis of conditions for controllability of space-vehicle orbit
NASA Astrophysics Data System (ADS)
Glazov, B. I.
2011-12-01
The general axiomatics of information measures for the macro analysis of relations of an information-cybernetic system in the control is introduced. The general structure of a semantically marked graph of open and closed relations of an information-cybernetic system between the participants in the environment, as well as thenecessary axiomatic and technological information-cybernetic system conditions of controllability and observability of objects, for the case of a space vehicle in orbit, are justified.
D'Ambrosio, Antonio; Heiser, Willem J
2016-09-01
Preference rankings usually depend on the characteristics of both the individuals judging a set of objects and the objects being judged. This topic has been handled in the literature with log-linear representations of the generalized Bradley-Terry model and, recently, with distance-based tree models for rankings. A limitation of these approaches is that they only work with full rankings or with a pre-specified pattern governing the presence of ties, and/or they are based on quite strict distributional assumptions. To overcome these limitations, we propose a new prediction tree method for ranking data that is totally distribution-free. It combines Kemeny's axiomatic approach to define a unique distance between rankings with the CART approach to find a stable prediction tree. Furthermore, our method is not limited by any particular design of the pattern of ties. The method is evaluated in an extensive full-factorial Monte Carlo study with a new simulation design.
An Axiomatic Approach to Hyperconnectivity
NASA Astrophysics Data System (ADS)
Wilkinson, Michael H. F.
In this paper the notion of hyperconnectivity, first put forward by Serra as an extension of the notion of connectivity is explored theoretically. Hyperconnectivity operators, which are the hyperconnected equivalents of connectivity openings are defined, which supports both hyperconnected reconstruction and attribute filters. The new axiomatics yield insight into the relationship between hyperconnectivity and structural morphology. The latter turns out to be a special case of the former, which means a continuum of filters between connected and structural exists, all of which falls into the category of hyperconnected filters.
Hyperconnectivity, Attribute-Space Connectivity and Path Openings: Theoretical Relationships
NASA Astrophysics Data System (ADS)
Wilkinson, Michael H. F.
In this paper the relationship of hyperconnected filters with path openings and attribute-space connected filters is studied. Using a recently developed axiomatic framework based on hyperconnectivity operators, which are the hyperconnected equivalents of connectivity openings, it is shown that path openings are a special case of hyperconnected area openings. The new axiomatics also yield insight into the relationship between hyperconnectivity and attribute-space connectivity. It is shown any hyperconnectivity is an attribute-space connectivity, but that the reverse is not true.
ERIC Educational Resources Information Center
Olesen, Mogens Noergaard
2010-01-01
In the history of mankind three important philosophical and scientific revolutions have taken place. The first of these revolutions was the mathematical-axiomatic revolution in ancient Greece, when the philosophers from Thales of Miletus to Archimedes built up the abstract deductive method used in pure mathematics. The second took place in the…
NASA Technical Reports Server (NTRS)
Tian, Jianhui; Porter, Adam; Zelkowitz, Marvin V.
1992-01-01
Identification of high cost modules has been viewed as one mechanism to improve overall system reliability, since such modules tend to produce more than their share of problems. A decision tree model was used to identify such modules. In this current paper, a previously developed axiomatic model of program complexity is merged with the previously developed decision tree process for an improvement in the ability to identify such modules. This improvement was tested using data from the NASA Software Engineering Laboratory.
Atmospheric Transmittance and Radiance: Methods of Calculation
1975-06-01
plasma theory. There are many analogies and in many cases the mathematical procedures used in the analyses are quite similar. The axiomatic basis for the...Nevertheless, an almost complete compilation is provided by the Radiation Shielding Information Center at the Oak Ridge National a.boratory. The...E. Turner, "Atmospheric Fifects In Remote Sensing," Remote Sensing of Earth Re- sources, Vol. II, F. Shahrokhl (ed.), University of Tennessee, 1973
Dynamic Order Algebras as an Axiomatization of Modal and Tense Logics
NASA Astrophysics Data System (ADS)
Chajda, Ivan; Paseka, Jan
2015-12-01
The aim of the paper is to introduce and describe tense operators in every propositional logic which is axiomatized by means of an algebra whose underlying structure is a bounded poset or even a lattice. We introduce the operators G, H, P and F without regard what propositional connectives the logic includes. For this we use the axiomatization of universal quantifiers as a starting point and we modify these axioms for our reasons. At first, we show that the operators can be recognized as modal operators and we study the pairs ( P, G) as the so-called dynamic order pairs. Further, we get constructions of these operators in the corresponding algebra provided a time frame is given. Moreover, we solve the problem of finding a time frame in the case when the tense operators are given. In particular, any tense algebra is representable in its Dedekind-MacNeille completion. Our approach is fully general, we do not relay on the logic under consideration and hence it is applicable in all the up to now known cases.
2008-10-23
for Public Release, Distribution Unlimited, GDLS approved, log 2008-98, dated 10/13/08 Analysis Analysis from DFSS axiomatic design methods indicate the...solutions AA enables a comprehensive analysis across different force configurations and dynamic situations October 26, 2006 slide 25 Approved for public ...analyzed by Software Engineering Institute. Analysis results reviewed by NDIA SE Effectiveness Committee. Reports 1. Public NDIA/SEI report released
Discrimination in a General Algebraic Setting
Fine, Benjamin; Lipschutz, Seymour; Spellman, Dennis
2015-01-01
Discriminating groups were introduced by G. Baumslag, A. Myasnikov, and V. Remeslennikov as an outgrowth of their theory of algebraic geometry over groups. Algebraic geometry over groups became the main method of attack on the solution of the celebrated Tarski conjectures. In this paper we explore the notion of discrimination in a general universal algebra context. As an application we provide a different proof of a theorem of Malcev on axiomatic classes of Ω-algebras. PMID:26171421
1977-02-01
usually with a ceiling height well in excess of head height. Formations (speleothems) include flowstone, stalagmites , stalactites , helictites, columns, onyx...Dots. Entered) READ INSTPUCTIONSREP-RTDOCUMENTATION PAGE B3EFORE COMPLETING FORM 1. REPORT NUMBER 12. qO T ACCESSION NO. 3. RECIPIENT’S CATALOG...practices. It is axiomatic that endangered sý.ecies require some form of management, as, by definition, most of them would become extinct otherwise. The
NASA Technical Reports Server (NTRS)
Owre, Sam; Shankar, Natarajan; Butler, Ricky W. (Technical Monitor)
2001-01-01
The purpose of this task was to provide a mechanism for theory interpretations in a prototype verification system (PVS) so that it is possible to demonstrate the consistency of a theory by exhibiting an interpretation that validates the axioms. The mechanization makes it possible to show that one collection of theories is correctly interpreted by another collection of theories under a user-specified interpretation for the uninterpreted types and constants. A theory instance is generated and imported, while the axiom instances are generated as proof obligations to ensure that the interpretation is valid. Interpretations can be used to show that an implementation is a correct refinement of a specification, that an axiomatically defined specification is consistent, or that a axiomatically defined specification captures its intended models. In addition, the theory parameter mechanism has been extended with a notion of theory as parameter so that a theory instance can be given as an actual parameter to an imported theory. Theory interpretations can thus be used to refine an abstract specification or to demonstrate the consistency of an axiomatic theory. In this report we describe the mechanism in detail. This extension is a part of PVS version 3.0, which will be publicly released in mid-2001.
Hilbert's 'Foundations of Physics': Gravitation and electromagnetism within the axiomatic method
NASA Astrophysics Data System (ADS)
Brading, K. A.; Ryckman, T. A.
2008-01-01
In November and December 1915, Hilbert presented two communications to the Göttingen Academy of Sciences under the common title 'The Foundations of Physics'. Versions of each eventually appeared in the Nachrichten of the Academy. Hilbert's first communication has received significant reconsideration in recent years, following the discovery of printer's proofs of this paper, dated 6 December 1915. The focus has been primarily on the 'priority dispute' over the Einstein field equations. Our contention, in contrast, is that the discovery of the December proofs makes it possible to see the thematic linkage between the material that Hilbert cut from the published version of the first communication and the content of the second, as published in 1917. The latter has been largely either disregarded or misinterpreted, and our aim is to show that (a) Hilbert's two communications should be regarded as part of a wider research program within the overarching framework of 'the axiomatic method' (as Hilbert expressly stated was the case), and (b) the second communication is a fine and coherent piece of work within this framework, whose principal aim is to address an apparent tension between general invariance and causality (in the precise sense of Cauchy determination), pinpointed in Theorem I of the first communication. This is not the same problem as that found in Einstein's 'hole argument'-something that, we argue, never confused Hilbert.
Mind-body dualism and the compatibility of medical methods.
Burkhardt, Hans; Imaguire, Guido
2002-01-01
In this paper we analyse some misleading theses concerning the old controversy over the relation between mind and body presented in contemporary medical literature. We undertake an epistemological clarification of the axiomatic structure of medical methods. This clarification, in turn, requires a precise philosophical explanation of the presupposed concepts. This analysis will establish two results: (1) that the mind-body dualism cannot be understood as a kind of biological variation of the subject-object dichotomy in physics, and (2) that the thesis of the incompatibility between somatic and psychosomatic medicine held by naturalists and others lacks solid epistemological foundation.
NASA Technical Reports Server (NTRS)
Onwubiko, Chinyere; Onyebueke, Landon
1996-01-01
This program report is the final report covering all the work done on this project. The goal of this project is technology transfer of methodologies to improve design process. The specific objectives are: 1. To learn and understand the Probabilistic design analysis using NESSUS. 2. To assign Design Projects to either undergraduate or graduate students on the application of NESSUS. 3. To integrate the application of NESSUS into some selected senior level courses in Civil and Mechanical Engineering curricula. 4. To develop courseware in Probabilistic Design methodology to be included in a graduate level Design Methodology course. 5. To study the relationship between the Probabilistic design methodology and Axiomatic design methodology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
L.Y. Dodin and N.J. Fisch
2012-06-18
By restating geometrical optics within the eld-theoretical approach, the classical concept of a photon in arbitrary dispersive medium is introduced, and photon properties are calculated unambiguously. In particular, the canonical and kinetic momenta carried by a photon, as well as the two corresponding energy-momentum tensors of a wave, are derived straightforwardly from rst principles of Lagrangian mechanics. The Abraham-Minkowski controversy pertaining to the de nitions of these quantities is thereby resolved for linear waves of arbitrary nature, and corrections to the traditional formulas for the photon kinetic quantities are found. An application of axiomatic geometrical optics to electromagnetic waves ismore » also presented as an example.« less
Scalable and Axiomatic Ranking of Network Role Similarity
Jin, Ruoming; Lee, Victor E.; Li, Longjie
2014-01-01
A key task in analyzing social networks and other complex networks is role analysis: describing and categorizing nodes according to how they interact with other nodes. Two nodes have the same role if they interact with equivalent sets of neighbors. The most fundamental role equivalence is automorphic equivalence. Unfortunately, the fastest algorithms known for graph automorphism are nonpolynomial. Moreover, since exact equivalence is rare, a more meaningful task is measuring the role similarity between any two nodes. This task is closely related to the structural or link-based similarity problem that SimRank addresses. However, SimRank and other existing similarity measures are not sufficient because they do not guarantee to recognize automorphically or structurally equivalent nodes. This paper makes two contributions. First, we present and justify several axiomatic properties necessary for a role similarity measure or metric. Second, we present RoleSim, a new similarity metric which satisfies these axioms and which can be computed with a simple iterative algorithm. We rigorously prove that RoleSim satisfies all these axiomatic properties. We also introduce Iceberg RoleSim, a scalable algorithm which discovers all pairs with RoleSim scores above a user-defined threshold θ. We demonstrate the interpretative power of RoleSim on both both synthetic and real datasets. PMID:25383066
Axiomatic Design of Space Life Support Systems
NASA Technical Reports Server (NTRS)
Jones, Harry W.
2017-01-01
Systems engineering is an organized way to design and develop systems, but the initial system design concepts are usually seen as the products of unexplained but highly creative intuition. Axiomatic design is a mathematical approach to produce and compare system architectures. The two axioms are:- Maintain the independence of the functional requirements.- Minimize the information content (or complexity) of the design. The first axiom generates good system design structures and the second axiom ranks them. The closed system human life support architecture now implemented in the International Space Station has been essentially unchanged for fifty years. In contrast, brief missions such as Apollo and Shuttle have used open loop life support. As mission length increases, greater system closure and increased recycling become more cost-effective.Closure can be gradually increased, first recycling humidity condensate, then hygiene wastewater, urine, carbon dioxide, and water recovery brine. A long term space station or planetary base could implement nearly full closure, including food production. Dynamic systems theory supports the axioms by showing that fewer requirements, fewer subsystems, and fewer interconnections all increase system stability. If systems are too complex and interconnected, reliability is reduced and operations and maintenance become more difficult. Using axiomatic design shows how the mission duration and other requirements determine the best life support system design including the degree of closure.
NASA Astrophysics Data System (ADS)
Jesudason, Christopher G.
2003-09-01
Recently, there have appeared interesting correctives or challenges [Entropy 1999, 1, 111-147] to the Second law formulations, especially in the interpretation of the Clausius equivalent transformations, closely related in area to extensions of the Clausius principle to irreversible processes [Chem. Phys. Lett. 1988, 143(1), 65-70]. Since the traditional formulations are central to science, a brief analysis of some of these newer theories along traditional lines is attempted, based on well-attested axioms which have formed the basis of equilibrium thermodynamics. It is deduced that the Clausius analysis leading to the law of increasing entropy does not follow from the given axioms but it can be proved that for irreversible transitions, the total entropy change of the system and thermal reservoirs (the "Universe") is not negative, even for the case when the reservoirs are not at the same temperature as the system during heat transfer. On the basis of two new simple theorems and three corollaries derived for the correlation between irreversible and reversible pathways and the traditional axiomatics, it is shown that a sequence of reversible states can never be used to describe a corresponding sequence of irreversible states for at least closed systems, thereby restricting the principle of local equilibrium. It is further shown that some of the newer irreversible entropy forms given exhibit some paradoxical properties relative to the standard axiomatics. It is deduced that any reconciliation between the traditional approach and novel theories lie in creating a well defined set of axioms to which all theoretical developments should attempt to be based on unless proven not be useful, in which case there should be consensus in removing such axioms from theory. Clausius' theory of equivalent transformations do not contradict the traditional understanding of heat- work efficiency. It is concluded that the intuitively derived assumptions over the last two centuries seem to be reasonably well grounded, requiring perhaps some minor elaboration to the concepts of (i) system, (ii) the mechanism of heat transfer, and (iii) the environment, which would be expected to evolve with time in any case. If new generalizations at variance with Clausius' concepts are presented, then these ideas could be expected to require a different axiomatic basis than the one for equilibrium theory, and this difference must be stated at the outset of any new development. So far such empirically self-consistent axiomatic developments are not very much in evidence.
Decision making in the short and long run: repeated gambles and rationality.
Aloysius, John A
2007-05-01
Experimental evidence indicates that decision makers who reject a single play of a gamble may accept repeated plays of that gamble. The rationality of this pattern of preference has been investigated beginning with Samuelson's colleague (SC) who gained notoriety in a well-known paper. SC's pattern of preference is commonly viewed as a behavioural anomaly. Researchers from branches of psychology and economics have analysed the choice and, despite much debate, there remains considerable confusion. An axiomatic analysis of SC's choice has been used to motivate experimental studies in several disciplines. This paper identifies the axiomatic violation as that of an assumed rather than a normative condition. Therefore, contrary to popular belief, SC's choice is consistent with expected utility theory.
A manifold learning approach to data-driven computational materials and processes
NASA Astrophysics Data System (ADS)
Ibañez, Ruben; Abisset-Chavanne, Emmanuelle; Aguado, Jose Vicente; Gonzalez, David; Cueto, Elias; Duval, Jean Louis; Chinesta, Francisco
2017-10-01
Standard simulation in classical mechanics is based on the use of two very different types of equations. The first one, of axiomatic character, is related to balance laws (momentum, mass, energy, …), whereas the second one consists of models that scientists have extracted from collected, natural or synthetic data. In this work we propose a new method, able to directly link data to computers in order to perform numerical simulations. These simulations will employ universal laws while minimizing the need of explicit, often phenomenological, models. They are based on manifold learning methodologies.
ERIC Educational Resources Information Center
Rogers, Pat
1972-01-01
Criteria for a reasonable axiomatic system are discussed. A discussion of the historical attempts to prove the independence of Euclids parallel postulate introduces non-Euclidean geometries. Poincare's model for a non-Euclidean geometry is defined and analyzed. (LS)
Comparison of specificity and information for fuzzy domains
NASA Technical Reports Server (NTRS)
Ramer, Arthur
1992-01-01
This paper demonstrates how an integrated theory can be built on the foundation of possibility theory. Information and uncertainty were considered in 'fuzzy' literature since 1982. Our departing point is the model proposed by Klir for the discrete case. It was elaborated axiomatically by Ramer, who also introduced the continuous model. Specificity as a numerical function was considered mostly within Dempster-Shafer evidence theory. An explicity definition was given first by Yager, who has also introduced it in the context of possibility theory. Axiomatic approach and the continuous model have been developed very recently by Ramer and Yager. They also establish a close analytical correspondence between specificity and information. In literature to date, specificity and uncertainty are defined only for the discrete finite domains, with a sole exception. Our presentation removes these limitations. We define specificity measures for arbitrary measurable domains.
Representation of aversive prediction errors in the human periaqueductal gray
Roy, Mathieu; Shohamy, Daphna; Daw, Nathaniel; Jepma, Marieke; Wimmer, Elliott; Wager, Tor D.
2014-01-01
Pain is a primary driver of learning and motivated action. It is also a target of learning, as nociceptive brain responses are shaped by learning processes. We combined an instrumental pain avoidance task with an axiomatic approach to assessing fMRI signals related to prediction errors (PEs), which drive reinforcement-based learning. We found that pain PEs were encoded in the periaqueductal gray (PAG), an important structure for pain control and learning in animal models. Axiomatic tests combined with dynamic causal modeling suggested that ventromedial prefrontal cortex, supported by putamen, provides an expected value-related input to the PAG, which then conveys PE signals to prefrontal regions important for behavioral regulation, including orbitofrontal, anterior mid-cingulate, and dorsomedial prefrontal cortices. Thus, pain-related learning involves distinct neural circuitry, with implications for behavior and pain dynamics. PMID:25282614
Reviving Campbell's paradigm for attitude research.
Kaiser, Florian G; Byrka, Katarzyna; Hartig, Terry
2010-11-01
Because people often say one thing and do another, social psychologists have abandoned the idea of a simple or axiomatic connection between attitude and behavior. Nearly 50 years ago, however, Donald Campbell proposed that the root of the seeming inconsistency between attitude and behavior lies in disregard of behavioral costs. According to Campbell, attitude- behavior gaps are empirical chimeras. Verbal claims and other overt behaviors regarding an attitude object all arise from one "behavioral disposition." In this article, the authors present the constituents of and evidence for a paradigm for attitude research that describes individual behavior as a function of a person's attitude level and the costs of the specific behavior involved. In the authors' version of Campbell's paradigm, they propose a formal and thus axiomatic rather than causal relationship between an attitude and its corresponding performances. The authors draw implications of their proposal for mainstream attitude theory, empirical research, and applications concerning attitudes.
Can a poverty-reducing and progressive tax and transfer system hurt the poor?
Higgins, Sean; Lustig, Nora
2016-09-01
To analyze anti-poverty policies in tandem with the taxes used to pay for them, comparisons of poverty before and after taxes and transfers are often used. We show that these comparisons, as well as measures of horizontal equity and progressivity, can fail to capture an important aspect: that a substantial proportion of the poor are made poorer (or non-poor made poor) by the tax and transfer system. We illustrate with data from seventeen developing countries: in fifteen, the fiscal system is poverty-reducing and progressive, but in ten of these at least one-quarter of the poor pay more in taxes than they receive in transfers. We call this fiscal impoverishment, and axiomatically derive a measure of its extent. An analogous measure of fiscal gains of the poor is also derived, and we show that changes in the poverty gap can be decomposed into our axiomatic measures of fiscal impoverishment and gains.
NASA Astrophysics Data System (ADS)
Lee, Dai Gil; Suh, Nam Pyo
2005-11-01
The idea that materials can be designed to satisfy specific performance requirements is relatively new. With high-performance composites, however, the entire process of designing and fabricating a part can be worked out before manufacturing. The purpose of this book is to present an integrated approach to the design and manufacturing of products from advanced composites. It shows how the basic behavior of composites and their constitutive relationships can be used during the design stage, which minimizes the complexity of manufacturing composite parts and reduces the repetitive "design-build-test" cycle. Designing it right the first time is going to determine the competitiveness of a company, the reliability of the part, the robustness of fabrication processes, and ultimately, the cost and development time of composite parts. Most of all, it should expand the use of advanced composite parts in fields that use composites only to a limited extent at this time. To achieve these goals, this book presents the design and fabrication of novel composite parts made for machine tools and other applications like robots and automobiles. This book is suitable as a textbook for graduate courses in the design and fabrication of composites. It will also be of interest to practicing engineers learning about composites and axiomatic design. A CD-ROM is included in every copy of the book, containing Axiomatic CLPT software. This program, developed by the authors, will assist readers in calculating material properties from the microstructure of the composite. This book is part of the Oxford Series on Advanced Manufacturing.
The transition to formal thinking in mathematics
NASA Astrophysics Data System (ADS)
Tall, David
2008-09-01
This paper focuses on the changes in thinking involved in the transition from school mathematics to formal proof in pure mathematics at university. School mathematics is seen as a combination of visual representations, including geometry and graphs, together with symbolic calculations and manipulations. Pure mathematics in university shifts towards a formal framework of axiomatic systems and mathematical proof. In this paper, the transition in thinking is formulated within a framework of `three worlds of mathematics'- the `conceptual-embodied' world based on perception, action and thought experiment, the `proceptual-symbolic' world of calculation and algebraic manipulation compressing processes such as counting into concepts such as number, and the `axiomatic-formal' world of set-theoretic concept definitions and mathematical proof. Each `world' has its own sequence of development and its own forms of proof that may be blended together to give a rich variety of ways of thinking mathematically. This reveals mathematical thinking as a blend of differing knowledge structures; for instance, the real numbers blend together the embodied number line, symbolic decimal arithmetic and the formal theory of a complete ordered field. Theoretical constructs are introduced to describe how genetic structures set before birth enable the development of mathematical thinking, and how experiences that the individual has met before affect their personal growth. These constructs are used to consider how students negotiate the transition from school to university mathematics as embodiment and symbolism are blended with formalism. At a higher level, structure theorems proved in axiomatic theories link back to more sophisticated forms of embodiment and symbolism, revealing the intimate relationship between the three worlds.
NASA Astrophysics Data System (ADS)
Konopleva, Nelly
2017-03-01
Fundamental physical theory axiomatics is closely connected with methods of experimental measurements. The difference between the theories using global and local symmetries is explained. It is shown that symmetry group localization leads not only to the change of the relativity principle, but to the fundamental modification of experimental programs testing physical theory predictions. It is noticed that any fundamental physical theory must be consistent with the measurement procedures employed for its testing. These ideas are illustrated by events of my biography connected with Yang-Mills theory transformation from an ordinary phenomenological model to a fundamental physical theory based on local symmetry principles like the Einsteinian General Relativity. Baldin position in this situation is demonstrated.
The difference that difference makes: bioethics and the challenge of "disability".
Koch, Tom
2004-12-01
Two rival paradigms permeate bioethics. One generally favors eugenics, euthanasia, assisted suicide and other methods for those with severely restricting physical and cognitive attributes. The other typically opposes these and favors instead ample support for "persons of difference" and their caring families or loved ones. In an attempt to understand the relation between these two paradigms, this article analyzes a publicly reported debate between proponents of both paradigms, bioethicist Peter Singer and lawyer Harriet McBryde Johnson. At issue, the article concludes, are two distinct axiomatic sets of values resulting in not simply different styles of rhetoric but different vocabularies, in effect two different languages of ethics.
Criteria for Assessing Naturalistic Inquiries as Reports.
ERIC Educational Resources Information Center
Lincoln, Yvonna S.; Guba, Egon G.
Research on the assessment of naturalistic inquiries is reviewed, and criteria for assessment are outlined. Criteria reviewed include early foundational and non-foundational criteria, trustworthiness criteria, axiomatic criteria, rhetorical criteria, action criteria, and application/transferability criteria. Case studies that are reports of…
How to Obtain the Covariant Form of Maxwell's Equations from the Continuity Equation
ERIC Educational Resources Information Center
Heras, Jose A.
2009-01-01
The covariant Maxwell equations are derived from the continuity equation for the electric charge. This result provides an axiomatic approach to Maxwell's equations in which charge conservation is emphasized as the fundamental axiom underlying these equations.
When Proofs Reflect More on Assumptions than Conclusions
ERIC Educational Resources Information Center
Dawkins, Paul Christian
2014-01-01
This paper demonstrates how questions of "provability" can help students engaged in reinvention of mathematical theory to understand the axiomatic game. While proof demonstrates how conclusions follow from assumptions, "provability" characterizes the dual relation that assumptions are "justified" when they afford…
Constructing a consumption model of fine dining from the perspective of behavioral economics
Tsai, Sang-Bing
2018-01-01
Numerous factors affect how people choose a fine dining restaurant, including food quality, service quality, food safety, and hedonic value. A conceptual framework for evaluating restaurant selection behavior has not yet been developed. This study surveyed 150 individuals with fine dining experience and proposed the use of mental accounting and axiomatic design to construct a consumer economic behavior model. Linear and logistic regressions were employed to determine model correlations and the probability of each factor affecting behavior. The most crucial factor was food quality, followed by service and dining motivation, particularly regarding family dining. Safe ingredients, high cooking standards, and menu innovation all increased the likelihood of consumers choosing fine dining restaurants. PMID:29641554
Constructing a consumption model of fine dining from the perspective of behavioral economics.
Hsu, Sheng-Hsun; Hsiao, Cheng-Fu; Tsai, Sang-Bing
2018-01-01
Numerous factors affect how people choose a fine dining restaurant, including food quality, service quality, food safety, and hedonic value. A conceptual framework for evaluating restaurant selection behavior has not yet been developed. This study surveyed 150 individuals with fine dining experience and proposed the use of mental accounting and axiomatic design to construct a consumer economic behavior model. Linear and logistic regressions were employed to determine model correlations and the probability of each factor affecting behavior. The most crucial factor was food quality, followed by service and dining motivation, particularly regarding family dining. Safe ingredients, high cooking standards, and menu innovation all increased the likelihood of consumers choosing fine dining restaurants.
Implementation of Complexity Analyzing Based on Additional Effect
NASA Astrophysics Data System (ADS)
Zhang, Peng; Li, Na; Liang, Yanhong; Liu, Fang
According to the Complexity Theory, there is complexity in the system when the functional requirement is not be satisfied. There are several study performances for Complexity Theory based on Axiomatic Design. However, they focus on reducing the complexity in their study and no one focus on method of analyzing the complexity in the system. Therefore, this paper put forth a method of analyzing the complexity which is sought to make up the deficiency of the researches. In order to discussing the method of analyzing the complexity based on additional effect, this paper put forth two concepts which are ideal effect and additional effect. The method of analyzing complexity based on additional effect combines Complexity Theory with Theory of Inventive Problem Solving (TRIZ). It is helpful for designers to analyze the complexity by using additional effect. A case study shows the application of the process.
The Failure of Progressive Paradigm Reversal
ERIC Educational Resources Information Center
Guthrie, Gerard
2017-01-01
The student-centred, progressive paradigm has not had sustained success in changing teacher-centred, formalistic practices in "developing" country classrooms. Does "Gestalt-switch" and paradigm reversal demonstrate that progressive theory has realigned with formalistic reality, or has it remained axiomatic in the research and…
Communication and Noncompliance: An Axiomatic Framework.
ERIC Educational Resources Information Center
Powers, William G.; Gonzales, M. Christina
Patient noncompliance with medical advice is of major concern to physicians. Although many do not consider compliance their responsibility, research studies indicate that physicians can control many of the variables influencing compliance. Physicians' verbal and nonverbal communication habits that convey directiveness, coldness, complexity, and…
Nonlocal Quantum Information Transfer Without Superluminal Signalling and Communication
NASA Astrophysics Data System (ADS)
Walleczek, Jan; Grössing, Gerhard
2016-09-01
It is a frequent assumption that—via superluminal information transfers—superluminal signals capable of enabling communication are necessarily exchanged in any quantum theory that posits hidden superluminal influences. However, does the presence of hidden superluminal influences automatically imply superluminal signalling and communication? The non-signalling theorem mediates the apparent conflict between quantum mechanics and the theory of special relativity. However, as a `no-go' theorem there exist two opposing interpretations of the non-signalling constraint: foundational and operational. Concerning Bell's theorem, we argue that Bell employed both interpretations, and that he finally adopted the operational position which is associated often with ontological quantum theory, e.g., de Broglie-Bohm theory. This position we refer to as "effective non-signalling". By contrast, associated with orthodox quantum mechanics is the foundational position referred to here as "axiomatic non-signalling". In search of a decisive communication-theoretic criterion for differentiating between "axiomatic" and "effective" non-signalling, we employ the operational framework offered by Shannon's mathematical theory of communication, whereby we distinguish between Shannon signals and non-Shannon signals. We find that an effective non-signalling theorem represents two sub-theorems: (1) Non-transfer-control (NTC) theorem, and (2) Non-signification-control (NSC) theorem. Employing NTC and NSC theorems, we report that effective, instead of axiomatic, non-signalling is entirely sufficient for prohibiting nonlocal communication. Effective non-signalling prevents the instantaneous, i.e., superluminal, transfer of message-encoded information through the controlled use—by a sender-receiver pair —of informationally-correlated detection events, e.g., in EPR-type experiments. An effective non-signalling theorem allows for nonlocal quantum information transfer yet—at the same time—effectively denies superluminal signalling and communication.
Reproducing the nonlinear dynamic behavior of a structured beam with a generalized continuum model
NASA Astrophysics Data System (ADS)
Vila, J.; Fernández-Sáez, J.; Zaera, R.
2018-04-01
In this paper we study the coupled axial-transverse nonlinear vibrations of a kind of one dimensional structured solids by application of the so called Inertia Gradient Nonlinear continuum model. To show the accuracy of this axiomatic model, previously proposed by the authors, its predictions are compared with numeric results from a previously defined finite discrete chain of lumped masses and springs, for several number of particles. A continualization of the discrete model equations based on Taylor series allowed us to set equivalent values of the mechanical properties in both discrete and axiomatic continuum models. Contrary to the classical continuum model, the inertia gradient nonlinear continuum model used herein is able to capture scale effects, which arise for modes in which the wavelength is comparable to the characteristic distance of the structured solid. The main conclusion of the work is that the proposed generalized continuum model captures the scale effects in both linear and nonlinear regimes, reproducing the behavior of the 1D nonlinear discrete model adequately.
Gkigkitzis, Ioannis; Haranas, Ioannis; Austerlitz, Carlos
2015-01-01
This study contains a discussion on the connection between current mathematical and biological modeling systems in response to the main research need for the development of a new mathematical theory for study of cell survival after medical treatment and cell biological behavior in general. This is a discussion of suggested future research directions and relations with interdisciplinary science. In an effort to establish the foundations for a possible framework that may be adopted to study and analyze the process of cell survival during treatment, we investigate the organic connection among an axiomatic system foundation, a predator-prey rate equation, and information theoretic signal processing. A new set theoretic approach is also introduced through the definition of cell survival units or cell survival units indicating the use of "proper classes" according to the Zermelo-Fraenkel set theory and the axiom of choice, as the mathematics appropriate for the development of biological theory of cell survival.
NASA Technical Reports Server (NTRS)
Pike, Lee
2005-01-01
I describe some inconsistencies in John Rushby s axiomatization of time-triggered algorithms that he presents in these transactions and that he formally specifies and verifies in a mechanical theorem-prover. I also present corrections for these inconsistencies.
NASA Technical Reports Server (NTRS)
Ryjov, Alexander P.
1992-01-01
A model that makes use of fuzzy linguistic scales (FLS) is considered in this report. The definition of FLS fuzziness and its major properties are given in the report. Definitions that are concerned with information loss and noise are also presented.
Cultivating Deductive Thinking with Angle Chasing
ERIC Educational Resources Information Center
Edwards, Michael todd; Quinlan, James; Harper, Suzanne R.; Cox, Dana C.; Phelps, Steve
2014-01-01
Despite Common Core State Standards for Mathematics (CCSSI 2010) recommendations, too often students' introduction to proof consists of the study of formal axiomatic systems--for example, triangle congruence proofs--typically in an introductory geometry course with no connection back to previous work in earlier algebra courses. Van Hiele…
Strategic Approaches to Practice: An Action Research Project
ERIC Educational Resources Information Center
Burwell, Kim; Shipton, Matthew
2013-01-01
The importance of personal practice for instrumentalists and vocalists is well established among researchers, and axiomatic for practitioners. This paper reports on a phase of an action research project, investigating student approaches to personal practice. Following a preliminary questionnaire study, a residential clinic was conducted by…
Calculus of Elementary Functions, Part I. Teacher's Commentary. Revised Edition.
ERIC Educational Resources Information Center
Herriot, Sarah T.; And Others
This course is intended for students who have a thorough knowledge of college preparatory mathematics including algebra, axiomatic geometry, trigonometry, and analytic geometry. It does not assume they have acquired a background of elementary functions. This teacher's guide contains background information, suggested instructional procedures, and…
Karayiannis, N B
2000-01-01
This paper presents the development and investigates the properties of ordered weighted learning vector quantization (LVQ) and clustering algorithms. These algorithms are developed by using gradient descent to minimize reformulation functions based on aggregation operators. An axiomatic approach provides conditions for selecting aggregation operators that lead to admissible reformulation functions. Minimization of admissible reformulation functions based on ordered weighted aggregation operators produces a family of soft LVQ and clustering algorithms, which includes fuzzy LVQ and clustering algorithms as special cases. The proposed LVQ and clustering algorithms are used to perform segmentation of magnetic resonance (MR) images of the brain. The diagnostic value of the segmented MR images provides the basis for evaluating a variety of ordered weighted LVQ and clustering algorithms.
ERIC Educational Resources Information Center
Kaufmann, Matthew L.; Bomer, Megan A.; Powell, Nancy Norem
2009-01-01
Students enter the geometry classroom with a strong concept of fairness and a sense of what it means to "play by the rules," yet many students have difficulty understanding the postulates, or rules, of geometry and their implications. Although they may never have articulated the properties of an axiomatic system, they have gained a practical…
Teaching Activity-Based Taxicab Geometry
ERIC Educational Resources Information Center
Ada, Tuba
2013-01-01
This study aimed on the process of teaching taxicab geometry, a non-Euclidean geometry that is easy to understand and similar to Euclidean geometry with its axiomatic structure. In this regard, several teaching activities were designed such as measuring taxicab distance, defining a taxicab circle, finding a geometric locus in taxicab geometry, and…
Supporting Academic Honesty in Online Courses
ERIC Educational Resources Information Center
McGee, Patricia
2013-01-01
Ensuring academic honesty is a challenge for traditional classrooms, but more so for online course where technology use is axiomatic to learning and instruction. With the Higher Education Opportunity Act of 2008 (HEOA) requirement that online course providers reduce opportunities to cheat and verify student identity, all involved with course…
Standards and Criteria. Paper #10 in Occasional Paper Series.
ERIC Educational Resources Information Center
Glass, Gene V.
The logical and psychological bases for setting cutting scores for criterion-referenced tests are examined; they are found to be intrinsically arbitrary and are often examples of misdirected precision and axiomatization. The term, criterion referenced, originally referred to a technique for making test scores meaningful by controlling the test…
Experimental Course Report/Grade Nine.
ERIC Educational Resources Information Center
Davis, Robert B.
Described is the development of an approach to the algebra of real numbers which includes three areas of mathematics not commonly found in grade 9--the theory of limits of infinite sequences, a frequent use of Cartesian co-ordinates, and algebra of matrices. Seventy per cent of the course is abstract axiomatic algebra and the remaining portion…
The Mathematical Event: Mapping the Axiomatic and the Problematic in School Mathematics
ERIC Educational Resources Information Center
de Freitas, Elizabeth
2013-01-01
Traditional philosophy of mathematics has been concerned with the nature of mathematical objects rather than events. This traditional focus on reified objects is reflected in dominant theories of learning mathematics whereby the learner is meant to acquire familiarity with ideal mathematical objects, such as number, polygon, or tangent. I argue…
From Concrete to Abstract: A Story of Passion, Proof and Pedagogy
ERIC Educational Resources Information Center
Lawton, Fiona
2011-01-01
The author states her belief that mathematics is a human construct based on axiomatic systems, and that these constructs are both personal and social. She argues that to succeed in mathematics, learners' personal constructs need to be aligned with formal globally agreed mathematical conventions. Put more simply, she informs her students that…
Reliability of a Longitudinal Sequence of Scale Ratings
ERIC Educational Resources Information Center
Laenen, Annouschka; Alonso, Ariel; Molenberghs, Geert; Vangeneugden, Tony
2009-01-01
Reliability captures the influence of error on a measurement and, in the classical setting, is defined as one minus the ratio of the error variance to the total variance. Laenen, Alonso, and Molenberghs ("Psychometrika" 73:443-448, 2007) proposed an axiomatic definition of reliability and introduced the R[subscript T] coefficient, a measure of…
The Media as Voyeur: What Is Our "Right to Know?"
ERIC Educational Resources Information Center
Sayer, James E.
Current print and broadcast journalism is moving away from a concept of journalism as "information people need to know" towards a notion of the "Right to Know": everything conceivable about everyone is newsworthy. It is axiomatic that a well-informed public is a better electorate. However, the First Amendment guarantee of…
Peirce and Rationalism: Is Peirce a Fully Semiotic Philosopher?
ERIC Educational Resources Information Center
Stables, Andrew
2014-01-01
While Peirce is a seminal figure for contemporary semiotic philosophers, it is axiomatic of a fully semiotic perspective that no philosopher or philosophy (semiotics included) can provide any final answer, as signs are always interpreted and the context of interpretation always varies. Semiosis is evolutionary: it may or may not be construed as…
The Mechanism of Impact of Summative Assessment on Medical Students' Learning
ERIC Educational Resources Information Center
Cilliers, Francois J.; Schuwirth, Lambert W.; Adendorff, Hanelie J.; Herman, Nicoline; van der Vleuten, Cees P.
2010-01-01
It has become axiomatic that assessment impacts powerfully on student learning, but there is a surprising dearth of research on how. This study explored the mechanism of impact of summative assessment on the process of learning of theory in higher education. Individual, in-depth interviews were conducted with medical students and analyzed…
A Model of the Pre-Assessment Learning Effects of Summative Assessment in Medical Education
ERIC Educational Resources Information Center
Cilliers, Francois J.; Schuwirth, Lambert W. T.; Herman, Nicoline; Adendorff, Hanelie J.; van der Vleuten, Cees P. M.
2012-01-01
It has become axiomatic that assessment impacts powerfully on student learning. However, surprisingly little research has been published emanating from authentic higher education settings about the nature and mechanism of the pre-assessment learning effects of summative assessment. Less still emanates from health sciences education settings. This…
Investigating Image-Based Perception and Reasoning in Geometry
ERIC Educational Resources Information Center
Campbell, Stephen R.; Handscomb, Kerry; Zaparyniuk, Nicholas E.; Sha, Li; Cimen, O. Arda; Shipulina, Olga V.
2009-01-01
Geometry is required for many secondary school students, and is often learned, taught, and assessed more in a heuristic image-based manner, than as a formal axiomatic deductive system. Students are required to prove general theorems, but diagrams are usually used. It follows that understanding how students engage in perceiving and reasoning about…
Developing Learning Communities: Using Communities of Practice within Community Psychology
ERIC Educational Resources Information Center
Lawthom, Rebecca
2011-01-01
The idea that communities need to be inclusive is almost axiomatic. The process, whereby, community members engage in inclusive practices is far less understood. Similarly, UK universities are being encouraged to include the wider community and extent campus boundaries. Here, I suggest a particular theoretical lens which sheds light on engagement…
Function-Based Interventions for Children with Challenging Behavior
ERIC Educational Resources Information Center
Dunlap, Glen; Fox, Lise
2011-01-01
It is now axiomatic that challenging behaviors are defined more profitably by their functions (their motivations) than by their topographies (what they look like). The notion that challenging behaviors can be defined on the basis of their function has led in the past 30 years to a dramatically reconfigured approach to assessment and intervention.…
Deleuze and the Queer Ethics of an Empirical Education
ERIC Educational Resources Information Center
Moran, Paul Andrew
2013-01-01
Axiomatic and problematic approaches to ontology are discussed, at first in relation to the work of Badiou and Deleuze in mathematics. This discussion is then broadened focussing on problematics in Deleuze and Guattari's critiques of capitalism and psychoanalysis which results in an analysis of the implications of this discussion for education.…
Calculus of Elementary Functions, Part II. Teacher's Commentary. Revised Edition.
ERIC Educational Resources Information Center
Herriot, Sarah T.; And Others
This course is intended for students who have a thorough knowledge of college preparatory mathematics, including algebra, axiomatic geometry, trigonometry, and analytic geometry. This teacher's guide is for Part II of the course. It is designed to follow Part I of the text. The guide contains background information, suggested instructional…
Calculus of Elementary Functions, Part I. Student Text. Revised Edition.
ERIC Educational Resources Information Center
Herriot, Sarah T.; And Others
This course is intended for students who have a thorough knowledge of college preparatory mathematics, including algebra, axiomatic geometry, trigonometry, and analytic geometry. This text, Part I, contains the first five chapters of the course and two appendices. Chapters included are: (1) Polynomial Functions; (2) The Derivative of a Polynomial…
Calculus of Elementary Functions, Part II. Student Text. Revised Edition.
ERIC Educational Resources Information Center
Herriot, Sarah T.; And Others
This course is intended for students who have a thorough knowledge of college preparatory mathematics, including algebra, axiomatic geometry, trigonometry, and analytic geometry. This text, Part II, contains material designed to follow Part I. Chapters included in this text are: (6) Derivatives of Exponential and Related Functions; (7) Area and…
DOT National Transportation Integrated Search
1994-12-01
It is now axiomatic that America's population is growing older. Primary : indicators of this aging are the number of individuals age 65 years and over : (which increased from about 26 million in 1980 to over 33 million by 1990) and : the elderly perc...
ERIC Educational Resources Information Center
Cheshire, Daniel C.
2017-01-01
The introduction to general topology represents a challenging transition for students of advanced mathematics. It requires the generalization of their previous understanding of ideas from fields like geometry, linear algebra, and real or complex analysis to fit within a more abstract conceptual system. Students must adopt a new lexicon of…
Quantifying quantum coherence with quantum Fisher information.
Feng, X N; Wei, L F
2017-11-14
Quantum coherence is one of the old but always important concepts in quantum mechanics, and now it has been regarded as a necessary resource for quantum information processing and quantum metrology. However, the question of how to quantify the quantum coherence has just been paid the attention recently (see, e.g., Baumgratz et al. PRL, 113. 140401 (2014)). In this paper we verify that the well-known quantum Fisher information (QFI) can be utilized to quantify the quantum coherence, as it satisfies the monotonicity under the typical incoherent operations and the convexity under the mixing of the quantum states. Differing from most of the pure axiomatic methods, quantifying quantum coherence by QFI could be experimentally testable, as the bound of the QFI is practically measurable. The validity of our proposal is specifically demonstrated with the typical phase-damping and depolarizing evolution processes of a generic single-qubit state, and also by comparing it with the other quantifying methods proposed previously.
Historical Perspectives on the Crisis of the University
ERIC Educational Resources Information Center
Schapira, Michael
2014-01-01
The beginning of the 21th century has not been a particularly stable period for the university, at least if you trust the steady stream of books, articles, jeremiads and statements from public officials lamenting its fallen status and calling for bold reforms. Such a state of affairs has allowed critics and reformers alike to axiomatically evoke…
Quantum mechanics: why complex Hilbert space?
NASA Astrophysics Data System (ADS)
Cassinelli, G.; Lahti, P.
2017-10-01
We outline a programme for an axiomatic reconstruction of quantum mechanics based on the statistical duality of states and effects that combines the use of a theorem of Solér with the idea of symmetry. We also discuss arguments favouring the choice of the complex field. This article is part of the themed issue `Second quantum revolution: foundational questions'.
The Non-Linear Nature of Information and its Implications for Advanced Technology Forces
1998-05-18
anticipated tremendous benefits from the growth of information based technology. It is now axiomatic that the ability to achieve information dominance against...the commercial world are mix. To achieve the information dominance anticipated through advances in technology, military decision makers must understand and accommodate the non-linear nature of the information systems they employ.
ERIC Educational Resources Information Center
Johnson, Bonnie McD.; Leck, Glorianne M.
The philosophical proposition axiomatic in all gender difference research is examined in this paper. Research on gender differences is that which attempts to describe categorical differences between males and females, based on a designated potential for sexual reproduction. The methodological problems raised by this assumption include the…
An Axiomatic Theory of Cognition and Writing.
ERIC Educational Resources Information Center
Grunig, James E.; And Others
Noting that although a great deal of empirical research has been done to investigate the writing rules commonly taught, this paper points out that no one has yet constructed a deep theory of the relationship between cognition and writing that confirms the writing rules and explains how they work. The paper then uses theories and research in the…
School on Cloud: Transforming Education
ERIC Educational Resources Information Center
Koutsopoulos, Kostis C.; Papoutsis, Panos
2016-01-01
Nowadays for an appropriate way to deal with teaching and learning there is an axiomatic need to accept an integrated-holistic approach both in terms of the way we regard education and of how we practice it. This leads to a two-prong position: First, that education constitutes a dialectic entity and second that approaches to education presently in…
What Difference Does a More In-Depth Programme Make to Learning?
ERIC Educational Resources Information Center
Reiss, Athene
2015-01-01
It is virtually axiomatic that a more extended learning experience will have more impact than a one-off experience. But how much difference does it make and is the extended time commitment justified by the results? The Berkshire, Buckinghamshire and Oxfordshire Wildlife Trust (BBOWT) conducted some research to explore this question with regard to…
ERIC Educational Resources Information Center
Goldberg, Adele; Suppes, Patrick
An interactive computer-assisted system for teaching elementary logic is described, which was designed to handle formalizations of first-order theories suitable for presentation in a computer-assisted instruction environment. The system provides tools with which the user can develop and then study a nonlogical axiomatic theory along whatever lines…
Charting an Alternate Pathway to Reaction Orders and Rate Laws in Introductory Chemistry Courses
ERIC Educational Resources Information Center
Rushton, Gregory T.; Criswell, Brett A.; McAllister, Nicole D.; Polizzi, Samuel J.; Moore, Lamesha A.; Pierre, Michelle S.
2014-01-01
Reaction kinetics is an axiomatic topic in chemistry that is often addressed as early as the high school course and serves as the foundation for more sophisticated conversations in college-level organic, physical, and biological chemistry courses. Despite the fundamental nature of reaction kinetics, students can struggle with transforming their…
Fundaments of plant cybernetics.
Zucconi, F
2001-01-01
A systemic approach is proposed for analyzing plants' physiological organization and cybernesis. To this end, the plant is inspected as a system, starting from the integration of crown and root systems, and its impact on a number of basic epigenetic events. The approach proves to be axiomatic and facilitates the definition of the principles behind the plant's autonomous control of growth and reproduction.
NASA Astrophysics Data System (ADS)
Cable, John
2014-01-01
This article offers a new interpretation of Piaget's decanting experiments, employing the mathematical notion of equivalence instead of conservation. Some reference is made to Piaget's theories and to his educational legacy, but the focus in on certain of the experiments. The key to the new analysis is the abstraction principle, which has been formally enunciated in mathematical philosophy but has universal application. It becomes necessary to identity fluid objects (both configured and unconfigured) and configured and unconfigured sets-of-objects. Issues emerge regarding the conflict between philosophic realism and anti-realism, including constructivism. Questions are asked concerning mathematics and mathematical philosophy, particularly over the nature of sets, the wisdom of the axiomatic method and aspects of the abstraction principle itself.
Quantum mechanics: why complex Hilbert space?
Cassinelli, G; Lahti, P
2017-11-13
We outline a programme for an axiomatic reconstruction of quantum mechanics based on the statistical duality of states and effects that combines the use of a theorem of Solér with the idea of symmetry. We also discuss arguments favouring the choice of the complex field.This article is part of the themed issue 'Second quantum revolution: foundational questions'. © 2017 The Author(s).
ERIC Educational Resources Information Center
Tapan, Menekse Seden; Arslan, Cigdem
2009-01-01
The main purpose of this research is to determine to what extent preservice teachers use visual elements and mathematical properties when they are dealing with a geometrical construction activity. The axiomatic structure of the Euclidian geometry forms a coherent field of objects and relations of a theoretical nature; and thus it constitutes a…
New Groups Give Teachers Alternative Voice: Organizations Help Educators Cut Policy Teeth
ERIC Educational Resources Information Center
Sawchuk, Steven
2012-01-01
In times of great uncertainty for U.S. teachers, who speaks for them? The question is almost axiomatic in its simplicity, but the answer is far less clear-cut. The teachers' unions remain the most visible, powerful, and probably the most important advocates for teachers. But over the past few years, a number of new efforts have sprung up…
ERIC Educational Resources Information Center
Bahrami, Bahador; Olsen, Karsten; Bang, Dan; Roepstorff, Andreas; Rees, Geraint; Frith, Chris
2012-01-01
That objective reference is necessary for formation of reliable beliefs about the external world is almost axiomatic. However, Condorcet (1785) suggested that purely subjective information--if shared and combined via social interaction--is enough for accurate understanding of the external world. We asked if social interaction and objective…
The Uplands after Neoliberalism?--The Role of the Small Farm in Rural Sustainability
ERIC Educational Resources Information Center
Shucksmith, Mark; Ronningen, Katrina
2011-01-01
The modernist project foresaw no role for small farms, but this can no longer be regarded as axiomatic as neoliberalism enters what Peck et al. call its "zombie phase". This paper asks what contribution small farms in the uplands can make to societies' goals, what role they might play in the sustainability of rural communities in such…
NASA Astrophysics Data System (ADS)
Mardi Safitri, Dian; Arfi Nabila, Zahra; Azmi, Nora
2018-03-01
Musculoskeletal Disorders (MSD) is one of the ergonomic risks due to manual activity, non-neutral posture and repetitive motion. The purpose of this study is to measure risk and implement ergonomic interventions to reduce the risk of MSD on the paper pallet assembly work station. Measurements to work posture are done by Ovako Working Posture Analysis (OWAS) methods and Rapid Entire Body Assessment (REBA) method, while the measurement of work repetitiveness was using Strain Index (SI) method. Assembly processes operators are identified has the highest risk level. OWAS score, Strain Index, and REBA values are 4, 20.25, and 11. Ergonomic improvements are needed to reduce that level of risk. Proposed improvements will be developed using the Quality Function Deployment (QFD) method applied with Axiomatic House of Quality (AHOQ) and Morphological Chart. As the result, risk level based on OWAS score & REBA score turn out from 4 & 11 to be 1 & 2. Biomechanics analysis of the operator also shows the decreasing values for L4-L5 moment, compression, joint shear, and joint moment strength.
Thermal response properties of protective clothing fabrics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baitinger, W.F.
1995-12-31
In the industrial workplace, it becomes increasingly incumbent upon employers to require employees to use suitable protective equipment and to wear protective apparel. When workers may be subjected to accidental radiant, flame, or electric arc heat sources, work clothing should be used that does not become involved in burning. It is axiomatic that work clothing should not become a primary fuel source, adding to the level of heat exposure, since clothing is usually in intimate contact with the skin. Further, clothing should provide sufficient insulation to protect the skin from severe burn injury. If the worker receives such protection frommore » clothing, action then may be taken to escape the confronted thermal hazard. Published laboratory test methods are used to measure flame resistance and thermal responses of flame resistant fabrics in protective clothing. The purpose of this article is to review these test methods, to discuss certain limitations in application, and to suggest how flame resistant cotton fabrics may be used to enhance worker safety.« less
Decidability of formal theories and hyperincursivity theory
NASA Astrophysics Data System (ADS)
Grappone, Arturo G.
2000-05-01
This paper shows the limits of the Proof Standard Theory (briefly, PST) and gives some ideas of how to build a proof anticipatory theory (briefly, PAT) that has no such limits. Also, this paper considers that Gödel's proof of the undecidability of Principia Mathematica formal theory is not valid for axiomatic theories that use a PAT to build their proofs because the (hyper)incursive functions are self-representable.
ERIC Educational Resources Information Center
Tschoumy, Jacques-Andre
This document examines the trend of school partnership both inside and outside the educational system. The report asks three questions: what is motivating European partners?; is the phenomenon of partnership really European?; and is this the end of the school of Jules Ferry? School partnership history, strategy, and axiomatics or rules are…
Can Defense Spending Be Justified during a Period of Continual Peace?
1991-06-07
although this was clearly unsatisfactory from a strictly theoretical perspective. 66Revealed Preference is a technique used to explain comsumer behavior ...insurgencies therefore a case cf irrational behavior ? In behaviorial sciences, it is usually tempting to assume away deviations from the prediction of a model...as irrational behavior or an inadequacy of the model. Rationality is axiomatic. All nation-states always act according to what they perceive (as
Plastino, A; Rocca, M C
2017-06-01
Appealing to the 1902 Gibbs formalism for classical statistical mechanics (SM)-the first SM axiomatic theory ever that successfully explained equilibrium thermodynamics-we show that already at the classical level there is a strong correlation between Renyi's exponent α and the number of particles for very simple systems. No reference to heat baths is needed for such a purpose.
Clinical and Cultural Perspectives on Mental Illness in the U.S. Navy.
1987-09-09
psychiatric problems. In the latter instance, an individual may report for another medical condition (e.g., an accidental injury, somatic complaints) and be...to the somatic or physical symptoms which are recognized either by the patient, his supervisor, or his command’s medical officer. Self-referrals...mental patients presented a stereotyped clinical syndrome in which hypochondriasis and paranoia were prominent. It was equally axiomatic that
Not all (possibly) “random” sequences are created equal
Pincus, Steve; Kalman, Rudolf E.
1997-01-01
The need to assess the randomness of a single sequence, especially a finite sequence, is ubiquitous, yet is unaddressed by axiomatic probability theory. Here, we assess randomness via approximate entropy (ApEn), a computable measure of sequential irregularity, applicable to single sequences of both (even very short) finite and infinite length. We indicate the novelty and facility of the multidimensional viewpoint taken by ApEn, in contrast to classical measures. Furthermore and notably, for finite length, finite state sequences, one can identify maximally irregular sequences, and then apply ApEn to quantify the extent to which given sequences differ from maximal irregularity, via a set of deficit (defm) functions. The utility of these defm functions which we show allows one to considerably refine the notions of probabilistic independence and normality, is featured in several studies, including (i) digits of e, π, √2, and √3, both in base 2 and in base 10, and (ii) sequences given by fractional parts of multiples of irrationals. We prove companion analytic results, which also feature in a discussion of the role and validity of the almost sure properties from axiomatic probability theory insofar as they apply to specified sequences and sets of sequences (in the physical world). We conclude by relating the present results and perspective to both previous and subsequent studies. PMID:11038612
Stochastic Gravity: Theory and Applications.
Hu, Bei Lok; Verdaguer, Enric
2004-01-01
Whereas semiclassical gravity is based on the semiclassical Einstein equation with sources given by the expectation value of the stress-energy tensor of quantum fields, stochastic semiclassical gravity is based on the Einstein-Langevin equation, which has in addition sources due to the noise kernel. The noise kernel is the vacuum expectation value of the (operatorvalued) stress-energy bi-tensor which describes the fluctuations of quantum matter fields in curved spacetimes. In the first part, we describe the fundamentals of this new theory via two approaches: the axiomatic and the functional. The axiomatic approach is useful to see the structure of the theory from the framework of semiclassical gravity, showing the link from the mean value of the stress-energy tensor to their correlation functions. The functional approach uses the Feynman-Vernon influence functional and the Schwinger-Keldysh closed-time-path effective action methods which are convenient for computations. It also brings out the open systems concepts and the statistical and stochastic contents of the theory such as dissipation, fluctuations, noise, and decoherence. We then focus on the properties of the stress-energy bi-tensor. We obtain a general expression for the noise kernel of a quantum field defined at two distinct points in an arbitrary curved spacetime as products of covariant derivatives of the quantum field's Green function. In the second part, we describe three applications of stochastic gravity theory. First, we consider metric perturbations in a Minkowski spacetime. We offer an analytical solution of the Einstein-Langevin equation and compute the two-point correlation functions for the linearized Einstein tensor and for the metric perturbations. Second, we discuss structure formation from the stochastic gravity viewpoint, which can go beyond the standard treatment by incorporating the full quantum effect of the inflaton fluctuations. Third, we discuss the backreaction of Hawking radiation in the gravitational background of a quasi-static black hole (enclosed in a box). We derive a fluctuation-dissipation relation between the fluctuations in the radiation and the dissipative dynamics of metric fluctuations.
1978-09-01
which, it seems to us, can be naturally interpreted as directly supporting this contention is reported in Hochstein and Shapley (1976a, b ), Levick ...are certainly positive aspects in many of these methodologies and, in particular, in what they are trying to obtain. To b * effective, however, a...statement about control structures Thus if y • A(x); we can say Where (y,x) are INTEGERS, A is a constant FUNCTION; in addition, if 2 « C( b ); we
On axiomatizations of the Shapley value for bi-cooperative games
NASA Astrophysics Data System (ADS)
Meirong, Wu; Shaochen, Cao; Huazhen, Zhu
2016-06-01
There are three decisions available for each participant in bi-cooperative games which can depict real life accurately in this paper. This paper researches the Shapley value of bi-cooperative games and completes the unique characterization. The axiom similar to classical cooperative games which could be used to characterize the Shapley value of bi-cooperative games as well. Meanwhile, it introduces a structural axiom and a zero excluded axiom instead of effective axiom in classical cooperative games.
NASA Astrophysics Data System (ADS)
Heinert, G.; Mondorf, W.
1982-11-01
High speed image processing was used to analyse morphologic and metabolic characteristics of clinically relevant kidney tissue alterations.Qualitative computer-assisted histophotometry was performed to measure alterations in levels of the enzymes alkaline phosphatase (Ap),alanine aminopeptidase (AAP),g-glutamyltranspepti-dase (GGTP) and A-glucuronidase (B-G1) and AAP and GGTP immunologically determined in prepared renal and cancer tissue sections. A "Mioro-Videomat 2" image analysis system with a "Tessovar" macroscope,a computer-assisted "Axiomat" photomicroscope and an "Interactive Image Analysis System (IBAS)" were employed for analysing changes in enzyme activities determined by changes in absorbance or transmission.Diseased kidney as well as renal neoplastic tissues could be distinguished by significantly (wilcoxon test,p<0,05) decreased enzyme concentrations as compared to those found in normal human kidney tissues.This image analysis techniques might be of potential use in diagnostic and prognostic evaluation of renal cancer and diseased kidney tissues.
A two-stage DEA approach for environmental efficiency measurement.
Song, Malin; Wang, Shuhong; Liu, Wei
2014-05-01
The slacks-based measure (SBM) model based on the constant returns to scale has achieved some good results in addressing the undesirable outputs, such as waste water and water gas, in measuring environmental efficiency. However, the traditional SBM model cannot deal with the scenario in which desirable outputs are constant. Based on the axiomatic theory of productivity, this paper carries out a systematic research on the SBM model considering undesirable outputs, and further expands the SBM model from the perspective of network analysis. The new model can not only perform efficiency evaluation considering undesirable outputs, but also calculate desirable and undesirable outputs separately. The latter advantage successfully solves the "dependence" problem of outputs, that is, we can not increase the desirable outputs without producing any undesirable outputs. The following illustration shows that the efficiency values obtained by two-stage approach are smaller than those obtained by the traditional SBM model. Our approach provides a more profound analysis on how to improve environmental efficiency of the decision making units.
NASA Astrophysics Data System (ADS)
Croon, Djuna; Sanz, Verónica; Setford, Jack
2015-10-01
Identifying the inflaton with a pseudo-Goldstone boson explains the flatness of its potential. Successful Goldstone Inflation should also be robust against UV corrections, such as from quantum gravity: in the language of the effective field theory this implies that all scales are sub-Planckian. In this paper we present scenarios which realise both requirements by examining the structure of Goldstone potentials arising from Coleman-Weinberg contributions. We focus on single-field models, for which we notice that both bosonic and fermionic contributions are required and that spinorial fermion representations can generate the right potential shape. We then evaluate the constraints on non-Gaussianity from higher-derivative interactions, finding that axiomatic constraints on Goldstone boson scattering prevail over the current CMB measurements. The fit to CMB data can be connected to the UV completions for Goldstone Inflation, finding relations in the spectrum of new resonances. Finally, we show how hybrid inflation can be realised in the same context, where both the inflaton and the waterfall fields share a common origin as Goldstones.
Quantum field theory on curved spacetimes: Axiomatic framework and examples
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fredenhagen, Klaus; Rejzner, Kasia
In this review article, we want to expose a systematic development of quantum field theory on curved spacetimes. The leading principle is the emphasis on local properties. It turns out that this requires a reformulation of the QFT framework which also yields a new perspective for the theories on Minkowski space. The aim of the present work is to provide an almost self-contained introduction into the framework, which should be accessible for both mathematical physicists and mathematicians.
A Systematic Review of Quantitative Resilience Measures for Water Infrastructure Systems
Shin, Sangmin; Lee, Seungyub; Judi, David; ...
2018-02-07
Over the past few decades, the concept of resilience has emerged as an important consideration in the planning and management of water infrastructure systems. Accordingly, various resilience measures have been developed for the quantitative evaluation and decision-making of systems. There are, however, numerous considerations and no clear choice of which measure, if any, provides the most appropriate representation of resilience for a given application. This study provides a critical review of quantitative approaches to measure the resilience of water infrastructure systems, with a focus on water resources and distribution systems. A compilation of 11 criteria evaluating 21 selected resilience measuresmore » addressing major features of resilience is developed using the Axiomatic Design process. Existing gaps of resilience measures are identified based on the review criteria. The results show that resilience measures have generally paid less attention to cascading damage to interrelated systems, rapid identification of failure, physical damage of system components, and time variation of resilience. Concluding the paper, improvements to resilience measures are recommended. The findings contribute to our understanding of gaps and provide information to help further improve resilience measures of water infrastructure systems.« less
A Systematic Review of Quantitative Resilience Measures for Water Infrastructure Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shin, Sangmin; Lee, Seungyub; Judi, David
Over the past few decades, the concept of resilience has emerged as an important consideration in the planning and management of water infrastructure systems. Accordingly, various resilience measures have been developed for the quantitative evaluation and decision-making of systems. There are, however, numerous considerations and no clear choice of which measure, if any, provides the most appropriate representation of resilience for a given application. This study provides a critical review of quantitative approaches to measure the resilience of water infrastructure systems, with a focus on water resources and distribution systems. A compilation of 11 criteria evaluating 21 selected resilience measuresmore » addressing major features of resilience is developed using the Axiomatic Design process. Existing gaps of resilience measures are identified based on the review criteria. The results show that resilience measures have generally paid less attention to cascading damage to interrelated systems, rapid identification of failure, physical damage of system components, and time variation of resilience. Concluding the paper, improvements to resilience measures are recommended. The findings contribute to our understanding of gaps and provide information to help further improve resilience measures of water infrastructure systems.« less
NASA Technical Reports Server (NTRS)
Kim, Hakil; Swain, Philip H.
1990-01-01
An axiomatic approach to intervalued (IV) probabilities is presented, where the IV probability is defined by a pair of set-theoretic functions which satisfy some pre-specified axioms. On the basis of this approach representation of statistical evidence and combination of multiple bodies of evidence are emphasized. Although IV probabilities provide an innovative means for the representation and combination of evidential information, they make the decision process rather complicated. It entails more intelligent strategies for making decisions. The development of decision rules over IV probabilities is discussed from the viewpoint of statistical pattern recognition. The proposed method, so called evidential reasoning method, is applied to the ground-cover classification of a multisource data set consisting of Multispectral Scanner (MSS) data, Synthetic Aperture Radar (SAR) data, and digital terrain data such as elevation, slope, and aspect. By treating the data sources separately, the method is able to capture both parametric and nonparametric information and to combine them. Then the method is applied to two separate cases of classifying multiband data obtained by a single sensor. In each case a set of multiple sources is obtained by dividing the dimensionally huge data into smaller and more manageable pieces based on the global statistical correlation information. By a divide-and-combine process, the method is able to utilize more features than the conventional maximum likelihood method.
NASA Technical Reports Server (NTRS)
Ditto, Thomas
2017-01-01
This Report is not the latest word on an old idea but the first word on a new one. The new idea reverses the old one, the axiom that the best primary objective for an astronomical telescope exhibits the least chromatic aberration. That axiomatic distinction goes back to a young Isaac Newton who knew from experiments with prisms and mirrors in the 1660's that magnification with a reflection primary was completely free of the dispersion he saw with refraction. The superiority of reflection primary objectives for eyeball or photographic viewing is now considered obvious. It was this piece of wisdom on achromatic primary objectives that led to the dominance of the parabolic mirror as the means to collect star light. Newton was aware of the problem when he introduced his telescope to the scientific world in 1670.This Report is not the latest word on an old idea but the first word on a new one. The new idea reverses the old one, the axiom that the best primary objective for an astronomical telescope exhibits the least chromatic aberration. That axiomatic distinction goes back to a young Isaac Newton who knew from experiments with prisms and mirrors in the 1660's that magnification with a reflection primary was completely free of the dispersion he saw with refraction. The superiority of reflection primary objectives for eyeball or photographic viewing is now considered obvious. Actually, Newton's design innovation was in a secondary mirror, a plane mirror far more easily fabricated than Gregory's embodiment of 1663 which required two curved mirrors.
Some trends and proposals for the inclusion of sustainability in the design of manufacturing process
NASA Astrophysics Data System (ADS)
Fradinho, J.; Nedelcu, D.; Gabriel-Santos, A.; Gonçalves-Coelho, A.; Mourão, A.
2015-11-01
Production processes are designed to meet requirements of three different natures, quality, cost and time. Environmental concerns have expanded the field of conceptual design through the introduction of sustainability requirements that are driven by the growing societal thoughtfulness about environmental issues. One could say that the major concern has been the definition of metrics or indices for sustainability. However, those metrics usually have some lack of consistency. More than ever, there is a need for an all-inclusive view at any level of decision-making, from the establishing of the design requirements to the implementation of the solutions. According to the Axiomatic Design Theory, sustainable designs are usually coupled designs that should be avoided. This raises a concern related to the very nature of sustainability: the cross effects between the actions that should be considered in the attempt to decouple the design solutions. In terms of production, one should clarify the characterization of the sustainability of production systems. The objectives of this paper are: i) to analyse some trends for approaching the sustainability of the production processes; ii) to define sustainability in terms of requirements for the design of the production processes; iii) to make some proposals based on the Axiomatic Design Theory, in order to establish the principles with which the guidelines for designing production processes must comply; iv) to discuss how to introduce this matter in teaching both manufacturing technology and design of production systems.
Quantum cellular automata and free quantum field theory
NASA Astrophysics Data System (ADS)
D'Ariano, Giacomo Mauro; Perinotti, Paolo
2017-02-01
In a series of recent papers [1-4] it has been shown how free quantum field theory can be derived without using mechanical primitives (including space-time, special relativity, quantization rules, etc.), but only considering the easiest quantum algorithm encompassing a countable set of quantum systems whose network of interactions satisfies the simple principles of unitarity, homogeneity, locality, and isotropy. This has opened the route to extending the axiomatic information-theoretic derivation of the quantum theory of abstract systems [5, 6] to include quantum field theory. The inherent discrete nature of the informational axiomatization leads to an extension of quantum field theory to a quantum cellular automata theory, where the usual field theory is recovered in a regime where the discrete structure of the automata cannot be probed. A simple heuristic argument sets the scale of discreteness to the Planck scale, and the customary physical regime where discreteness is not visible is the relativistic one of small wavevectors. In this paper we provide a thorough derivation from principles that in the most general case the graph of the quantum cellular automaton is the Cayley graph of a finitely presented group, and showing how for the case corresponding to Euclidean emergent space (where the group resorts to an Abelian one) the automata leads to Weyl, Dirac and Maxwell field dynamics in the relativistic limit. We conclude with some perspectives towards the more general scenario of non-linear automata for interacting quantum field theory.
Testing First-Order Logic Axioms in AutoCert
NASA Technical Reports Server (NTRS)
Ahn, Ki Yung; Denney, Ewen
2009-01-01
AutoCert [2] is a formal verification tool for machine generated code in safety critical domains, such as aerospace control code generated from MathWorks Real-Time Workshop. AutoCert uses Automated Theorem Provers (ATPs) [5] based on First-Order Logic (FOL) to formally verify safety and functional correctness properties of the code. These ATPs try to build proofs based on user provided domain-specific axioms, which can be arbitrary First-Order Formulas (FOFs). These axioms are the most crucial part of the trusted base, since proofs can be submitted to a proof checker removing the need to trust the prover and AutoCert itself plays the part of checking the code generator. However, formulating axioms correctly (i.e. precisely as the user had really intended) is non-trivial in practice. The challenge of axiomatization arise from several dimensions. First, the domain knowledge has its own complexity. AutoCert has been used to verify mathematical requirements on navigation software that carries out various geometric coordinate transformations involving matrices and quaternions. Axiomatic theories for such constructs are complex enough that mistakes are not uncommon. Second, adjusting axioms for ATPs can add even more complexity. The axioms frequently need to be modified in order to have them in a form suitable for use with ATPs. Such modifications tend to obscure the axioms further. Thirdly, speculating validity of the axioms from the output of existing ATPs is very hard since theorem provers typically do not give any examples or counterexamples.
Einstein's First Steps Toward General Relativity: Gedanken Experiments and Axiomatics
NASA Astrophysics Data System (ADS)
Miller, A. I.
1999-03-01
Albert Einstein's 1907 Jahrbuch paper is an extraordinary document because it contains his first steps toward generalizing the 1905 relativity theory to include gravitation. Ignoring the apparent experimental disconfirmation of the 1905 relativity theory and his unsuccessful attempts to generalize the mass-energy equivalence, Einstein boldly raises the mass-energy equivalence to an axiom, invokes equality between gravitational and inertial masses, and then postulates the equivalence between a uniform gravitational field and an oppositely directed constant acceleration, the equivalence principle. How did this come about? What is at issue is scientific creativity. This necessitates broadening historical analysis to include aspects of cognitive science such as the role of visual imagery in Einstein's thinking, and the relation between conscious and unconscious modes of thought in problem solving. This method reveals the catalysts that sparked a Gedanken experiment that occurred to Einstein while working on the Jahrbuch paper. A mental model is presented to further explore Einstein's profound scientific discovery.
NASA Astrophysics Data System (ADS)
Stöltzner, Michael
Answering to the double-faced influence of string theory on mathematical practice and rigour, the mathematical physicists Arthur Jaffe and Frank Quinn have contemplated the idea that there exists a `theoretical' mathematics (alongside `theoretical' physics) whose basic structures and results still require independent corroboration by mathematical proof. In this paper, I shall take the Jaffe-Quinn debate mainly as a problem of mathematical ontology and analyse it against the backdrop of two philosophical views that are appreciative towards informal mathematical development and conjectural results: Lakatos's methodology of proofs and refutations and John von Neumann's opportunistic reading of Hilbert's axiomatic method. The comparison of both approaches shows that mitigating Lakatos's falsificationism makes his insights about mathematical quasi-ontology more relevant to 20th century mathematics in which new structures are introduced by axiomatisation and not necessarily motivated by informal ancestors. The final section discusses the consequences of string theorists' claim to finality for the theory's mathematical make-up. I argue that ontological reductionism as advocated by particle physicists and the quest for mathematically deeper axioms do not necessarily lead to identical results.
An Alternate Approach to Axiomatizations of the Von Neumann/Morgenstern Characteristic Function.
1987-03-01
I ~Nh/N OROENSTERN C.. CU) STANFORD UNIY CA INST FOR I MATHEMTICAL STUDIES IN THE SOCIAL S.. U0CASFEA A LEWIS ET AL. MAR 87 TR-569 F/0 12/3 M 1111...Research NATIONAL SCIENCE FOUNDATION GRANT DMS-84-10456 THE ECONOMICS SERIES INSTITUTE FOR MATHEMATICAL STUDIES IN THE SOCIAL SCIENCES Fourth Floor, Encina...characteristic function of a game - that gives us an intuitive idea of the value of a coalition - is of central importance in the theory of N- person
Automated Verification of Design Patterns with LePUS3
NASA Technical Reports Server (NTRS)
Nicholson, Jonathan; Gasparis, Epameinondas; Eden, Ammon H.; Kazman, Rick
2009-01-01
Specification and [visual] modelling languages are expected to combine strong abstraction mechanisms with rigour, scalability, and parsimony. LePUS3 is a visual, object-oriented design description language axiomatized in a decidable subset of the first-order predicate logic. We demonstrate how LePUS3 is used to formally specify a structural design pattern and prove ( verify ) whether any JavaTM 1.4 program satisfies that specification. We also show how LePUS3 specifications (charts) are composed and how they are verified fully automatically in the Two-Tier Programming Toolkit.
NASA Technical Reports Server (NTRS)
Onwubiko, Chin-Yere; Onyebueke, Landon
1996-01-01
The structural design, or the design of machine elements, has been traditionally based on deterministic design methodology. The deterministic method considers all design parameters to be known with certainty. This methodology is, therefore, inadequate to design complex structures that are subjected to a variety of complex, severe loading conditions. A nonlinear behavior that is dependent on stress, stress rate, temperature, number of load cycles, and time is observed on all components subjected to complex conditions. These complex conditions introduce uncertainties; hence, the actual factor of safety margin remains unknown. In the deterministic methodology, the contingency of failure is discounted; hence, there is a use of a high factor of safety. It may be most useful in situations where the design structures are simple. The probabilistic method is concerned with the probability of non-failure performance of structures or machine elements. It is much more useful in situations where the design is characterized by complex geometry, possibility of catastrophic failure, sensitive loads and material properties. Also included: Comparative Study of the use of AGMA Geometry Factors and Probabilistic Design Methodology in the Design of Compact Spur Gear Set.
Disclosures and refutations: clinical psychoanalysis as a logic of enquiry.
Ahumada, J L
1997-12-01
The author argues that the empirical status of psychoanalysis has been distorted by the controversy surrounding Aristotelian and Galilean-Newtonian schemes of science, whose core ideas, respectively, are scientific concepts or forms and scientific laws. Bound by the Euclidian axiomatic tradition, Newtonian-style 'laws' do not obtain in the social sciences, and in biology they apply only at near-molecular levels, not at the level of mind. Continuing the traditional Aristotelian reliance on 'exemplars', neither Freud's nor Darwin's work fits the deductivist, infallibilistic scientific criteria used by philosophers such as Tarski, Popper or Grünbaum. Rebutting the charge that the workings of psychoanalysis are not explicit enough, this paper unfolds the logic of enquiry it utilises. In contrast to Newtonian inductivism, Popper's method of conjectures and refutations and Lakatos's method of proofs and refutations, clinical psychoanalysis employs a 'practical logic' of disclosures and refutations, a constantly evolving enquiry of multi-layered and tentative evidence of discordance and analogy. Such epistemic fallibilism, relying on a build-up of ostensive evidences rather than on 'certainties', is illustrated in the text by Moore's famous 'two-hands' argument, is held to fit in with the workings of everyday discernment and, arguably, with Darwinian evolution.
NASA Astrophysics Data System (ADS)
Vallega, Adalberto
1999-08-01
In the mid-1980s the debate about the role of oceanography vis-à-vis the evolving demand for ocean research was initiated in the framework of the Intergovernmental Oceanographic Commission (IOC) of UNESCO. That discussion was basically triggered by the need to meet the demand for research generated by the United Nations Conference on the Human Environment (1972). More recently, also as a consequence of the inputs from the United Nations Conference on Environment and Development (UNCED, 1992), the discussion of the role of oceanography in the framework of the co-operation between physical and social disciplines was initiated focusing on the prospect of building up the ocean science. The prospect of the ocean science as designed by IOC (1984) was finalised only to integrate the branches of oceanography. To discuss how that design could be implemented on the basis of progress in epistemology and ocean policy, the subject is focused on considering three levels: i) the epistemological level, where the option between positivism- and constructivism- based epistemologies has arisen; ii) the logical level, where the option is concerned with disjunctive and conjunctive logic; and iii) the methodological level, where the option regards the analytical-deductive and the inductive-axiomatic methods. The thesis is sustained that, to meet the demand for management-oriented research, the pathway including constructivist epistemology, conjunctive logic and inductive-axiomatic methods could be usefully adopted as the cement of inter-disciplinarity. The second part of the paper is concerned with the Mediterranean, and how holism-referred and management-aimed investigations might be conducted by applying the above conceptual approach is considered: i) presenting the individual emerging subject areas on which the demand for management patterns is expected to focus in the mid- and long-run; ii) illustrating the major aspects of the individual subject areas to be investigated; iii) deducing what leading role might be assigned to oceanography in building up inter-disciplinary approaches; iv) which disciplines might be stimulated to co-operate. The subject areas considered with reference to the Mediterranean include: i) integrated coastal management; ii) the deep-ocean coastal uses with special consideration of living resources management; iii) the protection of biodiversity. Ocean Geographical Information Systems (OGIS), data management, and education and training are presented as intersecting research areas calling for inter-disciplinary approaches. As a conclusion, a breakdown of questions on which discussion might be concentrated is considered.
Fuzzy Versions of Epistemic and Deontic Logic
NASA Technical Reports Server (NTRS)
Gounder, Ramasamy S.; Esterline, Albert C.
1998-01-01
Epistemic and deontic logics are modal logics, respectively, of knowledge and of the normative concepts of obligation, permission, and prohibition. Epistemic logic is useful in formalizing systems of communicating processes and knowledge and belief in AI (Artificial Intelligence). Deontic logic is useful in computer science wherever we must distinguish between actual and ideal behavior, as in fault tolerance and database integrity constraints. We here discuss fuzzy versions of these logics. In the crisp versions, various axioms correspond to various properties of the structures used in defining the semantics of the logics. Thus, any axiomatic theory will be characterized not only by its axioms but also by the set of properties holding of the corresponding semantic structures. Fuzzy logic does not proceed with axiomatic systems, but fuzzy versions of the semantic properties exist and can be shown to correspond to some of the axioms for the crisp systems in special ways that support dependency networks among assertions in a modal domain. This in turn allows one to implement truth maintenance systems. For the technical development of epistemic logic, and for that of deontic logic. To our knowledge, we are the first to address fuzzy epistemic and fuzzy deontic logic explicitly and to consider the different systems and semantic properties available. We give the syntax and semantics of epistemic logic and discuss the correspondence between axioms of epistemic logic and properties of semantic structures. The same topics are covered for deontic logic. Fuzzy epistemic and fuzzy deontic logic discusses the relationship between axioms and semantic properties for these logics. Our results can be exploited in truth maintenance systems.
Mathematics and complex systems.
Foote, Richard
2007-10-19
Contemporary researchers strive to understand complex physical phenomena that involve many constituents, may be influenced by numerous forces, and may exhibit unexpected or emergent behavior. Often such "complex systems" are macroscopic manifestations of other systems that exhibit their own complex behavior and obey more elemental laws. This article proposes that areas of mathematics, even ones based on simple axiomatic foundations, have discernible layers, entirely unexpected "macroscopic" outcomes, and both mathematical and physical ramifications profoundly beyond their historical beginnings. In a larger sense, the study of mathematics itself, which is increasingly surpassing the capacity of researchers to verify "by hand," may be the ultimate complex system.
Have we been wrong about ionizing radiation and chronic lymphocytic leukemia?
Hamblin, Terry J
2008-04-01
It is almost axiomatic that chronic lymphocytic leukemia (CLL) is not caused by ionizing radiation. This assumption has been challenged recently by a critical re-appraisal of existing data. A recent paper implicated radon exposure in Czech uranium miners as a possible cause of CLL and in this issue of Leukemia Research the first paper examining the incidence of CLL among those exposed to radiation from the accident at the nuclear power plant in Chernobyl is published. It suggests that CLL occurring among the clean-up workers was of a more aggressive form than is normally seen in the community.
Setting standards for product selection: allergy prevention.
White, I R
1997-01-01
It is axiomatic to state that if products made of natural rubber latex were not used in health care settings then there would be no problems of acquired hypersensitivity from such products. Although synthetic materials are available they do not currently possess the same technical qualities of elasticity and comfort, nor do they deliver the desired degree of protection against biological agents as gloves made out of natural rubber latex. Selection of gloves either for non-sterile procedures or sterile surgical use should be based on this understanding, and gloves with minimal levels of extractable latex proteins should be used.
Axiomatic foundations for cost-effectiveness analysis.
Canning, David
2013-12-01
We show that individual utilities can be measured in units of healthy life years. Social preferences over these life metric utilities are assumed to satisfy the Pareto principle, anonymity, and invariance to a change in origin. These axioms generate a utilitarian social welfare function implying the use of cost-effectiveness analysis in ordering health projects, based on maximizing the healthy years equivalents gained from a fixed health budget. For projects outside the health sector, our cost-effectiveness axioms imply a form of cost-benefit analysis where both costs and benefits are measured in equivalent healthy life years. Copyright © 2013 John Wiley & Sons, Ltd.
Foundations of educational psychology: Howard Gardner's neoclassical psyche.
Diessner, R
2001-12-01
This article is a theoretical examination of the implications of Howard Gardner's work in developmental and educational psychology (1983, 1993, 1999a, 1999b) for the structure of the psyche. The author accepts as axiomatic, in the context of this article, Gardner's educational manifesto (1999a) that all students should be taught disciplinary understandings of truth, beauty, and goodness. Rational inferences are then made indicating that the psyche that Gardner intends to educate and help develop is in the form of a neoclassical psyche and that it is structured by the capacities to know truth, to love beauty, and to will goodness.
A multi-agent architecture for geosimulation of moving agents
NASA Astrophysics Data System (ADS)
Vahidnia, Mohammad H.; Alesheikh, Ali A.; Alavipanah, Seyed Kazem
2015-10-01
In this paper, a novel architecture is proposed in which an axiomatic derivation system in the form of first-order logic facilitates declarative explanation and spatial reasoning. Simulation of environmental perception and interaction between autonomous agents is designed with a geographic belief-desire-intention and a request-inform-query model. The architecture has a complementary quantitative component that supports collaborative planning based on the concept of equilibrium and game theory. This new architecture presents a departure from current best practices geographic agent-based modelling. Implementation tasks are discussed in some detail, as well as scenarios for fleet management and disaster management.
Harm, hype and evidence: ELSI research and policy guidance
2013-01-01
There has been much investment in research on the ethical, legal and social issues (ELSI) associated with genetic and genomic research. This research should inform the development of the relevant policy. So far, much of the relevant policy - such as in the areas of patents, genetic testing and genetic discrimination - seems to be informed more by speculation of harm and anecdote than by available evidence. Although a quest for evidence cannot always be allowed to delay policy choice, it seems axiomatic to us that policy options are improved by the incorporation of evidence. PMID:23534337
The Nature of Quantum Truth: Logic, Set Theory, & Mathematics in the Context of Quantum Theory
NASA Astrophysics Data System (ADS)
Frey, Kimberly
The purpose of this dissertation is to construct a radically new type of mathematics whose underlying logic differs from the ordinary classical logic used in standard mathematics, and which we feel may be more natural for applications in quantum mechanics. Specifically, we begin by constructing a first order quantum logic, the development of which closely parallels that of ordinary (classical) first order logic --- the essential differences are in the nature of the logical axioms, which, in our construction, are motivated by quantum theory. After showing that the axiomatic first order logic we develop is sound and complete (with respect to a particular class of models), this logic is then used as a foundation on which to build (axiomatic) mathematical systems --- and we refer to the resulting new mathematics as "quantum mathematics." As noted above, the hope is that this form of mathematics is more natural than classical mathematics for the description of quantum systems, and will enable us to address some foundational aspects of quantum theory which are still troublesome --- e.g. the measurement problem --- as well as possibly even inform our thinking about quantum gravity. After constructing the underlying logic, we investigate properties of several mathematical systems --- e.g. axiom systems for abstract algebras, group theory, linear algebra, etc. --- in the presence of this quantum logic. In the process, we demonstrate that the resulting quantum mathematical systems have some strange, but very interesting features, which indicates a richness in the structure of mathematics that is classically inaccessible. Moreover, some of these features do indeed suggest possible applications to foundational questions in quantum theory. We continue our investigation of quantum mathematics by constructing an axiomatic quantum set theory, which we show satisfies certain desirable criteria. Ultimately, we hope that such a set theory will lead to a foundation for quantum mathematics in a sense which parallels the foundational role of classical set theory in classical mathematics. One immediate application of the quantum set theory we develop is to provide a foundation on which to construct quantum natural numbers, which are the quantum analog of the classical counting numbers. It turns out that in a special class of models, there exists a 1-1 correspondence between the quantum natural numbers and bounded observables in quantum theory whose eigenvalues are (ordinary) natural numbers. This 1-1 correspondence is remarkably satisfying, and not only gives us great confidence in our quantum set theory, but indicates the naturalness of such models for quantum theory itself. We go on to develop a Peano-like arithmetic for these new "numbers," as well as consider some of its consequences. Finally, we conclude by summarizing our results, and discussing directions for future work.
The Role of Self-Assessment in Foundation of Mathematics Learning
NASA Astrophysics Data System (ADS)
Masriyah
2018-01-01
This research is motivated by the low performance of students who took Foundations of Mathematics course. This study was aimed to describe (1) the learning outcomes of students who learned Mathematics Foundation after learning axiomatic applying self-assessment; (2) the difficulty of students and the alternative solutions; and (3) the response of students toward Foundation of Mathematics learning taught by applying self-assessment. This research was a descriptive research. The subjects were 25 mathematics students who studied Foundation of Mathematics in odd semester of the 2015/2016 academic year. Data collection was done using questionnaires, and testing methods. Based on the results of data analysis, it can be concluded that the learning outcomes of students were categorized as “good.” Student responses were positive; the difficulties lied in the sub material: Classification of Axiom Systems and the requirements, Theorem and how the formation, and finite geometry. The alternatives deal with these difficulties are to give emphasis and explanation as needed on these materials, as well as provide some more exercises to reinforce their understanding.
Lozada Aguilar, Miguel Ángel; Khrennikov, Andrei; Oleschko, Klaudia
2018-04-28
As was recently shown by the authors, quantum probability theory can be used for the modelling of the process of decision-making (e.g. probabilistic risk analysis) for macroscopic geophysical structures such as hydrocarbon reservoirs. This approach can be considered as a geophysical realization of Hilbert's programme on axiomatization of statistical models in physics (the famous sixth Hilbert problem). In this conceptual paper , we continue development of this approach to decision-making under uncertainty which is generated by complexity, variability, heterogeneity, anisotropy, as well as the restrictions to accessibility of subsurface structures. The belief state of a geological expert about the potential of exploring a hydrocarbon reservoir is continuously updated by outputs of measurements, and selection of mathematical models and scales of numerical simulation. These outputs can be treated as signals from the information environment E The dynamics of the belief state can be modelled with the aid of the theory of open quantum systems: a quantum state (representing uncertainty in beliefs) is dynamically modified through coupling with E ; stabilization to a steady state determines a decision strategy. In this paper, the process of decision-making about hydrocarbon reservoirs (e.g. 'explore or not?'; 'open new well or not?'; 'contaminated by water or not?'; 'double or triple porosity medium?') is modelled by using the Gorini-Kossakowski-Sudarshan-Lindblad equation. In our model, this equation describes the evolution of experts' predictions about a geophysical structure. We proceed with the information approach to quantum theory and the subjective interpretation of quantum probabilities (due to quantum Bayesianism).This article is part of the theme issue 'Hilbert's sixth problem'. © 2018 The Author(s).
NASA Astrophysics Data System (ADS)
Lozada Aguilar, Miguel Ángel; Khrennikov, Andrei; Oleschko, Klaudia
2018-04-01
As was recently shown by the authors, quantum probability theory can be used for the modelling of the process of decision-making (e.g. probabilistic risk analysis) for macroscopic geophysical structures such as hydrocarbon reservoirs. This approach can be considered as a geophysical realization of Hilbert's programme on axiomatization of statistical models in physics (the famous sixth Hilbert problem). In this conceptual paper, we continue development of this approach to decision-making under uncertainty which is generated by complexity, variability, heterogeneity, anisotropy, as well as the restrictions to accessibility of subsurface structures. The belief state of a geological expert about the potential of exploring a hydrocarbon reservoir is continuously updated by outputs of measurements, and selection of mathematical models and scales of numerical simulation. These outputs can be treated as signals from the information environment E. The dynamics of the belief state can be modelled with the aid of the theory of open quantum systems: a quantum state (representing uncertainty in beliefs) is dynamically modified through coupling with E; stabilization to a steady state determines a decision strategy. In this paper, the process of decision-making about hydrocarbon reservoirs (e.g. `explore or not?'; `open new well or not?'; `contaminated by water or not?'; `double or triple porosity medium?') is modelled by using the Gorini-Kossakowski-Sudarshan-Lindblad equation. In our model, this equation describes the evolution of experts' predictions about a geophysical structure. We proceed with the information approach to quantum theory and the subjective interpretation of quantum probabilities (due to quantum Bayesianism). This article is part of the theme issue `Hilbert's sixth problem'.
Design process of the nanofluid injection mechanism in nuclear power plants
NASA Astrophysics Data System (ADS)
Kang, Myoung-Suk; Jee, Changhyun; Park, Sangjun; Bang, In Choel; Heo, Gyunyoung
2011-04-01
Nanofluids, which are engineered suspensions of nanoparticles in a solvent such as water, have been found to show enhanced coolant properties such as higher critical heat flux and surface wettability at modest concentrations, which is a useful characteristic in nuclear power plants (NPPs). This study attempted to provide an example of engineering applications in NPPs using nanofluid technology. From these motivations, the conceptual designs of the emergency core cooling systems (ECCSs) assisted by nanofluid injection mechanism were proposed after following a design framework to develop complex engineering systems. We focused on the analysis of functional requirements for integrating the conventional ECCSs and nanofluid injection mechanism without loss of performance and reliability. Three candidates of nanofluid-engineered ECCS proposed in previous researches were investigated by applying axiomatic design (AD) in the manner of reverse engineering and it enabled to identify the compatibility of functional requirements and potential design vulnerabilities. The methods to enhance such vulnerabilities were referred from TRIZ and concretized for the ECCS of the Korean nuclear power plant. The results show a method to decouple the ECCS designs with the installation of a separate nanofluids injection tank adjacent to the safety injection tanks such that a low pH environment for nanofluids can be maintained at atmospheric pressure which is favorable for their injection in passive manner.
Design process of the nanofluid injection mechanism in nuclear power plants
2011-01-01
Nanofluids, which are engineered suspensions of nanoparticles in a solvent such as water, have been found to show enhanced coolant properties such as higher critical heat flux and surface wettability at modest concentrations, which is a useful characteristic in nuclear power plants (NPPs). This study attempted to provide an example of engineering applications in NPPs using nanofluid technology. From these motivations, the conceptual designs of the emergency core cooling systems (ECCSs) assisted by nanofluid injection mechanism were proposed after following a design framework to develop complex engineering systems. We focused on the analysis of functional requirements for integrating the conventional ECCSs and nanofluid injection mechanism without loss of performance and reliability. Three candidates of nanofluid-engineered ECCS proposed in previous researches were investigated by applying axiomatic design (AD) in the manner of reverse engineering and it enabled to identify the compatibility of functional requirements and potential design vulnerabilities. The methods to enhance such vulnerabilities were referred from TRIZ and concretized for the ECCS of the Korean nuclear power plant. The results show a method to decouple the ECCS designs with the installation of a separate nanofluids injection tank adjacent to the safety injection tanks such that a low pH environment for nanofluids can be maintained at atmospheric pressure which is favorable for their injection in passive manner. PMID:21711896
Design process of the nanofluid injection mechanism in nuclear power plants.
Kang, Myoung-Suk; Jee, Changhyun; Park, Sangjun; Bang, In Choel; Heo, Gyunyoung
2011-04-27
Nanofluids, which are engineered suspensions of nanoparticles in a solvent such as water, have been found to show enhanced coolant properties such as higher critical heat flux and surface wettability at modest concentrations, which is a useful characteristic in nuclear power plants (NPPs). This study attempted to provide an example of engineering applications in NPPs using nanofluid technology. From these motivations, the conceptual designs of the emergency core cooling systems (ECCSs) assisted by nanofluid injection mechanism were proposed after following a design framework to develop complex engineering systems. We focused on the analysis of functional requirements for integrating the conventional ECCSs and nanofluid injection mechanism without loss of performance and reliability. Three candidates of nanofluid-engineered ECCS proposed in previous researches were investigated by applying axiomatic design (AD) in the manner of reverse engineering and it enabled to identify the compatibility of functional requirements and potential design vulnerabilities. The methods to enhance such vulnerabilities were referred from TRIZ and concretized for the ECCS of the Korean nuclear power plant. The results show a method to decouple the ECCS designs with the installation of a separate nanofluids injection tank adjacent to the safety injection tanks such that a low pH environment for nanofluids can be maintained at atmospheric pressure which is favorable for their injection in passive manner.
Ferris, D Lance; Reb, Jochen; Lian, Huiwen; Sim, Samantha; Ang, Dionysius
2018-03-01
Past research on dynamic workplace performance evaluation has taken as axiomatic that temporal performance trends produce naïve extrapolation effects on performance ratings. That is, we naïvely assume that an individual whose performance has trended upward over time will continue to improve, and rate that individual more positively than an individual whose performance has trended downward over time-even if, on average, the 2 individuals have performed at an equivalent level. However, we argue that such naïve extrapolation effects are more pronounced in Western countries than Eastern countries, owing to Eastern countries having a more holistic cognitive style. To test our hypotheses, we examined the effect of performance trend on expectations of future performance and ratings of past performance across 2 studies: Study 1 compares the magnitude of naïve extrapolation effects among Singaporeans primed with either a more or less holistic cognitive style, and Study 2 examines holistic cognitive style as a mediating mechanism accounting for differences in the magnitude of naïve extrapolation effects between American and Chinese raters. Across both studies, we found support for our predictions that dynamic performance trends have less impact on the ratings of more holistic thinkers. Implications for the dynamic performance and naïve extrapolation literatures are discussed. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
A macro-physics model of depreciation rate in economic exchange
NASA Astrophysics Data System (ADS)
Marmont Lobo, Rui F.; de Sousa, Miguel Rocha
2014-02-01
This article aims at a new approach for a known fundamental result: barter or trade increases economic value. It successfully bridges the gap between the theory of value and the exchange process attached to the transition from endowments to the equilibrium in the core and contract curve. First, we summarise the theory of value; in Section 2, we present the Edgeworth (1881) box and an axiomatic approach and in Section 3, we apply our pure exchange model. Finally (in Section 4), using our open econo-physics pure barter (EPB) model, we derive an improvement in value, which means that pure barter leads to a decline in depreciation rate.
NASA Astrophysics Data System (ADS)
Skilling, John
2005-11-01
This tutorial gives a basic overview of Bayesian methodology, from its axiomatic foundation through the conventional development of data analysis and model selection to its rôle in quantum mechanics, and ending with some comments on inference in general human affairs. The central theme is that probability calculus is the unique language within which we can develop models of our surroundings that have predictive capability. These models are patterns of belief; there is no need to claim external reality. 1. Logic and probability 2. Probability and inference 3. Probability and model selection 4. Prior probabilities 5. Probability and frequency 6. Probability and quantum mechanics 7. Probability and fundamentalism 8. Probability and deception 9. Prediction and truth
On the theory of Brownian motion with the Alder-Wainwright effect
NASA Astrophysics Data System (ADS)
Okabe, Yasunori
1986-12-01
The Stokes-Boussinesq-Langevin equation, which describes the time evolution of Brownian motion with the Alder-Wainwright effect, can be treated in the framework of the theory of KMO-Langevin equations which describe the time evolution of a real, stationary Gaussian process with T-positivity (reflection positivity) originating in axiomatic quantum field theory. After proving the fluctuation-dissipation theorems for KMO-Langevin equations, we obtain an explicit formula for the deviation from the classical Einstein relation that occurs in the Stokes-Boussinesq-Langevin equation with a white noise as its random force. We are interested in whether or not it can be measured experimentally.
On q-non-extensive statistics with non-Tsallisian entropy
NASA Astrophysics Data System (ADS)
Jizba, Petr; Korbel, Jan
2016-02-01
We combine an axiomatics of Rényi with the q-deformed version of Khinchin axioms to obtain a measure of information (i.e., entropy) which accounts both for systems with embedded self-similarity and non-extensivity. We show that the entropy thus obtained is uniquely solved in terms of a one-parameter family of information measures. The ensuing maximal-entropy distribution is phrased in terms of a special function known as the Lambert W-function. We analyze the corresponding "high" and "low-temperature" asymptotics and reveal a non-trivial structure of the parameter space. Salient issues such as concavity and Schur concavity of the new entropy are also discussed.
Selecting reusable components using algebraic specifications
NASA Technical Reports Server (NTRS)
Eichmann, David A.
1992-01-01
A significant hurdle confronts the software reuser attempting to select candidate components from a software repository - discriminating between those components without resorting to inspection of the implementation(s). We outline a mixed classification/axiomatic approach to this problem based upon our lattice-based faceted classification technique and Guttag and Horning's algebraic specification techniques. This approach selects candidates by natural language-derived classification, by their interfaces, using signatures, and by their behavior, using axioms. We briefly outline our problem domain and related work. Lattice-based faceted classifications are described; the reader is referred to surveys of the extensive literature for algebraic specification techniques. Behavioral support for reuse queries is presented, followed by the conclusions.
Axiomatic Analysis of Co-occurrence Similarity Functions
2012-02-01
Formally, the similarity COSW (q, u) of a target node u to the query q based on weight matrix W is: COSW (q, u) = ∑ c∈Γ(q)∩Γ(u) WqcWuc || Wq :||2||Wu:||2...where Wq : and Wu: are the qth and uth row of the W matrix, respectively. 3 Symbol Definition q Query item with respect to which similarities of other...WqcWuc AA 1log|Γ(c)| COS WqcWuc|| Wq :||2||Wu:||2 FRW Wqc∑ j Wqj Wuc∑ iWic JAC 1|Γ(q)∪Γ(u)| BRW Wuc∑ j Wuj Wqc∑ iWic PMI 1|Γ(q)||Γ(u)| MMT Wqc∑ j Wqj Wuc
NASA Astrophysics Data System (ADS)
Schwabe, O.; Shehab, E.; Erkoyuncu, J.
2015-08-01
The lack of defensible methods for quantifying cost estimate uncertainty over the whole product life cycle of aerospace innovations such as propulsion systems or airframes poses a significant challenge to the creation of accurate and defensible cost estimates. Based on the axiomatic definition of uncertainty as the actual prediction error of the cost estimate, this paper provides a comprehensive overview of metrics used for the uncertainty quantification of cost estimates based on a literature review, an evaluation of publicly funded projects such as part of the CORDIS or Horizon 2020 programs, and an analysis of established approaches used by organizations such NASA, the U.S. Department of Defence, the ESA, and various commercial companies. The metrics are categorized based on their foundational character (foundations), their use in practice (state-of-practice), their availability for practice (state-of-art) and those suggested for future exploration (state-of-future). Insights gained were that a variety of uncertainty quantification metrics exist whose suitability depends on the volatility of available relevant information, as defined by technical and cost readiness level, and the number of whole product life cycle phases the estimate is intended to be valid for. Information volatility and number of whole product life cycle phases can hereby be considered as defining multi-dimensional probability fields admitting various uncertainty quantification metric families with identifiable thresholds for transitioning between them. The key research gaps identified were the lacking guidance grounded in theory for the selection of uncertainty quantification metrics and lacking practical alternatives to metrics based on the Central Limit Theorem. An innovative uncertainty quantification framework consisting of; a set-theory based typology, a data library, a classification system, and a corresponding input-output model are put forward to address this research gap as the basis for future work in this field.
High volcanic seismic b-values: Real or artefacts?
NASA Astrophysics Data System (ADS)
Roberts, Nick; Bell, Andrew; Main, Ian G.
2015-04-01
The b-value of the Gutenberg-Richter distribution quantifies the relative proportion of large to small magnitude earthquakes in a catalogue, in turn related to the population of fault rupture areas and the average slip or stress drop. Accordingly the b-value is an important parameter to consider when evaluating seismic catalogues as it has the potential to provide insight into the temporal or spatial evolution of the system, such as fracture development or changes in the local stress regime. The b-value for tectonic seismicity is commonly found to be close to 1, whereas much higher b-values are frequently reported for volcanic and induced seismicity. Understanding these differences is important for understanding the processes controlling earthquake occurrence in different settings. However, it is possible that anomalously high b-values could arise from small sample sizes, under-estimated completeness magnitudes, or other poorly applied methodologies. Therefore, it is important to establish a rigorous workflow for analyzing these datasets. Here we examine the frequency-magnitude distributions of volcanic earthquake catalogues in order to determine the significance of apparently high b-values. We first derive a workflow for computing the completeness magnitude of a seismic catalogue, using synthetic catalogues of varying shape, size, and known b-value. We find the best approach involves a combination of three methods: 'Maximum Curvature', 'b-value stability', and the 'Goodness-of-Fit test'. To calculate a reliable b-value with an error ≤0.25, the maximum curvature method is preferred for a 'sharp-peaked' discrete distribution. For a catalogue with a broader peak the b-value stability method is the most reliable with the Goodness-of-Fit test being an acceptable backup if the b-value stability method fails. We apply this workflow to earthquake catalogues from El Hierro (2011-2013) and Mt Etna (1999-2013) volcanoes. In general, we find the b-value to be equal to or slightly greater than 1. However, reliable high b-values of 1.5-2.4 at El Hierro and 1.5-1.8 at Mt Etna are observed for restricted time periods. We argue that many of the almost axiomatically 'high' b-values reported in the literature for volcanic and induced seismicity may be attributable to biases introduced by the methods of inference used and/or the relatively small sample sizes often available.
Can high seismic b-values be explained solely by poorly applied methodology?
NASA Astrophysics Data System (ADS)
Roberts, Nick; Bell, Andrew; Main, Ian
2015-04-01
The b-value of the Gutenberg-Richter distribution quantifies the relative proportion of large to small magnitude earthquakes in a catalogue, in turn related to the population of fault rupture areas and the average slip or stress drop. Accordingly the b-value is an important parameter to consider when evaluating seismic catalogues as it has the potential to provide insight into the temporal or spatial evolution of the system, such as fracture development or changes in the local stress regime. The b-value for tectonic seismicity is commonly found to be close to 1, whereas much higher b-values are frequently reported for volcanic and induced seismicity. Understanding these differences is important for understanding the processes controlling earthquake occurrence in different settings. However, it is possible that anomalously high b-values could arise from small sample sizes, under-estimated completeness magnitudes, or other poorly applied methodologies. Therefore, it is important to establish a rigorous workflow for analyzing these datasets. Here we examine the frequency-magnitude distributions of volcanic earthquake catalogues in order to determine the significance of apparently high b-values. We first derive a workflow for computing the completeness magnitude of a seismic catalogue, using synthetic catalogues of varying shape, size, and known b-value. We find the best approach involves a combination of three methods: 'Maximum Curvature', 'b-value stability', and the 'Goodness-of-Fit test'. To calculate a reliable b-value with an error ≤0.25, the maximum curvature method is preferred for a 'sharp-peaked' discrete distribution. For a catalogue with a broader peak the b-value stability method is the most reliable with the Goodness-of-Fit test being an acceptable backup if the b-value stability method fails. We apply this workflow to earthquake catalogues from El Hierro (2011-2013) and Mt Etna (1999-2013) volcanoes. In general, we find the b-value to be equal to or slightly greater than 1, however, reliably high b-values are reported in both catalogues. We argue that many of the almost axiomatically 'high' b-values reported in the literature for volcanic and induced seismicity may be attributable to biases introduced by the methods of inference used and/or the relatively small sample sizes often available. This new methodology, although focused towards volcanic catalogues, is applicabale to all seismic catalogues.
van Damme, Philip; Quesada-Martínez, Manuel; Cornet, Ronald; Fernández-Breis, Jesualdo Tomás
2018-06-13
Ontologies and terminologies have been identified as key resources for the achievement of semantic interoperability in biomedical domains. The development of ontologies is performed as a joint work by domain experts and knowledge engineers. The maintenance and auditing of these resources is also the responsibility of such experts, and this is usually a time-consuming, mostly manual task. Manual auditing is impractical and ineffective for most biomedical ontologies, especially for larger ones. An example is SNOMED CT, a key resource in many countries for codifying medical information. SNOMED CT contains more than 300000 concepts. Consequently its auditing requires the support of automatic methods. Many biomedical ontologies contain natural language content for humans and logical axioms for machines. The 'lexically suggest, logically define' principle means that there should be a relation between what is expressed in natural language and as logical axioms, and that such a relation should be useful for auditing and quality assurance. Besides, the meaning of this principle is that the natural language content for humans could be used to generate the logical axioms for the machines. In this work, we propose a method that combines lexical analysis and clustering techniques to (1) identify regularities in the natural language content of ontologies; (2) cluster, by similarity, labels exhibiting a regularity; (3) extract relevant information from those clusters; and (4) propose logical axioms for each cluster with the support of axiom templates. These logical axioms can then be evaluated with the existing axioms in the ontology to check their correctness and completeness, which are two fundamental objectives in auditing and quality assurance. In this paper, we describe the application of the method to two SNOMED CT modules, a 'congenital' module, obtained using concepts exhibiting the attribute Occurrence - Congenital, and a 'chronic' module, using concepts exhibiting the attribute Clinical course - Chronic. We obtained a precision and a recall of respectively 75% and 28% for the 'congenital' module, and 64% and 40% for the 'chronic' one. We consider these results to be promising, so our method can contribute to the support of content editors by using automatic methods for assuring the quality of biomedical ontologies and terminologies. Copyright © 2018. Published by Elsevier Inc.
Detecting trends in tree growth: not so simple.
Bowman, David M J S; Brienen, Roel J W; Gloor, Emanuel; Phillips, Oliver L; Prior, Lynda D
2013-01-01
Tree biomass influences biogeochemical cycles, climate, and biodiversity across local to global scales. Understanding the environmental control of tree biomass demands consideration of the drivers of individual tree growth over their lifespan. This can be achieved by studies of tree growth in permanent sample plots (prospective studies) and tree ring analyses (retrospective studies). However, identification of growth trends and attribution of their drivers demands statistical control of the axiomatic co-variation of tree size and age, and avoiding sampling biases at the stand, forest, and regional scales. Tracking and predicting the effects of environmental change on tree biomass requires well-designed studies that address the issues that we have reviewed. Copyright © 2012 Elsevier Ltd. All rights reserved.
Nonadditive entropies yield probability distributions with biases not warranted by the data.
Pressé, Steve; Ghosh, Kingshuk; Lee, Julian; Dill, Ken A
2013-11-01
Different quantities that go by the name of entropy are used in variational principles to infer probability distributions from limited data. Shore and Johnson showed that maximizing the Boltzmann-Gibbs form of the entropy ensures that probability distributions inferred satisfy the multiplication rule of probability for independent events in the absence of data coupling such events. Other types of entropies that violate the Shore and Johnson axioms, including nonadditive entropies such as the Tsallis entropy, violate this basic consistency requirement. Here we use the axiomatic framework of Shore and Johnson to show how such nonadditive entropy functions generate biases in probability distributions that are not warranted by the underlying data.
Hypothesis on the nature of time
NASA Astrophysics Data System (ADS)
Coumbe, D. N.
2015-06-01
We present numerical evidence that fictitious diffusing particles in the causal dynamical triangulation (CDT) approach to quantum gravity exceed the speed of light on small distance scales. We argue this superluminal behavior is responsible for the appearance of dimensional reduction in the spectral dimension. By axiomatically enforcing a scale invariant speed of light we show that time must dilate as a function of relative scale, just as it does as a function of relative velocity. By calculating the Hausdorff dimension of CDT diffusion paths we present a seemingly equivalent dual description in terms of a scale dependent Wick rotation of the metric. Such a modification to the nature of time may also have relevance for other approaches to quantum gravity.
Proving refinement transformations using extended denotational semantics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Winter, V.L.; Boyle, J.M.
1996-04-01
TAMPR is a fully automatic transformation system based on syntactic rewrites. Our approach in a correctness proof is to map the transformation into an axiomatized mathematical domain where formal (and automated) reasoning can be performed. This mapping is accomplished via an extended denotational semantic paradigm. In this approach, the abstract notion of a program state is distributed between an environment function and a store function. Such a distribution introduces properties that go beyond the abstract state that is being modeled. The reasoning framework needs to be aware of these properties in order to successfully complete a correctness proof. This papermore » discusses some of our experiences in proving the correctness of TAMPR transformations.« less
Carstensen, C.; Feischl, M.; Page, M.; Praetorius, D.
2014-01-01
This paper aims first at a simultaneous axiomatic presentation of the proof of optimal convergence rates for adaptive finite element methods and second at some refinements of particular questions like the avoidance of (discrete) lower bounds, inexact solvers, inhomogeneous boundary data, or the use of equivalent error estimators. Solely four axioms guarantee the optimality in terms of the error estimators. Compared to the state of the art in the temporary literature, the improvements of this article can be summarized as follows: First, a general framework is presented which covers the existing literature on optimality of adaptive schemes. The abstract analysis covers linear as well as nonlinear problems and is independent of the underlying finite element or boundary element method. Second, efficiency of the error estimator is neither needed to prove convergence nor quasi-optimal convergence behavior of the error estimator. In this paper, efficiency exclusively characterizes the approximation classes involved in terms of the best-approximation error and data resolution and so the upper bound on the optimal marking parameters does not depend on the efficiency constant. Third, some general quasi-Galerkin orthogonality is not only sufficient, but also necessary for the R-linear convergence of the error estimator, which is a fundamental ingredient in the current quasi-optimality analysis due to Stevenson 2007. Finally, the general analysis allows for equivalent error estimators and inexact solvers as well as different non-homogeneous and mixed boundary conditions. PMID:25983390
A Quantum Probability Model of Causal Reasoning
Trueblood, Jennifer S.; Busemeyer, Jerome R.
2012-01-01
People can often outperform statistical methods and machine learning algorithms in situations that involve making inferences about the relationship between causes and effects. While people are remarkably good at causal reasoning in many situations, there are several instances where they deviate from expected responses. This paper examines three situations where judgments related to causal inference problems produce unexpected results and describes a quantum inference model based on the axiomatic principles of quantum probability theory that can explain these effects. Two of the three phenomena arise from the comparison of predictive judgments (i.e., the conditional probability of an effect given a cause) with diagnostic judgments (i.e., the conditional probability of a cause given an effect). The third phenomenon is a new finding examining order effects in predictive causal judgments. The quantum inference model uses the notion of incompatibility among different causes to account for all three phenomena. Psychologically, the model assumes that individuals adopt different points of view when thinking about different causes. The model provides good fits to the data and offers a coherent account for all three causal reasoning effects thus proving to be a viable new candidate for modeling human judgment. PMID:22593747
Dual pricing algorithm in ISO markets
DOE Office of Scientific and Technical Information (OSTI.GOV)
O'Neill, Richard P.; Castillo, Anya; Eldridge, Brent
The challenge to create efficient market clearing prices in centralized day-ahead electricity markets arises from inherent non-convexities in unit commitment problems. When this aspect is ignored, marginal prices may result in economic losses to market participants who are part of the welfare maximizing solution. In this essay, we present an axiomatic approach to efficient prices and cost allocation for a revenue neutral and non-confiscatory day-ahead market. Current cost allocation practices do not adequately attribute costs based on transparent cost causation criteria. Instead we propose an ex post multi-part pricing scheme, which we refer to as the Dual Pricing Algorithm. Lastly,more » our approach can be incorporated into current dayahead markets without altering the market equilibrium.« less
NASA Technical Reports Server (NTRS)
1993-01-01
Under a NASA Small Business Innovation Research (SBIR) contract, Axiomatics Corporation developed a shunting Dielectric Sensor to determine the nutrient level and analyze plant nutrient solutions in the CELSS, NASA's space life support program. (CELSS is an experimental facility investigating closed-cycle plant growth and food processing for long duration manned missions.) The DiComp system incorporates a shunt electrode and is especially sensitive to changes in dielectric property changes in materials at measurements much lower than conventional sensors. The analyzer has exceptional capabilities for predicting composition of liquid streams or reactions. It measures concentrations and solids content up to 100 percent in applications like agricultural products, petrochemicals, food and beverages. The sensor is easily installed; maintenance is low, and it can be calibrated on line. The software automates data collection and analysis.
Research and exploration of product innovative design for function
NASA Astrophysics Data System (ADS)
Wang, Donglin; Wei, Zihui; Wang, Youjiang; Tan, Runhua
2009-07-01
Products innovation is under the prerequisite of realizing the new function, the realization of the new function must solve the contradiction. A new process model of new product innovative design was proposed based on Axiomatic Design (AD) Theory and Functional Structure Analysis (FSA), imbedded Principle of Solving Contradiction. In this model, employ AD Theory to guide FSA, determine the contradiction for the realization of the principle solution. To provide powerful support for innovative design tools in principle solution, Principle of Solving Contradiction in the model were imbedded, so as to boost up the innovation of principle solution. As a case study, an innovative design of button battery separator paper punching machine has been achieved with application of the proposed model.
The mechanism of impact of summative assessment on medical students’ learning
Schuwirth, Lambert W.; Adendorff, Hanelie J.; Herman, Nicoline; van der Vleuten, Cees P.
2010-01-01
It has become axiomatic that assessment impacts powerfully on student learning, but there is a surprising dearth of research on how. This study explored the mechanism of impact of summative assessment on the process of learning of theory in higher education. Individual, in-depth interviews were conducted with medical students and analyzed qualitatively. The impact of assessment on learning was mediated through various determinants of action. Respondents’ learning behaviour was influenced by: appraising the impact of assessment; appraising their learning response; their perceptions of agency; and contextual factors. This study adds to scant extant evidence and proposes a mechanism to explain this impact. It should help enhance the use of assessment as a tool to augment learning. PMID:20455078
Growth Hormone and Reproduction: A Review of Endocrine and Autocrine/Paracrine Interactions
Hull, Kerry L.; Harvey, Steve
2014-01-01
The somatotropic axis, consisting of growth hormone (GH), hepatic insulin-like growth factor I (IGF-I), and assorted releasing factors, regulates growth and body composition. Axiomatically, since optimal body composition enhances reproductive function, general somatic actions of GH modulate reproductive function. A growing body of evidence supports the hypothesis that GH also modulates reproduction directly, exerting both gonadotropin-dependent and gonadotropin-independent actions in both males and females. Moreover, recent studies indicate GH produced within reproductive tissues differs from pituitary GH in terms of secretion and action. Accordingly, GH is increasingly used as a fertility adjunct in males and females, both humans and nonhumans. This review reconsiders reproductive actions of GH in vertebrates in respect to these new conceptual developments. PMID:25580121
On S.N. Bernstein's derivation of Mendel's Law and 'rediscovery' of the Hardy-Weinberg distribution.
Stark, Alan; Seneta, Eugene
2012-04-01
Around 1923 the soon-to-be famous Soviet mathematician and probabilist Sergei N. Bernstein started to construct an axiomatic foundation of a theory of heredity. He began from the premise of stationarity (constancy of type proportions) from the first generation of offspring. This led him to derive the Mendelian coefficients of heredity. It appears that he had no direct influence on the subsequent development of population genetics. A basic assumption of Bernstein was that parents coupled randomly to produce offspring. This paper shows that a simple model of non-random mating, which nevertheless embodies a feature of the Hardy-Weinberg Law, can produce Mendelian coefficients of heredity while maintaining the population distribution. How W. Johannsen's monograph influenced Bernstein is discussed.
Dual pricing algorithm in ISO markets
O'Neill, Richard P.; Castillo, Anya; Eldridge, Brent; ...
2016-10-10
The challenge to create efficient market clearing prices in centralized day-ahead electricity markets arises from inherent non-convexities in unit commitment problems. When this aspect is ignored, marginal prices may result in economic losses to market participants who are part of the welfare maximizing solution. In this essay, we present an axiomatic approach to efficient prices and cost allocation for a revenue neutral and non-confiscatory day-ahead market. Current cost allocation practices do not adequately attribute costs based on transparent cost causation criteria. Instead we propose an ex post multi-part pricing scheme, which we refer to as the Dual Pricing Algorithm. Lastly,more » our approach can be incorporated into current dayahead markets without altering the market equilibrium.« less
Finsler-type modification of the Coulomb law
NASA Astrophysics Data System (ADS)
Itin, Yakov; Lämmerzahl, Claus; Perlick, Volker
2014-12-01
Finsler geometry is a natural generalization of pseudo-Riemannian geometry. It can be motivated e.g. by a modified version of the Ehlers-Pirani-Schild axiomatic approach to space-time theory. Also, some scenarios of quantum gravity suggest a modified dispersion relation which could be phrased in terms of Finsler geometry. On a Finslerian space-time, the universality of free fall is still satisfied but local Lorentz invariance is violated in a way not covered by standard Lorentz invariance violation schemes. In this paper we consider a Finslerian modification of Maxwell's equations. The corrections to the Coulomb potential and to the hydrogen energy levels are computed. We find that the Finsler metric corrections yield a splitting of the energy levels. Experimental data provide bounds for the Finsler parameters.
A market-based approach to share water and benefits in transboundary river basins
NASA Astrophysics Data System (ADS)
Arjoon, Diane; Tilmant, Amaury; Herrmann, Markus
2016-04-01
The equitable sharing of benefits in transboundary river basins is necessary to reach a consensus on basin-wide development and management activities. Benefit sharing arrangements must be collaboratively developed to be perceived as efficient, as well as equitable, in order to be considered acceptable to all riparian countries. The current literature falls short of providing practical, institutional arrangements that ensure maximum economic welfare as well as collaboratively developed methods for encouraging the equitable sharing of benefits. In this study we define an institutional arrangement that distributes welfare in a river basin by maximizing the economic benefits of water use and then sharing these benefits in an equitable manner using a method developed through stakeholder involvement. In this methodology (i) a hydro-economic model is used to efficiently allocate scarce water resources to water users in a transboundary basin, (ii) water users are obliged to pay for water, and (iii) the total of these water charges are equitably redistributed as monetary compensation to users. The amount of monetary compensation, for each water user, is determined through the application of a sharing method developed by stakeholder input, based on a stakeholder vision of fairness, using an axiomatic approach. The whole system is overseen by a river basin authority. The methodology is applied to the Eastern Nile River basin as a case study. The technique ensures economic efficiency and may lead to more equitable solutions in the sharing of benefits in transboundary river basins because the definition of the sharing rule is not in question, as would be the case if existing methods, such as game theory, were applied, with their inherent definitions of fairness.
Cerebral localization in the nineteenth century--the birth of a science and its modern consequences.
Steinberg, David A
2009-07-01
Although many individuals contributed to the development of the science of cerebral localization, its conceptual framework is the work of a single man--John Hughlings Jackson (1835-1911), a Victorian physician practicing in London. Hughlings Jackson's formulation of a neurological science consisted of an axiomatic basis, an experimental methodology, and a clinical neurophysiology. His axiom--that the brain is an exclusively sensorimotor machine--separated neurology from psychiatry and established a rigorous and sophisticated structure for the brain and mind. Hughlings Jackson's experimental method utilized the focal lesion as a probe of brain function and created an evolutionary structure of somatotopic representation to explain clinical neurophysiology. His scientific theory of cerebral localization can be described as a weighted ordinal representation. Hughlings Jackson's theory of weighted ordinal representation forms the scientific basis for modern neurology. Though this science is utilized daily by every neurologist and forms the basis of neuroscience, the consequences of Hughlings Jackson's ideas are still not generally appreciated. For example, they imply the intrinsic inconsistency of some modern fields of neuroscience and neurology. Thus, "cognitive imaging" and the "neurology of art"--two topics of modern interest--are fundamentally oxymoronic according to the science of cerebral localization. Neuroscientists, therefore, still have much to learn from John Hughlings Jackson.
Mean-deviation analysis in the theory of choice.
Grechuk, Bogdan; Molyboha, Anton; Zabarankin, Michael
2012-08-01
Mean-deviation analysis, along with the existing theories of coherent risk measures and dual utility, is examined in the context of the theory of choice under uncertainty, which studies rational preference relations for random outcomes based on different sets of axioms such as transitivity, monotonicity, continuity, etc. An axiomatic foundation of the theory of coherent risk measures is obtained as a relaxation of the axioms of the dual utility theory, and a further relaxation of the axioms are shown to lead to the mean-deviation analysis. Paradoxes arising from the sets of axioms corresponding to these theories and their possible resolutions are discussed, and application of the mean-deviation analysis to optimal risk sharing and portfolio selection in the context of rational choice is considered. © 2012 Society for Risk Analysis.
Murach, Kevin A; Bagley, James R
2016-08-01
Over the last 30+ years, it has become axiomatic that performing aerobic exercise within the same training program as resistance exercise (termed concurrent exercise training) interferes with the hypertrophic adaptations associated with resistance exercise training. However, a close examination of the literature reveals that the interference effect of concurrent exercise training on muscle growth in humans is not as compelling as previously thought. Moreover, recent studies show that, under certain conditions, concurrent exercise may augment resistance exercise-induced hypertrophy in healthy human skeletal muscle. The purpose of this article is to outline the contrary evidence for an acute and chronic interference effect of concurrent exercise on skeletal muscle growth in humans and provide practical literature-based recommendations for maximizing hypertrophy when training concurrently.
On S.N. Bernstein’s derivation of Mendel’s Law and ‘rediscovery’ of the Hardy-Weinberg distribution
Stark, Alan; Seneta, Eugene
2012-01-01
Around 1923 the soon-to-be famous Soviet mathematician and probabilist Sergei N. Bernstein started to construct an axiomatic foundation of a theory of heredity. He began from the premise of stationarity (constancy of type proportions) from the first generation of offspring. This led him to derive the Mendelian coefficients of heredity. It appears that he had no direct influence on the subsequent development of population genetics. A basic assumption of Bernstein was that parents coupled randomly to produce offspring. This paper shows that a simple model of non-random mating, which nevertheless embodies a feature of the Hardy-Weinberg Law, can produce Mendelian coefficients of heredity while maintaining the population distribution. How W. Johannsen’s monograph influenced Bernstein is discussed. PMID:22888285
NASA Technical Reports Server (NTRS)
Onwubiko, Chin-Yere; Onyebueke, Landon
1996-01-01
Structural failure is rarely a "sudden death" type of event, such sudden failures may occur only under abnormal loadings like bomb or gas explosions and very strong earthquakes. In most cases, structures fail due to damage accumulated under normal loadings such as wind loads, dead and live loads. The consequence of cumulative damage will affect the reliability of surviving components and finally causes collapse of the system. The cumulative damage effects on system reliability under time-invariant loadings are of practical interest in structural design and therefore will be investigated in this study. The scope of this study is, however, restricted to the consideration of damage accumulation as the increase in the number of failed components due to the violation of their strength limits.
Jaton, Florian
2017-01-01
This article documents the practical efforts of a group of scientists designing an image-processing algorithm for saliency detection. By following the actors of this computer science project, the article shows that the problems often considered to be the starting points of computational models are in fact provisional results of time-consuming, collective and highly material processes that engage habits, desires, skills and values. In the project being studied, problematization processes lead to the constitution of referential databases called ‘ground truths’ that enable both the effective shaping of algorithms and the evaluation of their performances. Working as important common touchstones for research communities in image processing, the ground truths are inherited from prior problematization processes and may be imparted to subsequent ones. The ethnographic results of this study suggest two complementary analytical perspectives on algorithms: (1) an ‘axiomatic’ perspective that understands algorithms as sets of instructions designed to solve given problems computationally in the best possible way, and (2) a ‘problem-oriented’ perspective that understands algorithms as sets of instructions designed to computationally retrieve outputs designed and designated during specific problematization processes. If the axiomatic perspective on algorithms puts the emphasis on the numerical transformations of inputs into outputs, the problem-oriented perspective puts the emphasis on the definition of both inputs and outputs. PMID:28950802
[Can the materialistic concept of reproduction be terminated? (author's transl)].
Maier, W
1980-06-01
Discussions of the presently low birth rate in the Federal Republic of Germany are ideological and concepts such as birth rate and population trend are used more and more in partisan political fights. These emotional evaluations are based largely on the materialistic value system of the national economies of the 19th century and their interlacing with demographic political theses. As soon as the population political problem of increase or decrease in the population of a state is discussed, the discussants become embroiled in unsolvable contradictions. The reasons for these contradictions is the search for an optimal size of the population according to the economic theoretical ideas of a classical national economy where the search for optimal profit is axiomatic and where these ideas have immediate demographic political consequences. The materialistic concept of reproduction can be impeded when the identification of the individual with his nation is discontinued and when it is recognized that in a capital intensive economy, increase of productivity leads to increase in consumption but not like a natural law to an increase in population. Human labor has demographically less impact on a modern technologic economy. Therefore, a low birth rate is a positive contribution to the economic prosperity. A low birth rate is the prerequisite for the excellence of the professional and general education of the population. This type of education in times of modern technology assures general prosperity and peace. (author's)
From classical to quantum mechanics: ``How to translate physical ideas into mathematical language''
NASA Astrophysics Data System (ADS)
Bergeron, H.
2001-09-01
Following previous works by E. Prugovečki [Physica A 91A, 202 (1978) and Stochastic Quantum Mechanics and Quantum Space-time (Reidel, Dordrecht, 1986)] on common features of classical and quantum mechanics, we develop a unified mathematical framework for classical and quantum mechanics (based on L2-spaces over classical phase space), in order to investigate to what extent quantum mechanics can be obtained as a simple modification of classical mechanics (on both logical and analytical levels). To obtain this unified framework, we split quantum theory in two parts: (i) general quantum axiomatics (a system is described by a state in a Hilbert space, observables are self-adjoints operators, and so on) and (ii) quantum mechanics proper that specifies the Hilbert space as L2(Rn); the Heisenberg rule [pi,qj]=-iℏδij with p=-iℏ∇, the free Hamiltonian H=-ℏ2Δ/2m and so on. We show that general quantum axiomatics (up to a supplementary "axiom of classicity") can be used as a nonstandard mathematical ground to formulate physical ideas and equations of ordinary classical statistical mechanics. So, the question of a "true quantization" with "ℏ" must be seen as an independent physical problem not directly related with quantum formalism. At this stage, we show that this nonstandard formulation of classical mechanics exhibits a new kind of operation that has no classical counterpart: this operation is related to the "quantization process," and we show why quantization physically depends on group theory (the Galilei group). This analytical procedure of quantization replaces the "correspondence principle" (or canonical quantization) and allows us to map classical mechanics into quantum mechanics, giving all operators of quantum dynamics and the Schrödinger equation. The great advantage of this point of view is that quantization is based on concrete physical arguments and not derived from some "pure algebraic rule" (we exhibit also some limit of the correspondence principle). Moreover spins for particles are naturally generated, including an approximation of their interaction with magnetic fields. We also recover by this approach the semi-classical formalism developed by E. Prugovečki [Stochastic Quantum Mechanics and Quantum Space-time (Reidel, Dordrecht, 1986)].
NASA Astrophysics Data System (ADS)
Herrera, I.; Herrera, G. S.
2015-12-01
Most geophysical systems are macroscopic physical systems. The behavior prediction of such systems is carried out by means of computational models whose basic models are partial differential equations (PDEs) [1]. Due to the enormous size of the discretized version of such PDEs it is necessary to apply highly parallelized super-computers. For them, at present, the most efficient software is based on non-overlapping domain decomposition methods (DDM). However, a limiting feature of the present state-of-the-art techniques is due to the kind of discretizations used in them. Recently, I. Herrera and co-workers using 'non-overlapping discretizations' have produced the DVS-Software which overcomes this limitation [2]. The DVS-software can be applied to a great variety of geophysical problems and achieves very high parallel efficiencies (90%, or so [3]). It is therefore very suitable for effectively applying the most advanced parallel supercomputers available at present. In a parallel talk, in this AGU Fall Meeting, Graciela Herrera Z. will present how this software is being applied to advance MOD-FLOW. Key Words: Parallel Software for Geophysics, High Performance Computing, HPC, Parallel Computing, Domain Decomposition Methods (DDM)REFERENCES [1]. Herrera Ismael and George F. Pinder, Mathematical Modelling in Science and Engineering: An axiomatic approach", John Wiley, 243p., 2012. [2]. Herrera, I., de la Cruz L.M. and Rosas-Medina A. "Non Overlapping Discretization Methods for Partial, Differential Equations". NUMER METH PART D E, 30: 1427-1454, 2014, DOI 10.1002/num 21852. (Open source) [3]. Herrera, I., & Contreras Iván "An Innovative Tool for Effectively Applying Highly Parallelized Software To Problems of Elasticity". Geofísica Internacional, 2015 (In press)
Celik, Metin
2009-03-01
The International Safety Management (ISM) Code defines a broad framework for the safe management and operation of merchant ships, maintaining high standards of safety and environmental protection. On the other hand, ISO 14001:2004 provides a generic, worldwide environmental management standard that has been utilized by several industries. Both the ISM Code and ISO 14001:2004 have the practical goal of establishing a sustainable Integrated Environmental Management System (IEMS) for shipping businesses. This paper presents a hybrid design methodology that shows how requirements from both standards can be combined into a single execution scheme. Specifically, the Analytic Hierarchy Process (AHP) and Fuzzy Axiomatic Design (FAD) are used to structure an IEMS for ship management companies. This research provides decision aid to maritime executives in order to enhance the environmental performance in the shipping industry.
NASA Astrophysics Data System (ADS)
Carmesin, Hans-Otto
1992-06-01
Knowing about the axiomatic aspects of mathematics, Wittgenstein asked the more fundamental question: ‘But then what does the peculiar inexorability of mathematics consist in?’. He answers the question partially by saying: ‘Then do you want to say that “being true” means: being usable (or useful)? — No, not that; but that it can't be said of the series of natural numbers — any more than of our language —that it is true, but: that it is usable, and, above all, it is used’. Here it will be demonstrated that there is another aspect ‘to be said of the series of natural numbers’, besides the mere fact that they are used or usable, namely a biological one, as has been suggested, though not explicated, by Piaget.
Natural selection can favour 'irrational' behaviour.
McNamara, J M; Trimmer, P C; Houston, A I
2014-01-01
Understanding decisions is the fundamental aim of the behavioural sciences. The theory of rational choice is based on axiomatic principles such as transitivity and independence of irrelevant alternatives (IIA). Empirical studies have demonstrated that the behaviour of humans and other animals often seems irrational; there can be a lack of transitivity in choice and seemingly irrelevant alternatives can alter decisions. These violations of transitivity and IIA undermine rational choice theory. However, we show that an individual that is maximizing its rate of food gain can exhibit failure of transitivity and IIA. We show that such violations can be caused because a current option may disappear in the near future or a better option may reappear soon. Current food options can be indicative of food availability in the near future, and this key feature can result in apparently irrational behaviour.
NASA Astrophysics Data System (ADS)
Saveliev, M. V.; Vershik, A. M.
1989-12-01
We present an axiomatic formulation of a new class of infinitedimensional Lie algebras-the generalizations of Z-graded Lie algebras with, generally speaking, an infinite-dimensional Cartan subalgebra and a contiguous set of roots. We call such algebras “continuum Lie algebras.” The simple Lie algebras of constant growth are encapsulated in our formulation. We pay particular attention to the case when the local algebra is parametrized by a commutative algebra while the Cartan operator (the generalization of the Cartan matrix) is a linear operator. Special examples of these algebras are the Kac-Moody algebras, algebras of Poisson brackets, algebras of vector fields on a manifold, current algebras, and algebras with differential or integro-differential cartan operator. The nonlinear dynamical systems associated with the continuum contragredient Lie algebras are also considered.
Gleason-Busch theorem for sequential measurements
NASA Astrophysics Data System (ADS)
Flatt, Kieran; Barnett, Stephen M.; Croke, Sarah
2017-12-01
Gleason's theorem is a statement that, given some reasonable assumptions, the Born rule used to calculate probabilities in quantum mechanics is essentially unique [A. M. Gleason, Indiana Univ. Math. J. 6, 885 (1957), 10.1512/iumj.1957.6.56050]. We show that Gleason's theorem contains within it also the structure of sequential measurements, and along with this the state update rule. We give a small set of axioms, which are physically motivated and analogous to those in Busch's proof of Gleason's theorem [P. Busch, Phys. Rev. Lett. 91, 120403 (2003), 10.1103/PhysRevLett.91.120403], from which the familiar Kraus operator form follows. An axiomatic approach has practical relevance as well as fundamental interest, in making clear those assumptions which underlie the security of quantum communication protocols. Interestingly, the two-time formalism is seen to arise naturally in this approach.
Jarzynski equality in the context of maximum path entropy
NASA Astrophysics Data System (ADS)
González, Diego; Davis, Sergio
2017-06-01
In the global framework of finding an axiomatic derivation of nonequilibrium Statistical Mechanics from fundamental principles, such as the maximum path entropy - also known as Maximum Caliber principle -, this work proposes an alternative derivation of the well-known Jarzynski equality, a nonequilibrium identity of great importance today due to its applications to irreversible processes: biological systems (protein folding), mechanical systems, among others. This equality relates the free energy differences between two equilibrium thermodynamic states with the work performed when going between those states, through an average over a path ensemble. In this work the analysis of Jarzynski's equality will be performed using the formalism of inference over path space. This derivation highlights the wide generality of Jarzynski's original result, which could even be used in non-thermodynamical settings such as social systems, financial and ecological systems.
Toward a metric for patterned injury analysis
NASA Astrophysics Data System (ADS)
Oliver, William R.; Fritsch, Daniel S.
1997-02-01
An intriguing question in the matching of objects with patterned injures in two and three dimensions is that of an appropriate metric for closeness -- is it possible to objectively measure how well an object 'fits' a patterned injury. Many investigators have suggested an energy-based metric, and have used such metrics to analyze craniofacial growth and anatomic variation. A strict dependence on homology is the primary disadvantage of this energy functional for generalized biological structures; many shapes do not have obvious landmarks. Some tentative solutions to the problem of landmark dependency for patterned injury analysis are presented. One intriguing approach comes from recent work in axiomatic vision. This approach has resulted in the development of a multiresolution medial axis for the extraction of shape primitives which can be used as the basis for registration. A scale-based description of this process can be captured in structures called cores, which can describe object shape and position in a highly compact manner. Cores may provide a scale- and shape-based method of determining correspondences necessary for determining the number and position of landmarks for some patterned injuries. Each of the approaches described are generalizable to higher dimensions, and can thus be used to analyze both two- and three- dimensional data. Together, they may represent a reasonable way of measuring shape distance for the purpose of matching objects and wounds, and can be combined with texture measures for a complete description.
Energy accounting and optimization for mobile systems
NASA Astrophysics Data System (ADS)
Dong, Mian
Energy accounting determines how much a software process contributes to the total system energy consumption. It is the foundation for evaluating software and has been widely used by operating system based energy management. While various energy accounting policies have been tried, there is no known way to evaluate them directly simply because it is hard to track every hardware use by software in a heterogeneous multi-core system like modern smartphones and tablets. In this thesis, we provide the ground truth for energy accounting based on multi-player game theory and offer the first evaluation of existing energy accounting policies, revealing their important flaws. The proposed ground truth is based on Shapley value, a single value solution to multi-player games of which four axiomatic properties are natural and self-evident to energy accounting. To obtain the Shapley value-based ground truth, one only needs to know if a process is active during the time under question and the system energy consumption during the same time. We further provide a utility optimization formulation of energy management and show, surprisingly, that energy accounting does not matter for existing energy management solutions that control the energy use of a process by giving it an energy budget, or budget based energy management (BEM). We show an optimal energy management (OEM) framework can always outperform BEM. While OEM does not require any form of energy accounting, it is related to Shapley value in that both require the system energy consumption for all possible combination of processes under question. We provide a novel system solution that meet this requirement by acquiring system energy consumption in situ for an OS scheduler period, i.e.,10 ms. We report a prototype implementation of both Shapley value-based energy accounting and OEM based scheduling. Using this prototype and smartphone workload, we experimentally demonstrate how erroneous existing energy accounting policies can be, show that existing BEM solutions are unnecessarily complicated yet underperforming by 20% compared to OEM.
[Taxonomic theory for non-classical systematics].
Pavlinov, I Ia
2012-01-01
Outlined briefly are basic principles of construing general taxonomic theory for biological systematics considered in the context of non-classical scientific paradigm. The necessity of such kind of theory is substantiated, and some key points of its elaboration are exposed: its interpretation as a framework concept for the partial taxonomic theories in various schools of systematics; elaboration of idea of cognitive situation including three interrelated components, namely subject, object, and epistemic ones; its construing as a content-wisely interpreted quasi-axiomatics, with strong structuring of its conceptual space including demarcation between axioms and inferring rules; its construing as a "conceptual pyramid" of concepts of various levels of generality; inclusion of a basic model into definition of the taxonomic system (classification) regulating its content. Two problems are indicated as fundamental: definition of taxonomic diversity as a subject domain for the systematics as a whole; definition of onto-epistemological status of taxonomic system (classification) in general and of taxa in particular.
[Regeneration of the ciliary beat of human ciliated cells].
Wolf, G; Koidl, B; Pelzmann, B
1991-10-01
The influence of an isotonic, alkaline saline solution (diluted "Emser Sole" or brine from the spa of Bad Ems) on the ciliary beat of isolated cultured human ciliated cells of the upper respiratory tract was investigated. The ciliary beat was observed via an inverted phase contrast microscope (Zeiss Axiomat IDPC) and measured microphotometrically under physiological conditions and after the damaging influence of 1% propanal solution. Under physiological conditions the saline solution had a positive, although statistically not significant influence on the frequency of the ciliary beat. After damage of the cultivated cells by 1% propanal solution, the saline solution had a significant better influence on the regeneration of the cultured cells than a physiological sodium chloride solution. It is concluded that diluted brine from Bad Ems has a positive effect on the ciliary beat of the respiratory epithelium and accelerates its regeneration after damage by viral and bacterial infections, surgery or inhaled noxae.
Ontology-Driven Business Modelling: Improving the Conceptual Representation of the REA Ontology
NASA Astrophysics Data System (ADS)
Gailly, Frederik; Poels, Geert
Business modelling research is increasingly interested in exploring how domain ontologies can be used as reference models for business models. The Resource Event Agent (REA) ontology is a primary candidate for ontology-driven modelling of business processes because the REA point of view on business reality is close to the conceptual modelling perspective on business models. In this paper Ontology Engineering principles are employed to reengineer REA in order to make it more suitable for ontology-driven business modelling. The new conceptual representation of REA that we propose uses a single representation formalism, includes a more complete domain axiomatizat-ion (containing definitions of concepts, concept relations and ontological axioms), and is proposed as a generic model that can be instantiated to create valid business models. The effects of these proposed improvements on REA-driven business modelling are demonstrated using a business modelling example.
On the Boltzmann-Grad Limit for Smooth Hard-Sphere Systems
NASA Astrophysics Data System (ADS)
Tessarotto, Massimo; Cremaschini, Claudio; Mond, Michael; Asci, Claudio; Soranzo, Alessandro; Tironi, Gino
2018-03-01
The problem is posed of the prescription of the so-called Boltzmann-Grad limit operator (L_{BG}) for the N-body system of smooth hard-spheres which undergo unary, binary as well as multiple elastic instantaneous collisions. It is proved, that, despite the non-commutative property of the operator L_{BG}, the Boltzmann equation can nevertheless be uniquely determined. In particular, consistent with the claim of Uffink and Valente (Found Phys 45:404, 2015) that there is "no time-asymmetric ingredient" in its derivation, the Boltzmann equation is shown to be time-reversal symmetric. The proof is couched on the "ab initio" axiomatic approach to the classical statistical mechanics recently developed (Tessarotto et al. in Eur Phys J Plus 128:32, 2013). Implications relevant for the physical interpretation of the Boltzmann H-theorem and the phenomenon of decay to kinetic equilibrium are pointed out.
Dead simple OWL design patterns
DOE Office of Scientific and Technical Information (OSTI.GOV)
Osumi-Sutherland, David; Courtot, Melanie; Balhoff, James P.
Bio-ontologies typically require multiple axes of classification to support the needs of their users. Development of such ontologies can only be made scalable and sustainable by the use of inference to automate classification via consistent patterns of axiomatization. Many bio-ontologies originating in OBO or OWL follow this approach. These patterns need to be documented in a form that requires minimal expertise to understand and edit and that can be validated and applied using any of the various programmatic approaches to working with OWL ontologies. We describe a system, Dead Simple OWL Design Patterns (DOS-DPs), which fulfills these requirements, illustrating themore » system with examples from the Gene Ontology. In conclusion, the rapid adoption of DOS-DPs by multiple ontology development projects illustrates both the ease-of use and the pressing need for the simple design pattern system we have developed.« less
Changing the Direction of Suicide Prevention in the United States.
Reidenberg, Dan; Berman, Alan L
2017-08-01
It is axiomatic that the goal of suicide prevention is the prevention of suicide. Yet in spite of significant efforts to this end since the middle of the last century, and most notably in the last decade, the rate of suicide in the U.S. has not declined; rather, it has increased. To address this issue, Suicide Awareness Voices of Education (SAVE) brought together leading prevention specialists from other public health problems where successes have been achieved, representatives from countries where suicide rates have declined, and U.S. based suicide prevention researchers and program directors, to "think outside the box" and propose innovative, scalable approaches that might better drive success in achieving desired results from U.S. suicide prevention efforts. The recommendations should challenge our preconceptions and force us outside our own mental constraints to broaden our perspectives and suggest catalysts for real change in suicide prevention. © 2016 The American Association of Suicidology.
Electrodynamics and Spacetime Geometry: Foundations
NASA Astrophysics Data System (ADS)
Cabral, Francisco; Lobo, Francisco S. N.
2017-02-01
We explore the intimate connection between spacetime geometry and electrodynamics. This link is already implicit in the constitutive relations between the field strengths and excitations, which are an essential part of the axiomatic structure of electromagnetism, clearly formulated via integration theory and differential forms. We review the foundations of classical electromagnetism based on charge and magnetic flux conservation, the Lorentz force and the constitutive relations. These relations introduce the conformal part of the metric and allow the study of electrodynamics for specific spacetime geometries. At the foundational level, we discuss the possibility of generalizing the vacuum constitutive relations, by relaxing the fixed conditions of homogeneity and isotropy, and by assuming that the symmetry properties of the electro-vacuum follow the spacetime isometries. The implications of this extension are briefly discussed in the context of the intimate connection between electromagnetism and the geometry (and causal structure) of spacetime.
Dead simple OWL design patterns
Osumi-Sutherland, David; Courtot, Melanie; Balhoff, James P.; ...
2017-06-05
Bio-ontologies typically require multiple axes of classification to support the needs of their users. Development of such ontologies can only be made scalable and sustainable by the use of inference to automate classification via consistent patterns of axiomatization. Many bio-ontologies originating in OBO or OWL follow this approach. These patterns need to be documented in a form that requires minimal expertise to understand and edit and that can be validated and applied using any of the various programmatic approaches to working with OWL ontologies. We describe a system, Dead Simple OWL Design Patterns (DOS-DPs), which fulfills these requirements, illustrating themore » system with examples from the Gene Ontology. In conclusion, the rapid adoption of DOS-DPs by multiple ontology development projects illustrates both the ease-of use and the pressing need for the simple design pattern system we have developed.« less
Feeling our way in the dark: the psychiatric nursing care of suicidal people--a literature review.
Cutcliffe, John R; Stevenson, Chris
2008-06-01
Psychiatric/Mental Health nurses have a long history of being front-line carers of suicidal people, and yet the international epidemiological literature, methodological problems notwithstanding, suggests that contemporary care practices for suicidal people have much room for improvement. As a result, this paper focuses on several areas/issues of care of the suicidal person, and in so doing, critiques the extant literature, such as it is. This critique illustrates that there is a disconcerting lack of empirically induced theory to guide practice and even less empirical evidence to support-specific interventions. The paper concludes, accepting the axiomatic complexity and multi-dimensionality of suicide, and the undeniable fact that suicide is a human drama, played out in the everyday lives of people, that for Psychiatric/Mental Health nurses, caring for suicidal people must be an interpersonal endeavor; and one personified by talking and listening.
Diamond and diamond-like carbon MEMS
NASA Astrophysics Data System (ADS)
Luo, J. K.; Fu, Y. Q.; Le, H. R.; Williams, J. A.; Spearing, S. M.; Milne, W. I.
2007-07-01
To generate complex cartilage/bone tissues, scaffolds must possess several structural features that are difficult to create using conventional scaffold design/fabrication technologies. Successful cartilage/bone regeneration depends on the ability to assemble chondrocytes/osteoblasts into three-dimensional (3D) scaffolds. Therefore, we developed a 3D scaffold fabrication system that applies the axiomatic approach to our microstereolithography system. The new system offers a reduced machine size by minimizing the optical components, and shows that the design matrix is decoupled. This analysis identified the key factors affecting microstructure fabrication and an improved scaffold fabrication system was constructed. The results demonstrate that precise, predesigned 3D structures can be fabricated. Using this 3D scaffold, cell adhesion behavior was observed. The use of 3D scaffolds might help determine key factors in the study of cell behavior in complex environments and could eventually lead to the optimal design of scaffolds for the regeneration of various tissues, such as cartilage and bone.
Budiyono, Agung; Rohrlich, Daniel
2017-11-03
Where does quantum mechanics part ways with classical mechanics? How does quantum randomness differ fundamentally from classical randomness? We cannot fully explain how the theories differ until we can derive them within a single axiomatic framework, allowing an unambiguous account of how one theory is the limit of the other. Here we derive non-relativistic quantum mechanics and classical statistical mechanics within a common framework. The common axioms include conservation of average energy and conservation of probability current. But two axioms distinguish quantum mechanics from classical statistical mechanics: an "ontic extension" defines a nonseparable (global) random variable that generates physical correlations, and an "epistemic restriction" constrains allowed phase space distributions. The ontic extension and epistemic restriction, with strength on the order of Planck's constant, imply quantum entanglement and uncertainty relations. This framework suggests that the wave function is epistemic, yet it does not provide an ontic dynamics for individual systems.
A Policy Language for Modelling Recommendations
NASA Astrophysics Data System (ADS)
Abou El Kalam, Anas; Balbiani, Philippe
While current and emergent applications become more and more complex, most of existing security policies and models only consider a yes/no response to the access requests. Consequently, modelling, formalizing and implementing permissions, obligations and prohibitions do not cover the richness of all the possible scenarios. In fact, several applications have access rules with the recommendation access modality. In this paper we focus on the problem of formalizing security policies with recommendation needs. The aim is to provide a generic domain-independent formal system for modelling not only permissions, prohibitions and obligations, but also recommendations. In this respect, we present our logic-based language, the semantics, the truth conditions, our axiomatic as well as inference rules. We also give a representative use case with our specification of recommendation requirements. Finally, we explain how our logical framework could be used to query the security policy and to check its consistency.
p-adic stochastic hidden variable model
NASA Astrophysics Data System (ADS)
Khrennikov, Andrew
1998-03-01
We propose stochastic hidden variables model in which hidden variables have a p-adic probability distribution ρ(λ) and at the same time conditional probabilistic distributions P(U,λ), U=A,A',B,B', are ordinary probabilities defined on the basis of the Kolmogorov measure-theoretical axiomatics. A frequency definition of p-adic probability is quite similar to the ordinary frequency definition of probability. p-adic frequency probability is defined as the limit of relative frequencies νn but in the p-adic metric. We study a model with p-adic stochastics on the level of the hidden variables description. But, of course, responses of macroapparatuses have to be described by ordinary stochastics. Thus our model describes a mixture of p-adic stochastics of the microworld and ordinary stochastics of macroapparatuses. In this model probabilities for physical observables are the ordinary probabilities. At the same time Bell's inequality is violated.
Ensembles and Experiments in Classical and Quantum Physics
NASA Astrophysics Data System (ADS)
Neumaier, Arnold
A philosophically consistent axiomatic approach to classical and quantum mechanics is given. The approach realizes a strong formal implementation of Bohr's correspondence principle. In all instances, classical and quantum concepts are fully parallel: the same general theory has a classical realization and a quantum realization. Extending the ''probability via expectation'' approach of Whittle to noncommuting quantities, this paper defines quantities, ensembles, and experiments as mathematical concepts and shows how to model complementarity, uncertainty, probability, nonlocality and dynamics in these terms. The approach carries no connotation of unlimited repeatability; hence it can be applied to unique systems such as the universe. Consistent experiments provide an elegant solution to the reality problem, confirming the insistence of the orthodox Copenhagen interpretation on that there is nothing but ensembles, while avoiding its elusive reality picture. The weak law of large numbers explains the emergence of classical properties for macroscopic systems.
Psychophysics of the probability weighting function
NASA Astrophysics Data System (ADS)
Takahashi, Taiki
2011-03-01
A probability weighting function w(p) for an objective probability p in decision under risk plays a pivotal role in Kahneman-Tversky prospect theory. Although recent studies in econophysics and neuroeconomics widely utilized probability weighting functions, psychophysical foundations of the probability weighting functions have been unknown. Notably, a behavioral economist Prelec (1998) [4] axiomatically derived the probability weighting function w(p)=exp(-() (0<α<1 and w(0)=1,w(
Pointwise Partial Information Decomposition Using the Specificity and Ambiguity Lattices
NASA Astrophysics Data System (ADS)
Finn, Conor; Lizier, Joseph
2018-04-01
What are the distinct ways in which a set of predictor variables can provide information about a target variable? When does a variable provide unique information, when do variables share redundant information, and when do variables combine synergistically to provide complementary information? The redundancy lattice from the partial information decomposition of Williams and Beer provided a promising glimpse at the answer to these questions. However, this structure was constructed using a much criticised measure of redundant information, and despite sustained research, no completely satisfactory replacement measure has been proposed. In this paper, we take a different approach, applying the axiomatic derivation of the redundancy lattice to a single realisation from a set of discrete variables. To overcome the difficulty associated with signed pointwise mutual information, we apply this decomposition separately to the unsigned entropic components of pointwise mutual information which we refer to as the specificity and ambiguity. This yields a separate redundancy lattice for each component. Then based upon an operational interpretation of redundancy, we define measures of redundant specificity and ambiguity enabling us to evaluate the partial information atoms in each lattice. These atoms can be recombined to yield the sought-after multivariate information decomposition. We apply this framework to canonical examples from the literature and discuss the results and the various properties of the decomposition. In particular, the pointwise decomposition using specificity and ambiguity satisfies a chain rule over target variables, which provides new insights into the so-called two-bit-copy example.
Skolem and pessimism about proof in mathematics.
Cohen, Paul J
2005-10-15
Attitudes towards formalization and proof have gone through large swings during the last 150 years. We sketch the development from Frege's first formalization, to the debates over intuitionism and other schools, through Hilbert's program and the decisive blow of the Gödel Incompleteness Theorem. A critical role is played by the Skolem-Lowenheim Theorem, which showed that no first-order axiom system can characterize a unique infinite model. Skolem himself regarded this as a body blow to the belief that mathematics can be reliably founded only on formal axiomatic systems. In a remarkably prescient paper, he even sketches the possibility of interesting new models for set theory itself, something later realized by the method of forcing. This is in contrast to Hilbert's belief that mathematics could resolve all its questions. We discuss the role of new axioms for set theory, questions in set theory itself, and their relevance for number theory. We then look in detail at what the methods of the predicate calculus, i.e. mathematical reasoning, really entail. The conclusion is that there is no reasonable basis for Hilbert's assumption. The vast majority of questions even in elementary number theory, of reasonable complexity, are beyond the reach of any such reasoning. Of course this cannot be proved and we present only plausibility arguments. The great success of mathematics comes from considering 'natural problems', those which are related to previous work and offer a good chance of being solved. The great glories of human reasoning, beginning with the Greek discovery of geometry, are in no way diminished by this pessimistic view. We end by wishing good health to present-day mathematics and the mathematics of many centuries to come.
Science of the conscious mind.
Ascoli, Giorgio A; Samsonovich, Alexei V
2008-12-01
Human beings have direct access to their own mental states, but can only indirectly observe cosmic radiation and enzyme kinetics. Why then can we measure the temperature of far away galaxies and the activation constant of kinases to the third digit, yet we only gauge our happiness on a scale from 1 to 7? Here we propose a radical research paradigm shift to embrace the subjective conscious mind into the realm of objective empirical science. Key steps are the axiomatic acceptance of first-person experiences as scientific observables; the definition of a quantitative, reliable metric system based on natural language; and the careful distinction of subjective mental states (e.g., interpretation and intent) from physically measurable sensory and motor behaviors (input and output). Using this approach, we propose a series of reproducible experiments that may help define a still largely unexplored branch of science. We speculate that the development of this new discipline will be initially parallel to, and eventually converging with, neurobiology and physics.
A hierarchically distributed architecture for fault isolation expert systems on the space station
NASA Technical Reports Server (NTRS)
Miksell, Steve; Coffer, Sue
1987-01-01
The Space Station Axiomatic Fault Isolating Expert Systems (SAFTIES) system deals with the hierarchical distribution of control and knowledge among independent expert systems doing fault isolation and scheduling of Space Station subsystems. On its lower level, fault isolation is performed on individual subsystems. These fault isolation expert systems contain knowledge about the performance requirements of their particular subsystem and corrective procedures which may be involved in repsonse to certain performance errors. They can control the functions of equipment in their system and coordinate system task schedules. On a higher level, the Executive contains knowledge of all resources, task schedules for all systems, and the relative priority of all resources and tasks. The executive can override any subsystem task schedule in order to resolve use conflicts or resolve errors that require resources from multiple subsystems. Interprocessor communication is implemented using the SAFTIES Communications Interface (SCI). The SCI is an application layer protocol which supports the SAFTIES distributed multi-level architecture.
Lowe, Kimberly A
2015-10-01
The International Committee of the Red Cross (ICRC) is today a staunch proponent of the need for humanitarian organisations to remain independent of state interests, yet it deliberately solicited intergovernmental intervention in international relief after the First World War of 1914-18. This paper examines why an organisation committed to upholding the independence and impartiality of humanitarian action might still choose to partner with governmental bodies. It also highlights the historical beginnings of a linkage between international aid and geopolitics. To secure governmental funding for refugee relief during the 1920s, the ICRC argued that the humanitarian crises of the post-war years were a threat to the political and social stability of Europe. While this has become axiomatic, the interwar history of the ICRC demonstrates that the perceived connection between relief and geopolitical stability is historically constructed, and that it must continue to be asserted persuasively to be effective. © 2015 The Author(s). Disasters © Overseas Development Institute, 2015.
Future states: the axioms underlying prospective, future-oriented, health planning instruments.
Koch, T
2001-02-01
Proscriptive planning exercises are critical to and generally accepted as integral to health planning at varying scales. These require specific instruments designed to predict future actions on the basis of present knowledge. At the macro-level of health economics, for example, a number of future-oriented Quality of Life Instruments (QL) are commonly employed. At the level of individual decision making, on the other hand, Advance Directives (AD's) are advanced as a means by which healthy individuals can assure their wishes will be carried out if at some future point they are incapacitated. As proscriptive tools, both instrument classes appear to share an axiomatic set whose individual parts have not been rigorously considered. This paper attempts to first identify and then consider a set of five axioms underlying future oriented health planning instruments. These axioms are then critiqued using data from a pre-test survey designed specifically to address their assumptions. Results appear to challenge the validity of the axioms underlying the proscriptive planning instruments.
From General Game Descriptions to a Market Specification Language for General Trading Agents
NASA Astrophysics Data System (ADS)
Thielscher, Michael; Zhang, Dongmo
The idea behind General Game Playing is to build systems that, instead of being programmed for one specific task, are intelligent and flexible enough to negotiate an unknown environment solely on the basis of the rules which govern it. In this paper, we argue that this principle has the great potential to bring to a new level artificially intelligent systems in other application areas as well. Our specific interest lies in General Trading Agents, which are able to understand the rules of unknown markets and then to actively participate in them without human intervention. To this end, we extend the general Game Description Language into a language that allows to formally describe arbitrary markets in such a way that these specifications can be automatically processed by a computer. We present both syntax and a transition-based semantics for this Market Specification Language and illustrate its expressive power by presenting axiomatizations of several well-known auction types.
Learning Physics from the Real World by Direct Observation
NASA Astrophysics Data System (ADS)
Shaibani, Saami J.
2012-03-01
It is axiomatic that hands-on experience provides many learning opportunities, which lectures and textbooks cannot match. Moreover, experiments involving the real world are beneficial in helping students to gain a level of understanding that they might not otherwise achieve. One practical limitation with the real world is that simplifications and approximations are sometimes necessary to make the material accessible; however, these types of adjustments can be viewed with misgiving when they appear arbitrary and/or convenience-based. The present work describes a very familiar feature of everyday life, whose underlying physics is examined without modifications to mitigate difficulties from the lack of control in a non-laboratory environment. In the absence of any immediate formula to process results, students are encouraged to reach ab initio answers with guidance provided by a structured series of worksheets. Many of the latter can be completed as homework assignments prior to activity in the field. This approach promotes thinking and inquiry as valuable attributes instead of unquestioningly following a prescribed path.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nikitin, N. V., E-mail: nnikit@mail.cern.ch; Sotnikov, V.P., E-mail: sotnikov@physics.msu.ru; Toms, K. S., E-mail: ktoms@mail.cern.ch
A radically new class of Bell inequalities in Wigner’s form was obtained on the basis of Kolmorov’s axiomatization of probability theory and the hypothesis of locality. These inequalities take explicitly into account the dependence on time (time-dependent Bell inequalities in Wigner’s form). By using these inequalities, one can propose a means for experimentally testing Bohr’ complementarity principle in the relativistic region. The inequalities in question open broad possibilities for studying correlations of nonrelativistic and relativistic quantum systems in external fields. The violation of the time-dependent inequalities in quantum mechanics was studied by considering the behavior of a pair of anticorrelatedmore » spins in a constant external magnetic field and oscillations of neutral pseudoscalar mesons. The decay of a pseudoscalar particle to a fermion–antifermion pair is considered within quantum field theory. In order to test experimentally the inequalities proposed in the present study, it is not necessary to perform dedicated noninvasive measurements required in the Leggett–Garg approach, for example.« less
Set-Theoretic Analysis of Ethical Systems for Off-Planet Future Engagement with Living Organisms
NASA Astrophysics Data System (ADS)
Helman, Daniel S.
2016-10-01
Living organisms are a conundrum. Their origin and provenance are open questions. An operational definition for their detection has been settled upon for practical reasons, i.e. in order to plan mission goals. The spirit of such undertakings is typically noble, and yet the question arises clearly related to how humanity will engage with other living organisms. Prudence demands a pre-contact appraisal of ethical requirements towards other living organisms. To answer this question, an anology with the number line in mathematics (integers versus the set of real numbers) will be presented to explore the structure of finite versus open-ended hierarchies. In this, the architecture of set theory will be used as a basis to describe the validity of systems hierarchies in general. Note that how numbers populate sets follow distinct rules when the elements of the sets or the sets themselves are unbounded. Principles of axiomatic versus observed conclusions will be emphasized. Results from mathematics will be used to inform analysis and dilemmas in ethical systems.
A legal market in organs: the problem of exploitation.
Greasley, Kate
2014-01-01
The article considers the objection to a commercial market in living donor organs for transplantation on the ground that such a market would be exploitative of the vendors. It examines a key challenge to that objection, to the effect that denying poor people the option to sell an organ is to withhold from them the best that a bad situation has to offer. The article casts serious doubt on this attempt at justifying an organ market, and its philosophical underpinning. Drawing, in part, from the catalogued consequences of a thriving kidney market in some parts of India, it is argued that the justification relies on conditions which are extremely unlikely to obtain, even in a regulated donor market: that organ selling meaningfully improves the material situation of the organ vendor. Far from being axiomatic, both logic and the extant empirical evidence point towards the unlikelihood of such an upshot. Finally, the article considers a few conventional counter-arguments in favour of a permissive stance on organ sales.
A Process Algebraic Approach to Software Architecture Design
NASA Astrophysics Data System (ADS)
Aldini, Alessandro; Bernardo, Marco; Corradini, Flavio
Process algebra is a formal tool for the specification and the verification of concurrent and distributed systems. It supports compositional modeling through a set of operators able to express concepts like sequential composition, alternative composition, and parallel composition of action-based descriptions. It also supports mathematical reasoning via a two-level semantics, which formalizes the behavior of a description by means of an abstract machine obtained from the application of structural operational rules and then introduces behavioral equivalences able to relate descriptions that are syntactically different. In this chapter, we present the typical behavioral operators and operational semantic rules for a process calculus in which no notion of time, probability, or priority is associated with actions. Then, we discuss the three most studied approaches to the definition of behavioral equivalences - bisimulation, testing, and trace - and we illustrate their congruence properties, sound and complete axiomatizations, modal logic characterizations, and verification algorithms. Finally, we show how these behavioral equivalences and some of their variants are related to each other on the basis of their discriminating power.
Towards a bioethics of innovation.
Lipworth, Wendy; Axler, Renata
2016-07-01
In recent years, it has become almost axiomatic that biomedical research and clinical practice should be 'innovative'-that is, that they should be always evolving and directed towards the production, translation and implementation of new technologies and practices. While this drive towards innovation in biomedicine might be beneficial, it also raises serious moral, legal, economic and sociopolitical questions that require further scrutiny. In this article, we argue that biomedical innovation needs to be accompanied by a dedicated 'bioethics of innovation' that attends systematically to the goals, process and outcomes of biomedical innovation as objects of critical inquiry. Using the example of personalised or precision medicine, we then suggest a preliminary framework for a bioethics of innovation, based on the research policy initiative of 'Responsible Innovation'. We invite and encourage critiques of this framework and hope that this will provoke a challenging and enriching new bioethical discourse. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
[Relations between equilibrium and dynamics at the turn of the 17th and 18th centuries].
Schmit, Christophe
2014-01-01
This article investigates the reception of Galileo and Descartes' principles of statics in the works of some French scientists in the second half of seventeenth century, tracing their importance for the genesis of a concept of force. Through an examination of the link between statics and dynamics--especially concerning the phenomena of collision and the motion of falling bodies--it will be shown, first, that these principles of statics actually contributed to the genesis of dynamics; secondly, that the authors examined in this article managed to unify the various fields of mechanics by building a common axiomatic basis, and, thirdly, that there exists a conceptual identity between actions in engines and actions in dynamic phenomena. The evidence brought fourth in this articles challenges the view according to which statics, and more particularly the law of the lever, was an obstacle for the development of dynamics, and particularly for the conceptualization of force.
NASA Astrophysics Data System (ADS)
2014-01-01
It has recently been shown within a formal axiomatic framework using a definition of four-momentum based on the Stückelberg-Feynman-Sudarshan-Recami ''switching principle'' that Einstein's relativistic dynamics is logically consistent with the existence of interacting faster-than-light inertial particles. Our results here show, using only basic natural assumptions on dynamics, that this definition is the only possible way to get a consistent theory of such particles moving within the geometry of Minkowskian spacetime. We present a strictly formal proof from a streamlined axiom system that given any slow or fast inertial particle, all inertial observers agree on the value of {m}\\cdot √{|1-v^2|}, where {m} is the particle's relativistic mass and vits speed. This confirms formally the widely held belief that the relativistic mass and momentum of a positive-mass faster-than-light particle must decrease as its speed increases.
Sharing water and benefits in transboundary river basins
NASA Astrophysics Data System (ADS)
Arjoon, Diane; Tilmant, Amaury; Herrmann, Markus
2016-06-01
The equitable sharing of benefits in transboundary river basins is necessary to solve disputes among riparian countries and to reach a consensus on basin-wide development and management activities. Benefit-sharing arrangements must be collaboratively developed to be perceived not only as efficient, but also as equitable in order to be considered acceptable to all riparian countries. The current literature mainly describes what is meant by the term benefit sharing in the context of transboundary river basins and discusses this from a conceptual point of view, but falls short of providing practical, institutional arrangements that ensure maximum economic welfare as well as collaboratively developed methods for encouraging the equitable sharing of benefits. In this study, we define an institutional arrangement that distributes welfare in a river basin by maximizing the economic benefits of water use and then sharing these benefits in an equitable manner using a method developed through stakeholder involvement. We describe a methodology in which (i) a hydrological model is used to allocate scarce water resources, in an economically efficient manner, to water users in a transboundary basin, (ii) water users are obliged to pay for water, and (iii) the total of these water charges is equitably redistributed as monetary compensation to users in an amount determined through the application of a sharing method developed by stakeholder input, thus based on a stakeholder vision of fairness, using an axiomatic approach. With the proposed benefit-sharing mechanism, the efficiency-equity trade-off still exists, but the extent of the imbalance is reduced because benefits are maximized and redistributed according to a key that has been collectively agreed upon by the participants. The whole system is overseen by a river basin authority. The methodology is applied to the Eastern Nile River basin as a case study. The described technique not only ensures economic efficiency, but may also lead to more equitable solutions in the sharing of benefits in transboundary river basins because the definition of the sharing rule is not in question, as would be the case if existing methods, such as game theory, were applied, with their inherent definitions of fairness.
Style in knitted textiles and fashion
NASA Astrophysics Data System (ADS)
Štemberger, M.; Pavko-Čuden, A.
2017-10-01
The presented research relates the basic elements of the art theory with the concept of style and fashion design. The objective of the research was to determine how style is manifested in knitting in different periods of fashion seasons. The collections of three designers were compared: Missoni, Issey Miyake and Sonia Rykiel, in four different seasons in three different years. The basic artistic elements used in the presented research were: point, line, light-dark and colour together with syntactic rules. A combination of different elements and syntactic rules refers to different artistic languages, which have their own artistic grammar, i.e. a different style. All three investigated fashion designers used knitting in their collections as a significant element which defined their style. Different knitting technologies as well as different yarns made of synthetic or natural fibres in all colour spectra significantly influence the surface of a knitted fabric. Even when the technology is the same, the use of different materials, structures, colours, etc. creates various unique surfaces. The method used in the presented research was a style matrix which is developed from the axiomatic system. Only the part dealing with the language of fine arts and the pictorial speech - the style of a certain designer and a certain work of art/knitted fabric was used. After the selected three designers were examined through all the periods, it was concluded that each designer can be characterised by his own style. Despite the influencing fashion trends, all the compared designers still retained their own style, their own techniques, their own inspirations.
Reassessing the trophic role of reef sharks as apex predators on coral reefs
NASA Astrophysics Data System (ADS)
Frisch, Ashley J.; Ireland, Matthew; Rizzari, Justin R.; Lönnstedt, Oona M.; Magnenat, Katalin A.; Mirbach, Christopher E.; Hobbs, Jean-Paul A.
2016-06-01
Apex predators often have strong top-down effects on ecosystem components and are therefore a priority for conservation and management. Due to their large size and conspicuous predatory behaviour, reef sharks are typically assumed to be apex predators, but their functional role is yet to be confirmed. In this study, we used stomach contents and stable isotopes to estimate diet, trophic position and carbon sources for three common species of reef shark ( Triaenodon obesus, Carcharhinus melanopterus and C. amblyrhynchos) from the Great Barrier Reef (Australia) and evaluated their assumed functional role as apex predators by qualitative and quantitative comparisons with other sharks and large predatory fishes. We found that reef sharks do not occupy the apex of coral reef food chains, but instead have functional roles similar to those of large predatory fishes such as snappers, emperors and groupers, which are typically regarded as high-level mesopredators. We hypothesise that a degree of functional redundancy exists within this guild of predators, potentially explaining why shark-induced trophic cascades are rare or subtle in coral reef ecosystems. We also found that reef sharks participate in multiple food webs (pelagic and benthic) and are sustained by multiple sources of primary production. We conclude that large conspicuous predators, be they elasmobranchs or any other taxon, should not axiomatically be regarded as apex predators without thorough analysis of their diet. In the case of reef sharks, our dietary analyses suggest they should be reassigned to an alternative trophic group such as high-level mesopredators. This change will facilitate improved understanding of how reef communities function and how removal of predators (e.g., via fishing) might affect ecosystem properties.
Crangle, Colleen E.; Perreau-Guimaraes, Marcos; Suppes, Patrick
2013-01-01
This paper presents a new method of analysis by which structural similarities between brain data and linguistic data can be assessed at the semantic level. It shows how to measure the strength of these structural similarities and so determine the relatively better fit of the brain data with one semantic model over another. The first model is derived from WordNet, a lexical database of English compiled by language experts. The second is given by the corpus-based statistical technique of latent semantic analysis (LSA), which detects relations between words that are latent or hidden in text. The brain data are drawn from experiments in which statements about the geography of Europe were presented auditorily to participants who were asked to determine their truth or falsity while electroencephalographic (EEG) recordings were made. The theoretical framework for the analysis of the brain and semantic data derives from axiomatizations of theories such as the theory of differences in utility preference. Using brain-data samples from individual trials time-locked to the presentation of each word, ordinal relations of similarity differences are computed for the brain data and for the linguistic data. In each case those relations that are invariant with respect to the brain and linguistic data, and are correlated with sufficient statistical strength, amount to structural similarities between the brain and linguistic data. Results show that many more statistically significant structural similarities can be found between the brain data and the WordNet-derived data than the LSA-derived data. The work reported here is placed within the context of other recent studies of semantics and the brain. The main contribution of this paper is the new method it presents for the study of semantics and the brain and the focus it permits on networks of relations detected in brain data and represented by a semantic model. PMID:23799009
Biodiversity, conservation biology, and rational choice.
Frank, David
2014-03-01
This paper critically discusses two areas of Sahotra Sarkar's recent work in environmental philosophy: biodiversity and conservation biology and roles for decision theory in incorporating values explicitly in the environmental policy process. I argue that Sarkar's emphasis on the practices of conservation biologists, and especially the role of social and cultural values in the choice of biodiversity constituents, restricts his conception of biodiversity to particular practical conservation contexts. I argue that life scientists have many reasons to measure many types of diversity, and that biodiversity metrics could be value-free. I argue that Sarkar's emphasis on the limitations of normative decision theory is in tension with his statement that decision theory can "put science and ethics together." I also challenge his claim that multi-criteria decision tools lacking axiomatic foundations in preference and utility theory are "without a rational basis," by presenting a case of a simple "outranking" multi-criteria decision rule that can violate a basic normative requirement of preferences (transitivity) and ask whether there may nevertheless be contexts in which such a procedure might assist decision makers. Copyright © 2013 Elsevier Ltd. All rights reserved.
Zieba, D A; Szczesna, M; Klocek-Gorka, B; Williams, G L
2008-12-01
Photoperiod and nutrition both exert major influences on reproduction. Thus, it seems axiomatic that seasonal rhythms in ovulation are influenced by nutrition. In this context, leptin is one of the most important hormonal signals involved in the control of energy homeostasis, feeding behavior and reproductive function in mammals. However, the number of published investigations establishing a functional interaction between leptin and photoperiodism in seasonal breeders is limited. In common with most seasonally-breeding mammals, sheep exhibit robust circannual cycles in body weight and reproduction, which are driven mainly by changes in day-length. Recently, attention has focused on the role of leptin in this process, particularly in its roles as a major peripheral signal controlling appetite, melatonin and prolactin secretion. The purpose herein is to review current concepts in the overall biology of leptin, to summarize its influence on the hypothalamic-pituitary axis, and to highlight recent developments in our understanding of its interaction with season in regulating appetite, body weight and reproduction in seasonally-breeding mammals. The latter observations may be important in delineating states of leptin resistance and obesity in humans.
Soto, Ana M; Longo, Giuseppe; Montévil, Maël; Sonnenschein, Carlos
2016-10-01
The principle of inertia is central to the modern scientific revolution. By postulating this principle Galileo at once identified a pertinent physical observable (momentum) and a conservation law (momentum conservation). He then could scientifically analyze what modifies inertial movement: gravitation and friction. Inertia, the default state in mechanics, represented a major theoretical commitment: there is no need to explain uniform rectilinear motion, rather, there is a need to explain departures from it. By analogy, we propose a biological default state of proliferation with variation and motility. From this theoretical commitment, what requires explanation is proliferative quiescence, lack of variation, lack of movement. That proliferation is the default state is axiomatic for biologists studying unicellular organisms. Moreover, it is implied in Darwin's "descent with modification". Although a "default state" is a theoretical construct and a limit case that does not need to be instantiated, conditions that closely resemble unrestrained cell proliferation are readily obtained experimentally. We will illustrate theoretical and experimental consequences of applying and of ignoring this principle. Copyright © 2016 Elsevier Ltd. All rights reserved.
SOTO, ANA M.; LONGO, GIUSEPPE; Montévil, Maël; SONNENSCHEIN, CARLOS
2017-01-01
The principle of inertia is central to the modern scientific revolution. By postulating this principle Galileo at once identified a pertinent physical observable (momentum) and a conservation law (momentum conservation). He then could scientifically analyze what modifies inertial movement: gravitation and friction. Inertia, the default state in mechanics, represented a major theoretical commitment: there is no need to explain uniform rectilinear motion, rather, there is a need to explain departures from it. By analogy, we propose a biological default state of proliferation with variation and motility. From this theoretical commitment, what requires explanation is proliferative quiescence, lack of variation, lack of movement. That proliferation is the default state is axiomatic for biologists studying unicellular organisms. Moreover, it is implied in Darwin’s “descent with modification”. Although a “default state” is a theoretical construct and a limit case that does not need to be instantiated, conditions that closely resemble unrestrained cell proliferation are readily obtained experimentally. We will illustrate theoretical and experimental consequences of applying and of ignoring this principle. PMID:27381480
NASA Technical Reports Server (NTRS)
Johnson, Charles S.
1986-01-01
It is nearly axiomatic, that to take the greatest advantage of the useful features available in a development system, and to avoid the negative interactions of those features, requires the exercise of a design methodology which constrains their use. A major design support feature of the Ada language is abstraction: for data, functions processes, resources, and system elements in general. Atomic abstract types can be created in packages defining those private types and all of the overloaded operators, functions, and hidden data required for their use in an application. Generically structured abstract types can be created in generic packages defining those structured private types, as buildups from the user-defined data types which are input as parameters. A study is made of the design constraints required for software incorporating either atomic or generically structured abstract types, if the integration of software components based on them is to be subsequently performed. The impact of these techniques on the reusability of software and the creation of project-specific software support environments is also discussed.
Operational Axioms for Quantum Mechanics
DOE Office of Scientific and Technical Information (OSTI.GOV)
D'Ariano, Giacomo Mauro; Department of Electrical and Computer Engineering, Northwestern University, Evanston, IL 60208
2007-02-21
The mathematical formulation of Quantum Mechanics in terms of complex Hilbert space is derived for finite dimensions, starting from a general definition of physical experiment and from five simple Postulates concerning experimental accessibility and simplicity. For the infinite dimensional case, on the other hand, a C*-algebra representation of physical transformations is derived, starting from just four of the five Postulates via a Gelfand-Naimark-Segal (GNS) construction. The present paper simplifies and sharpens the previous derivation in Ref. [1]. The main ingredient of the axiomatization is the postulated existence of faithful states that allows one to calibrate the experimental apparatus. Such notionmore » is at the basis of the operational definitions of the scalar product and of the transposed of a physical transformation. What is new in the present paper with respect to Ref. [1], is the operational deduction of an involution corresponding to the complex-conjugation for effects, whose extension to transformations allows to define the adjoint of a transformation when the extension is composition-preserving. The existence of such composition-preserving extension among possible extensions is analyzed.« less
Deductive Coordination of Multiple Geospatial Knowledge Sources
NASA Astrophysics Data System (ADS)
Waldinger, R.; Reddy, M.; Culy, C.; Hobbs, J.; Jarvis, P.; Dungan, J. L.
2002-12-01
Deductive inference is applied to choreograph the cooperation of multiple knowledge sources to respond to geospatial queries. When no one source can provide an answer, the response may be deduced from pieces of the answer provided by many sources. Examples of sources include (1) The Alexandria Digital Library Gazetteer, a repository that gives the locations for almost six million place names, (2) The Cia World Factbook, an online almanac with basic information about more than 200 countries. (3) The SRI TerraVision 3D Terrain Visualization System, which displays a flight-simulator-like interactive display of geographic data held in a database, (4) The NASA GDACC WebGIS client for searching satellite and other geographic data available through OpenGIS Consortium (OGC) Web Map Servers, and (5) The Northern Arizona University Latitude/Longitude Distance Calculator. Queries are phrased in English and are translated into logical theorems by the Gemini Natural Language Parser. The theorems are proved by SNARK, a first-order-logic theorem prover, in the context of an axiomatic geospatial theory. The theory embodies a representational scheme that takes into account the fact that the same place may have many names, and the same name may refer to many places. SNARK has built-in procedures (RCC8 and the Allen calculus, respectively) for reasoning about spatial and temporal concepts. External knowledge sources may be consulted by SNARK as the proof is in progress, so that most knowledge need not be stored axiomatically. The Open Agent Architecture (OAA) facilitates communication between sources that may be implemented on different machines in different computer languages. An answer to the query, in the form of text or an image, is extracted from the proof. Currently, three-dimensional images are displayed by TerraVision but other displays are possible. The combined system is called Geo-Logica. Some example queries that can be handled by Geo-Logica include: (1) show the petrified forests in Oregon north of Portland, (2) show the lake in Argentina with the highest elevation, and (3) Show the IGPB land cover classification, derived using MODIS, of Montana for July, 2000. Use of a theorem prover allows sources to cooperate even if they adapt different notational conventions and representation schemes and have never been designed to work together. New sources can be added without reprogramming the system, by providing axioms that advertise their capabilities. Future directions include entering into a dialogue with the user to clarify ambiguities, elaborate on previous questions, or provide new information necessary to answer the question. In addition, of particular interest is to deal with temporally varying data, with answers displayed as animated images.
Dependence of the firearm-related homicide rate on gun availability: a mathematical analysis.
Wodarz, Dominik; Komarova, Natalia L
2013-01-01
In the USA, the relationship between the legal availability of guns and the firearm-related homicide rate has been debated. It has been argued that unrestricted gun availability promotes the occurrence of firearm-induced homicides. It has also been pointed out that gun possession can protect potential victims when attacked. This paper provides a first mathematical analysis of this tradeoff, with the goal to steer the debate towards arguing about assumptions, statistics, and scientific methods. The model is based on a set of clearly defined assumptions, which are supported by available statistical data, and is formulated axiomatically such that results do not depend on arbitrary mathematical expressions. According to this framework, two alternative scenarios can minimize the gun-related homicide rate: a ban of private firearms possession, or a policy allowing the general population to carry guns. Importantly, the model identifies the crucial parameters that determine which policy minimizes the death rate, and thus serves as a guide for the design of future epidemiological studies. The parameters that need to be measured include the fraction of offenders that illegally possess a gun, the degree of protection provided by gun ownership, and the fraction of the population who take up their right to own a gun and carry it when attacked. Limited data available in the literature were used to demonstrate how the model can be parameterized, and this preliminary analysis suggests that a ban of private firearm possession, or possibly a partial reduction in gun availability, might lower the rate of firearm-induced homicides. This, however, should not be seen as a policy recommendation, due to the limited data available to inform and parameterize the model. However, the model clearly defines what needs to be measured, and provides a basis for a scientific discussion about assumptions and data.
Generalized Quantum Theory and Mathematical Foundations of Quantum Field Theory
NASA Astrophysics Data System (ADS)
Maroun, Michael Anthony
This dissertation is divided into two main topics. The first is the generalization of quantum dynamics when the Schrodinger partial differential equation is not defined even in the weak mathematical sense because the potential function itself is a distribution in the spatial variable, the same variable that is used to define the kinetic energy operator, i.e. the Laplace operator. The procedure is an extension and broadening of the distributional calculus and offers spectral results as an alternative to the only other two known methods to date, namely a) the functional calculi; and b) non-standard analysis. Furthermore, the generalizations of quantum dynamics presented within give a resolution to the time asymmetry paradox created by multi-particle quantum mechanics due to the time evolution still being unitary. A consequence is the randomization of phases needed for the fundamental justification Pauli master equation. The second topic is foundations of the quantum theory of fields. The title is phrased as ``foundations'' to emphasize that there is no claim of uniqueness but rather a proposal is put forth, which is markedly different than that of constructive or axiomatic field theory. In particular, the space of fields is defined as a space of generalized functions with involutive symmetry maps (the CPT invariance) that affect the topology of the field space. The space of quantum fields is then endowed the Frechet property and interactions change the topology in such a way as to cause some field spaces to be incompatible with others. This is seen in the consequences of the Haag theorem. Various examples and discussions are given that elucidate a new view of the quantum theory of fields and its (lack of) mathematical structure.
Learning, Realizability and Games in Classical Arithmetic
NASA Astrophysics Data System (ADS)
Aschieri, Federico
2010-12-01
In this dissertation we provide mathematical evidence that the concept of learning can be used to give a new and intuitive computational semantics of classical proofs in various fragments of Predicative Arithmetic. First, we extend Kreisel modified realizability to a classical fragment of first order Arithmetic, Heyting Arithmetic plus EM1 (Excluded middle axiom restricted to Sigma^0_1 formulas). We introduce a new realizability semantics we call "Interactive Learning-Based Realizability". Our realizers are self-correcting programs, which learn from their errors and evolve through time. Secondly, we extend the class of learning based realizers to a classical version PCFclass of PCF and, then, compare the resulting notion of realizability with Coquand game semantics and prove a full soundness and completeness result. In particular, we show there is a one-to-one correspondence between realizers and recursive winning strategies in the 1-Backtracking version of Tarski games. Third, we provide a complete and fully detailed constructive analysis of learning as it arises in learning based realizability for HA+EM1, Avigad's update procedures and epsilon substitution method for Peano Arithmetic PA. We present new constructive techniques to bound the length of learning processes and we apply them to reprove - by means of our theory - the classic result of Godel that provably total functions of PA can be represented in Godel's system T. Last, we give an axiomatization of the kind of learning that is needed to computationally interpret Predicative classical second order Arithmetic. Our work is an extension of Avigad's and generalizes the concept of update procedure to the transfinite case. Transfinite update procedures have to learn values of transfinite sequences of non computable functions in order to extract witnesses from classical proofs.
Air quality measurements-From rubber bands to tapping the rainbow.
Hidy, George M; Mueller, Peter K; Altshuler, Samuel L; Chow, Judith C; Watson, John G
2017-06-01
It is axiomatic that good measurements are integral to good public policy for environmental protection. The generalized term for "measurements" includes sampling and quantitation, data integrity, documentation, network design, sponsorship, operations, archiving, and accessing for applications. Each of these components has evolved and advanced over the last 200 years as knowledge of atmospheric chemistry and physics has matured. Air quality was first detected by what people could see and smell in contaminated air. Gaseous pollutants were found to react with certain materials or chemicals, changing the color of dissolved reagents such that their light absorption at selected wavelengths could be related to both the pollutant chemistry and its concentration. Airborne particles have challenged the development of a variety of sensory devices and laboratory assays for characterization of their enormous range of physical and chemical properties. Advanced electronics made possible the sampling, concentration, and detection of gases and particles, both in situ and in laboratory analysis of collected samples. Accurate and precise measurements by these methods have made possible advanced air quality management practices that led to decreasing concentrations over time. New technologies are leading to smaller and cheaper measurement systems that can further expand and enhance current air pollution monitoring networks. Ambient air quality measurement systems have a large influence on air quality management by determining compliance, tracking trends, elucidating pollutant transport and transformation, and relating concentrations to adverse effects. These systems consist of more than just instrumentation, and involve extensive support efforts for siting, maintenance, calibration, auditing, data validation, data management and access, and data interpretation. These requirements have largely been attained for criteria pollutants regulated by National Ambient Air Quality Standards, but they are rarely attained for nonroutine measurements and research studies.
A Note on the Problem of Proper Time in Weyl Space-Time
NASA Astrophysics Data System (ADS)
Avalos, R.; Dahia, F.; Romero, C.
2018-02-01
We discuss the question of whether or not a general Weyl structure is a suitable mathematical model of space-time. This is an issue that has been in debate since Weyl formulated his unified field theory for the first time. We do not present the discussion from the point of view of a particular unification theory, but instead from a more general standpoint, in which the viability of such a structure as a model of space-time is investigated. Our starting point is the well known axiomatic approach to space-time given by Elhers, Pirani and Schild (EPS). In this framework, we carry out an exhaustive analysis of what is required for a consistent definition for proper time and show that such a definition leads to the prediction of the so-called "second clock effect". We take the view that if, based on experience, we were to reject space-time models predicting this effect, this could be incorporated as the last axiom in the EPS approach. Finally, we provide a proof that, in this case, we are led to a Weyl integrable space-time as the most general structure that would be suitable to model space-time.
Light and harmonicity: the golden section
NASA Astrophysics Data System (ADS)
Raftopoulos, Dionysios G.
2015-09-01
Adhering to Werner Heisenberg's and to the school of Copenhagen's physical philosophy we introduce the localized observer as an absolutely necessary element of a consistent physical description of nature. Thus we have synthesized the theory of the harmonicity of the field of light, which attempts to present a new approach to the events in the human perceptible space. It is an axiomatic theory based on the selection of the projective space as the geometrical space of choice, while its first fundamental hypothesis is none other than special relativity theory's second hypothesis, properly modified. The result is that all our observations and measurements of physical entities always refer not to their present state but rather to a previous one, a conclusion evocative of the "shadows" paradigm in Plato's cave allegory. In the kinematics of a material point this previous state we call "conjugate position", which has been called the "retarded position" by Richard Feynman. We prove that the relation of the present position with its conjugate is ruled by a harmonic tetrad. Thus the relation of the elements of the geometrical (noetic) and the perceptible space is harmonic. In this work we show a consequence of this harmonic relation: the golden section.
NASA Astrophysics Data System (ADS)
Luo, Shunlong; Sun, Yuan
2017-08-01
Quantifications of coherence are intensively studied in the context of completely decoherent operations (i.e., von Neuamnn measurements, or equivalently, orthonormal bases) in recent years. Here we investigate partial coherence (i.e., coherence in the context of partially decoherent operations such as Lüders measurements). A bona fide measure of partial coherence is introduced. As an application, we address the monotonicity problem of K -coherence (a quantifier for coherence in terms of Wigner-Yanase skew information) [Girolami, Phys. Rev. Lett. 113, 170401 (2014), 10.1103/PhysRevLett.113.170401], which is introduced to realize a measure of coherence as axiomatized by Baumgratz, Cramer, and Plenio [Phys. Rev. Lett. 113, 140401 (2014), 10.1103/PhysRevLett.113.140401]. Since K -coherence fails to meet the necessary requirement of monotonicity under incoherent operations, it is desirable to remedy this monotonicity problem. We show that if we modify the original measure by taking skew information with respect to the spectral decomposition of an observable, rather than the observable itself, as a measure of coherence, then the problem disappears, and the resultant coherence measure satisfies the monotonicity. Some concrete examples are discussed and related open issues are indicated.
Varieties of Orthocomplemented Lattices Induced by Łukasiewicz-Groupoid-Valued Mappings
NASA Astrophysics Data System (ADS)
Matoušek, Milan; Pták, Pavel
2017-12-01
In the logico-algebraic approach to the foundation of quantum mechanics we sometimes identify the set of events of the quantum experiment with an orthomodular lattice ("quantum logic"). The states are then usually associated with (normalized) finitely additive measures ("states"). The conditions imposed on states then define classes of orthomodular lattices that are sometimes found to be universal-algebraic varieties. In this paper we adopt a conceptually different approach, we relax orthomodular to orthocomplemented and we replace the states with certain subadditive mappings that range in the Łukasiewicz groupoid. We then show that when we require a type of "fulness" of these mappings, we obtain varieties of orthocomplemented lattices. Some of these varieties contain the projection lattice in a Hilbert space so there is a link to quantum logic theories. Besides, on the purely algebraic side, we present a characterization of orthomodular lattices among the orthocomplemented ones. - The intention of our approach is twofold. First, we recover some of the Mayet varieties in a principally different way (indeed, we also obtain many other new varieties). Second, by introducing an interplay of the lattice, measure-theoretic and fuzzy-set notions we intend to add to the concepts of quantum axiomatics.
Water Conservation and Hydrological Transitions in Cities
NASA Astrophysics Data System (ADS)
Hornberger, G. M.; Gilligan, J. M.; Hess, D. J.
2014-12-01
A 2012 report by the National Research Council, Challenges and Opportunities in the Hydrologic Sciences, called for the development of "translational hydrologic science." Translational research in this context requires knowledge about the communication of science to decision makers and to the public but also improved understanding of the public by the scientists. This kind of knowledge is inherently interdisciplinary because it requires understanding of the complex sociotechnical dimensions of water, policy, and user relations. It is axiomatic that good governance of water resources and water infrastructure requires information about water resources themselves and about the institutions that govern water use. This "socio-hydrologic" or "hydrosociological" knowledge is often characterized by complex dynamics between and among human and natural systems. Water Resources Research has provided a forum for presentation of interdisciplinary research in coupled natural-human systems since its inception 50 years ago. The evolution of ideas presented in the journal provides a basis for framing new work, an example of which is water conservation in cities. In particular, we explore the complex interactions of political, sociodemographic, economic, and hydroclimatological factors in affecting decisions that either advance or retard the development of water conservation policies.
NASA Astrophysics Data System (ADS)
Tessarotto, Massimo; Asci, Claudio
2017-05-01
In this paper the problem is posed of determining the physically-meaningful asymptotic orderings holding for the statistical description of a large N-body system of hard spheres, i.e., formed by N ≡1/ε ≫ 1 particles, which are allowed to undergo instantaneous and purely elastic unary, binary or multiple collisions. Starting point is the axiomatic treatment recently developed [Tessarotto et al., 2013-2016] and the related discovery of an exact kinetic equation realized by Master equation which advances in time the 1-body probability density function (PDF) for such a system. As shown in the paper the task involves introducing appropriate asymptotic orderings in terms of ε for all the physically-relevant parameters. The goal is that of identifying the relevant physically-meaningful asymptotic approximations applicable for the Master kinetic equation, together with their possible relationships with the Boltzmann and Enskog kinetic equations, and holding in appropriate asymptotic regimes. These correspond either to dilute or dense systems and are formed either by small-size or finite-size identical hard spheres, the distinction between the various cases depending on suitable asymptotic orderings in terms of ε.
Perceiving while producing: Modeling the dynamics of phonological planning
Roon, Kevin D.; Gafos, Adamantios I.
2016-01-01
We offer a dynamical model of phonological planning that provides a formal instantiation of how the speech production and perception systems interact during online processing. The model is developed on the basis of evidence from an experimental task that requires concurrent use of both systems, the so-called response-distractor task in which speakers hear distractor syllables while they are preparing to produce required responses. The model formalizes how ongoing response planning is affected by perception and accounts for a range of results reported across previous studies. It does so by explicitly addressing the setting of parameter values in representations. The key unit of the model is that of the dynamic field, a distribution of activation over the range of values associated with each representational parameter. The setting of parameter values takes place by the attainment of a stable distribution of activation over the entire field, stable in the sense that it persists even after the response cue in the above experiments has been removed. This and other properties of representations that have been taken as axiomatic in previous work are derived by the dynamics of the proposed model. PMID:27440947
The birth of the blues: how physics underlies music
NASA Astrophysics Data System (ADS)
Gibson, J. M.
2009-07-01
Art and science have intimate connections, although these are often underappreciated. Western music provides compelling examples. The sensation of harmony and related melodic development are rooted in physical principles that can be understood with simple mathematics. The focus of this review is not the better known acoustics of instruments, but the structure of music itself. The physical basis of the evolution of Western music in the last half millennium is discussed, culminating with the development of the 'blues'. The paper refers to a number of works which expand the connections, and introduces material specific to the development of the 'blues'. Several conclusions are made: (1) that music is axiomatic like mathematics and that to appreciate music fully listeners must learn the axioms; (2) that this learning does not require specific conscious study but relies on a linkage between the creative and quantitative brain and (3) that a key element of the musical 'blues' comes from recreating missing notes on the modern equal temperament scale. The latter is an example of 'art built on artifacts'. Finally, brief reference is made to the value of music as a tool for teaching physics, mathematics and engineering to non-scientists.
Memory-n strategies of direct reciprocity
Martinez-Vaquero, Luis A.; Chatterjee, Krishnendu; Nowak, Martin A.
2017-01-01
Humans routinely use conditionally cooperative strategies when interacting in repeated social dilemmas. They are more likely to cooperate if others cooperated before, and are ready to retaliate if others defected. To capture the emergence of reciprocity, most previous models consider subjects who can only choose from a restricted set of representative strategies, or who react to the outcome of the very last round only. As players memorize more rounds, the dimension of the strategy space increases exponentially. This increasing computational complexity renders simulations for individuals with higher cognitive abilities infeasible, especially if multiplayer interactions are taken into account. Here, we take an axiomatic approach instead. We propose several properties that a robust cooperative strategy for a repeated multiplayer dilemma should have. These properties naturally lead to a unique class of cooperative strategies, which contains the classical Win–Stay Lose–Shift rule as a special case. A comprehensive numerical analysis for the prisoner’s dilemma and for the public goods game suggests that strategies of this class readily evolve across various memory-n spaces. Our results reveal that successful strategies depend not only on how cooperative others were in the past but also on the respective context of cooperation. PMID:28420786
A new look at the decomposition of agricultural productivity growth incorporating weather effects.
Njuki, Eric; Bravo-Ureta, Boris E; O'Donnell, Christopher J
2018-01-01
Random fluctuations in temperature and precipitation have substantial impacts on agricultural output. However, the contribution of these changing configurations in weather to total factor productivity (TFP) growth has not been addressed explicitly in econometric analyses. Thus, the key objective of this study is to quantify and to investigate the role of changing weather patterns in explaining yearly fluctuations in TFP. For this purpose, we define TFP to be a measure of total output divided by a measure of total input. We estimate a stochastic production frontier model using U.S. state-level agricultural data incorporating growing season temperature and precipitation, and intra-annual standard deviations of temperature and precipitation for the period 1960-2004. We use the estimated parameters of the model to compute a TFP index that has good axiomatic properties. We then decompose TFP growth in each state into weather effects, technological progress, technical efficiency, and scale-mix efficiency changes. This approach improves our understanding of the role of different components of TFP in agricultural productivity growth. We find that annual TFP growth averaged 1.56% between 1960 and 2004. Moreover, we observe substantial heterogeneity in weather effects across states and over time.
The birth of the blues : how physics underlies music.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gibson, J. M.
Art and science have intimate connections, although these are often underappreciated. Western music provides compelling examples. The sensation of harmony and related melodic development are rooted in physical principles that can be understood with simple mathematics. The focus of this review is not the better known acoustics of instruments, but the structure of music itself. The physical basis of the evolution of Western music in the last half millennium is discussed, culminating with the development of the 'blues'. The paper refers to a number of works which expand the connections, and introduces material specific to the development of the 'blues'.more » Several conclusions are made: (1) that music is axiomatic like mathematics and that to appreciate music fully listeners must learn the axioms; (2) that this learning does not require specific conscious study but relies on a linkage between the creative and quantitative brain and (3) that a key element of the musical 'blues' comes from recreating missing notes on the modern equal temperament scale. The latter is an example of 'art built on artifacts'. Finally, brief reference is made to the value of music as a tool for teaching physics, mathematics and engineering to non-scientists.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Friedman, N.; Koller, D.; Halpern, J.Y.
Conditional logics play an important role in recent attempts to investigate default reasoning. This paper investigates first-order conditional logic. We show that, as for first-order probabilistic logic, it is important not to confound statistical conditionals over the domain (such as {open_quotes}most birds fly{close_quotes}), and subjective conditionals over possible worlds (such as I believe that Tweety is unlikely to fly). We then address the issue of ascribing semantics to first-order conditional logic. As in the propositional case, there are many possible semantics. To study the problem in a coherent way, we use plausibility structures. These provide us with a general frameworkmore » in which many of the standard approaches can be embedded. We show that while these standard approaches are all the same at the propositional level, they are significantly different in the context of a first-order language. We show that plausibilities provide the most natural extension of conditional logic to the first-order case: We provide a sound and complete axiomatization that contains only the KLM properties and standard axioms of first-order modal logic. We show that most of the other approaches have additional properties, which result in an inappropriate treatment of an infinitary version of the lottery paradox.« less
OPPL-Galaxy, a Galaxy tool for enhancing ontology exploitation as part of bioinformatics workflows
2013-01-01
Background Biomedical ontologies are key elements for building up the Life Sciences Semantic Web. Reusing and building biomedical ontologies requires flexible and versatile tools to manipulate them efficiently, in particular for enriching their axiomatic content. The Ontology Pre Processor Language (OPPL) is an OWL-based language for automating the changes to be performed in an ontology. OPPL augments the ontologists’ toolbox by providing a more efficient, and less error-prone, mechanism for enriching a biomedical ontology than that obtained by a manual treatment. Results We present OPPL-Galaxy, a wrapper for using OPPL within Galaxy. The functionality delivered by OPPL (i.e. automated ontology manipulation) can be combined with the tools and workflows devised within the Galaxy framework, resulting in an enhancement of OPPL. Use cases are provided in order to demonstrate OPPL-Galaxy’s capability for enriching, modifying and querying biomedical ontologies. Conclusions Coupling OPPL-Galaxy with other bioinformatics tools of the Galaxy framework results in a system that is more than the sum of its parts. OPPL-Galaxy opens a new dimension of analyses and exploitation of biomedical ontologies, including automated reasoning, paving the way towards advanced biological data analyses. PMID:23286517
A new look at the decomposition of agricultural productivity growth incorporating weather effects
Bravo-Ureta, Boris E.; O’Donnell, Christopher J.
2018-01-01
Random fluctuations in temperature and precipitation have substantial impacts on agricultural output. However, the contribution of these changing configurations in weather to total factor productivity (TFP) growth has not been addressed explicitly in econometric analyses. Thus, the key objective of this study is to quantify and to investigate the role of changing weather patterns in explaining yearly fluctuations in TFP. For this purpose, we define TFP to be a measure of total output divided by a measure of total input. We estimate a stochastic production frontier model using U.S. state-level agricultural data incorporating growing season temperature and precipitation, and intra-annual standard deviations of temperature and precipitation for the period 1960–2004. We use the estimated parameters of the model to compute a TFP index that has good axiomatic properties. We then decompose TFP growth in each state into weather effects, technological progress, technical efficiency, and scale-mix efficiency changes. This approach improves our understanding of the role of different components of TFP in agricultural productivity growth. We find that annual TFP growth averaged 1.56% between 1960 and 2004. Moreover, we observe substantial heterogeneity in weather effects across states and over time. PMID:29466461
Scientific explanations in Greek upper secondary physics textbooks
NASA Astrophysics Data System (ADS)
Velentzas, Athanasios; Halkia, Krystallia
2018-01-01
In this study, an analysis of the structure of scientific explanations included in physics textbooks of upper secondary schools in Greece was completed. In scientific explanations for specific phenomena found in the sample textbooks, the explanandum is a logical consequence of the explanans, which in all cases include at least one scientific law (and/or principle, model or rule) previously presented, as well as statements concerning a specific case or specific conditions. The same structure is also followed in most of the cases in which the textbook authors explain regularities (i.e. laws, rules) as consequences of one or more general law or principle of physics. Finally, a number of the physics laws and principles presented in textbooks are not deduced as consequences from other, more general laws, but they are formulated axiomatically or inductively derived and the authors argue for their validity. Since, as it was found, the scientific explanations presented in the textbooks used in the study have similar structures to the explanations in internationally known textbooks, the findings of the present work may be of interest not only to science educators in Greece, but also to the community of science educators in other countries.
Dependence of the Firearm-Related Homicide Rate on Gun Availability: A Mathematical Analysis
Wodarz, Dominik; Komarova, Natalia L.
2013-01-01
In the USA, the relationship between the legal availability of guns and the firearm-related homicide rate has been debated. It has been argued that unrestricted gun availability promotes the occurrence of firearm-induced homicides. It has also been pointed out that gun possession can protect potential victims when attacked. This paper provides a first mathematical analysis of this tradeoff, with the goal to steer the debate towards arguing about assumptions, statistics, and scientific methods. The model is based on a set of clearly defined assumptions, which are supported by available statistical data, and is formulated axiomatically such that results do not depend on arbitrary mathematical expressions. According to this framework, two alternative scenarios can minimize the gun-related homicide rate: a ban of private firearms possession, or a policy allowing the general population to carry guns. Importantly, the model identifies the crucial parameters that determine which policy minimizes the death rate, and thus serves as a guide for the design of future epidemiological studies. The parameters that need to be measured include the fraction of offenders that illegally possess a gun, the degree of protection provided by gun ownership, and the fraction of the population who take up their right to own a gun and carry it when attacked. Limited data available in the literature were used to demonstrate how the model can be parameterized, and this preliminary analysis suggests that a ban of private firearm possession, or possibly a partial reduction in gun availability, might lower the rate of firearm-induced homicides. This, however, should not be seen as a policy recommendation, due to the limited data available to inform and parameterize the model. However, the model clearly defines what needs to be measured, and provides a basis for a scientific discussion about assumptions and data. PMID:23923062
A fractal model of the Universe
NASA Astrophysics Data System (ADS)
Gottlieb, Ioan
The book represents a revisioned, extended, completed and translated version of the book "Superposed Universes. A scientific novel and a SF story" (1995). The book contains a hypothesis by the author concerning the complexity of the Nature. An introduction to the theories of numbers, manyfolds and topology is given. The possible connection with the theory of evolution of the Universe is discussed. The book contains also in the last chapter a SF story based on the hypothesis presented. A connection with fractals theory is given. A part of his earlier studies (1955-1956) were subsequently published without citation by Ali Kyrala (Phys. Rev. vol.117, No.5, march 1, 1960). The book contains as an important appendix the early papers (some of which are published in the coauthoprship with his scientific advisors): 1) T.T. Vescan, A. Weiszmann and I.Gottlieb, Contributii la studiul problemelor geometrice ale teoriei relativitatii restranse. Academia R.P.R. Baza Timisoara. Lucrarile consfatuirii de geometrie diferentiala din 9-12 iunie 1955. In this paper the authors show a new method of the calculation of the metrics. 2) Jean Gottlieb, L'hyphotese d'un modele de la structure de la matiere, Revista Matematica y Fisica Teorica, Serie A, Volumen XY, No.1, y.2, 1964 3) I. Gottlieb, Some hypotheses on space, time and gravitation, Studies in Gravitation Theory, CIP Press, Bucharest, 1988, pp.227-234 as well as some recent papers (published in the coauthorship with his disciples): 4)M. Agop, Gottlieb speace-time. A fractal axiomatic model of the Universe. in Particles and Fields, Editors: M.Agop and P.D. Ioannou, Athens University Press, 2005, pp. 59-141 5) I. Gottlieb, M.Agop and V.Enache, Games with Cantor's dust. Chaos, Solitons and Fractals, vol.40 (2009) pp. 940-945 6) I. Gottlieb, My picture over the World, Bull. of the Polytechnic Institute of Iasi. Tom LVI)LX, Fasc. 1, 2010, pp. 1-18. The book contains also a dedication to father Vasile Gottlieb and wife Cleopatra Mociutchi.
Quantum cluster algebras and quantum nilpotent algebras.
Goodearl, Kenneth R; Yakimov, Milen T
2014-07-08
A major direction in the theory of cluster algebras is to construct (quantum) cluster algebra structures on the (quantized) coordinate rings of various families of varieties arising in Lie theory. We prove that all algebras in a very large axiomatically defined class of noncommutative algebras possess canonical quantum cluster algebra structures. Furthermore, they coincide with the corresponding upper quantum cluster algebras. We also establish analogs of these results for a large class of Poisson nilpotent algebras. Many important families of coordinate rings are subsumed in the class we are covering, which leads to a broad range of applications of the general results to the above-mentioned types of problems. As a consequence, we prove the Berenstein-Zelevinsky conjecture [Berenstein A, Zelevinsky A (2005) Adv Math 195:405-455] for the quantized coordinate rings of double Bruhat cells and construct quantum cluster algebra structures on all quantum unipotent groups, extending the theorem of Geiß et al. [Geiß C, et al. (2013) Selecta Math 19:337-397] for the case of symmetric Kac-Moody groups. Moreover, we prove that the upper cluster algebras of Berenstein et al. [Berenstein A, et al. (2005) Duke Math J 126:1-52] associated with double Bruhat cells coincide with the corresponding cluster algebras.
Physiological optics and physical geometry.
Hyder, D J
2001-09-01
Hermann von Helmholtz's distinction between "pure intuitive" and "physical" geometry must be counted as the most influential of his many contributions to the philosophy of science. In a series of papers from the 1860s and 70s, Helmholtz argued against Kant's claim that our knowledge of Euclidean geometry was an a priori condition for empirical knowledge. He claimed that geometrical propositions could be meaningful only if they were taken to concern the behaviors of physical bodies used in measurement, from which it followed that it was posterior to our acquaintance with this behavior. This paper argues that Helmholtz's understanding of geometry was fundamentally shaped by his work in sense-physiology, above all on the continuum of colors. For in the course of that research, Helmholtz was forced to realize that the color-space had no inherent metrical structure. The latter was a product of axiomatic definitions of color-addition and the empirical results of such additions. Helmholtz's development of these views is explained with detailed reference to the competing work of the mathematician Hermann Grassmann and that of the young James Clerk Maxwell. It is this separation between 1) essential properties of a continuum, 2) supplementary axioms concerning distance-measurement, and 3) the behaviors of the physical apparatus used to realize the axioms, which is definitive of Helmholtz's arguments concerning geometry.
Stress Induces Contextual Blindness in Lotteries and Coordination Games
Brocas, Isabelle; Carrillo, Juan D.; Kendall, Ryan
2017-01-01
In this paper, we study how stress affects risk taking in three tasks: individual lotteries, Stag Hunt (coordination) games, and Hawk-Dove (anti-coordination) games. Both control and stressed subjects take more risks in all three tasks when the value of the safe option is decreased and in lotteries when the expected gain is increased. Also, subjects take longer to take decisions when stakes are high, when the safe option is less attractive and in the conceptually more difficult Hawk-Dove game. Stress (weakly) increases reaction times in those cases. Finally, our main result is that the behavior of stressed subjects in lotteries, Stag Hunt and Hawk-Dove are all highly predictive of each other (p-value < 0.001 for all three pairwise correlations). Such strong relationship is not present in our control group. Our results illustrate a “contextual blindness” caused by stress. The mathematical and behavioral tensions of Stag Hunt and Hawk-Dove games are axiomatically different, and we should expect different behavior across these games, and also with respect to the individual task. A possible explanation for the highly significant connection across tasks in the stress condition is that stressed subjects habitually rely on one mechanism to make a decision in all contexts whereas unstressed subjects utilize a more cognitively flexible approach. PMID:29321733
Pragmatic turn in biology: From biological molecules to genetic content operators.
Witzany, Guenther
2014-08-26
Erwin Schrödinger's question "What is life?" received the answer for decades of "physics + chemistry". The concepts of Alain Turing and John von Neumann introduced a third term: "information". This led to the understanding of nucleic acid sequences as a natural code. Manfred Eigen adapted the concept of Hammings "sequence space". Similar to Hilbert space, in which every ontological entity could be defined by an unequivocal point in a mathematical axiomatic system, in the abstract "sequence space" concept each point represents a unique syntactic structure and the value of their separation represents their dissimilarity. In this concept molecular features of the genetic code evolve by means of self-organisation of matter. Biological selection determines the fittest types among varieties of replication errors of quasi-species. The quasi-species concept dominated evolution theory for many decades. In contrast to this, recent empirical data on the evolution of DNA and its forerunners, the RNA-world and viruses indicate cooperative agent-based interactions. Group behaviour of quasi-species consortia constitute de novo and arrange available genetic content for adaptational purposes within real-life contexts that determine epigenetic markings. This review focuses on some fundamental changes in biology, discarding its traditional status as a subdiscipline of physics and chemistry.
Beyond the Shannon–Khinchin formulation: The composability axiom and the universal-group entropy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tempesta, Piergiulio, E-mail: p.tempesta@fis.ucm.es
2016-02-15
The notion of entropy is ubiquitous both in natural and social sciences. In the last two decades, a considerable effort has been devoted to the study of new entropic forms, which generalize the standard Boltzmann–Gibbs (BG) entropy and could be applicable in thermodynamics, quantum mechanics and information theory. In Khinchin (1957), by extending previous ideas of Shannon (1948) and Shannon and Weaver (1949), Khinchin proposed a characterization of the BG entropy, based on four requirements, nowadays known as the Shannon–Khinchin (SK) axioms. The purpose of this paper is twofold. First, we show that there exists an intrinsic group-theoretical structure behindmore » the notion of entropy. It comes from the requirement of composability of an entropy with respect to the union of two statistically independent systems, that we propose in an axiomatic formulation. Second, we show that there exists a simple universal family of trace-form entropies. This class contains many well known examples of entropies and infinitely many new ones, a priori multi-parametric. Due to its specific relation with Lazard’s universal formal group of algebraic topology, the new general entropy introduced in this work will be called the universal-group entropy. A new example of multi-parametric entropy is explicitly constructed.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, J. D.; Oberkampf, William Louis; Helton, Jon Craig
2006-10-01
Evidence theory provides an alternative to probability theory for the representation of epistemic uncertainty in model predictions that derives from epistemic uncertainty in model inputs, where the descriptor epistemic is used to indicate uncertainty that derives from a lack of knowledge with respect to the appropriate values to use for various inputs to the model. The potential benefit, and hence appeal, of evidence theory is that it allows a less restrictive specification of uncertainty than is possible within the axiomatic structure on which probability theory is based. Unfortunately, the propagation of an evidence theory representation for uncertainty through a modelmore » is more computationally demanding than the propagation of a probabilistic representation for uncertainty, with this difficulty constituting a serious obstacle to the use of evidence theory in the representation of uncertainty in predictions obtained from computationally intensive models. This presentation describes and illustrates a sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory. Preliminary trials indicate that the presented strategy can be used to propagate uncertainty representations based on evidence theory in analysis situations where naive sampling-based (i.e., unsophisticated Monte Carlo) procedures are impracticable due to computational cost.« less
Valsiner, Jaan
2009-03-01
Since the new beginning in 2007 of Integrative Psychological & Behavioral Science we have brought out to the open both the reasons why the ever-widening research enterprise in psychology has largely failed to produce general knowledge, and to point to promising new directions in the field. The post-modernist turn in psychology is now over, and it is an interesting task to return to creating a universal science of psychology that is context-sensitive, and culture-inclusive. The latter goal entails a renewed focus upon qualitative analyses of time-based processes, close attention to the phenomena under study, and systematic (single-system-based-usually labeled idiographic) focus in empirical investigations. Through these three pathways centrality of human experiencing of culturally constructed worlds is restored as the core of psychological science. Universal principles are evident in each and every single case. Transcending post-modernist deconstruction of science happens through active international participation and a renewed focus on creating general theories. Contemporary psychology is global in ways that no longer can any country's socio-political world view dominate the field. Such international equality of contributions grants innovation of the core of the discipline, and safeguards it against assuming any single cultural myth-story as the axiomatic basis for the discipline.
Evolutionary theory and teleology.
O'Grady, R T
1984-04-21
The order within and among living systems can be explained rationally by postulating a process of descent with modification, effected by factors which are extrinsic or intrinsic to the organisms. Because at the time Darwin proposed his theory of evolution there was no concept of intrinsic factors which could evolve, he postulated a process of extrinsic effects--natural selection. Biological order was thus seen as an imposed, rather than an emergent, property. Evolutionary change was seen as being determined by the functional efficiency (adaptedness) of the organism in its environment, rather than by spontaneous changes in intrinsically generated organizing factors. The initial incompleteness of Darwin's explanatory model, and the axiomatization of its postulates in neo-Darwinism, has resulted in a theory of functionalism, rather than structuralism. As such, it introduces an unnecessary teleology which confounds evolutionary studies and reduces the usefulness of the theory. This problem cannot be detected from within the neo-Darwinian paradigm because the different levels of end-directed activity--teleomatic, teleonomic, and teleological--are not recognized. They are, in fact, considered to influence one another. The theory of nonequilibrium evolution avoids these problems by returning to the basic principles of biological order and developing a structuralist explanation of intrinsically generated change. Extrinsic factors may affect the resultant evolutionary pattern, but they are neither necessary nor sufficient for evolution to occur.
The principle of finiteness - a guideline for physical laws
NASA Astrophysics Data System (ADS)
Sternlieb, Abraham
2013-04-01
I propose a new principle in physics-the principle of finiteness (FP). It stems from the definition of physics as a science that deals with measurable dimensional physical quantities. Since measurement results including their errors, are always finite, FP postulates that the mathematical formulation of legitimate laws in physics should prevent exactly zero or infinite solutions. I propose finiteness as a postulate, as opposed to a statement whose validity has to be corroborated by, or derived theoretically or experimentally from other facts, theories or principles. Some consequences of FP are discussed, first in general, and then more specifically in the fields of special relativity, quantum mechanics, and quantum gravity. The corrected Lorentz transformations include an additional translation term depending on the minimum length epsilon. The relativistic gamma is replaced by a corrected gamma, that is finite for v=c. To comply with FP, physical laws should include the relevant extremum finite values in their mathematical formulation. An important prediction of FP is that there is a maximum attainable relativistic mass/energy which is the same for all subatomic particles, meaning that there is a maximum theoretical value for cosmic rays energy. The Generalized Uncertainty Principle required by Quantum Gravity is actually a necessary consequence of FP at Planck's scale. Therefore, FP may possibly contribute to the axiomatic foundation of Quantum Gravity.
Information flow and causality as rigorous notions ab initio
NASA Astrophysics Data System (ADS)
Liang, X. San
2016-11-01
Information flow or information transfer the widely applicable general physics notion can be rigorously derived from first principles, rather than axiomatically proposed as an ansatz. Its logical association with causality is firmly rooted in the dynamical system that lies beneath. The principle of nil causality that reads, an event is not causal to another if the evolution of the latter is independent of the former, which transfer entropy analysis and Granger causality test fail to verify in many situations, turns out to be a proven theorem here. Established in this study are the information flows among the components of time-discrete mappings and time-continuous dynamical systems, both deterministic and stochastic. They have been obtained explicitly in closed form, and put to applications with the benchmark systems such as the Kaplan-Yorke map, Rössler system, baker transformation, Hénon map, and stochastic potential flow. Besides unraveling the causal relations as expected from the respective systems, some of the applications show that the information flow structure underlying a complex trajectory pattern could be tractable. For linear systems, the resulting remarkably concise formula asserts analytically that causation implies correlation, while correlation does not imply causation, providing a mathematical basis for the long-standing philosophical debate over causation versus correlation.
NASA Astrophysics Data System (ADS)
Anisimov, M. P.
2016-12-01
One can find in scientific literature a pretty fresh idea of the nucleation rate surfaces design over the diagrams of phase equilibria. That idea looks like profitable for the nucleation theory development and for various practical applications where predictions of theory have no high enough accuracy for today. The common thermodynamics has no real ability to predict parameters of the first order phase transition. Nucleation experiment can be provided in very local nucleation conditions even the nucleation takes place from the critical line (in two-component case) down to the absolute zero temperature limit and from zero nucleation rates at phase equilibria up to the spinodal conditions. Theory predictions have low reliability as a rule. The computational chemistry has chance to make solution of that problem easier when a set of the used axiomatic statements will adapt enough progressive assumptions [1]. Semiempirical design of the nucleation rate surfaces over diagrams of phase equilibria have a potential ability to provide a reasonable quality information on nucleation rate for each channel of nucleation. Consideration and using of the nucleation rate surface topologies to optimize synthesis of a given phase of the target material can be available when data base on nucleation rates over diagrams of phase equilibria will be created.
Entropy production and optimization of geothermal power plants
NASA Astrophysics Data System (ADS)
Michaelides, Efstathios E.
2012-09-01
Geothermal power plants are currently producing reliable and low-cost, base load electricity. Three basic types of geothermal power plants are currently in operation: single-flashing, dual-flashing, and binary power plants. Typically, the single-flashing and dual-flashing geothermal power plants utilize geothermal water (brine) at temperatures in the range of 550-430 K. Binary units utilize geothermal resources at lower temperatures, typically 450-380 K. The entropy production in the various components of the three types of geothermal power plants determines the efficiency of the plants. It is axiomatic that a lower entropy production would improve significantly the energy utilization factor of the corresponding power plant. For this reason, the entropy production in the major components of the three types of geothermal power plants has been calculated. It was observed that binary power plants generate the lowest amount of entropy and, thus, convert the highest rate of geothermal energy into mechanical energy. The single-flashing units generate the highest amount of entropy, primarily because they re-inject fluid at relatively high temperature. The calculations for entropy production provide information on the equipment where the highest irreversibilities occur, and may be used to optimize the design of geothermal processes in future geothermal power plants and thermal cycles used for the harnessing of geothermal energy.
D'Acremont, Mathieu; Bossaerts, Peter
2008-12-01
When modeling valuation under uncertainty, economists generally prefer expected utility because it has an axiomatic foundation, meaning that the resulting choices will satisfy a number of rationality requirements. In expected utility theory, values are computed by multiplying probabilities of each possible state of nature by the payoff in that state and summing the results. The drawback of this approach is that all state probabilities need to be dealt with separately, which becomes extremely cumbersome when it comes to learning. Finance academics and professionals, however, prefer to value risky prospects in terms of a trade-off between expected reward and risk, where the latter is usually measured in terms of reward variance. This mean-variance approach is fast and simple and greatly facilitates learning, but it impedes assigning values to new gambles on the basis of those of known ones. To date, it is unclear whether the human brain computes values in accordance with expected utility theory or with mean-variance analysis. In this article, we discuss the theoretical and empirical arguments that favor one or the other theory. We also propose a new experimental paradigm that could determine whether the human brain follows the expected utility or the mean-variance approach. Behavioral results of implementation of the paradigm are discussed.
Quantum cluster algebras and quantum nilpotent algebras
Goodearl, Kenneth R.; Yakimov, Milen T.
2014-01-01
A major direction in the theory of cluster algebras is to construct (quantum) cluster algebra structures on the (quantized) coordinate rings of various families of varieties arising in Lie theory. We prove that all algebras in a very large axiomatically defined class of noncommutative algebras possess canonical quantum cluster algebra structures. Furthermore, they coincide with the corresponding upper quantum cluster algebras. We also establish analogs of these results for a large class of Poisson nilpotent algebras. Many important families of coordinate rings are subsumed in the class we are covering, which leads to a broad range of applications of the general results to the above-mentioned types of problems. As a consequence, we prove the Berenstein–Zelevinsky conjecture [Berenstein A, Zelevinsky A (2005) Adv Math 195:405–455] for the quantized coordinate rings of double Bruhat cells and construct quantum cluster algebra structures on all quantum unipotent groups, extending the theorem of Geiß et al. [Geiß C, et al. (2013) Selecta Math 19:337–397] for the case of symmetric Kac–Moody groups. Moreover, we prove that the upper cluster algebras of Berenstein et al. [Berenstein A, et al. (2005) Duke Math J 126:1–52] associated with double Bruhat cells coincide with the corresponding cluster algebras. PMID:24982197
Utility measurement in healthcare: the things I never got to.
Torrance, George W
2006-01-01
The present article provides a brief historical background on the development of utility measurement and cost-utility analysis in healthcare. It then outlines a number of research ideas in this field that the author never got to. The first idea is extremely fundamental. Why is health economics the only application of economics that does not use the discipline of economics? And, more importantly, what discipline should it use? Research ideas are discussed to investigate precisely the underlying theory and axiom systems of both Paretian welfare economics and the decision-theoretical utility approach. Can the two approaches be integrated or modified in some appropriate way so that they better reflect the needs of the health field? The investigation is described both for the individual and societal levels. Constructing a 'Robinson Crusoe' society of only a few individuals with different health needs, preferences and willingness to pay is suggested as a method for gaining insight into the problem. The second idea concerns the interval property of utilities and, therefore, QALYs. It specifically concerns the important requirement that changes of equal magnitude anywhere on the utility scale, or alternatively on the QALY scale, should be equally desirable. Unfortunately, one of the original restrictions on utility theory states that such comparisons are not permitted by the theory. It is shown, in an important new finding, that while this restriction applies in a world of certainty, it does not in a world of uncertainty, such as healthcare. Further research is suggested to investigate this property under both certainty and uncertainty. Other research ideas that are described include: the development of a precise axiomatic basis for the time trade-off method; the investigation of chaining as a method of preference measurement with the standard gamble or time trade-off; the development and training of a representative panel of the general public to improve the completeness, coherence and consistency of measured preferences; and the investigation, using a model of a very small society, of the conflict between the patient perspective and the societal perspective regarding preferences. Finally, it is suggested that an important area of research, which the author never got to, would be to work closely with specific decision makers on specific decision problems, to help them formulate the problem, provide useful analyses, and to publish these as case studies to give the field a better understanding of the problems and the needs of decision makers.
Arguments concerning Relativity and Cosmology.
Klein, O
1971-01-29
In the first place I have reviewed the true foundation of Einstein's theory of general relativity, the so-called principle of equivalence, according to which there is no essential difference between "genuine" gravitation and inertial forces, well known from accelerated vehicles. By means of a comparison with Gaussian geometry of curved surfaces-the background of Riemannian geometry, the tool used by Einstein for the mathematical formulation of his theory-it is made clear that this principle is incompatible with the idea proposed by Mach and accepted by Einstein as an incitement to his attempt to describe the main situation in the universe as an analogy in three dimensions to the closed surface of a sphere. In the later attempts toward a mathematical description of the universe, where Einstein's cosmology was adapted to the discovery by Hubble that its observed part is expanding, the socalled cosmological postulate has been used as a kind of axiomatic background which, when analyzed, makes it probable that this expansion is shared by a very big, but still bounded system. This implies that our expanding metagalaxy is probably just one of a type of stellar objects in different phases of evolution, some expanding and some contracting. Some attempts toward the description of this evolution are sketched in the article with the hope that further investigation, theoretical and observational, may lead to an interesting advance in this part of astrophysics.
A Tangent Bundle Theory for Visual Curve Completion.
Ben-Yosef, Guy; Ben-Shahar, Ohad
2012-07-01
Visual curve completion is a fundamental perceptual mechanism that completes the missing parts (e.g., due to occlusion) between observed contour fragments. Previous research into the shape of completed curves has generally followed an "axiomatic" approach, where desired perceptual/geometrical properties are first defined as axioms, followed by mathematical investigation into curves that satisfy them. However, determining psychophysically such desired properties is difficult and researchers still debate what they should be in the first place. Instead, here we exploit the observation that curve completion is an early visual process to formalize the problem in the unit tangent bundle R(2) × S(1), which abstracts the primary visual cortex (V1) and facilitates exploration of basic principles from which perceptual properties are later derived rather than imposed. Exploring here the elementary principle of least action in V1, we show how the problem becomes one of finding minimum-length admissible curves in R(2) × S(1). We formalize the problem in variational terms, we analyze it theoretically, and we formulate practical algorithms for the reconstruction of these completed curves. We then explore their induced visual properties vis-à-vis popular perceptual axioms and show how our theory predicts many perceptual properties reported in the corresponding perceptual literature. Finally, we demonstrate a variety of curve completions and report comparisons to psychophysical data and other completion models.
Data-driven non-linear elasticity: constitutive manifold construction and problem discretization
NASA Astrophysics Data System (ADS)
Ibañez, Ruben; Borzacchiello, Domenico; Aguado, Jose Vicente; Abisset-Chavanne, Emmanuelle; Cueto, Elias; Ladeveze, Pierre; Chinesta, Francisco
2017-11-01
The use of constitutive equations calibrated from data has been implemented into standard numerical solvers for successfully addressing a variety problems encountered in simulation-based engineering sciences (SBES). However, the complexity remains constantly increasing due to the need of increasingly detailed models as well as the use of engineered materials. Data-Driven simulation constitutes a potential change of paradigm in SBES. Standard simulation in computational mechanics is based on the use of two very different types of equations. The first one, of axiomatic character, is related to balance laws (momentum, mass, energy,\\ldots ), whereas the second one consists of models that scientists have extracted from collected, either natural or synthetic, data. Data-driven (or data-intensive) simulation consists of directly linking experimental data to computers in order to perform numerical simulations. These simulations will employ laws, universally recognized as epistemic, while minimizing the need of explicit, often phenomenological, models. The main drawback of such an approach is the large amount of required data, some of them inaccessible from the nowadays testing facilities. Such difficulty can be circumvented in many cases, and in any case alleviated, by considering complex tests, collecting as many data as possible and then using a data-driven inverse approach in order to generate the whole constitutive manifold from few complex experimental tests, as discussed in the present work.
Test of a hypothesis of realism in quantum theory using a Bayesian approach
NASA Astrophysics Data System (ADS)
Nikitin, N.; Toms, K.
2017-05-01
In this paper we propose a time-independent equality and time-dependent inequality, suitable for an experimental test of the hypothesis of realism. The derivation of these relations is based on the concept of conditional probability and on Bayes' theorem in the framework of Kolmogorov's axiomatics of probability theory. The equality obtained is intrinsically different from the well-known Greenberger-Horne-Zeilinger (GHZ) equality and its variants, because violation of the proposed equality might be tested in experiments with only two microsystems in a maximally entangled Bell state |Ψ-> , while a test of the GHZ equality requires at least three quantum systems in a special state |ΨGHZ> . The obtained inequality differs from Bell's, Wigner's, and Leggett-Garg inequalities, because it deals with spin s =1 /2 projections onto only two nonparallel directions at two different moments of time, while a test of the Bell and Wigner inequalities requires at least three nonparallel directions, and a test of the Leggett-Garg inequalities requires at least three distinct moments of time. Hence, the proposed inequality seems to open an additional experimental possibility to avoid the "contextuality loophole." Violation of the proposed equality and inequality is illustrated with the behavior of a pair of anticorrelated spins in an external magnetic field and also with the oscillations of flavor-entangled pairs of neutral pseudoscalar mesons.
Geary, Nori
2013-02-01
Analysis of the interactive effects of combinations of hormones or other manipulations with qualitatively similar individual effects is an important topic in basic and clinical endocrinology as well as other branches of basic and clinical research related to integrative physiology. Functional, as opposed to mechanistic, analyses of interactions rely on the concept of synergy, which can be defined qualitatively as a cooperative action or quantitatively as a supra-additive effect according to some metric for the addition of different dose-effect curves. Unfortunately, dose-effect curve addition is far from straightforward; rather, it requires the development of an axiomatic mathematical theory. I review the mathematical soundness, face validity, and utility of the most frequently used approaches to supra-additive synergy. These criteria highlight serious problems in the two most common synergy approaches, response additivity and Loewe additivity, which is the basis of the isobole and related response surface approaches. I conclude that there is no adequate, generally applicable, supra-additive synergy metric appropriate for endocrinology or any other field of basic and clinical integrative physiology. I recommend that these metrics be abandoned in favor of the simpler definition of synergy as a cooperative, i.e., nonantagonistic, effect. This simple definition avoids mathematical difficulties, is easily applicable, meets regulatory requirements for combination therapy development, and suffices to advance phenomenological basic research to mechanistic studies of interactions and clinical combination therapy research.
Defining fitness in an uncertain world.
Crewe, Paul; Gratwick, Richard; Grafen, Alan
2018-04-01
The recently elucidated definition of fitness employed by Fisher in his fundamental theorem of natural selection is combined with reproductive values as appropriately defined in the context of both random environments and continuing fluctuations in the distribution over classes in a class-structured population. We obtain astonishingly simple results, generalisations of the Price Equation and the fundamental theorem, that show natural selection acting only through the arithmetic expectation of fitness over all uncertainties, in contrast to previous studies with fluctuating demography, in which natural selection looks rather complicated. Furthermore, our setting permits each class to have its characteristic ploidy, thus covering haploidy, diploidy and haplodiploidy at the same time; and allows arbitrary classes, including continuous variables such as condition. The simplicity is achieved by focussing just on the effects of natural selection on genotype frequencies: while other causes are present in the model, and the effect of natural selection is assessed in their presence, these causes will have their own further effects on genoytpe frequencies that are not assessed here. Also, Fisher's uses of reproductive value are shown to have two ambivalences, and a new axiomatic foundation for reproductive value is endorsed. The results continue the formal darwinism project, and extend support for the individual-as-maximising-agent analogy to finite populations with random environments and fluctuating class-distributions. The model may also lead to improved ways to measure fitness in real populations.
Development of the system of reactor thermophysical data on the basis of ontological modelling
NASA Astrophysics Data System (ADS)
Chusov, I. A.; Kirillov, P. L.; Bogoslovskaya, G. P.; Yunusov, L. K.; Obysov, N. A.; Novikov, G. E.; Pronyaev, V. G.; Erkimbaev, A. O.; Zitserman, V. Yu; Kobzev, G. A.; Trachtengerts, M. S.; Fokin, L. R.
2017-11-01
Compilation and processing of the thermophysical data was always an important task for the nuclear industry. The difficulties of the present stage of this activity are explained by sharp increase of the data volume and the number of new materials, as well as by the increased requirements to the reliability of the data used in the nuclear industry. General trend in the fields with predominantly orientation at the work with data (material science, chemistry and others) consists in the transition to a common infrastructure with integration of separate databases, Web-portals and other resources. This infrastructure provides the interoperability, the procedures of the data exchange, storage and dissemination. Key elements of this infrastructure is a domain-specific ontology, which provides a single information model and dictionary for semantic definitions. Formalizing the subject area, the ontology adapts the definitions for the different database schemes and provides the integration of heterogeneous data. The important property to be inherent for ontologies is a possibility of permanent expanding of new definitions, e.g. list of materials and properties. The expansion of the thermophysical data ontology at the reactor materials includes the creation of taxonomic dictionaries for thermophysical properties; the models for data presentation and their uncertainties; the inclusion along with the parameters of the state, some additional factors, such as the material porosity, the burnup rate, the irradiation rate and others; axiomatics of the properties applicable to the given class of materials.
Principle of Maximum Fisher Information from Hardy’s Axioms Applied to Statistical Systems
Frieden, B. Roy; Gatenby, Robert A.
2014-01-01
Consider a finite-sized, multidimensional system in a parameter state a. The system is in either a state of equilibrium or general non-equilibrium, and may obey either classical or quantum physics. L. Hardy’s mathematical axioms provide a basis for the physics obeyed by any such system. One axiom is that the number N of distinguishable states a in the system obeys N = max. This assumes that N is known as deterministic prior knowledge. However, most observed systems suffer statistical fluctuations, for which N is therefore only known approximately. Then what happens if the scope of the axiom N = max is extended to include such observed systems? It is found that the state a of the system must obey a principle of maximum Fisher information, I = Imax. This is important because many physical laws have been derived, assuming as a working hypothesis that I = Imax. These derivations include uses of the principle of Extreme physical information (EPI). Examples of such derivations were of the De Broglie wave hypothesis, quantum wave equations, Maxwell’s equations, new laws of biology (e.g. of Coulomb force-directed cell development, and of in situ cancer growth), and new laws of economic fluctuation and investment. That the principle I = Imax itself derives, from suitably extended Hardy axioms, thereby eliminates its need to be assumed in these derivations. Thus, uses of I = Imax and EPI express physics at its most fundamental level – its axiomatic basis in math. PMID:24229152
Walach, H
2003-08-01
Homeopathy is scientifically banned, both for lack of consistent empirical findings, but more so for lack of a sound theoretical model to explain its purported effects. This paper makes an attempt to introduce an explanatory idea based on a generalized version of quantum mechanics (QM), the weak quantum theory (WQT). WQT uses the algebraic formalism of QM proper, but drops some restrictions and definitions typical for QM. This results in a general axiomatic framework similar to QM, but more generalized and applicable to all possible systems. Most notably, WQT predicts entanglement, which in QM is known as Einstein-Podolsky-Rosen (EPR) correlatedness within quantum systems. According to WQT, this entanglement is not only tied to quantum systems, but is to be expected whenever a global and a local variable describing a system are complementary. This idea is used here to reconstruct homeopathy as an exemplification of generalized entanglement as predicted by WQT. It transpires that homeopathy uses two instances of generalized entanglement: one between the remedy and the original substance (potentiation principle) and one between the individual symptoms of a patient and the general symptoms of a remedy picture (similarity principle). By bringing these two elements together, double entanglement ensues, which is reminiscent of cryptographic and teleportation applications of entanglement in QM proper. Homeopathy could be a macroscopic analogue to quantum teleportation. This model is exemplified and some predictions are derived, which make it possible to test the model. Copyright 2003 S. Karger GmbH, Freiburg
Free Quantum Field Theory from Quantum Cellular Automata
NASA Astrophysics Data System (ADS)
Bisio, Alessandro; D'Ariano, Giacomo Mauro; Perinotti, Paolo; Tosini, Alessandro
2015-10-01
After leading to a new axiomatic derivation of quantum theory (see D'Ariano et al. in Found Phys, 2015), the new informational paradigm is entering the domain of quantum field theory, suggesting a quantum automata framework that can be regarded as an extension of quantum field theory to including an hypothetical Planck scale, and with the usual quantum field theory recovered in the relativistic limit of small wave-vectors. Being derived from simple principles (linearity, unitarity, locality, homogeneity, isotropy, and minimality of dimension), the automata theory is quantum ab-initio, and does not assume Lorentz covariance and mechanical notions. Being discrete it can describe localized states and measurements (unmanageable by quantum field theory), solving all the issues plaguing field theory originated from the continuum. These features make the theory an ideal framework for quantum gravity, with relativistic covariance and space-time emergent solely from the interactions, and not assumed a priori. The paper presents a synthetic derivation of the automata theory, showing how the principles lead to a description in terms of a quantum automaton over a Cayley graph of a group. Restricting to Abelian groups we show how the automata recover the Weyl, Dirac and Maxwell dynamics in the relativistic limit. We conclude with some new routes about the more general scenario of non-Abelian Cayley graphs. The phenomenology arising from the automata theory in the ultra-relativistic domain and the analysis of corresponding distorted Lorentz covariance is reviewed in Bisio et al. (Found Phys 2015, in this same issue).
Principle of maximum Fisher information from Hardy's axioms applied to statistical systems.
Frieden, B Roy; Gatenby, Robert A
2013-10-01
Consider a finite-sized, multidimensional system in parameter state a. The system is either at statistical equilibrium or general nonequilibrium, and may obey either classical or quantum physics. L. Hardy's mathematical axioms provide a basis for the physics obeyed by any such system. One axiom is that the number N of distinguishable states a in the system obeys N=max. This assumes that N is known as deterministic prior knowledge. However, most observed systems suffer statistical fluctuations, for which N is therefore only known approximately. Then what happens if the scope of the axiom N=max is extended to include such observed systems? It is found that the state a of the system must obey a principle of maximum Fisher information, I=I(max). This is important because many physical laws have been derived, assuming as a working hypothesis that I=I(max). These derivations include uses of the principle of extreme physical information (EPI). Examples of such derivations were of the De Broglie wave hypothesis, quantum wave equations, Maxwell's equations, new laws of biology (e.g., of Coulomb force-directed cell development and of in situ cancer growth), and new laws of economic fluctuation and investment. That the principle I=I(max) itself derives from suitably extended Hardy axioms thereby eliminates its need to be assumed in these derivations. Thus, uses of I=I(max) and EPI express physics at its most fundamental level, its axiomatic basis in math.
The Role of the Community Nurse in Promoting Health and Human Dignity-Narrative Review Article
Muntean, Ana; Tomita, Mihaela; Ungureanu, Roxana
2013-01-01
Abstract Background: Population health, as defined by WHO in its constitution, is out “a physical, mental and social complete wellbeing”. At the basis of human welfare is the human dignity. This dimension requires an integrated vision of health care. The ecosystemical vision of Bronfenbrenner allows highlighting the unexpected connections between social macro system based on values and the micro system consisting of individual and family. Community nurse is aimed to transgression in practice of education and care, the respect for human dignity, the bonds among values and practices of the community and the physical health of individuals. In Romania, the promotion of community nurse began in 2002, through the project promoting the social inclusion by developing human and institutional resources within community nursery of the National School of Public Health, Management and Education in Healthcare Bucharest. The community nurse became apparent in 10 counties included in the project. Considering the respect for human dignity as an axiomatic value for the community nurse interventions, we stress the need for developing a primary care network in Romania. The proof is based on the analysis of the concept of human dignity within health care, as well as the secondary analysis of health indicators, in the year of 2010, of the 10 counties included in the project. Our conclusions will draw attention to the need of community nurse and, will open directions for new researches and developments needed to promote primary health in Romania. PMID:26060614
A Formal Approach to Domain-Oriented Software Design Environments
NASA Technical Reports Server (NTRS)
Lowry, Michael; Philpot, Andrew; Pressburger, Thomas; Underwood, Ian; Lum, Henry, Jr. (Technical Monitor)
1994-01-01
This paper describes a formal approach to domain-oriented software design environments, based on declarative domain theories, formal specifications, and deductive program synthesis. A declarative domain theory defines the semantics of a domain-oriented specification language and its relationship to implementation-level subroutines. Formal specification development and reuse is made accessible to end-users through an intuitive graphical interface that guides them in creating diagrams denoting formal specifications. The diagrams also serve to document the specifications. Deductive program synthesis ensures that end-user specifications are correctly implemented. AMPHION has been applied to the domain of solar system kinematics through the development of a declarative domain theory, which includes an axiomatization of JPL's SPICELIB subroutine library. Testing over six months with planetary scientists indicates that AMPHION's interactive specification acquisition paradigm enables users to develop, modify, and reuse specifications at least an order of magnitude more rapidly than manual program development. Furthermore, AMPHION synthesizes one to two page programs consisting of calls to SPICELIB subroutines from these specifications in just a few minutes. Test results obtained by metering AMPHION's deductive program synthesis component are examined. AMPHION has been installed at JPL and is currently undergoing further refinement in preparation for distribution to hundreds of SPICELIB users worldwide. Current work to support end-user customization of AMPHION's specification acquisition subsystem is briefly discussed, as well as future work to enable domain-expert creation of new AMPHION applications through development of suitable domain theories.
A quantum probability account of order effects in inference.
Trueblood, Jennifer S; Busemeyer, Jerome R
2011-01-01
Order of information plays a crucial role in the process of updating beliefs across time. In fact, the presence of order effects makes a classical or Bayesian approach to inference difficult. As a result, the existing models of inference, such as the belief-adjustment model, merely provide an ad hoc explanation for these effects. We postulate a quantum inference model for order effects based on the axiomatic principles of quantum probability theory. The quantum inference model explains order effects by transforming a state vector with different sequences of operators for different orderings of information. We demonstrate this process by fitting the quantum model to data collected in a medical diagnostic task and a jury decision-making task. To further test the quantum inference model, a new jury decision-making experiment is developed. Using the results of this experiment, we compare the quantum inference model with two versions of the belief-adjustment model, the adding model and the averaging model. We show that both the quantum model and the adding model provide good fits to the data. To distinguish the quantum model from the adding model, we develop a new experiment involving extreme evidence. The results from this new experiment suggest that the adding model faces limitations when accounting for tasks involving extreme evidence, whereas the quantum inference model does not. Ultimately, we argue that the quantum model provides a more coherent account for order effects that was not possible before. Copyright © 2011 Cognitive Science Society, Inc.
Pastor-Bernier, Alexandre; Plott, Charles R.; Schultz, Wolfram
2017-01-01
Revealed preference theory provides axiomatic tools for assessing whether individuals make observable choices “as if” they are maximizing an underlying utility function. The theory evokes a tradeoff between goods whereby individuals improve themselves by trading one good for another good to obtain the best combination. Preferences revealed in these choices are modeled as curves of equal choice (indifference curves) and reflect an underlying process of optimization. These notions have far-reaching applications in consumer choice theory and impact the welfare of human and animal populations. However, they lack the empirical implementation in animals that would be required to establish a common biological basis. In a design using basic features of revealed preference theory, we measured in rhesus monkeys the frequency of repeated choices between bundles of two liquids. For various liquids, the animals’ choices were compatible with the notion of giving up a quantity of one good to gain one unit of another good while maintaining choice indifference, thereby implementing the concept of marginal rate of substitution. The indifference maps consisted of nonoverlapping, linear, convex, and occasionally concave curves with typically negative, but also sometimes positive, slopes depending on bundle composition. Out-of-sample predictions using homothetic polynomials validated the indifference curves. The animals’ preferences were internally consistent in satisfying transitivity. Change of option set size demonstrated choice optimality and satisfied the Weak Axiom of Revealed Preference (WARP). These data are consistent with a version of revealed preference theory in which preferences are stochastic; the monkeys behaved “as if” they had well-structured preferences and maximized utility. PMID:28202727
A computational framework for modeling targets as complex adaptive systems
NASA Astrophysics Data System (ADS)
Santos, Eugene; Santos, Eunice E.; Korah, John; Murugappan, Vairavan; Subramanian, Suresh
2017-05-01
Modeling large military targets is a challenge as they can be complex systems encompassing myriad combinations of human, technological, and social elements that interact, leading to complex behaviors. Moreover, such targets have multiple components and structures, extending across multiple spatial and temporal scales, and are in a state of change, either in response to events in the environment or changes within the system. Complex adaptive system (CAS) theory can help in capturing the dynamism, interactions, and more importantly various emergent behaviors, displayed by the targets. However, a key stumbling block is incorporating information from various intelligence, surveillance and reconnaissance (ISR) sources, while dealing with the inherent uncertainty, incompleteness and time criticality of real world information. To overcome these challenges, we present a probabilistic reasoning network based framework called complex adaptive Bayesian Knowledge Base (caBKB). caBKB is a rigorous, overarching and axiomatic framework that models two key processes, namely information aggregation and information composition. While information aggregation deals with the union, merger and concatenation of information and takes into account issues such as source reliability and information inconsistencies, information composition focuses on combining information components where such components may have well defined operations. Since caBKBs can explicitly model the relationships between information pieces at various scales, it provides unique capabilities such as the ability to de-aggregate and de-compose information for detailed analysis. Using a scenario from the Network Centric Operations (NCO) domain, we will describe how our framework can be used for modeling targets with a focus on methodologies for quantifying NCO performance metrics.
Some considerations on the definition of risk based on concepts of systems theory and probability.
Andretta, Massimo
2014-07-01
The concept of risk has been applied in many modern science and technology fields. Despite its successes in many applicative fields, there is still not a well-established vision and universally accepted definition of the principles and fundamental concepts of the risk assessment discipline. As emphasized recently, the risk fields suffer from a lack of clarity on their scientific bases that can define, in a unique theoretical framework, the general concepts in the different areas of application. The aim of this article is to make suggestions for another perspective of risk definition that could be applied and, in a certain sense, generalize some of the previously known definitions (at least in the fields of technical and scientific applications). By drawing on my experience of risk assessment in different applicative situations (particularly in the risk estimation for major industrial accidents, and in the health and ecological risk assessment for contaminated sites), I would like to revise some general and foundational concepts of risk analysis in as consistent a manner as possible from the axiomatic/deductive point of view. My proposal is based on the fundamental concepts of the systems theory and of the probability. In this way, I try to frame, in a single, broad, and general theoretical context some fundamental concepts and principles applicable in many different fields of risk assessment. I hope that this article will contribute to the revitalization and stimulation of useful discussions and new insights into the key issues and theoretical foundations of risk assessment disciplines. © 2013 Society for Risk Analysis.
Pastor-Bernier, Alexandre; Plott, Charles R; Schultz, Wolfram
2017-03-07
Revealed preference theory provides axiomatic tools for assessing whether individuals make observable choices "as if" they are maximizing an underlying utility function. The theory evokes a tradeoff between goods whereby individuals improve themselves by trading one good for another good to obtain the best combination. Preferences revealed in these choices are modeled as curves of equal choice (indifference curves) and reflect an underlying process of optimization. These notions have far-reaching applications in consumer choice theory and impact the welfare of human and animal populations. However, they lack the empirical implementation in animals that would be required to establish a common biological basis. In a design using basic features of revealed preference theory, we measured in rhesus monkeys the frequency of repeated choices between bundles of two liquids. For various liquids, the animals' choices were compatible with the notion of giving up a quantity of one good to gain one unit of another good while maintaining choice indifference, thereby implementing the concept of marginal rate of substitution. The indifference maps consisted of nonoverlapping, linear, convex, and occasionally concave curves with typically negative, but also sometimes positive, slopes depending on bundle composition. Out-of-sample predictions using homothetic polynomials validated the indifference curves. The animals' preferences were internally consistent in satisfying transitivity. Change of option set size demonstrated choice optimality and satisfied the Weak Axiom of Revealed Preference (WARP). These data are consistent with a version of revealed preference theory in which preferences are stochastic; the monkeys behaved "as if" they had well-structured preferences and maximized utility.
Mockrin, Miranda H; Fishler, Hillary K; Stewart, Susan I
2018-05-15
Becoming a fire adapted community that can coexist with wildfire is envisioned as a continuous, iterative process of adaptation, but it is unclear how communities may pursue adaptation. Experience with wildfire and other natural hazards suggests that disasters may open a "window of opportunity" leading to local government policy changes. We examined how destructive wildfire affected progress toward becoming fire adapted in eight locations in the United States. We found that community-level adaptation following destructive fires is most common where destructive wildfire is novel and there is already government capacity and investment in wildfire regulation and land use planning. External funding, staff capacity, and the presence of issue champions combined to bring about change after wildfire. Locations with long histories of destructive wildfire, extensive previous investment in formal wildfire regulation and mitigation, or little government and community capacity to manage wildfire saw fewer changes. Across diverse settings, communities consistently used the most common tools and actions for wildfire mitigation and planning. Nearly all sites reported changes in wildfire suppression, emergency response, and hazard planning documents. Expansion in voluntary education and outreach programs to increase defensible space was also common, occurring in half of our sites, but land use planning and regulations remained largely unchanged. Adaptation at the community and local governmental level therefore may not axiomatically follow from each wildfire incident, nor easily incorporate formal approaches to minimizing land use and development in hazardous environments, but in many sites wildfire was a focusing event that inspired reflection and adaptation.
Mass media and rational domination: a critical review of a dominant paradigm.
Moemeka, A
1988-01-01
The mass media exert powerful influences on the way people perceive, think about, and ultimately act in their world. Despite agreement on this fact, communication scholars are divided into 2 opposing camps. The functionalists view the mass media as instruments for providing the framework for the education and enlightenment of the masses socially, economically, and politically. In contrast, the conflict and critical theorists see the mass media as instruments for rational domination and manipulation of the masses through ideological control. Because the mass media are part of the social system and their operators belong to the ruling elite class, they invariably support the ideology of the power structure through justifying the sociopolitical status quo. It is axiomatic that the mass media are capable of diverting people's attention and consciousness away from sociopolitical issues by filling their leisure time with escapist forms of entertainment. The political structure is fully aware of the potential of the mass media to effect cognitive changes among individuals and to structure their thinking. As long as social, political, and economic status determine who is important and who is not, the media will continue to be instruments of control. However, this control function can be weakened when media infrastructure and administration are decentralized and closer to the masses. Then, solutions to the problems of the masses are the priority targets of media contents. The democratic-participant media theory calls for the right of access to the mass media for citizens and the rights of the masses to be served by the media according to their own self-determined needs.
[PEDIATRIC GASTROENTEROLOGY: ORIGINS, PROBLEMS, AND PROSPECTS OF THE RESEARCH].
Zaprudnov, A M; Kharitonova, L A; Grigoriev, K I; Bogomaz, L V
2015-01-01
The nomenclature of digestive diseases in children was supplemented by the "new" diseases: of esophagus--gastroesophageal reflux disease (GERD), Barrett's esophagus, Zenker's diverticulum; of stomach and duodenum--gastroduodenitis, peptic ulcer disease, polyps, ectopic pancreas in the stomach wall; of the intestine--jejunitis, ileocolitis, Crohn's disease, celiac disease, bacterial overgrowth syndrome in the small intestine; of biliary tract--cholelithiasis, gallbladder cholesterosis, anomalies of the biliary tract; of pancreas--acute and chronic pancreatitis, annular pancreas (2). The features of gastrointestinal diseases in children experiencing the action of factors, not always positively affecting the growing organism, were established. These features include: presence of allergic background; high level of neuro-autonomous and psycho-emotional changes in modern children, not only in schoolchildren, but even in preschoolers; polymorbidity or a combination (syntropy) of lesions of the digestive system; adverse outcomes of certain diseases as chronization, complications development, and as a consequence--a high risk of disability in children; "rejuvenation" of certain diseases of the digestive system (cholelithiasis, gallbladder cholesterosis, Crohn's disease), typical for adults. It is important to emphasize the clinical and social importance of gastroenterological diseases in childhood. Axiomatic is that the origins of many diseases of the digestive organs in adults lie in childhood. Early manifestation of certain diseases such as peptic ulcer disease, gluten enteropathy, Crohn's disease, and others, significantly impact the quality of life of sick children and their parents. It is worth to emphasize high costs of medical and prophylactic (tertiary prevention) activities using the drugs of latest generations. All this causes problems in both applied and scientific pediatric gastroenterology.
[Concepts of rational taxonomy].
Pavlinov, I Ia
2011-01-01
The problems are discussed related to development of concepts of rational taxonomy and rational classifications (taxonomic systems) in biology. Rational taxonomy is based on the assumption that the key characteristic of rationality is deductive inference of certain partial judgments about reality under study from other judgments taken as more general and a priory true. Respectively, two forms of rationality are discriminated--ontological and epistemological ones. The former implies inference of classifications properties from general (essential) properties of the reality being investigated. The latter implies inference of the partial rules of judgments about classifications from more general (formal) rules. The following principal concepts of ontologically rational biological taxonomy are considered: "crystallographic" approach, inference of the orderliness of organismal diversity from general laws of Nature, inference of the above orderliness from the orderliness of ontogenetic development programs, based on the concept of natural kind and Cassirer's series theory, based on the systemic concept, based on the idea of periodic systems. Various concepts of ontologically rational taxonomy can be generalized by an idea of the causal taxonomy, according to which any biologically sound classification is founded on a contentwise model of biological diversity that includes explicit indication of general causes responsible for that diversity. It is asserted that each category of general causation and respective background model may serve as a basis for a particular ontologically rational taxonomy as a distinctive research program. Concepts of epistemologically rational taxonomy and classifications (taxonomic systems) can be interpreted in terms of application of certain epistemological criteria of substantiation of scientific status of taxonomy in general and of taxonomic systems in particular. These concepts include: consideration of taxonomy consistency from the standpoint of inductive and hypothetico-deductive argumentation schemes and such fundamental criteria of classifications naturalness as their prognostic capabilities; foundation of a theory of "general taxonomy" as a "general logic", including elements of the axiomatic method. The latter concept constitutes a core of the program of general classiology; it is inconsistent due to absence of anything like "general logic". It is asserted that elaboration of a theory of taxonomy as a biological discipline based on the formal principles of epistemological rationality is not feasible. Instead, it is to be elaborated as ontologically rational one based on biologically sound metatheories about biological diversity causes.
Idealized Computational Models for Auditory Receptive Fields
Lindeberg, Tony; Friberg, Anders
2015-01-01
We present a theory by which idealized models of auditory receptive fields can be derived in a principled axiomatic manner, from a set of structural properties to (i) enable invariance of receptive field responses under natural sound transformations and (ii) ensure internal consistency between spectro-temporal receptive fields at different temporal and spectral scales. For defining a time-frequency transformation of a purely temporal sound signal, it is shown that the framework allows for a new way of deriving the Gabor and Gammatone filters as well as a novel family of generalized Gammatone filters, with additional degrees of freedom to obtain different trade-offs between the spectral selectivity and the temporal delay of time-causal temporal window functions. When applied to the definition of a second-layer of receptive fields from a spectrogram, it is shown that the framework leads to two canonical families of spectro-temporal receptive fields, in terms of spectro-temporal derivatives of either spectro-temporal Gaussian kernels for non-causal time or a cascade of time-causal first-order integrators over the temporal domain and a Gaussian filter over the logspectral domain. For each filter family, the spectro-temporal receptive fields can be either separable over the time-frequency domain or be adapted to local glissando transformations that represent variations in logarithmic frequencies over time. Within each domain of either non-causal or time-causal time, these receptive field families are derived by uniqueness from the assumptions. It is demonstrated how the presented framework allows for computation of basic auditory features for audio processing and that it leads to predictions about auditory receptive fields with good qualitative similarity to biological receptive fields measured in the inferior colliculus (ICC) and primary auditory cortex (A1) of mammals. PMID:25822973
The Oedipus Cycle: developmental mythology, Greek tragedy, and the sociology of knowledge.
Datan, N
1988-01-01
The Oedipus complex of Freud is based on the inevitability of the tragic fate of a man who fled his home to escape the prophecy of parricide. Thus, he fulfilled it by killing a stranger who proved to be his father. As Freud does, this consideration of the tragedy of Oedipus takes as its point of departure the inevitability of the confrontation between father and son. Where Freud looks to the son, however, I look to the father, who set the tragedy in motion by attempting to murder his infant son. Themes ignored in developmental theory but axiomatic in gerontology are considered in this study of the elder Oedipus. The study begins by noting that Oedipus ascended the throne of Thebes not by parricide but by answering the riddle of the Sphynx and affirming the continuity of the life cycle which his father denied. In the second tragedy of the Oedipus Cycle of Sophocles, Oedipus at Colonus, this affirmation is maintained. As Oedipus the elder accepts the infirmities of old age and the support of his daughter Antigone, Oedipus the king proves powerful up to the very end of his life when he gives his blessing not to the sons who had exiled him from Thebes, but to King Theseus who shelters him in his old age. Thus, the Oedipus cycle, in contrast to the "Oedipus complex," represents not the unconscious passions of the small boy, but rather the awareness of the life cycle in the larger context of the succession of the generations and their mutual interdependence. These themes are illuminated by a fuller consideration of the tragedy of Oedipus.
Stochastic Geometry and Quantum Gravity: Some Rigorous Results
NASA Astrophysics Data System (ADS)
Zessin, H.
The aim of these lectures is a short introduction into some recent developments in stochastic geometry which have one of its origins in simplicial gravity theory (see Regge Nuovo Cimento 19: 558-571, 1961). The aim is to define and construct rigorously point processes on spaces of Euclidean simplices in such a way that the configurations of these simplices are simplicial complexes. The main interest then is concentrated on their curvature properties. We illustrate certain basic ideas from a mathematical point of view. An excellent representation of this area can be found in Schneider and Weil (Stochastic and Integral Geometry, Springer, Berlin, 2008. German edition: Stochastische Geometrie, Teubner, 2000). In Ambjørn et al. (Quantum Geometry Cambridge University Press, Cambridge, 1997) you find a beautiful account from the physical point of view. More recent developments in this direction can be found in Ambjørn et al. ("Quantum gravity as sum over spacetimes", Lect. Notes Phys. 807. Springer, Heidelberg, 2010). After an informal axiomatic introduction into the conceptual foundations of Regge's approach the first lecture recalls the concepts and notations used. It presents the fundamental zero-infinity law of stochastic geometry and the construction of cluster processes based on it. The second lecture presents the main mathematical object, i.e. Poisson-Delaunay surfaces possessing an intrinsic random metric structure. The third and fourth lectures discuss their ergodic behaviour and present the two-dimensional Regge model of pure simplicial quantum gravity. We terminate with the formulation of basic open problems. Proofs are given in detail only in a few cases. In general the main ideas are developed. Sufficiently complete references are given.
Mammen, Jens
2016-09-01
The paper is a reply to commentaries to "Activity theories and the Ontology of Psychology: Learning from Danish and Russian Experiences" (Mammen and Mironenko 2015). At the same time it is an attempt to reply to more general issues raised by the commentators and an attempt to further develop some general ideas from our paper with a focus on the introduction of the new analytical concepts sense and choice categories. These concepts have been elaborated in an axiomatic frame in (Mammen 2016) and the present paper is thus also pointing forwards to that and supporting it with examples from research on adult human relations of love and affection and on infant cognitive development. A few examples from myth and literature are referred to also. The ambition is to introduce new analytical tools across schools and domains of psychology which open for theoretical inclusion of new phenomena and re-structuring of well-known ones. The hope is to surmount some problems, as e.g. the dilemma between dualism and reductionism, which have been obstacles in the search for conceptual and methodological coherence in psychology. In the first place the hope is also to sharpen the analytical, critical and practical potential of psychology as a science. The ambition is not, here and now, to develop a comprehensive general theory as a container for the huge amount of empirical results collected using very heterogeneous criteria for what belongs to the domain of psychology and very heterogeneous conceptual frames. Here we still need some patience following the lesson from natural science, step by step including new domains as the conceptual and practical frames are expanding, but on the other hand not excluding anything apriori.
Towards an ontological representation of morbidity and mortality in Description Logics.
Santana, Filipe; Freitas, Fred; Fernandes, Roberta; Medeiros, Zulma; Schober, Daniel
2012-09-21
Despite the high coverage of biomedical ontologies, very few sound definitions of death can be found. Nevertheless, this concept has its relevance in epidemiology, such as for data integration within mortality notification systems. We here introduce an ontological representation of the complex biological qualities and processes that inhere in organisms transitioning from life to death. We further characterize them by causal processes and their temporal borders. Several representational difficulties were faced, mainly regarding kinds of processes with blurred or fiat borders that change their type in a continuous rather than discrete mode. Examples of such hard to grasp concepts are life, death and its relationships with injuries and diseases. We illustrate an iterative optimization of definitions within four versions of the ontology, so as to stress the typical problems encountered in representing complex biological processes. We point out possible solutions for representing concepts related to biological life cycles, preserving identity of participating individuals, i.e. for a patient in transition from life to death. This solution however required the use of extended description logics not yet supported by tools. We also focus on the interdependencies and need to change further parts if one part is changed. The axiomatic definition of mortality we introduce allows the description of biologic processes related to the transition from healthy to diseased or injured, and up to a final death state. Exploiting such definitions embedded into descriptions of pathogen transmissions by arthropod vectors, the complete sequence of infection and disease processes can be described, starting from the inoculation of a pathogen by a vector, until the death of an individual, preserving the identity of the patient.
A gist of comprehensive review of hadronic chemistry and its applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tangde, Vijay M.
20{sup th} century theories of Quantum Mechanics and Quantum Chemistry are exactly valid only when considered to represent the atomic structures. While considering the more general aspects of atomic combinations these theories fail to explain all the related experimental data from first unadulterated axiomatic principles. According to Quantum Chemistry two valence electrons should repel each other and as such there is no mathematical representation of a strong attractive forces between such valence electrons. In view of these and other insufficiencies of Quantum Chemistry, an Italian-American Scientist Professor Ruggero Maria Santilli during his more than five decades of dedicated and sustainedmore » research has denounced the fact that quantum chemistry is mostly based on mere nomenclatures. Professor R M Santilli first formulated the iso-, geno- and hyper- mathematics [1, 2, 3, 4] that helped in understanding numerous diversified problems and removing inadequacies in most of the established and celebrated theories of 20th century physics and chemistry. This involves the isotopic, genotopic, etc. lifting of Lie algebra that generated Lie admissible mathematics to properly describe irreversible processes. The studies on Hadronic Mechanics in general and chemistry in particular based on Santilli’s mathematics[3, 4, 5] for the first time has removed the very fundamental limitations of quantum chemistry [2, 6, 7, 8]. In the present discussion, a comprehensive review of Hadronic Chemistry is presented that imparts the completeness to the Quantum Chemistry via an addition of effects at distances of the order of 1 fm (only) which are assumed to be Non-linear, Non-local, Non-potential, Non-hamiltonian and thus Non-unitary, stepwise successes of Hadronic Chemistry and its application in development of a new chemical species called Magnecules.« less
Quantum theory and human perception of the macro-world.
Aerts, Diederik
2014-01-01
We investigate the question of 'why customary macroscopic entities appear to us humans as they do, i.e., as bounded entities occupying space and persisting through time', starting from our knowledge of quantum theory, how it affects the behavior of such customary macroscopic entities, and how it influences our perception of them. For this purpose, we approach the question from three perspectives. Firstly, we look at the situation from the standard quantum angle, more specifically the de Broglie wavelength analysis of the behavior of macroscopic entities, indicate how a problem with spin and identity arises, and illustrate how both play a fundamental role in well-established experimental quantum-macroscopical phenomena, such as Bose-Einstein condensates. Secondly, we analyze how the question is influenced by our result in axiomatic quantum theory, which proves that standard quantum theory is structurally incapable of describing separated entities. Thirdly, we put forward our new 'conceptual quantum interpretation', including a highly detailed reformulation of the question to confront the new insights and views that arise with the foregoing analysis. At the end of the final section, a nuanced answer is given that can be summarized as follows. The specific and very classical perception of human seeing-light as a geometric theory-and human touching-only ruled by Pauli's exclusion principle-plays a role in our perception of macroscopic entities as ontologically stable entities in space. To ascertain quantum behavior in such macroscopic entities, we will need measuring apparatuses capable of its detection. Future experimental research will have to show if sharp quantum effects-as they occur in smaller entities-appear to be ontological aspects of customary macroscopic entities. It remains a possibility that standard quantum theory is an incomplete theory, and hence incapable of coping ultimately with separated entities, meaning that a more general theory will be needed.
Geoscience Australia Publishes Sample Descriptions using W3C standards
NASA Astrophysics Data System (ADS)
Car, N. J.; Cox, S. J. D.; Bastrakova, I.; Wyborn, L. A.
2017-12-01
The recent revision of the W3C Semantic Sensor Network Ontology (SSN) has focused on three key concerns: Extending the scope of the ontology to include sampling and actuation as well as observation and sensing Modularizing the ontology into a simple core with few classes and properties and little formal axiomatization, supplemented by additional modules that formalize the semantics and extend the scope Alignments with several existing applications and upper ontologies These enhancements mean that SSN can now be used as the basis for publishing descriptions of geologic samples as Linked Data. Geoscience Australia maintains a database of about three million samples, collected over 50 years through projects from ocean core, terrestrial rock and hydrochemistry borehole projects, almost all of which are held in in the special-purpose GA samples repository. Access to descriptions of these samples as Linked Data has recently been enabled. The sample descriptions can be viewed in various machine-readable formalizations, including IGSN (XML & RDF), Dublin Core (XML & RDF) and SSN (RDF), as well as web landing-pages for people. Of particular importance is the support for encoding relationships between samples, and between samples and surveys, boreholes, and traverses which they are related to, as well as between samples processed for analytical purposes and their parents, siblings, and back to the original field samples. The SSN extension for Sample Relationships provides an extensible, semantically rich mechanism to capture any relationship necessary to explain the provenance of observation results obtained from samples. Sample citation is facilitated through the use of URI-based persistent identifiers which resolve to samples' landing pages. The sample system also allows PROV pingbacks to be received for samples when users of them record provenance for their actions.
Forghany, Saeed; Sadeghi-Demneh, Ebrahim; Trinler, Ursula; Onmanee, Pornsuree; Dillon, Michael P; Baker, Richard
2018-06-01
Education and training in prosthetics and orthotics typically comply with International Society for Prosthetics and Orthotics standards based on three categories of prosthetic and orthotic professionals. This scoping study sought to describe the evidence base available to answer the question, How are prosthetic and orthotic services influenced by the training of staff providing them? Scoping review. A structured search of the peer-reviewed literature catalogued in major electronic databases yielded 3039 papers. Following review of title and abstract, 93 articles were considered relevant. Full-text review reduced this number to 25. Only two articles were identified as providing direct evidence of the effects of training and education on service provision. While both suggested that there was an impact, it is difficult to see how the more specific conclusions of either could be generalised. The other 23 articles provide a useful background to a range of issues including the specification of competencies that training programmes should deliver (3 articles), descriptions of a range of training programmes and the effects of training and education on student knowledge and skills. Although it is considered axiomatic, the service quality is dependent on practitioner education and training. There is insufficient evidence to establish whether levels of training and education in prosthetics and orthotics have an effect on the quality of prosthetic and orthotic services. Clinical relevance There is very little evidence about the effects of training and education of prosthetists and orthotists on service quality. While this is a somewhat negative finding, we feel that it is important to bring this to the attention of the prosthetics and orthotics community.
Ecosystem services of soil biota: In what context is a focus on soil biota meaningful?
NASA Astrophysics Data System (ADS)
Baveye, Philippe C.
2016-04-01
Over the last few years, the topic of the ecosystem services of soils has attracted considerable attention, in particular among researchers working on soil biota. A direct link is established explicitly in numerous articles between soil biota and specific ecosystem services, or between soil biodiversity and ecosystem services. A careful review of the literature indicates however that these links are, more often than not, strictly axiomatic, rather than based on actual observations. In fact, there are still at the moment virtually no measurements of ecosystem services of soils at any scale, measurements that would be required to establish such links. Furthermore, at a conceptual level, it is not clear to what extent the effect of soil biota in the delivery of ecosystem services can be separated from the contribution of other components of soil systems. Soil microorganisms, in particular, proliferate and are metabolically active in a pore space whose characteristics and dynamics could in principle have a profound effect on their activity. So also could the composition and spatial distribution of soil organic matter, or the spatial pattern of plant root propagation. By emphasizing the role of soil biota, at the exclusion of other aspects of soil systems, there is a risk that important features of the provision of ecosystem services by soils will be missed. In this talk (based in part on a workshop organized recently in France, and of a follow-up review article), an analysis of this general problem will be presented, as well as suggestions of how to avoid it by promoting truly interdisciplinary research involving not only soil ecologists but also physicists, hydrologists, and chemists.
A new evaluation method research for fusion quality of infrared and visible images
NASA Astrophysics Data System (ADS)
Ge, Xingguo; Ji, Yiguo; Tao, Zhongxiang; Tian, Chunyan; Ning, Chengda
2017-03-01
In order to objectively evaluate the fusion effect of infrared and visible image, a fusion evaluation method for infrared and visible images based on energy-weighted average structure similarity and edge information retention value is proposed for drawbacks of existing evaluation methods. The evaluation index of this method is given, and the infrared and visible image fusion results under different algorithms and environments are made evaluation experiments on the basis of this index. The experimental results show that the objective evaluation index is consistent with the subjective evaluation results obtained from this method, which shows that the method is a practical and effective fusion image quality evaluation method.
Odendaal, Willem; Atkins, Salla; Lewin, Simon
2016-12-15
Formative programme evaluations assess intervention implementation processes, and are seen widely as a way of unlocking the 'black box' of any programme in order to explore and understand why a programme functions as it does. However, few critical assessments of the methods used in such evaluations are available, and there are especially few that reflect on how well the evaluation achieved its objectives. This paper describes a formative evaluation of a community-based lay health worker programme for TB and HIV/AIDS clients across three low-income communities in South Africa. It assesses each of the methods used in relation to the evaluation objectives, and offers suggestions on ways of optimising the use of multiple, mixed-methods within formative evaluations of complex health system interventions. The evaluation's qualitative methods comprised interviews, focus groups, observations and diary keeping. Quantitative methods included a time-and-motion study of the lay health workers' scope of practice and a client survey. The authors conceptualised and conducted the evaluation, and through iterative discussions, assessed the methods used and their results. Overall, the evaluation highlighted programme issues and insights beyond the reach of traditional single methods evaluations. The strengths of the multiple, mixed-methods in this evaluation included a detailed description and nuanced understanding of the programme and its implementation, and triangulation of the perspectives and experiences of clients, lay health workers, and programme managers. However, the use of multiple methods needs to be carefully planned and implemented as this approach can overstretch the logistic and analytic resources of an evaluation. For complex interventions, formative evaluation designs including multiple qualitative and quantitative methods hold distinct advantages over single method evaluations. However, their value is not in the number of methods used, but in how each method matches the evaluation questions and the scientific integrity with which the methods are selected and implemented.
Aquifer water abundance evaluation using a fuzzy- comprehensive weighting method
NASA Astrophysics Data System (ADS)
Wei, Z.
2016-08-01
Aquifer water abundance evaluation is a highly relevant issue that has been researched for many years. Despite prior research, problems with the conventional evaluation method remain. This paper establishes an aquifer water abundance evaluation method that combines fuzzy evaluation with a comprehensive weighting method to overcome both the subjectivity and lack of conformity in determining weight by pure data analysis alone. First, this paper introduces the principle of a fuzzy-comprehensive weighting method. Second, the example of well field no. 3 (of a coalfield) is used to illustrate the method's process. The evaluation results show that this method is can more suitably meet the real requirements of aquifer water abundance assessment, leading to more precise and accurate evaluations. Ultimately, this paper provides a new method for aquifer water abundance evaluation.
10 CFR 963.16 - Postclosure suitability evaluation method.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 10 Energy 4 2014-01-01 2014-01-01 false Postclosure suitability evaluation method. 963.16 Section... Determination, Methods, and Criteria § 963.16 Postclosure suitability evaluation method. (a) DOE will evaluate postclosure suitability using the total system performance assessment method. DOE will conduct a total system...
10 CFR 963.16 - Postclosure suitability evaluation method.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 10 Energy 4 2011-01-01 2011-01-01 false Postclosure suitability evaluation method. 963.16 Section... Determination, Methods, and Criteria § 963.16 Postclosure suitability evaluation method. (a) DOE will evaluate postclosure suitability using the total system performance assessment method. DOE will conduct a total system...
10 CFR 963.16 - Postclosure suitability evaluation method.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 10 Energy 4 2012-01-01 2012-01-01 false Postclosure suitability evaluation method. 963.16 Section... Determination, Methods, and Criteria § 963.16 Postclosure suitability evaluation method. (a) DOE will evaluate postclosure suitability using the total system performance assessment method. DOE will conduct a total system...
Designs and methods used in published Australian health promotion evaluations 1992-2011.
Chambers, Alana Hulme; Murphy, Kylie; Kolbe, Anthony
2015-06-01
To describe the designs and methods used in published Australian health promotion evaluation articles between 1992 and 2011. Using a content analysis approach, we reviewed 157 articles to analyse patterns and trends in designs and methods in Australian health promotion evaluation articles. The purpose was to provide empirical evidence about the types of designs and methods used. The most common type of evaluation conducted was impact evaluation. Quantitative designs were used exclusively in more than half of the articles analysed. Almost half the evaluations utilised only one data collection method. Surveys were the most common data collection method used. Few articles referred explicitly to an intended evaluation outcome or benefit and references to published evaluation models or frameworks were rare. This is the first time Australian-published health promotion evaluation articles have been empirically investigated in relation to designs and methods. There appears to be little change in the purposes, overall designs and methods of published evaluations since 1992. More methodologically transparent and sophisticated published evaluation articles might be instructional, and even motivational, for improving evaluation practice and result in better public health interventions and outcomes. © 2015 Public Health Association of Australia.
Flexible methods for segmentation evaluation: results from CT-based luggage screening.
Karimi, Seemeen; Jiang, Xiaoqian; Cosman, Pamela; Martz, Harry
2014-01-01
Imaging systems used in aviation security include segmentation algorithms in an automatic threat recognition pipeline. The segmentation algorithms evolve in response to emerging threats and changing performance requirements. Analysis of segmentation algorithms' behavior, including the nature of errors and feature recovery, facilitates their development. However, evaluation methods from the literature provide limited characterization of the segmentation algorithms. To develop segmentation evaluation methods that measure systematic errors such as oversegmentation and undersegmentation, outliers, and overall errors. The methods must measure feature recovery and allow us to prioritize segments. We developed two complementary evaluation methods using statistical techniques and information theory. We also created a semi-automatic method to define ground truth from 3D images. We applied our methods to evaluate five segmentation algorithms developed for CT luggage screening. We validated our methods with synthetic problems and an observer evaluation. Both methods selected the same best segmentation algorithm. Human evaluation confirmed the findings. The measurement of systematic errors and prioritization helped in understanding the behavior of each segmentation algorithm. Our evaluation methods allow us to measure and explain the accuracy of segmentation algorithms.
Flexible methods for segmentation evaluation: Results from CT-based luggage screening
Karimi, Seemeen; Jiang, Xiaoqian; Cosman, Pamela; Martz, Harry
2017-01-01
BACKGROUND Imaging systems used in aviation security include segmentation algorithms in an automatic threat recognition pipeline. The segmentation algorithms evolve in response to emerging threats and changing performance requirements. Analysis of segmentation algorithms’ behavior, including the nature of errors and feature recovery, facilitates their development. However, evaluation methods from the literature provide limited characterization of the segmentation algorithms. OBJECTIVE To develop segmentation evaluation methods that measure systematic errors such as oversegmentation and undersegmentation, outliers, and overall errors. The methods must measure feature recovery and allow us to prioritize segments. METHODS We developed two complementary evaluation methods using statistical techniques and information theory. We also created a semi-automatic method to define ground truth from 3D images. We applied our methods to evaluate five segmentation algorithms developed for CT luggage screening. We validated our methods with synthetic problems and an observer evaluation. RESULTS Both methods selected the same best segmentation algorithm. Human evaluation confirmed the findings. The measurement of systematic errors and prioritization helped in understanding the behavior of each segmentation algorithm. CONCLUSIONS Our evaluation methods allow us to measure and explain the accuracy of segmentation algorithms. PMID:24699346
A method to evaluate process performance by integrating time and resources
NASA Astrophysics Data System (ADS)
Wang, Yu; Wei, Qingjie; Jin, Shuang
2017-06-01
The purpose of process mining is to improve the existing process of the enterprise, so how to measure the performance of the process is particularly important. However, the current research on the performance evaluation method is still insufficient. The main methods of evaluation are mainly using time or resource. These basic statistics cannot evaluate process performance very well. In this paper, a method of evaluating the performance of the process based on time dimension and resource dimension is proposed. This method can be used to measure the utilization and redundancy of resources in the process. This paper will introduce the design principle and formula of the evaluation algorithm. Then, the design and the implementation of the evaluation method will be introduced. Finally, we will use the evaluating method to analyse the event log from a telephone maintenance process and propose an optimization plan.
Pragmatism, Evidence, and Mixed Methods Evaluation
ERIC Educational Resources Information Center
Hall, Jori N.
2013-01-01
Mixed methods evaluation has a long-standing history of enhancing the credibility of evaluation findings. However, using mixed methods in a utilitarian way implicitly emphasizes convenience over engaging with its philosophical underpinnings (Denscombe, 2008). Because of this, some mixed methods evaluators and social science researchers have been…
Using a fuzzy comprehensive evaluation method to determine product usability: A test case
Zhou, Ronggang; Chan, Alan H. S.
2016-01-01
BACKGROUND: In order to take into account the inherent uncertainties during product usability evaluation, Zhou and Chan [1] proposed a comprehensive method of usability evaluation for products by combining the analytic hierarchy process (AHP) and fuzzy evaluation methods for synthesizing performance data and subjective response data. This method was designed to provide an integrated framework combining the inevitable vague judgments from the multiple stages of the product evaluation process. OBJECTIVE AND METHODS: In order to illustrate the effectiveness of the model, this study used a summative usability test case to assess the application and strength of the general fuzzy usability framework. To test the proposed fuzzy usability evaluation framework [1], a standard summative usability test was conducted to benchmark the overall usability of a specific network management software. Based on the test data, the fuzzy method was applied to incorporate both the usability scores and uncertainties involved in the multiple components of the evaluation. Then, with Monte Carlo simulation procedures, confidence intervals were used to compare the reliabilities among the fuzzy approach and two typical conventional methods combining metrics based on percentages. RESULTS AND CONCLUSIONS: This case study showed that the fuzzy evaluation technique can be applied successfully for combining summative usability testing data to achieve an overall usability quality for the network software evaluated. Greater differences of confidence interval widths between the method of averaging equally percentage and weighted evaluation method, including the method of weighted percentage averages, verified the strength of the fuzzy method. PMID:28035942
Image quality evaluation of full reference algorithm
NASA Astrophysics Data System (ADS)
He, Nannan; Xie, Kai; Li, Tong; Ye, Yushan
2018-03-01
Image quality evaluation is a classic research topic, the goal is to design the algorithm, given the subjective feelings consistent with the evaluation value. This paper mainly introduces several typical reference methods of Mean Squared Error(MSE), Peak Signal to Noise Rate(PSNR), Structural Similarity Image Metric(SSIM) and feature similarity(FSIM) of objective evaluation methods. The different evaluation methods are tested by Matlab, and the advantages and disadvantages of these methods are obtained by analyzing and comparing them.MSE and PSNR are simple, but they are not considered to introduce HVS characteristics into image quality evaluation. The evaluation result is not ideal. SSIM has a good correlation and simple calculation ,because it is considered to the human visual effect into image quality evaluation,However the SSIM method is based on a hypothesis,The evaluation result is limited. The FSIM method can be used for test of gray image and color image test, and the result is better. Experimental results show that the new image quality evaluation algorithm based on FSIM is more accurate.
NASA Astrophysics Data System (ADS)
shunhe, Li; jianhua, Rao; lin, Gui; weimin, Zhang; degang, Liu
2017-11-01
The result of remanufacturing evaluation is the basis for judging whether the heavy duty machine tool can remanufacture in the EOL stage of the machine tool lifecycle management.The objectivity and accuracy of evaluation is the key to the evaluation method.In this paper, the catastrophe progression method is introduced into the quantitative evaluation of heavy duty machine tools’ remanufacturing,and the results are modified by the comprehensive adjustment method,which makes the evaluation results accord with the standard of human conventional thinking.Using the catastrophe progression method to establish the heavy duty machine tools’ quantitative evaluation model,to evaluate the retired TK6916 type CNC floor milling-boring machine’s remanufacturing.The evaluation process is simple,high quantification,the result is objective.
Economic evaluation of diagnostic methods used in dentistry. A systematic review.
Christell, Helena; Birch, Stephen; Horner, Keith; Lindh, Christina; Rohlin, Madeleine
2014-11-01
To review the literature of economic evaluations regarding diagnostic methods used in dentistry. Four databases (MEDLINE, Web of Science, The Cochrane library, the NHS Economic Evaluation Database) were searched for studies, complemented by hand search, until February 2013. Two authors independently screened all titles or abstracts and then applied inclusion and exclusion criteria to select full-text publications published in English, which reported an economic evaluation comparing at least two alternative methods. Studies of diagnostic methods were assessed by four reviewers using a protocol based on the QUADAS tool regarding diagnostic methods and a check-list for economic evaluations. The results of the data extraction were summarized in a structured table and as a narrative description. From 476 identified full-text publications, 160 were considered to be economic evaluations. Only 12 studies (7%) were on diagnostic methods, whilst 78 studies (49%) were on prevention and 70 (40%) on treatment. Among studies on diagnostic methods, there was between-study heterogeneity methodologically, regarding the diagnostic method analysed and type of economic evaluation addressed. Generally, the choice of economic evaluation method was not justified and the perspective of the study not stated. Costing of diagnostic methods varied. A small body of literature addresses economic evaluation of diagnostic methods in dentistry. Thus, there is a need for studies from various perspectives with well defined research questions and measures of the cost and effectiveness. Economic resources in healthcare are finite. For diagnostic methods, an understanding of efficacy provides only part of the information needed for evidence-based practice. This study highlighted a paucity of economic evaluations of diagnostic methods used in dentistry, indicating that much of what we practise lacks sufficient evidence.
Uncertainty Modeling and Evaluation of CMM Task Oriented Measurement Based on SVCMM
NASA Astrophysics Data System (ADS)
Li, Hongli; Chen, Xiaohuai; Cheng, Yinbao; Liu, Houde; Wang, Hanbin; Cheng, Zhenying; Wang, Hongtao
2017-10-01
Due to the variety of measurement tasks and the complexity of the errors of coordinate measuring machine (CMM), it is very difficult to reasonably evaluate the uncertainty of the measurement results of CMM. It has limited the application of CMM. Task oriented uncertainty evaluation has become a difficult problem to be solved. Taking dimension measurement as an example, this paper puts forward a practical method of uncertainty modeling and evaluation of CMM task oriented measurement (called SVCMM method). This method makes full use of the CMM acceptance or reinspection report and the Monte Carlo computer simulation method (MCM). The evaluation example is presented, and the results are evaluated by the traditional method given in GUM and the proposed method, respectively. The SVCMM method is verified to be feasible and practical. It can help CMM users to conveniently complete the measurement uncertainty evaluation through a single measurement cycle.
NASA Astrophysics Data System (ADS)
Zhao, Shijia; Liu, Zongwei; Wang, Yue; Zhao, Fuquan
2017-01-01
Subjectivity usually causes large fluctuations in evaluation results. Many scholars attempt to establish new mathematical methods to make evaluation results consistent with actual objective situations. An improved catastrophe progression method (ICPM) is constructed to overcome the defects of the original method. The improved method combines the merits of the principal component analysis' information coherence and the catastrophe progression method's none index weight and has the advantage of highly objective comprehensive evaluation. Through the systematic analysis of the influencing factors of the automotive industry's core technology capacity, the comprehensive evaluation model is established according to the different roles that different indices play in evaluating the overall goal with a hierarchical structure. Moreover, ICPM is developed for evaluating the automotive industry's core technology capacity for the typical seven countries in the world, which demonstrates the effectiveness of the method.
Using a fuzzy comprehensive evaluation method to determine product usability: A test case.
Zhou, Ronggang; Chan, Alan H S
2017-01-01
In order to take into account the inherent uncertainties during product usability evaluation, Zhou and Chan [1] proposed a comprehensive method of usability evaluation for products by combining the analytic hierarchy process (AHP) and fuzzy evaluation methods for synthesizing performance data and subjective response data. This method was designed to provide an integrated framework combining the inevitable vague judgments from the multiple stages of the product evaluation process. In order to illustrate the effectiveness of the model, this study used a summative usability test case to assess the application and strength of the general fuzzy usability framework. To test the proposed fuzzy usability evaluation framework [1], a standard summative usability test was conducted to benchmark the overall usability of a specific network management software. Based on the test data, the fuzzy method was applied to incorporate both the usability scores and uncertainties involved in the multiple components of the evaluation. Then, with Monte Carlo simulation procedures, confidence intervals were used to compare the reliabilities among the fuzzy approach and two typical conventional methods combining metrics based on percentages. This case study showed that the fuzzy evaluation technique can be applied successfully for combining summative usability testing data to achieve an overall usability quality for the network software evaluated. Greater differences of confidence interval widths between the method of averaging equally percentage and weighted evaluation method, including the method of weighted percentage averages, verified the strength of the fuzzy method.
The Evaluator's Perspective: Evaluating the State Capacity Building Program.
ERIC Educational Resources Information Center
Madey, Doren L.
A historical antagonism between the advocates of quantitative evaluation methods and the proponents of qualitative evaluation methods has stymied the recognition of the value to be gained by utilizing both methodologies in the same study. The integration of quantitative and qualitative methods within a single evaluation has synergistic effects in…
A hybrid method for evaluating enterprise architecture implementation.
Nikpay, Fatemeh; Ahmad, Rodina; Yin Kia, Chiam
2017-02-01
Enterprise Architecture (EA) implementation evaluation provides a set of methods and practices for evaluating the EA implementation artefacts within an EA implementation project. There are insufficient practices in existing EA evaluation models in terms of considering all EA functions and processes, using structured methods in developing EA implementation, employing matured practices, and using appropriate metrics to achieve proper evaluation. The aim of this research is to develop a hybrid evaluation method that supports achieving the objectives of EA implementation. To attain this aim, the first step is to identify EA implementation evaluation practices. To this end, a Systematic Literature Review (SLR) was conducted. Second, the proposed hybrid method was developed based on the foundation and information extracted from the SLR, semi-structured interviews with EA practitioners, program theory evaluation and Information Systems (ISs) evaluation. Finally, the proposed method was validated by means of a case study and expert reviews. This research provides a suitable foundation for researchers who wish to extend and continue this research topic with further analysis and exploration, and for practitioners who would like to employ an effective and lightweight evaluation method for EA projects. Copyright © 2016 Elsevier Ltd. All rights reserved.
A method for evaluating discoverability and navigability of recommendation algorithms.
Lamprecht, Daniel; Strohmaier, Markus; Helic, Denis
2017-01-01
Recommendations are increasingly used to support and enable discovery, browsing, and exploration of items. This is especially true for entertainment platforms such as Netflix or YouTube, where frequently, no clear categorization of items exists. Yet, the suitability of a recommendation algorithm to support these use cases cannot be comprehensively evaluated by any recommendation evaluation measures proposed so far. In this paper, we propose a method to expand the repertoire of existing recommendation evaluation techniques with a method to evaluate the discoverability and navigability of recommendation algorithms. The proposed method tackles this by means of first evaluating the discoverability of recommendation algorithms by investigating structural properties of the resulting recommender systems in terms of bow tie structure, and path lengths. Second, the method evaluates navigability by simulating three different models of information seeking scenarios and measuring the success rates. We show the feasibility of our method by applying it to four non-personalized recommendation algorithms on three data sets and also illustrate its applicability to personalized algorithms. Our work expands the arsenal of evaluation techniques for recommendation algorithms, extends from a one-click-based evaluation towards multi-click analysis, and presents a general, comprehensive method to evaluating navigability of arbitrary recommendation algorithms.
Liu, Peigui; Elshall, Ahmed S.; Ye, Ming; ...
2016-02-05
Evaluating marginal likelihood is the most critical and computationally expensive task, when conducting Bayesian model averaging to quantify parametric and model uncertainties. The evaluation is commonly done by using Laplace approximations to evaluate semianalytical expressions of the marginal likelihood or by using Monte Carlo (MC) methods to evaluate arithmetic or harmonic mean of a joint likelihood function. This study introduces a new MC method, i.e., thermodynamic integration, which has not been attempted in environmental modeling. Instead of using samples only from prior parameter space (as in arithmetic mean evaluation) or posterior parameter space (as in harmonic mean evaluation), the thermodynamicmore » integration method uses samples generated gradually from the prior to posterior parameter space. This is done through a path sampling that conducts Markov chain Monte Carlo simulation with different power coefficient values applied to the joint likelihood function. The thermodynamic integration method is evaluated using three analytical functions by comparing the method with two variants of the Laplace approximation method and three MC methods, including the nested sampling method that is recently introduced into environmental modeling. The thermodynamic integration method outperforms the other methods in terms of their accuracy, convergence, and consistency. The thermodynamic integration method is also applied to a synthetic case of groundwater modeling with four alternative models. The application shows that model probabilities obtained using the thermodynamic integration method improves predictive performance of Bayesian model averaging. As a result, the thermodynamic integration method is mathematically rigorous, and its MC implementation is computationally general for a wide range of environmental problems.« less
Ranking of physiotherapeutic evaluation methods as outcome measures of stifle functionality in dogs.
Hyytiäinen, Heli K; Mölsä, Sari H; Junnila, Jouni T; Laitinen-Vapaavuori, Outi M; Hielm-Björkman, Anna K
2013-04-08
Various physiotherapeutic evaluation methods are used to assess the functionality of dogs with stifle problems. Neither validity nor sensitivity of these methods has been investigated. This study aimed to determine the most valid and sensitive physiotherapeutic evaluation methods for assessing functional capacity in hind limbs of dogs with stifle problems and to serve as a basis for developing an indexed test for these dogs. A group of 43 dogs with unilateral surgically treated cranial cruciate ligament deficiency and osteoarthritic findings was used to test different physiotherapeutic evaluation methods. Twenty-one healthy dogs served as the control group and were used to determine normal variation in static weight bearing and range of motion.The protocol consisted of 14 different evaluation methods: visual evaluation of lameness, visual evaluation of diagonal movement, visual evaluation of functional active range of motion and difference in thrust of hind limbs via functional tests (sit-to-move and lie-to-move), movement in stairs, evaluation of hind limb muscle atrophy, manual evaluation of hind limb static weight bearing, quantitative measurement of static weight bearing of hind limbs with bathroom scales, and passive range of motion of hind limb stifle (flexion and extension) and tarsal (flexion and extension) joints using a universal goniometer. The results were compared with those from an orthopaedic examination, force plate analysis, radiographic evaluation, and a conclusive assessment. Congruity of the methods was assessed with a combination of three statistical approaches (Fisher's exact test and two differently calculated proportions of agreeing observations), and the components were ranked from best to worst. Sensitivities of all of the physiotherapeutic evaluation methods against each standard were calculated. Evaluation of asymmetry in a sitting and lying position, assessment of muscle atrophy, manual and measured static weight bearing, and measurement of stifle passive range of motion were the most valid and sensitive physiotherapeutic evaluation methods. Ranking of the various physiotherapeutic evaluation methods was accomplished. Several of these methods can be considered valid and sensitive when examining the functionality of dogs with stifle problems.
Ranking of physiotherapeutic evaluation methods as outcome measures of stifle functionality in dogs
2013-01-01
Background Various physiotherapeutic evaluation methods are used to assess the functionality of dogs with stifle problems. Neither validity nor sensitivity of these methods has been investigated. This study aimed to determine the most valid and sensitive physiotherapeutic evaluation methods for assessing functional capacity in hind limbs of dogs with stifle problems and to serve as a basis for developing an indexed test for these dogs. A group of 43 dogs with unilateral surgically treated cranial cruciate ligament deficiency and osteoarthritic findings was used to test different physiotherapeutic evaluation methods. Twenty-one healthy dogs served as the control group and were used to determine normal variation in static weight bearing and range of motion. The protocol consisted of 14 different evaluation methods: visual evaluation of lameness, visual evaluation of diagonal movement, visual evaluation of functional active range of motion and difference in thrust of hind limbs via functional tests (sit-to-move and lie-to-move), movement in stairs, evaluation of hind limb muscle atrophy, manual evaluation of hind limb static weight bearing, quantitative measurement of static weight bearing of hind limbs with bathroom scales, and passive range of motion of hind limb stifle (flexion and extension) and tarsal (flexion and extension) joints using a universal goniometer. The results were compared with those from an orthopaedic examination, force plate analysis, radiographic evaluation, and a conclusive assessment. Congruity of the methods was assessed with a combination of three statistical approaches (Fisher’s exact test and two differently calculated proportions of agreeing observations), and the components were ranked from best to worst. Sensitivities of all of the physiotherapeutic evaluation methods against each standard were calculated. Results Evaluation of asymmetry in a sitting and lying position, assessment of muscle atrophy, manual and measured static weight bearing, and measurement of stifle passive range of motion were the most valid and sensitive physiotherapeutic evaluation methods. Conclusions Ranking of the various physiotherapeutic evaluation methods was accomplished. Several of these methods can be considered valid and sensitive when examining the functionality of dogs with stifle problems. PMID:23566355
[Reconstituting evaluation methods based on both qualitative and quantitative paradigms].
Miyata, Hiroaki; Okubo, Suguru; Yoshie, Satoru; Kai, Ichiro
2011-01-01
Debate about the relationship between quantitative and qualitative paradigms is often muddled and confusing and the clutter of terms and arguments has resulted in the concepts becoming obscure and unrecognizable. In this study we conducted content analysis regarding evaluation methods of qualitative healthcare research. We extracted descriptions on four types of evaluation paradigm (validity/credibility, reliability/credibility, objectivity/confirmability, and generalizability/transferability), and classified them into subcategories. In quantitative research, there has been many evaluation methods based on qualitative paradigms, and vice versa. Thus, it might not be useful to consider evaluation methods of qualitative paradigm are isolated from those of quantitative methods. Choosing practical evaluation methods based on the situation and prior conditions of each study is an important approach for researchers.
NASA Astrophysics Data System (ADS)
Mao, Chao; Chen, Shou
2017-01-01
According to the traditional entropy value method still have low evaluation accuracy when evaluating the performance of mining projects, a performance evaluation model of mineral project founded on improved entropy is proposed. First establish a new weight assignment model founded on compatible matrix analysis of analytic hierarchy process (AHP) and entropy value method, when the compatibility matrix analysis to achieve consistency requirements, if it has differences between subjective weights and objective weights, moderately adjust both proportions, then on this basis, the fuzzy evaluation matrix for performance evaluation. The simulation experiments show that, compared with traditional entropy and compatible matrix analysis method, the proposed performance evaluation model of mining project based on improved entropy value method has higher accuracy assessment.
Evaluating the utility of two gestural discomfort evaluation methods
Son, Minseok; Jung, Jaemoon; Park, Woojin
2017-01-01
Evaluating physical discomfort of designed gestures is important for creating safe and usable gesture-based interaction systems; yet, gestural discomfort evaluation has not been extensively studied in HCI, and few evaluation methods seem currently available whose utility has been experimentally confirmed. To address this, this study empirically demonstrated the utility of the subjective rating method after a small number of gesture repetitions (a maximum of four repetitions) in evaluating designed gestures in terms of physical discomfort resulting from prolonged, repetitive gesture use. The subjective rating method has been widely used in previous gesture studies but without empirical evidence on its utility. This study also proposed a gesture discomfort evaluation method based on an existing ergonomics posture evaluation tool (Rapid Upper Limb Assessment) and demonstrated its utility in evaluating designed gestures in terms of physical discomfort resulting from prolonged, repetitive gesture use. Rapid Upper Limb Assessment is an ergonomics postural analysis tool that quantifies the work-related musculoskeletal disorders risks for manual tasks, and has been hypothesized to be capable of correctly determining discomfort resulting from prolonged, repetitive gesture use. The two methods were evaluated through comparisons against a baseline method involving discomfort rating after actual prolonged, repetitive gesture use. Correlation analyses indicated that both methods were in good agreement with the baseline. The methods proposed in this study seem useful for predicting discomfort resulting from prolonged, repetitive gesture use, and are expected to help interaction designers create safe and usable gesture-based interaction systems. PMID:28423016
Comparative Evaluation of Two Methods to Estimate Natural Gas Production in Texas
2003-01-01
This report describes an evaluation conducted by the Energy Information Administration (EIA) in August 2003 of two methods that estimate natural gas production in Texas. The first method (parametric method) was used by EIA from February through August 2003 and the second method (multinomial method) replaced it starting in September 2003, based on the results of this evaluation.
ERIC Educational Resources Information Center
Deng, Nina
2011-01-01
Three decision consistency and accuracy (DC/DA) methods, the Livingston and Lewis (LL) method, LEE method, and the Hambleton and Han (HH) method, were evaluated. The purposes of the study were: (1) to evaluate the accuracy and robustness of these methods, especially when their assumptions were not well satisfied, (2) to investigate the "true"…
A Multi-level Fuzzy Evaluation Method for Smart Distribution Network Based on Entropy Weight
NASA Astrophysics Data System (ADS)
Li, Jianfang; Song, Xiaohui; Gao, Fei; Zhang, Yu
2017-05-01
Smart distribution network is considered as the future trend of distribution network. In order to comprehensive evaluate smart distribution construction level and give guidance to the practice of smart distribution construction, a multi-level fuzzy evaluation method based on entropy weight is proposed. Firstly, focus on both the conventional characteristics of distribution network and new characteristics of smart distribution network such as self-healing and interaction, a multi-level evaluation index system which contains power supply capability, power quality, economy, reliability and interaction is established. Then, a combination weighting method based on Delphi method and entropy weight method is put forward, which take into account not only the importance of the evaluation index in the experts’ subjective view, but also the objective and different information from the index values. Thirdly, a multi-level evaluation method based on fuzzy theory is put forward. Lastly, an example is conducted based on the statistical data of some cites’ distribution network and the evaluation method is proved effective and rational.
Zhou, Ronggang; Chan, Alan H. S.
2016-01-01
BACKGROUND: In order to compare existing usability data to ideal goals or to that for other products, usability practitioners have tried to develop a framework for deriving an integrated metric. However, most current usability methods with this aim rely heavily on human judgment about the various attributes of a product, but often fail to take into account of the inherent uncertainties in these judgments in the evaluation process. OBJECTIVE: This paper presents a universal method of usability evaluation by combining the analytic hierarchical process (AHP) and the fuzzy evaluation method. By integrating multiple sources of uncertain information during product usability evaluation, the method proposed here aims to derive an index that is structured hierarchically in terms of the three usability components of effectiveness, efficiency, and user satisfaction of a product. METHODS: With consideration of the theoretical basis of fuzzy evaluation, a two-layer comprehensive evaluation index was first constructed. After the membership functions were determined by an expert panel, the evaluation appraisals were computed by using the fuzzy comprehensive evaluation technique model to characterize fuzzy human judgments. Then with the use of AHP, the weights of usability components were elicited from these experts. RESULTS AND CONCLUSIONS: Compared to traditional usability evaluation methods, the major strength of the fuzzy method is that it captures the fuzziness and uncertainties in human judgments and provides an integrated framework that combines the vague judgments from multiple stages of a product evaluation process. PMID:28035943
Evaluating an Intelligent Tutoring System for Design Patterns: The DEPTHS Experience
ERIC Educational Resources Information Center
Jeremic, Zoran; Jovanovic, Jelena; Gasevic, Dragan
2009-01-01
The evaluation of intelligent tutoring systems (ITSs) is an important though often neglected stage of ITS development. There are many evaluation methods available but literature does not provide clear guidelines for the selection of evaluation method(s) to be used in a particular context. This paper describes the evaluation study of DEPTHS, an…
Issues in evaluation: evaluating assessments of elderly people using a combination of methods.
McEwan, R T
1989-02-01
In evaluating a health service, individuals will give differing accounts of its performance, according to their experiences of the service, and the evaluative perspective they adopt. The value of a service may also change through time, and according to the particular part of the service studied. Traditional health care evaluations have generally not accounted for this variability because of the approaches used. Studies evaluating screening or assessment programmes for the elderly have focused on programme effectiveness and efficiency, using relatively inflexible quantitative methods. Evaluative approaches must reflect the complexity of health service provision, and methods must vary to suit the particular research objective. Under these circumstances, this paper presents the case for the use of multiple triangulation in evaluative research, where differing methods and perspectives are combined in one study. Emphasis is placed on the applications and benefits of subjectivist approaches in evaluation. An example of combined methods is provided in the form of an evaluation of the Newcastle Care Plan for the Elderly.
Snodgrass, Jeffrey G; Lacy, Michael G; Upadhyay, Chakrapani
2017-08-01
We present a perspective to analyze mental health without either a) imposing Western illness categories or b) adopting local or "native" categories of mental distress. Our approach takes as axiomatic only that locals within any culture share a cognitive and verbal lexicon of salient positive and negative emotional experiences, which an appropriate and repeatable set of ethnographic procedures can elicit. Our approach is provisionally agnostic with respect to either Western or native nosological categories, and instead focuses on persons' relative frequency of experiencing emotions. Putting this perspective into practice in India, our ethnographic fieldwork (2006-2014) and survey analysis (N = 219) resulted in a 40-item Positive and Negative Affect Scale (PANAS), which we used to assess the mental well-being of Indigenous persons (the tribal Sahariya) in the Indian states of Rajasthan and Madhya Pradesh. Generated via standard cognitive anthropological procedures that can be replicated elsewhere, measures such as this possess features of psychiatric scales favored by leaders in global mental health initiatives. Though not capturing locally named distress syndromes, our scale is nonetheless sensitive to local emotional experiences, frames of meaning, and "idioms of distress." By sharing traits of both global and also locally-derived diagnoses, approaches like ours can help identify synergies between them. For example, employing data reduction techniques such as factor analysis-where diagnostic and screening categories emerge inductively ex post facto from emotional symptom clusters, rather than being deduced or assigned a priori by either global mental health experts or locals themselves-reveals hidden overlaps between local wellness idioms and global ones. Practically speaking, our perspective, which assesses both emotional frailty and also potential sources of emotional resilience and balance, while eschewing all named illness categories, can be deployed in mental health initiatives in ways that minimize stigma and increase both the acceptability and validity of assessment instruments. Copyright © 2017 Elsevier Ltd. All rights reserved.
Ultimate Data Hiding in Quantum Mechanics and Beyond
NASA Astrophysics Data System (ADS)
Lami, Ludovico; Palazuelos, Carlos; Winter, Andreas
2018-06-01
The phenomenon of data hiding, i.e. the existence of pairs of states of a bipartite system that are perfectly distinguishable via general entangled measurements yet almost indistinguishable under LOCC, is a distinctive signature of nonclassicality. The relevant figure of merit is the maximal ratio (called data hiding ratio) between the distinguishability norms associated with the two sets of measurements we are comparing, typically all measurements vs LOCC protocols. For a bipartite {n× n} quantum system, it is known that the data hiding ratio scales as n, i.e. the square root of the real dimension of the local state space of density matrices. We show that for bipartite {n_A× n_B} systems the maximum data hiding ratio against LOCC protocols is Θ(\\min{n_A,n_B}) . This scaling is better than the previously obtained upper bounds O(√{n_A n_B}) and \\min{n_A^2, n_B^2} , and moreover our intuitive argument yields constants close to optimal. In this paper, we investigate data hiding in the more general context of general probabilistic theories (GPTs), an axiomatic framework for physical theories encompassing only the most basic requirements about the predictive power of the theory. The main result of the paper is the determination of the maximal data hiding ratio obtainable in an arbitrary GPT, which is shown to scale linearly in the minimum of the local dimensions. We exhibit an explicit model achieving this bound up to additive constants, finding that quantum mechanical data hiding ratio is only of the order of the square root of the maximal one. Our proof rests crucially on an unexpected link between data hiding and the theory of projective and injective tensor products of Banach spaces. Finally, we develop a body of techniques to compute data hiding ratios for a variety of restricted classes of GPTs that support further symmetries.
Reclamation of peat-based wetlands affected by Alberta, Canada's oil sands development
NASA Astrophysics Data System (ADS)
Foote, Lee; Ciborowski, Jan; Dixon, D. George; Liber, Karsten; Smits, Judit
2013-04-01
The ability to construct or reclaim functional peat-based wetlands as a replacement for those lost to development activity is uncertain. Oil sands development in northern Alberta, Canada will ultimately result in the removal of over 85 km2 of peat-based wetlands. To examine potential replacement of these lost peatlands we compared four treatments assigned to 16 known-age wetlands where we followed plant community, carbon dynamics, water quality, invertebrates and top predators for 5 years. Key questions followed by a synopsis of findings include: (1) Will wetland communities become more natural with age? - Yes, however industrial effluents of salinity and napthenates will slow succession and may truncate development compared to natural systems; (2) Can community succession be accelerated? - Yes, the addition of carbon-rich soils can facilitate development in some zones but cautions are raised about a "green desert" of vigorous plant stands with low insect and vertebrate diversity; (3) Is productivity sustainable? - Maybe, limitations of water chemistry (salinity and napthenates) and hydrologic regime appear to play large roles; (4) Will production support top predators? Sometimes; insectivorous birds, some small fish and a few amphibians persisted under all except the most saline and napthenate-enriched sites; (5) What is the role of the compromised water quality in reclamation? - Reduced diversity of plants, insects and vertebrates, reduced plant physiological efficiency and thus slower rates of reclamation. It is axiomatic and well demonstrated throughout Europe that it is easier and more cost effective to protect peatlands than it is to reclaim or create them. This is complicated, though, where mineral or property values soar to over 1 million per hectare. Industrial planners, governments and the public need to understand the options, possibilities, time frames and costs of peatland replacement to make the best land use decisions possible. Our research provides a quantifiable scientific basis for forecasting the future functions, conditions and replacement value of wetlands lost to development, while providing a basis for reclamation recommendations.
Quantum theory and human perception of the macro-world
Aerts, Diederik
2014-01-01
We investigate the question of ‘why customary macroscopic entities appear to us humans as they do, i.e., as bounded entities occupying space and persisting through time’, starting from our knowledge of quantum theory, how it affects the behavior of such customary macroscopic entities, and how it influences our perception of them. For this purpose, we approach the question from three perspectives. Firstly, we look at the situation from the standard quantum angle, more specifically the de Broglie wavelength analysis of the behavior of macroscopic entities, indicate how a problem with spin and identity arises, and illustrate how both play a fundamental role in well-established experimental quantum-macroscopical phenomena, such as Bose-Einstein condensates. Secondly, we analyze how the question is influenced by our result in axiomatic quantum theory, which proves that standard quantum theory is structurally incapable of describing separated entities. Thirdly, we put forward our new ‘conceptual quantum interpretation’, including a highly detailed reformulation of the question to confront the new insights and views that arise with the foregoing analysis. At the end of the final section, a nuanced answer is given that can be summarized as follows. The specific and very classical perception of human seeing—light as a geometric theory—and human touching—only ruled by Pauli's exclusion principle—plays a role in our perception of macroscopic entities as ontologically stable entities in space. To ascertain quantum behavior in such macroscopic entities, we will need measuring apparatuses capable of its detection. Future experimental research will have to show if sharp quantum effects—as they occur in smaller entities—appear to be ontological aspects of customary macroscopic entities. It remains a possibility that standard quantum theory is an incomplete theory, and hence incapable of coping ultimately with separated entities, meaning that a more general theory will be needed. PMID:25009510
The evolutionary nature of narratives about expansion and sustenance
NASA Astrophysics Data System (ADS)
Raupach, M. R.
2014-12-01
The 200 years since the start of the industrial era has been a period of rapid and almost unbroken economic growth in much of the world, based upon exponentially increasing use of energy and water resources and the atmospheric commons. It is axiomatic that exponential growth cannot continue forever on a finite planet, leading to an emerging collision between the presently irresistible force of economic growth and the immovable reality of the finitude of Planet Earth. This has led to the a contest between two broad narratives about humans and their planet in the 21st century, an "expansion" narrative framed around the paramount need for economic growth, and a "sustenance" narrative framed around the paramount need to protect an increasingly fragile natural world. Many features of recent public discourse, including the acceleration of the news cycle and the echo-chamber effect of interactive social media, have driven these narratives to become progressively more mutually antagonistic and incompatible. Here I explore the idea that narratives (in the sense of stories that empower actions) are meme sequences that evolve through diversification, selection and adaptation. This memetic evolution can be understood and, to some extent, influenced. An analogy might be with the influence exerted by human selection over centuries on the gene pool of domesticated animals and plants. In shaping our shared future, the evolutionary contest between "expansion" and "sustenance" narratives is just as important as the dynamics of the natural world. The future therefore depends upon the evolution of more subtle and resilient narratives about human-earth interactions. A selection test for these narratives is their ability to empower a transition to a society that lives within the means of a finite planet and improves global wellbeing at the same time. My own recent experience is that scientists alone are not very good at shaping narratives to pass this fitness test, and the participation of other disciplines and approaches is urgently needed.
Linskey, M E
2000-12-01
By definition, the term "radiosurgery" refers to the delivery of a therapeutic radiation dose in a single fraction, not simply the use of stereotaxy. Multiple-fraction delivery is better termed "stereotactic radiotherapy." There are compelling radiobiological principles supporting the biological superiority of single-fraction radiation for achieving an optimal therapeutic response for the slowly proliferating, late-responding, tissue of a schwannoma. It is axiomatic that complication avoidance requires precise three-dimensional conformality between treatment and tumor volumes. This degree of conformality can only be achieved through complex multiisocenter planning. Alternative radiosurgery devices are generally limited to delivering one to four isocenters in a single treatment session. Although they can reproduce dose plans similar in conformality to early gamma knife dose plans by using a similar number of isocenters, they cannot reproduce the conformality of modern gamma knife plans based on magnetic resonance image-targeted localization and five to 30 isocenters. A disturbing trend is developing in which institutions without nongamma knife radiosurgery (GKS) centers are championing and/or shifting to hypofractionated stereotactic radiotherapy for vestibular schwannomas. This trend appears to be driven by a desire to reduce complication rates to compete with modern GKS results by using complex multiisocenter planning. Aggressive advertising and marketing from some of these centers even paradoxically suggests biological superiority of hypofractionation approaches over single-dose radiosurgery for vestibular schwannomas. At the same time these centers continue to use the term radiosurgery to describe their hypofractionated radiotherapy approach in an apparent effort to benefit from a GKS "halo effect." It must be reemphasized that as neurosurgeons our primary duty is to achieve permanent tumor control for our patients and not to eliminate complications at the expense of potential late recurrence. The answer to minimizing complications while maintaining maximum tumor control is improved conformality of radiosurgery dose planning and not resorting to homeopathic radiosurgery doses or hypofractionation radiotherapy schemes.
Linskey, Mark E
2013-12-01
By definition, the term "radiosurgery" refers to the delivery of a therapeutic radiation dose in a single fraction, not simply the use of stereotaxy. Multiple-fraction delivery is better termed "stereotactic radiotherapy." There are compelling radiobiological principles supporting the biological superiority of single-fraction radiation for achieving an optimal therapeutic response for the slowly proliferating, late-responding, tissue of a schwannoma. It is axiomatic that complication avoidance requires precise three-dimensional conformality between treatment and tumor volumes. This degree of conformality can only be achieved through complex multiisocenter planning. Alternative radiosurgery devices are generally limited to delivering one to four isocenters in a single treatment session. Although they can reproduce dose plans similar in conformality to early gamma knife dose plans by using a similar number of isocenters, they cannot reproduce the conformality of modern gamma knife plans based on magnetic resonance image--targeted localization and five to 30 isocenters. A disturbing trend is developing in which institutions without nongamma knife radiosurgery (GKS) centers are championing and/or shifting to hypofractionated stereotactic radiotherapy for vestibular schwannomas. This trend appears to be driven by a desire to reduce complication rates to compete with modern GKS results by using complex multiisocenter planning. Aggressive advertising and marketing from some of these centers even paradoxically suggests biological superiority of hypofractionation approaches over single-dose radiosurgery for vestibular schwannomas. At the same time these centers continue to use the term radiosurgery to describe their hypofractionated radiotherapy approach in an apparent effort to benefit from a GKS "halo effect." It must be reemphasized that as neurosurgeons our primary duty is to achieve permanent tumor control for our patients and not to eliminate complications at the expense of potential late recurrence. The answer to minimizing complications while maintaining maximum tumor control is improved conformality of radiosurgery dose planning and not resorting to homeopathic radiosurgery doses or hypofractionation radiotherapy schemes.
Quantum to Classical Transitions via Weak Measurements and Post-Selection
NASA Astrophysics Data System (ADS)
Cohen, Eliahu; Aharonov, Yakir
Alongside its immense empirical success, the quantum mechanical account of physical systems imposes a myriad of divergences from our thoroughly ingrained classical ways of thinking. These divergences, while striking, would have been acceptable if only a continuous transition to the classical domain was at hand. Strangely, this is not quite the case. The difficulties involved in reconciling the quantum with the classical have given rise to different interpretations, each with its own shortcomings. Traditionally, the two domains are sewed together by invoking an ad hoc theory of measurement, which has been incorporated in the axiomatic foundations of quantum theory. This work will incorporate a few related tools for addressing the above conceptual difficulties: deterministic operators, weak measurements, and post-selection. Weak Measurement, based on a very weak von Neumann coupling, is a unique kind of quantum measurement with numerous theoretical and practical applications. In contrast to other measurement techniques, it allows to gather a small amount of information regarding the quantum system, with only a negligible probability of collapsing it onto an eigenstate of the measured observable. A single weak measurement yieldsan almost random outcome, but when performed repeatedly over a large ensemble, the averaged outcome becomes increasingly robust and accurate. Importantly, a long sequence of weak measurements can be thought of as a single projective measurement. We claim in this work that classical variables appearing in the o-world, such as center of mass, moment of inertia, pressure, and average forces, result from a multitude of quantum weak measurements performed in the micro-world. Here again, the quantum outcomes are highly uncertain, but the law of large numbers obliges their convergence to the definite quantities we know from our everyday lives. By augmenting this description with a final boundary condition and employing the notion of "classical robustness under time-reversal", we will draw a quantitative borderline between the classical and quantum regimes. We will conclude by analyzing the role of oscopic systems in amplifying and recording quantum outcomes.
Self-organizing ontology of biochemically relevant small molecules
2012-01-01
Background The advent of high-throughput experimentation in biochemistry has led to the generation of vast amounts of chemical data, necessitating the development of novel analysis, characterization, and cataloguing techniques and tools. Recently, a movement to publically release such data has advanced biochemical structure-activity relationship research, while providing new challenges, the biggest being the curation, annotation, and classification of this information to facilitate useful biochemical pattern analysis. Unfortunately, the human resources currently employed by the organizations supporting these efforts (e.g. ChEBI) are expanding linearly, while new useful scientific information is being released in a seemingly exponential fashion. Compounding this, currently existing chemical classification and annotation systems are not amenable to automated classification, formal and transparent chemical class definition axiomatization, facile class redefinition, or novel class integration, thus further limiting chemical ontology growth by necessitating human involvement in curation. Clearly, there is a need for the automation of this process, especially for novel chemical entities of biological interest. Results To address this, we present a formal framework based on Semantic Web technologies for the automatic design of chemical ontology which can be used for automated classification of novel entities. We demonstrate the automatic self-assembly of a structure-based chemical ontology based on 60 MeSH and 40 ChEBI chemical classes. This ontology is then used to classify 200 compounds with an accuracy of 92.7%. We extend these structure-based classes with molecular feature information and demonstrate the utility of our framework for classification of functionally relevant chemicals. Finally, we discuss an iterative approach that we envision for future biochemical ontology development. Conclusions We conclude that the proposed methodology can ease the burden of chemical data annotators and dramatically increase their productivity. We anticipate that the use of formal logic in our proposed framework will make chemical classification criteria more transparent to humans and machines alike and will thus facilitate predictive and integrative bioactivity model development. PMID:22221313
A new evaluation tool to obtain practice-based evidence of worksite health promotion programs.
Dunet, Diane O; Sparling, Phillip B; Hersey, James; Williams-Piehota, Pamela; Hill, Mary D; Hanssen, Carl; Lawrenz, Frances; Reyes, Michele
2008-10-01
The Centers for Disease Control and Prevention developed the Swift Worksite Assessment and Translation (SWAT) evaluation method to identify promising practices in worksite health promotion programs. The new method complements research studies and evaluation studies of evidence-based practices that promote healthy weight in working adults. We used nationally recognized program evaluation standards of utility, feasibility, accuracy, and propriety as the foundation for our 5-step method: 1) site identification and selection, 2) site visit, 3) post-visit evaluation of promising practices, 4) evaluation capacity building, and 5) translation and dissemination. An independent, outside evaluation team conducted process and summative evaluations of SWAT to determine its efficacy in providing accurate, useful information and its compliance with evaluation standards. The SWAT evaluation approach is feasible in small and medium-sized workplace settings. The independent evaluation team judged SWAT favorably as an evaluation method, noting among its strengths its systematic and detailed procedures and service orientation. Experts in worksite health promotion evaluation concluded that the data obtained by using this evaluation method were sufficient to allow them to make judgments about promising practices. SWAT is a useful, business-friendly approach to systematic, yet rapid, evaluation that comports with program evaluation standards. The method provides a new tool to obtain practice-based evidence of worksite health promotion programs that help prevent obesity and, more broadly, may advance public health goals for chronic disease prevention and health promotion.
Zhou, Ronggang; Chan, Alan H S
2017-01-01
In order to compare existing usability data to ideal goals or to that for other products, usability practitioners have tried to develop a framework for deriving an integrated metric. However, most current usability methods with this aim rely heavily on human judgment about the various attributes of a product, but often fail to take into account of the inherent uncertainties in these judgments in the evaluation process. This paper presents a universal method of usability evaluation by combining the analytic hierarchical process (AHP) and the fuzzy evaluation method. By integrating multiple sources of uncertain information during product usability evaluation, the method proposed here aims to derive an index that is structured hierarchically in terms of the three usability components of effectiveness, efficiency, and user satisfaction of a product. With consideration of the theoretical basis of fuzzy evaluation, a two-layer comprehensive evaluation index was first constructed. After the membership functions were determined by an expert panel, the evaluation appraisals were computed by using the fuzzy comprehensive evaluation technique model to characterize fuzzy human judgments. Then with the use of AHP, the weights of usability components were elicited from these experts. Compared to traditional usability evaluation methods, the major strength of the fuzzy method is that it captures the fuzziness and uncertainties in human judgments and provides an integrated framework that combines the vague judgments from multiple stages of a product evaluation process.
Khajouei, Reza; Hajesmaeel Gohari, Sadrieh; Mirzaee, Moghaddameh
2018-04-01
In addition to following the usual Heuristic Evaluation (HE) method, the usability of health information systems can also be evaluated using a checklist. The objective of this study is to compare the performance of these two methods in identifying usability problems of health information systems. Eight evaluators independently evaluated different parts of a Medical Records Information System using two methods of HE (usual and with a checklist). The two methods were compared in terms of the number of problems identified, problem type, and the severity of identified problems. In all, 192 usability problems were identified by two methods in the Medical Records Information System. This was significantly higher than the number of usability problems identified by the checklist and usual method (148 and 92, respectively) (p < 0.0001). After removing the duplicates, the difference between the number of unique usability problems identified by the checklist method (n = 100) and usual method (n = 44) was significant (p < 0.0001). Differences between the mean severity of the real usability problems (1.83) and those identified by only one of the methods (usual = 2.05, checklist = 1.74) were significant (p = 0.001). This study revealed the potential of the two HE methods for identifying usability problems of health information systems. The results demonstrated that the checklist method had significantly better performance in terms of the number of identified usability problems; however, the performance of the usual method for identifying problems of higher severity was significantly better. Although the checklist method can be more efficient for less experienced evaluators, wherever usability is critical, the checklist should be used with caution in usability evaluations. Copyright © 2018 Elsevier Inc. All rights reserved.
Wada, Yoichi; Hara, Takanori; Miyati, Tosiaki
2008-02-20
Many methods of measuring contrast-to-noise ratio (CNR) in magnetic resonance imaging (MRI) have been proposed. However, it is not clear which method is best for evaluating clinical or phantom images. In this study we examined the characteristics of the methods of evaluation proposed in the past, and we proposed new CNR evaluation method that improved noise evaluation. We examined the relationship of theoretical CNR value and measurement value when measurement sensitivity was changed. We measured the relationship between number of signal averaged (NSA) and value of CNR. The CNR value changed greatly according to where noise was measured. The measuring method that we proposed in this study was superior for the following reasons: the measurement point of noise and signal are the same; the influence of the low frequency element is slight; and the correlation of measurements and theoretical value is high. The method that we proposed in this study is useful for evaluating phantom images.
Evaluation of temperament scoring methods for beef cattle
USDA-ARS?s Scientific Manuscript database
The objective of this study was to evaluate methods of temperament scoring. Crossbred (n=228) calves were evaluated for temperament by an individual evaluator at weaning by two methods of scoring: 1) pen score (1 to 5 scale, with higher scores indicating increasing degree of nervousness, aggressiven...
The acoustical design of vehicles-a challenge for qualitative evaluation
NASA Astrophysics Data System (ADS)
Schulte-Fortkamp, Brigitte; Genuit, Klaus; Fiebig, Andre
2005-09-01
Whenever the acoustical design of vehicles is explored, the crucial question about the appropriate method of evaluation arises. Research shows that not only acoustic but also non-acoustic parameters have a major influence on the way sounds are evaluated. Therefore, new methods of evaluation have to be implemented. Methods are needed which give the opportunity to test the quality of the given ambience and to register the effects and evaluations in their functional interdependence as well as the influence of personal and contextual factors. Moreover, new methods have to give insight into processes of evaluation and their contextual parameters. In other words, the task of evaluating acoustical ambiences consists of designating a set of social, psychological, and cultural conditions which are important to determine particular individual and collective behavior, attitudes, and also emotions relative to the given ambience. However, no specific recommendations exist yet which comprise particular descriptions of how to assess those specific sound effects. That is why there is a need to develop alternative methods of evaluation with whose help effects of acoustical ambiences can be better predicted. A method of evaluation will be presented which incorporates a new sensitive approach for the evaluation of vehicle sounds.
An evaluation method for nanoscale wrinkle
NASA Astrophysics Data System (ADS)
Liu, Y. P.; Wang, C. G.; Zhang, L. M.; Tan, H. F.
2016-06-01
In this paper, a spectrum-based wrinkling analysis method via two-dimensional Fourier transformation is proposed aiming to solve the difficulty of nanoscale wrinkle evaluation. It evaluates the wrinkle characteristics including wrinkling wavelength and direction simply using a single wrinkling image. Based on this method, the evaluation results of nanoscale wrinkle characteristics show agreement with the open experimental results within an error of 6%. It is also verified to be appropriate for the macro wrinkle evaluation without scale limitations. The spectrum-based wrinkling analysis is an effective method for nanoscale evaluation, which contributes to reveal the mechanism of nanoscale wrinkling.
Comparative study on the welded structure fatigue strength assessment method
NASA Astrophysics Data System (ADS)
Hu, Tao
2018-04-01
Due to the welding structure is widely applied in various industries, especially the pressure container, motorcycle, automobile, aviation, ship industry, such as large crane steel structure, so for welded structure fatigue strength evaluation is particularly important. For welded structure fatigue strength evaluation method mainly has four kinds of, the more from the use of two kinds of welded structure fatigue strength evaluation method, namely the nominal stress method and the hot spot stress evaluation method, comparing from its principle, calculation method for the process analysis and research, compare the similarities and the advantages and disadvantages, the analysis of practical engineering problems to provide the reference for every profession and trade, as well as the future welded structure fatigue strength and life evaluation method put forward outlook.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-17
... for the Evaluation of Alternative Toxicological Methods (NICEATM); Availability of Interagency Coordinating Committee on the Validation of Alternative Methods (ICCVAM) Test Method Evaluation Reports: In Vitro Ocular Safety Testing Methods and Strategies, and Routine Use of Topical Anesthetics, Systemic...
Study on the evaluation method for fault displacement based on characterized source model
NASA Astrophysics Data System (ADS)
Tonagi, M.; Takahama, T.; Matsumoto, Y.; Inoue, N.; Irikura, K.; Dalguer, L. A.
2016-12-01
In IAEA Specific Safety Guide (SSG) 9 describes that probabilistic methods for evaluating fault displacement should be used if no sufficient basis is provided to decide conclusively that the fault is not capable by using the deterministic methodology. In addition, International Seismic Safety Centre compiles as ANNEX to realize seismic hazard for nuclear facilities described in SSG-9 and shows the utility of the deterministic and probabilistic evaluation methods for fault displacement. In Japan, it is required that important nuclear facilities should be established on ground where fault displacement will not arise when earthquakes occur in the future. Under these situations, based on requirements, we need develop evaluation methods for fault displacement to enhance safety in nuclear facilities. We are studying deterministic and probabilistic methods with tentative analyses using observed records such as surface fault displacement and near-fault strong ground motions of inland crustal earthquake which fault displacements arose. In this study, we introduce the concept of evaluation methods for fault displacement. After that, we show parts of tentative analysis results for deterministic method as follows: (1) For the 1999 Chi-Chi earthquake, referring slip distribution estimated by waveform inversion, we construct a characterized source model (Miyake et al., 2003, BSSA) which can explain observed near-fault broad band strong ground motions. (2) Referring a characterized source model constructed in (1), we study an evaluation method for surface fault displacement using hybrid method, which combines particle method and distinct element method. At last, we suggest one of the deterministic method to evaluate fault displacement based on characterized source model. This research was part of the 2015 research project `Development of evaluating method for fault displacement` by the Secretariat of Nuclear Regulation Authority (S/NRA), Japan.
A usability evaluation toolkit for In-Vehicle Information Systems (IVISs).
Harvey, Catherine; Stanton, Neville A; Pickering, Carl A; McDonald, Mike; Zheng, Pengjun
2011-05-01
Usability must be defined specifically for the context of use of the particular system under investigation. This specific context of use should also be used to guide the definition of specific usability criteria and the selection of appropriate evaluation methods. There are four principles which can guide the selection of evaluation methods, relating to the information required in the evaluation, the stage at which to apply methods, the resources required and the people involved in the evaluation. This paper presents a framework for the evaluation of usability in the context of In-Vehicle Information Systems (IVISs). This framework guides designers through defining usability criteria for an evaluation, selecting appropriate evaluation methods and applying those methods. These stages form an iterative process of design-evaluation-redesign with the overall aim of improving the usability of IVISs and enhancing the driving experience, without compromising the safety of the driver. Copyright © 2010 Elsevier Ltd and The Ergonomics Society. All rights reserved.
NASA Astrophysics Data System (ADS)
Kang, Hong; Zhang, Yun; Hou, Haochen; Sun, Xiaoyang; Qin, Chenglu
2018-03-01
The textile industry has a high environmental impact so that implementing cleaner production audit is an effective way to achieve energy conservation and emissions reduction. But the evaluation method in current cleaner production audit divided the evaluation of CPOs into two parts: environment and economy. The evaluation index system was constructed from three criteria of environment benefits, economy benefits and product performance; weights of five indicators were determined by combination weights of entropy method and factor weight sorting method. Then efficiencies were evaluated comprehensively. The results showed that the best alkali recovery option was the nanofiltration membrane method (S=0.80).
Research on the Value Evaluation of Used Pure Electric Car Based on the Replacement Cost Method
NASA Astrophysics Data System (ADS)
Tan, zhengping; Cai, yun; Wang, yidong; Mao, pan
2018-03-01
In this paper, the value evaluation of the used pure electric car is carried out by the replacement cost method, which fills the blank of the value evaluation of the electric vehicle. The basic principle of using the replacement cost method, combined with the actual cost of pure electric cars, puts forward the calculation method of second-hand electric car into a new rate based on the use of AHP method to construct the weight matrix comprehensive adjustment coefficient of related factors, the improved method of value evaluation system for second-hand car
Study on process evaluation model of students' learning in practical course
NASA Astrophysics Data System (ADS)
Huang, Jie; Liang, Pei; Shen, Wei-min; Ye, Youxiang
2017-08-01
In practical course teaching based on project object method, the traditional evaluation methods include class attendance, assignments and exams fails to give incentives to undergraduate students to learn innovatively and autonomously. In this paper, the element such as creative innovation, teamwork, document and reporting were put into process evaluation methods, and a process evaluation model was set up. Educational practice shows that the evaluation model makes process evaluation of students' learning more comprehensive, accurate, and fairly.
An Evaluation Method of Equipment Reliability Configuration Management
NASA Astrophysics Data System (ADS)
Wang, Wei; Feng, Weijia; Zhang, Wei; Li, Yuan
2018-01-01
At present, many equipment development companies have been aware of the great significance of reliability of the equipment development. But, due to the lack of effective management evaluation method, it is very difficult for the equipment development company to manage its own reliability work. Evaluation method of equipment reliability configuration management is to determine the reliability management capabilities of equipment development company. Reliability is not only designed, but also managed to achieve. This paper evaluates the reliability management capabilities by reliability configuration capability maturity model(RCM-CMM) evaluation method.
Usability Evaluation of a Web-Based Learning System
ERIC Educational Resources Information Center
Nguyen, Thao
2012-01-01
The paper proposes a contingent, learner-centred usability evaluation method and a prototype tool of such systems. This is a new usability evaluation method for web-based learning systems using a set of empirically-supported usability factors and can be done effectively with limited resources. During the evaluation process, the method allows for…
Land management planning: a method of evaluating alternatives
Andres Weintraub; Richard Adams; Linda Yellin
1982-01-01
A method is described for developing and evaluating alternatives in land management planning. A structured set of 15 steps provides a framework for such an evaluation. when multiple objectives and uncertainty must be considered in the planning process. The method is consistent with other processes used in organizational evaluation, and allows for the interaction of...
Approaches for Evaluating the Usability of Assistive Technology Product Prototypes
ERIC Educational Resources Information Center
Choi, Young Mi; Sprigle, Stephen H.
2011-01-01
User input is an important component to help guide designers in producing a more usable product. Evaluation of prototypes is one method of obtaining this input, but methods for evaluating assistive technology prototypes during design have not been adequately described or evaluated. This project aimed to compare different methods of evaluating…
Duque-Ramos, Astrid; Boeker, Martin; Jansen, Ludger; Schulz, Stefan; Iniesta, Miguela; Fernández-Breis, Jesualdo Tomás
2014-01-01
Objective To (1) evaluate the GoodOD guideline for ontology development by applying the OQuaRE evaluation method and metrics to the ontology artefacts that were produced by students in a randomized controlled trial, and (2) informally compare the OQuaRE evaluation method with gold standard and competency questions based evaluation methods, respectively. Background In the last decades many methods for ontology construction and ontology evaluation have been proposed. However, none of them has become a standard and there is no empirical evidence of comparative evaluation of such methods. This paper brings together GoodOD and OQuaRE. GoodOD is a guideline for developing robust ontologies. It was previously evaluated in a randomized controlled trial employing metrics based on gold standard ontologies and competency questions as outcome parameters. OQuaRE is a method for ontology quality evaluation which adapts the SQuaRE standard for software product quality to ontologies and has been successfully used for evaluating the quality of ontologies. Methods In this paper, we evaluate the effect of training in ontology construction based on the GoodOD guideline within the OQuaRE quality evaluation framework and compare the results with those obtained for the previous studies based on the same data. Results Our results show a significant effect of the GoodOD training over developed ontologies by topics: (a) a highly significant effect was detected in three topics from the analysis of the ontologies of untrained and trained students; (b) both positive and negative training effects with respect to the gold standard were found for five topics. Conclusion The GoodOD guideline had a significant effect over the quality of the ontologies developed. Our results show that GoodOD ontologies can be effectively evaluated using OQuaRE and that OQuaRE is able to provide additional useful information about the quality of the GoodOD ontologies. PMID:25148262
Reliable clarity automatic-evaluation method for optical remote sensing images
NASA Astrophysics Data System (ADS)
Qin, Bangyong; Shang, Ren; Li, Shengyang; Hei, Baoqin; Liu, Zhiwen
2015-10-01
Image clarity, which reflects the sharpness degree at the edge of objects in images, is an important quality evaluate index for optical remote sensing images. Scholars at home and abroad have done a lot of work on estimation of image clarity. At present, common clarity-estimation methods for digital images mainly include frequency-domain function methods, statistical parametric methods, gradient function methods and edge acutance methods. Frequency-domain function method is an accurate clarity-measure approach. However, its calculation process is complicate and cannot be carried out automatically. Statistical parametric methods and gradient function methods are both sensitive to clarity of images, while their results are easy to be affected by the complex degree of images. Edge acutance method is an effective approach for clarity estimate, while it needs picking out the edges manually. Due to the limits in accuracy, consistent or automation, these existing methods are not applicable to quality evaluation of optical remote sensing images. In this article, a new clarity-evaluation method, which is based on the principle of edge acutance algorithm, is proposed. In the new method, edge detection algorithm and gradient search algorithm are adopted to automatically search the object edges in images. Moreover, The calculation algorithm for edge sharpness has been improved. The new method has been tested with several groups of optical remote sensing images. Compared with the existing automatic evaluation methods, the new method perform better both in accuracy and consistency. Thus, the new method is an effective clarity evaluation method for optical remote sensing images.
Core Professionalism Education in Surgery: A Systematic Review.
Sarıoğlu Büke, Akile; Karabilgin Öztürkçü, Özlem Sürel; Yılmaz, Yusuf; Sayek, İskender
2018-03-15
Professionalism education is one of the major elements of surgical residency education. To evaluate the studies on core professionalism education programs in surgical professionalism education. Systematic review. This systematic literature review was performed to analyze core professionalism programs for surgical residency education published in English with at least three of the following features: program developmental model/instructional design method, aims and competencies, methods of teaching, methods of assessment, and program evaluation model or method. A total of 27083 articles were retrieved using EBSCOHOST, PubMed, Science Direct, Web of Science, and manual search. Eight articles met the selection criteria. The instructional design method was presented in only one article, which described the Analysis, Design, Development, Implementation, and Evaluation model. Six articles were based on the Accreditation Council for Graduate Medical Education criterion, although there was significant variability in content. The most common teaching method was role modeling with scenario- and case-based learning. A wide range of assessment methods for evaluating professionalism education were reported. The Kirkpatrick model was reported in one article as a method for program evaluation. It is suggested that for a core surgical professionalism education program, developmental/instructional design model, aims and competencies, content, teaching methods, assessment methods, and program evaluation methods/models should be well defined, and the content should be comparable.
Umay, Ebru Karaca; Unlu, Ece; Saylam, Guleser Kılıc; Cakci, Aytul; Korkmaz, Hakan
2013-09-01
We aimed in this study to evaluate dysphagia in early stroke patients using a bedside screening test and flexible fiberoptic endoscopic evaluation of swallowing (FFEES) and electrophysiological evaluation (EE) methods and to compare the effectiveness of these methods. Twenty-four patients who were hospitalized in our clinic within the first 3 months after stroke were included in this study. Patients were evaluated using a bedside screening test [including bedside dysphagia score (BDS), neurological examination dysphagia score (NEDS), and total dysphagia score (TDS)] and FFEES and EE methods. Patients were divided into normal-swallowing and dysphagia groups according to the results of the evaluation methods. Patients with dysphagia as determined by any of these methods were compared to the patients with normal swallowing based on the results of the other two methods. Based on the results of our study, a high BDS was positively correlated with dysphagia identified by FFEES and EE methods. Moreover, the FFEES and EE methods were positively correlated. There was no significant correlation between NEDS and TDS levels and either EE or FFEES method. Bedside screening tests should be used mainly as an initial screening test; then FFEES and EE methods should be combined in patients who show risks. This diagnostic algorithm may provide a practical and fast solution for selected stroke patients.
NASA Astrophysics Data System (ADS)
Ma, Guosheng
2018-02-01
With the implementation of the personnel training mode of deep integration between production and education, the original evaluation method cannot adapt to the goal of personnel training, so that the traditional teaching evaluation methods need to be reformed urgently. This paper studies and analyzes the four main problems in the teaching evaluation of agricultural eco-environmental protection specialties, and puts forward three measures to reform the teaching evaluation methods: establishing diversified evaluation indexes, establishing diversified evaluation subjects, and establishing diversified evaluation feedback mechanisms.
Methods of Product Evaluation. Guide Number 10. Evaluation Guides Series.
ERIC Educational Resources Information Center
St. John, Mark
In this guide the logic of product evaluation is described in a framework that is meant to be general and adaptable to all kinds of evaluations. Evaluators should consider using the logic and methods of product evaluation when (1) the purpose of the evaluation is to aid evaluators in making a decision about purchases; (2) a comprehensive…
Wohlsen, T; Bates, J; Vesey, G; Robinson, W A; Katouli, M
2006-04-01
To use BioBall cultures as a precise reference standard to evaluate methods for enumeration of Escherichia coli and other coliform bacteria in water samples. Eight methods were evaluated including membrane filtration, standard plate count (pour and spread plate methods), defined substrate technology methods (Colilert and Colisure), the most probable number method and the Petrifilm disposable plate method. Escherichia coli and Enterobacter aerogenes BioBall cultures containing 30 organisms each were used. All tests were performed using 10 replicates. The mean recovery of both bacteria varied with the different methods employed. The best and most consistent results were obtained with Petrifilm and the pour plate method. Other methods either yielded a low recovery or showed significantly high variability between replicates. The BioBall is a very suitable quality control tool for evaluating the efficiency of methods for bacterial enumeration in water samples.
A Critical Review of Methods to Evaluate the Impact of FDA Regulatory Actions
Briesacher, Becky A.; Soumerai, Stephen B.; Zhang, Fang; Toh, Sengwee; Andrade, Susan E.; Wagner, Joann L.; Shoaibi, Azadeh; Gurwitz, Jerry H.
2013-01-01
Purpose To conduct a synthesis of the literature on methods to evaluate the impacts of FDA regulatory actions, and identify best practices for future evaluations. Methods We searched MEDLINE for manuscripts published between January 1948 and August 2011 that included terms related to FDA, regulatory actions, and empirical evaluation; the review additionally included FDA-identified literature. We used a modified Delphi method to identify preferred methodologies. We included studies with explicit methods to address threats to validity, and identified designs and analytic methods with strong internal validity that have been applied to other policy evaluations. Results We included 18 studies out of 243 abstracts and papers screened. Overall, analytic rigor in prior evaluations of FDA regulatory actions varied considerably; less than a quarter of studies (22%) included control groups. Only 56% assessed changes in the use of substitute products/services, and 11% examined patient health outcomes. Among studies meeting minimal criteria of rigor, 50% found no impact or weak/modest impacts of FDA actions and 33% detected unintended consequences. Among those studies finding significant intended effects of FDA actions, all cited the importance of intensive communication efforts. There are preferred methods with strong internal validity that have yet to be applied to evaluations of FDA regulatory actions. Conclusions Rigorous evaluations of the impact of FDA regulatory actions have been limited and infrequent. Several methods with strong internal validity are available to improve trustworthiness of future evaluations of FDA policies. PMID:23847020
NASA Astrophysics Data System (ADS)
Girard, Corentin; Rinaudo, Jean-Daniel; Pulido-Velázquez, Manuel
2015-04-01
Adaptation to global change is a key issue in the planning of water resource systems in a changing world. Adaptation has to be efficient, but also equitable in the share of the costs of joint adaptation at the river basin scale. Least-cost hydro-economic optimization models have been helpful at defining efficient adaptation strategies. However, they often rely on the assumption of a "perfect cooperation" among the stakeholders, required for reaching the optimal solution. Nowadays, most adaptation decisions have to be agreed among the different actors in charge of their implementation, thus challenging the validity of a perfect command-and-control solution. As a first attempt to over-pass this limitation, our work presents a method to allocate the cost of an efficient adaptation programme of measures among the different stakeholders at the river basin scale. Principles of equity are used to define cost allocation scenarios from different perspectives, combining elements from cooperative game theory and axioms from social justice to bring some "food for thought" in the decision making process of adaptation. To illustrate the type of interactions between stakeholders in a river basin, the method has been applied in a French case study, the Orb river basin. Located on the northern rim of the Mediterranean Sea, this river basin is experiencing changes in demand patterns, and its water resources will be impacted by climate change, calling for the design of an adaptation plan. A least-cost river basin optimization model (LCRBOM) has been developed under GAMS to select the combination of demand- and supply-side adaptation measures that allows meeting quantitative water management targets at the river basin scale in a global change context. The optimal adaptation plan encompasses measures in both agricultural and urban sectors, up-stream and down-stream of the basin, disregarding the individual interests of the stakeholders. In order to ensure equity in the cost allocation of the adaptation plan, different allocation scenarios are considered. The LCRBOM allows defining a solution space based on economic rationality concepts from cooperative game theory (the core of the game), and then, to define equitable allocation of the cost of the programme of measures (the Shapley value and the nucleolus). Moreover, alternative allocation scenarios have been considered based on axiomatic principles of social justice, such as "utilitarian", "prior rights" or "strict equality", applied in the case study area. The comparison of the cost allocation scenarios brings insight to inform the decision making process at the river basin scale and potentially reap the efficiency gains from cooperation in the design of adaptation plan. The study has been partially supported by the IMPADAPT project /CGL2013-48424-C2-1-R) from the Spanish ministry MINECO (Ministerio de Economía y Competitividad) and European FEDER funds. Corentin Girard is supported by a grant from the University Lecturer Training Program (FPU12/03803) of the Ministry of Education, Culture and Sports of Spain.
The Use of Mixed Methods in Randomized Control Trials
ERIC Educational Resources Information Center
White, Howard
2013-01-01
Evaluations should be issues driven, not methods driven. The starting point should be priority programs to be evaluated or policies to be tested. From this starting point, a list of evaluation questions is identified. For each evaluation question, the task is to identify the best available method for answering that question. Hence it is likely…
The State of Evaluation in Internal Medicine Residency
Holmboe, Eric; Beasley, Brent W.
2008-01-01
Background There are no nationwide data on the methods residency programs are using to assess trainee competence. The Accreditation Council for Graduate Medical Education (ACGME) has recommended tools that programs can use to evaluate their trainees. It is unknown if programs are adhering to these recommendations. Objective To describe evaluation methods used by our nation’s internal medicine residency programs and assess adherence to ACGME methodological recommendations for evaluation. Design Nationwide survey. Participants All internal medicine programs registered with the Association of Program Directors of Internal Medicine (APDIM). Measurements Descriptive statistics of programs and tools used to evaluate competence; compliance with ACGME recommended evaluative methods. Results The response rate was 70%. Programs were using an average of 4.2–6.0 tools to evaluate their trainees with heavy reliance on rating forms. Direct observation and practice and data-based tools were used much less frequently. Most programs were using at least 1 of the Accreditation Council for Graduate Medical Education (ACGME)’s “most desirable” methods of evaluation for all 6 measures of trainee competence. These programs had higher support staff to resident ratios than programs using less desirable evaluative methods. Conclusions Residency programs are using a large number and variety of tools for evaluating the competence of their trainees. Most are complying with ACGME recommended methods of evaluation especially if the support staff to resident ratio is high. PMID:18612734
2017-07-01
ER D C/ CE RL T R- 17 -2 5 Army Environmental Quality Technology An Evaluation of Methods for Assessing Vulnerability of Army...Evaluation of Methods for Assessing Vulnerability of Army Installations to Impacts of Climate Change on Listed and At-Risk Species Matthew G. Hohmann...their suitability for informing BRAC-related evaluations. Three recently developed methods for assessing the vulnerability of Army installations to
A Tool for the Automated Design and Evaluation of Habitat Interior Layouts
NASA Technical Reports Server (NTRS)
Simon, Matthew A.; Wilhite, Alan W.
2013-01-01
The objective of space habitat design is to minimize mass and system size while providing adequate space for all necessary equipment and a functional layout that supports crew health and productivity. Unfortunately, development and evaluation of interior layouts is often ignored during conceptual design because of the subjectivity and long times required using current evaluation methods (e.g., human-in-the-loop mockup tests and in-depth CAD evaluations). Early, more objective assessment could prevent expensive design changes that may increase vehicle mass and compromise functionality. This paper describes a new interior design evaluation method to enable early, structured consideration of habitat interior layouts. This interior layout evaluation method features a comprehensive list of quantifiable habitat layout evaluation criteria, automatic methods to measure these criteria from a geometry model, and application of systems engineering tools and numerical methods to construct a multi-objective value function measuring the overall habitat layout performance. In addition to a detailed description of this method, a C++/OpenGL software tool which has been developed to implement this method is also discussed. This tool leverages geometry modeling coupled with collision detection techniques to identify favorable layouts subject to multiple constraints and objectives (e.g., minimize mass, maximize contiguous habitable volume, maximize task performance, and minimize crew safety risks). Finally, a few habitat layout evaluation examples are described to demonstrate the effectiveness of this method and tool to influence habitat design.
Evaluating Health Information Systems Using Ontologies
Anderberg, Peter; Larsson, Tobias C; Fricker, Samuel A; Berglund, Johan
2016-01-01
Background There are several frameworks that attempt to address the challenges of evaluation of health information systems by offering models, methods, and guidelines about what to evaluate, how to evaluate, and how to report the evaluation results. Model-based evaluation frameworks usually suggest universally applicable evaluation aspects but do not consider case-specific aspects. On the other hand, evaluation frameworks that are case specific, by eliciting user requirements, limit their output to the evaluation aspects suggested by the users in the early phases of system development. In addition, these case-specific approaches extract different sets of evaluation aspects from each case, making it challenging to collectively compare, unify, or aggregate the evaluation of a set of heterogeneous health information systems. Objectives The aim of this paper is to find a method capable of suggesting evaluation aspects for a set of one or more health information systems—whether similar or heterogeneous—by organizing, unifying, and aggregating the quality attributes extracted from those systems and from an external evaluation framework. Methods On the basis of the available literature in semantic networks and ontologies, a method (called Unified eValuation using Ontology; UVON) was developed that can organize, unify, and aggregate the quality attributes of several health information systems into a tree-style ontology structure. The method was extended to integrate its generated ontology with the evaluation aspects suggested by model-based evaluation frameworks. An approach was developed to extract evaluation aspects from the ontology that also considers evaluation case practicalities such as the maximum number of evaluation aspects to be measured or their required degree of specificity. The method was applied and tested in Future Internet Social and Technological Alignment Research (FI-STAR), a project of 7 cloud-based eHealth applications that were developed and deployed across European Union countries. Results The relevance of the evaluation aspects created by the UVON method for the FI-STAR project was validated by the corresponding stakeholders of each case. These evaluation aspects were extracted from a UVON-generated ontology structure that reflects both the internally declared required quality attributes in the 7 eHealth applications of the FI-STAR project and the evaluation aspects recommended by the Model for ASsessment of Telemedicine applications (MAST) evaluation framework. The extracted evaluation aspects were used to create questionnaires (for the corresponding patients and health professionals) to evaluate each individual case and the whole of the FI-STAR project. Conclusions The UVON method can provide a relevant set of evaluation aspects for a heterogeneous set of health information systems by organizing, unifying, and aggregating the quality attributes through ontological structures. Those quality attributes can be either suggested by evaluation models or elicited from the stakeholders of those systems in the form of system requirements. The method continues to be systematic, context sensitive, and relevant across a heterogeneous set of health information systems. PMID:27311735
Core Professionalism Education in Surgery: A Systematic Review
Sarıoğlu Büke, Akile; Karabilgin Öztürkçü, Özlem Sürel; Yılmaz, Yusuf; Sayek, İskender
2018-01-01
Background: Professionalism education is one of the major elements of surgical residency education. Aims: To evaluate the studies on core professionalism education programs in surgical professionalism education. Study Design: Systematic review. Methods: This systematic literature review was performed to analyze core professionalism programs for surgical residency education published in English with at least three of the following features: program developmental model/instructional design method, aims and competencies, methods of teaching, methods of assessment, and program evaluation model or method. A total of 27083 articles were retrieved using EBSCOHOST, PubMed, Science Direct, Web of Science, and manual search. Results: Eight articles met the selection criteria. The instructional design method was presented in only one article, which described the Analysis, Design, Development, Implementation, and Evaluation model. Six articles were based on the Accreditation Council for Graduate Medical Education criterion, although there was significant variability in content. The most common teaching method was role modeling with scenario- and case-based learning. A wide range of assessment methods for evaluating professionalism education were reported. The Kirkpatrick model was reported in one article as a method for program evaluation. Conclusion: It is suggested that for a core surgical professionalism education program, developmental/instructional design model, aims and competencies, content, teaching methods, assessment methods, and program evaluation methods/models should be well defined, and the content should be comparable. PMID:29553464
Computerized Analysis of Digital Photographs for Evaluation of Tooth Movement
Toodehzaeim, Mohammad Hossein; Karandish, Maryam; Karandish, Mohammad Nabi
2015-01-01
Objectives: Various methods have been introduced for evaluation of tooth movement in orthodontics. The challenge is to adopt the most accurate and most beneficial method for patients. This study was designed to introduce analysis of digital photographs with AutoCAD software as a method to evaluate tooth movement and assess the reliability of this method. Materials and Methods: Eighteen patients were evaluated in this study. Three intraoral digital images from the buccal view were captured from each patient in half an hour interval. All the photos were sent to AutoCAD software 2011, calibrated and the distance between canine and molar hooks were measured. The data was analyzed using intraclass correlation coefficient. Results: Photographs were found to have high reliability coefficient (P > 0.05). Conclusion: The introduced method is an accurate, efficient and reliable method for evaluation of tooth movement. PMID:26622272
A human reliability based usability evaluation method for safety-critical software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boring, R. L.; Tran, T. Q.; Gertman, D. I.
2006-07-01
Boring and Gertman (2005) introduced a novel method that augments heuristic usability evaluation methods with that of the human reliability analysis method of SPAR-H. By assigning probabilistic modifiers to individual heuristics, it is possible to arrive at the usability error probability (UEP). Although this UEP is not a literal probability of error, it nonetheless provides a quantitative basis to heuristic evaluation. This method allows one to seamlessly prioritize and identify usability issues (i.e., a higher UEP requires more immediate fixes). However, the original version of this method required the usability evaluator to assign priority weights to the final UEP, thusmore » allowing the priority of a usability issue to differ among usability evaluators. The purpose of this paper is to explore an alternative approach to standardize the priority weighting of the UEP in an effort to improve the method's reliability. (authors)« less
Cross-Continental Reflections on Evaluation Practice: Methods, Use, and Valuing
ERIC Educational Resources Information Center
Kallemeyn, Leanne M.; Hall, Jori; Friche, Nanna; McReynolds, Clifton
2015-01-01
The evaluation theory tree typology reflects the following three components of evaluation practice: (a) methods, (b) use, and (c) valuing. The purpose of this study was to explore how evaluation practice is conceived as reflected in articles published in the "American Journal of Evaluation" ("AJE") and "Evaluation," a…
Evaluating co-creation of knowledge: from quality criteria and indicators to methods
NASA Astrophysics Data System (ADS)
Schuck-Zöller, Susanne; Cortekar, Jörg; Jacob, Daniela
2017-11-01
Basic research in the natural sciences rests on a long tradition of evaluation. However, since the San Francisco Declaration on Research Assessment (DORA) came out in 2012, there has been intense discussion in the natural sciences, above all amongst researchers and funding agencies in the different fields of applied research and scientific service. This discussion was intensified when climate services and other fields, used to make users participate in research and development activities (co-creation), demanded new evaluation methods appropriate to this new research mode. This paper starts by describing a comprehensive and interdisciplinary literature overview of indicators to evaluate co-creation of knowledge, including the different fields of integrated knowledge production. Then the authors harmonize the different elements of evaluation from literature in an evaluation cascade that scales down from very general evaluation dimensions to tangible assessment methods. They describe evaluation indicators already being documented and include a mixture of different assessment methods for two exemplary criteria. It is shown what can be deduced from already existing methodology for climate services and envisaged how climate services can further to develop their specific evaluation method.
ERIC Educational Resources Information Center
Aucoin, Jennifer Mangrum
2013-01-01
The purpose of this mixed methods concurrent triangulation study was to examine the program evaluation practices of high school counselors. A total of 294 high school counselors in Texas were assessed using a mixed methods concurrent triangulation design. A researcher-developed survey, the School Counseling Program Evaluation Questionnaire…
Evaluation method based on the image correlation for laser jamming image
NASA Astrophysics Data System (ADS)
Che, Jinxi; Li, Zhongmin; Gao, Bo
2013-09-01
The jamming effectiveness evaluation of infrared imaging system is an important part of electro-optical countermeasure. The infrared imaging devices in the military are widely used in the searching, tracking and guidance and so many other fields. At the same time, with the continuous development of laser technology, research of laser interference and damage effect developed continuously, laser has been used to disturbing the infrared imaging device. Therefore, the effect evaluation of the infrared imaging system by laser has become a meaningful problem to be solved. The information that the infrared imaging system ultimately present to the user is an image, so the evaluation on jamming effect can be made from the point of assessment of image quality. The image contains two aspects of the information, the light amplitude and light phase, so the image correlation can accurately perform the difference between the original image and disturbed image. In the paper, the evaluation method of digital image correlation, the assessment method of image quality based on Fourier transform, the estimate method of image quality based on error statistic and the evaluation method of based on peak signal noise ratio are analysed. In addition, the advantages and disadvantages of these methods are analysed. Moreover, the infrared disturbing images of the experiment result, in which the thermal infrared imager was interfered by laser, were analysed by using these methods. The results show that the methods can better reflect the jamming effects of the infrared imaging system by laser. Furthermore, there is good consistence between evaluation results by using the methods and the results of subjective visual evaluation. And it also provides well repeatability and convenient quantitative analysis. The feasibility of the methods to evaluate the jamming effect was proved. It has some extent reference value for the studying and developing on electro-optical countermeasures equipments and effectiveness evaluation.
NASA Astrophysics Data System (ADS)
Riechers, Paul M.; Crutchfield, James P.
2018-06-01
Nonlinearities in finite dimensions can be linearized by projecting them into infinite dimensions. Unfortunately, the familiar linear operator techniques that one would then hope to use often fail since the operators cannot be diagonalized. The curse of nondiagonalizability also plays an important role even in finite-dimensional linear operators, leading to analytical impediments that occur across many scientific domains. We show how to circumvent it via two tracks. First, using the well-known holomorphic functional calculus, we develop new practical results about spectral projection operators and the relationship between left and right generalized eigenvectors. Second, we generalize the holomorphic calculus to a meromorphic functional calculus that can decompose arbitrary functions of nondiagonalizable linear operators in terms of their eigenvalues and projection operators. This simultaneously simplifies and generalizes functional calculus so that it is readily applicable to analyzing complex physical systems. Together, these results extend the spectral theorem of normal operators to a much wider class, including circumstances in which poles and zeros of the function coincide with the operator spectrum. By allowing the direct manipulation of individual eigenspaces of nonnormal and nondiagonalizable operators, the new theory avoids spurious divergences. As such, it yields novel insights and closed-form expressions across several areas of physics in which nondiagonalizable dynamics arise, including memoryful stochastic processes, open nonunitary quantum systems, and far-from-equilibrium thermodynamics. The technical contributions include the first full treatment of arbitrary powers of an operator, highlighting the special role of the zero eigenvalue. Furthermore, we show that the Drazin inverse, previously only defined axiomatically, can be derived as the negative-one power of singular operators within the meromorphic functional calculus and we give a new general method to construct it. We provide new formulae for constructing spectral projection operators and delineate the relations among projection operators, eigenvectors, and left and right generalized eigenvectors. By way of illustrating its application, we explore several, rather distinct examples. First, we analyze stochastic transition operators in discrete and continuous time. Second, we show that nondiagonalizability can be a robust feature of a stochastic process, induced even by simple counting. As a result, we directly derive distributions of the time-dependent Poisson process and point out that nondiagonalizability is intrinsic to it and the broad class of hidden semi-Markov processes. Third, we show that the Drazin inverse arises naturally in stochastic thermodynamics and that applying the meromorphic functional calculus provides closed-form solutions for the dynamics of key thermodynamic observables. Finally, we draw connections to the Ruelle-Frobenius-Perron and Koopman operators for chaotic dynamical systems and propose how to extract eigenvalues from a time-series.
Evaluation of Thermoelectric Devices by the Slope-Efficiency Method
2016-09-01
ARL-TR-7837 ● SEP 2016 US Army Research Laboratory Evaluation of Thermoelectric Devices by the Slope-Efficiency Method by...Evaluation of Thermoelectric Devices by the Slope-Efficiency Method by Patrick J Taylor Sensors and Electron Devices Directorate, ARL Jay R...
DOT National Transportation Integrated Search
2000-01-01
Three nondestructive evaluation (NDE) methods for concrete pavements - surface ultrasonic pulse velocity measurements (UPV), the impact-echo (IE) method, and the use of a seismic pavement analyzer (SPA) - were tested on six sections of two continuous...
DOT National Transportation Integrated Search
2015-10-01
The overarching goal of the MoDOT Pavement Preservation Research Program, Task 3: Pavement Evaluation Tools Data : Collection Methods was to identify and evaluate methods to rapidly obtain network-level and project-level information relevant to :...
Examining mixing methods in an evaluation of a smoking cessation program.
Betzner, Anne; Lawrenz, Frances P; Thao, Mao
2016-02-01
Three different methods were used in an evaluation of a smoking cessation study: surveys, focus groups, and phenomenological interviews. The results of each method were analyzed separately and then combined using both a pragmatic and dialectic stance to examine the effects of different approaches to mixing methods. Results show that the further apart the methods are philosophically, the more diverse the findings. Comparisons of decision maker opinions and costs of the different methods are provided along with recommendations for evaluators' uses of different methods. Copyright © 2015. Published by Elsevier Ltd.
A KARAOKE System Singing Evaluation Method that More Closely Matches Human Evaluation
NASA Astrophysics Data System (ADS)
Takeuchi, Hideyo; Hoguro, Masahiro; Umezaki, Taizo
KARAOKE is a popular amusement for old and young. Many KARAOKE machines have singing evaluation function. However, it is often said that the scores given by KARAOKE machines do not match human evaluation. In this paper a KARAOKE scoring method strongly correlated with human evaluation is proposed. This paper proposes a way to evaluate songs based on the distance between singing pitch and musical scale, employing a vibrato extraction method based on template matching of spectrum. The results show that correlation coefficients between scores given by the proposed system and human evaluation are -0.76∼-0.89.
A strategy for evaluating pathway analysis methods.
Yu, Chenggang; Woo, Hyung Jun; Yu, Xueping; Oyama, Tatsuya; Wallqvist, Anders; Reifman, Jaques
2017-10-13
Researchers have previously developed a multitude of methods designed to identify biological pathways associated with specific clinical or experimental conditions of interest, with the aim of facilitating biological interpretation of high-throughput data. Before practically applying such pathway analysis (PA) methods, we must first evaluate their performance and reliability, using datasets where the pathways perturbed by the conditions of interest have been well characterized in advance. However, such 'ground truths' (or gold standards) are often unavailable. Furthermore, previous evaluation strategies that have focused on defining 'true answers' are unable to systematically and objectively assess PA methods under a wide range of conditions. In this work, we propose a novel strategy for evaluating PA methods independently of any gold standard, either established or assumed. The strategy involves the use of two mutually complementary metrics, recall and discrimination. Recall measures the consistency of the perturbed pathways identified by applying a particular analysis method to an original large dataset and those identified by the same method to a sub-dataset of the original dataset. In contrast, discrimination measures specificity-the degree to which the perturbed pathways identified by a particular method to a dataset from one experiment differ from those identifying by the same method to a dataset from a different experiment. We used these metrics and 24 datasets to evaluate six widely used PA methods. The results highlighted the common challenge in reliably identifying significant pathways from small datasets. Importantly, we confirmed the effectiveness of our proposed dual-metric strategy by showing that previous comparative studies corroborate the performance evaluations of the six methods obtained by our strategy. Unlike any previously proposed strategy for evaluating the performance of PA methods, our dual-metric strategy does not rely on any ground truth, either established or assumed, of the pathways perturbed by a specific clinical or experimental condition. As such, our strategy allows researchers to systematically and objectively evaluate pathway analysis methods by employing any number of datasets for a variety of conditions.
Computerized Analysis of Digital Photographs for Evaluation of Tooth Movement.
Toodehzaeim, Mohammad Hossein; Karandish, Maryam; Karandish, Mohammad Nabi
2015-03-01
Various methods have been introduced for evaluation of tooth movement in orthodontics. The challenge is to adopt the most accurate and most beneficial method for patients. This study was designed to introduce analysis of digital photographs with AutoCAD software as a method to evaluate tooth movement and assess the reliability of this method. Eighteen patients were evaluated in this study. Three intraoral digital images from the buccal view were captured from each patient in half an hour interval. All the photos were sent to AutoCAD software 2011, calibrated and the distance between canine and molar hooks were measured. The data was analyzed using intraclass correlation coefficient. Photographs were found to have high reliability coefficient (P > 0.05). The introduced method is an accurate, efficient and reliable method for evaluation of tooth movement.
Turbulent boundary layers over nonstationary plane boundaries
NASA Technical Reports Server (NTRS)
Roper, A. T.; Gentry, G. L., Jr.
1978-01-01
Methods of predicting integral parameters and skin friction coefficients of turbulent boundary layers developing over moving ground planes were evaluated. The three methods evaluated were: relative integral parameter method; relative power law method; and modified law of the wall method.
NASA Astrophysics Data System (ADS)
Wu, Linqin; Xu, Sheng; Jiang, Dezhi
2015-12-01
Industrial wireless networked control system has been widely used, and how to evaluate the performance of the wireless network is of great significance. In this paper, considering the shortcoming of the existing performance evaluation methods, a comprehensive performance evaluation method of networks multi-indexes fuzzy analytic hierarchy process (MFAHP) combined with the fuzzy mathematics and the traditional analytic hierarchy process (AHP) is presented. The method can overcome that the performance evaluation is not comprehensive and subjective. Experiments show that the method can reflect the network performance of real condition. It has direct guiding role on protocol selection, network cabling, and node setting, and can meet the requirements of different occasions by modifying the underlying parameters.
Hoseini, Bibi Leila; Mazloum, Seyed Reza; Jafarnejad, Farzaneh; Foroughipour, Mohsen
2013-03-01
The clinical evaluation, as one of the most important elements in medical education, must measure students' competencies and abilities. The implementation of any assessment tool is basically dependent on the acceptance of students. This study tried to assess midwifery students' satisfaction with Direct Observation of Procedural Skills (DOPS) and current clinical evaluation methods. This quasi-experimental study was conducted in the university hospitals affiliated to Mashhad University of Medical Sciences. The subjects comprised 67 undergraduate midwifery students selected by convenience sampling and allocated to control and intervention groups according to the training transposition. Current method was performed in the control group, and DOPS was conducted in the intervention group. The applied tools included DOPS rating scales, logbook, and satisfaction questionnaires with clinical evaluation methods. Validity and reliability of these tools were approved. At the end of training, students' satisfaction with the evaluation methods was assessed by the mentioned tools. The data were analyzed by descriptive and analytical statistics. Satisfaction mean scores of midwifery students with DOPS and current methods were 76.7 ± 12.9 and 62.6 ± 14.7 (out of 100), respectively. DOPS students' satisfaction mean score was significantly higher than the score obtained in current method (P < 0.000). The most satisfactory domains in the current method were "consistence with learning objectives" (71.2 ± 14.9) and "objectiveness" in DOPS (87.9 ± 15.0). In contrast, the least satisfactory domains in the current method were "interested in applying the method" (57.8 ± 26.5) and "number of assessments for each skill" (58.8 ± 25.9) in DOPS method. This study showed that DOPS method is associated with greater students' satisfaction. Since the students' satisfaction with the current method was also acceptable, we recommend combining this new clinical evaluation method with the current method, which covers its weaknesses, to promote the students' satisfaction with clinical evaluation methods in a perfect manner.
EPA METHODS FOR EVALUATING WETLAND CONDITION, WETLANDS CLASSIFICATION
In 1999, the U.S. Environmental Protection Agency (EPA) began work on this series of reports entitled Methods for Evaluating Wetland Condition. The purpose of these reports is to help States and Tribes develop methods to evaluate 1) the overall ecological condition of wetlands us...
Duan, Xia; Shi, Yan
2014-01-01
Background: The quality evaluation of nursing care is a key link in medical quality management. It is important and worth studying for the nursing supervisors to know the disadvantages during the process of quality evaluation of nursing care and then to improve the whole nursing quality. This study was to provide director insight on the current status of quality evaluation of nursing care from Nursing Quality Control Centers (NQCCs). Material and Methods: This qualitative study used a sample of 12 directors from NQCCs who were recruited from 12 provinces in China to evaluate the current status of quality evaluation of nursing care. Data were collected by in-depth interviews. Content analysis method was used to analyze the data. Results: Four themes emerged from the data: 1) lag of evaluation index; 2) limitations of evaluation content; 3) simplicity of evaluation method; 4) excessive emphasis on terminal quality. Conclusion: It is of great realistic significance to ameliorate nursing quality evaluation criteria, modify the evaluation content based on patient needs-oriented idea, adopt scientific evaluation method to evaluate nursing quality, and scientifically and reasonably draw horizontal comparisons of nursing quality between hospitals, as well as longitudinal comparisons of a hospital’s nursing quality. These methods mentioned above can all enhance a hospital’s core competitiveness and benefit more patients. PMID:25419427
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ragan, Eric D; Goodall, John R
2014-01-01
Provenance tools can help capture and represent the history of analytic processes. In addition to supporting analytic performance, provenance tools can be used to support memory of the process and communication of the steps to others. Objective evaluation methods are needed to evaluate how well provenance tools support analyst s memory and communication of analytic processes. In this paper, we present several methods for the evaluation of process memory, and we discuss the advantages and limitations of each. We discuss methods for determining a baseline process for comparison, and we describe various methods that can be used to elicit processmore » recall, step ordering, and time estimations. Additionally, we discuss methods for conducting quantitative and qualitative analyses of process memory. By organizing possible memory evaluation methods and providing a meta-analysis of the potential benefits and drawbacks of different approaches, this paper can inform study design and encourage objective evaluation of process memory and communication.« less
Chen, Yibin; Chen, Jiaxi; Chen, Xuan; Wang, Min; Wang, Wei
2015-01-01
A new method of uniform sampling is evaluated in this paper. The items and indexes were adopted to evaluate the rationality of the uniform sampling. The evaluation items included convenience of operation, uniformity of sampling site distribution, and accuracy and precision of measured results. The evaluation indexes included operational complexity, occupation rate of sampling site in a row and column, relative accuracy of pill weight, and relative deviation of pill weight. They were obtained from three kinds of drugs with different shape and size by four kinds of sampling methods. Gray correlation analysis was adopted to make the comprehensive evaluation by comparing it with the standard method. The experimental results showed that the convenience of uniform sampling method was 1 (100%), odds ratio of occupation rate in a row and column was infinity, relative accuracy was 99.50-99.89%, reproducibility RSD was 0.45-0.89%, and weighted incidence degree exceeded the standard method. Hence, the uniform sampling method was easy to operate, and the selected samples were distributed uniformly. The experimental results demonstrated that the uniform sampling method has good accuracy and reproducibility, which can be put into use in drugs analysis.
Duque-Ramos, Astrid; Boeker, Martin; Jansen, Ludger; Schulz, Stefan; Iniesta, Miguela; Fernández-Breis, Jesualdo Tomás
2014-01-01
To (1) evaluate the GoodOD guideline for ontology development by applying the OQuaRE evaluation method and metrics to the ontology artefacts that were produced by students in a randomized controlled trial, and (2) informally compare the OQuaRE evaluation method with gold standard and competency questions based evaluation methods, respectively. In the last decades many methods for ontology construction and ontology evaluation have been proposed. However, none of them has become a standard and there is no empirical evidence of comparative evaluation of such methods. This paper brings together GoodOD and OQuaRE. GoodOD is a guideline for developing robust ontologies. It was previously evaluated in a randomized controlled trial employing metrics based on gold standard ontologies and competency questions as outcome parameters. OQuaRE is a method for ontology quality evaluation which adapts the SQuaRE standard for software product quality to ontologies and has been successfully used for evaluating the quality of ontologies. In this paper, we evaluate the effect of training in ontology construction based on the GoodOD guideline within the OQuaRE quality evaluation framework and compare the results with those obtained for the previous studies based on the same data. Our results show a significant effect of the GoodOD training over developed ontologies by topics: (a) a highly significant effect was detected in three topics from the analysis of the ontologies of untrained and trained students; (b) both positive and negative training effects with respect to the gold standard were found for five topics. The GoodOD guideline had a significant effect over the quality of the ontologies developed. Our results show that GoodOD ontologies can be effectively evaluated using OQuaRE and that OQuaRE is able to provide additional useful information about the quality of the GoodOD ontologies.
Methodological Pluralism: The Gold Standard of STEM Evaluation
ERIC Educational Resources Information Center
Lawrenz, Frances; Huffman, Douglas
2006-01-01
Nationally, there is continuing debate about appropriate methods for conducting educational evaluations. The U.S. Department of Education has placed a priority on "scientifically" based evaluation methods and has advocated a "gold standard" of randomized controlled experimentation. The priority suggests that randomized control methods are best,…
DOT National Transportation Integrated Search
2006-01-01
This study evaluated two half-cell mapping methods for nondestructive evaluation of epoxy-coated rebar (ECR) in concrete: the semi-fixed bi-electrode and the moving bi-electrode methods. These methods were expected to provide early detection of corro...
An Improved Image Ringing Evaluation Method with Weighted Sum of Gray Extreme Value
NASA Astrophysics Data System (ADS)
Yang, Ling; Meng, Yanhua; Wang, Bo; Bai, Xu
2018-03-01
Blind image restoration algorithm usually produces ringing more obvious at the edges. Ringing phenomenon is mainly affected by noise, species of restoration algorithm, and the impact of the blur kernel estimation during restoration. Based on the physical mechanism of ringing, a method of evaluating the ringing on blind restoration images is proposed. The method extracts the ringing image overshooting and ripple region to make the weighted statistics for the regional gradient value. According to the weights set by multiple experiments, the edge information is used to characterize the details of the edge to determine the weight, quantify the seriousness of the ring effect, and propose the evaluation method of the ringing caused by blind restoration. The experimental results show that the method can effectively evaluate the ring effect in the restoration images under different restoration algorithms and different restoration parameters. The evaluation results are consistent with the visual evaluation results.
Drug exposure in register-based research—An expert-opinion based evaluation of methods
Taipale, Heidi; Koponen, Marjaana; Tolppanen, Anna-Maija; Hartikainen, Sirpa; Ahonen, Riitta; Tiihonen, Jari
2017-01-01
Background In register-based pharmacoepidemiological studies, construction of drug exposure periods from drug purchases is a major methodological challenge. Various methods have been applied but their validity is rarely evaluated. Our objective was to conduct an expert-opinion based evaluation of the correctness of drug use periods produced by different methods. Methods Drug use periods were calculated with three fixed methods: time windows, assumption of one Defined Daily Dose (DDD) per day and one tablet per day, and with PRE2DUP that is based on modelling of individual drug purchasing behavior. Expert-opinion based evaluation was conducted with 200 randomly selected purchase histories of warfarin, bisoprolol, simvastatin, risperidone and mirtazapine in the MEDALZ-2005 cohort (28,093 persons with Alzheimer’s disease). Two experts reviewed purchase histories and judged which methods had joined correct purchases and gave correct duration for each of 1000 drug exposure periods. Results The evaluated correctness of drug use periods was 70–94% for PRE2DUP, and depending on grace periods and time window lengths 0–73% for tablet methods, 0–41% for DDD methods and 0–11% for time window methods. The highest rate of evaluated correct solutions for each method class were observed for 1 tablet per day with 180 days grace period (TAB_1_180, 43–73%), and 1 DDD per day with 180 days grace period (1–41%). Time window methods produced at maximum only 11% correct solutions. The best performing fixed method TAB_1_180 reached highest correctness for simvastatin 73% (95% CI 65–81%) whereas 89% (95% CI 84–94%) of PRE2DUP periods were judged as correct. Conclusions This study shows inaccuracy of fixed methods and the urgent need for new data-driven methods. In the expert-opinion based evaluation, the lowest error rates were observed with data-driven method PRE2DUP. PMID:28886089
Evaluation Methods of The Text Entities
ERIC Educational Resources Information Center
Popa, Marius
2006-01-01
The paper highlights some evaluation methods to assess the quality characteristics of the text entities. The main concepts used in building and evaluation processes of the text entities are presented. Also, some aggregated metrics for orthogonality measurements are presented. The evaluation process for automatic evaluation of the text entities is…
ORE's GENeric Evaluation SYStem: GENESYS 1988-89.
ERIC Educational Resources Information Center
Baenen, Nancy; And Others
GENESYS--GENeric Evaluation SYStem--is a method of streamlining data collection and evaluation through the use of computer technology. GENESYS has allowed the Office of Research and Evaluation (ORE) of the Austin (Texas) Independent School District to evaluate a multitude of contrasting programs with limited resources. By standardizing methods and…
Evaluation as a Collaborative Activity to Learn Content Knowledge in a Graduate Course
ERIC Educational Resources Information Center
Hughes, Bob; Arbogast, Janet; Kafer, Lindsey; Chen, Julianna
2014-01-01
Teaching graduate students to conduct evaluations is typically relegated to evaluation methods courses. This approach misses an opportunity for students to collaboratively use evaluation skills to explore content. This article examines a graduate course, Issues in Adult Basic Education, in which students learned evaluation methods concurrently…
NASA Astrophysics Data System (ADS)
Zhu, Lianqing; Chen, Yunfang; Chen, Qingshan; Meng, Hao
2011-05-01
According to minimum zone condition, a method for evaluating the profile error of Archimedes helicoid surface based on Genetic Algorithm (GA) is proposed. The mathematic model of the surface is provided and the unknown parameters in the equation of surface are acquired through least square method. Principle of GA is explained. Then, the profile error of Archimedes Helicoid surface is obtained through GA optimization method. To validate the proposed method, the profile error of an Archimedes helicoid surface, Archimedes Cylindrical worm (ZA worm) surface, is evaluated. The results show that the proposed method is capable of correctly evaluating the profile error of Archimedes helicoid surface and satisfy the evaluation standard of the Minimum Zone Method. It can be applied to deal with the measured data of profile error of complex surface obtained by three coordinate measurement machines (CMM).
Chen, Jing; Wang, Shu-Mei; Meng, Jiang; Sun, Fei; Liang, Sheng-Wang
2013-05-01
To establish a new method for quality evaluation and validate its feasibilities by simultaneous quantitative assay of five alkaloids in Sophora flavescens. The new quality evaluation method, quantitative analysis of multi-components by single marker (QAMS), was established and validated with S. flavescens. Five main alkaloids, oxymatrine, sophocarpine, matrine, oxysophocarpine and sophoridine, were selected as analytes to evaluate the quality of rhizome of S. flavescens, and the relative correction factor has good repeatibility. Their contents in 21 batches of samples, collected from different areas, were determined by both external standard method and QAMS. The method was evaluated by comparison of the quantitative results between external standard method and QAMS. No significant differences were found in the quantitative results of five alkaloids in 21 batches of S. flavescens determined by external standard method and QAMS. It is feasible and suitable to evaluate the quality of rhizome of S. flavescens by QAMS.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kurnik, Charles W; Gowans, Dakers; Telarico, Chad
The Commercial and Industrial Lighting Evaluation Protocol (the protocol) describes methods to account for gross energy savings resulting from the programmatic installation of efficient lighting equipment in large populations of commercial, industrial, and other nonresidential facilities. This protocol does not address savings resulting from changes in codes and standards, or from education and training activities. A separate Uniform Methods Project (UMP) protocol, Chapter 3: Commercial and Industrial Lighting Controls Evaluation Protocol, addresses methods for evaluating savings resulting from lighting control measures such as adding time clocks, tuning energy management system commands, and adding occupancy sensors.
Duque, Gustavo; Finkelstein, Adam; Roberts, Ayanna; Tabatabai, Diana; Gold, Susan L; Winer, Laura R
2006-01-01
Background Electronic evaluation portfolios may play a role in learning and evaluation in clinical settings and may complement other traditional evaluation methods (bedside evaluations, written exams and tutor-led evaluations). Methods 133 third-year medical students used the McGill Electronic Evaluation Portfolio (MEEP) during their one-month clerkship rotation in Geriatric Medicine between September 2002 and September 2003. Students were divided into two groups, one who received an introductory hands-on session about the electronic evaluation portfolio and one who did not. Students' marks in their portfolios were compared between both groups. Additionally, students self-evaluated their performance and received feedback using the electronic portfolio during their mandatory clerkship rotation. Students were surveyed immediately after the rotation and at the end of the clerkship year. Tutors' opinions about this method were surveyed once. Finally, the number of evaluations/month was quantified. In all surveys, Likert scales were used and were analyzed using Chi-square tests and t-tests to assess significant differences in the responses from surveyed subjects. Results The introductory session had a significant effect on students' portfolio marks as well as on their comfort using the system. Both tutors and students reported positive notions about the method. Remarkably, an average (± SD) of 520 (± 70) evaluations/month was recorded with 30 (± 5) evaluations per student/month. Conclusion The MEEP showed a significant and positive effect on both students' self-evaluations and tutors' evaluations involving an important amount of self-reflection and feedback which may complement the more traditional evaluation methods. PMID:16409640
Comparative Evaluation of Quantitative Test Methods for Gases on a Hard Surface
2017-02-01
COMPARATIVE EVALUATION OF QUANTITATIVE TEST METHODS FOR GASES ON A HARD SURFACE ECBC-TR-1426 Vipin Rastogi...1 COMPARATIVE EVALUATION OF QUANTITATIVE TEST METHODS FOR GASES ON A HARD SURFACE 1. INTRODUCTION Members of the U.S. Environmental...Generator 4 2.4 Experimental Design Each quantitative method was performed three times on three consecutive days. For the CD runs, three
Evaluating Health Information Systems Using Ontologies.
Eivazzadeh, Shahryar; Anderberg, Peter; Larsson, Tobias C; Fricker, Samuel A; Berglund, Johan
2016-06-16
There are several frameworks that attempt to address the challenges of evaluation of health information systems by offering models, methods, and guidelines about what to evaluate, how to evaluate, and how to report the evaluation results. Model-based evaluation frameworks usually suggest universally applicable evaluation aspects but do not consider case-specific aspects. On the other hand, evaluation frameworks that are case specific, by eliciting user requirements, limit their output to the evaluation aspects suggested by the users in the early phases of system development. In addition, these case-specific approaches extract different sets of evaluation aspects from each case, making it challenging to collectively compare, unify, or aggregate the evaluation of a set of heterogeneous health information systems. The aim of this paper is to find a method capable of suggesting evaluation aspects for a set of one or more health information systems-whether similar or heterogeneous-by organizing, unifying, and aggregating the quality attributes extracted from those systems and from an external evaluation framework. On the basis of the available literature in semantic networks and ontologies, a method (called Unified eValuation using Ontology; UVON) was developed that can organize, unify, and aggregate the quality attributes of several health information systems into a tree-style ontology structure. The method was extended to integrate its generated ontology with the evaluation aspects suggested by model-based evaluation frameworks. An approach was developed to extract evaluation aspects from the ontology that also considers evaluation case practicalities such as the maximum number of evaluation aspects to be measured or their required degree of specificity. The method was applied and tested in Future Internet Social and Technological Alignment Research (FI-STAR), a project of 7 cloud-based eHealth applications that were developed and deployed across European Union countries. The relevance of the evaluation aspects created by the UVON method for the FI-STAR project was validated by the corresponding stakeholders of each case. These evaluation aspects were extracted from a UVON-generated ontology structure that reflects both the internally declared required quality attributes in the 7 eHealth applications of the FI-STAR project and the evaluation aspects recommended by the Model for ASsessment of Telemedicine applications (MAST) evaluation framework. The extracted evaluation aspects were used to create questionnaires (for the corresponding patients and health professionals) to evaluate each individual case and the whole of the FI-STAR project. The UVON method can provide a relevant set of evaluation aspects for a heterogeneous set of health information systems by organizing, unifying, and aggregating the quality attributes through ontological structures. Those quality attributes can be either suggested by evaluation models or elicited from the stakeholders of those systems in the form of system requirements. The method continues to be systematic, context sensitive, and relevant across a heterogeneous set of health information systems.
Extraction of GBH Film Medicine and Influence on Quality Evaluation of The Film
NASA Astrophysics Data System (ADS)
Ji, Y. B.; Lu, L.; Ru, X.; Guo, S. Z.; Qiao, A. N.; Wang, S. W.
2017-12-01
To know the extraction effects of GBH film medicine and the influence on film quality evaluation. Ultrasonic extraction and reflux extraction were used to extract the two methods with the traditional water decocting method to contrast. They were used to determining the content of total flavonoids. The same method was used to separate the root of the main medicinal herbs and the decoction of the water decoction of the other drug powder. They were used to determine the content of total flavonoids. The effect of extraction method on the preparation of membrane was investigated. The membrane preparation, evaluation and flexibility, respectively, film-forming property, smoothness and disintegration time are used to evaluating separately the effect of extraction method. The results showed that the extraction effect of 70% ethanol concentration of total flavonoids. The best extraction method had no effect on the film quality initial evaluation. This experiment provides a method for membrane extraction agent, has a certain practical significance.
Liu, Dinglin; Zhao, Xianglian
2013-01-01
In an effort to deal with more complicated evaluation situations, scientists have focused their efforts on dynamic comprehensive evaluation research. How to make full use of the subjective and objective information has become one of the noteworthy content. In this paper, a dynamic comprehensive evaluation method with subjective and objective information is proposed. We use the combination weighting method to determine the index weight. Analysis hierarchy process method is applied to dispose the subjective information, and criteria importance through intercriteria correlation method is used to handle the objective information. And for the time weight determination, we consider both time distance and information size to embody the principle of esteeming the present over the past. And then the linear weighted average model is constructed to make the evaluation process more practicable. Finally, an example is presented to illustrate the effectiveness of this method. Overall, the results suggest that the proposed method is reasonable and effective. PMID:24386176
Using Developmental Evaluation Methods with Communities of Practice
ERIC Educational Resources Information Center
van Winkelen, Christine
2016-01-01
Purpose: This paper aims to explore the use of developmental evaluation methods with community of practice programmes experiencing change or transition to better understand how to target support resources. Design/methodology/approach: The practical use of a number of developmental evaluation methods was explored in three organizations over a…
Assessing Student Perception of Practice Evaluation Knowledge in Introductory Research Methods
ERIC Educational Resources Information Center
Baker, Lisa R.; Pollio, David E.; Hudson, Ashley
2011-01-01
The authors explored the use of the Practice Evaluation Knowledge Scale (PEKS) to assess student perception of acquisition and retention of practice evaluation knowledge from an undergraduate research methods class. The authors sampled 2 semesters of undergraduate social work students enrolled in an introductory research methods course.…
Aeroelastic loads prediction for an arrow wing. Task 1: Evaluation of R. P. White's method
NASA Technical Reports Server (NTRS)
Borland, C. J.; Manro, M. E.
1983-01-01
The separated flow method is evaluated. This method was developed for moderately swept wings with multiple, constant strength vortex systems. The flow on the highly swept wing used in this evaluation is characterized by a single vortex system of continuously varying strength.
Enhancing Learning Outcomes through Evaluation of Serious Gaming: A Mixed Methods Study
ERIC Educational Resources Information Center
Douglas, Kerrie Anna
2012-01-01
This study compared the change in counseling student's self-efficacy and skill related to suicide assessment and intervention through the use of a novel intervention-oriented evaluation method, evaluation focused discussion groups, in an experimental embedded mixed methods design. An innovation counselor pedagogical tool, Suicide Risk Assessment…
Prevalence of Evaluation Method Courses in Education Leader Doctoral Preparation
ERIC Educational Resources Information Center
Shepperson, Tara L.
2013-01-01
This exploratory study investigated the prevalence of single evaluation methods courses in doctoral education leadership programs. Analysis of websites of 132 leading U.S. university programs found 62 evaluation methods courses in 54 programs. Content analysis of 49 course catalog descriptions resulted in five categories: survey, planning and…
A Mixed-Methods Longitudinal Evaluation of a One-Day Mental Health Wellness Intervention
ERIC Educational Resources Information Center
Doyle, Louise; de Vries, Jan; Higgins, Agnes; Keogh, Brian; McBennett, Padraig; O'Shea, Marié T.
2017-01-01
Objectives: This study evaluated the impact of a one-day mental health Wellness Workshop on participants' mental health and attitudes towards mental health. Design: Convergent, longitudinal mixed-methods approach. Setting: The study evaluated Wellness Workshops which took place throughout the Republic of Ireland. Method: Questionnaires measuring…
Effectiveness Evaluation Method of Anti-Radiation Missile against Active Decoy
NASA Astrophysics Data System (ADS)
Tang, Junyao; Cao, Fei; Li, Sijia
2017-06-01
In the problem of anti-radiation missile against active decoy, whether the ARM can effectively kill the target radiation source and bait is an important index for evaluating the operational effectiveness of the missile. Aiming at this problem, this paper proposes a method to evaluate the effect of ARM against active decoy. Based on the calculation of ARM’s ability to resist the decoy, the paper proposes a method to evaluate the decoy resistance based on the key components of the hitting radar. The method has the advantages of scientific and reliability.
An Evaluation of the Method of Determining Parallax from Measured Phase Differences.
1977-12-01
5bOOé ~ETL—0 145 . ; ‘ , ~ ~“ (L2~I_ _ _ _~l: !~~~flI~~~ I— — — — — _ — — An evaluation of the method of determining parallax from measured phase...using a digi tized aerial image. The method was found to be not as accurate ~nd not as efficier,as conventiona l image ma tching techniques...EVALUATION OF THE METHOD OF DETERMINING PARALLAX FROM MEASURED PHASE DWFERENCES INTRODUCTION The purpose of the report is to describe an evaluation of
Assessing and evaluating multidisciplinary translational teams: a mixed methods approach.
Wooten, Kevin C; Rose, Robert M; Ostir, Glenn V; Calhoun, William J; Ameredes, Bill T; Brasier, Allan R
2014-03-01
A case report illustrates how multidisciplinary translational teams can be assessed using outcome, process, and developmental types of evaluation using a mixed-methods approach. Types of evaluation appropriate for teams are considered in relation to relevant research questions and assessment methods. Logic models are applied to scientific projects and team development to inform choices between methods within a mixed-methods design. Use of an expert panel is reviewed, culminating in consensus ratings of 11 multidisciplinary teams and a final evaluation within a team-type taxonomy. Based on team maturation and scientific progress, teams were designated as (a) early in development, (b) traditional, (c) process focused, or (d) exemplary. Lessons learned from data reduction, use of mixed methods, and use of expert panels are explored.
Using mixed methods to develop and evaluate complex interventions in palliative care research.
Farquhar, Morag C; Ewing, Gail; Booth, Sara
2011-12-01
there is increasing interest in combining qualitative and quantitative research methods to provide comprehensiveness and greater knowledge yield. Mixed methods are valuable in the development and evaluation of complex interventions. They are therefore particularly valuable in palliative care research where the majority of interventions are complex, and the identification of outcomes particularly challenging. this paper aims to introduce the role of mixed methods in the development and evaluation of complex interventions in palliative care, and how they may be used in palliative care research. the paper defines mixed methods and outlines why and how mixed methods are used to develop and evaluate complex interventions, with a pragmatic focus on design and data collection issues and data analysis. Useful texts are signposted and illustrative examples provided of mixed method studies in palliative care, including a detailed worked example of the development and evaluation of a complex intervention in palliative care for breathlessness. Key challenges to conducting mixed methods in palliative care research are identified in relation to data collection, data integration in analysis, costs and dissemination and how these might be addressed. the development and evaluation of complex interventions in palliative care benefit from the application of mixed methods. Mixed methods enable better understanding of whether and how an intervention works (or does not work) and inform the design of subsequent studies. However, they can be challenging: mixed method studies in palliative care will benefit from working with agreed protocols, multidisciplinary teams and engaging staff with appropriate skill sets.
NASA Astrophysics Data System (ADS)
Zhang, Qian; Feng, Minquan; Hao, Xiaoyan
2018-03-01
[Objective] Based on the water quality historical data from the Zhangze Reservoir from the last five years, the water quality was assessed by the integrated water quality identification index method and the Nemerow pollution index method. The results of different evaluation methods were analyzed and compared and the characteristics of each method were identified.[Methods] The suitability of the water quality assessment methods were compared and analyzed, based on these results.[Results] the water quality tended to decrease over time with 2016 being the year with the worst water quality. The sections with the worst water quality were the southern and northern sections.[Conclusion] The results produced by the traditional Nemerow index method fluctuated greatly in each section of water quality monitoring and therefore could not effectively reveal the trend of water quality at each section. The combination of qualitative and quantitative measures of the comprehensive pollution index identification method meant it could evaluate the degree of water pollution as well as determine that the river water was black and odorous. However, the evaluation results showed that the water pollution was relatively low.The results from the improved Nemerow index evaluation were better as the single indicators and evaluation results are in strong agreement; therefore the method is able to objectively reflect the water quality of each water quality monitoring section and is more suitable for the water quality evaluation of the reservoir.
Evaluation of equipment and methods to map lost circulation zones in geothermal wells
DOE Office of Scientific and Technical Information (OSTI.GOV)
McDonald, W.J.; Leon, P.A.; Pittard, G.
A study and evaluation of methods to locate, characterize, and quantify lost circulation zones are described. Twenty-five methods of mapping and quantifying lost circulation zones were evaluated, including electrical, acoustical, mechanical, radioactive, and optical systems. Each tool studied is described. The structured, numerical evaluation plan, used as the basis for comparing the 25 tools, and the resulting ranking among the tools is presented.
Using Mixed Methods and Collaboration to Evaluate an Education and Public Outreach Program (Invited)
NASA Astrophysics Data System (ADS)
Shebby, S.; Shipp, S. S.
2013-12-01
Traditional indicators (such as the number of participants or Likert-type ratings of participant perceptions) are often used to provide stakeholders with basic information about program outputs and to justify funding decisions. However, use of qualitative methods can strengthen the reliability of these data and provide stakeholders with more meaningful information about program challenges, successes, and ultimate impacts (Stern, Stame, Mayne, Forss, David & Befani, 2012). In this session, presenters will discuss how they used a mixed methods evaluation to determine the impact of an education and public outreach (EPO) program. EPO efforts were intended to foster more effective, sustainable, and efficient utilization of science discoveries and learning experiences through three main goals 1) increase engagement and support by leveraging of resources, expertise, and best practices; 2) organize a portfolio of resources for accessibility, connectivity, and strategic growth; and 3) develop an infrastructure to support coordination. The evaluation team used a mixed methods design to conduct the evaluation. Presenters will first discuss five potential benefits of mixed methods designs: triangulation of findings, development, complementarity, initiation, and value diversity (Greene, Caracelli & Graham, 2005). They will next demonstrate how a 'mix' of methods, including artifact collection, surveys, interviews, focus groups, and vignettes, was included in the EPO project's evaluation design, providing specific examples of how alignment between the program theory and the evaluation plan was best achieved with a mixed methods approach. The presentation will also include an overview of different mixed methods approaches and information about important considerations when using a mixed methods design, such as selection of data collection methods and sources, and the timing and weighting of quantitative and qualitative methods (Creswell, 2003). Ultimately, this presentation will provide insight into how a mixed methods approach was used to provide stakeholders with important information about progress toward program goals. Creswell, J.W. (2003). Research design: Qualitative, quantitative, and mixed approaches. Thousand Oaks, CA: Sage. Greene, J. C., Caracelli, V. J., & Graham, W. D. (1989). Toward a conceptual framework for mixed-method evaluation designs. Educational Evaluation and Policy Analysis, 11(3), 255-274. Stern, E; Stame, N; Mayne, J; Forss, K; Davis, R & Befani, B (2012) Broadening the range of designs and methods for impact evaluation. Department for International Development.
Schriver, Brittany; Mandal, Mahua; Muralidharan, Arundati; Nwosu, Anthony; Dayal, Radhika; Das, Madhumita; Fehringer, Jessica
2017-11-01
As a result of new global priorities, there is a growing need for high-quality evaluations of gender-integrated health programmes. This systematic review examined 99 peer-reviewed articles on evaluations of gender-integrated (accommodating and transformative) health programmes with regard to their theory of change (ToC), study design, gender integration in data collection, analysis, and gender measures used. Half of the evaluations explicitly described a ToC or conceptual framework (n = 50) that guided strategies for their interventions. Over half (61%) of the evaluations used quantitative methods exclusively; 11% used qualitative methods exclusively; and 28% used mixed methods. Qualitative methods were not commonly detailed. Evaluations of transformative interventions were less likely than those of accommodating interventions to employ randomised control trials. Two-thirds of the reviewed evaluations reported including at least one specific gender-related outcome (n = 18 accommodating, n = 44 transformative). To strengthen evaluations of gender-integrated programmes, we recommend use of ToCs, explicitly including gender in the ToC, use of gender-sensitive measures, mixed-method designs, in-depth descriptions of qualitative methods, and attention to gender-related factors in data collection logistics. We also recommend further research to develop valid and reliable gender measures that are globally relevant.
Evaluating the evaluation of cancer driver genes
Tokheim, Collin J.; Papadopoulos, Nickolas; Kinzler, Kenneth W.; Vogelstein, Bert; Karchin, Rachel
2016-01-01
Sequencing has identified millions of somatic mutations in human cancers, but distinguishing cancer driver genes remains a major challenge. Numerous methods have been developed to identify driver genes, but evaluation of the performance of these methods is hindered by the lack of a gold standard, that is, bona fide driver gene mutations. Here, we establish an evaluation framework that can be applied to driver gene prediction methods. We used this framework to compare the performance of eight such methods. One of these methods, described here, incorporated a machine-learning–based ratiometric approach. We show that the driver genes predicted by each of the eight methods vary widely. Moreover, the P values reported by several of the methods were inconsistent with the uniform values expected, thus calling into question the assumptions that were used to generate them. Finally, we evaluated the potential effects of unexplained variability in mutation rates on false-positive driver gene predictions. Our analysis points to the strengths and weaknesses of each of the currently available methods and offers guidance for improving them in the future. PMID:27911828
Evaluating care from a care ethical perspective:: A pilot study.
Kuis, Esther E; Goossensen, Anne
2017-08-01
Care ethical theories provide an excellent opening for evaluation of healthcare practices since searching for (moments of) good care from a moral perspective is central to care ethics. However, a fruitful way to translate care ethical insights into measurable criteria and how to measure these criteria has as yet been unexplored: this study describes one of the first attempts. To investigate whether the emotional touchpoint method is suitable for evaluating care from a care ethical perspective. An adapted version of the emotional touchpoint interview method was used. Touchpoints represent the key moments to the experience of receiving care, where the patient recalls being touched emotionally or cognitively. Participants and research context: Interviews were conducted at three different care settings: a hospital, mental healthcare institution and care facility for older people. A total of 31 participants (29 patients and 2 relatives) took part in the study. Ethical considerations: The research was found not to be subject to the (Dutch) Medical Research Involving Human Subjects Act. A three-step care ethical evaluation model was developed and described using two touchpoints as examples. A focus group meeting showed that the method was considered of great value for partaking institutions in comparison with existing methods. Reflection and discussion: Considering existing methods to evaluate quality of care, the touchpoint method belongs to the category of instruments which evaluate the patient experience. The touchpoint method distinguishes itself because no pre-defined categories are used but the values of patients are followed, which is an essential issue from a care ethical perspective. The method portrays the insider perspective of patients and thereby contributes to humanizing care. The touchpoint method is a valuable instrument for evaluating care; it generates evaluation data about the core care ethical principle of responsiveness.
Evaluation of Low-Voltage Distribution Network Index Based on Improved Principal Component Analysis
NASA Astrophysics Data System (ADS)
Fan, Hanlu; Gao, Suzhou; Fan, Wenjie; Zhong, Yinfeng; Zhu, Lei
2018-01-01
In order to evaluate the development level of the low-voltage distribution network objectively and scientifically, chromatography analysis method is utilized to construct evaluation index model of low-voltage distribution network. Based on the analysis of principal component and the characteristic of logarithmic distribution of the index data, a logarithmic centralization method is adopted to improve the principal component analysis algorithm. The algorithm can decorrelate and reduce the dimensions of the evaluation model and the comprehensive score has a better dispersion degree. The clustering method is adopted to analyse the comprehensive score because the comprehensive score of the courts is concentrated. Then the stratification evaluation of the courts is realized. An example is given to verify the objectivity and scientificity of the evaluation method.
Planning Evaluation through the Program Life Cycle
ERIC Educational Resources Information Center
Scheirer, Mary Ann; Mark, Melvin M.; Brooks, Ariana; Grob, George F.; Chapel, Thomas J.; Geisz, Mary; McKaughan, Molly; Leviton, Laura
2012-01-01
Linking evaluation methods to the several phases of a program's life cycle can provide evaluation planners and funders with guidance about what types of evaluation are most appropriate over the trajectory of social and educational programs and other interventions. If methods are matched to the needs of program phases, evaluation can and should…
Evaluation and comparison of predictive individual-level general surrogates.
Gabriel, Erin E; Sachs, Michael C; Halloran, M Elizabeth
2018-07-01
An intermediate response measure that accurately predicts efficacy in a new setting at the individual level could be used both for prediction and personalized medical decisions. In this article, we define a predictive individual-level general surrogate (PIGS), which is an individual-level intermediate response that can be used to accurately predict individual efficacy in a new setting. While methods for evaluating trial-level general surrogates, which are predictors of trial-level efficacy, have been developed previously, few, if any, methods have been developed to evaluate individual-level general surrogates, and no methods have formalized the use of cross-validation to quantify the expected prediction error. Our proposed method uses existing methods of individual-level surrogate evaluation within a given clinical trial setting in combination with cross-validation over a set of clinical trials to evaluate surrogate quality and to estimate the absolute prediction error that is expected in a new trial setting when using a PIGS. Simulations show that our method performs well across a variety of scenarios. We use our method to evaluate and to compare candidate individual-level general surrogates over a set of multi-national trials of a pentavalent rotavirus vaccine.
2016-11-01
Display Design, Methods , and Results for a User Study by Christopher J Garneau and Robert F Erbacher Approved for public...NOV 2016 US Army Research Laboratory Evaluation of Visualization Tools for Computer Network Defense Analysts: Display Design, Methods ...January 2013–September 2015 4. TITLE AND SUBTITLE Evaluation of Visualization Tools for Computer Network Defense Analysts: Display Design, Methods
Efficacy Evaluation of Current and Future Naval Mine Warfare Neutralization Method
2016-12-01
Distribution is unlimited. EFFICACY EVALUATION OF CURRENT AND FUTURE NAVAL MINE WARFARE NEUTRALIZATION METHOD by Team MIW Cohort SE311-152O...EFFICACY EVALUATION OF CURRENT AND FUTURE NAVAL MINE WARFARE NEUTRALIZATION METHOD 5. FUNDING NUMBERS 6. AUTHOR (S) Team MIW, Systems Engineering...NEUTRALIZATION METHOD Team MIW, Systems Engineering Cohort SE311-152O Submitted in partial fulfillment of the requirements for the degrees of
Validation and Verification (V and V) Testing on Midscale Flame Resistant (FR) Test Method
2016-12-16
Method for Evaluation of Flame Resistant Clothing for Protection against Fire Simulations Using an Instrumented Manikin. Validation and...complement (not replace) the capabilities of the ASTM F1930 Standard Test Method for Evaluation of Flame Resistant Clothing for Protection against Fire ...Engineering Center (NSRDEC) to complement the ASTM F1930 Standard Test Method for Evaluation of Flame Resistant Clothing for Protection against Fire
Berns, U; Hemprich, L
2001-01-01
In No. 8, 48th year, August 1998, of the journal "Psychotherapie--Psychosomatik--Medizinische Psychologie" the tape recorder transcription of the 290th session of a long-term analysis was studied by three methods (BIP, Frames, ZBKT). The paper presented here was stimulated by this publication. From the author's viewpoint substantial clinical aspects of evaluation could be added by applying a clinical evaluation method developed by R. Langs and his corresponding concept of interpretation. Clinical vignettes exemplify the possibility to resolve pathological countertransference by using this evaluation method. With the help of this method the presented transcription of the 290th session is evaluated partially.
Risk Evaluation of Bogie System Based on Extension Theory and Entropy Weight Method
Du, Yanping; Zhang, Yuan; Zhao, Xiaogang; Wang, Xiaohui
2014-01-01
A bogie system is the key equipment of railway vehicles. Rigorous practical evaluation of bogies is still a challenge. Presently, there is overreliance on part-specific experiments in practice. In the present work, a risk evaluation index system of a bogie system has been established based on the inspection data and experts' evaluation. Then, considering quantitative and qualitative aspects, the risk state of a bogie system has been evaluated using an extension theory and an entropy weight method. Finally, the method has been used to assess the bogie system of four different samples. Results show that this method can assess the risk state of a bogie system exactly. PMID:25574159
Risk evaluation of bogie system based on extension theory and entropy weight method.
Du, Yanping; Zhang, Yuan; Zhao, Xiaogang; Wang, Xiaohui
2014-01-01
A bogie system is the key equipment of railway vehicles. Rigorous practical evaluation of bogies is still a challenge. Presently, there is overreliance on part-specific experiments in practice. In the present work, a risk evaluation index system of a bogie system has been established based on the inspection data and experts' evaluation. Then, considering quantitative and qualitative aspects, the risk state of a bogie system has been evaluated using an extension theory and an entropy weight method. Finally, the method has been used to assess the bogie system of four different samples. Results show that this method can assess the risk state of a bogie system exactly.
New knowledge network evaluation method for design rationale management
NASA Astrophysics Data System (ADS)
Jing, Shikai; Zhan, Hongfei; Liu, Jihong; Wang, Kuan; Jiang, Hao; Zhou, Jingtao
2015-01-01
Current design rationale (DR) systems have not demonstrated the value of the approach in practice since little attention is put to the evaluation method of DR knowledge. To systematize knowledge management process for future computer-aided DR applications, a prerequisite is to provide the measure for the DR knowledge. In this paper, a new knowledge network evaluation method for DR management is presented. The method characterizes the DR knowledge value from four perspectives, namely, the design rationale structure scale, association knowledge and reasoning ability, degree of design justification support and degree of knowledge representation conciseness. The DR knowledge comprehensive value is also measured by the proposed method. To validate the proposed method, different style of DR knowledge network and the performance of the proposed measure are discussed. The evaluation method has been applied in two realistic design cases and compared with the structural measures. The research proposes the DR knowledge evaluation method which can provide object metric and selection basis for the DR knowledge reuse during the product design process. In addition, the method is proved to be more effective guidance and support for the application and management of DR knowledge.
Research on evaluation methods for water regulation ability of dams in the Huai River Basin
NASA Astrophysics Data System (ADS)
Shan, G. H.; Lv, S. F.; Ma, K.
2016-08-01
Water environment protection is a global and urgent problem that requires correct and precise evaluation. Evaluation methods have been studied for many years; however, there is a lack of research on the methods of assessing the water regulation ability of dams. Currently, evaluating the ability of dams has become a practical and significant research orientation because of the global water crisis, and the lack of effective ways to manage a dam's regulation ability has only compounded this. This paper firstly constructs seven evaluation factors and then develops two evaluation approaches to implement the factors according to the features of the problem. Dams of the Yin Shang ecological control section in the Huai He River basin are selected as an example to demonstrate the method. The results show that the evaluation approaches can produce better and more practical suggestions for dam managers.
Evaluation of Visualization Software
NASA Technical Reports Server (NTRS)
Globus, Al; Uselton, Sam
1995-01-01
Visualization software is widely used in scientific and engineering research. But computed visualizations can be very misleading, and the errors are easy to miss. We feel that the software producing the visualizations must be thoroughly evaluated and the evaluation process as well as the results must be made available. Testing and evaluation of visualization software is not a trivial problem. Several methods used in testing other software are helpful, but these methods are (apparently) often not used. When they are used, the description and results are generally not available to the end user. Additional evaluation methods specific to visualization must also be developed. We present several useful approaches to evaluation, ranging from numerical analysis of mathematical portions of algorithms to measurement of human performance while using visualization systems. Along with this brief survey, we present arguments for the importance of evaluations and discussions of appropriate use of some methods.
An Examination of Final Evaluation Methods Used in Master's Level Counseling Programs.
ERIC Educational Resources Information Center
Carney, Jamie S.; Cobia, Debra C.; Shannon, David M.
1998-01-01
Reports the findings of a national study examining methods used for final evaluation in master's level counseling programs. Suggests that as faculty review their policies and procedures with regard to student evaluation, these data may provide valuable information concerning methods selection, content, and delivery of feedback to students.…
Analyzing Empirical Evaluations of Non-Experimental Methods in Field Settings
ERIC Educational Resources Information Center
Steiner, Peter M.; Wong, Vivian
2016-01-01
Despite recent emphasis on the use of randomized control trials (RCTs) for evaluating education interventions, in most areas of education research, observational methods remain the dominant approach for assessing program effects. Over the last three decades, the within-study comparison (WSC) design has emerged as a method for evaluating the…
ERIC Educational Resources Information Center
Patalino, Marianne
Problems in current course evaluation methods are discussed and an alternative method is described for the construction, analysis, and interpretation of a test to evaluate instructional programs. The method presented represents a different approach to the traditional overreliance on standardized achievement tests and the total scores they provide.…
The MIXED framework: A novel approach to evaluating mixed-methods rigor.
Eckhardt, Ann L; DeVon, Holli A
2017-10-01
Evaluation of rigor in mixed-methods (MM) research is a persistent challenge due to the combination of inconsistent philosophical paradigms, the use of multiple research methods which require different skill sets, and the need to combine research at different points in the research process. Researchers have proposed a variety of ways to thoroughly evaluate MM research, but each method fails to provide a framework that is useful for the consumer of research. In contrast, the MIXED framework is meant to bridge the gap between an academic exercise and practical assessment of a published work. The MIXED framework (methods, inference, expertise, evaluation, and design) borrows from previously published frameworks to create a useful tool for the evaluation of a published study. The MIXED framework uses an experimental eight-item scale that allows for comprehensive integrated assessment of MM rigor in published manuscripts. Mixed methods are becoming increasingly prevalent in nursing and healthcare research requiring researchers and consumers to address issues unique to MM such as evaluation of rigor. © 2017 John Wiley & Sons Ltd.
2016-09-01
Standardization (ISO). 2015. Water quality - calanoid copepod early- life stage test with Acartia tonsa. ISO 16778:2015. International Organization for...Toxicity Test Methods for Marine Water Quality Evaluations by Alan J Kennedy, Guilherme Lotufo, Jennifer G. Laird, and J. Daniel Farrar PURPOSE: The...MPRSA evaluations in some regions. The organisms used in these test methods are not planktonic for most of their life cycles (juveniles and adults
1992-06-01
CRITERIA TO HIRE CIVILIANS 10 21. PROFESSIONAL QUALIFICATION STANDARDS 18 22. CLASSROOM OBSERVATION 19 23. OTHER METHODS TO EVALUATE 18 INSTRUCTION 24. OTHER...other methods used to evaluate classroom instruction? (Note: Question 23 asks whether respondents use classroom observation to evaluate instruction] (15...number of affirmative responses are as follows: "* Question 22: Do you use classroom observation to evaluate instruction? (17 responses) "* Question
System and method for evaluating a wire conductor
Panozzo, Edward; Parish, Harold
2013-10-22
A method of evaluating an electrically conductive wire segment having an insulated intermediate portion and non-insulated ends includes passing the insulated portion of the wire segment through an electrically conductive brush. According to the method, an electrical potential is established on the brush by a power source. The method also includes determining a value of electrical current that is conducted through the wire segment by the brush when the potential is established on the brush. The method additionally includes comparing the value of electrical current conducted through the wire segment with a predetermined current value to thereby evaluate the wire segment. A system for evaluating an electrically conductive wire segment is also disclosed.
Dental students' evaluations of an interactive histology software.
Rosas, Cristian; Rubí, Rafael; Donoso, Manuel; Uribe, Sergio
2012-11-01
This study assessed dental students' evaluations of a new Interactive Histology Software (IHS) developed by the authors and compared students' assessment of the extent to which this new software, as well as other histology teaching methods, supported their learning. The IHS is a computer-based tool for histology learning that presents high-resolution images of histology basics as well as specific oral histologies at different magnifications and with text labels. Survey data were collected from 204 first-year dental students at the Universidad Austral de Chile. The survey consisted of questions for the respondents to evaluate the characteristics of the IHS and the contribution of various teaching methods to their histology learning. The response rate was 85 percent. Student evaluations were positive for the design, usability, and theoretical-practical integration of the IHS, and the students reported they would recommend the method to future students. The students continued to value traditional teaching methods for histological lab work and did not think this new technology would replace traditional methods. With respect to the contribution of each teaching method to students' learning, no statistically significant differences (p>0.05) were found for an evaluation of IHS, light microscopy, and slide presentations. However, these student assessments were significantly more positive than the evaluations of other digital or printed materials. Overall, the students evaluated the IHS very positively in terms of method quality and contribution to their learning; they also evaluated use of light microscopy and teacher slide presentations positively.
Physical Evaluation of Cleaning Performance: We Are Only Fooling Ourselves
NASA Technical Reports Server (NTRS)
Pratz, Earl; McCool, A. (Technical Monitor)
2000-01-01
Surface cleaning processes are normally evaluated using visual physical properties such as discolorations, streaking, staining and water-break-free conditions. There is an assumption that these physical methods will evaluate all surfaces all the time for all subsequent operations. We have found that these physical methods are lacking in sensitivity and selectivity with regard to surface residues and subsequent process performance. We will report several conditions where evaluations using visual physical properties are lacking. We will identify possible alternative methods and future needs for surface evaluations.
Evaluation Measures and Methods: Some Intersections.
ERIC Educational Resources Information Center
Elliott, John
The literature is reviewed for four combinations of evaluation measures and methods: traditional methods with traditional measures (T-Meth/T-Mea), nontraditional methods with traditional measures (N-Meth/T-Mea), traditional measures with nontraditional measures (T-Meth/N-Mea), and nontraditional methods with nontraditional measures (N-Meth/N-Mea).…
Methods for comparative evaluation of propulsion system designs for supersonic aircraft
NASA Technical Reports Server (NTRS)
Tyson, R. M.; Mairs, R. Y.; Halferty, F. D., Jr.; Moore, B. E.; Chaloff, D.; Knudsen, A. W.
1976-01-01
The propulsion system comparative evaluation study was conducted to define a rapid, approximate method for evaluating the effects of propulsion system changes for an advanced supersonic cruise airplane, and to verify the approximate method by comparing its mission performance results with those from a more detailed analysis. A table look up computer program was developed to determine nacelle drag increments for a range of parametric nacelle shapes and sizes. Aircraft sensitivities to propulsion parameters were defined. Nacelle shapes, installed weights, and installed performance was determined for four study engines selected from the NASA supersonic cruise aircraft research (SCAR) engine studies program. Both rapid evaluation method (using sensitivities) and traditional preliminary design methods were then used to assess the four engines. The method was found to compare well with the more detailed analyses.
NASA Astrophysics Data System (ADS)
Okabe, Shigemitsu; Tsuboi, Toshihiro; Takami, Jun
The power-frequency withstand voltage tests are regulated on electric power equipment in JEC by evaluating the lifetime reliability with a Weibull distribution function. The evaluation method is still controversial in terms of consideration of a plural number of faults and some alternative methods were proposed on this subject. The present paper first discusses the physical meanings of the various kinds of evaluating methods and secondly examines their effects on the power-frequency withstand voltage tests. Further, an appropriate method is investigated for an oil-filled transformer and a gas insulated switchgear with taking notice of dielectric breakdown or partial discharge mechanism under various insulating material and structure conditions and the tentative conclusion gives that the conventional method would be most pertinent under the present conditions.
Assessing and Evaluating Multidisciplinary Translational Teams: A Mixed Methods Approach
Wooten, Kevin C.; Rose, Robert M.; Ostir, Glenn V.; Calhoun, William J.; Ameredes, Bill T.; Brasier, Allan R.
2014-01-01
A case report illustrates how multidisciplinary translational teams can be assessed using outcome, process, and developmental types of evaluation using a mixed methods approach. Types of evaluation appropriate for teams are considered in relation to relevant research questions and assessment methods. Logic models are applied to scientific projects and team development to inform choices between methods within a mixed methods design. Use of an expert panel is reviewed, culminating in consensus ratings of 11 multidisciplinary teams and a final evaluation within a team type taxonomy. Based on team maturation and scientific progress, teams were designated as: a) early in development, b) traditional, c) process focused, or d) exemplary. Lessons learned from data reduction, use of mixed methods, and use of expert panels are explored. PMID:24064432
Simms, Leonard J; Zelazny, Kerry; Yam, Wern How; Gros, Daniel F
2010-05-01
Little attention typically is paid to the way self-report measures are translated for use in self-informant agreement studies. We studied two possible methods for creating informant measures: (a) the traditional method in which self-report items were translated from the first- to the third-person and (b) an alternative meta-perceptual method in which informants were directed to rate their perception of the targets' self-perception. We hypothesized that the latter method would yield stronger self-informant agreement for evaluative personality dimensions measured by indirect item markers. We studied these methods in a sample of 303 undergraduate friendship dyads. Results revealed mean-level differences between methods, similar self-informant agreement across methods, stronger agreement for Big Five dimensions than for evaluative dimensions, and incremental validity for meta-perceptual informant rating methods. Limited power reduced the interpretability of several sparse acquaintanceship effects. We conclude that traditional informant methods are appropriate for most personality traits, but meta-perceptual methods may be more appropriate when personality questionnaire items reflect indirect indicators of the trait being measured, which is particularly likely for evaluative traits.
Simms, Leonard J.; Zelazny, Kerry; Yam, Wern How; Gros, Daniel F.
2011-01-01
Little attention typically is paid to the way self-report measures are translated for use in self-informant agreement studies. We studied two possible methods for creating informant measures: (a) the traditional method in which self-report items were translated from the first- to the third-person and (b) an alternative meta-perceptual method in which informants were directed to rate their perception of the targets’ self-perception. We hypothesized that the latter method would yield stronger self-informant agreement for evaluative personality dimensions measured by indirect item markers. We studied these methods in a sample of 303 undergraduate friendship dyads. Results revealed mean-level differences between methods, similar self-informant agreement across methods, stronger agreement for Big Five dimensions than for evaluative dimensions, and incremental validity for meta-perceptual informant rating methods. Limited power reduced the interpretability of several sparse acquaintanceship effects. We conclude that traditional informant methods are appropriate for most personality traits, but meta-perceptual methods may be more appropriate when personality questionnaire items reflect indirect indicators of the trait being measured, which is particularly likely for evaluative traits. PMID:21541262
Reisfeld, S; Assaly, M; Tannous, E; Amarney, K; Stein, M
2018-06-01
Approximately 20-50% of antimicrobial therapy in hospitalized patients is considered inappropriate, which may be associated with increased morbidity and mortality. The best method for evaluation of appropriateness is not well defined. To evaluate the rate of appropriate antimicrobial therapy in a secondary hospital using three different methods, and determine the rate of agreement between the different methods. A point prevalence study included all adult hospitalized patients receiving systemic antimicrobial therapy during 2016, screened on a single day. Clinical, laboratory and therapeutic data were collected from patient files, and appropriateness was rated with a qualitative evaluation by expert opinion. In addition, a quantitative evaluation was performed according to 11 quality indicators (QIs) rated for each patient. A strict definition of appropriateness was fulfilled if six essential QIs were met, and a lenient definition was fulfilled if at least five QIs were met. Agreement between methods was analysed using kappa statistic. Among 106 patients included, rates of appropriateness of antimicrobial therapy ranged from 20% to 75%, depending on the method of evaluation. Very low agreement was found between the strict definition and expert opinion (kappa=0.068), and medium agreement was found between the lenient definition and expert opinion (kappa=0.45). Rates of appropriateness of antimicrobial therapy varied between evaluation methods, with low to moderate agreement between the different methods. Copyright © 2017 The Healthcare Infection Society. Published by Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Christie, Christina A.; Fleischer, Dreolin Nesbitt
2010-01-01
To describe the recent practice of evaluation, specifically method and design choices, the authors performed a content analysis on 117 evaluation studies published in eight North American evaluation-focused journals for a 3-year period (2004-2006). The authors chose this time span because it follows the scientifically based research (SBR)…
MRBrainS Challenge: Online Evaluation Framework for Brain Image Segmentation in 3T MRI Scans.
Mendrik, Adriënne M; Vincken, Koen L; Kuijf, Hugo J; Breeuwer, Marcel; Bouvy, Willem H; de Bresser, Jeroen; Alansary, Amir; de Bruijne, Marleen; Carass, Aaron; El-Baz, Ayman; Jog, Amod; Katyal, Ranveer; Khan, Ali R; van der Lijn, Fedde; Mahmood, Qaiser; Mukherjee, Ryan; van Opbroek, Annegreet; Paneri, Sahil; Pereira, Sérgio; Persson, Mikael; Rajchl, Martin; Sarikaya, Duygu; Smedby, Örjan; Silva, Carlos A; Vrooman, Henri A; Vyas, Saurabh; Wang, Chunliang; Zhao, Liang; Biessels, Geert Jan; Viergever, Max A
2015-01-01
Many methods have been proposed for tissue segmentation in brain MRI scans. The multitude of methods proposed complicates the choice of one method above others. We have therefore established the MRBrainS online evaluation framework for evaluating (semi)automatic algorithms that segment gray matter (GM), white matter (WM), and cerebrospinal fluid (CSF) on 3T brain MRI scans of elderly subjects (65-80 y). Participants apply their algorithms to the provided data, after which their results are evaluated and ranked. Full manual segmentations of GM, WM, and CSF are available for all scans and used as the reference standard. Five datasets are provided for training and fifteen for testing. The evaluated methods are ranked based on their overall performance to segment GM, WM, and CSF and evaluated using three evaluation metrics (Dice, H95, and AVD) and the results are published on the MRBrainS13 website. We present the results of eleven segmentation algorithms that participated in the MRBrainS13 challenge workshop at MICCAI, where the framework was launched, and three commonly used freeware packages: FreeSurfer, FSL, and SPM. The MRBrainS evaluation framework provides an objective and direct comparison of all evaluated algorithms and can aid in selecting the best performing method for the segmentation goal at hand.
MRBrainS Challenge: Online Evaluation Framework for Brain Image Segmentation in 3T MRI Scans
Mendrik, Adriënne M.; Vincken, Koen L.; Kuijf, Hugo J.; Breeuwer, Marcel; Bouvy, Willem H.; de Bresser, Jeroen; Alansary, Amir; de Bruijne, Marleen; Carass, Aaron; El-Baz, Ayman; Jog, Amod; Katyal, Ranveer; Khan, Ali R.; van der Lijn, Fedde; Mahmood, Qaiser; Mukherjee, Ryan; van Opbroek, Annegreet; Paneri, Sahil; Pereira, Sérgio; Rajchl, Martin; Sarikaya, Duygu; Smedby, Örjan; Silva, Carlos A.; Vrooman, Henri A.; Vyas, Saurabh; Wang, Chunliang; Zhao, Liang; Biessels, Geert Jan; Viergever, Max A.
2015-01-01
Many methods have been proposed for tissue segmentation in brain MRI scans. The multitude of methods proposed complicates the choice of one method above others. We have therefore established the MRBrainS online evaluation framework for evaluating (semi)automatic algorithms that segment gray matter (GM), white matter (WM), and cerebrospinal fluid (CSF) on 3T brain MRI scans of elderly subjects (65–80 y). Participants apply their algorithms to the provided data, after which their results are evaluated and ranked. Full manual segmentations of GM, WM, and CSF are available for all scans and used as the reference standard. Five datasets are provided for training and fifteen for testing. The evaluated methods are ranked based on their overall performance to segment GM, WM, and CSF and evaluated using three evaluation metrics (Dice, H95, and AVD) and the results are published on the MRBrainS13 website. We present the results of eleven segmentation algorithms that participated in the MRBrainS13 challenge workshop at MICCAI, where the framework was launched, and three commonly used freeware packages: FreeSurfer, FSL, and SPM. The MRBrainS evaluation framework provides an objective and direct comparison of all evaluated algorithms and can aid in selecting the best performing method for the segmentation goal at hand. PMID:26759553
ERIC Educational Resources Information Center
Youtie, Jan; Bozeman, Barry; Shapira, Philip
1999-01-01
Describes an evaluability assessment of the Georgia Research Alliance (GRA), a technology development program. Presents the steps involved in conducting an evaluability assessment, including development of an understanding of the program and its stakeholders. Analyzes and compares different methods by which the GRA could be evaluated. (SLD)
ERIC Educational Resources Information Center
Kimball, Steven M.; Milanowski, Anthony
2009-01-01
Purpose: The article reports on a study of school leader decision making that examined variation in the validity of teacher evaluation ratings in a school district that has implemented a standards-based teacher evaluation system. Research Methods: Applying mixed methods, the study used teacher evaluation ratings and value-added student achievement…
Data Collection Methods for Evaluating Museum Programs and Exhibitions
ERIC Educational Resources Information Center
Nelson, Amy Crack; Cohn, Sarah
2015-01-01
Museums often evaluate various aspects of their audiences' experiences, be it what they learn from a program or how they react to an exhibition. Each museum program or exhibition has its own set of goals, which can drive what an evaluator studies and how an evaluation evolves. When designing an evaluation, data collection methods are purposefully…
Comparison of Methods for Evaluating Urban Transportation Alternatives
DOT National Transportation Integrated Search
1975-02-01
The objective of the report was to compare five alternative methods for evaluating urban transportation improvement options: unaided judgmental evaluation cost-benefit analysis, cost-effectiveness analysis based on a single measure of effectiveness, ...
Evaluation Guidelines for Service and Methods Demonstration Projects
DOT National Transportation Integrated Search
1976-02-01
The document consists of evaluation guidelines for planning, implementing, and reporting the findings of the evaluation of Service and Methods Demonstration (SMD) projects sponsored by the Urban Mass Transportation Administration (UMTA). The objectiv...
Reduction of bias and variance for evaluation of computer-aided diagnostic schemes.
Li, Qiang; Doi, Kunio
2006-04-01
Computer-aided diagnostic (CAD) schemes have been developed to assist radiologists in detecting various lesions in medical images. In addition to the development, an equally important problem is the reliable evaluation of the performance levels of various CAD schemes. It is good to see that more and more investigators are employing more reliable evaluation methods such as leave-one-out and cross validation, instead of less reliable methods such as resubstitution, for assessing their CAD schemes. However, the common applications of leave-one-out and cross-validation evaluation methods do not necessarily imply that the estimated performance levels are accurate and precise. Pitfalls often occur in the use of leave-one-out and cross-validation evaluation methods, and they lead to unreliable estimation of performance levels. In this study, we first identified a number of typical pitfalls for the evaluation of CAD schemes, and conducted a Monte Carlo simulation experiment for each of the pitfalls to demonstrate quantitatively the extent of bias and/or variance caused by the pitfall. Our experimental results indicate that considerable bias and variance may exist in the estimated performance levels of CAD schemes if one employs various flawed leave-one-out and cross-validation evaluation methods. In addition, for promoting and utilizing a high standard for reliable evaluation of CAD schemes, we attempt to make recommendations, whenever possible, for overcoming these pitfalls. We believe that, with the recommended evaluation methods, we can considerably reduce the bias and variance in the estimated performance levels of CAD schemes.
ERIC Educational Resources Information Center
Daigneault, Pierre-Marc; Jacob, Steve
2014-01-01
Although combining methods is nothing new, more contributions about why and how to mix methods for validation purposes are needed. This article presents a case of validating the inferences drawn from the Participatory Evaluation Measurement Instrument, an instrument that purports to measure stakeholder participation in evaluation. Although the…
NASA Astrophysics Data System (ADS)
Tatebe, Hironobu; Kato, Kunihito; Yamamoto, Kazuhiko; Katsuta, Yukio; Nonaka, Masahiko
2005-12-01
Now a day, many evaluation methods for the food industry by using image processing are proposed. These methods are becoming new evaluation method besides the sensory test and the solid-state measurement that are using for the quality evaluation. An advantage of the image processing is to be able to evaluate objectively. The goal of our research is structure evaluation of sponge cake by using image processing. In this paper, we propose a feature extraction method of the bobble structure in the sponge cake. Analysis of the bubble structure is one of the important properties to understand characteristics of the cake from the image. In order to take the cake image, first we cut cakes and measured that's surface by using the CIS scanner. Because the depth of field of this type scanner is very shallow, the bubble region of the surface has low gray scale values, and it has a feature that is blur. We extracted bubble regions from the surface images based on these features. First, input image is binarized, and the feature of bubble is extracted by the morphology analysis. In order to evaluate the result of feature extraction, we compared correlation with "Size of the bubble" of the sensory test result. From a result, the bubble extraction by using morphology analysis gives good correlation. It is shown that our method is as well as the subjectivity evaluation.
Portfolio assessment and evaluation: implications and guidelines for clinical nursing education.
Chabeli, M M
2002-08-01
With the advent of Outcomes-Based Education in South Africa, the quality of nursing education is debatable, especially with regard to the assessment and evaluation of clinical nursing education, which is complex and renders the validity and reliability of the methods used questionable. This paper seeks to explore and describe the use of portfolio assessment and evaluation, its implications and guidelines for its effective use in nursing education. Firstly, the concepts of assessment, evaluation, portfolio and alternative methods of evaluation are defined. Secondly, a comparison of the characteristics of the old (traditional) methods and the new alternative methods of evaluation is made. Thirdly, through deductive analysis, synthesis and inference, implications and guidelines for the effective use of portfolio assessment and evaluation are described. In view of the qualitative, descriptive and exploratory nature of the study, a focus group interview with twenty students following a post-basic degree at a university in Gauteng regarding their perceptions on the use of portfolio assessment and evaluation method in clinical nursing education was used. A descriptive method of qualitative data analysis of open coding in accordance with Tesch's protocol (in Creswell 1994:155) was used. Resultant implications and guidelines were conceptualised and described within the existing theoretical framework. Principles of trustworthiness were maintained as described by (Lincoln & Guba 1985:290-327). Ethical considerations were in accordance with DENOSA's standards of research (1998:7).
Mansoorian, Mohammad Reza; Hosseiny, Marzeih Sadat; Khosravan, Shahla; Alami, Ali; Alaviani, Mehri
2015-06-01
Despite the benefits of the objective structured assessment of technical skills (OSATS) and it appropriateness for evaluating clinical abilities of nursing students , few studies are available on the application of this method in nursing education. The purpose of this study was to compare the effect of using OSATS and traditional methods on the students' learning. We also aimed to signify students' views about these two methods and their views about the scores they received in these methods in a medical emergency course. A quasi-experimental study was performed on 45 first semester students in nursing and medical emergencies passing a course on fundamentals of practice. The students were selected by a census method and evaluated by both the OSATS and traditional methods. Data collection was performed using checklists prepared based on the 'text book of nursing procedures checklists' published by Iranian nursing organization and a questionnaire containing learning rate and students' estimation of their received scores. Descriptive statistics as well as paired t-test and independent samples t-test were used in data analysis. The mean of students' score in OSATS was significantly higher than their mean score in traditional method (P = 0.01). Moreover, the mean of self-evaluation score after the traditional method was relatively the same as the score the students received in the exam. However, the mean of self-evaluation score after the OSATS was relatively lower than the scores the students received in the OSATS exam. Most students believed that OSATS can evaluate a wide range of students' knowledge and skills compared to traditional method. Results of this study indicated the better effect of OSATS on learning and its relative superiority in precise assessment of clinical skills compared with the traditional evaluation method. Therefore, we recommend using this method in evaluation of students in practical courses.
Multi-criteria evaluation methods in the production scheduling
NASA Astrophysics Data System (ADS)
Kalinowski, K.; Krenczyk, D.; Paprocka, I.; Kempa, W.; Grabowik, C.
2016-08-01
The paper presents a discussion on the practical application of different methods of multi-criteria evaluation in the process of scheduling in manufacturing systems. Among the methods two main groups are specified: methods based on the distance function (using metacriterion) and methods that create a Pareto set of possible solutions. The basic criteria used for scheduling were also described. The overall procedure of evaluation process in production scheduling was presented. It takes into account the actions in the whole scheduling process and human decision maker (HDM) participation. The specified HDM decisions are related to creating and editing a set of evaluation criteria, selection of multi-criteria evaluation method, interaction in the searching process, using informal criteria and making final changes in the schedule for implementation. According to need, process scheduling may be completely or partially automated. Full automatization is possible in case of metacriterion based objective function and if Pareto set is selected - the final decision has to be done by HDM.
NASA Astrophysics Data System (ADS)
Viacheslav, Ilyin; Lana, Moukhamedieva; Georgy, Osipov; Aleksey, Batov; Zoya, Soloviova; Robert, Mardanov; Yana, Panina; Anna, Gegenava
2011-05-01
Current control of human microflora is a great problem not only for the space medicine but also for practical health care. Due to many reasons its realization by classical bacteriological method is difficult in practical application or cannot be done. To evaluate non-cultural methods of microbial control of crews in a confined habitat we evaluated two different methods. The first method is based on digital treatment of microbial visual images, appearing after gram staining of microbial material from natural sample. This way the rate between gram-positive and gram-negative microbe could be gained as well as differentiation of rods and cocci could be attained, which is necessary for primary evaluation of human microbial cenosis in remote confined habitats. The other non-culture method of human microflora evaluation is gas chromatomass spectrometry (gcms) analysis of swabs gathered from different body sites. Gc-ms testing of swabs allows one to validate quantitative and special microflora based on specific lipid markers analysis.
Water Quality Evaluation of the Yellow River Basin Based on Gray Clustering Method
NASA Astrophysics Data System (ADS)
Fu, X. Q.; Zou, Z. H.
2018-03-01
Evaluating the water quality of 12 monitoring sections in the Yellow River Basin comprehensively by grey clustering method based on the water quality monitoring data from the Ministry of environmental protection of China in May 2016 and the environmental quality standard of surface water. The results can reflect the water quality of the Yellow River Basin objectively. Furthermore, the evaluation results are basically the same when compared with the fuzzy comprehensive evaluation method. The results also show that the overall water quality of the Yellow River Basin is good and coincident with the actual situation of the Yellow River basin. Overall, gray clustering method for water quality evaluation is reasonable and feasible and it is also convenient to calculate.
A new state evaluation method of oil pump unit based on AHP and FCE
NASA Astrophysics Data System (ADS)
Lin, Yang; Liang, Wei; Qiu, Zeyang; Zhang, Meng; Lu, Wenqing
2017-05-01
In order to make an accurate state evaluation of oil pump unit, a comprehensive evaluation index should be established. A multi-parameters state evaluation method of oil pump unit is proposed in this paper. The oil pump unit is analyzed by Failure Mode and Effect Analysis (FMEA), so evaluation index can be obtained based on FMEA conclusions. The weights of different parameters in evaluation index are discussed using Analytic Hierarchy Process (AHP) with expert experience. According to the evaluation index and the weight of each parameter, the state evaluation is carried out by Fuzzy Comprehensive Evaluation (FCE) and the state is divided into five levels depending on status value, which is inspired by human body health. In order to verify the effectiveness and feasibility of the proposed method, a state evaluation of oil pump used in a pump station is taken as an example.
Systematic evaluation of non-animal test methods for skin sensitisation safety assessment.
Reisinger, Kerstin; Hoffmann, Sebastian; Alépée, Nathalie; Ashikaga, Takao; Barroso, Joao; Elcombe, Cliff; Gellatly, Nicola; Galbiati, Valentina; Gibbs, Susan; Groux, Hervé; Hibatallah, Jalila; Keller, Donald; Kern, Petra; Klaric, Martina; Kolle, Susanne; Kuehnl, Jochen; Lambrechts, Nathalie; Lindstedt, Malin; Millet, Marion; Martinozzi-Teissier, Silvia; Natsch, Andreas; Petersohn, Dirk; Pike, Ian; Sakaguchi, Hitoshi; Schepky, Andreas; Tailhardat, Magalie; Templier, Marie; van Vliet, Erwin; Maxwell, Gavin
2015-02-01
The need for non-animal data to assess skin sensitisation properties of substances, especially cosmetics ingredients, has spawned the development of many in vitro methods. As it is widely believed that no single method can provide a solution, the Cosmetics Europe Skin Tolerance Task Force has defined a three-phase framework for the development of a non-animal testing strategy for skin sensitization potency prediction. The results of the first phase – systematic evaluation of 16 test methods – are presented here. This evaluation involved generation of data on a common set of ten substances in all methods and systematic collation of information including the level of standardisation, existing test data,potential for throughput, transferability and accessibility in cooperation with the test method developers.A workshop was held with the test method developers to review the outcome of this evaluation and to discuss the results. The evaluation informed the prioritisation of test methods for the next phase of the non-animal testing strategy development framework. Ultimately, the testing strategy – combined with bioavailability and skin metabolism data and exposure consideration – is envisaged to allow establishment of a data integration approach for skin sensitisation safety assessment of cosmetic ingredients.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-23
... DEPARTMENT OF HEALTH AND HUMAN SERVICES Nomination of In Vitro Test Methods for Detection and... Evaluated by These Test Methods AGENCY: Division of National Toxicology Program (NTP), National Institute of... Methods (ICCVAM), the NTP Interagency Center for the Evaluation of Alternative Toxicological Methods...
Bubble structure evaluation method of sponge cake by using image morphology
NASA Astrophysics Data System (ADS)
Kato, Kunihito; Yamamoto, Kazuhiko; Nonaka, Masahiko; Katsuta, Yukiyo; Kasamatsu, Chinatsu
2007-01-01
Nowadays, many evaluation methods for food industry by using image processing are proposed. These methods are becoming new evaluation method besides the sensory test and the solid-state measurement that have been used for the quality evaluation recently. The goal of our research is structure evaluation of sponge cake by using the image processing. In this paper, we propose a feature extraction method of the bobble structure in the sponge cake. Analysis of the bubble structure is one of the important properties to understand characteristics of the cake from the image. In order to take the cake image, first we cut cakes and measured that's surface by using the CIS scanner, because the depth of field of this type scanner is very shallow. Therefore the bubble region of the surface has low gray scale value, and it has a feature that is blur. We extracted bubble regions from the surface images based on these features. The input image is binarized, and the feature of bubble is extracted by the morphology analysis. In order to evaluate the result of feature extraction, we compared correlation with "Size of the bubble" of the sensory test result. From a result, the bubble extraction by using morphology analysis gives good correlation. It is shown that our method is as well as the subjectivity evaluation.
Efficacy of evaluation of rooster sperm morphology using different staining methods.
Lukaszewicz, E; Jerysz, A; Partyka, A; Siudzińska, A
2008-12-01
This work focused on inexpensive methods of evaluation fowl sperm morphology, based on eosin-nigrosin smears, which can determine disorders in spermatogenesis and can be recommended for evaluating the fertilising potency and selecting males in flocks reproduced by artificial insemination. Four fowl breeds (Black Minorca, Italian Partridge, Forwerk and Greenleg Partridge) were used to determine the efficacy of sperm morphology evaluation using four eosin-nigrosin staining methods (according to Blom, Bakst and Cecil, Morisson, Jaśkowski) and three examiners of different experience (high, medium, novice). There were significant (P< or = 0.01) differences in sperm morphology between Blom's staining method and those of Bakst and Cecil, Morisson or Jaśkowski, irrespective of fowl breed and examiners experience. Blom stain caused sperm head swelling and showed a drastic reduction in the proportion of live spermatozoa with normal morphology. The staining method had a greater influence on sperm morphology evaluation than the experience of the examiners.
Evaluation of a new ultrasensitive assay for cardiac troponin I.
Casals, Gregori; Filella, Xavier; Bedini, Josep Lluis
2007-12-01
We evaluated the analytical and clinical performance of a new ultrasensitive cardiac troponin I assay (cTnI) on the ADVIA Centaur system (TnI-Ultra). The evaluation included the determination of detection limit, within-assay and between-assay variation and comparison with two other non-ultrasensitive methods. Moreover, cTnI was determined in 120 patients with acute chest pain with three methods. To evaluate the ability of the new method to detect MI earlier, it was assayed in 8 MI patients who first tested negative then positive by the other methods. The detection limit was 0.009 microg/L and imprecision was <10% at all concentrations evaluated. In comparison with two other methods, 10% of the anginas diagnosed were recategorized to MI. The ADVIA Centaur TnI-Ultra assay presented high reproducibility and high sensitivity. The use of the recommended lower cutpoint (0.044 microg/L) implied an increased and earlier identification of MI.
Cautions on the Use of Investigative Case Studies in Meta-Evaluation.
ERIC Educational Resources Information Center
Smith, Nick L.
1990-01-01
A meta-analysis combining expert evaluation with naturalistic case study methods indicates that such investigations must use special methods to render evaluative judgments of worth. It is demonstrated that descriptive, interpretive, and evaluative aspects of such a study must be combined to yield justifiable conclusions. (TJH)
Evaluation Methods for Intelligent Tutoring Systems Revisited
ERIC Educational Resources Information Center
Greer, Jim; Mark, Mary
2016-01-01
The 1993 paper in "IJAIED" on evaluation methods for Intelligent Tutoring Systems (ITS) still holds up well today. Basic evaluation techniques described in that paper remain in use. Approaches such as kappa scores, simulated learners and learning curves are refinements on past evaluation techniques. New approaches have also arisen, in…
Are We There Yet? Evaluating Library Collections, Reference Services, Programs, and Personnel.
ERIC Educational Resources Information Center
Robbins-Carter, Jane; Zweizig, Douglas L.
1985-01-01
This second in a five-lesson tutorial on library evaluation focuses on the evaluation of library collections. Highlights include the seven-step evaluation process described in lesson one; quantitative methods (total size, unfilled requests, circulation, turnover rate); and qualitative methods (impressionistic, list-checking). One required and…
A methodology for evaluating the usability of audiovisual consumer electronic products.
Kwahk, Jiyoung; Han, Sung H
2002-09-01
Usability evaluation is now considered an essential procedure in consumer product development. Many studies have been conducted to develop various techniques and methods of usability evaluation hoping to help the evaluators choose appropriate methods. However, planning and conducting usability evaluation requires considerations of a number of factors surrounding the evaluation process including the product, user, activity, and environmental characteristics. In this perspective, this study suggested a new methodology of usability evaluation through a simple, structured framework. The framework was outlined by three major components: the interface features of a product as design variables, the evaluation context consisting of user, product, activity, and environment as context variables, and the usability measures as dependent variables. Based on this framework, this study established methods to specify the product interface features, to define evaluation context, and to measure usability. The effectiveness of this methodology was demonstrated through case studies in which the usability of audiovisual products was evaluated by using the methods developed in this study. This study is expected to help the usability practitioners in consumer electronics industry in various ways. Most directly, it supports the evaluators' plan and conduct usability evaluation sessions in a systematic and structured manner. In addition, it can be applied to other categories of consumer products (such as appliances, automobiles, communication devices, etc.) with minor modifications as necessary.
Sadeghi, Tabandeh; Seyed Bagheri, Seyed Hamid
2017-01-01
Clinical evaluation is very important in the educational system of nursing. One of the most common methods of clinical evaluation is evaluation by the teacher, but the challenges that students would face in this evaluation method, have not been mentioned. Thus, this study aimed to explore the experiences and views of nursing students about the challenges of teacher-based clinical evaluation. This study was a descriptive qualitative study with a qualitative content analysis approach. Data were gathered through semi-structured focused group sessions with undergraduate nursing students who were passing their 8 th semester at Rafsanjan University of Medical Sciences. Date were analyzed using Graneheim and Lundman's proposed method. Data collection and analysis were concurrent. According to the findings, "factitious evaluation" was the main theme of study that consisted of three categories: "Personal preferences," "unfairness" and "shirking responsibility." These categories are explained using quotes derived from the data. According to the results of this study, teacher-based clinical evaluation would lead to factitious evaluation. Thus, changing this approach of evaluation toward modern methods of evaluation is suggested. The finding can help nursing instructors to get a better understanding of the nursing students' point of view toward this evaluation approach and as a result could be planning for changing of this approach.
NASA Astrophysics Data System (ADS)
Ando, Yoshinobu; Eguchi, Yuya; Mizukawa, Makoto
In this research, we proposed and evaluated a management method of college mechatronics education. We applied the project management to college mechatronics education. We practiced our management method to the seminar “Microcomputer Seminar” for 3rd grade students who belong to Department of Electrical Engineering, Shibaura Institute of Technology. We succeeded in management of Microcomputer Seminar in 2006. We obtained the good evaluation for our management method by means of questionnaire.
Evaluation of Athletic Training Students' Clinical Proficiencies
Walker, Stacy E; Weidner, Thomas G; Armstrong, Kirk J
2008-01-01
Context: Appropriate methods for evaluating clinical proficiencies are essential in ensuring entry-level competence. Objective: To investigate the common methods athletic training education programs use to evaluate student performance of clinical proficiencies. Design: Cross-sectional design. Setting: Public and private institutions nationwide. Patients or Other Participants: All program directors of athletic training education programs accredited by the Commission on Accreditation of Allied Health Education Programs as of January 2006 (n = 337); 201 (59.6%) program directors responded. Data Collection and Analysis: The institutional survey consisted of 11 items regarding institutional and program demographics. The 14-item Methods of Clinical Proficiency Evaluation in Athletic Training survey consisted of respondents' demographic characteristics and Likert-scale items regarding clinical proficiency evaluation methods and barriers, educational content areas, and clinical experience settings. We used analyses of variance and independent t tests to assess differences among athletic training education program characteristics and the barriers, methods, content areas, and settings regarding clinical proficiency evaluation. Results: Of the 3 methods investigated, simulations (n = 191, 95.0%) were the most prevalent method of clinical proficiency evaluation. An independent-samples t test revealed that more opportunities existed for real-time evaluations in the college or high school athletic training room (t189 = 2.866, P = .037) than in other settings. Orthopaedic clinical examination and diagnosis (4.37 ± 0.826) and therapeutic modalities (4.36 ± 0.738) content areas were scored the highest in sufficient opportunities for real-time clinical proficiency evaluations. An inadequate volume of injuries or conditions (3.99 ± 1.033) and injury/condition occurrence not coinciding with the clinical proficiency assessment timetable (4.06 ± 0.995) were barriers to real-time evaluation. One-way analyses of variance revealed no difference between athletic training education program characteristics and the opportunities for and barriers to real-time evaluations among the various clinical experience settings. Conclusions: No one primary barrier hindered real-time clinical proficiency evaluation. To determine athletic training students' clinical proficiency for entry-level employment, athletic training education programs must incorporate standardized patients or take a disciplined approach to using simulation for instruction and evaluation. PMID:18668172
Development of Methods of Evaluating Abilities to Make Plans in New Group Work
NASA Astrophysics Data System (ADS)
Kiriyama, Satoshi
The ability to evaluate something vague which is, for example, originality can be regarded as one of important elements which constitute the ability to make plans. The author has made use of cooperative activities in which every member undertakes each stage of a plan-do-check-cycle in order to develop training methods and evaluating methods of evaluating ability. The members of a CHECK team evaluated activities of a PLAN team and a DO team. The author tried to grasp the abilities of the members of a CHECK team by analyzing results of the evaluation. In addition, the author have made some teachers evaluate a sample in order to study the accuracy of criteria and extracted some challenges.
Radiologic methods of evaluating generalized osteopenia
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schneider, R.
1984-10-01
Noninvasive methods of evaluating generalized osteopenia include radiography, radionuclide studies, and various quantitative studies. These methods differ in availability, cost, accuracy, precision, radiation dose, and information supplied about bony change. A combination of methods is necessary to detect and follow the course and treatment of osteopenia.
Color image definition evaluation method based on deep learning method
NASA Astrophysics Data System (ADS)
Liu, Di; Li, YingChun
2018-01-01
In order to evaluate different blurring levels of color image and improve the method of image definition evaluation, this paper proposed a method based on the depth learning framework and BP neural network classification model, and presents a non-reference color image clarity evaluation method. Firstly, using VGG16 net as the feature extractor to extract 4,096 dimensions features of the images, then the extracted features and labeled images are employed in BP neural network to train. And finally achieve the color image definition evaluation. The method in this paper are experimented by using images from the CSIQ database. The images are blurred at different levels. There are 4,000 images after the processing. Dividing the 4,000 images into three categories, each category represents a blur level. 300 out of 400 high-dimensional features are trained in VGG16 net and BP neural network, and the rest of 100 samples are tested. The experimental results show that the method can take full advantage of the learning and characterization capability of deep learning. Referring to the current shortcomings of the major existing image clarity evaluation methods, which manually design and extract features. The method in this paper can extract the images features automatically, and has got excellent image quality classification accuracy for the test data set. The accuracy rate is 96%. Moreover, the predicted quality levels of original color images are similar to the perception of the human visual system.
ERIC Educational Resources Information Center
Adams, Adrienne E.; Nnawulezi, Nkiru A.; Vandenberg, Lela
2015-01-01
From a utilization-focused evaluation perspective, the success of an evaluation is rooted in the extent to which the evaluation was used by stakeholders. This paper details the "Expectations to Change" (E2C) process, an interactive, workshop-based method designed to engage primary users with their evaluation findings as a means of…
Quality Evaluation of Raw Moutan Cortex Using the AHP and Gray Correlation-TOPSIS Method
Zhou, Sujuan; Liu, Bo; Meng, Jiang
2017-01-01
Background: Raw Moutan cortex (RMC) is an important Chinese herbal medicine. Comprehensive and objective quality evaluation of Chinese herbal medicine has been one of the most important issues in the modern herbs development. Objective: To evaluate and compare the quality of RMC using the weighted gray correlation- Technique for Order Preference by Similarity to an Ideal Solution (TOPSIS) method. Materials and Methods: The percentage composition of gallic acid, catechin, oxypaeoniflorin, paeoniflorin, quercetin, benzoylpaeoniflorin, paeonol in different batches of RMC was determined, and then adopting MATLAB programming to construct the gray correlation-TOPSIS assessment model for quality evaluation of RMC. Results: The quality evaluation results of model evaluation and objective evaluation were consistent, reliable, and stable. Conclusion: The model of gray correlation-TOPSIS can be well applied to the quality evaluation of traditional Chinese medicine with multiple components and has broad prospect in application. SUMMARY The experiment tries to construct a model to evaluate the quality of RMC using the weighted gray correlation- Technique for Order Preference by Similarity to an Ideal Solution (TOPSIS) method. Results show the model is reliable and provide a feasible way in evaluating quality of traditional Chinese medicine with multiple components. PMID:28839384
[Research progress on identification and quality evaluation of glues medicines].
Li, Hui-Hu; Ren, Gang; Chen, Li-Min; Zhong, Guo-Yue
2018-01-01
Glues medicines is a special kind of traditional Chinese medicine.As the market demand is large, the raw materials are in short supply and lacks proper quality evaluation technology, which causes inconsistent quality of products on the market. Its authentic identification and evaluation stay a problem to be solved. In this paper, the research progress of the methods and techniques of the evaluation of the identification and quality of glues medicines were reviewed. The researches of medicinal glue type identification and quality evaluation mainly concentrated in four aspects of medicinal materials of physical and chemical properties, trace elements, organic chemicals and biological genetic methods and techniques. The methods of physicochemical properties include thermal analysis, gel electrophoresis, isoelectric focusing electrophoresis, infrared spectroscopy, gel exclusion chromatography, and circular dichroism. The methods including atomic absorption spectrometry, X-ray fluorescence spectrometry, plasma emission spectrometry and visible spectrophotometry were used for the study of the trace elements of glues medicines. The organic chemical composition was studied by methods of composition of amino acids, content detection, odor detection, lipid soluble component, organic acid detection. Methods based on the characteristics of biogenetics include DNA, polypeptide and amino acid sequence difference analysis. Overall, because of relative components similarity of the glues medicines (such as amino acids, proteins and peptides), its authenticity and quality evaluation index is difficult to judge objectively, all sorts of identification evaluation methods have different characteristics, but also their limitations. It indicates that further study should focus on identification of evaluation index and various technology integrated application combining with the characteristics of the production process. Copyright© by the Chinese Pharmaceutical Association.
Jerrold E. Winandy; Douglas Herdman
2003-01-01
The purpose of this work was to evaluate the effects of a new boron-nitrogen, phosphate-free fire-rerardant (FR) formulation on the initial strength of No. 1 southern pine 2 by 4 lumber and its potential for in-service thermal degradation. The lumber was evaluated according to Method C of the D 5664 standard test method. The results indicated that for lumber exposed at...
Valderrama, Joaquin T; de la Torre, Angel; Alvarez, Isaac; Segura, Jose Carlos; Thornton, A Roger D; Sainz, Manuel; Vargas, Jose Luis
2014-05-01
The recording of the auditory brainstem response (ABR) is used worldwide for hearing screening purposes. In this process, a precise estimation of the most relevant components is essential for an accurate interpretation of these signals. This evaluation is usually carried out subjectively by an audiologist. However, the use of automatic methods for this purpose is being encouraged nowadays in order to reduce human evaluation biases and ensure uniformity among test conditions, patients, and screening personnel. This article describes a new method that performs automatic quality assessment and identification of the peaks, the fitted parametric peaks (FPP). This method is based on the use of synthesized peaks that are adjusted to the ABR response. The FPP is validated, on one hand, by an analysis of amplitudes and latencies measured manually by an audiologist and automatically by the FPP method in ABR signals recorded at different stimulation rates; and on the other hand, contrasting the performance of the FPP method with the automatic evaluation techniques based on the correlation coefficient, FSP, and cross correlation with a predefined template waveform by comparing the automatic evaluations of the quality of these methods with subjective evaluations provided by five experienced evaluators on a set of ABR signals of different quality. The results of this study suggest (a) that the FPP method can be used to provide an accurate parameterization of the peaks in terms of amplitude, latency, and width, and (b) that the FPP remains as the method that best approaches the averaged subjective quality evaluation, as well as provides the best results in terms of sensitivity and specificity in ABR signals validation. The significance of these findings and the clinical value of the FPP method are highlighted on this paper. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.