The four principles: Can they be measured and do they predict ethical decision making?
2012-01-01
Background The four principles of Beauchamp and Childress - autonomy, non-maleficence, beneficence and justice - have been extremely influential in the field of medical ethics, and are fundamental for understanding the current approach to ethical assessment in health care. This study tests whether these principles can be quantitatively measured on an individual level, and then subsequently if they are used in the decision making process when individuals are faced with ethical dilemmas. Methods The Analytic Hierarchy Process was used as a tool for the measurement of the principles. Four scenarios, which involved conflicts between the medical ethical principles, were presented to participants who then made judgments about the ethicality of the action in the scenario, and their intentions to act in the same manner if they were in the situation. Results Individual preferences for these medical ethical principles can be measured using the Analytic Hierarchy Process. This technique provides a useful tool in which to highlight individual medical ethical values. On average, individuals have a significant preference for non-maleficence over the other principles, however, and perhaps counter-intuitively, this preference does not seem to relate to applied ethical judgements in specific ethical dilemmas. Conclusions People state they value these medical ethical principles but they do not actually seem to use them directly in the decision making process. The reasons for this are explained through the lack of a behavioural model to account for the relevant situational factors not captured by the principles. The limitations of the principles in predicting ethical decision making are discussed. PMID:22606995
The four principles: can they be measured and do they predict ethical decision making?
Page, Katie
2012-05-20
The four principles of Beauchamp and Childress--autonomy, non-maleficence, beneficence and justice--have been extremely influential in the field of medical ethics, and are fundamental for understanding the current approach to ethical assessment in health care. This study tests whether these principles can be quantitatively measured on an individual level, and then subsequently if they are used in the decision making process when individuals are faced with ethical dilemmas. The Analytic Hierarchy Process was used as a tool for the measurement of the principles. Four scenarios, which involved conflicts between the medical ethical principles, were presented to participants who then made judgments about the ethicality of the action in the scenario, and their intentions to act in the same manner if they were in the situation. Individual preferences for these medical ethical principles can be measured using the Analytic Hierarchy Process. This technique provides a useful tool in which to highlight individual medical ethical values. On average, individuals have a significant preference for non-maleficence over the other principles, however, and perhaps counter-intuitively, this preference does not seem to relate to applied ethical judgements in specific ethical dilemmas. People state they value these medical ethical principles but they do not actually seem to use them directly in the decision making process. The reasons for this are explained through the lack of a behavioural model to account for the relevant situational factors not captured by the principles. The limitations of the principles in predicting ethical decision making are discussed.
Introductory review on `Flying Triangulation': a motion-robust optical 3D measurement principle
NASA Astrophysics Data System (ADS)
Ettl, Svenja
2015-04-01
'Flying Triangulation' (FlyTri) is a recently developed principle which allows for a motion-robust optical 3D measurement of rough surfaces. It combines a simple sensor with sophisticated algorithms: a single-shot sensor acquires 2D camera images. From each camera image, a 3D profile is generated. The series of 3D profiles generated are aligned to one another by algorithms, without relying on any external tracking device. It delivers real-time feedback of the measurement process which enables an all-around measurement of objects. The principle has great potential for small-space acquisition environments, such as the measurement of the interior of a car, and motion-sensitive measurement tasks, such as the intraoral measurement of teeth. This article gives an overview of the basic ideas and applications of FlyTri. The main challenges and their solutions are discussed. Measurement examples are also given to demonstrate the potential of the measurement principle.
NASA Technical Reports Server (NTRS)
Tsang, Pamela S.; Hart, Sandra G.; Vidulich, Michael A.
1987-01-01
The utility of speech technology was evaluated in terms of three dual task principles: resource competition between the time shared tasks, stimulus central processing response compatibility, and task integrality. Empirical support for these principles was reviewed. Two studies investigating the interactive effects of the three principles were described. Objective performance and subjective workload ratings for both single and dual tasks were examined. It was found that the single task measures were not necessarily good predictors for the dual task measures. It was shown that all three principles played an important role in determining an optimal task configuration. This was reflected in both the performance measures and the subjective measures. Therefore, consideration of all three principles is required to insure proper use of speech technology in a complex environment.
Structured light optical microscopy for three-dimensional reconstruction of technical surfaces
NASA Astrophysics Data System (ADS)
Kettel, Johannes; Reinecke, Holger; Müller, Claas
2016-04-01
In microsystems technology quality control of micro structured surfaces with different surface properties is playing an ever more important role. The process of quality control incorporates three-dimensional (3D) reconstruction of specularand diffusive reflecting technical surfaces. Due to the demand on high measurement accuracy and data acquisition rates, structured light optical microscopy has become a valuable solution to solve this problem providing high vertical and lateral resolution. However, 3D reconstruction of specular reflecting technical surfaces still remains a challenge to optical measurement principles. In this paper we present a measurement principle based on structured light optical microscopy which enables 3D reconstruction of specular- and diffusive reflecting technical surfaces. It is realized using two light paths of a stereo microscope equipped with different magnification levels. The right optical path of the stereo microscope is used to project structured light onto the object surface. The left optical path is used to capture the structured illuminated object surface with a camera. Structured light patterns are generated by a Digital Light Processing (DLP) device in combination with a high power Light Emitting Diode (LED). Structured light patterns are realized as a matrix of discrete light spots to illuminate defined areas on the object surface. The introduced measurement principle is based on multiple and parallel processed point measurements. Analysis of the measured Point Spread Function (PSF) by pattern recognition and model fitting algorithms enables the precise calculation of 3D coordinates. Using exemplary technical surfaces we demonstrate the successful application of our measurement principle.
Instructors' Use of the Principles of Teaching and Learning during College Class Sessions
ERIC Educational Resources Information Center
Foster, Daniel D.; Whittington, M. Susie
2017-01-01
The purpose of this study was to measure the frequency of utilization of the Principles of Teaching and Learning (Newcomb, McCracken, Warmbrod, & Whittington, 2004) during college class sessions. Process-product research was implemented (Gage, 1972; Rosenshine & Furst, 1973) using the Principles of Teaching and Learning Assessment (PTLA)…
Definitely maybe: can unconscious processes perform the same functions as conscious processes?
Hesselmann, Guido; Moors, Pieter
2015-01-01
Hassin recently proposed the “Yes It Can” (YIC) principle to describe the division of labor between conscious and unconscious processes in human cognition. According to this principle, unconscious processes can carry out every fundamental high-level cognitive function that conscious processes can perform. In our commentary, we argue that the author presents an overly idealized review of the literature in support of the YIC principle. Furthermore, we point out that the dissimilar trends observed in social and cognitive psychology, with respect to published evidence of strong unconscious effects, can better be explained by the way how awareness is defined and measured in both research fields. Finally, we show that the experimental paradigm chosen by Hassin to rule out remaining objections against the YIC principle is unsuited to verify the new default notion that all high-level cognitive functions can unfold unconsciously. PMID:25999896
Quality of Life and its Measurement: Important Principles and Guidelines
ERIC Educational Resources Information Center
Verdugo, M. A.; Schalock, R. L.; Keith, K. D.; Stancliffe, R. J.
2005-01-01
Background: The importance of the valid assessment of quality of life (QOL) is heightened with the increased use of the QOL construct as a basis for policies and practices in the field of intellectual disability (ID). Method: This article discusses the principles that should guide the measurement process, the major interrogatories (i.e. who, what,…
Ethical principles of informed consent: exploring nurses' dual role of care provider and researcher.
Judkins-Cohn, Tanya M; Kielwasser-Withrow, Kiersten; Owen, Melissa; Ward, Jessica
2014-01-01
This article describes the ethical principles of autonomy, beneficence, and justice within the nurse researcher-participant relationship as these principles relate to the informed consent process for research. Within this process, the nurse is confronted with a dual role. This article describes how nurses, who are in the dual role of care provider and researcher, can apply these ethical principles to their practice in conjunction with the American Nurses Association's code of ethics for nurses. This article also describes, as an element of ethical practice, the importance of using participant-centered quality measures to aid informed decision making of participants in research. In addition, the article provides strategies for improving the informed consent process in nursing research. Finally, case scenarios are discussed, along with the application of ethical principles within the awareness of the dual role of the nurse as care provider and researcher. Copyright 2014, SLACK Incorporated.
1981-03-01
systems, sub- systems, equipment, weapons, tactics, missions, etc. Concepts and Principles - Fundamental truths, ideas, opinions and thoughts formed from...verification, etc. Grasping the meaning of concepts and principles , i.e., understanding the basic principles of infrared and radar detection. Understanding...concepts, principles , procedures, etc.). Analysis A demonstration of a learned process of breaking down material (i.e., data, other information) into
Development of a surface topography instrument for automotive textured steel plate
NASA Astrophysics Data System (ADS)
Wang, Zhen; Wang, Shenghuai; Chen, Yurong; Xie, Tiebang
2010-08-01
The surface topography of automotive steel plate is decisive to its stamping, painting and image clarity performances. For measuring this kind of surface topography, an instrument has been developed based on the principle of vertical scanning white light microscopy interference principle. The microscopy interference system of this instrument is designed based on the structure of Linnik interference microscopy. The 1D worktable of Z direction is designed and introduced in details. The work principle of this instrument is analyzed. In measuring process, the interference microscopy is derived as a whole and the measured surface is scanned in vertical direction. The measurement accuracy and validity is verified by templates. Surface topography of textured steel plate is also measured by this instrument.
Guelpa, Valérian; Laurent, Guillaume J; Sandoz, Patrick; Zea, July Galeano; Clévy, Cédric
2014-03-12
This paper presents a visual measurement method able to sense 1D rigid body displacements with very high resolutions, large ranges and high processing rates. Sub-pixelic resolution is obtained thanks to a structured pattern placed on the target. The pattern is made of twin periodic grids with slightly different periods. The periodic frames are suited for Fourier-like phase calculations-leading to high resolution-while the period difference allows the removal of phase ambiguity and thus a high range-to-resolution ratio. The paper presents the measurement principle as well as the processing algorithms (source files are provided as supplementary materials). The theoretical and experimental performances are also discussed. The processing time is around 3 µs for a line of 780 pixels, which means that the measurement rate is mostly limited by the image acquisition frame rate. A 3-σ repeatability of 5 nm is experimentally demonstrated which has to be compared with the 168 µm measurement range.
Guelpa, Valérian; Laurent, Guillaume J.; Sandoz, Patrick; Zea, July Galeano; Clévy, Cédric
2014-01-01
This paper presents a visual measurement method able to sense 1D rigid body displacements with very high resolutions, large ranges and high processing rates. Sub-pixelic resolution is obtained thanks to a structured pattern placed on the target. The pattern is made of twin periodic grids with slightly different periods. The periodic frames are suited for Fourier-like phase calculations—leading to high resolution—while the period difference allows the removal of phase ambiguity and thus a high range-to-resolution ratio. The paper presents the measurement principle as well as the processing algorithms (source files are provided as supplementary materials). The theoretical and experimental performances are also discussed. The processing time is around 3 μs for a line of 780 pixels, which means that the measurement rate is mostly limited by the image acquisition frame rate. A 3-σ repeatability of 5 nm is experimentally demonstrated which has to be compared with the 168 μm measurement range. PMID:24625736
Reverse engineering of the homogeneous-entity product profiles based on CCD
NASA Astrophysics Data System (ADS)
Gan, Yong; Zhong, Jingru; Sun, Ning; Sun, Aoran
2011-08-01
This measurement system uses delaminated measurement principle, measures the three perpendicular direction values of the entities. When the measured entity is immerged in the liquid layer by layer, every layer's image are collected by CCD and digitally processed. It introduces the basic measuring principle and the working process of the measure method. According to Archimedes law, the related buoyancy and volume that soaked in different layer's depth are measured by electron balance and the mathematics models are established. Through calculating every layer's weight and centre of gravity by computer based on the method of Artificial Intelligence, we can reckon 3D coordinate values of every minute entity cell in different layers and its 3D contour picture is constructed. The experimental results show that for all the homogeneous entity insoluble in water, it can measure them. The measurement velocity is fast and non-destructive test, it can measure the entity with internal hole.
40 CFR 63.488 - Methods and procedures for batch front-end process vent group determination.
Code of Federal Regulations, 2014 CFR
2014-07-01
... engineering principles, measurable process parameters, or physical or chemical laws or properties. Examples of... direct measurement as specified in paragraph (b)(5) of this section. Engineering assessment may also be... obtained through direct measurement, as defined in paragraph (b)(5) of this section, through engineering...
40 CFR 63.488 - Methods and procedures for batch front-end process vent group determination.
Code of Federal Regulations, 2012 CFR
2012-07-01
... engineering principles, measurable process parameters, or physical or chemical laws or properties. Examples of... direct measurement as specified in paragraph (b)(5) of this section. Engineering assessment may also be... obtained through direct measurement, as defined in paragraph (b)(5) of this section, through engineering...
40 CFR 63.488 - Methods and procedures for batch front-end process vent group determination.
Code of Federal Regulations, 2013 CFR
2013-07-01
... engineering principles, measurable process parameters, or physical or chemical laws or properties. Examples of... direct measurement as specified in paragraph (b)(5) of this section. Engineering assessment may also be... obtained through direct measurement, as defined in paragraph (b)(5) of this section, through engineering...
40 CFR 63.488 - Methods and procedures for batch front-end process vent group determination.
Code of Federal Regulations, 2011 CFR
2011-07-01
... engineering principles, measurable process parameters, or physical or chemical laws or properties. Examples of... direct measurement as specified in paragraph (b)(5) of this section. Engineering assessment may also be... obtained through direct measurement, as defined in paragraph (b)(5) of this section, through engineering...
Monitoring of laser material processing using machine integrated low-coherence interferometry
NASA Astrophysics Data System (ADS)
Kunze, Rouwen; König, Niels; Schmitt, Robert
2017-06-01
Laser material processing has become an indispensable tool in modern production. With the availability of high power pico- and femtosecond laser sources, laser material processing is advancing into applications, which demand for highest accuracies such as laser micro milling or laser drilling. In order to enable narrow tolerance windows, a closedloop monitoring of the geometrical properties of the processed work piece is essential for achieving a robust manufacturing process. Low coherence interferometry (LCI) is a high-precision measuring principle well-known from surface metrology. In recent years, we demonstrated successful integrations of LCI into several different laser material processing methods. Within this paper, we give an overview about the different machine integration strategies, that always aim at a complete and ideally telecentric integration of the measurement device into the existing beam path of the processing laser. Thus, highly accurate depth measurements within machine coordinates and a subsequent process control and quality assurance are possible. First products using this principle have already found its way to the market, which underlines the potential of this technology for the monitoring of laser material processing.
NASA Astrophysics Data System (ADS)
Korepanov, Alexey
2017-12-01
Let {T : M \\to M} be a nonuniformly expanding dynamical system, such as logistic or intermittent map. Let {v : M \\to R^d} be an observable and {v_n = \\sum_{k=0}^{n-1} v circ T^k} denote the Birkhoff sums. Given a probability measure {μ} on M, we consider v n as a discrete time random process on the probability space {(M, μ)} . In smooth ergodic theory there are various natural choices of {μ} , such as the Lebesgue measure, or the absolutely continuous T-invariant measure. They give rise to different random processes. We investigate relation between such processes. We show that in a large class of measures, it is possible to couple (redefine on a new probability space) every two processes so that they are almost surely close to each other, with explicit estimates of "closeness". The purpose of this work is to close a gap in the proof of the almost sure invariance principle for nonuniformly hyperbolic transformations by Melbourne and Nicol.
Spatial and Temporal Patterns of Impervious Cover Relative to Watershed Stream Location
The influence of spatial pattern on ecological processes is a guiding principle of landscape ecology. The guiding principle of spatial pattern was used for a U.S. nationwide assessment of impervious cover (IC). Spatial pattern was measured by comparing IC concentration near strea...
Hayes, A Wallace
2005-06-01
The Precautionary Principle in its simplest form states: "When an activity raises threats of harm to human health or the environment, precautionary measures should be taken even if some cause-and-effect relationships are not fully established scientifically". This Principle is the basis for European environmental law, and plays an increasing role in developing environmental health policies as well. It also is used in environmental decision-making in Canada and in several European countries, especially in Denmark, Sweden, and Germany. The Precautionary Principle has been used in the environmental decision-making process and in regulating drugs and other consumer products in the United States. The Precautionary Principle enhances the collection of risk information for, among other items, high production volume chemicals and risk-based analyses in general. It does not eliminate the need for good science or for science-based risk assessments. Public participation is encouraged in both the review process and the decision-making process. The Precautionary Principle encourages, and in some cases may require, transparency of the risk assessment process on health risk of chemicals both for public health and the environment. A debate continues on whether the Principle should embrace the "polluter pays" directive and place the responsibility for providing risk assessment on industry. The best elements of a precautionary approach demand good science and challenge the scientific community to improve methods used for risk assessment.
76 FR 45271 - Review and Qualification of Clinical Outcome Assessments; Public Workshop
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-28
... announcing a public workshop to discuss measurement principles for clinical outcome assessments (COAs) for... appropriate drug development program. Because the qualification process is separate from the drug marketing... other DDTs. This workshop will focus on FDA review principles specific to all type of COAs, i.e., PRO...
The physics of bat echolocation: Signal processing techniques
NASA Astrophysics Data System (ADS)
Denny, Mark
2004-12-01
The physical principles and signal processing techniques underlying bat echolocation are investigated. It is shown, by calculation and simulation, how the measured echolocation performance of bats can be achieved.
Influence of measurement error on Maxwell's demon
NASA Astrophysics Data System (ADS)
Sørdal, Vegard; Bergli, Joakim; Galperin, Y. M.
2017-06-01
In any general cycle of measurement, feedback, and erasure, the measurement will reduce the entropy of the system when information about the state is obtained, while erasure, according to Landauer's principle, is accompanied by a corresponding increase in entropy due to the compression of logical and physical phase space. The total process can in principle be fully reversible. A measurement error reduces the information obtained and the entropy decrease in the system. The erasure still gives the same increase in entropy, and the total process is irreversible. Another consequence of measurement error is that a bad feedback is applied, which further increases the entropy production if the proper protocol adapted to the expected error rate is not applied. We consider the effect of measurement error on a realistic single-electron box Szilard engine, and we find the optimal protocol for the cycle as a function of the desired power P and error ɛ .
Application of participatory ergonomics to the redesign of the family-centred rounds process.
Xie, Anping; Carayon, Pascale; Cox, Elizabeth D; Cartmill, Randi; Li, Yaqiong; Wetterneck, Tosha B; Kelly, Michelle M
2015-01-01
Participatory ergonomics (PE) can promote the application of human factors and ergonomics (HFE) principles to healthcare system redesign. This study applied a PE approach to redesigning the family-centred rounds (FCR) process to improve family engagement. Various FCR stakeholders (e.g. patients and families, physicians, nurses, hospital management) were involved in different stages of the PE process. HFE principles were integrated in both the content (e.g. shared mental model, usability, workload consideration, systems approach) and process (e.g. top management commitment, stakeholder participation, communication and feedback, learning and training, project management) of FCR redesign. We describe activities of the PE process (e.g. formation and meetings of the redesign team, data collection activities, intervention development, intervention implementation) and present data on PE process evaluation. To demonstrate the value of PE-based FCR redesign, future research should document its impact on FCR process measures (e.g. family engagement, round efficiency) and patient outcome measures (e.g. patient satisfaction).
Application of participatory ergonomics to the redesign of the family-centered rounds process
Xie, Anping; Carayon, Pascale; Cox, Elizabeth D.; Cartmill, Randi; Li, Yaqiong; Wetterneck, Tosha B.; Kelly, Michelle M.
2015-01-01
Participatory ergonomics (PE) can promote the application of human factors and ergonomics (HFE) principles to healthcare system redesign. This study applied a PE approach to redesigning the family-centered rounds (FCR) process to improve family engagement. Various FCR stakeholders (e.g., patients and families, physicians, nurses, hospital management) were involved in different stages of the PE process. HFE principles were integrated in both the content (e.g., shared mental model, usability, workload consideration, systems approach) and process (e.g., top management commitment, stakeholder participation, communication and feedback, learning and training, project management) of FCR redesign. We describe activities of the PE process (e.g., formation and meetings of the redesign team, data collection activities, intervention development, intervention implementation) and present data on PE process evaluation. To demonstrate the value of PE-based FCR redesign, future research should document its impact on FCR process measures (e.g., family engagement, round efficiency) and patient outcome measures (e.g., patient satisfaction). PMID:25777042
40 CFR 63.115 - Process vent provisions-methods and procedures for process vent group determination.
Code of Federal Regulations, 2011 CFR
2011-07-01
... accepted chemical engineering principles, measurable process parameters, or physical or chemical laws or... From the Synthetic Organic Chemical Manufacturing Industry for Process Vents, Storage Vessels, Transfer... (d)(3) of this section. (1) Engineering assessment may be used to determine vent stream flow rate...
40 CFR 63.115 - Process vent provisions-methods and procedures for process vent group determination.
Code of Federal Regulations, 2013 CFR
2013-07-01
... accepted chemical engineering principles, measurable process parameters, or physical or chemical laws or... From the Synthetic Organic Chemical Manufacturing Industry for Process Vents, Storage Vessels, Transfer... (d)(3) of this section. (1) Engineering assessment may be used to determine vent stream flow rate...
40 CFR 63.115 - Process vent provisions-methods and procedures for process vent group determination.
Code of Federal Regulations, 2014 CFR
2014-07-01
... accepted chemical engineering principles, measurable process parameters, or physical or chemical laws or... From the Synthetic Organic Chemical Manufacturing Industry for Process Vents, Storage Vessels, Transfer... (d)(3) of this section. (1) Engineering assessment may be used to determine vent stream flow rate...
40 CFR 63.115 - Process vent provisions-methods and procedures for process vent group determination.
Code of Federal Regulations, 2012 CFR
2012-07-01
... accepted chemical engineering principles, measurable process parameters, or physical or chemical laws or... From the Synthetic Organic Chemical Manufacturing Industry for Process Vents, Storage Vessels, Transfer... (d)(3) of this section. (1) Engineering assessment may be used to determine vent stream flow rate...
2014-04-01
wettability of diamond is not an issue. Moreover, the solid-state processing can, in principle , be carried out at relatively low temperatures even for non...capacity. q was mea- sured using Archimedes ’ method, and D was measured with laser flash technique per ASTM E1461. The speci- mens for D measurement... principle , attainable by changing the interfacial Cr3C2 layer characteristics. In an earlier study [3], for a given diamond particle size and volume
Moutel, G; Hergon, E; Duchange, N; Bellier, L; Rouger, P; Hervé, C
2005-02-01
The precautionary principle first appeared in France during the health crisis following the contamination of patients with HIV via blood transfusion. This study analyses whether the risk associated with blood transfusion was taken into account early enough considering the context of scientific uncertainty between 1982 and 1985. The aim was to evaluate whether a precautionary principle was applied and whether it was relevant. First, we investigated the context of scientific uncertainty and controversies prevailing between 1982 and 1985. Then we analysed the attitude and decisions of the French authorities in this situation to determine whether a principle of precaution was applied. Finally, we explored the reasons at the origin of the delay in controlling the risk. Despite the scientific uncertainties associated with the potential risk of HIV contamination by transfusion in 1983, we found that a list of recommendations aiming to reduce this risk was published in June of that year. In the prevailing climate of uncertainty, these measures could be seen as precautionary. However, the recommended measures were not widely applied. Cultural, structural and economic factors hindered their implementation. Our analysis provides insight into the use of precautionary principle in the domain of blood transfusion and, more generally, medicine. It also sheds light on the expectations that health professionals should have of this principle. The aim of the precautionary principle is to manage rather than to reduce scientific uncertainty. The principle is not a futile search for zero risk. Rather, it is a principle for action allowing precautionary measures to be taken. However, we show that these measures must appear legitimate to be applied. This legitimacy requires an adapted decision-making process, involving all those concerned in the management of collective risks.
Development and Validation of Instruments to Measure Learning of Expert-Like Thinking
ERIC Educational Resources Information Center
Adams, Wendy K.; Wieman, Carl E.
2011-01-01
This paper describes the process for creating and validating an assessment test that measures the effectiveness of instruction by probing how well that instruction causes students in a class to think like experts about specific areas of science. The design principles and process are laid out and it is shown how these align with professional…
USDA-ARS?s Scientific Manuscript database
In this chapter, definitions of dielectric properties, or permittivity, of materials and a brief discussion of the fundamental principles governing their behavior with respect to influencing factors are presented. The basic physics of the influence of frequency of the electric fields and temperatur...
Evaluation and correction of laser-scanned point clouds
NASA Astrophysics Data System (ADS)
Teutsch, Christian; Isenberg, Tobias; Trostmann, Erik; Weber, Michael; Berndt, Dirk; Strothotte, Thomas
2005-01-01
The digitalization of real-world objects is of great importance in various application domains. E.g. in industrial processes quality assurance is very important. Geometric properties of workpieces have to be measured. Traditionally, this is done with gauges which is somewhat subjective and time-consuming. We developed a robust optical laser scanner for the digitalization of arbitrary objects, primary, industrial workpieces. As measuring principle we use triangulation with structured lighting and a multi-axis locomotor system. Measurements on the generated data leads to incorrect results if the contained error is too high. Therefore, processes for geometric inspection under non-laboratory conditions are needed that are robust in permanent use and provide high accuracy as well as high operation speed. The many existing methods for polygonal mesh optimization produce very esthetic 3D models but often require user interaction and are limited in processing speed and/or accuracy. Furthermore, operations on optimized meshes consider the entire model and pay only little attention to individual measurements. However, many measurements contribute to parts or single scans with possibly strong differences between neighboring scans being lost during mesh construction. Also, most algorithms consider unsorted point clouds although the scanned data is structured through device properties and measuring principles. We use this underlying structure to achieve high processing speeds and extract intrinsic system parameters and use them for fast pre-processing.
40 CFR 63.11925 - What are my initial and continuous compliance requirements for process vents?
Code of Federal Regulations, 2013 CFR
2013-07-01
... accepted chemical engineering principles, measurable process parameters, or physical or chemical laws or... scale. (iv) Engineering assessment including, but not limited to, the following: (A) Previous test..., and procedures used in the engineering assessment shall be documented. (3) For miscellaneous process...
40 CFR 63.11925 - What are my initial and continuous compliance requirements for process vents?
Code of Federal Regulations, 2014 CFR
2014-07-01
... accepted chemical engineering principles, measurable process parameters, or physical or chemical laws or... scale. (iv) Engineering assessment including, but not limited to, the following: (A) Previous test..., and procedures used in the engineering assessment shall be documented. (3) For miscellaneous process...
40 CFR 63.11925 - What are my initial and continuous compliance requirements for process vents?
Code of Federal Regulations, 2012 CFR
2012-07-01
... accepted chemical engineering principles, measurable process parameters, or physical or chemical laws or... scale. (iv) Engineering assessment including, but not limited to, the following: (A) Previous test..., and procedures used in the engineering assessment shall be documented. (3) For miscellaneous process...
Innovative model of business process reengineering at machine building enterprises
NASA Astrophysics Data System (ADS)
Nekrasov, R. Yu; Tempel, Yu A.; Tempel, O. A.
2017-10-01
The paper provides consideration of business process reengineering viewed as amanagerial innovation accepted by present day machine building enterprises, as well as waysto improve its procedure. A developed innovative model of reengineering measures isdescribed and is based on the process approach and other principles of company management.
Code of Federal Regulations, 2014 CFR
2014-07-01
... permit limit applicable to the process vent. (4) Design analysis based on accepted chemical engineering principles, measurable process parameters, or physical or chemical laws or properties. (5) All data... tested for vapor tightness. (b) Engineering assessment. Engineering assessment to determine if a vent...
Code of Federal Regulations, 2012 CFR
2012-07-01
... permit limit applicable to the process vent. (4) Design analysis based on accepted chemical engineering principles, measurable process parameters, or physical or chemical laws or properties. (5) All data... tested for vapor tightness. (b) Engineering assessment. Engineering assessment to determine if a vent...
Code of Federal Regulations, 2011 CFR
2011-07-01
... permit limit applicable to the process vent. (4) Design analysis based on accepted chemical engineering principles, measurable process parameters, or physical or chemical laws or properties. (5) All data... tested for vapor tightness. (b) Engineering assessment. Engineering assessment to determine if a vent...
Code of Federal Regulations, 2010 CFR
2010-07-01
... permit limit applicable to the process vent. (4) Design analysis based on accepted chemical engineering principles, measurable process parameters, or physical or chemical laws or properties. (5) All data... tested for vapor tightness. (b) Engineering assessment. Engineering assessment to determine if a vent...
Code of Federal Regulations, 2013 CFR
2013-07-01
... permit limit applicable to the process vent. (4) Design analysis based on accepted chemical engineering principles, measurable process parameters, or physical or chemical laws or properties. (5) All data... tested for vapor tightness. (b) Engineering assessment. Engineering assessment to determine if a vent...
Ultrasound-enhanced bioscouring of greige cotton: regression analysis of process factors
USDA-ARS?s Scientific Manuscript database
Ultrasound-enhanced bioscouring process factors for greige cotton fabric are examined using custom experimental design utilizing statistical principles. An equation is presented which predicts bioscouring performance based upon percent reflectance values obtained from UV-Vis measurements of rutheniu...
Method for measuring target rotation angle by theodolites
NASA Astrophysics Data System (ADS)
Sun, Zelin; Wang, Zhao; Zhai, Huanchun; Yang, Xiaoxu
2013-05-01
To overcome the disadvantage of the current measurement methods using theodolites in an environment with shock and long working hours and so on, this paper proposes a new method for 3D coordinate measurement that is based on an immovable measuring coordinate system. According to the measuring principle, the mathematics model is established and the measurement uncertainty is analysed. The measurement uncertainty of the new method is a function of the theodolite observation angles and their uncertainty, and can be reduced by optimizing the theodolites’ placement. Compared to other methods, this method allows the theodolite positions to be changed in the measuring process, and mutual collimation between the theodolites is not required. The experimental results show that the measurement model and the optimal placement principle are correct, and the measurement error is less than 0.01° after optimizing the theodolites’ placement.
40 CFR 63.1412 - Continuous process vent applicability assessment procedures and methods.
Code of Federal Regulations, 2010 CFR
2010-07-01
... engineering principles, measurable process parameters, or physical or chemical laws or properties. Examples of... values, and engineering assessment control applicability assessment requirements are to be determined... by using the engineering assessment procedures in paragraph (k) of this section. (f) Volumetric flow...
40 CFR 63.1412 - Continuous process vent applicability assessment procedures and methods.
Code of Federal Regulations, 2012 CFR
2012-07-01
... engineering principles, measurable process parameters, or physical or chemical laws or properties. Examples of... values, and engineering assessment control applicability assessment requirements are to be determined... by using the engineering assessment procedures in paragraph (k) of this section. (f) Volumetric flow...
40 CFR 63.1412 - Continuous process vent applicability assessment procedures and methods.
Code of Federal Regulations, 2014 CFR
2014-07-01
... engineering principles, measurable process parameters, or physical or chemical laws or properties. Examples of... values, and engineering assessment control applicability assessment requirements are to be determined... by using the engineering assessment procedures in paragraph (k) of this section. (f) Volumetric flow...
40 CFR 63.1412 - Continuous process vent applicability assessment procedures and methods.
Code of Federal Regulations, 2013 CFR
2013-07-01
... engineering principles, measurable process parameters, or physical or chemical laws or properties. Examples of... values, and engineering assessment control applicability assessment requirements are to be determined... by using the engineering assessment procedures in paragraph (k) of this section. (f) Volumetric flow...
40 CFR 63.1412 - Continuous process vent applicability assessment procedures and methods.
Code of Federal Regulations, 2011 CFR
2011-07-01
... engineering principles, measurable process parameters, or physical or chemical laws or properties. Examples of... values, and engineering assessment control applicability assessment requirements are to be determined... by using the engineering assessment procedures in paragraph (k) of this section. (f) Volumetric flow...
Scientific data processing for the MICROSCOPE space experiment
NASA Astrophysics Data System (ADS)
Hardy, Emilie; Metris, Gilles; Santos Rodrigues, Manuel; Touboul, Pierre; Chhun, Ratana; Baghi, Quentin; Berge, Joel
The MICROSCOPE space mission aims at testing the Equivalence Principle, which states that the acceleration of a test object due to gravitation is independent of its mass and internal composition. The Equivalence Principle is at the basis of General Relativity and has been tested on-ground with a record accuracy of a few 10(-13) . However, most theories for the unification of the gravitation with the three other fundamental interactions predict that it will be violated at a level 10(-18) -10(-13) . This range cannot be reached on Earth because of the numerous perturbations in the terrestrial environment. Being performed in space, the MICROSCOPE experiment will be able to overcome these limitations in order to test the Equivalence Principle with an accuracy of 10(-15) . The instrument will be embarked on board a drag-free microsatellite orbiting the Earth, and consists in a differential electrostatic accelerometer composed of two cylindrical test masses made of different materials. The position of the masses is detected thanks to capacitive sensors, while control loops with electrostatic actuation keep them concentric, so that they both are submitted to the same gravitational field. The electrostatic acceleration applied to the masses to maintain them relatively motionless are measured and will demonstrate a violation of the Equivalence Principle if found unequal. The potential Equivalence Principle violation signal is expected at a well identified frequency, f _{EP}. However, the raw measurement is impacted by systematic instrumental errors, which are calibrated in-orbit during dedicated sessions. The data processing therefore includes the correction of the measurement in order to reduce the contribution of these errors at f _{EP}. Other perturbations must be considered during the data analysis: numerical effects arise from the finite time span of the measurement. A procedure have thus been determined in order to extract the Equivalence Principle violation parameter with minimal numerical perturbations, in a nominal situation as well as in the case of missing data, which may amplify these effects. A numerical simulator has been developed in order to validate the protocol of measurement correction and analysis. The simulator architecture as well as the current results will be presented. The scientific data of the MICROSCOPE mission, to be launched in 2016, will be processed by the MICROSCOPE Scientific Mission Center (CMSM). The presentation will focus on the CMSM data management and describe the ground segment architecture including the interactions between the CMSM and the other entities.
Large-aperture space optical system testing based on the scanning Hartmann.
Wei, Haisong; Yan, Feng; Chen, Xindong; Zhang, Hao; Cheng, Qiang; Xue, Donglin; Zeng, Xuefeng; Zhang, Xuejun
2017-03-10
Based on the Hartmann testing principle, this paper proposes a novel image quality testing technology which applies to a large-aperture space optical system. Compared with the traditional testing method through a large-aperture collimator, the scanning Hartmann testing technology has great advantages due to its simple structure, low cost, and ability to perform wavefront measurement of an optical system. The basic testing principle of the scanning Hartmann testing technology, data processing method, and simulation process are presented in this paper. Certain simulation results are also given to verify the feasibility of this technology. Furthermore, a measuring system is developed to conduct a wavefront measurement experiment for a 200 mm aperture optical system. The small deviation (6.3%) of root mean square (RMS) between experimental results and interferometric results indicates that the testing system can measure low-order aberration correctly, which means that the scanning Hartmann testing technology has the ability to test the imaging quality of a large-aperture space optical system.
40 CFR 63.1323 - Batch process vents-methods and procedures for group determination.
Code of Federal Regulations, 2010 CFR
2010-07-01
... accepted chemical engineering principles, measurable process parameters, or physical or chemical laws or... recovering monomer, reaction products, by-products, or solvent from a stripper operated in batch mode, and the primary condenser recovering monomer, reaction products, by-products, or solvent from a...
Ku-band signal design study. [space shuttle orbiter data processing network
NASA Technical Reports Server (NTRS)
Rubin, I.
1978-01-01
Analytical tools, methods and techniques for assessing the design and performance of the space shuttle orbiter data processing system (DPS) are provided. The computer data processing network is evaluated in the key areas of queueing behavior synchronization and network reliability. The structure of the data processing network is described as well as the system operation principles and the network configuration. The characteristics of the computer systems are indicated. System reliability measures are defined and studied. System and network invulnerability measures are computed. Communication path and network failure analysis techniques are included.
Optical versus tactile geometry measurement: alternatives or counterparts
NASA Astrophysics Data System (ADS)
Lehmann, Peter
2003-05-01
This contribution deals with measuring strategies and methods for the determination of several geometrical features, covering the surface micro-topography and the form of mechanical objects. The measuring principles used in optical surface metrology include optical focusing profilers, confocal point measuring and areal measuring sensors as well as interferometrical principles such as white light interferometry and speckle techniques. In comparison with stylus instruments optical techniques provide certain advantages such as a fast data acquisition, in-process applicability or contactless measurement. However, the frequency response characteristics of optical and tactile measurement differ significantly. In addition, optical sensors are commonly more influenced by critical geometrical conditions and optical properties of an object. For precise form measurement mechanical instruments dominate till now. One reason for this may be, that commonly the complete 360 degrees geometry of the measuring object has to be analyzed. Another point is that optical principles such as form measuring interferometry fail in cases of complex object geometry or rougher object surfaces. Other methods, e.g. fringe projection or digital holography, till now do not meet the accuracy demands of precision engineered workpieces. Hence, a combination of mechanical concepts and optical sensors represents an interesting potential for current and future measuring tasks, which require high accuracy and maximum flexibility.
The Measurement Process in Local Quantum Physics and the EPR Paradox
NASA Astrophysics Data System (ADS)
Doplicher, Sergio
2018-01-01
We describe in a qualitative way a possible picture of the Measurement Process in Quantum Mechanics, which takes into account the finite and non zero time duration T of the interaction between the observed system and the microscopic part of the measurement apparatus; the finite space size R of that apparatus; and the fact that the macroscopic part of the measurement apparatus, having the role of amplifying the effect of that interaction to a macroscopic scale, is composed by a very large but finite number N of particles. The Schrödinger evolution of the composed system can be expected to deform into the conventional picture of the measurement, as an instantaneous action turning a pure state into a mixture, only in the limit {N → ∞, T → 0, R → ∞}. Our main point is to discuss this picture for the measurement of local observables in Quantum Field Theory, where the dynamics of the theory and the measurement itself are described by the same time evolution complying with the Principle of Locality. We comment on the Einstein Podolski Rosen thought experiment, reformulated here only in terms of local observables (rather than global ones, as one particle or polarization observables).The local picture of the measurement process helps to make it clear that there is no conflict with the Principle of Locality.
40 CFR 63.488 - Methods and procedures for batch front-end process vent group determination.
Code of Federal Regulations, 2010 CFR
2010-07-01
... engineering principles, measurable process parameters, or physical or chemical laws or properties. Examples of... primary condenser recovering monomer, reaction products, by-products, or solvent from a stripper operated in batch mode, and the primary condenser recovering monomer, reaction products, by-products, or...
Beyond bipolar conceptualizations and measures: the case of attitudes and evaluative space.
Cacioppo, J T; Gardner, W L; Berntson, G G
1997-01-01
All organisms must be capable of differentiating hostile from hospitable stimuli to survive. Typically, this evaluative discrimination is conceptualized as being bipolar (hostile-hospitable). This conceptualization is certainly evident in the area of attitudes, where the ubiquitous bipolar attitude measure, by gauging the net affective predisposition toward a stimulus, treats positive and negative evaluative processes as equivalent, reciprocally activated, and interchangeable. Contrary to conceptualizations of this evaluative process as bipolar, recent evidence suggests that distinguishable motivational systems underlie assessments of the positive and negative significance of a stimulus. Thus, a stimulus may vary in terms of the strength of positive evaluative activation and the strength of negative evaluative activation it evokes. Low activation of positive and negative evaluative processes by a stimulus reflects attitude neutrality or indifference, whereas high activation of positive and negative evaluative processes reflects attitude ambivalence. As such, attitudes can be represented more completely within a bivariate space than along a bipolar continuum. Evidence is reviewed showing that the positive and negative evaluative processes underlying many attitudes are distinguishable (stochastically and functionally independent), are characterized by distinct activation functions (positivity offset and negativity bias principles), are related differentially to attitude ambivalence (corollary of ambivalence asymmetries), have distinguishable antecedents (heteroscedacity principle), and tend to gravitate from a bivariate toward a bipolar structure when the underlying beliefs are the target of deliberation or a guide for behavior (principle of motivational certainty). The implications for society phenomena such as political elections and democratic structures are discussed.
Ensuring the reliability of stable isotope ratio data--beyond the principle of identical treatment.
Carter, J F; Fry, B
2013-03-01
The need for inter-laboratory comparability is crucial to facilitate the globalisation of scientific networks and the development of international databases to support scientific and criminal investigations. This article considers what lessons can be learned from a series of inter-laboratory comparison exercises organised by the Forensic Isotope Ratio Mass Spectrometry (FIRMS) network in terms of reference materials (RMs), the management of data quality, and technical limitations. The results showed that within-laboratory precision (repeatability) was generally good but between-laboratory accuracy (reproducibility) called for improvements. This review considers how stable isotope laboratories can establish a system of quality control (QC) and quality assurance (QA), emphasising issues of repeatability and reproducibility. For results to be comparable between laboratories, measurements must be traceable to the international δ-scales and, because isotope ratio measurements are reported relative to standards, a key aspect is the correct selection, calibration, and use of international and in-house RMs. The authors identify four principles which promote good laboratory practice. The principle of identical treatment by which samples and RMs are processed in an identical manner and which incorporates three further principles; the principle of identical correction (by which necessary corrections are identified and evenly applied), the principle of identical scaling (by which data are shifted and stretched to the international δ-scales), and the principle of error detection by which QC and QA results are monitored and acted upon. To achieve both good repeatability and good reproducibility it is essential to obtain RMs with internationally agreed δ-values. These RMs will act as the basis for QC and can be used to calibrate further in-house QC RMs tailored to the activities of specific laboratories. In-house QA standards must also be developed to ensure that QC-based calibrations and corrections lead to accurate results for samples. The δ-values assigned to RMs must be recorded and reported with all data. Reference materials must be used to determine what corrections are necessary for measured data. Each analytical sequence of samples must include both QC and QA materials which are subject to identical treatment during measurement and data processing. Results for these materials must be plotted, monitored, and acted upon. Periodically international RMs should be analysed as an in-house proficiency test to demonstrate results are accurate.
Toward Development of a Generalized Instrument to Measure Andragogy
ERIC Educational Resources Information Center
Holton, Elwood F., III; Wilson, Lynda Swanson; Bates, Reid A.
2009-01-01
Andragogy has emerged as one of the dominant frameworks for teaching adults during the past 40 years. A major and glaring gap in andragogy research is the lack of a measurement instrument that adequately measures both andragogical principles and process design elements. As a result, no definitive empirical test of the theory has been possible. The…
NASA Astrophysics Data System (ADS)
Thurner, Stefan; Corominas-Murtra, Bernat; Hanel, Rudolf
2017-09-01
There are at least three distinct ways to conceptualize entropy: entropy as an extensive thermodynamic quantity of physical systems (Clausius, Boltzmann, Gibbs), entropy as a measure for information production of ergodic sources (Shannon), and entropy as a means for statistical inference on multinomial processes (Jaynes maximum entropy principle). Even though these notions represent fundamentally different concepts, the functional form of the entropy for thermodynamic systems in equilibrium, for ergodic sources in information theory, and for independent sampling processes in statistical systems, is degenerate, H (p ) =-∑ipilogpi . For many complex systems, which are typically history-dependent, nonergodic, and nonmultinomial, this is no longer the case. Here we show that for such processes, the three entropy concepts lead to different functional forms of entropy, which we will refer to as SEXT for extensive entropy, SIT for the source information rate in information theory, and SMEP for the entropy functional that appears in the so-called maximum entropy principle, which characterizes the most likely observable distribution functions of a system. We explicitly compute these three entropy functionals for three concrete examples: for Pólya urn processes, which are simple self-reinforcing processes, for sample-space-reducing (SSR) processes, which are simple history dependent processes that are associated with power-law statistics, and finally for multinomial mixture processes.
Measuring and assessing maintainability at the end of high level design
NASA Technical Reports Server (NTRS)
Briand, Lionel C.; Morasca, Sandro; Basili, Victor R.
1993-01-01
Software architecture appears to be one of the main factors affecting software maintainability. Therefore, in order to be able to predict and assess maintainability early in the development process we need to be able to measure the high-level design characteristics that affect the change process. To this end, we propose a measurement approach, which is based on precise assumptions derived from the change process, which is based on Object-Oriented Design principles and is partially language independent. We define metrics for cohesion, coupling, and visibility in order to capture the difficulty of isolating, understanding, designing and validating changes.
Measurement for Work. Teaching Guide and Sample Learning Activities.
ERIC Educational Resources Information Center
Angel, Margo; Bolton, Chris
This document is intended to help Australian technical and further education instructors in New South Wales (TAFE NSW) identify teaching principles and learning activities that they can use to help adult learners master the mathematics processes, knowledge, and skills needed to perform basic measurement tasks in today's workplace. The materials…
Warner, Courtney J; Walsh, Daniel B; Horvath, Alexander J; Walsh, Teri R; Herrick, Daniel P; Prentiss, Steven J; Powell, Richard J
2013-11-01
Lean process improvement techniques are used in industry to improve efficiency and quality while controlling costs. These techniques are less commonly applied in health care. This study assessed the effectiveness of Lean principles on first case on-time operating room starts and quantified effects on resident work hours. Standard process improvement techniques (DMAIC methodology: define, measure, analyze, improve, control) were used to identify causes of delayed vascular surgery first case starts. Value stream maps and process flow diagrams were created. Process data were analyzed with Pareto and control charts. High-yield changes were identified and simulated in computer and live settings prior to implementation. The primary outcome measure was the proportion of on-time first case starts; secondary outcomes included hospital costs, resident rounding time, and work hours. Data were compared with existing benchmarks. Prior to implementation, 39% of first cases started on time. Process mapping identified late resident arrival in preoperative holding as a cause of delayed first case starts. Resident rounding process inefficiencies were identified and changed through the use of checklists, standardization, and elimination of nonvalue-added activity. Following implementation of process improvements, first case on-time starts improved to 71% at 6 weeks (P = .002). Improvement was sustained with an 86% on-time rate at 1 year (P < .001). Resident rounding time was reduced by 33% (from 70 to 47 minutes). At 9 weeks following implementation, these changes generated an opportunity cost potential of $12,582. Use of Lean principles allowed rapid identification and implementation of perioperative process changes that improved efficiency and resulted in significant cost savings. This improvement was sustained at 1 year. Downstream effects included improved resident efficiency with decreased work hours. Copyright © 2013 Society for Vascular Surgery. Published by Mosby, Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Gao, Peng
2018-06-01
This work concerns the problem associated with averaging principle for a higher order nonlinear Schrödinger equation perturbed by a oscillating term arising as the solution of a stochastic reaction-diffusion equation evolving with respect to the fast time. This model can be translated into a multiscale stochastic partial differential equations. Stochastic averaging principle is a powerful tool for studying qualitative analysis of stochastic dynamical systems with different time-scales. To be more precise, under suitable conditions, we prove that there is a limit process in which the fast varying process is averaged out and the limit process which takes the form of the higher order nonlinear Schrödinger equation is an average with respect to the stationary measure of the fast varying process. Finally, by using the Khasminskii technique we can obtain the rate of strong convergence for the slow component towards the solution of the averaged equation, and as a consequence, the system can be reduced to a single higher order nonlinear Schrödinger equation with a modified coefficient.
NASA Astrophysics Data System (ADS)
Gao, Peng
2018-04-01
This work concerns the problem associated with averaging principle for a higher order nonlinear Schrödinger equation perturbed by a oscillating term arising as the solution of a stochastic reaction-diffusion equation evolving with respect to the fast time. This model can be translated into a multiscale stochastic partial differential equations. Stochastic averaging principle is a powerful tool for studying qualitative analysis of stochastic dynamical systems with different time-scales. To be more precise, under suitable conditions, we prove that there is a limit process in which the fast varying process is averaged out and the limit process which takes the form of the higher order nonlinear Schrödinger equation is an average with respect to the stationary measure of the fast varying process. Finally, by using the Khasminskii technique we can obtain the rate of strong convergence for the slow component towards the solution of the averaged equation, and as a consequence, the system can be reduced to a single higher order nonlinear Schrödinger equation with a modified coefficient.
Modified Universal Design Survey: Enhancing Operability of Launch Vehicle Ground Crew Worksites
NASA Technical Reports Server (NTRS)
Blume, Jennifer L.
2010-01-01
Operability is a driving requirement for next generation space launch vehicles. Launch site ground operations include numerous operator tasks to prepare the vehicle for launch or to perform preflight maintenance. Ensuring that components requiring operator interaction at the launch site are designed for optimal human use is a high priority for operability. To promote operability, a Design Quality Evaluation Survey based on Universal Design framework was developed to support Human Factors Engineering (HFE) evaluation for NASA s launch vehicles. Universal Design per se is not a priority for launch vehicle processing however; applying principles of Universal Design will increase the probability of an error free and efficient design which promotes operability. The Design Quality Evaluation Survey incorporates and tailors the seven Universal Design Principles and adds new measures for Safety and Efficiency. Adapting an approach proven to measure Universal Design Performance in Product, each principle is associated with multiple performance measures which are rated with the degree to which the statement is true. The Design Quality Evaluation Survey was employed for several launch vehicle ground processing worksite analyses. The tool was found to be most useful for comparative judgments as opposed to an assessment of a single design option. It provided a useful piece of additional data when assessing possible operator interfaces or worksites for operability.
The neural circuits for arithmetic principles.
Liu, Jie; Zhang, Han; Chen, Chuansheng; Chen, Hui; Cui, Jiaxin; Zhou, Xinlin
2017-02-15
Arithmetic principles are the regularities underlying arithmetic computation. Little is known about how the brain supports the processing of arithmetic principles. The current fMRI study examined neural activation and functional connectivity during the processing of verbalized arithmetic principles, as compared to numerical computation and general language processing. As expected, arithmetic principles elicited stronger activation in bilateral horizontal intraparietal sulcus and right supramarginal gyrus than did language processing, and stronger activation in left middle temporal lobe and left orbital part of inferior frontal gyrus than did computation. In contrast, computation elicited greater activation in bilateral horizontal intraparietal sulcus (extending to posterior superior parietal lobule) than did either arithmetic principles or language processing. Functional connectivity analysis with the psychophysiological interaction approach (PPI) showed that left temporal-parietal (MTG-HIPS) connectivity was stronger during the processing of arithmetic principle and language than during computation, whereas parietal-occipital connectivities were stronger during computation than during the processing of arithmetic principles and language. Additionally, the left fronto-parietal (orbital IFG-HIPS) connectivity was stronger during the processing of arithmetic principles than during computation. The results suggest that verbalized arithmetic principles engage a neural network that overlaps but is distinct from the networks for computation and language processing. Copyright © 2016 Elsevier Inc. All rights reserved.
Surface tension determination using liquid sample micromirror property
NASA Astrophysics Data System (ADS)
Hošek, Jan
2007-05-01
This paper presents an application of adaptive optics principle onto small sample of liquid surface tension measurement. The principle of experimental method devised by Ferguson (1924) is based on measurement of pressure difference across a liquid sample placed into small diameter capillary on condition of one flat meniscus of the liquid sample. Planarity or curvature radius of the capillary tip meniscus has to be measured and controlled, in order to fulfill this condition during measurement. Two different optical set-ups using liquid meniscus micromirror property are presented and its suitability for meniscus profile determination is compared. Meniscus radius optical measurement, data processing and control algorithm of the adaptive micromirror profile set are presented too. The presented adaptive optics system can be used for focal length control of microsystems based on liquid micromirrors or microlenses with long focal distances especially.
Direct optical sensors: principles and selected applications.
Gauglitz, Guenter
2005-01-01
In the field of bio and chemosensors a large number of detection principles has been published within the last decade. These detection principles are based either on the observation of fluorescence-labelled systems or on direct optical detection in the heterogeneous phase. Direct optical detection can be measured by remission (absorption of reflected radiation, opt(r)odes), by measuring micro-refractivity, or measuring interference. In the last case either Mach-Zehnder interferometers or measurement of changes in the physical thickness of the layer (measuring micro-reflectivity) caused, e.g., by swelling effects in polymers (due to interaction with analytes) or in bioassays (due to affinity reactions) also play an important role. Here, an overview of methods of microrefractometric and microreflectometric principles is given and benefits and drawbacks of the various approaches are demonstrated using samples from the chemo and biosensor field. The quality of sensors does not just depend on transduction principles but on the total sensor system defined by this transduction, the sensitive layer, data acquisition electronics, and evaluation software. The intention of this article is, therefore, to demonstrate the essentials of the interaction of these parts within the system, and the focus is on optical sensing using planar transducers, because fibre optical sensors have been reviewed in this journal only recently. Lack of selectivity of chemosensors can be compensated either by the use of sensor arrays or by evaluating time-resolved measurements of analyte/sensitive layer interaction. In both cases chemometrics enables the quantification of analyte mixtures. These data-processing methods have also been successfully applied to antibody/antigen interactions even using cross-reactive antibodies. Because miniaturisation and parallelisation are essential approaches in recent years, some aspects and current trends, especially for bio-applications, will be discussed. Miniaturisation is especially well covered in the literature.
Practical use of video imagery in nearshore oceanographic field studies
Holland, K.T.; Holman, R.A.; Lippmann, T.C.; Stanley, J.; Plant, N.
1997-01-01
An approach was developed for using video imagery to quantify, in terms of both spatial and temporal dimensions, a number of naturally occurring (nearshore) physical processes. The complete method is presented, including the derivation of the geometrical relationships relating image and ground coordinates, principles to be considered when working with video imagery and the two-step strategy for calibration of the camera model. The techniques are founded on the principles of photogrammetry, account for difficulties inherent in the use of video signals, and have been adapted to allow for flexibility of use in field studies. Examples from field experiments indicate that this approach is both accurate and applicable under the conditions typically experienced when sampling in coastal regions. Several applications of the camera model are discussed, including the measurement of nearshore fluid processes, sand bar length scales, foreshore topography, and drifter motions. Although we have applied this method to the measurement of nearshore processes and morphologic features, these same techniques are transferable to studies in other geophysical settings.
Code of Federal Regulations, 2010 CFR
2010-07-01
... accepted chemical engineering principles, measurable process parameters, or physical or chemical laws or... engineering assessment, as described in paragraph (c)(2) of this section. (1) The owner or operator may choose... run. (2) The owner or operator may use engineering assessment to demonstrate compliance with the...
Code of Federal Regulations, 2013 CFR
2013-07-01
... accepted chemical engineering principles, measurable process parameters, or physical or chemical laws or... engineering assessment, as described in paragraph (c)(2) of this section. (1) The owner or operator may choose... run. (2) The owner or operator may use engineering assessment to demonstrate compliance with the...
Code of Federal Regulations, 2012 CFR
2012-07-01
... accepted chemical engineering principles, measurable process parameters, or physical or chemical laws or... engineering assessment, as described in paragraph (c)(2) of this section. (1) The owner or operator may choose... run. (2) The owner or operator may use engineering assessment to demonstrate compliance with the...
40 CFR 63.645 - Test methods and procedures for miscellaneous process vents.
Code of Federal Regulations, 2012 CFR
2012-07-01
... analysis based on accepted chemical engineering principles, measurable process parameters, or physical or... minute, at a temperature of 20 °C. (g) Engineering assessment may be used to determine the TOC emission...) Engineering assessment includes, but is not limited to, the following: (i) Previous test results provided the...
40 CFR 63.1323 - Batch process vents-methods and procedures for group determination.
Code of Federal Regulations, 2014 CFR
2014-07-01
... accepted chemical engineering principles, measurable process parameters, or physical or chemical laws or... paragraph (b)(5) of this section. Engineering assessment may be used to estimate emissions from a batch... defined in paragraph (b)(5) of this section, through engineering assessment, as defined in paragraph (b)(6...
40 CFR 63.645 - Test methods and procedures for miscellaneous process vents.
Code of Federal Regulations, 2013 CFR
2013-07-01
... analysis based on accepted chemical engineering principles, measurable process parameters, or physical or... minute, at a temperature of 20 °C. (g) Engineering assessment may be used to determine the TOC emission...) Engineering assessment includes, but is not limited to, the following: (i) Previous test results provided the...
40 CFR 63.645 - Test methods and procedures for miscellaneous process vents.
Code of Federal Regulations, 2011 CFR
2011-07-01
... analysis based on accepted chemical engineering principles, measurable process parameters, or physical or... minute, at a temperature of 20 °C. (g) Engineering assessment may be used to determine the TOC emission...) Engineering assessment includes, but is not limited to, the following: (i) Previous test results provided the...
40 CFR 63.1323 - Batch process vents-methods and procedures for group determination.
Code of Federal Regulations, 2011 CFR
2011-07-01
... accepted chemical engineering principles, measurable process parameters, or physical or chemical laws or... paragraph (b)(5) of this section. Engineering assessment may be used to estimate emissions from a batch... defined in paragraph (b)(5) of this section, through engineering assessment, as defined in paragraph (b)(6...
40 CFR 63.1323 - Batch process vents-methods and procedures for group determination.
Code of Federal Regulations, 2013 CFR
2013-07-01
... accepted chemical engineering principles, measurable process parameters, or physical or chemical laws or... paragraph (b)(5) of this section. Engineering assessment may be used to estimate emissions from a batch... defined in paragraph (b)(5) of this section, through engineering assessment, as defined in paragraph (b)(6...
Code of Federal Regulations, 2011 CFR
2011-07-01
... accepted chemical engineering principles, measurable process parameters, or physical or chemical laws or... engineering assessment, as described in paragraph (c)(2) of this section. (1) The owner or operator may choose... run. (2) The owner or operator may use engineering assessment to demonstrate compliance with the...
40 CFR 63.1323 - Batch process vents-methods and procedures for group determination.
Code of Federal Regulations, 2012 CFR
2012-07-01
... accepted chemical engineering principles, measurable process parameters, or physical or chemical laws or... paragraph (b)(5) of this section. Engineering assessment may be used to estimate emissions from a batch... defined in paragraph (b)(5) of this section, through engineering assessment, as defined in paragraph (b)(6...
Demetrius, L
2000-09-07
The science of thermodynamics is concerned with understanding the properties of inanimate matter in so far as they are determined by changes in temperature. The Second Law asserts that in irreversible processes there is a uni-directional increase in thermodynamic entropy, a measure of the degree of uncertainty in the thermal energy state of a randomly chosen particle in the aggregate. The science of evolution is concerned with understanding the properties of populations of living matter in so far as they are regulated by changes in generation time. Directionality theory, a mathematical model of the evolutionary process, establishes that in populations subject to bounded growth constraints, there is a uni-directional increase in evolutionary entropy, a measure of the degree of uncertainty in the age of the immediate ancestor of a randomly chosen newborn. This article reviews the mathematical basis of directionality theory and analyses the relation between directionality theory and statistical thermodynamics. We exploit an analytic relation between temperature, and generation time, to show that the directionality principle for evolutionary entropy is a non-equilibrium extension of the principle of a uni-directional increase of thermodynamic entropy. The analytic relation between these directionality principles is consistent with the hypothesis of the equivalence of fundamental laws as one moves up the hierarchy, from a molecular ensemble where the thermodynamic laws apply, to a population of replicating entities (molecules, cells, higher organisms), where evolutionary principles prevail. Copyright 2000 Academic Press.
Priority-setting in New Zealand: translating principles into practice.
Ashton, T; Cumming, J; Devlin, N
2000-07-01
In May 1998 the New Zealand Health Funding Authority released a discussion paper which proposed a principles-based approach to setting purchasing priorities that incorporates the economic methods of programme budgeting and marginal analysis, and cost-utility analysis. The principles upon which the process was to be based are effectiveness, cost, equity of health outcomes, Maori health and acceptability. This essay describes and critiques issues associated with translating the principles into practice, most particularly the proposed methods for evaluating the effectiveness and measuring the cost of services. It is argued that the proposals make an important contribution towards the development of a method for prioritizing services which challenges our thinking about those services and their goals, and which is systematic, explicit, and transparent. The shift towards 'thinking at the margin' and systematically reviewing the value for money of competing claims on resources is likely to improve the quality of decision-making compared with the status quo. This does not imply that prioritization can, or should, be undertaken by means of any simple formula. Any prioritization process should always be guided by informed judgement. The approach is more appropriate for some services than for others. Key methodological issues that need further consideration include the choice of instrument for measuring health gains, the identification of marginal services, how to combine qualitative and quantitative information, and how to ensure consistency across different levels of decision-making.
Validation of a multi-criteria evaluation model for animal welfare.
Martín, P; Czycholl, I; Buxadé, C; Krieter, J
2017-04-01
The aim of this paper was to validate an alternative multi-criteria evaluation system to assess animal welfare on farms based on the Welfare Quality® (WQ) project, using an example of welfare assessment of growing pigs. This alternative methodology aimed to be more transparent for stakeholders and more flexible than the methodology proposed by WQ. The WQ assessment protocol for growing pigs was implemented to collect data in different farms in Schleswig-Holstein, Germany. In total, 44 observations were carried out. The aggregation system proposed in the WQ protocol follows a three-step aggregation process. Measures are aggregated into criteria, criteria into principles and principles into an overall assessment. This study focussed on the first two steps of the aggregation. Multi-attribute utility theory (MAUT) was used to produce a value of welfare for each criterion and principle. The utility functions and the aggregation function were constructed in two separated steps. The MACBETH (Measuring Attractiveness by a Categorical-Based Evaluation Technique) method was used for utility function determination and the Choquet integral (CI) was used as an aggregation operator. The WQ decision-makers' preferences were fitted in order to construct the utility functions and to determine the CI parameters. The validation of the MAUT model was divided into two steps, first, the results of the model were compared with the results of the WQ project at criteria and principle level, and second, a sensitivity analysis of our model was carried out to demonstrate the relative importance of welfare measures in the different steps of the multi-criteria aggregation process. Using the MAUT, similar results were obtained to those obtained when applying the WQ protocol aggregation methods, both at criteria and principle level. Thus, this model could be implemented to produce an overall assessment of animal welfare in the context of the WQ protocol for growing pigs. Furthermore, this methodology could also be used as a framework in order to produce an overall assessment of welfare for other livestock species. Two main findings are obtained from the sensitivity analysis, first, a limited number of measures had a strong influence on improving or worsening the level of welfare at criteria level and second, the MAUT model was not very sensitive to an improvement in or a worsening of single welfare measures at principle level. The use of weighted sums and the conversion of disease measures into ordinal scores should be reconsidered.
Design and Development of a Real-Time Model Attitude Measurement System for Hypersonic Facilities
NASA Technical Reports Server (NTRS)
Jones, Thomas W.; Lunsford, Charles B.
2005-01-01
A series of wind tunnel tests have been conducted to evaluate a multi-camera videogrammetric system designed to measure model attitude in hypersonic facilities. The technique utilizes processed video data and applies photogrammetric principles for point tracking to compute model position including pitch, roll and yaw variables. A discussion of the constraints encountered during the design, development, and testing process, including lighting, vibration, operational range and optical access is included. Initial measurement results from the NASA Langley Research Center (LaRC) 31-Inch Mach 10 tunnel are presented.
Design and Development of a Real-Time Model Attitude Measurement System for Hypersonic Facilities
NASA Technical Reports Server (NTRS)
Jones, Thomas W.; Lunsford, Charles B.
2004-01-01
A series of wind tunnel tests have been conducted to evaluate a multi-camera videogrammetric system designed to measure model attitude in hypersonic facilities. The technique utilizes processed video data and applies photogrammetric principles for point tracking to compute model position including pitch, roll and yaw variables. A discussion of the constraints encountered during the design, development, and testing process, including lighting, vibration, operational range and optical access is included. Initial measurement results from the NASA Langley Research Center (LaRC) 31-Inch Mach 10 tunnel are presented.
High-speed measurements of steel-plate deformations during laser surface processing.
Jezersek, Matija; Gruden, Valter; Mozina, Janez
2004-10-04
In this paper we present a novel approach to monitoring the deformations of a steel plate's surface during various types of laser processing, e.g., engraving, marking, cutting, bending, and welding. The measuring system is based on a laser triangulation principle, where the laser projector generates multiple lines simultaneously. This enables us to measure the shape of the surface with a high sampling rate (80 Hz with our camera) and high accuracy (+/-7 microm). The measurements of steel-plate deformations for plates of different thickness and with different illumination patterns are presented graphically and in an animation.
1982-06-01
process) pertain to the second. Tiae or cost factors sometimes preclude the uGe of product measures, leaving measures of task process as the only...it noo.ewy ad idenatify by Sleek nomher) Training Offectiveness ivluation Product ftaluation Training effectiveness Air defense training Training...requirements. ’The, TE systea described in this report 4 rporates the principles of instructional system Aevelopusnt and provides for both product evaltiation
The role of fiberoptics in remote temperature measurement
NASA Technical Reports Server (NTRS)
Vanzetti, Riccardo
1988-01-01
The use of optical fibers in conjunction with infrared detectors and signal processing electronics represents the latest advance in the field of non-contact temperature measurement and control. The operating principles and design of fiber-optic radiometric systems are discussed and the advantages and disadvantages of using optical fibers are addressed. Signal processing requirements and various infrared detector types are also described. Several areas in which infrared fiber-optic instrumentation is used for temperature monitoring and control are discussed.
NASA Astrophysics Data System (ADS)
Mitter, Thomas; Grün, Hubert; Roither, Jürgen; Betz, Andreas; Bozorgi, Salar; Reitinger, Bernhard; Burgholzer, Peter
2014-05-01
In the continuous casting process the avoidance and rapid detection of occurring solidification cracks in the slab is a crucial issue, in particular for the maintenance of a high quality level in further production processes. Due to the elevated temperatures of the slab surface a remote sensing non-destructive tool for quality inspection is required, which is also applicable for the harsh industrial environment. In this work the application of laser ultrasound (LUS) technique during the continuous casting process in industrial environment is shown. The proof of principle of the detection of the centered solidification cracks is shown by pulse-echo measurements with laser ultrasonic equipment for inline quality inspection. Preliminary examinations in the lab of different casted samples have shown the distinguishability of slabs with and without any solidification cracks. Furthermore the damping of the bulk wave has been used for the prediction of the dimension of the crack. With an adapted "synthetic aperture focusing technique" (SAFT) algorithm the image reconstruction of multiple measurements at different positions around the circumference has provided enough information for the estimation of the localization and extension of the centered solidification cracks. Subsequent first measurements using this laser ultrasonic setup during the continuous casting of aluminum were carried out and showed the proof of principle in an industrial environment with elevated temperatures, dust, cooling water and vibrations.
ERIC Educational Resources Information Center
Hahn, William G.; Bart, Barbara D.
2003-01-01
Business students were taught a total quality management-based outlining process for course readings and a tally method to measure learning efficiency. Comparison of 233 who used the process and 99 who did not showed that the group means of users' test scores were 12.4 points higher than those of nonusers. (Contains 25 references.) (SK)
Margaritelis, Nikos V; Cobley, James N; Paschalis, Vassilis; Veskoukis, Aristidis S; Theodorou, Anastasios A; Kyparos, Antonios; Nikolaidis, Michalis G
2016-04-01
The equivocal role of reactive species and redox signaling in exercise responses and adaptations is an example clearly showing the inadequacy of current redox biology research to shed light on fundamental biological processes in vivo. Part of the answer probably relies on the extreme complexity of the in vivo redox biology and the limitations of the currently applied methodological and experimental tools. We propose six fundamental principles that should be considered in future studies to mechanistically link reactive species production to exercise responses or adaptations: 1) identify and quantify the reactive species, 2) determine the potential signaling properties of the reactive species, 3) detect the sources of reactive species, 4) locate the domain modified and verify the (ir)reversibility of post-translational modifications, 5) establish causality between redox and physiological measurements, 6) use selective and targeted antioxidants. Fulfilling these principles requires an idealized human experimental setting, which is certainly a utopia. Thus, researchers should choose to satisfy those principles, which, based on scientific evidence, are most critical for their specific research question. Copyright © 2015 Elsevier Inc. All rights reserved.
GUIDING PRINCIPLES FOR GOOD PRACTICES IN HOSPITAL-BASED HEALTH TECHNOLOGY ASSESSMENT UNITS.
Sampietro-Colom, Laura; Lach, Krzysztof; Pasternack, Iris; Wasserfallen, Jean-Blaise; Cicchetti, Americo; Marchetti, Marco; Kidholm, Kristian; Arentz-Hansen, Helene; Rosenmöller, Magdalene; Wild, Claudia; Kahveci, Rabia; Ulst, Margus
2015-01-01
Health technology assessment (HTA) carried out for policy decision making has well-established principles unlike hospital-based HTA (HB-HTA), which differs from the former in the context characteristics and ways of operation. This study proposes principles for good practices in HB-HTA units. A framework for good practice criteria was built inspired by the EFQM excellence business model and information from six literature reviews, 107 face-to-face interviews, forty case studies, large-scale survey, focus group, Delphi survey, as well as local and international validation. In total, 385 people from twenty countries have participated in defining the principles for good practices in HB-HTA units. Fifteen guiding principles for good practices in HB-HTA units are grouped in four dimensions. Dimension 1 deals with principles of the assessment process aimed at providing contextualized information for hospital decision makers. Dimension 2 describes leadership, strategy and partnerships of HB-HTA units which govern and facilitate the assessment process. Dimension 3 focuses on adequate resources that ensure the operation of HB-HTA units. Dimension 4 deals with measuring the short- and long-term impact of the overall performance of HB-HTA units. Finally, nine core guiding principles were selected as essential requirements for HB-HTA units based on the expertise of the HB-HTA units participating in the project. Guiding principles for good practices set up a benchmark for HB-HTA because they represent the ideal performance of HB-HTA units; nevertheless, when performing HTA at hospital level, context also matters; therefore, they should be adapted to ensure their applicability in the local context.
The action uncertainty principle for continuous measurements
NASA Astrophysics Data System (ADS)
Mensky, Michael B.
1996-02-01
The action uncertainty principle (AUP) for the specification of the most probable readouts of continuous quantum measurements is proved, formulated in different forms and analyzed (for nonlinear as well as linear systems). Continuous monitoring of an observable A(p,q,t) with resolution Δa( t) is considered. The influence of the measurement process on the evolution of the measured system (quantum measurement noise) is presented by an additional term δ F(t)A(p,q,t) in the Hamiltonian where the function δ F (generalized fictitious force) is restricted by the AUP ∫|δ F(t)| Δa( t) d t ≲ and arbitrary otherwise. Quantum-nondemolition (QND) measurements are analyzed with the help of the AUP. A simple uncertainty relation for continuous quantum measurements is derived. It states that the area of a certain band in the phase space should be of the order of. The width of the band depends on the measurement resolution while its length is determined by the deviation of the system, due to the measurement, from classical behavior.
Effect of Automatic Processing on Specification of Problem Solutions for Computer Programs.
1981-03-01
Number 7 ± 2" item limitaion on human short-term memory capability (Miller, 1956) should be a guiding principle in program design. Yourdon and...input either a single example solution or multiple exam’- le solutions in sequence. If a participant’s P1 has a low value - near 0 - it may be concluded... Principles in Experimental Design, Winer ,1971). 55 Table 12 ANOVA Resultt, For Performance Measure 2 Sb DF MS F Source of Variation Between Subjects
Solarsh, Barbara; Alant, Erna
2006-01-01
A culturally appropriate test, The Test of Ability To Explain for Zulu-speaking Children (TATE-ZC), was developed to measure verbal problem solving skills of rural, Zulu-speaking, primary school children. Principles of 'non-biased' assessment, as well as emic (culture specific) and etic (universal) aspects of intelligence formed the theoretical backdrop. In addition, specific principles relating to test translation; test content; culturally appropriate stimulus material; scoring procedures and test administration were applied. Five categories of abstract thinking skills formed the basis of the TATE-ZC. These were: (a) Explaining Inferences, (b) Determining Cause, (c) Negative Why Questions, (d) Determining Solutions and (e) Avoiding Problem. The process of test development underwent three pilot studies. Results indicate that the TATE-ZC is a reliable and valid test for the target population. A critical analysis of the efficacy of creating a test of verbal reasoning for children from the developing world concludes the article. As a result of this activity (1) the participant will have a clearer understanding of the principles that need to be followed when developing culturally appropriate test material; (2) the participant will understand the process of developing culturally appropriate test material for non-mainstream cultures; (3) the participant will be able to apply the process and principles to other cross-cultural testing situations.
[Thermal energy utilization analysis and energy conservation measures of fluidized bed dryer].
Xing, Liming; Zhao, Zhengsheng
2012-07-01
To propose measures for enhancing thermal energy utilization by analyzing drying process and operation principle of fluidized bed dryers,in order to guide optimization and upgrade of fluidized bed drying equipment. Through a systematic analysis on drying process and operation principle of fluidized beds,the energy conservation law was adopted to calculate thermal energy of dryers. The thermal energy of fluidized bed dryers is mainly used to make up for thermal consumption of water evaporation (Qw), hot air from outlet equipment (Qe), thermal consumption for heating and drying wet materials (Qm) and heat dissipation to surroundings through hot air pipelines and cyclone separators. Effective measures and major approaches to enhance thermal energy utilization of fluidized bed dryers were to reduce exhaust gas out by the loss of heat Qe, recycle dryer export air quantity of heat, preserve heat for dry towers, hot air pipes and cyclone separators, dehumidify clean air in inlets and reasonably control drying time and air temperature. Such technical parameters such air supply rate, air inlet temperature and humidity, material temperature and outlet temperature and humidity are set and controlled to effectively save energy during the drying process and reduce the production cost.
Meanings, mechanisms, and measures of holistic processing.
Richler, Jennifer J; Palmeri, Thomas J; Gauthier, Isabel
2012-01-01
Few concepts are more central to the study of face recognition than holistic processing. Progress toward understanding holistic processing is challenging because the term "holistic" has many meanings, with different researchers addressing different mechanisms and favoring different measures. While in principle the use of different measures should provide converging evidence for a common theoretical construct, convergence has been slow to emerge. We explore why this is the case. One challenge is that "holistic processing" is often used to describe both a theoretical construct and a measured effect, which may not have a one-to-one mapping. Progress requires more than greater precision in terminology regarding different measures of holistic processing or different hypothesized mechanisms of holistic processing. Researchers also need to be explicit about what meaning of holistic processing they are investigating so that it is clear whether different researchers are describing the same phenomenon or not. Face recognition differs from object recognition, and not all meanings of holistic processing are equally suited to help us understand that important difference.
A real-time measurement system for parameters of live biology metabolism process with fiber optics
NASA Astrophysics Data System (ADS)
Tao, Wei; Zhao, Hui; Liu, Zemin; Cheng, Jinke; Cai, Rong
2010-08-01
Energy metabolism is one of the basic life activities of cellular in which lactate, O2 and CO2 will be released into the extracellular environment. By monitoring the quantity of these parameters, the mitochondrial performance will be got. A continuous measurement system for the concentration of O2, CO2 and PH value is introduced in this paper. The system is made up of several small-sized fiber optics biosensors corresponding to the container. The setup of the system and the principle of measurement of several parameters are explained. The setup of the fiber PH sensor based on principle of light absorption is also introduced in detail and some experimental results are given. From the results we can see that the system can measure the PH value precisely suitable for cell cultivation. The linear and repeatable accuracies are 3.6% and 6.7% respectively, which can fulfill the measurement task.
How did Archimedes discover the law of buoyancy by experiment?
NASA Astrophysics Data System (ADS)
Kuroki, Hidetaka
2016-03-01
After Archimedes and Vitruvius era, for more than 2000 years, it has been believed that the displaced water measurement of golden crown is impossible, and at his Eureka moment, Archimedes discovered the law of buoyancy (Proposition 7 of his principles) and proved the theft of a goldsmith by weighing the golden crown in water. A previous study showed that a small amount of displaced water was able to be measured with enough accuracy by the introduced method. Archimedes measured the weight of displaced water. He did not find the law of buoyancy but rather specific gravity of things at the moment. After which, Archimedes continued to measure the specific gravity of various solids and fluids. Through these measurements, he reached the discovery of the law of buoyancy directly by experiment. In this paper, the process to the discovery of Archimedes' principle (Proposition 5) is presented.
Code of Federal Regulations, 2014 CFR
2014-07-01
... accepted chemical engineering principles, measurable process parameters, or physical or chemical laws or... engineering assessment, as described in paragraph (c)(2) of this section. (1) The owner or operator may choose... sample run. (2) The owner or operator may use engineering assessment to demonstrate compliance with the...
ERIC Educational Resources Information Center
Bloom, Robert; And Others
A study of the processes for establishing the principles and policies of measurement and disclosure in preparing financial reports examines differences in these processes in the United States, Canada, and England. Information was drawn from international accounting literature on standard setting. The differences and similarities in the…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Isselhardt, B. H.; Prussin, S. G.; Savina, M. R.
2016-01-01
Resonance Ionization Mass Spectrometry (RIMS) has been developed as a method to measure uranium isotope abundances. In this approach, RIMS is used as an element-selective ionization process between uranium atoms and potential isobars without the aid of chemical purification and separation. The use of broad bandwidth lasers with automated feedback control of wavelength was applied to the measurement of the U-235/U-238 ratio to decrease laser-induced isotopic fractionation. In application, isotope standards are used to identify and correct bias in measured isotope ratios, but understanding laser-induced bias from first-principles can improve the precision and accuracy of experimental measurements. A rate equationmore » model for predicting the relative ionization probability has been developed to study the effect of variations in laser parameters on the measured isotope ratio. The model uses atomic data and empirical descriptions of laser performance to estimate the laser-induced bias expected in experimental measurements of the U-235/U-238 ratio. Empirical corrections are also included to account for ionization processes that are difficult to calculate from first principles with the available atomic data. Development of this model has highlighted several important considerations for properly interpreting experimental results.« less
Isselhardt, B. H.; Prussin, S. G.; Savina, M. R.; ...
2015-12-07
Resonance Ionization Mass Spectrometry (RIMS) has been developed as a method to measure uranium isotope abundances. In this approach, RIMS is used as an element-selective ionization process between uranium atoms and potential isobars without the aid of chemical purification and separation. The use of broad bandwidth lasers with automated feedback control of wavelength was applied to the measurement of the 235U/238U ratio to decrease laser-induced isotopic fractionation. In application, isotope standards are used to identify and correct bias in measured isotope ratios, but understanding laser-induced bias from first-principles can improve the precision and accuracy of experimental measurements. A rate equationmore » model for predicting the relative ionization probability has been developed to study the effect of variations in laser parameters on the measured isotope ratio. The model uses atomic data and empirical descriptions of laser performance to estimate the laser-induced bias expected in experimental measurements of the 235U/ 238U ratio. Empirical corrections are also included to account for ionization processes that are difficult to calculate from first principles with the available atomic data. As a result, development of this model has highlighted several important considerations for properly interpreting experimental results.« less
Calestani, D; Culiolo, M; Villani, M; Delmonte, D; Solzi, M; Kim, Tae-Yun; Kim, Sang-Woo; Marchini, L; Zappettini, A
2018-08-17
The physical and operating principle of a stress sensor, based on two crossing carbon fibers functionalized with ZnO nanorod-shaped nanostructures, was recently demonstrated. The functionalization process has been here extended to tows made of one thousand fibers, like those commonly used in industrial processing, to prove the idea that the same working principle can be exploited in the creation of smart sensing carbon fiber composites. A stress-sensing device made of two functionalized tows, fixed with epoxy resin and crossing like in a typical carbon fiber texture, was successfully tested. Piezoelectric properties of single nanorods, as well as those of the test device, were measured and discussed.
40 CFR 63.526 - Monitoring requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... vent. (D) Design analysis based on accepted chemical engineering principles, measurable process.... (i) For the purpose of determining de minimis status for emission points, engineering assessment may... operating conditions expected to yield the highest flow rate and concentration. Engineering assessment...
40 CFR 63.526 - Monitoring requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... vent. (D) Design analysis based on accepted chemical engineering principles, measurable process.... (i) For the purpose of determining de minimis status for emission points, engineering assessment may... operating conditions expected to yield the highest flow rate and concentration. Engineering assessment...
40 CFR 63.526 - Monitoring requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... vent. (D) Design analysis based on accepted chemical engineering principles, measurable process.... (i) For the purpose of determining de minimis status for emission points, engineering assessment may... operating conditions expected to yield the highest flow rate and concentration. Engineering assessment...
Consolidated principles for screening based on a systematic review and consensus process.
Dobrow, Mark J; Hagens, Victoria; Chafe, Roger; Sullivan, Terrence; Rabeneck, Linda
2018-04-09
In 1968, Wilson and Jungner published 10 principles of screening that often represent the de facto starting point for screening decisions today; 50 years on, are these principles still the right ones? Our objectives were to review published work that presents principles for population-based screening decisions since Wilson and Jungner's seminal publication, and to conduct a Delphi consensus process to assess the review results. We conducted a systematic review and modified Delphi consensus process. We searched multiple databases for articles published in English in 1968 or later that were intended to guide population-based screening decisions, described development and modification of principles, and presented principles as a set or list. Identified sets were compared for basic characteristics (e.g., number, categorization), a citation analysis was conducted, and principles were iteratively synthesized and consolidated into categories to assess evolution. Participants in the consensus process assessed the level of agreement with the importance and interpretability of the consolidated screening principles. We identified 41 sets and 367 unique principles. Each unique principle was coded to 12 consolidated decision principles that were further categorized as disease/condition, test/intervention or program/system principles. Program or system issues were the focus of 3 of Wilson and Jungner's 10 principles, but comprised almost half of all unique principles identified in the review. The 12 consolidated principles were assessed through 2 rounds of the consensus process, leading to specific refinements to improve their relevance and interpretability. No gaps or missing principles were identified. Wilson and Jungner's principles are remarkably enduring, but increasingly reflect a truncated version of contemporary thinking on screening that does not fully capture subsequent focus on program or system principles. Ultimately, this review and consensus process provides a comprehensive and iterative modernization of guidance to inform population-based screening decisions. © 2018 Joule Inc. or its licensors.
Consolidated principles for screening based on a systematic review and consensus process
Hagens, Victoria; Chafe, Roger; Sullivan, Terrence; Rabeneck, Linda
2018-01-01
BACKGROUND: In 1968, Wilson and Jungner published 10 principles of screening that often represent the de facto starting point for screening decisions today; 50 years on, are these principles still the right ones? Our objectives were to review published work that presents principles for population-based screening decisions since Wilson and Jungner’s seminal publication, and to conduct a Delphi consensus process to assess the review results. METHODS: We conducted a systematic review and modified Delphi consensus process. We searched multiple databases for articles published in English in 1968 or later that were intended to guide population-based screening decisions, described development and modification of principles, and presented principles as a set or list. Identified sets were compared for basic characteristics (e.g., number, categorization), a citation analysis was conducted, and principles were iteratively synthesized and consolidated into categories to assess evolution. Participants in the consensus process assessed the level of agreement with the importance and interpretability of the consolidated screening principles. RESULTS: We identified 41 sets and 367 unique principles. Each unique principle was coded to 12 consolidated decision principles that were further categorized as disease/condition, test/intervention or program/system principles. Program or system issues were the focus of 3 of Wilson and Jungner’s 10 principles, but comprised almost half of all unique principles identified in the review. The 12 consolidated principles were assessed through 2 rounds of the consensus process, leading to specific refinements to improve their relevance and interpretability. No gaps or missing principles were identified. INTERPRETATION: Wilson and Jungner’s principles are remarkably enduring, but increasingly reflect a truncated version of contemporary thinking on screening that does not fully capture subsequent focus on program or system principles. Ultimately, this review and consensus process provides a comprehensive and iterative modernization of guidance to inform population-based screening decisions. PMID:29632037
Precision optical device of freeform defects inspection
NASA Astrophysics Data System (ADS)
Meguellati, S.
2015-09-01
This method of optical scanning presented in this paper is used for precision measurement deformation in shape or absolute forms in comparison with a reference component form, of optical or mechanical components, on reduced surfaces area that are of the order of some mm2 and more. The principle of the method is to project the image of the source grating to palpate optically surface to be inspected, after reflection; the image of the source grating is printed by the object topography and is then projected onto the plane of reference grating for generate moiré fringe for defects detection. The optical device used allows a significant dimensional surface magnification of up to 1000 times the area inspected for micro-surfaces, which allows easy processing and reaches an exceptional nanometric imprecision of measurements. According to the measurement principle, the sensitivity for displacement measurement using moiré technique depends on the frequency grating, for increase the detection resolution. This measurement technique can be used advantageously to measure the deformations generated by the production process or constraints on functional parts and the influence of these variations on the function. The optical device and optical principle, on which it is based, can be used for automated inspection of industrially produced goods. It can also be used for dimensional control when, for example, to quantify the error as to whether a piece is good or rubbish. It then suffices to compare a figure of moiré fringes with another previously recorded from a piece considered standard; which saves time, money and accuracy. The technique has found various applications in diverse fields, from biomedical to industrial and scientific applications.
NASA Astrophysics Data System (ADS)
Liu, Wei; Li, Ying-jun; Jia, Zhen-yuan; Zhang, Jun; Qian, Min
2011-01-01
In working process of huge heavy-load manipulators, such as the free forging machine, hydraulic die-forging press, forging manipulator, heavy grasping manipulator, large displacement manipulator, measurement of six-dimensional heavy force/torque and real-time force feedback of the operation interface are basis to realize coordinate operation control and force compliance control. It is also an effective way to raise the control accuracy and achieve highly efficient manufacturing. Facing to solve dynamic measurement problem on six-dimensional time-varying heavy load in extremely manufacturing process, the novel principle of parallel load sharing on six-dimensional heavy force/torque is put forward. The measuring principle of six-dimensional force sensor is analyzed, and the spatial model is built and decoupled. The load sharing ratios are analyzed and calculated in vertical and horizontal directions. The mapping relationship between six-dimensional heavy force/torque value to be measured and output force value is built. The finite element model of parallel piezoelectric six-dimensional heavy force/torque sensor is set up, and its static characteristics are analyzed by ANSYS software. The main parameters, which affect load sharing ratio, are analyzed. The experiments for load sharing with different diameters of parallel axis are designed. The results show that the six-dimensional heavy force/torque sensor has good linearity. Non-linearity errors are less than 1%. The parallel axis makes good effect of load sharing. The larger the diameter is, the better the load sharing effect is. The results of experiments are in accordance with the FEM analysis. The sensor has advantages of large measuring range, good linearity, high inherent frequency, and high rigidity. It can be widely used in extreme environments for real-time accurate measurement of six-dimensional time-varying huge loads on manipulators.
Meanings, Mechanisms, and Measures of Holistic Processing
Richler, Jennifer J.; Palmeri, Thomas J.; Gauthier, Isabel
2012-01-01
Few concepts are more central to the study of face recognition than holistic processing. Progress toward understanding holistic processing is challenging because the term “holistic” has many meanings, with different researchers addressing different mechanisms and favoring different measures. While in principle the use of different measures should provide converging evidence for a common theoretical construct, convergence has been slow to emerge. We explore why this is the case. One challenge is that “holistic processing” is often used to describe both a theoretical construct and a measured effect, which may not have a one-to-one mapping. Progress requires more than greater precision in terminology regarding different measures of holistic processing or different hypothesized mechanisms of holistic processing. Researchers also need to be explicit about what meaning of holistic processing they are investigating so that it is clear whether different researchers are describing the same phenomenon or not. Face recognition differs from object recognition, and not all meanings of holistic processing are equally suited to help us understand that important difference. PMID:23248611
NASA Astrophysics Data System (ADS)
Nath, Sunil
2018-05-01
Metabolic energy obtained from the coupled chemical reactions of oxidative phosphorylation (OX PHOS) is harnessed in the form of ATP by cells. We experimentally measured thermodynamic forces and fluxes during ATP synthesis, and calculated the thermodynamic efficiency, η and the rate of free energy dissipation, Φ. We show that the OX PHOS system is tuned such that the coupled nonequilibrium processes operate at optimal η. This state does not coincide with the state of minimum Φ but is compatible with maximum Φ under the imposed constraints. Conditions that must hold for species concentration in order to satisfy the principle of optimal efficiency are derived analytically and a molecular explanation based on Nath's torsional mechanism of energy transduction and ATP synthesis is suggested. Differences of the proposed principle with Prigogine's principle are discussed.
Nano-sized graphene flakes: insights from experimental synthesis and first principles calculations.
Lin, Pin-Chun; Chen, Yi-Rui; Hsu, Kuei-Ting; Lin, Tzu-Neng; Tung, Kuo-Lun; Shen, Ji-Lin; Liu, Wei-Ren
2017-03-01
In this study, we proposed a cost-effective method for preparing graphene nano-flakes (GNFs) derived from carbon nanotubes (CNTs) via three steps (pressing, homogenization and sonication exfoliation processes). Scanning electron microscopy (SEM), transmission electron microscopy (TEM), atomic force microscopy (AFM), laser scattering, as well as ultraviolet-visible and photoluminescence (PL) measurements were carried out. The results indicated that the size of as-synthesized GNFs was approximately 40-50 nm. Furthermore, we also used first principles calculations to understand the transformation from CNTs to GNFs from the viewpoints of the edge formation energies of GNFs in different shapes and sizes. The corresponding photoluminescence measurements of GNFs were carried out in this work.
Martinez, Johanna; Phillips, Erica; Harris, Christina
2014-01-01
For many educators it has been challenging to meet the Accreditation Council for Graduate Medical Education's requirements for teaching systems-based practice (SBP). An additional layer of complexity for educators is evaluating competency in SBP, despite milestones and entrustable professional activities (EPAs). In order to address this challenge, the authors present the results of a literature review for how SBP is currently being taught and a series of recommendations on how to achieve competency in SBP for graduate medical trainees with the use of milestones. The literature review included 29 articles and demonstrated that only 28% of the articles taught more than one of the six core principles of SBP in a meaningful way. Only 7% of the articles received the highest grade of A. The authors summarize four guiding principles for creating a competency-based curriculum that is in alignment with the Next Accreditation System (NAS): 1) the curriculum needs to include all of the core principles in that competency, 2) the objectives of the curriculum should be driven by clinical outcomes, 3) the teaching modalities need to be interactive and clinically relevant, and 4) the evaluation process should be able to measure competency and be directly reflective of pertinent milestones and/or EPAs. This literature review and the provided guiding principles can guide other residency educators in their development of competency-based curricula that meets the standards of the NAS.
40 CFR 63.526 - Monitoring requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
.... (D) Design analysis based on accepted chemical engineering principles, measurable process parameters... purpose of determining de minimis status for emission points, engineering assessment may be used to... expected to yield the highest flow rate and concentration. Engineering assessment includes, but is not...
40 CFR 63.526 - Monitoring requirements.
Code of Federal Regulations, 2010 CFR
2010-07-01
.... (D) Design analysis based on accepted chemical engineering principles, measurable process parameters... purpose of determining de minimis status for emission points, engineering assessment may be used to... expected to yield the highest flow rate and concentration. Engineering assessment includes, but is not...
Young, John Q; Wachter, Robert M
2009-09-01
Health care organizations have increasingly embraced industrial methods, such as the Toyota Production System (TPS), to improve quality, safety, timeliness, and efficiency. However, the use of such methods in psychiatric hospitals has been limited. A psychiatric hospital applied TPS principles to patient transfers to the outpatient medication management clinics (MMCs) from all other inpatient and outpatient services within the hospital's system. Sources of error and delay were identified, and a new process was designed to improve timely access (measured by elapsed time from request for transfer to scheduling of an appointment and to the actual visit) and patient safety by decreasing communication errors (measured by number of failed transfers). Complexity was substantially reduced, with one streamlined pathway replacing five distinct and more complicated pathways. To assess sustainability, the postintervention period was divided into Period 1 (first 12 months) and Period 2 (next 24 months). Time required to process the transfer and schedule the first appointment was reduced by 74.1% in Period 1 (p < .001) and by an additional 52.7% in Period 2 (p < .0001) for an overall reduction of 87% (p < .0001). Similarly, time to the actual appointment was reduced 31.2% in Period 1 (p < .0001), but was stable in Period 2 (p = .48). The number of transfers per month successfully processed and scheduled increased 95% in the postintervention period compared with the pre-implementation period (p = .015). Finally, data for failed transfers were only available for the postintervention period, and the rate decreased 89% in Period 2 compared with Period 1 (p = .017). The application of TPS principles enhanced access and safety through marked and sustained improvements in the transfer process's timeliness and reliability. Almost all transfer processes have now been standardized.
[Welding arc temperature field measurements based on Boltzmann spectrometry].
Si, Hong; Hua, Xue-Ming; Zhang, Wang; Li, Fang; Xiao, Xiao
2012-09-01
Arc plasma, as non-uniform plasma, has complicated energy and mass transport processes in its internal, so plasma temperature measurement is of great significance. Compared with absolute spectral line intensity method and standard temperature method, Boltzmann plot measuring is more accurate and convenient. Based on the Boltzmann theory, the present paper calculates the temperature distribution of the plasma and analyzes the principle of lines selection by real time scanning the space of the TIG are measurements.
A guide to Ussing chamber studies of mouse intestine
Clarke, Lane L.
2009-01-01
The Ussing chamber provides a physiological system to measure the transport of ions, nutrients, and drugs across various epithelial tissues. One of the most studied epithelia is the intestine, which has provided several landmark discoveries regarding the mechanisms of ion transport processes. Adaptation of this method to mouse intestine adds the dimension of investigating genetic loss or gain of function as a means to identify proteins or processes affecting transepithelial transport. In this review, the principles underlying the use of Ussing chambers are outlined including limitations and advantages of the technique. With an emphasis on mouse intestinal preparations, the review covers chamber design, commercial equipment sources, tissue preparation, step-by-step instruction for operation, troubleshooting, and examples of interpretation difficulties. Specialized uses of the Ussing chamber such as the pH stat technique to measure transepithelial bicarbonate secretion and isotopic flux methods to measure net secretion or absorption of substrates are discussed in detail, and examples are given for the adaptation of Ussing chamber principles to other measurement systems. The purpose of the review is to provide a practical guide for investigators who are new to the Ussing chamber method. PMID:19342508
Lagos, Maureen J; Batson, Philip E
2018-06-13
We measure phonon energy gain and loss down to 20 meV in a single nanostructure using an atom-wide monochromatic electron beam. We show that the bulk and surface, energy loss and energy gain processes obey the principle of detailed balancing in nanostructured systems at thermal equilibrium. By plotting the logarithm of the ratio of the loss and gain bulk/surface scattering as a function of the excitation energy, we find a linear behavior, expected from detailed balance arguments. Since that universal linearity scales with the inverse of the nanosystem temperature only, we can measure the temperature of the probed object with precision down to about 1 K without reference to the nanomaterial. We also show that subnanometer spatial resolution (down to ∼2 Å) can be obtained using highly localized acoustic phonon scattering. The surface phonon polariton signal can also be used to measure the temperature near the nanostructure surfaces, but with unavoidable averaging over several nanometers. Comparison between transmission and aloof probe configurations suggests that our method exhibits noninvasive characteristics. Our work demonstrates the validity of the principle of detailed balancing within nanoscale materials at thermal equilibrium, and it describes a transparent method to measure nanoscale temperature, thus representing an advance in the development of a noninvasive method for measurements with angstrom resolution.
Phonon-Assisted Optical Absorption in Silicon from First Principles
NASA Astrophysics Data System (ADS)
Noffsinger, Jesse; Kioupakis, Emmanouil; Van de Walle, Chris G.; Louie, Steven G.; Cohen, Marvin L.
2012-04-01
The phonon-assisted interband optical absorption spectrum of silicon is calculated at the quasiparticle level entirely from first principles. We make use of the Wannier interpolation formalism to determine the quasiparticle energies, as well as the optical transition and electron-phonon coupling matrix elements, on fine grids in the Brillouin zone. The calculated spectrum near the onset of indirect absorption is in very good agreement with experimental measurements for a range of temperatures. Moreover, our method can accurately determine the optical absorption spectrum of silicon in the visible range, an important process for optoelectronic and photovoltaic applications that cannot be addressed with simple models. The computational formalism is quite general and can be used to understand the phonon-assisted absorption processes in general.
Labarta, T
2007-01-01
Operational radiation protection of workers during the dismantling of nuclear facilities is based on the same radiation protection principles as that applied in its exploitation period with the objective of ensuring proper implementation of the as-low-as-reasonably-achievable (ALARA) principle. These principles are: prior determination of the nature and magnitude of radiological risk; classification of workplaces and workers depending on the risks; implementation of control measures; monitoring of zones and working conditions, including, if necessary, individual monitoring. From the experiences and the lessons learned during the dismantling processes carried out in Spain, several important aspects in the practical implementation of these principles that directly influence and ensure an adequate prevention of exposures and the estimation of internal doses are pointed out, with special emphasis on the estimation of internal doses due to transuranic intakes.
Viscosity Meaurement Technique for Metal Fuels
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ban, Heng; Kennedy, Rory
2015-02-09
Metallic fuels have exceptional transient behavior, excellent thermal conductivity, and a more straightforward reprocessing path, which does not separate out pure plutonium from the process stream. Fabrication of fuel containing minor actinides and rare earth (RE) elements for irradiation tests, for instance, U-20Pu-3Am-2Np-1.0RE-15Zr samples at the Idaho National Laboratory, is generally done by melt casting in an inert atmosphere. For the design of a casting system and further scale up development, computational modeling of the casting process is needed to provide information on melt flow and solidification for process optimization. Therefore, there is a need for melt viscosity data, themore » most important melt property that controls the melt flow. The goal of the project was to develop a measurement technique that uses fully sealed melt sample with no Americium vapor loss to determine the viscosity of metallic melts and at temperatures relevant to the casting process. The specific objectives of the project were to: develop mathematical models to establish the principle of the measurement method, design and build a viscosity measurement prototype system based on the established principle, and calibrate the system and quantify the uncertainty range. The result of the project indicates that the oscillation cup technique is applicable for melt viscosity measurement. Detailed mathematical models of innovative sample ampoule designs were developed to not only determine melt viscosity, but also melt density under certain designs. Measurement uncertainties were analyzed and quantified. The result of this project can be used as the initial step toward the eventual goal of establishing a viscosity measurement system for radioactive melts.« less
NASA Astrophysics Data System (ADS)
Griffiths, Robert B.
2001-11-01
Quantum mechanics is one of the most fundamental yet difficult subjects in physics. Nonrelativistic quantum theory is presented here in a clear and systematic fashion, integrating Born's probabilistic interpretation with Schrödinger dynamics. Basic quantum principles are illustrated with simple examples requiring no mathematics beyond linear algebra and elementary probability theory. The quantum measurement process is consistently analyzed using fundamental quantum principles without referring to measurement. These same principles are used to resolve several of the paradoxes that have long perplexed physicists, including the double slit and Schrödinger's cat. The consistent histories formalism used here was first introduced by the author, and extended by M. Gell-Mann, J. Hartle and R. Omnès. Essential for researchers yet accessible to advanced undergraduate students in physics, chemistry, mathematics, and computer science, this book is supplementary to standard textbooks. It will also be of interest to physicists and philosophers working on the foundations of quantum mechanics. Comprehensive account Written by one of the main figures in the field Paperback edition of successful work on philosophy of quantum mechanics
Applying activity-based costing to healthcare settings.
Canby, J B
1995-02-01
Activity-based costing (ABC) focuses on processes that drive cost. By tracing healthcare activities back to events that generate cost, a more accurate measurement of financial performance is possible. This article uses ABC principles and techniques to determine costs associated with the x-ray process in a midsized outpatient clinic. The article also provides several tips for initiating an ABC cost system for an entire healthcare organization.
Rail inspection system based on iGPS
NASA Astrophysics Data System (ADS)
Fu, Xiaoyan; Wang, Mulan; Wen, Xiuping
2018-05-01
Track parameters include gauge, super elevation, cross level and so on, which could be calculated through the three-dimensional coordinates of the track. The rail inspection system based on iGPS (indoor/infrared GPS) was composed of base station, receiver, rail inspection frame, wireless communication unit, display and control unit and data processing unit. With the continuous movement of the inspection frame, the system could accurately inspect the coordinates of rail; realize the intelligent detection and precision measurement. According to principle of angle intersection measurement, the inspection model was structured, and detection process was given.
40 CFR 65.64 - Group determination procedures.
Code of Federal Regulations, 2011 CFR
2011-07-01
... accepted chemical engineering principles, measurable process parameters, or physical or chemical laws or... stream volumetric flow shall be corrected to 2.3 percent moisture; or (2) The engineering assessment... section or by using the engineering assessment procedures in paragraph (i) of this section. (1) The net...
40 CFR 65.64 - Group determination procedures.
Code of Federal Regulations, 2013 CFR
2013-07-01
... accepted chemical engineering principles, measurable process parameters, or physical or chemical laws or... stream volumetric flow shall be corrected to 2.3 percent moisture; or (2) The engineering assessment... section or by using the engineering assessment procedures in paragraph (i) of this section. (1) The net...
40 CFR 65.64 - Group determination procedures.
Code of Federal Regulations, 2014 CFR
2014-07-01
... accepted chemical engineering principles, measurable process parameters, or physical or chemical laws or... stream volumetric flow shall be corrected to 2.3 percent moisture; or (2) The engineering assessment... section or by using the engineering assessment procedures in paragraph (i) of this section. (1) The net...
40 CFR 65.64 - Group determination procedures.
Code of Federal Regulations, 2012 CFR
2012-07-01
... accepted chemical engineering principles, measurable process parameters, or physical or chemical laws or... stream volumetric flow shall be corrected to 2.3 percent moisture; or (2) The engineering assessment... section or by using the engineering assessment procedures in paragraph (i) of this section. (1) The net...
Six Key Principles for Music Assessment
ERIC Educational Resources Information Center
Hale, Connie L.; Green, Susan K.
2009-01-01
Evaluating students' performance and measuring growth are ongoing foundational activities in the educational process. This article evolved from conversations between the authors about essential information that preservice teachers need to be able to assess their students fairly and effectively. Although the authors' expertise is in different…
Applying quantum principles to psychology
NASA Astrophysics Data System (ADS)
Busemeyer, Jerome R.; Wang, Zheng; Khrennikov, Andrei; Basieva, Irina
2014-12-01
This article starts out with a detailed example illustrating the utility of applying quantum probability to psychology. Then it describes several alternative mathematical methods for mapping fundamental quantum concepts (such as state preparation, measurement, state evolution) to fundamental psychological concepts (such as stimulus, response, information processing). For state preparation, we consider both pure states and densities with mixtures. For measurement, we consider projective measurements and positive operator valued measurements. The advantages and disadvantages of each method with respect to applications in psychology are discussed.
Acausal measurement-based quantum computing
NASA Astrophysics Data System (ADS)
Morimae, Tomoyuki
2014-07-01
In measurement-based quantum computing, there is a natural "causal cone" among qubits of the resource state, since the measurement angle on a qubit has to depend on previous measurement results in order to correct the effect of by-product operators. If we respect the no-signaling principle, by-product operators cannot be avoided. Here we study the possibility of acausal measurement-based quantum computing by using the process matrix framework [Oreshkov, Costa, and Brukner, Nat. Commun. 3, 1092 (2012), 10.1038/ncomms2076]. We construct a resource process matrix for acausal measurement-based quantum computing restricting local operations to projective measurements. The resource process matrix is an analog of the resource state of the standard causal measurement-based quantum computing. We find that if we restrict local operations to projective measurements the resource process matrix is (up to a normalization factor and trivial ancilla qubits) equivalent to the decorated graph state created from the graph state of the corresponding causal measurement-based quantum computing. We also show that it is possible to consider a causal game whose causal inequality is violated by acausal measurement-based quantum computing.
In-process and post-process measurements of drill wear for control of the drilling process
NASA Astrophysics Data System (ADS)
Liu, Tien-I.; Liu, George; Gao, Zhiyu
2011-12-01
Optical inspection was used in this research for the post-process measurements of drill wear. A precision toolmakers" microscope was used. Indirect index, cutting force, is used for in-process drill wear measurements. Using in-process measurements to estimate the drill wear for control purpose can decrease the operation cost and enhance the product quality and safety. The challenge is to correlate the in-process cutting force measurements with the post-process optical inspection of drill wear. To find the most important feature, the energy principle was used in this research. It is necessary to select only the cutting force feature which shows the highest sensitivity to drill wear. The best feature selected is the peak of torque in the drilling process. Neuro-fuzzy systems were used for correlation purposes. The Adaptive-Network-Based Fuzzy Inference System (ANFIS) can construct fuzzy rules with membership functions to generate an input-output pair. A 1x6 ANFIS architecture with product of sigmoid membership functions can in-process measure the drill wear with an error as low as 0.15%. This is extremely important for control of the drilling process. Furthermore, the measurement of drill wear was performed under different drilling conditions. This shows that ANFIS has the capability of generalization.
Design of an intelligent instrument for large direct-current measurement
NASA Astrophysics Data System (ADS)
Zhang, Rong; Zhang, Gang; Zhang, Zhipeng
2000-05-01
The principle and structure of an intelligent large direct current measurement is presented in this paper. It is of reflective type and detects signal by employing the high direct current sensor. The single-chip microcomputer of this system provides a powerful function of control and processing and greatly improves the extent of intelligence. The value can be displayed and printed automatically or manually.
Spectral structure of electron antineutrinos from nuclear reactors.
Dwyer, D A; Langford, T J
2015-01-09
Recent measurements of the positron energy spectrum obtained from inverse beta decay interactions of reactor electron antineutrinos show an excess in the 4 to 6 MeV region relative to current predictions. First-principles calculations of fission and beta decay processes within a typical pressurized water reactor core identify prominent fission daughter isotopes as a possible origin for this excess. These calculations also predict percent-level substructures in the antineutrino spectrum due to Coulomb effects in beta decay. Precise measurement of these substructures can elucidate the nuclear processes occurring within reactors. These substructures can be a systematic issue for measurements utilizing the detailed spectral shape.
Metrology in physics, chemistry, and biology: differing perceptions.
Iyengar, Venkatesh
2007-04-01
The association of physics and chemistry with metrology (the science of measurements) is well documented. For practical purposes, basic metrological measurements in physics are governed by two components, namely, the measure (i.e., the unit of measurement) and the measurand (i.e., the entity measured), which fully account for the integrity of a measurement process. In simple words, in the case of measuring the length of a room (the measurand), the SI unit meter (the measure) provides a direct answer sustained by metrological concepts. Metrology in chemistry, as observed through physical chemistry (measures used to express molar relationships, volume, pressure, temperature, surface tension, among others) follows the same principles of metrology as in physics. The same basis percolates to classical analytical chemistry (gravimetry for preparing high-purity standards, related definitive analytical techniques, among others). However, certain transition takes place in extending the metrological principles to chemical measurements in complex chemical matrices (e.g., food samples), as it adds a third component, namely, indirect measurements (e.g., AAS determination of Zn in foods). This is a practice frequently used in field assays, and calls for additional steps to account for traceability of such chemical measurements for safeguarding reliability concerns. Hence, the assessment that chemical metrology is still evolving.
Smart spectroscopy sensors: II. Narrow-band laser systems
NASA Astrophysics Data System (ADS)
Matharoo, Inderdeep; Peshko, Igor
2013-03-01
This paper describes the principles of operation of a miniature multifunctional optical sensory system based on laser technology and spectroscopic principles of analysis. The operation of the system as a remote oxygen sensor has been demonstrated. The multi-component alarm sensor has been designed to recognise gases and to measure gas concentration (O2, CO2, CO, CH4, N2O, C2H2, HI, OH radicals and H2O vapour, including semi-heavy water), temperature, pressure, humidity, and background radiation from the environment. Besides gas sensing, the same diode lasers are used for range-finding and to provide sensor self-calibration. The complete system operates as an inhomogeneous sensory network: the laser sensors are capable of using information received from environmental sensors for improving accuracy and reliability of gas concentration measurement. The sources of measurement errors associated with hardware and algorithms of operation and data processing have been analysed in detail.
Control volume based hydrocephalus research; analysis of human data
NASA Astrophysics Data System (ADS)
Cohen, Benjamin; Wei, Timothy; Voorhees, Abram; Madsen, Joseph; Anor, Tomer
2010-11-01
Hydrocephalus is a neuropathophysiological disorder primarily diagnosed by increased cerebrospinal fluid volume and pressure within the brain. To date, utilization of clinical measurements have been limited to understanding of the relative amplitude and timing of flow, volume and pressure waveforms; qualitative approaches without a clear framework for meaningful quantitative comparison. Pressure volume models and electric circuit analogs enforce volume conservation principles in terms of pressure. Control volume analysis, through the integral mass and momentum conservation equations, ensures that pressure and volume are accounted for using first principles fluid physics. This approach is able to directly incorporate the diverse measurements obtained by clinicians into a simple, direct and robust mechanics based framework. Clinical data obtained for analysis are discussed along with data processing techniques used to extract terms in the conservation equation. Control volume analysis provides a non-invasive, physics-based approach to extracting pressure information from magnetic resonance velocity data that cannot be measured directly by pressure instrumentation.
Development of an electromechanical principle for wet and dry milling
NASA Astrophysics Data System (ADS)
Halbedel, Bernd; Kazak, Oleg
2018-05-01
The paper presents a novel electromechanical principle for wet and dry milling of different materials, in which the milling beads are moved under a time- and local-variable magnetic field. A possibility to optimize the milling process in such a milling machine by simulation of the vector gradient distribution of the electromagnetic field in the process room is presented. The mathematical model and simulation methods based on standard software packages are worked out. The results of numerical simulations and experimental measurements of the electromagnetic field in the working chamber of a developed and manufactured laboratory plant correlate well with each other. Using the obtained operating parameters, dry milling experiments with crushed cement clinker and wet milling experiments of organic agents in the laboratory plant are performed and the results are discussed here.
Scientific data processing for Hipparcos
NASA Astrophysics Data System (ADS)
van der Marel, H.
1988-04-01
The scientific aims of the ESA Hipparcos astrometric satellite are reviewed, and the fundamental principles and practical implementation of the data-analysis and data-reduction procedures are discussed in detail. Hipparcos is to determine the positions and proper motions of a catalog of 110,000 stars to a limit of 12 mag with accuracy a few marcsec and obtain photometric observations of 400,000 stars (the Tycho mission). Consideration is given to the organization of the data-processing consortia FAST, NDAC, and TDAC; the basic problems of astrometry; the measurement principle; the large amounts of data to be generated during the 2.5-year mission; and the three-step iterative method to be applied (positional reconstruction and reduction to a reference great circle, spherical reconstruction, and extraction of the astrometric parameters). Diagrams and a flow chart are provided.
The Process of Suicide Risk Assessment: Twelve Core Principles
ERIC Educational Resources Information Center
Granello, Darcy Haag
2010-01-01
Suicide risk assessment requires counselors to determine client risk factors, warning signs, and protective factors. The content of suicide assessment has received attention in the literature. The guiding principles of the process of suicide assessment, however, have not yet been articulated. This article contains 12 core process principles that…
Lee, Robert H; Bott, Marjorie J; Forbes, Sarah; Redford, Linda; Swagerty, Daniel L; Taunton, Roma Lee
2003-01-01
Understanding how quality improvement affects costs is important. Unfortunately, low-cost, reliable ways of measuring direct costs are scarce. This article builds on the principles of process improvement to develop a costing strategy that meets both criteria. Process-based costing has 4 steps: developing a flowchart, estimating resource use, valuing resources, and calculating direct costs. To illustrate the technique, this article uses it to cost the care planning process in 3 long-term care facilities. We conclude that process-based costing is easy to implement; generates reliable, valid data; and allows nursing managers to assess the costs of new or modified processes.
1991-04-23
in this section. In our investigation of higher order processing methods for remote acoustic sensing we sought to understand the principles of laser...magnitude less than those presently detected in laboratory measurements. An initial study of several potential higher order processing techniques was...incoherent. The use of higher order processing methods to provide some level of discrimination against noise thus appears tractable. Finally, the effects
NASA Astrophysics Data System (ADS)
Zou, Yanbiao; Chen, Tao
2018-06-01
To address the problem of low welding precision caused by the poor real-time tracking performance of common welding robots, a novel seam tracking system with excellent real-time tracking performance and high accuracy is designed based on the morphological image processing method and continuous convolution operator tracker (CCOT) object tracking algorithm. The system consists of a six-axis welding robot, a line laser sensor, and an industrial computer. This work also studies the measurement principle involved in the designed system. Through the CCOT algorithm, the weld feature points are determined in real time from the noise image during the welding process, and the 3D coordinate values of these points are obtained according to the measurement principle to control the movement of the robot and the torch in real time. Experimental results show that the sensor has a frequency of 50 Hz. The welding torch runs smoothly with a strong arc light and splash interference. Tracking error can reach ±0.2 mm, and the minimal distance between the laser stripe and the welding molten pool can reach 15 mm, which can significantly fulfill actual welding requirements.
40 CFR 63.1414 - Test methods and emission estimation equations.
Code of Federal Regulations, 2010 CFR
2010-07-01
... (D) Design analysis based on accepted chemical engineering principles, measurable process parameters.... Engineering assessment may be used to estimate organic HAP emissions from a batch emission episode only under... (d)(5) of this section; through engineering assessment, as defined in paragraph (d)(6)(ii) of this...
Autobiographical Reflections for Teacher Professional Learning
ERIC Educational Resources Information Center
Choi, Tat Heung
2013-01-01
This article is based on the principle that teacher development is a life-long process when seeking to develop professional competencies. With the changing views of teacher education as background, the benefits to teachers associated with practice-oriented knowledge are predicated on a measure of empowerment through narration, self-expression and…
Development of Measures of Success for Corporate Level Air Force Acquisition Initiatives
2006-04-30
initiative. Customer satisfaction is described as the extent to which a process or product meets a customer’s expectations ( Kotler and Armstrong ...ADA366787). Kotler , P. and G. Armstrong . Principles of Marketing (9th Edition). Upper Saddle River NJ: Prentice Hall, 2001. Lambert, D. and T
Development of Measures of Success for Corporate Level Air Force Acquisition Initiatives
2004-03-01
has failed. Customer satisfaction is described as the extent to which a process or product meets a customer’s expectations ( Kotler and Armstrong ...ADA366787). Kotler , P. and G. Armstrong . Principles of Marketing (9th Edition). Upper Saddle River NJ: Prentice Hall, 2001. Lambert, D. and T
40 CFR 63.1414 - Test methods and emission estimation equations.
Code of Federal Regulations, 2012 CFR
2012-07-01
... (D) Design analysis based on accepted chemical engineering principles, measurable process parameters.... Engineering assessment may be used to estimate organic HAP emissions from a batch emission episode only under... (d)(5) of this section; through engineering assessment, as defined in paragraph (d)(6)(ii) of this...
40 CFR 63.1414 - Test methods and emission estimation equations.
Code of Federal Regulations, 2011 CFR
2011-07-01
... (D) Design analysis based on accepted chemical engineering principles, measurable process parameters.... Engineering assessment may be used to estimate organic HAP emissions from a batch emission episode only under... (d)(5) of this section; through engineering assessment, as defined in paragraph (d)(6)(ii) of this...
40 CFR 63.1414 - Test methods and emission estimation equations.
Code of Federal Regulations, 2014 CFR
2014-07-01
... (D) Design analysis based on accepted chemical engineering principles, measurable process parameters... paragraph (d)(5) of this section. Engineering assessment may be used to estimate organic HAP emissions from... defined in paragraph (d)(5) of this section; through engineering assessment, as defined in paragraph (d)(6...
40 CFR 63.1414 - Test methods and emission estimation equations.
Code of Federal Regulations, 2013 CFR
2013-07-01
... (D) Design analysis based on accepted chemical engineering principles, measurable process parameters... paragraph (d)(5) of this section. Engineering assessment may be used to estimate organic HAP emissions from... defined in paragraph (d)(5) of this section; through engineering assessment, as defined in paragraph (d)(6...
Processing and Testing Re2Si207 Matrix Composites (Preprint)
2012-07-01
using the Archimedes method. 2.3. Indentation and Characterization The hardnesses of the sintered pellets were measured by Vickers indentation at...J. Mechanical Properties and Atomistic Deformation Mechanism of g-Y2Si2O7 from First- Principles Investigations. Acta mat. 55, 6019-6026 (2007). 10
2012-06-01
this report. The property measurements that have been focused on were the assessment of density ( Archimedes ). grain structure {optical and SEM...Scintillator", Materials Letters 60 1960-1963 (2006) [15] J.S. Reed, Forming Processes, Chapter 20 in Introduction to the Principles of Ceramic
Total quality management in American industry.
Widtfeldt, A K; Widtfeldt, J R
1992-07-01
The definition of total quality management is conformance to customer requirements and specifications, fitness for use, buyer satisfaction, and value at an affordable price. The three individuals who have developed the total quality management concepts in the United States are W.E. Deming, J.M. Juran, and Philip Crosby. The universal principles of total quality management are (a) a customer focus, (b) management commitment, (c) training, (d) process capability and control, and (e) measurement through quality improvement tools. Results from the National Demonstration Project on Quality Improvement in Health Care showed the principles of total quality management could be applied to healthcare.
Research on motor rotational speed measurement in regenerative braking system of electric vehicle
NASA Astrophysics Data System (ADS)
Pan, Chaofeng; Chen, Liao; Chen, Long; Jiang, Haobin; Li, Zhongxing; Wang, Shaohua
2016-01-01
Rotational speed signals acquisition and processing techniques are widely used in rotational machinery. In order to realized precise and real-time control of motor drive and regenerative braking process, rotational speed measurement techniques are needed in electric vehicles. Obtaining accurate motor rotational speed signal will contribute to the regenerative braking force control steadily and realized higher energy recovery rate. This paper aims to develop a method that provides instantaneous speed information in the form of motor rotation. It addresses principles of motor rotational speed measurement in the regenerative braking systems of electric vehicle firstly. The paper then presents ideal and actual Hall position sensor signals characteristics, the relation between the motor rotational speed and the Hall position sensor signals is revealed. Finally, Hall position sensor signals conditioning and processing circuit and program for motor rotational speed measurement have been carried out based on measurement error analysis.
NASA Astrophysics Data System (ADS)
Hess, Holger; Albrecht, Martin; Grothof, Markus; Hussmann, Stephan; Schwarte, Rudolf
2004-01-01
Working on optical distance measurement a new optical correlator was developed at the Institute for Data Processing of the University of Siegen in the last years. The so called Photonic Mixer Device (PMD), to be meant originally for laser ranging systems, offers a lot of advantages for wireless optical data communication like high speed spatial light demodulation up to the GHz range and inherent backlight suppression. This contribution describes the application of such PMDs in a free space interconnect based on the principle of Multi Dimensional Multiple Access (MDMA) and the advantages of this new approach, starting from the MDMA principle and followed by the fundamental functionality of PMDs. After that an Optical MDMA (O-MDMA) demonstrator and first measurement results will be presented.
NASA Astrophysics Data System (ADS)
Wang, Shuang; Yin, Zhen-Qiang; Chau, H. F.; Chen, Wei; Wang, Chao; Guo, Guang-Can; Han, Zheng-Fu
2018-04-01
In comparison to qubit-based protocols, qudit-based quantum key distribution ones generally allow two cooperative parties to share unconditionally secure keys under a higher channel noise. However, it is very hard to prepare and measure the required quantum states in qudit-based protocols in general. One exception is the recently proposed highly error tolerant qudit-based protocol known as the Chau15 (Chau 2015 Phys. Rev. A 92 062324). Remarkably, the state preparation and measurement in this protocol can be done relatively easily since the required states are phase encoded almost like the diagonal basis states of a qubit. Here we report the first proof-of-principle demonstration of the Chau15 protocol. One highlight of our experiment is that its post-processing is based on practical one-way manner, while the original proposal in Chau (2015 Phys. Rev. A 92 062324) relies on complicated two-way post-processing, which is a great challenge in experiment. In addition, by manipulating time-bin qudit and measurement with a variable delay interferometer, our realization is extensible to qudit with high-dimensionality and confirms the experimental feasibility of the Chau15 protocol.
Free-form surface measuring method based on optical theodolite measuring system
NASA Astrophysics Data System (ADS)
Yu, Caili
2012-10-01
The measurement for single-point coordinate, length and large-dimension curved surface in industrial measurement can be achieved through forward intersection measurement by the theodolite measuring system composed of several optical theodolites and one computer. The measuring principle of flexible large-dimension three-coordinate measuring system made up of multiple (above two) optical theodolites and composition and functions of the system have been introduced in this paper. Especially for measurement of curved surface, 3D measured data of spatial free-form surface is acquired through the theodolite measuring system and the CAD model is formed through surface fitting to directly generate CAM processing data.
NASA Astrophysics Data System (ADS)
Heffron, E.; Lurton, X.; Lamarche, G.; Brown, C.; Lucieer, V.; Rice, G.; Schimel, A.; Weber, T.
2015-12-01
Backscatter data acquired with multibeam sonars are now commonly used for the remote geological interpretation of the seabed. The systems hardware, software, and processing methods and tools have grown in numbers and improved over the years, yet many issues linger: there are no standard procedures for acquisition, poor or absent calibration, limited understanding and documentation of processing methods, etc. A workshop organized at the GeoHab (a community of geoscientists and biologists around the topic of marine habitat mapping) annual meeting in 2013 was dedicated to seafloor backscatter data from multibeam sonars and concluded that there was an overwhelming need for better coherence and agreement on the topics of acquisition, processing and interpretation of data. The GeoHab Backscatter Working Group (BSWG) was subsequently created with the purpose of documenting and synthetizing the state-of-the-art in sensors and techniques available today and proposing methods for best practice in the acquisition and processing of backscatter data. Two years later, the resulting document "Backscatter measurements by seafloor-mapping sonars: Guidelines and Recommendations" was completed1. The document provides: An introduction to backscatter measurements by seafloor-mapping sonars; A background on the physical principles of sonar backscatter; A discussion on users' needs from a wide spectrum of community end-users; A review on backscatter measurement; An analysis of best practices in data acquisition; A review of data processing principles with details on present software implementation; and finally A synthesis and key recommendations. This presentation reviews the BSWG mandate, structure, and development of this document. It details the various chapter contents, its recommendations to sonar manufacturers, operators, data processing software developers and end-users and its implication for the marine geology community. 1: Downloadable at https://www.niwa.co.nz/coasts-and-oceans/research-projects/backscatter-measurement-guidelines
A Photogrammetric System for Model Attitude Measurement in Hypersonic Wind Tunnels
NASA Technical Reports Server (NTRS)
Jones, Thomas W.; Lunsford, Charles B.
2007-01-01
A series of wind tunnel tests have been conducted to evaluate a multi-camera videogrammetric system designed to measure model attitude in hypersonic facilities. The technique utilizes processed video data and photogrammetric principles for point tracking to compute model position including pitch, roll and yaw. A discussion of the constraints encountered during the design, and a review of the measurement results obtained from the NASA Langley Research Center (LaRC) 31-Inch Mach 10 tunnel are presented.
Chauvenet, B; Bobin, C; Bouchard, J
2017-12-01
Dead-time correction formulae are established in the general case of superimposed non-homogeneous Poisson processes. Based on the same principles as conventional live-timed counting, this method exploits the additional information made available using digital signal processing systems, and especially the possibility to store the time stamps of live-time intervals. No approximation needs to be made to obtain those formulae. Estimates of the variances of corrected rates are also presented. This method is applied to the activity measurement of short-lived radionuclides. Copyright © 2017 Elsevier Ltd. All rights reserved.
Causality, Measurement, and Elementary Interactions
NASA Astrophysics Data System (ADS)
Gillis, Edward J.
2011-12-01
Signal causality, the prohibition of superluminal information transmission, is the fundamental property shared by quantum measurement theory and relativity, and it is the key to understanding the connection between nonlocal measurement effects and elementary interactions. To prevent those effects from transmitting information between the generating and observing process, they must be induced by the kinds of entangling interactions that constitute measurements, as implied in the Projection Postulate. They must also be nondeterministic as reflected in the Born Probability Rule. The nondeterminism of entanglement-generating processes explains why the relevant types of information cannot be instantiated in elementary systems, and why the sequencing of nonlocal effects is, in principle, unobservable. This perspective suggests a simple hypothesis about nonlocal transfers of amplitude during entangling interactions, which yields straightforward experimental consequences.
432- μm laser's beam-waist measurement for the polarimeter/interferometer on the EAST tokamak
NASA Astrophysics Data System (ADS)
Wang, Z. X.; Liu, H. Q.; Jie, Y. X.; Wu, M. Q.; Lan, T.; Zhu, X.; Zou, Z. Y.; Yang, Y.; Wei, X. C.; Zeng, L.; Li, G. S.; Gao, X.
2014-10-01
A far-infrared (FIR) polarimeter/interferometer (PI) system is under development for measurements of the current-density and the electron-density profiles in the EAST tokamak. The system will utilize three identical 432- μm CHCOOH lasers pumped by a CO2 laser. Measurements of the laser beam's waist size and position are basic works. This paper will introduce three methods with a beam profiler and several focusing optical elements. The beam profiler can be used to show the spatial energy distribution of the laser beam. The active area of the profiler is 12.4 × 12.4 mm2. Some focusing optical elements are needed to focus the beam in order for the beam profiler to receive the entire laser beam. Two principles and three methods are used in the measurement. The first and the third methods are based on the same principle, and the second method adopts an other principle. Due to the fast and convenient measurement, although the first method is a special form of the third and it can only give the size of beam waist, it is essential to the development of the experiment and it can provide guidance for the choices of the sizes of the optical elements in the next step. A concave mirror, a high-density polyethylene (HDPE) lens and a polymethylpentene (TPX) lens are each used in the measurement process. The results of these methods are close enough for the design of PI system's optical path.
Organic food processing: a framework for concept, starting definitions and evaluation.
Kahl, Johannes; Alborzi, Farnaz; Beck, Alexander; Bügel, Susanne; Busscher, Nicolaas; Geier, Uwe; Matt, Darja; Meischner, Tabea; Paoletti, Flavio; Pehme, Sirli; Ploeger, Angelika; Rembiałkowska, Ewa; Schmid, Otto; Strassner, Carola; Taupier-Letage, Bruno; Załęcka, Aneta
2014-10-01
In 2007 EU Regulation (EC) 834/2007 introduced principles and criteria for organic food processing. These regulations have been analysed and discussed in several scientific publications and research project reports. Recently, organic food quality was described by principles, aspects and criteria. These principles from organic agriculture were verified and adapted for organic food processing. Different levels for evaluation were suggested. In another document, underlying paradigms and consumer perception of organic food were reviewed against functional food, resulting in identifying integral product identity as the underlying paradigm and a holistic quality view connected to naturalness as consumers' perception of organic food quality. In a European study, the quality concept was applied to the organic food chain, resulting in a problem, namely that clear principles and related criteria were missing to evaluate processing methods. Therefore the goal of this paper is to describe and discuss the topic of organic food processing to make it operational. A conceptual background for organic food processing is given by verifying the underlying paradigms and principles of organic farming and organic food as well as on organic processing. The proposed definition connects organic processing to related systems such as minimal, sustainable and careful, gentle processing, and describes clear principles and related criteria. Based on food examples, such as milk with different heat treatments, the concept and definitions were verified. Organic processing can be defined by clear paradigms and principles and evaluated according criteria from a multidimensional approach. Further work has to be done on developing indicators and parameters for assessment of organic food quality. © 2013 Society of Chemical Industry.
Design of Moisture Content Detection System
NASA Astrophysics Data System (ADS)
Wang, W. C.; Wang, L.
In this paper, a method for measuring the moisture content of grain was presented based on single chip microcomputer and capacitive sensor. The working principle of measuring moisture content is introduced and a concentric cylinder type of capacitive sensor is designed, the signal processing circuits of system are described in details. System is tested in practice and discussions are made on the various factors affecting the capacitive measuring of grain moisture based on the practical experiments, experiment results showed that the system has high measuring accuracy and good controlling capacity.
NASA Technical Reports Server (NTRS)
1992-01-01
A Small Business Innovation Research (SBIR) contract led to a commercially available instrument used to measure the shape profile of mirror surfaces in scientific instruments. Bauer Associates, Inc.'s Bauer Model 200 Profilometer is based upon a different measurement concept. The local curvature of the mirror's surface is measured at many points, and the collection of data is computer processed to yield the desired shape profile. (Earlier profilometers are based on the principle of interferometry.) The system is accurate and immune to problems like vibration and turbulence. Two profilometers are currently marketed, and a third will soon be commercialized.
Reflection measurement of waveguide-injected high-power microwave antennas.
Yuan, Chengwei; Peng, Shengren; Shu, Ting; Zhang, Qiang; Zhao, Xuelong
2015-12-01
A method for reflection measurements of High-power Microwave (HPM) antennas excited with overmoded waveguides is proposed and studied systemically. In theory, principle of the method is proposed and the data processing formulas are developed. In simulations, a horn antenna excited by a TE11 mode exciter is examined and its reflection is calculated by CST Microwave Studio and by the method proposed in this article, respectively. In experiments, reflection measurements of two HPM antennas are conducted, and the measured results are well consistent with the theoretical expectations.
Elvan, Osman Devrim; Turker, Y Ozhan
2014-01-01
Water resources have shaped the destinies of societies and affected settlement choice of civilizations for centuries. Demand for them is constantly increasing and this surge has become an important threat for water resources due to those excessive demands and variety of usage types; at the same time, balancing the protection and use of ground and surface waters has become more difficult. The progress in legal and corporate structures for water management has been too slow for a long time. In this study, principles of international conventions on groundwater are compared with the relevant Turkish groundwater legislation, which is in the process of harmonization with European Union (EU) acquis under the scope of Turkey's nomination for EU membership. The purpose of this study is to measure the compliance of Turkish legislation on groundwater with the relevant international principles and conventions, and also to analyze legal loopholes in Turkish legislation in accordance with the international principles and conventions to be determined.
A method to evaluate process performance by integrating time and resources
NASA Astrophysics Data System (ADS)
Wang, Yu; Wei, Qingjie; Jin, Shuang
2017-06-01
The purpose of process mining is to improve the existing process of the enterprise, so how to measure the performance of the process is particularly important. However, the current research on the performance evaluation method is still insufficient. The main methods of evaluation are mainly using time or resource. These basic statistics cannot evaluate process performance very well. In this paper, a method of evaluating the performance of the process based on time dimension and resource dimension is proposed. This method can be used to measure the utilization and redundancy of resources in the process. This paper will introduce the design principle and formula of the evaluation algorithm. Then, the design and the implementation of the evaluation method will be introduced. Finally, we will use the evaluating method to analyse the event log from a telephone maintenance process and propose an optimization plan.
2002-12-01
Accounting and Reporting System-Field Level SWOT Strengths Weaknesses Opportunities Threats TMA Tricare Management Activity TOA Total Obligational...progression of the four principles. [Ref 3] The organization uses SWOT analysis to assist in developing the mission and business...strategy. SWOT stands for the strengths and weaknesses of the organization and the opportunities for and threats to the organization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wei, D. T.
Ion beam interference coating (IBIC) is a sputter-deposition process for multiple layers of optical thin films employing a Kaufman gun. It has achieved coatings of extremely low optical loss and high mechanical strength. It has many potential applications for a wide spectral range. This coating process is described in terms of principle, fabrication procedure, and optical measurements. Some discussions follow the history and outlooks of IBIC with emphasis on how to achieve low loss and on the throughput improvements.
Teaching the foundational principles of rehabilitation psychology.
Stiers, William
2016-02-01
Wright (1983) described 20 "value-laden beliefs and principles" that form the foundational principles of rehabilitation psychology, and the education and training of rehabilitation psychologists necessitates that they acquire the specialty-specific knowledge and attitudes/values related to these principles. This article addresses 2 questions about how these principles can be taught in rehabilitation psychology training: (a) What are the core theories and evidence supporting these foundational principles, and what should be the content of a "core curriculum" for teaching these?; and (b) What is known about the most effective methods for teaching these foundational principles, including questions of how to teach values? The foundational principles were grouped into 3 categories: individual psychological processes, social psychological processes, and values related to social integration. A literature review was conducted in these 3 categories, and the results are summarized and discussed. A core curriculum is discussed for teaching about disability-specific individual psychological processes, social psychological processes, and values related to social integration, including methods to reduce group prejudice and promote values relevant to the foundational principles. Specific suggestions for training program content and methods are provided. It is hoped that effective teaching of Wright's (1983) value-laden beliefs and principles will help rehabilitation psychology trainers and trainees focus on the key knowledge and attitude-value competencies that are to be acquired in training. (c) 2016 APA, all rights reserved).
A novel weld seam detection method for space weld seam of narrow butt joint in laser welding
NASA Astrophysics Data System (ADS)
Shao, Wen Jun; Huang, Yu; Zhang, Yong
2018-02-01
Structured light measurement is widely used for weld seam detection owing to its high measurement precision and robust. However, there is nearly no geometrical deformation of the stripe projected onto weld face, whose seam width is less than 0.1 mm and without misalignment. So, it's very difficult to ensure an exact retrieval of the seam feature. This issue is raised as laser welding for butt joint of thin metal plate is widely applied. Moreover, measurement for the seam width, seam center and the normal vector of the weld face at the same time during welding process is of great importance to the welding quality but rarely reported. Consequently, a seam measurement method based on vision sensor for space weld seam of narrow butt joint is proposed in this article. Three laser stripes with different wave length are project on the weldment, in which two red laser stripes are designed and used to measure the three dimensional profile of the weld face by the principle of optical triangulation, and the third green laser stripe is used as light source to measure the edge and the centerline of the seam by the principle of passive vision sensor. The corresponding image process algorithm is proposed to extract the centerline of the red laser stripes as well as the seam feature. All these three laser stripes are captured and processed in a single image so that the three dimensional position of the space weld seam can be obtained simultaneously. Finally, the result of experiment reveals that the proposed method can meet the precision demand of space narrow butt joint.
Microwave/Sonic Apparatus Measures Flow and Density in Pipe
NASA Technical Reports Server (NTRS)
Arndt, G. D.; Ngo, Phong; Carl, J. R.; Byerly, Kent A.
2004-01-01
An apparatus for measuring the rate of flow and the mass density of a liquid or slurry includes a special section of pipe instrumented with microwave and sonic sensors, and a computer that processes digitized readings taken by the sensors. The apparatus was conceived specifically for monitoring a flow of oil-well-drilling mud, but the basic principles of its design and operation are also applicable to monitoring flows of other liquids and slurries.
Read, Gemma J M; Salmon, Paul M; Lenné, Michael G
2016-09-01
The Cognitive Work Analysis Design Toolkit (CWA-DT) is a recently developed approach that provides guidance and tools to assist in applying the outputs of CWA to design processes to incorporate the values and principles of sociotechnical systems theory. In this paper, the CWA-DT is evaluated based on an application to improve safety at rail level crossings. The evaluation considered the extent to which the CWA-DT met pre-defined methodological criteria and aligned with sociotechnical values and principles. Both process and outcome measures were taken based on the ratings of workshop participants and human factors experts. Overall, workshop participants were positive about the process and indicated that it met the methodological criteria and sociotechnical values. However, expert ratings suggested that the CWA-DT achieved only limited success in producing RLX designs that fully aligned with the sociotechnical approach. Discussion about the appropriateness of the sociotechnical approach in a public safety context is provided. Practitioner Summary: Human factors and ergonomics practitioners need evidence of the effectiveness of methods. A design toolkit for cognitive work analysis, incorporating values and principles from sociotechnical systems theory, was applied to create innovative designs for rail level crossings. Evaluation results based on the application are provided and discussed.
Complexity and compositionality in fluid intelligence.
Duncan, John; Chylinski, Daphne; Mitchell, Daniel J; Bhandari, Apoorva
2017-05-16
Compositionality, or the ability to build complex cognitive structures from simple parts, is fundamental to the power of the human mind. Here we relate this principle to the psychometric concept of fluid intelligence, traditionally measured with tests of complex reasoning. Following the principle of compositionality, we propose that the critical function in fluid intelligence is splitting a complex whole into simple, separately attended parts. To test this proposal, we modify traditional matrix reasoning problems to minimize requirements on information integration, working memory, and processing speed, creating problems that are trivial once effectively divided into parts. Performance remains poor in participants with low fluid intelligence, but is radically improved by problem layout that aids cognitive segmentation. In line with the principle of compositionality, we suggest that effective cognitive segmentation is important in all organized behavior, explaining the broad role of fluid intelligence in successful cognition.
Al-Araidah, Omar; Momani, Amer; Khasawneh, Mohammad; Momani, Mohammed
2010-01-01
The healthcare arena, much like the manufacturing industry, benefits from many aspects of the Toyota lean principles. Lean thinking contributes to reducing or eliminating nonvalue-added time, money, and energy in healthcare. In this paper, we apply selected principles of lean management aiming at reducing the wasted time associated with drug dispensing at an inpatient pharmacy at a local hospital. Thorough investigation of the drug dispensing process revealed unnecessary complexities that contribute to delays in delivering medications to patients. We utilize DMAIC (Define, Measure, Analyze, Improve, Control) and 5S (Sort, Set-in-order, Shine, Standardize, Sustain) principles to identify and reduce wastes that contribute to increasing the lead-time in healthcare operations at the pharmacy understudy. The results obtained from the study revealed potential savings of > 45% in the drug dispensing cycle time.
Popova, A Yu; Trukhina, G M; Mikailova, O M
In the article there is considered the quality control and safety system implemented in the one of the largest flight catering food production plant for airline passengers and flying squad. The system for the control was based on the Hazard Analysis And Critical Control Points (HACCP) principles and developed hygienic and antiepidemic measures. There is considered the identification of hazard factors at stages of the technical process. There are presented results of the analysis data of monitoring for 6 critical control points over the five-year period. The quality control and safety system permit to decline food contamination risk during acceptance, preparation and supplying of in-flight meal. There was proved the efficiency of the implemented system. There are determined further ways of harmonization and implementation for HACCP principles in the plant.
Multimedia Approach and Its Effect in Teaching Mathematics for the Prospective Teachers
ERIC Educational Resources Information Center
Joan, D. R. Robert; Denisia, S. P.
2012-01-01
Multimedia improves the effectiveness of teaching learning process of multimedia in formal or informal setting and utilizing scientific principle. It allows us to sort out the information to analyse and make meaning for conceptualization and applications which is suitable for individual learners. The objectives of the study was to measure the…
A glacier runoff extension to the Precipitation Runoff Modeling System
A. E. Van Beusekom; R. J. Viger
2016-01-01
A module to simulate glacier runoff, PRMSglacier, was added to PRMS (Precipitation Runoff Modeling System), a distributed-parameter, physical-process hydrological simulation code. The extension does not require extensive on-glacier measurements or computational expense but still relies on physical principles over empirical relations as much as is feasible while...
Environmental Science Curriculum Guide, 1987. Bulletin 1792.
ERIC Educational Resources Information Center
Louisiana State Dept. of Education, Baton Rouge. Div. of Academic Programs.
This guide for environmental science is intended to make students aware of the problems they will be facing in their environment, and of alternative measures to solve these problems. The course is designed to use scientific principles to study the processes of the environment; examine changes within the environment from a broad perspective;…
40 CFR 63.11950 - What emissions calculations must I use for an emission profile?
Code of Federal Regulations, 2012 CFR
2012-07-01
... chemical engineering principles, measurable process parameters, or physical or chemical laws or properties... stream. i = Identifier for a HAP compound. (i) Engineering assessments. You must conduct an engineering... drying or empty vessel purging. An engineering assessment may also be used to support a finding that the...
40 CFR 63.11950 - What emissions calculations must I use for an emission profile?
Code of Federal Regulations, 2013 CFR
2013-07-01
... chemical engineering principles, measurable process parameters, or physical or chemical laws or properties... stream. i = Identifier for a HAP compound. (i) Engineering assessments. You must conduct an engineering... drying or empty vessel purging. An engineering assessment may also be used to support a finding that the...
40 CFR 63.11950 - What emissions calculations must I use for an emission profile?
Code of Federal Regulations, 2014 CFR
2014-07-01
... chemical engineering principles, measurable process parameters, or physical or chemical laws or properties... stream. i = Identifier for a HAP compound. (i) Engineering assessments. You must conduct an engineering... drying or empty vessel purging. An engineering assessment may also be used to support a finding that the...
Application of lean manufacturing techniques in the Emergency Department.
Dickson, Eric W; Singh, Sabi; Cheung, Dickson S; Wyatt, Christopher C; Nugent, Andrew S
2009-08-01
"Lean" is a set of principles and techniques that drive organizations to continually add value to the product they deliver by enhancing process steps that are necessary, relevant, and valuable while eliminating those that fail to add value. Lean has been used in manufacturing for decades and has been associated with enhanced product quality and overall corporate success. To evaluate whether the adoption of Lean principles by an Emergency Department (ED) improves the value of emergency care delivered. Beginning in December 2005, we implemented a variety of Lean techniques in an effort to enhance patient and staff satisfaction. The implementation followed a six-step process of Lean education, ED observation, patient flow analysis, process redesign, new process testing, and full implementation. Process redesign focused on generating improvement ideas from frontline workers across all departmental units. Value-based and operational outcome measures, including patient satisfaction, expense per patient, ED length of stay (LOS), and patient volume were compared for calendar year 2005 (pre-Lean) and periodically after 2006 (post-Lean). Patient visits increased by 9.23% in 2006. Despite this increase, LOS decreased slightly and patient satisfaction increased significantly without raising the inflation adjusted cost per patient. Lean improved the value of the care we delivered to our patients. Generating and instituting ideas from our frontline providers have been the key to the success of our Lean program. Although Lean represents a fundamental change in the way we think of delivering care, the specific process changes we employed tended to be simple, small procedure modifications specific to our unique people, process, and place. We, therefore, believe that institutions or departments aspiring to adopt Lean should focus on the core principles of Lean rather than on emulating specific process changes made at other institutions.
NASA Astrophysics Data System (ADS)
Noda, Toshihiko; Takao, Hidekuni; Ashiki, Mitsuaki; Ebi, Hiroyuki; Sawada, Kazuaki; Ishida, Makoto
2004-04-01
In this study, a microchip for measurement of hemoglobin in human blood has been proposed, fabricated and evaluated. The measurement principle of hemoglobin is based on the “cyanmethemoglobin method” that calculates the cyanmethemoglobin concentration by absorption photometry. A glass/silicon/silicon structure was used for the microchip. The middle silicon layer includes flow channels, and 45° mirrors formed at each end of the flow channels. Photodiodes and metal oxide semiconductor (MOS) integrated circuits were fabricated on the bottom silicon layer. The performance of the microchip for hemoglobin measurement was evaluated using a solution of red food color instead of a real blood sample. The fabricated microchip exhibited a similar performance to a nonminiaturized absorption cell which has the same optical path length. Signal processing output varied with solution concentration from 5.32 V to 5.55 V with very high stability due to differential signal processing.
Brink, Adrian J; Messina, Angeliki P; Feldman, Charles; Richards, Guy A; van den Bergh, Dena
2017-04-01
Few data exist on the implementation of process measures to facilitate adherence to peri-operative antibiotic prophylaxis (PAP) guidelines in Africa. To implement an improvement model for PAP utilizing existing resources, in order to achieve a reduction in surgical site infections (SSIs) across a heterogeneous group of 34 urban and rural South African hospitals. A pharmacist-driven, prospective audit and feedback strategy involving change management and improvement principles was utilized. This 2.5 year intervention involved a pre-implementation phase to test a PAP guideline and a 'toolkit' at pilot sites. Following antimicrobial stewardship committee and clinician endorsement, the model was introduced in all institutions and a survey of baseline SSI and compliance rates with four process measures (antibiotic choice, dose, administration time and duration) was performed. The post-implementation phase involved audit, intervention and monthly feedback to facilitate improvements in compliance. For 70 weeks of standardized measurements and feedback, 24 206 surgical cases were reviewed. There was a significant improvement in compliance with all process measures (composite compliance) from 66.8% (95% CI 64.8-68.7) to 83.3% (95% CI 80.8-85.8), representing a 24.7% increase ( P < 0.0001). The SSI rate decreased by 19.7% from a mean group rate of 2.46 (95% CI 2.18-2.73) pre-intervention to 1.97 post-intervention (95% CI 1.79-2.15) ( P = 0.0029). The implementation of process improvement initiatives and principles targeted to institutional needs utilizing pharmacists can effectively improve PAP guideline compliance and sustainable patient outcomes. © The Author 2016. Published by Oxford University Press on behalf of the British Society for Antimicrobial Chemotherapy. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
NASA Astrophysics Data System (ADS)
van Veenstra, Anne Fleur; Janssen, Marijn
One of the main challenges for e-government is to create coherent services for citizens and businesses. Realizing Integrated Service Delivery (ISD) requires government agencies to collaborate across their organizational boundaries. The coordination of processes across multiple organizations to realize ISD is called orchestration. One way of achieving orchestration is to formalize processes using architecture. In this chapter we identify architectural principles for orchestration by looking at three case studies of cross-organizational service delivery chain formation in the Netherlands. In total, six generic principles were formulated and subsequently validated in two workshops with experts. These principles are: (i) build an intelligent front office, (ii) give processes a clear starting point and end, (iii) build a central workflow application keeping track of the process, (iv) differentiate between simple and complex processes, (v) ensure that the decision-making responsibility and the overview of the process are not performed by the same process role, and (vi) create a central point where risk profiles are maintained. Further research should focus on how organizations can adapt these principles to their own situation.
Introduction to Color Imaging Science
NASA Astrophysics Data System (ADS)
Lee, Hsien-Che
2005-04-01
Color imaging technology has become almost ubiquitous in modern life in the form of monitors, liquid crystal screens, color printers, scanners, and digital cameras. This book is a comprehensive guide to the scientific and engineering principles of color imaging. It covers the physics of light and color, how the eye and physical devices capture color images, how color is measured and calibrated, and how images are processed. It stresses physical principles and includes a wealth of real-world examples. The book will be of value to scientists and engineers in the color imaging industry and, with homework problems, can also be used as a text for graduate courses on color imaging.
Vargas-Leguás, H; Rodríguez Garrido, V; Lorite Cuenca, R; Pérez-Portabella, C; Redecillas Ferreiro, S; Campins Martí, M
2009-06-01
This guide for the preparation of powdered infant formulae in hospital environments is a collaborative work between several hospital services and is based on national and European regulations, international experts meetings and the recommendations of scientific societies. This guide also uses the Hazard Analysis and Critical Control Point principles proposed by Codex Alimentarius and emphasises effective verifying measures, microbiological controls of the process and the corrective actions when monitoring indicates that a critical control point is not under control. It is a dynamic guide and specifies the evaluation procedures that allow it to be constantly adapted.
Basic design principles of colorimetric vision systems
NASA Astrophysics Data System (ADS)
Mumzhiu, Alex M.
1998-10-01
Color measurement is an important part of overall production quality control in textile, coating, plastics, food, paper and other industries. The color measurement instruments such as colorimeters and spectrophotometers, used for production quality control have many limitations. In many applications they cannot be used for a variety of reasons and have to be replaced with human operators. Machine vision has great potential for color measurement. The components for color machine vision systems, such as broadcast quality 3-CCD cameras, fast and inexpensive PCI frame grabbers, and sophisticated image processing software packages are available. However the machine vision industry has only started to approach the color domain. The few color machine vision systems on the market, produced by the largest machine vision manufacturers have very limited capabilities. A lack of understanding that a vision based color measurement system could fail if it ignores the basic principles of colorimetry is the main reason for the slow progress of color vision systems. the purpose of this paper is to clarify how color measurement principles have to be applied to vision systems and how the electro-optical design features of colorimeters have to be modified in order to implement them for vision systems. The subject of this presentation far exceeds the limitations of a journal paper so only the most important aspects will be discussed. An overview of the major areas of applications for colorimetric vision system will be discussed. Finally, the reasons why some customers are happy with their vision systems and some are not will be analyzed.
Optical Fiber On-Line Detection System for Non-Touch Monitoring Roller Shape
NASA Astrophysics Data System (ADS)
Guo, Y.; Wang, Y. T.
2006-10-01
Basing on the principle of reflective displacement fiber-optic sensor, a high accuracy non-touch on-line optical fiber measurement system for roller shape is presented. The principle and composition of the detection system and the operation process are expatiated also. By using a novel probe of three optical fibers in equal transverse space, the effects of fluctuations in the light source, reflective changing of target surface and the intensity losses in the fiber lines are automatically compensated. Meantime, an optical fiber sensor model of correcting static error based on BP artificial neural network (ANN) is set up. Also by using interpolation method and value filtering to process the signals, effectively reduce the influence of random noise and the vibration of the roller bearing. So enhance the accuracy and resolution remarkably. Experiment proves that the accuracy of the system reach to the demand of practical production process, it provides a new method for the high speed, accurate and automatic on line detection of the mill roller shape.
Mid-Level Planning and Control for Articulated Locomoting Systems
2017-02-12
accelerometers and gyros into each module of our snake robots. Prior work from our group has already used an extended Kalman filter (EKF) to fuse these distributed...body frame is performed as part of the measurement model at every iteration of the filter , using an SVD to identify the principle components of the...addi- tion the conventional EKF, although we found that all three methods worked equally well. All three filters used the same process and measurement
Inauen, A; Jenny, G J; Bauer, G F
2012-06-01
This article focuses on organizational analysis in workplace health promotion (WHP) projects. It shows how this analysis can be designed such that it provides rational data relevant to the further context-specific and goal-oriented planning of WHP and equally supports individual and organizational change processes implied by WHP. Design principles for organizational analysis were developed on the basis of a narrative review of the guiding principles of WHP interventions and organizational change as well as the scientific principles of data collection. Further, the practical experience of WHP consultants who routinely conduct organizational analysis was considered. This resulted in a framework with data-oriented and change-oriented design principles, addressing the following elements of organizational analysis in WHP: planning the overall procedure, data content, data-collection methods and information processing. Overall, the data-oriented design principles aim to produce valid, reliable and representative data, whereas the change-oriented design principles aim to promote motivation, coherence and a capacity for self-analysis. We expect that the simultaneous consideration of data- and change-oriented design principles for organizational analysis will strongly support the WHP process. We finally illustrate the applicability of the design principles to health promotion within a WHP case study.
Search for geo-neutrinos and rare nuclear processes with Borexino
NASA Astrophysics Data System (ADS)
Caminata, Alessio; Davini, Stefano; di Noto, Lea; Pallavicini, Marco; Testera, Gemma; Zavatarelli, Sandra
2018-03-01
Borexino was designed to measure solar neutrinos in the MeV or sub-MeV energy range. The unprecedented radiopurity of the detector has allowed the detection of geo-neutrinos and the determination of competitive limits on the rate of rare or forbidden processes. In this paper, we review the basic principle of neutrinos and antineutrinos detection in Borexino and we describe the results of the geo-neutrinos measurements and their implications. Then we summarize the search for Borexino events correlated with gamma ray bursts and for axion induced signals, and the limits achieved on Pauli forbidden transitions and on the electron charge conservation.
Non-Contact Detection of Breathing Using a Microwave Sensor
Dei, Devis; Grazzini, Gilberto; Luzi, Guido; Pieraccini, Massimiliano; Atzeni, Carlo; Boncinelli, Sergio; Camiciottoli, Gianna; Castellani, Walter; Marsili, Massimo; Dico, Juri Lo
2009-01-01
In this paper the use of a continuous-wave microwave sensor as a non-contact tool for quantitative measurement of respiratory tidal volume has been evaluated by experimentation in seventeen healthy volunteers. The sensor working principle is reported and several causes that can affect its response are analyzed. A suitable data processing has been devised able to reject the majority of breath measurements taken under non suitable conditions. Furthermore, a relationship between microwave sensor measurements and volume inspired and expired at quiet breathing (tidal volume) has been found. PMID:22574033
Thermal Remote Sensing and the Thermodynamics of Ecosystem Development
NASA Technical Reports Server (NTRS)
Luvall, Jeffrey C.; Kay, James J.; Fraser, Roydon F.
2000-01-01
Thermal remote sensing can provide environmental measuring tools with capabilities for measuring ecosystem development and integrity. Recent advances in applying principles of nonequilibrium thermodynamics to ecology provide fundamental insights into energy partitioning in ecosystems. Ecosystems are nonequilibrium systems, open to material and energy flows, which grow and develop structures and processes to increase energy degradation. More developed terrestrial ecosystems will be more effective at dissipating the solar gradient (degrading its energy content). This can be measured by the effective surface temperature of the ecosystem on a landscape scale.
Absolute Distance Measurement with the MSTAR Sensor
NASA Technical Reports Server (NTRS)
Lay, Oliver P.; Dubovitsky, Serge; Peters, Robert; Burger, Johan; Ahn, Seh-Won; Steier, William H.; Fetterman, Harrold R.; Chang, Yian
2003-01-01
The MSTAR sensor (Modulation Sideband Technology for Absolute Ranging) is a new system for measuring absolute distance, capable of resolving the integer cycle ambiguity of standard interferometers, and making it possible to measure distance with sub-nanometer accuracy. The sensor uses a single laser in conjunction with fast phase modulators and low frequency detectors. We describe the design of the system - the principle of operation, the metrology source, beamlaunching optics, and signal processing - and show results for target distances up to 1 meter. We then demonstrate how the system can be scaled to kilometer-scale distances.
How quantitative measures unravel design principles in multi-stage phosphorylation cascades.
Frey, Simone; Millat, Thomas; Hohmann, Stefan; Wolkenhauer, Olaf
2008-09-07
We investigate design principles of linear multi-stage phosphorylation cascades by using quantitative measures for signaling time, signal duration and signal amplitude. We compare alternative pathway structures by varying the number of phosphorylations and the length of the cascade. We show that a model for a weakly activated pathway does not reflect the biological context well, unless it is restricted to certain parameter combinations. Focusing therefore on a more general model, we compare alternative structures with respect to a multivariate optimization criterion. We test the hypothesis that the structure of a linear multi-stage phosphorylation cascade is the result of an optimization process aiming for a fast response, defined by the minimum of the product of signaling time and signal duration. It is then shown that certain pathway structures minimize this criterion. Several popular models of MAPK cascades form the basis of our study. These models represent different levels of approximation, which we compare and discuss with respect to the quantitative measures.
Silicon solar cell process development, fabrication and analysis
NASA Technical Reports Server (NTRS)
Leung, D. C.; Iles, P. A.
1983-01-01
Measurements of minority carrier diffusion lengths were made on the small mesa diodes from HEM Si and SILSO Si. The results were consistent with previous Voc and Isc measurements. Only the medium grain SILSO had a distinct advantage for the non grain boundary diodes. Substantial variations were observed for the HEM ingot 4141C. Also a quantitatively scaled light spot scan was being developed for localized diffusion length measurements in polycrystalline silicon solar cells. A change to a more monochromatic input for the light spot scan results in greater sensitivity and in principle, quantitative measurement of local material qualities is now possible.
ERIC Educational Resources Information Center
Resing, Wilma C. M.; Tunteler, Erika
2007-01-01
In this article, time effects on intelligence test scores have been investigated. In particular, we examined whether the "Flynn effect" is manifest in children from the middle and higher IQ distribution range, measured with a child intelligence test based on information processing principles--the Leiden Diagnostic Test. The test was administered…
Successful affiliations: principles and practices.
Rice, Ann Madden
2011-01-01
An affiliation can help a healthcare provider prepare for the challenges of healthcare reform, the rapidly changing landscapes of the commercial insurance industry, and the public's expectations about service and quality. UC Davis Medical Center, a 645-bed tertiary hospital in Sacramento, California, with many hospital-based clinics and a community-based group of primary care clinics, has developed a number of principles for affiliation. These principles are based on its experience in legal and financial affiliations with an academic practice group, with individual and small groups of primary care physicians, and with community hospitals around oncology services linked with U.C. Davis' National Cancer Institute-designated cancer center. This article offers a process for evaluating the appropriateness of an affiliation. The chances for a successful affiliation improve if each party has indicated the value it hopes to derive and how to measure that value, has communicated with all affected constituents, and has an agreed-upon method for resolving disputes.
Complexity and compositionality in fluid intelligence
Duncan, John; Chylinski, Daphne
2017-01-01
Compositionality, or the ability to build complex cognitive structures from simple parts, is fundamental to the power of the human mind. Here we relate this principle to the psychometric concept of fluid intelligence, traditionally measured with tests of complex reasoning. Following the principle of compositionality, we propose that the critical function in fluid intelligence is splitting a complex whole into simple, separately attended parts. To test this proposal, we modify traditional matrix reasoning problems to minimize requirements on information integration, working memory, and processing speed, creating problems that are trivial once effectively divided into parts. Performance remains poor in participants with low fluid intelligence, but is radically improved by problem layout that aids cognitive segmentation. In line with the principle of compositionality, we suggest that effective cognitive segmentation is important in all organized behavior, explaining the broad role of fluid intelligence in successful cognition. PMID:28461462
Applying Lean Six Sigma to improve medication management.
Nayar, Preethy; Ojha, Diptee; Fetrick, Ann; Nguyen, Anh T
2016-01-01
A significant proportion of veterans use dual care or health care services within and outside the Veterans Health Administration (VHA). In this study conducted at a VHA medical center in the USA, the authors used Lean Six Sigma principles to develop recommendations to eliminate wasteful processes and implement a more efficient and effective process to manage medications for dual care veteran patients. The purpose of this study is to: assess compliance with the VHA's dual care policy; collect data and describe the current process for co-management of dual care veterans' medications; and draft recommendations to improve the current process for dual care medications co-management. Input was obtained from the VHA patient care team members to draw a process map to describe the current process for filling a non-VHA prescription at a VHA facility. Data were collected through surveys and direct observation to measure the current process and to develop recommendations to redesign and improve the process. A key bottleneck in the process that was identified was the receipt of the non-VHA medical record which resulted in delays in filling prescriptions. The recommendations of this project focus on the four domains of: documentation of dual care; veteran education; process redesign; and outreach to community providers. This case study describes the application of Lean Six Sigma principles in one urban Veterans Affairs Medical Center (VAMC) in the Mid-Western USA to solve a specific organizational quality problem. Therefore, the findings may not be generalizable to other organizations. The Lean Six Sigma general principles applied in this project to develop recommendations to improve medication management for dual care veterans are applicable to any process improvement or redesign project and has valuable lessons for other VAMCs seeking to improve care for their dual care veteran patients. The findings of this project will be of value to VA providers and policy makers and health care managers who plan to apply Lean Six Sigma techniques in their organizations to improve the quality of care for their patients.
NASA Astrophysics Data System (ADS)
Shanahan, Daniel
2008-05-01
The memory loophole supposes that the measurement of an entangled pair is influenced by the measurements of earlier pairs in the same run of measurements. To assert the memory loophole is thus to deny that measurement is intrinsically random. It is argued that measurement might instead involve a process of recovery and equilibrium in the measuring apparatus akin to that described in thermodynamics by Le Chatelier's principle. The predictions of quantum mechanics would then arise from conservation of the measured property in the combined system of apparatus and measured ensemble. Measurement would be consistent with classical laws of conservation, not simply in the classical limit of large numbers, but whatever the size of the ensemble. However variances from quantum mechanical predictions would be self-correcting and centripetal, rather than Markovian and increasing as under the standard theory. Entanglement correlations would persist, not because the entangled particles act in concert (which would entail nonlocality), but because the measurements of the particles were influenced by the one fluctuating state of imbalance in the process of measurement.
NASA Astrophysics Data System (ADS)
Selle, B.; Schwientek, M.
2012-04-01
Water quality of ground and surface waters in catchments is typically driven by many complex and interacting processes. While small scale processes are often studied in great detail, their relevance and interplay at catchment scales remain often poorly understood. For many catchments, extensive monitoring data on water quality have been collected for different purposes. These heterogeneous data sets contain valuable information on catchment scale processes but are rarely analysed using integrated methods. Principle component analysis (PCA) has previously been applied to this kind of data sets. However, a detailed analysis of scores, which are an important result of a PCA, is often missing. Mathematically, PCA expresses measured variables on water quality, e.g. nitrate concentrations, as linear combination of independent, not directly observable key processes. These computed key processes are represented by principle components. Their scores are interpretable as process intensities which vary in space and time. Subsequently, scores can be correlated with other key variables and catchment characteristics, such as water travel times and land use that were not considered in PCA. This detailed analysis of scores represents an extension of the commonly applied PCA which could considerably improve the understanding of processes governing water quality at catchment scales. In this study, we investigated the 170 km2 Ammer catchment in SW Germany which is characterised by an above average proportion of agricultural (71%) and urban (17%) areas. The Ammer River is mainly fed by karstic springs. For PCA, we separately analysed concentrations from (a) surface waters of the Ammer River and its tributaries, (b) spring waters from the main aquifers and (c) deep groundwater from production wells. This analysis was extended by a detailed analysis of scores. We analysed measured concentrations on major ions and selected organic micropollutants. Additionally, redox-sensitive variables and environmental tracers indicating groundwater age were analysed for deep groundwater from production wells. For deep groundwater, we found that microbial turnover was stronger influenced by local availability of energy sources than by travel times of groundwater to the wells. Groundwater quality primarily reflected the input of pollutants determined by landuse, e.g. agrochemicals. We concluded that for water quality in the Ammer catchment, conservative mixing of waters with different origin is more important than reactive transport processes along the flow path.
Student Engagement: A Principle-Based Concept Analysis.
Bernard, Jean S
2015-08-04
A principle-based concept analysis of student engagement was used to examine the state of the science across disciplines. Four major perspectives of philosophy of science guided analysis and provided a framework for study of interrelationships and integration of conceptual components which then resulted in formulation of a theoretical definition. Findings revealed student engagement as a dynamic reiterative process marked by positive behavioral, cognitive, and affective elements exhibited in pursuit of deep learning. This process is influenced by a broader sociocultural environment bound by contextual preconditions of self-investment, motivation, and a valuing of learning. Outcomes of student engagement include satisfaction, sense of well-being, and personal development. Findings of this analysis prove relevant to nursing education as faculty transition from traditional teaching paradigms, incorporate learner-centered strategies, and adopt innovative pedagogical methodologies. It lends support for curricula reform, development of more accurate evaluative measures, and creation of meaningful teaching-learning environments within the discipline.
First-Principles Prediction of Liquid/Liquid Interfacial Tension.
Andersson, M P; Bennetzen, M V; Klamt, A; Stipp, S L S
2014-08-12
The interfacial tension between two liquids is the free energy per unit surface area required to create that interface. Interfacial tension is a determining factor for two-phase liquid behavior in a wide variety of systems ranging from water flooding in oil recovery processes and remediation of groundwater aquifers contaminated by chlorinated solvents to drug delivery and a host of industrial processes. Here, we present a model for predicting interfacial tension from first principles using density functional theory calculations. Our model requires no experimental input and is applicable to liquid/liquid systems of arbitrary compositions. The consistency of the predictions with experimental data is significant for binary, ternary, and multicomponent water/organic compound systems, which offers confidence in using the model to predict behavior where no data exists. The method is fast and can be used as a screening technique as well as to extend experimental data into conditions where measurements are technically too difficult, time consuming, or impossible.
Model-based review of Doppler global velocimetry techniques with laser frequency modulation
NASA Astrophysics Data System (ADS)
Fischer, Andreas
2017-06-01
Optical measurements of flow velocity fields are of crucial importance to understand the behavior of complex flow. One flow field measurement technique is Doppler global velocimetry (DGV). A large variety of different DGV approaches exist, e.g., applying different kinds of laser frequency modulation. In order to investigate the measurement capabilities especially of the newer DGV approaches with laser frequency modulation, a model-based review of all DGV measurement principles is performed. The DGV principles can be categorized by the respective number of required time steps. The systematic review of all DGV principle reveals drawbacks and benefits of the different measurement approaches with respect to the temporal resolution, the spatial resolution and the measurement range. Furthermore, the Cramér-Rao bound for photon shot is calculated and discussed, which represents a fundamental limit of the achievable measurement uncertainty. As a result, all DGV techniques provide similar minimal uncertainty limits. With Nphotons as the number of scattered photons, the minimal standard deviation of the flow velocity reads about 106 m / s /√{Nphotons } , which was calculated for a perpendicular arrangement of the illumination and observation direction and a laser wavelength of 895 nm. As a further result, the signal processing efficiencies are determined with a Monte-Carlo simulation. Except for the newest correlation-based DGV method, the signal processing algorithms are already optimal or near the optimum. Finally, the different DGV approaches are compared regarding errors due to temporal variations of the scattered light intensity and the flow velocity. The influence of a linear variation of the scattered light intensity can be reduced by maximizing the number of time steps, because this means to acquire more information for the correction of this systematic effect. However, more time steps can result in a flow velocity measurement with a lower temporal resolution, when operating at the maximal frame rate of the camera. DGV without laser frequency modulation then provides the highest temporal resolutions and is not sensitive with respect to temporal variations but with respect to spatial variations of the scattered light intensity. In contrast to this, all DGV variants suffer from velocity variations during the measurement. In summary, the experimental conditions and the measurement task finally decide about the ideal choice from the reviewed DGV methods.
The Development of a Gas–Liquid Two-Phase Flow Sensor Applicable to CBM Wellbore Annulus
Wu, Chuan; Wen, Guojun; Han, Lei; Wu, Xiaoming
2016-01-01
The measurement of wellbore annulus gas–liquid two-phase flow in CBM (coalbed methane) wells is of great significance for reasonably developing gas drainage and extraction processes, estimating CBM output, judging the operating conditions of CBM wells and analyzing stratum conditions. Hence, a specially designed sensor is urgently needed for real-time measurement of gas–liquid two-phase flow in CBM wellbore annulus. Existing flow sensors fail to meet the requirements of the operating conditions of CBM wellbore annulus due to such factors as an inapplicable measurement principle, larger size, poor sealability, high installation accuracy, and higher requirements for fluid media. Therefore, based on the principle of a target flowmeter, this paper designs a new two-phase flow sensor that can identify and automatically calibrate different flow patterns of two-phase flows. Upon the successful development of the new flow sensor, lab and field tests were carried out, and the results show that the newly designed sensor, with a measurement accuracy of ±2.5%, can adapt to the operating conditions of CBM wells and is reliable for long-term work. PMID:27869708
The Development of a Gas-Liquid Two-Phase Flow Sensor Applicable to CBM Wellbore Annulus.
Wu, Chuan; Wen, Guojun; Han, Lei; Wu, Xiaoming
2016-11-18
The measurement of wellbore annulus gas-liquid two-phase flow in CBM (coalbed methane) wells is of great significance for reasonably developing gas drainage and extraction processes, estimating CBM output, judging the operating conditions of CBM wells and analyzing stratum conditions. Hence, a specially designed sensor is urgently needed for real-time measurement of gas-liquid two-phase flow in CBM wellbore annulus. Existing flow sensors fail to meet the requirements of the operating conditions of CBM wellbore annulus due to such factors as an inapplicable measurement principle, larger size, poor sealability, high installation accuracy, and higher requirements for fluid media. Therefore, based on the principle of a target flowmeter, this paper designs a new two-phase flow sensor that can identify and automatically calibrate different flow patterns of two-phase flows. Upon the successful development of the new flow sensor, lab and field tests were carried out, and the results show that the newly designed sensor, with a measurement accuracy of ±2.5%, can adapt to the operating conditions of CBM wells and is reliable for long-term work.
Principles of survey development for telemedicine applications.
Demiris, George
2006-01-01
Surveys can be used in the evaluation of telemedicine applications but they must be properly designed, consistent and accurate. The purpose of the survey and the resources available will determine the extent of testing that a survey instrument should undergo prior to its use. The validity of an instrument is the correspondence between what is being measured and what was intended to be measured. The reliability of an instrument describes the 'consistency' or 'repeatability' of the measurements made with it. Survey instruments should be designed and tested following basic principles of survey development. The actual survey administration also requires consideration, for example data collection and processing, as well as the interpretation of the findings. Surveys are of two different types. Either they are self-administered, or they are administered by interview. In the latter case, they may be administered by telephone or in a face-to-face meeting. It is important to design a survey instrument based on a detailed definition of what it intends to measure and to test it before administering it to the larger sample.
Weeks, James L
2006-06-01
The Mine Safety and Health Administration (MSHA) proposes to issue citations for non-compliance with the exposure limit for respirable coal mine dust when measured exposure exceeds the exposure limit with a "high degree of confidence." This criterion threshold value (CTV) is derived from the sampling and analytical error of the measurement method. This policy is based on a combination of statistical and legal reasoning: the one-tailed 95% confidence limit of the sampling method, the apparent principle of due process and a standard of proof analogous to "beyond a reasonable doubt." This policy raises the effective exposure limit, it is contrary to the precautionary principle, it is not a fair sharing of the burden of uncertainty, and it employs an inappropriate standard of proof. Its own advisory committee and NIOSH have advised against this policy. For longwall mining sections, it results in a failure to issue citations for approximately 36% of the measured values that exceed the statutory exposure limit. Citations for non-compliance with the respirable dust standard should be issued for any measure exposure that exceeds the exposure limit.
Patient involvement in clinical research: why, when, and how
Sacristán, José A; Aguarón, Alfonso; Avendaño-Solá, Cristina; Garrido, Pilar; Carrión, Juan; Gutiérrez, Alipio; Kroes, Robert; Flores, Angeles
2016-01-01
The development of a patient-centered approach to medicine is gradually allowing more patients to be involved in their own medical decisions. However, this change is not happening at the same rate in clinical research, where research generally continues to be carried out on patients, but not with patients. This work describes the why, when, and how of more active patient participation in the research process. Specific measures are proposed to improve patient involvement in 1) setting priorities, 2) study leadership and design, 3) improved access to clinical trials, 4) preparation and oversight of the information provided to participants, 5) post-study evaluation of the patient experience, and 6) the dissemination and application of results. In order to achieve these aims, the relative emphases on the ethical principles underlying research need to be changed. The current model based on the principle of beneficence must be left behind, and one that upholds the ethical principles of autonomy and non maleficence should be embraced. There is a need to improve the level of information that patients and society as a whole have on research objectives and processes; the goal is to promote the gradual emergence of the expert patient. PMID:27175063
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hourdequin, Marion, E-mail: Marion.Hourdequin@ColoradoCollege.edu; Department of Philosophy, Colorado College, 14 E. Cache La Poudre St., Colorado Springs, CO 80903; Landres, Peter
Traditional mechanisms for public participation in environmental impact assessment under U.S. federal law have been criticized as ineffective and unable to resolve conflict. As these mechanisms are modified and new approaches developed, we argue that participation should be designed and evaluated not only on practical grounds of cost-effectiveness and efficiency, but also on ethical grounds based on democratic ideals. In this paper, we review and synthesize modern democratic theory to develop and justify four ethical principles for public participation: equal opportunity to participate, equal access to information, genuine deliberation, and shared commitment. We then explore several tensions that are inherentmore » in applying these ethical principles to public participation in EIA. We next examine traditional NEPA processes and newer collaborative approaches in light of these principles. Finally, we explore the circumstances that argue for more in-depth participatory processes. While improved EIA participatory processes do not guarantee improved outcomes in environmental management, processes informed by these four ethical principles derived from democratic theory may lead to increased public engagement and satisfaction with government agency decisions. - Highlights: Black-Right-Pointing-Pointer Four ethical principles based on democratic theory for public participation in EIA. Black-Right-Pointing-Pointer NEPA and collaboration offer different strengths in meeting these principles. Black-Right-Pointing-Pointer We explore tensions inherent in applying these principles. Black-Right-Pointing-Pointer Improved participatory processes may improve public acceptance of agency decisions.« less
Teaching about Due Process of Law. ERIC Digest.
ERIC Educational Resources Information Center
Vontz, Thomas S.
Fundamental constitutional and legal principles are central to effective instruction in the K-12 social studies curriculum. To become competent citizens, students need to develop an understanding of the principles on which their society and government are based. Few principles are as important in the social studies curriculum as due process of…
Technology integration performance assessment using lean principles in health care.
Rico, Florentino; Yalcin, Ali; Eikman, Edward A
2015-01-01
This study assesses the impact of an automated infusion system (AIS) integration at a positron emission tomography (PET) center based on "lean thinking" principles. The authors propose a systematic measurement system that evaluates improvement in terms of the "8 wastes." This adaptation to the health care context consisted of performance measurement before and after integration of AIS in terms of time, utilization of resources, amount of materials wasted/saved, system variability, distances traveled, and worker strain. The authors' observations indicate that AIS stands to be very effective in a busy PET department, such as the one in Moffitt Cancer Center, owing to its accuracy, pace, and reliability, especially after the necessary adjustments are made to reduce or eliminate the source of errors. This integration must be accompanied by a process reengineering exercise to realize the full potential of AIS in reducing waste and improving patient care and worker satisfaction. © The Author(s) 2014.
Design Principles for Rapid Prototyping Forces Sensors using 3D Printing.
Kesner, Samuel B; Howe, Robert D
2011-07-21
Force sensors provide critical information for robot manipulators, manufacturing processes, and haptic interfaces. Commercial force sensors, however, are generally not adapted to specific system requirements, resulting in sensors with excess size, cost, and fragility. To overcome these issues, 3D printers can be used to create components for the quick and inexpensive development of force sensors. Limitations of this rapid prototyping technology, however, require specialized design principles. In this paper, we discuss techniques for rapidly developing simple force sensors, including selecting and attaching metal flexures, using inexpensive and simple displacement transducers, and 3D printing features to aid in assembly. These design methods are illustrated through the design and fabrication of a miniature force sensor for the tip of a robotic catheter system. The resulting force sensor prototype can measure forces with an accuracy of as low as 2% of the 10 N measurement range.
Exotic looped trajectories of photons in three-slit interference
Magaña-Loaiza, Omar S; De Leon, Israel; Mirhosseini, Mohammad; Fickler, Robert; Safari, Akbar; Mick, Uwe; McIntyre, Brian; Banzer, Peter; Rodenburg, Brandon; Leuchs, Gerd; Boyd, Robert W.
2016-01-01
The validity of the superposition principle and of Born's rule are well-accepted tenants of quantum mechanics. Surprisingly, it has been predicted that the intensity pattern formed in a three-slit experiment is seemingly in contradiction with the most conventional form of the superposition principle when exotic looped trajectories are taken into account. However, the probability of observing such paths is typically very small, thus rendering them extremely difficult to measure. Here we confirm the validity of Born's rule and present the first experimental observation of exotic trajectories as additional paths for the light by directly measuring their contribution to the formation of optical interference fringes. We accomplish this by enhancing the electromagnetic near-fields in the vicinity of the slits through the excitation of surface plasmons. This process increases the probability of occurrence of these exotic trajectories, demonstrating that they are related to the near-field component of the photon's wavefunction. PMID:28008907
Using offsets to mitigate environmental impacts of major projects: A stakeholder analysis.
Martin, Nigel; Evans, Megan; Rice, John; Lodhia, Sumit; Gibbons, Philip
2016-09-01
Global patterns of development suggest that as more projects are initiated, business will need to find acceptable measures to conserve biodiversity. The application of environmental offsets allows firms to combine their economic interests with the environment and society. This article presents the results of a multi-stakeholder analysis related to the design of offsets principles, policies, and regulatory processes, using a large infrastructure projects context. The results indicate that business was primarily interested in using direct offsets and other compensatory measures, known internationally as indirect offsets, to acquit their environmental management obligations. In contrast, the environmental sector argued that highly principled and scientifically robust offsets programs should be implemented and maintained for enduring environmental protection. Stakeholder consensus stressed the importance of offsets registers with commensurate monitoring and enforcement. Our findings provide instructive insights into the countervailing views of offsets policy stakeholders. Copyright © 2016 Elsevier Ltd. All rights reserved.
Exotic looped trajectories of photons in three-slit interference.
Magaña-Loaiza, Omar S; De Leon, Israel; Mirhosseini, Mohammad; Fickler, Robert; Safari, Akbar; Mick, Uwe; McIntyre, Brian; Banzer, Peter; Rodenburg, Brandon; Leuchs, Gerd; Boyd, Robert W
2016-12-23
The validity of the superposition principle and of Born's rule are well-accepted tenants of quantum mechanics. Surprisingly, it has been predicted that the intensity pattern formed in a three-slit experiment is seemingly in contradiction with the most conventional form of the superposition principle when exotic looped trajectories are taken into account. However, the probability of observing such paths is typically very small, thus rendering them extremely difficult to measure. Here we confirm the validity of Born's rule and present the first experimental observation of exotic trajectories as additional paths for the light by directly measuring their contribution to the formation of optical interference fringes. We accomplish this by enhancing the electromagnetic near-fields in the vicinity of the slits through the excitation of surface plasmons. This process increases the probability of occurrence of these exotic trajectories, demonstrating that they are related to the near-field component of the photon's wavefunction.
Principles for the dynamic maintenance of cortical polarity
Marco, Eugenio; Wedlich-Soldner, Roland; Li, Rong; Altschuler, Steven J.; Wu, Lani F.
2007-01-01
Summary Diverse cell types require the ability to dynamically maintain polarized membrane protein distributions through balancing transport and diffusion. However, design principles underlying dynamically maintained cortical polarity are not well understood. Here we constructed a mathematical model for characterizing the morphology of dynamically polarized protein distributions. We developed analytical approaches for measuring all model parameters from single-cell experiments. We applied our methods to a well-characterized system for studying polarized membrane proteins: budding yeast cells expressing activated Cdc42. We found that balanced diffusion and colocalized transport to and from the plasma membrane were sufficient for accurately describing polarization morphologies. Surprisingly, the model predicts that polarized regions are defined with a precision that is nearly optimal for measured transport rates, and that polarity can be dynamically stabilized through positive feedback with directed transport. Our approach provides a step towards understanding how biological systems shape spatially precise, unambiguous cortical polarity domains using dynamic processes. PMID:17448998
Langeland, Eva; Riise, Trond; Hanestad, Berit R; Nortvedt, Monica W; Kristoffersen, Kjell; Wahl, Astrid K
2006-08-01
Although the theory of salutogenesis provides generic understanding of how coping may be created, this theoretical perspective has not been explored sufficiently within research among people suffering from mental health problems. The aim of this study is to investigate the effect of talk-therapy groups based on salutogenic treatment principles on coping with mental health problems. In an experimental design, the participants (residents in the community) were randomly allocated to a coping-enhancing experimental group (n=59) and a control group (n=47) receiving standard care. Coping was measured using the sense of coherence (SOC) questionnaire. Coping improved significantly in the experiment group (+6 points) compared with the control group (-2 points). The manageability component contributed most to this improvement. Talk-therapy groups based on salutogenic treatment principles improve coping among people with mental health problems. Talk-therapy groups based on salutogenic treatment principles may be helpful in increasing coping in the recovery process among people with mental health problems and seem to be applicable to people with various mental health problems.
Review of surface steam sterilization for validation purposes.
van Doornmalen, Joost; Kopinga, Klaas
2008-03-01
Sterilization is an essential step in the process of producing sterile medical devices. To guarantee sterility, the process of sterilization must be validated. Because there is no direct way to measure sterility, the techniques applied to validate the sterilization process are based on statistical principles. Steam sterilization is the most frequently applied sterilization method worldwide and can be validated either by indicators (chemical or biological) or physical measurements. The steam sterilization conditions are described in the literature. Starting from these conditions, criteria for the validation of steam sterilization are derived and can be described in terms of physical parameters. Physical validation of steam sterilization appears to be an adequate and efficient validation method that could be considered as an alternative for indicator validation. Moreover, physical validation can be used for effective troubleshooting in steam sterilizing processes.
Fischer, Andreas
2016-11-01
Optical flow velocity measurements are important for understanding the complex behavior of flows. Although a huge variety of methods exist, they are either based on a Doppler or a time-of-flight measurement principle. Doppler velocimetry evaluates the velocity-dependent frequency shift of light scattered at a moving particle, whereas time-of-flight velocimetry evaluates the traveled distance of a scattering particle per time interval. Regarding the aim of achieving a minimal measurement uncertainty, it is unclear if one principle allows to achieve lower uncertainties or if both principles can achieve equal uncertainties. For this reason, the natural, fundamental uncertainty limit according to Heisenberg's uncertainty principle is derived for Doppler and time-of-flight measurement principles, respectively. The obtained limits of the velocity uncertainty are qualitatively identical showing, e.g., a direct proportionality for the absolute value of the velocity to the power of 32 and an indirect proportionality to the square root of the scattered light power. Hence, both measurement principles have identical potentials regarding the fundamental uncertainty limit due to the quantum mechanical behavior of photons. This fundamental limit can be attained (at least asymptotically) in reality either with Doppler or time-of-flight methods, because the respective Cramér-Rao bounds for dominating photon shot noise, which is modeled as white Poissonian noise, are identical with the conclusions from Heisenberg's uncertainty principle.
A sub-ensemble theory of ideal quantum measurement processes
NASA Astrophysics Data System (ADS)
Allahverdyan, Armen E.; Balian, Roger; Nieuwenhuizen, Theo M.
2017-01-01
In order to elucidate the properties currently attributed to ideal measurements, one must explain how the concept of an individual event with a well-defined outcome may emerge from quantum theory which deals with statistical ensembles, and how different runs issued from the same initial state may end up with different final states. This so-called "measurement problem" is tackled with two guidelines. On the one hand, the dynamics of the macroscopic apparatus A coupled to the tested system S is described mathematically within a standard quantum formalism, where " q-probabilities" remain devoid of interpretation. On the other hand, interpretative principles, aimed to be minimal, are introduced to account for the expected features of ideal measurements. Most of the five principles stated here, which relate the quantum formalism to physical reality, are straightforward and refer to macroscopic variables. The process can be identified with a relaxation of S + A to thermodynamic equilibrium, not only for a large ensemble E of runs but even for its sub-ensembles. The different mechanisms of quantum statistical dynamics that ensure these types of relaxation are exhibited, and the required properties of the Hamiltonian of S + A are indicated. The additional theoretical information provided by the study of sub-ensembles remove Schrödinger's quantum ambiguity of the final density operator for E which hinders its direct interpretation, and bring out a commutative behaviour of the pointer observable at the final time. The latter property supports the introduction of a last interpretative principle, needed to switch from the statistical ensembles and sub-ensembles described by quantum theory to individual experimental events. It amounts to identify some formal " q-probabilities" with ordinary frequencies, but only those which refer to the final indications of the pointer. The desired properties of ideal measurements, in particular the uniqueness of the result for each individual run of the ensemble and von Neumann's reduction, are thereby recovered with economic interpretations. The status of Born's rule involving both A and S is re-evaluated, and contextuality of quantum measurements is made obvious.
Measurement standards for interdisciplinary medical rehabilitation.
Johnston, M V; Keith, R A; Hinderer, S R
1992-12-01
Rehabilitation must address problems inherent in the measurement of human function and health-related quality of life, as well as problems in diagnosis and measurement of impairment. This educational document presents an initial set of standards to be used as guidelines for development and use of measurement and evaluation procedures and instruments for interdisciplinary, health-related rehabilitation. Part I covers general measurement principles and technical standards, beginning with validity, the central consideration for use of measures. Subsequent sections focus on reliability and errors of measurement, norms and scaling, development of measures, and technical manuals and guides. Part II covers principles and standards for use of measures. General principles of application of measures in practice are discussed first, followed by standards to protect persons being measured and then by standards for administrative applications. Many explanations, examples, and references are provided to help professionals understand measurement principles. Improved measurement will ensure the basis of rehabilitation as a science and nourish its success as a clinical service.
Peterson, J P S; Sarthour, R S; Souza, A M; Oliveira, I S; Goold, J; Modi, K; Soares-Pinto, D O; Céleri, L C
2016-04-01
Landauer's principle sets fundamental thermodynamical constraints for classical and quantum information processing, thus affecting not only various branches of physics, but also of computer science and engineering. Despite its importance, this principle was only recently experimentally considered for classical systems. Here we employ a nuclear magnetic resonance set-up to experimentally address the information to energy conversion in a quantum system. Specifically, we consider a three nuclear spins [Formula: see text] (qubits) molecule-the system, the reservoir and the ancilla-to measure the heat dissipated during the implementation of a global system-reservoir unitary interaction that changes the information content of the system. By employing an interferometric technique, we were able to reconstruct the heat distribution associated with the unitary interaction. Then, through quantum state tomography, we measured the relative change in the entropy of the system. In this way, we were able to verify that an operation that changes the information content of the system must necessarily generate heat in the reservoir, exactly as predicted by Landauer's principle. The scheme presented here allows for the detailed study of irreversible entropy production in quantum information processors.
NASA Astrophysics Data System (ADS)
Murali, Swetha; Ponmalar, V.
2017-07-01
To make innovation and continuous improvement as a norm, some traditional practices must become unlearnt. Change for growth and competitiveness are required for sustainability for any profitable business such as the construction industry. The leading companies are willing to implement Total Quality Management (TQM) principles, to realise potential advantages and improve growth and efficiency. Ironically, researches recollected quality as the most significant provider for competitive advantage in industrial leadership. The two objectives of this paper are 1) Identify TQM effectiveness in residential projects and 2) Identify the client satisfaction/dissatisfaction areas using Analytical Hierarchy Process (AHP) and suggest effective mitigate measures. Using statistical survey techniques like set of questionnaire survey, it is observed that total quality management was applied in some leading successful organization to an extent. The main attributes for quality achievement can be defined as teamwork and better communication with single agreed goal between client and contractor. Onsite safety is a paramount attribute in the identifying quality within the residential projects. It was noticed that the process based quality methods such as onsite safe working condition; safe management system and modern engineering process safety controls etc. as interlinked functions. Training and effective communication with all stakeholders on quality management principles is essential for effective quality work. Late Only through effective TQM principles companies can avoid some contract litigations with an increased client satisfaction Index.
NASA Astrophysics Data System (ADS)
Schmidt, J. B.
1985-09-01
This thesis investigates ways of improving the real-time performance of the Stockpoint Logistics Integrated Communication Environment (SPLICE). Performance evaluation through continuous monitoring activities and performance studies are the principle vehicles discussed. The method for implementing this performance evaluation process is the measurement of predefined performance indexes. Performance indexes for SPLICE are offered that would measure these areas. Existing SPLICE capability to carry out performance evaluation is explored, and recommendations are made to enhance that capability.
New type of measuring and intelligent instrument for curing tobacco
NASA Astrophysics Data System (ADS)
Yi, Chui-Jie; Huang, Xieqing; Chen, Tianning; Xia, Hong
1993-09-01
A new type of measuring intelligent instrument for cured tobacco is presented in this paper. Based on fuzzy linguistic control principles the instrument is used to controlling the temperature and humidity during cured tobacco taking 803 1 singlechip computer as a center controller. By using methods of fuzzy weighted factors the cross coupling in curing procedures is decoupled. Results that the instrument has producted indicate the fuzzy controller in the instrument has perfect performance for process of cured tobacco as shown in figure
Brain tissues volume measurements from 2D MRI using parametric approach
NASA Astrophysics Data System (ADS)
L'vov, A. A.; Toropova, O. A.; Litovka, Yu. V.
2018-04-01
The purpose of the paper is to propose a fully automated method of volume assessment of structures within human brain. Our statistical approach uses maximum interdependency principle for decision making process of measurements consistency and unequal observations. Detecting outliers performed using maximum normalized residual test. We propose a statistical model which utilizes knowledge of tissues distribution in human brain and applies partial data restoration for precision improvement. The approach proposes completed computationally efficient and independent from segmentation algorithm used in the application.
Basic Skills Resource Center: Report on the Preliminary Research Findings
1985-01-01
indicates that the higher the level of processing , the greater the comprehension and recall. This is true of word lists ( Craik & Lockhart , 1972) as well as... Levels of Processing Principle 9 Content-Driven Strategy/Skills Instruction Principle 10 Instruction, Content, and Prior Knowledge Principle 11 Sequencing...34 Ws 1.’t) 0 U) 14 C0 W u w. C -0.0 C) a. I-s U) w~ 0 4) 0 C "q’ 01 .0 0c 414U >4 0.4 F 0 to 0)0 IvJ0 04Cu B-13 Principle 8 ( Levels of Processing ) The
Laser pulse coded signal frequency measuring device based on DSP and CPLD
NASA Astrophysics Data System (ADS)
Zhang, Hai-bo; Cao, Li-hua; Geng, Ai-hui; Li, Yan; Guo, Ru-hai; Wang, Ting-feng
2011-06-01
Laser pulse code is an anti-jamming measures used in semi-active laser guided weapons. On account of the laser-guided signals adopting pulse coding mode and the weak signal processing, it need complex calculations in the frequency measurement process according to the laser pulse code signal time correlation to meet the request in optoelectronic countermeasures in semi-active laser guided weapons. To ensure accurately completing frequency measurement in a short time, it needed to carry out self-related process with the pulse arrival time series composed of pulse arrival time, calculate the signal repetition period, and then identify the letter type to achieve signal decoding from determining the time value, number and rank number in a signal cycle by Using CPLD and DSP for signal processing chip, designing a laser-guided signal frequency measurement in the pulse frequency measurement device, improving the signal processing capability through the appropriate software algorithms. In this article, we introduced the principle of frequency measurement of the device, described the hardware components of the device, the system works and software, analyzed the impact of some system factors on the accuracy of the measurement. The experimental results indicated that this system improve the accuracy of the measurement under the premise of volume, real-time, anti-interference, low power of the laser pulse frequency measuring device. The practicality of the design, reliability has been demonstrated from the experimental point of view.
Disturbance, the uncertainty principle and quantum optics
NASA Technical Reports Server (NTRS)
Martens, Hans; Demuynck, Willem M.
1993-01-01
It is shown how a disturbance-type uncertainty principle can be derived from an uncertainty principle for joint measurements. To achieve this, we first clarify the meaning of 'inaccuracy' and 'disturbance' in quantum mechanical measurements. The case of photon number and phase is treated as an example, and it is applied to a quantum non-demolition measurement using the optical Kerr effect.
NASA Astrophysics Data System (ADS)
Orloff, Nathan D.; Long, Christian J.; Obrzut, Jan; Maillaud, Laurent; Mirri, Francesca; Kole, Thomas P.; McMichael, Robert D.; Pasquali, Matteo; Stranick, Stephan J.; Alexander Liddle, J.
2015-11-01
Advances in roll-to-roll processing of graphene and carbon nanotubes have at last led to the continuous production of high-quality coatings and filaments, ushering in a wave of applications for flexible and wearable electronics, woven fabrics, and wires. These applications often require specific electrical properties, and hence precise control over material micro- and nanostructure. While such control can be achieved, in principle, by closed-loop processing methods, there are relatively few noncontact and nondestructive options for quantifying the electrical properties of materials on a moving web at the speed required in modern nanomanufacturing. Here, we demonstrate a noncontact microwave method for measuring the dielectric constant and conductivity (or geometry for samples of known dielectric properties) of materials in a millisecond. Such measurement times are compatible with current and future industrial needs, enabling real-time materials characterization and in-line control of processing variables without disrupting production.
Orloff, Nathan D.; Long, Christian J.; Obrzut, Jan; Maillaud, Laurent; Mirri, Francesca; Kole, Thomas P.; McMichael, Robert D.; Pasquali, Matteo; Stranick, Stephan J.; Alexander Liddle, J.
2015-01-01
Advances in roll-to-roll processing of graphene and carbon nanotubes have at last led to the continuous production of high-quality coatings and filaments, ushering in a wave of applications for flexible and wearable electronics, woven fabrics, and wires. These applications often require specific electrical properties, and hence precise control over material micro- and nanostructure. While such control can be achieved, in principle, by closed-loop processing methods, there are relatively few noncontact and nondestructive options for quantifying the electrical properties of materials on a moving web at the speed required in modern nanomanufacturing. Here, we demonstrate a noncontact microwave method for measuring the dielectric constant and conductivity (or geometry for samples of known dielectric properties) of materials in a millisecond. Such measurement times are compatible with current and future industrial needs, enabling real-time materials characterization and in-line control of processing variables without disrupting production. PMID:26592441
SPM interferometer with large range for mirco-vibration measurement
NASA Astrophysics Data System (ADS)
Fu, Mingyi; Tang, Chaowei; He, Guotian; Hu, Jun; Wang, Li
2007-12-01
The measuring range and precision are two inconsistent parameters of traditional optical interferometry. In this paper, the interferometer measuring vibration with high precision and large range is proposed and its measuring principle is analyzed in detail. The interferometer obtains phase information by processing interference signals with two real-time phase discriminator and the vibration displacement could be gotten by expanding this phase. The measuring range was enlarged from half wavelength to millimeter. Meanwhile, the measuring precision was independent of external disturbance and vibration displacement measurement with high precision was realized. The measuring range of vibration displacement for 6000.5nm and the repeatable measuring precision was 5.72nm from experiment. The feasibility of the measuring method was validated by experiments.
Attention in aviation. [to aircraft design and pilot performance
NASA Technical Reports Server (NTRS)
Wickens, Christopher D.
1987-01-01
The relevance of four principles or mechanisms of human attention to the design of aviation systems and the performance of pilots in multitask environments, including workload prediction and measurement, control-display integration, and the use of voice and head-up displays is discussed. The principles are: the mental energy that supplies task performance (resources), the resulting cross-talk between tasks as they are made more similar (confusion), the combination of different task elements (integration), and the way in which one task is processed and another is ignored (selection or tunneling). The introduction of greater levels of complexity into the validation of attentional theories in order to approach the demands of the cockpit or ATC console is proposed.
NASA Astrophysics Data System (ADS)
Rausch, J.; Hatzfeld, C.; Karsten, R.; Kraus, R.; Millitzer, J.; Werthschützky, R.
2012-06-01
This paper presents an experimental evaluation of three different strain measuring principles. Mounted on a steel beam resembling a car engine mount, metal foil strain gauges, piezoresistive silicon strain gauges and piezoelectric patches are investigated to measure structure-borne forces to control an active mounting structure. FEA simulation determines strains to be measured in the range of 10-8 up to 10-5 m × m-1. These low strains cannot be measured with conventional metal foil strain gauges, as shown in the experiment conducted. Both piezoresistive and piezoelectric gauges show good results compared to a conventional piezoelectric force sensor. Depending on bandwidth, overload capacity and primary electronic costs, these principles seem to be worth considering in an adaptronic system design. These parameters are described in detail for the principles investigated.
NASA Astrophysics Data System (ADS)
Wu, Weibin; Dai, Yifan; Zhou, Lin; Xu, Mingjin
2016-09-01
Material removal accuracy has a direct impact on the machining precision and efficiency of ion beam figuring. By analyzing the factors suppressing the improvement of material removal accuracy, we conclude that correcting the removal function deviation and reducing the removal material amount during each iterative process could help to improve material removal accuracy. Removal function correcting principle can effectively compensate removal function deviation between actual figuring and simulated processes, while experiments indicate that material removal accuracy decreases with a long machining time, so a small amount of removal material in each iterative process is suggested. However, more clamping and measuring steps will be introduced in this way, which will also generate machining errors and suppress the improvement of material removal accuracy. On this account, a free-measurement iterative process method is put forward to improve material removal accuracy and figuring efficiency by using less measuring and clamping steps. Finally, an experiment on a φ 100-mm Zerodur planar is preformed, which shows that, in similar figuring time, three free-measurement iterative processes could improve the material removal accuracy and the surface error convergence rate by 62.5% and 17.6%, respectively, compared with a single iterative process.
Machine tools error characterization and compensation by on-line measurement of artifact
NASA Astrophysics Data System (ADS)
Wahid Khan, Abdul; Chen, Wuyi; Wu, Lili
2009-11-01
Most manufacturing machine tools are utilized for mass production or batch production with high accuracy at a deterministic manufacturing principle. Volumetric accuracy of machine tools depends on the positional accuracy of the cutting tool, probe or end effector related to the workpiece in the workspace volume. In this research paper, a methodology is presented for volumetric calibration of machine tools by on-line measurement of an artifact or an object of a similar type. The machine tool geometric error characterization was carried out through a standard or an artifact, having similar geometry to the mass production or batch production product. The artifact was measured at an arbitrary position in the volumetric workspace with a calibrated Renishaw touch trigger probe system. Positional errors were stored into a computer for compensation purpose, to further run the manufacturing batch through compensated codes. This methodology was found quite effective to manufacture high precision components with more dimensional accuracy and reliability. Calibration by on-line measurement gives the advantage to improve the manufacturing process by use of deterministic manufacturing principle and found efficient and economical but limited to the workspace or envelop surface of the measured artifact's geometry or the profile.
The Baby Care Questionnaire: A measure of parenting principles and practices during infancy☆
Winstanley, Alice; Gattis, Merideth
2013-01-01
The current report provides a new framework to explore the role of parenting practices and principles during infancy. We identify structure and attunement as key parenting principles during infancy. Structure represents reliance on regularity and routines in daily life. Attunement represents reliance on infant cues and close physical contact. We suggest parents’ relative endorsement of these parenting principles is related to their choices about practices such as feeding, holding and night-time sleeping. We designed the Baby Care Questionnaire to measure parents’ endorsement of structure and attunement, as well as their daily parenting practices. We report data demonstrating the factor structure, reliability and validity of the BCQ. The BCQ, to our knowledge, is the first comprehensive measure of parenting practices and principles during infancy. We conclude with a discussion of future directions for the measure. PMID:24050932
ERIC Educational Resources Information Center
Mi, Fangqiong
2010-01-01
A growing number of residency programs are instituting curricula to include the component of evidence-based medicine (EBM) principles and process. However, these curricula may not be able to achieve the optimal learning outcomes, perhaps because various contextual factors are often overlooked when EBM training is being designed, developed, and…
Jain, Rahi; Venkatasubramanian, Padma
2014-01-01
Quality Ayurvedic herbal medicines are potential, low-cost solutions for addressing contemporary healthcare needs of both Indian and global community. Correlating Ayurvedic herbal preparations with modern processing principles (MPPs) can help develop new and use appropriate technology for scaling up production of the medicines, which is necessary to meet the growing demand. Understanding the fundamental Ayurvedic principles behind formulation and processing is also important for improving the dosage forms. Even though Ayurvedic industry has adopted technologies from food, chemical and pharmaceutical industries, there is no systematic study to correlate the traditional and modern processing methods. This study is an attempt to provide a possible correlation between the Ayurvedic processing methods and MPPs. A systematic literature review was performed to identify the Ayurvedic processing methods by collecting information from English editions of classical Ayurveda texts on medicine preparation methods. Correlation between traditional and MPPs was done based on the techniques used in Ayurvedic drug processing. It was observed that in Ayurvedic medicine preparations there were two major types of processes, namely extraction, and separation. Extraction uses membrane rupturing and solute diffusion principles, while separation uses volatility, adsorption, and size-exclusion principles. The study provides systematic documentation of methods used in Ayurveda for herbal drug preparation along with its interpretation in terms of MPPs. This is the first step which can enable improving or replacing traditional techniques. New technologies or use of existing technologies can be used to improve the dosage forms and scaling up while maintaining the Ayurvedic principles similar to traditional techniques.
NASA Astrophysics Data System (ADS)
Musaitif, Linda M.
Purpose. The purpose of this study was to determine the degree to which undergraduate full-time and adjunct faculty members in the health and science programs at community colleges in Southern California utilize the seven principles of good practice as measured by the Faculty Inventory of the Seven Principles for Good Practice in Undergraduate Education. A second purpose was to compare degree of utilization for gender and class size. Methodology. This is a quantitative study wherein there exists a systematic and mathematical assessment of data gathered through the use of a Likert scale survey to process and determine the mathematical model of the use of the principles by the target population of both full-time and adjunct faculty of health/science programs of community colleges in Southern California. Findings. Examination of the data revealed that both full-time and adjunct faculty members of Southern California community colleges perceive themselves a high degree of utilization of the seven principles of good practice. There was no statistically significant data to suggest a discrepancy between full-time and adjunct professors' perceptions among the utilization of the seven principles. Overall, male faculty members perceived themselves as utilizing the principles to a greater degree than female faculty. Data suggest that faculty with class size 60 or larger showed to utilize the seven principles more frequently than the professors with smaller class sizes. Conclusions. Full-time and adjunct professors of the health and sciences in Southern California community colleges perceive themselves as utilizing the seven principles of good practice to a high degree. Recommendations. This study suggests many recommendations for future research, including the degree to which negative economic factors such as budget cuts and demands affect the utilization of the seven principles. Also recommended is a study comparing students' perceptions of faculty's utilization of the seven principles of good practice in the classroom with faculty's self-perception.
Complementarity and Compensation: Bridging the Gap between Writing and Design.
ERIC Educational Resources Information Center
Killingsworth, M. Jimmie; Sanders, Scott P.
1990-01-01
Outlines two rhetorical principles for producing iconic-mosaic texts--the principle of complementarity and the principle of compensation. Shows how these principles can be applied to practical problems in coordinating the writing and design processes in student projects. (RS)
Approaches on calibration of bolometer and establishment of bolometer calibration device
NASA Astrophysics Data System (ADS)
Xia, Ming; Gao, Jianqiang; Ye, Jun'an; Xia, Junwen; Yin, Dejin; Li, Tiecheng; Zhang, Dong
2015-10-01
Bolometer is mainly used for measuring thermal radiation in the field of public places, labor hygiene, heating and ventilation and building energy conservation. The working principle of bolometer is under the exposure of thermal radiation, temperature of black absorbing layer of detector rise after absorption of thermal radiation, which makes the electromotive force produced by thermoelectric. The white light reflective layer of detector does not absorb thermal radiation, so the electromotive force produced by thermoelectric is almost zero. A comparison of electromotive force produced by thermoelectric of black absorbing layer and white reflective layer can eliminate the influence of electric potential produced by the basal background temperature change. After the electromotive force which produced by thermal radiation is processed by the signal processing unit, the indication displays through the indication display unit. The measurement unit of thermal radiation intensity is usually W/m2 or kW/m2. Its accurate and reliable value has important significance for high temperature operation, labor safety and hygiene grading management. Bolometer calibration device is mainly composed of absolute radiometer, the reference light source, electric measuring instrument. Absolute radiometer is a self-calibration type radiometer. Its working principle is using the electric power which can be accurately measured replaces radiation power to absolutely measure the radiation power. Absolute radiometer is the standard apparatus of laser low power standard device, the measurement traceability is guaranteed. Using the calibration method of comparison, the absolute radiometer and bolometer measure the reference light source in the same position alternately which can get correction factor of irradiance indication. This paper is mainly about the design and calibration method of the bolometer calibration device. The uncertainty of the calibration result is also evaluated.
[The General Principles of Suicide Prevention Policy from the perspective of clinical psychiatry].
Cho, Yoshinori; Inagaki, Masatoshi
2014-01-01
In view of the fact that the suicide rate in Japan has remained high since 1998, the Basic Act on Suicide Prevention was implemented in 2006 with the objective of comprehensively promoting suicide prevention measures on a national scale. Based on this Basic Act, in 2007, the Japanese government formulated the General Principles of Suicide Prevention Policy as a guideline for recommended suicide prevention measures. These General Principles were revised in 2012 in accordance with the initial plan of holding a review after five years. The Basic Act places an emphasis on the various social factors that underlie suicides and takes the perspective that suicide prevention measures are also social measures. The slogan of the revised General Principles is "Toward Realization of a Society in which Nobody is Driven to Commit Suicide". The General Principles list various measures that are able to be used universally. These contents would be sufficient if the objective of the General Principles were "realization of a society that is easy to live in"; however, the absence of information on the effectiveness and order of priority for each measure may limit the specific effectiveness of the measures in relation to the actual prevention of suicide. In addition, considering that nearly 90% of suicide victims are in a state at the time of committing suicide in which a psychiatric disorder would be diagnosed, it would appear from a psychiatric standpoint that measures related to mental health, including expansion of psychiatric services, should be the top priority in suicide prevention measures. However, this is not the case in the General Principles, in either its original or revised form. Revisions to the General Principles related to clinical psychiatry provide more detailed descriptions of measures for individuals who unsuccessfully attempt suicide and identify newly targeted mental disorders other than depression; however, the overall proportion of contents relating to psychiatric care remains small. In particular, it must be noted that almost no measures are provided for individuals with chronic psychiatric disorders. We believe that the role of academic societies involved in suicide prevention, including our own, is to organize the contents of the General Principles based on evidence, to advance research in areas lacking in evidence, and to promote support for implementation of activities in areas with clear evidence.
17 CFR 37.400 - Core Principle 4-Monitoring of trading and trade processing.
Code of Federal Regulations, 2014 CFR
2014-04-01
... trading and trade processing. 37.400 Section 37.400 Commodity and Securities Exchanges COMMODITY FUTURES TRADING COMMISSION SWAP EXECUTION FACILITIES Monitoring of Trading and Trade Processing § 37.400 Core Principle 4—Monitoring of trading and trade processing. The swap execution facility shall: (a) Establish and...
Evaluation of Preanalytical Quality Indicators by Six Sigma and Pareto`s Principle.
Kulkarni, Sweta; Ramesh, R; Srinivasan, A R; Silvia, C R Wilma Delphine
2018-01-01
Preanalytical steps are the major sources of error in clinical laboratory. The analytical errors can be corrected by quality control procedures but there is a need for stringent quality checks in preanalytical area as these processes are done outside the laboratory. Sigma value depicts the performance of laboratory and its quality measures. Hence in the present study six sigma and Pareto principle was applied to preanalytical quality indicators to evaluate the clinical biochemistry laboratory performance. This observational study was carried out for a period of 1 year from November 2015-2016. A total of 1,44,208 samples and 54,265 test requisition forms were screened for preanalytical errors like missing patient information, sample collection details in forms and hemolysed, lipemic, inappropriate, insufficient samples and total number of errors were calculated and converted into defects per million and sigma scale. Pareto`s chart was drawn using total number of errors and cumulative percentage. In 75% test requisition forms diagnosis was not mentioned and sigma value of 0.9 was obtained and for other errors like sample receiving time, stat and type of sample sigma values were 2.9, 2.6, and 2.8 respectively. For insufficient sample and improper ratio of blood to anticoagulant sigma value was 4.3. Pareto`s chart depicts out of 80% of errors in requisition forms, 20% is contributed by missing information like diagnosis. The development of quality indicators, application of six sigma and Pareto`s principle are quality measures by which not only preanalytical, the total testing process can be improved.
Voltage Imaging of Waking Mouse Cortex Reveals Emergence of Critical Neuronal Dynamics
Scott, Gregory; Fagerholm, Erik D.; Mutoh, Hiroki; Leech, Robert; Sharp, David J.; Shew, Woodrow L.
2014-01-01
Complex cognitive processes require neuronal activity to be coordinated across multiple scales, ranging from local microcircuits to cortex-wide networks. However, multiscale cortical dynamics are not well understood because few experimental approaches have provided sufficient support for hypotheses involving multiscale interactions. To address these limitations, we used, in experiments involving mice, genetically encoded voltage indicator imaging, which measures cortex-wide electrical activity at high spatiotemporal resolution. Here we show that, as mice recovered from anesthesia, scale-invariant spatiotemporal patterns of neuronal activity gradually emerge. We show for the first time that this scale-invariant activity spans four orders of magnitude in awake mice. In contrast, we found that the cortical dynamics of anesthetized mice were not scale invariant. Our results bridge empirical evidence from disparate scales and support theoretical predictions that the awake cortex operates in a dynamical regime known as criticality. The criticality hypothesis predicts that small-scale cortical dynamics are governed by the same principles as those governing larger-scale dynamics. Importantly, these scale-invariant principles also optimize certain aspects of information processing. Our results suggest that during the emergence from anesthesia, criticality arises as information processing demands increase. We expect that, as measurement tools advance toward larger scales and greater resolution, the multiscale framework offered by criticality will continue to provide quantitative predictions and insight on how neurons, microcircuits, and large-scale networks are dynamically coordinated in the brain. PMID:25505314
Application of Six Sigma towards improving surgical outcomes.
Shukla, P J; Barreto, S G; Nadkarni, M S
2008-01-01
Six Sigma is a 'process excellence' tool targeting continuous improvement achieved by providing a methodology for improving key steps of a process. It is ripe for application into health care since almost all health care processes require a near-zero tolerance for mistakes. The aim of this study is to apply the Six Sigma methodology into a clinical surgical process and to assess the improvement (if any) in the outcomes and patient care. The guiding principles of Six Sigma, namely DMAIC (Define, Measure, Analyze, Improve, Control), were used to analyze the impact of double stapling technique (DST) towards improving sphincter preservation rates for rectal cancer. The analysis using the Six Sigma methodology revealed a Sigma score of 2.10 in relation to successful sphincter preservation. This score demonstrates an improvement over the previous technique (73% over previous 54%). This study represents one of the first clinical applications of Six Sigma in the surgical field. By understanding, accepting, and applying the principles of Six Sigma, we have an opportunity to transfer a very successful management philosophy to facilitate the identification of key steps that can improve outcomes and ultimately patient safety and the quality of surgical care provided.
Ahern, Tracey; Gardner, Anne; Gardner, Glenn; Middleton, Sandy; Della, Phillip
2013-05-01
The final phase of a three phase study analysing the implementation and impact of the nurse practitioner role in Australia (the Australian Nurse Practitioner Project or AUSPRAC) was undertaken in 2009, requiring nurse telephone interviewers to gather information about health outcomes directly from patients and their treating nurse practitioners. A team of several registered nurses was recruited and trained as telephone interviewers. The aim of this paper is to report on development and evaluation of the training process for telephone interviewers. The training process involved planning the content and methods to be used in the training session; delivering the session; testing skills and understanding of interviewers post-training; collecting and analysing data to determine the degree to which the training process was successful in meeting objectives and post-training follow-up. All aspects of the training process were informed by established educational principles. Interrater reliability between interviewers was high for well-validated sections of the survey instrument resulting in 100% agreement between interviewers. Other sections with unvalidated questions showed lower agreement (between 75% and 90%). Overall the agreement between interviewers was 92%. Each interviewer was also measured against a specifically developed master script or gold standard and for this each interviewer achieved a percentage of correct answers of 94.7% or better. This equated to a Kappa value of 0.92 or better. The telephone interviewer training process was very effective and achieved high interrater reliability. We argue that the high reliability was due to the use of well validated instruments and the carefully planned programme based on established educational principles. There is limited published literature on how to successfully operationalise educational principles and tailor them for specific research studies; this report addresses this knowledge gap. Copyright © 2012 Elsevier Ltd. All rights reserved.
Energy and Uncertainty in General Relativity
NASA Astrophysics Data System (ADS)
Cooperstock, F. I.; Dupre, M. J.
2018-03-01
The issue of energy and its potential localizability in general relativity has challenged physicists for more than a century. Many non-invariant measures were proposed over the years but an invariant measure was never found. We discovered the invariant localized energy measure by expanding the domain of investigation from space to spacetime. We note from relativity that the finiteness of the velocity of propagation of interactions necessarily induces indefiniteness in measurements. This is because the elements of actual physical systems being measured as well as their detectors are characterized by entire four-velocity fields, which necessarily leads to information from a measured system being processed by the detector in a spread of time. General relativity adds additional indefiniteness because of the variation in proper time between elements. The uncertainty is encapsulated in a generalized uncertainty principle, in parallel with that of Heisenberg, which incorporates the localized contribution of gravity to energy. This naturally leads to a generalized uncertainty principle for momentum as well. These generalized forms and the gravitational contribution to localized energy would be expected to be of particular importance in the regimes of ultra-strong gravitational fields. We contrast our invariant spacetime energy measure with the standard 3-space energy measure which is familiar from special relativity, appreciating why general relativity demands a measure in spacetime as opposed to 3-space. We illustrate the misconceptions by certain authors of our approach.
Onsager's variational principle in soft matter.
Doi, Masao
2011-07-20
In the celebrated paper on the reciprocal relation for the kinetic coefficients in irreversible processes, Onsager (1931 Phys. Rev. 37 405) extended Rayleigh's principle of the least energy dissipation to general irreversible processes. In this paper, I shall show that this variational principle gives us a very convenient framework for deriving many established equations which describe the nonlinear and non-equilibrium phenomena in soft matter, such as phase separation kinetics in solutions, gel dynamics, molecular modeling for viscoelasticity nemato-hydrodynamics, etc. Onsager's variational principle can therefore be regarded as a solid general basis for soft matter physics.
Rinaldi-Miles, Anna; Quick, Brian L; LaVoie, Nicole R
2014-01-01
Cialdini's (1984) principles of influence were employed to inform the decision-making process with respect to using condoms during casual sex. In the current study, focus groups (n = 9) were conducted to understand the relationship between the six principles of influence (authority, consistency, liking, reciprocity, scarcity, and social proof) and condom use in casual sex relationships. Results revealed that authority, consistency, and social proof were endorsed often as influencing condom use. Gender differences in the endorsement of the principles were also observed. The results speak to how these principles of influence aide the condom decision-making process during these often spontaneous sexual encounters and are discussed with an emphasis on the theoretical and practical implications for using these principles in future health campaigns.
Wavefunction Collapse via a Nonlocal Relativistic Variational Principle
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harrison, Alan K.
2012-06-18
Since the origin of quantum theory in the 1920's, some of its practitioners (and founders) have been troubled by some of its features, including indeterminacy, nonlocality and entanglement. The 'collapse' process described in the Copenhagen Interpretation is suspect for several reasons, and the act of 'measurement,' which is supposed to delimit its regime of validity, has never been unambiguously defined. In recent decades, nonlocality and entanglement have been studied energetically, both theoretically and experimentally, and the theory has been reinterpreted in imaginative ways, but many mysteries remain. We propose that it is necessary to replace the theory by one thatmore » is explicitly nonlinear and nonlocal, and does not distinguish between measurement and non-measurement regimes. We have constructed such a theory, for which the phase of the wavefunction plays the role of a hidden variable via the process of zitterbewegung. To capture this effect, the theory must be relativistic, even when describing nonrelativistic phenomena. It is formulated as a variational principle, in which Nature attempts to minimize the sum of two spacetime integrals. The first integral tends to drive the solution toward a solution of the standard quantum mechanical wave equation, and also enforces the Born rule of outcome probabilities. The second integral drives the collapse process. We demonstrate that the new theory correctly predicts the possible outcomes of the electron two-slit experiment, including the infamous 'delayed-choice' variant. We observe that it appears to resolve some long-standing mysteries, but introduces new ones, including possible retrocausality (a cause later than its effect). It is not clear whether the new theory is deterministic.« less
Proceedings of the Second Noncontact Temperature Measurement Workshop
NASA Technical Reports Server (NTRS)
Hale, Robert R. (Editor)
1989-01-01
The state of the art in noncontact temperature measurement (NCTM) technology was reviewed and the NCTM requirements of microgravity materials processing community identified. The workshop included technical presentations and discussions which ranged from research on advanced concepts for temperature measurement to laboratory research and development regarding measurement principles and state-of-the-art engineering practices for NCTM methodology in commercial and industrial applications. Technical presentations were made concerning: NCTM needs as perceived by several NASA centers, recent ground-based NCT, research and development of industry, NASA, academia, and selected national laboratories, work-in-progress communication, and technical issues of the implementation of temperature measurement in the space environment to facilitate future U.S. materials science investigations.
Contributions of Science Principles to Teaching: How Science Principles Can Be Used
ERIC Educational Resources Information Center
Henson, Kenneth T.
1974-01-01
Describes the steps involved in using the "principles" approach in teaching science, illustrates the process of using science principles with an example relating to rock formation, and discusses the relevance of this approach to contemporary trends in science teaching. (JR)
Remote detection of carbon monoxide by FTIR for simulating field detection in industrial process
NASA Astrophysics Data System (ADS)
Gao, Qiankun; Liu, Wenqing; Zhang, Yujun; Gao, Mingguang; Xu, Liang; Li, Xiangxian; Jin, Ling
2016-10-01
In order to monitor carbon monoxide in industrial production, we developed a passive gas radiation measurement system based on Fourier transform infrared spectroscopy and carried out infrared radiation measurement experiment of carbon monoxide detection in simulated industrial production environment by this system. The principle, condition, device and data processing method of the experiment are introduced in this paper. In order to solve the problem of light path jitter in the actual industrial field, we simulated the noise in the industrial environment. We combine the advantages of MATHEMATICA software in the aspects of graph processing and symbolic computation to data processing to improve the signal noise ratio and noise suppression. Based on the HITRAN database, the nonlinear least square fitting method was used to calculate the concentration of the CO spectra before and after the data processing. By comparing the calculated concentration, the data processed by MATHEMATICA is reliable and necessary in the industrial production environment.
THE DEVELOPMENT OF AN INSTRUMENT FOR MEASURING THE UNDERSTANDING OF PROFIT-MAXIMIZING PRINCIPLES.
ERIC Educational Resources Information Center
MCCORMICK, FLOYD G.
THE PURPOSE OF THE STUDY WAS TO DEVELOP AN INSTRUMENT FOR MEASURING PROFIT-MAXIMIZING PRINCIPLES IN FARM MANAGEMENT WITH IMPLICATIONS FOR VOCATIONAL AGRICULTURE. PRINCIPLES WERE IDENTIFIED FROM LITERATURE SELECTED BY AGRICULTURAL ECONOMISTS. FORTY-FIVE MULTIPLE-CHOICE QUESTIONS WERE REFINED ON THE BASIS OF RESULTS OF THREE PRETESTS AND…
A compact semiconductor digital interferometer and its applications
NASA Astrophysics Data System (ADS)
Britsky, Oleksander I.; Gorbov, Ivan V.; Petrov, Viacheslav V.; Balagura, Iryna V.
2015-05-01
The possibility of using semiconductor laser interferometers to measure displacements at the nanometer scale was demonstrated. The creation principles of miniature digital Michelson interferometers based on semiconductor lasers were proposed. The advanced processing algorithm for the interferometer quadrature signals was designed. It enabled to reduce restrictions on speed of measured movements. A miniature semiconductor digital Michelson interferometer was developed. Designing of the precision temperature stability system for miniature low-cost semiconductor laser with 0.01ºС accuracy enabled to use it for creation of compact interferometer rather than a helium-neon one. Proper firmware and software was designed for the interferometer signals real-time processing and conversion in to respective shifts. In the result the relative displacement between 0-500 mm was measured with a resolution of better than 1 nm. Advantages and disadvantages of practical use of the compact semiconductor digital interferometer in seismometers for the measurement of shifts were shown.
Approaches to a Quantitative Analytical Description of Low Frequency Sound Absorption in Sea Water,
1980-09-01
medium and is a measure of its chemical compress-ibility under the influence of the perturbing process. When f = fr’ a = r = (K.f ) (2) r r If v is the...kHz, and it is a tribute to Thorp and Browning’s perspicacity that, in their original report under Thorp’ s direction [4], they recognized that the...Ki having their usual significance (Section 2) with respect to the particular relaxation process, i, under consideration. While Ki, in principle, can
Relativistic corrections to electromagnetic heavy quarkonium production
NASA Astrophysics Data System (ADS)
Shtabovenko, Vladyslav
2017-03-01
We report on the calculation [1] of the relativistic O(αs0 ν2) corrections to the quarkonium production process e+e- → χcJ + γ in non-relativistic QCD (NRQCD). In our work we incorporate effects from operators that contribute through the sub-leading Fock state |QQ¯g>, that were not taken into account by previous studies. We determine the corresponding matching coeffcients that should be included into theoretical predictions for the electromagnetic production cross-section of χcJ. This process could be, in principle, measured by the Belle II experiment.
Intelligent Sensors for Atomization Processing of Molten Metals and Alloys
1988-06-01
20ff. 12. Hirleman, Dan E. Particle Sizing by Optical , Nonimaging Techniques. Liquid Particle Size Measurement Techniques, ASTM, 1984, pp. 35ff. 13...sensors are based on electric, electromagnetic or optical principles, the latter being most developed in fields obviously related to atomization. Optical ...beams to observe various interference, diffraction, and heterodyning effects, and to observe, with high signal-to-noise ratio, even weak optical
ERIC Educational Resources Information Center
Hunnicutt, Sally S.; Grushow, Alexander; Whitnell, Rob
2017-01-01
The principles of process-oriented guided inquiry learning (POGIL) are applied to a binary solid-liquid mixtures experiment. Over the course of two learning cycles, students predict, measure, and model the phase diagram of a mixture of fatty acids. The enthalpy of fusion of each fatty acid is determined from the results. This guided inquiry…
Uncertainty Analysis Principles and Methods
2007-09-01
error source . The Data Processor converts binary coded numbers to values, performs D/A curve fitting and applies any correction factors that may be...describes the stages or modules involved in the measurement process. We now need to identify all relevant error sources and develop the mathematical... sources , gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden
A New Principle in Physiscs: the Principle "Finiteness", and Some Consequences
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abraham Sternlieb
2010-06-25
In this paper I propose a new principle in physics: the principle of "finiteness". It stems from the definition of physics as a science that deals (among other things) with measurable dimensional physical quantities. Since measurement results, including their errors, are always finite, the principle of finiteness postulates that the mathematical formulation of "legitimate" laws of physics should prevent exactly zero or infinite solutions. Some consequences of the principle of finiteness are discussed, in general, and then more specifically in the fields of special relativity, quantum mechanics, and quantum gravity. The consequences are derived independently of any other theory ormore » principle in physics. I propose "finiteness" as a postulate (like the constancy of the speed of light in vacuum, "c"), as opposed to a notion whose validity has to be corroborated by, or derived theoretically or experimentally from other facts, theories, or principles.« less
Code of Federal Regulations, 2013 CFR
2013-07-01
... measurement interference from water vapor, carbon dioxide (CO2), or other species. Also, various schemes may....0Interferences 3.1The NDIR measurement principle is potentially susceptible to interference from water vapor and...
Basic principles of stability.
Egan, William; Schofield, Timothy
2009-11-01
An understanding of the principles of degradation, as well as the statistical tools for measuring product stability, is essential to management of product quality. Key to this is management of vaccine potency. Vaccine shelf life is best managed through determination of a minimum potency release requirement, which helps assure adequate potency throughout expiry. Use of statistical tools such a least squares regression analysis should be employed to model potency decay. The use of such tools provides incentive to properly design vaccine stability studies, while holding stability measurements to specification presents a disincentive for collecting valuable data. The laws of kinetics such as Arrhenius behavior help practitioners design effective accelerated stability programs, which can be utilized to manage stability after a process change. Design of stability studies should be carefully considered, with an eye to minimizing the variability of the stability parameter. In the case of measuring the degradation rate, testing at the beginning and the end of the study improves the precision of this estimate. Additional design considerations such as bracketing and matrixing improve the efficiency of stability evaluation of vaccines.
Application of the coplanar principle to dynamic epidural pressure measurements.
Beck, J; Schettini, A; Salton, R
1984-10-01
The application of the coplanar principle to dynamic epidural pressure measurements was investigated in vitro. The authors used a coplanar pressure-displacement transducer, commonly employed to measure the viscoelastic properties of brain tissue in vivo. The present studies were performed using canine dura and a specially constructed fluid-filled chamber. The accuracy of the technique was assessed by comparing the pressure in the chamber recorded by the coplanar transducer to the pressure measured by a transducer directly vented to the chamber. The results show that the coplanar principle remained valid for dynamic measurements with the transducer under a variety of conditions.
Factual Approach in Decision Making - the Prerequisite of Success in Quality Management
NASA Astrophysics Data System (ADS)
Kučerová, Marta; Škůrková Lestyánszka, Katarína
2013-12-01
In quality management system as well as in other managerial systems, effective decisions must be always based on the data and information analysis, i.e. based on facts, in accordance with the factual approach principle in quality management. It is therefore necessary to measure and collect the data and information about processes. The article presents the results of a conducted survey, which was focused on application of factual approach in decision making. It also offers suggestions for improvements of application of the principle in business practice. This article was prepared using the research results of VEGA project No. 1/0229/08 "Perspectives of the quality management development in relation to the requirements of market in the Slovak Republic".
First-principles multiple-barrier diffusion theory. The case study of interstitial diffusion in CdTe
Yang, Ji -Hui; Park, Ji -Sang; Kang, Joongoo; ...
2015-02-17
The diffusion of particles in solid-state materials generally involves several sequential thermal-activation processes. However, presently, diffusion coefficient theory only deals with a single barrier, i.e., it lacks an accurate description to deal with multiple-barrier diffusion. Here, we develop a general diffusion coefficient theory for multiple-barrier diffusion. Using our diffusion theory and first-principles calculated hopping rates for each barrier, we calculate the diffusion coefficients of Cd, Cu, Te, and Cl interstitials in CdTe for their full multiple-barrier diffusion pathways. As a result, we found that the calculated diffusivity agrees well with the experimental measurement, thus justifying our theory, which is generalmore » for many other systems.« less
NASA Astrophysics Data System (ADS)
Qiu, Liming; Shen, Rongxi; Song, Dazhao; Wang, Enyuan; Liu, Zhentang; Niu, Yue; Jia, Haishan; Xia, Shankui; Zheng, Xiangxin
2017-12-01
An accurate and non-destructive evaluation method for the hydraulic measure impact range in coal seams is urgently needed. Aiming at the application demands, a theoretical study and field test are presented using the direct current (DC) method to evaluate the impact range of coal seam hydraulic measures. We firstly analyzed the law of the apparent resistivity response of an abnormal conductive zone in a coal seam, and then investigated the principle of non-destructive testing of the coal seam hydraulic measure impact range using the DC method, and used an accurate evaluation method based on the apparent resistivity cloud chart. Finally, taking hydraulic fracturing and hydraulic flushing as examples, field experiments were carried out in coal mines to evaluate the impact ranges. The results showed that: (1) in the process of hydraulic fracturing, coal conductivity was enhanced by high-pressure water in the coal seam, and after hydraulic fracturing, the boundary of the apparent resistivity decrease area was the boundary impact range. (2) In the process of hydraulic flushing, coal conductivity was reduced by holes and cracks in the coal seam, and after hydraulic flushing, the boundary of the apparent resistivity increase area was the boundary impact range. (3) After the implementation of the hydraulic measures, there may be some blind zones in the coal seam; in hydraulic fracturing blind zones, the apparent resistivity increased or stayed constant, while in hydraulic flushing blind zones, the apparent resistivity decreased or stayed constant. The DC method realized a comprehensive and non-destructive evaluation of the impact range of the hydraulic measures, and greatly reduced the time and cost of evaluation.
Noncontact vibration measurements using magnetoresistive sensing elements
NASA Astrophysics Data System (ADS)
Tomassini, R.; Rossi, G.
2016-06-01
Contactless instrumentations is more and more used in turbomachinery testing thanks to the non-intrusive character and the possibility to monitor all the components of the machine at the same time. Performances of blade tip timing (BTT) measurement systems, used for noncontact turbine blade vibration measurements, in terms of uncertainty and resolution are strongly affected by sensor characteristics and processing methods. The sensors used for BTT generate pulses, used for precise measurements of turbine blades time of arrival. Nowadays proximity sensors used in this application are based on optical, capacitive, eddy current and microwave measuring principle. Pressure sensors has been also tried. This paper summarizes the results achieved using a novel instrumentation based on the magnetoresistive sensing elements. The characterization of the novel probe has been already published. The measurement system was validated in test benches and in a real jet-engine comparing different sensor technologies. The whole instrumentation was improved. The work presented in this paper focuses on the current developments. In particular, attention is given to the data processing software and new sensor configurations.
Calibrators measurement system for headlamp tester of motor vehicle base on machine vision
NASA Astrophysics Data System (ADS)
Pan, Yue; Zhang, Fan; Xu, Xi-ping; Zheng, Zhe
2014-09-01
With the development of photoelectric detection technology, machine vision has a wider use in the field of industry. The paper mainly introduces auto lamps tester calibrator measuring system, of which CCD image sampling system is the core. Also, it shows the measuring principle of optical axial angle and light intensity, and proves the linear relationship between calibrator's facula illumination and image plane illumination. The paper provides an important specification of CCD imaging system. Image processing by MATLAB can get flare's geometric midpoint and average gray level. By fitting the statistics via the method of the least square, we can get regression equation of illumination and gray level. It analyzes the error of experimental result of measurement system, and gives the standard uncertainty of synthesis and the resource of optical axial angle. Optical axial angle's average measuring accuracy is controlled within 40''. The whole testing process uses digital means instead of artificial factors, which has higher accuracy, more repeatability and better mentality than any other measuring systems.
NASA Astrophysics Data System (ADS)
Putov, A. V.; Kopichev, M. M.; Ignatiev, K. V.; Putov, V. V.; Stotckaia, A. D.
2017-01-01
In this paper it is considered a discussion of the technique that realizes a brand new method of runway friction coefficient measurement based upon the proposed principle of measuring wheel braking control for the imitation of antilock braking modes that are close to the real braking modes of the aircraft chassis while landing that are realized by the aircraft anti-skid systems. Also here is the description of the model of towed measuring device that realizes a new technique of runway friction coefficient measuring, based upon the measuring wheel braking control principle. For increasing the repeatability accuracy of electromechanical braking imitation system the sideslip (brake) adaptive control system is proposed. Based upon the Burkhard model and additive random processes several mathematical models were created that describes the friction coefficient arrangement along the airstrip with different qualitative adjectives. Computer models of friction coefficient measuring were designed and first in the world the research of correlation between the friction coefficient measuring results and shape variations, intensity and cycle frequency of the measuring wheel antilock braking modes. The sketch engineering documentation was designed and prototype of the latest generation measuring device is ready to use. The measuring device was tested on the autonomous electromechanical examination laboratory treadmill bench. The experiments approved effectiveness of method of imitation the antilock braking modes for solving the problem of correlation of the runway friction coefficient measuring.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kegel, T.M.
Calibration laboratories are faced with the need to become accredited or registered to one or more quality standards. One requirement common to all of these standards is the need to have in place a measurement assurance program. What is a measurement assurance program? Brian Belanger, in Measurement Assurance Programs: Part 1, describes it as a {open_quotes}quality assurance program for a measurement process that quantifies the total uncertainty of the measurements (both random and systematic components of error) with respect to national or designated standards and demonstrates that the total uncertainty is sufficiently small to meet the user`s requirements.{close_quotes} Rolf Schumachermore » is more specific in Measurement Assurance in Your Own Laboratory. He states, {open_quotes}Measurement assurance is the application of broad quality control principles to measurements of calibrations.{close_quotes} Here, the focus is on one important part of any measurement assurance program: implementation of statistical process control (SPC). Paraphrasing Juran`s Quality Control Handbook, a process is in statistical control if the only observed variations are those that can be attributed to random causes. Conversely, a process that exhibits variations due to assignable causes is not in a state of statistical control. Finally, Carrol Croarkin states, {open_quotes}In the measurement assurance context the measurement algorithm including instrumentation, reference standards and operator interactions is the process that is to be controlled, and its direct product is the measurement per se. The measurements are assumed to be valid if the measurement algorithm is operating in a state of control.{close_quotes} Implicit in this statement is the important fact that an out-of-control process cannot produce valid measurements. 7 figs.« less
NASA Astrophysics Data System (ADS)
Fahy, Stephen; Murray, Eamonn
2015-03-01
Using first principles electronic structure methods, we calculate the induced force on the Eg (zone centre transverse optical) phonon mode in bismuth immediately after absorption of a ultrafast pulse of polarized light. To compare the results with recent ultra-fast, time-resolved x-ray diffraction experiments, we include the decay of the force due to carrier scattering, as measured in optical Raman scattering experiments, and simulate the optical absorption process, depth-dependent atomic driving forces, and x-ray diffraction in the experimental geometry. We find excellent agreement between the theoretical predictions and the observed oscillations of the x-ray diffraction signal, indicating that first-principles theory of optical absorption is well suited to the calculation of initial atomic driving forces in photo-excited materials following ultrafast excitation. This work is supported by Science Foundation Ireland (Grant No. 12/IA/1601) and EU Commission under the Marie Curie Incoming International Fellowships (Grant No. PIIF-GA-2012-329695).
Analysis of multiple pulse NMR in solids. III
NASA Technical Reports Server (NTRS)
Burum, D. P.; Rhim, W. K.
1979-01-01
The paper introduces principles which greatly simplify the process of designing and analyzing compound pulse cycles. These principles are demonstrated by applying them to the design and analysis of several cycles, including a 52-pulse cycle; this pulse cycle combines six different REV-8 cycles and has substantially more resolving power than previously available techniques. Also, a new 24-pulse cycle is introduced which combines three different REV-8 cycles and has a resolving ability equivalent to that of the 52-pulse cycle. The principle of pulse-cycle decoupling provides a method for systematically combining pulse groups into compound cycles in order to achieve enhanced performance. This method is illustrated by a logical development from the two-pulse solid echo sequence to the WAHUHA (Waugh et al., 1968), the REV-8, and the new 24-pulse and 52-pulse cycles, along with the 14-pulse and 12-pulse cycles. Proton chemical shift tensor components for several organic solids, measured by using the 52-pulse cycle, are reported without detailed discussion.
Emergency medicine: an operations management view.
Soremekun, Olan A; Terwiesch, Christian; Pines, Jesse M
2011-12-01
Operations management (OM) is the science of understanding and improving business processes. For the emergency department (ED), OM principles can be used to reduce and alleviate the effects of crowding. A fundamental principle of OM is the waiting time formula, which has clear implications in the ED given that waiting time is fundamental to patient-centered emergency care. The waiting time formula consists of the activity time (how long it takes to complete a process), the utilization rate (the proportion of time a particular resource such a staff is working), and two measures of variation: the variation in patient interarrival times and the variation in patient processing times. Understanding the waiting time formula is important because it presents the fundamental parameters that can be managed to reduce waiting times and length of stay. An additional useful OM principle that is applicable to the ED is the efficient frontier. The efficient frontier compares the performance of EDs with respect to two dimensions: responsiveness (i.e., 1/wait time) and utilization rates. Some EDs may be "on the frontier," maximizing their responsiveness at their given utilization rates. However, most EDs likely have opportunities to move toward the frontier. Increasing capacity is a movement along the frontier and to truly move toward the frontier (i.e., improving responsiveness at a fixed capacity), we articulate three possible options: eliminating waste, reducing variability, or increasing flexibility. When conceptualizing ED crowding interventions, these are the major strategies to consider. © 2011 by the Society for Academic Emergency Medicine.
Nearfield acoustic holography. I - Theory of generalized holography and the development of NAH
NASA Technical Reports Server (NTRS)
Maynard, J. D.; Williams, E. G.; Lee, Y.
1985-01-01
Because its underlying principles are so fundamental, holography has been studied and applied in many areas of science. Recently, a technique has been developed which takes the maximum advantage of the fundamental principles and extracts much more information from a hologram than is customarily associated with such a measurement. In this paper the fundamental principles of holography are reviewed, and a sound radiation measurement system, called nearfield acoustic holography (NAH), which fully exploits the fundamental principles, is described.
The development of a quality appraisal tool for studies of diagnostic reliability (QAREL).
Lucas, Nicholas P; Macaskill, Petra; Irwig, Les; Bogduk, Nikolai
2010-08-01
In systematic reviews of the reliability of diagnostic tests, no quality assessment tool has been used consistently. The aim of this study was to develop a specific quality appraisal tool for studies of diagnostic reliability. Key principles for the quality of studies of diagnostic reliability were identified with reference to epidemiologic principles, existing quality appraisal checklists, and the Standards for Reporting of Diagnostic Accuracy (STARD) and Quality Assessment of Diagnostic Accuracy Studies (QUADAS) resources. Specific items that encompassed each of the principles were developed. Experts in diagnostic research provided feedback on the items that were to form the appraisal tool. This process was iterative and continued until consensus among experts was reached. The Quality Appraisal of Reliability Studies (QAREL) checklist includes 11 items that explore seven principles. Items cover the spectrum of subjects, spectrum of examiners, examiner blinding, order effects of examination, suitability of the time interval among repeated measurements, appropriate test application and interpretation, and appropriate statistical analysis. QAREL has been developed as a specific quality appraisal tool for studies of diagnostic reliability. The reliability of this tool in different contexts needs to be evaluated. Copyright (c) 2010 Elsevier Inc. All rights reserved.
A Multistep Equilibria-Redox-Complexation Demonstration to Illustrate Le Chatelier's Principle.
ERIC Educational Resources Information Center
Berger, Tomas G.; Mellon, Edward K.
1996-01-01
Describes a process that can be used to illustrate a number of chemical principles including Le Chatelier's principle, redox chemistry, equilibria versus steady state situations, and solubility of species. (JRH)
Improving preanalytic processes using the principles of lean production (Toyota Production System).
Persoon, Thomas J; Zaleski, Sue; Frerichs, Janice
2006-01-01
The basic technologies used in preanalytic processes for chemistry tests have been mature for a long time, and improvements in preanalytic processes have lagged behind improvements in analytic and postanalytic processes. We describe our successful efforts to improve chemistry test turnaround time from a central laboratory by improving preanalytic processes, using existing resources and the principles of lean production. Our goal is to report 80% of chemistry tests in less than 1 hour and to no longer recognize a distinction between expedited and routine testing. We used principles of lean production (the Toyota Production System) to redesign preanalytic processes. The redesigned preanalytic process has fewer steps and uses 1-piece flow to move blood samples through the accessioning, centrifugation, and aliquoting processes. Median preanalytic processing time was reduced from 29 to 19 minutes, and the laboratory met the goal of reporting 80% of chemistry results in less than 1 hour for 11 consecutive months.
NASA Astrophysics Data System (ADS)
Docktor, Jennifer L.; Dornfeld, Jay; Frodermann, Evan; Heller, Kenneth; Hsu, Leonardo; Jackson, Koblar Alan; Mason, Andrew; Ryan, Qing X.; Yang, Jie
2016-06-01
Problem solving is a complex process valuable in everyday life and crucial for learning in the STEM fields. To support the development of problem-solving skills it is important for researchers and curriculum developers to have practical tools that can measure the difference between novice and expert problem-solving performance in authentic classroom work. It is also useful if such tools can be employed by instructors to guide their pedagogy. We describe the design, development, and testing of a simple rubric to assess written solutions to problems given in undergraduate introductory physics courses. In particular, we present evidence for the validity, reliability, and utility of the instrument. The rubric identifies five general problem-solving processes and defines the criteria to attain a score in each: organizing problem information into a Useful Description, selecting appropriate principles (Physics Approach), applying those principles to the specific conditions in the problem (Specific Application of Physics), using Mathematical Procedures appropriately, and displaying evidence of an organized reasoning pattern (Logical Progression).
Calibration and Data Processing in Gas Chromatography Combustion Isotope Ratio Mass Spectrometry
Zhang, Ying; Tobias, Herbert J.; Sacks, Gavin L.; Brenna, J. Thomas
2013-01-01
Compound-specific isotope analysis (CSIA) by gas chromatography combustion isotope ratio mass spectrometry (GCC-IRMS) is a powerful technique for the sourcing of substances, such as determination of the geographic or chemical origin of drugs and food adulteration, and it is especially invaluable as a confirmatory tool for detection of the use of synthetic steroids in competitive sport. We review here principles and practices for data processing and calibration of GCC-IRMS data with consideration to anti-doping analyses, with a focus on carbon isotopic analysis (13C/12C). After a brief review of peak definition, the isotopologue signal reduction methods of summation, curve-fitting, and linear regression are described and reviewed. Principles for isotopic calibration are considered in the context of the Δ13C = δ13CM – δ13CE difference measurements required for establishing adverse analytical findings for metabolites relative to endogenous reference compounds. Considerations for the anti-doping analyst are reviewed. PMID:22362612
Merino, Giselle Schmidt Alves Díaz; Teixeira, Clarissa Stefani; Schoenardie, Rodrigo Petry; Merino, Eugenio Andrés Diáz; Gontijo, Leila Amaral
2012-01-01
In product design, human factors are considered as an element of differentiation given that today's consumer demands are increasing. Safety, wellbeing, satisfaction, health, effectiveness, efficiency, and other aspects must be effectively incorporated into the product development process. This work proposes a usability assessment model that can be incorporated as an assessment tool. The methodological approach is settled in two stages. First a literature review focus specifically on usability and developing user-centred products. After this, a model of usability named Usa-Design (U-D©) is presented. Consisted of four phases: understanding the use context, pre-preliminary usability assessment (efficiency/effectiveness/satisfaction); assessment of usability principles and results, U-D© features are modular and flexible, allowing principles used in Phase 3 to be changed according to the needs and scenario of each situation. With qualitative/quantitative measurement scales of easy understanding and application, the model results are viable and applicable throughout all the product development process.
Developing an ultrasound correlation velocimetry system
NASA Astrophysics Data System (ADS)
Surup, Gerrit; White, Christopher; UNH Team
2011-11-01
The process of building an ultrasound correlation velocimetry (UCV) system by integrating a commercial medical ultrasound with a PC running commercial PIV software is described and preliminary validation measurements in pipe flow using UCV and optical particle image velocimetry (PIV) are reported. In principles of operation, UCV is similar to the technique of PIV, differing only in the image acquisition process. The benefits of UCV are that it does not require optical access to the flow field and can be used for measuring flows of opaque fluids. While the limitations of UVC are the inherently low frame rates (limited by the imaging capabilities of the commercial ultrasound system) and low spatial resolution, which limits the range of velocities and transient flow behavior that can be measured. The support of the NSF (CBET0846359, grant monitor Horst Henning Winter) is gratefully acknowledged.
Zhang, Jiwei; Di, Jianglei; Li, Ying; Xi, Teli; Zhao, Jianlin
2015-10-19
We present a method for dynamically measuring the refractive index distribution in a large range based on the combination of digital holographic interferometry and total internal reflection. A series of holograms, carrying the index information of mixed liquids adhered on a total reflection prism surface, are recorded with CCD during the diffusion process. Phase shift differences of the reflected light are reconstructed exploiting the principle of double-exposure holographic interferometry. According to the relationship between the reflection phase shift difference and the liquid index, two dimensional index distributions can be directly figured out, assuming that the index of air near the prism surface is constant. The proposed method can also be applied to measure the index of solid media and monitor the index variation during some chemical reaction processes.
Eremenco, Sonya; Pease, Sheryl; Mann, Sarah; Berry, Pamela
2017-01-01
This paper describes the rationale and goals of the Patient-Reported Outcome (PRO) Consortium's instrument translation process. The PRO Consortium has developed a number of novel PRO measures which are in the process of qualification by the U.S. Food and Drug Administration (FDA) for use in clinical trials where endpoints based on these measures would support product labeling claims. Given the importance of FDA qualification of these measures, the PRO Consortium's Process Subcommittee determined that a detailed linguistic validation (LV) process was necessary to ensure that all translations of Consortium-developed PRO measures are performed using a standardized approach with the rigor required to meet regulatory and pharmaceutical industry expectations, as well as having a clearly defined instrument translation process that the translation industry can support. The consensus process involved gathering information about current best practices from 13 translation companies with expertise in LV, consolidating the findings to generate a proposed process, and obtaining iterative feedback from the translation companies and PRO Consortium member firms on the proposed process in two rounds of review in order to update existing principles of good practice in LV and to provide sufficient detail for the translation process to ensure consistency across PRO Consortium measures, sponsors, and translation companies. The consensus development resulted in a 12-step process that outlines universal and country-specific new translation approaches, as well as country-specific adaptations of existing translations. The PRO Consortium translation process will play an important role in maintaining the validity of the data generated through these measures by ensuring that they are translated by qualified linguists following a standardized and rigorous process that reflects best practice.
1994-07-10
TEMPUS, an electromagnetic levitation facility that allows containerless processing of metallic samples in microgravity, first flew on the IML-2 Spacelab mission. The principle of electromagnetic levitation is used commonly in ground-based experiments to melt and then cool metallic melts below their freezing points without solidification occurring. The TEMPUS operation is controlled by its own microprocessor system; although commands may be sent remotely from the ground and real time adjustments may be made by the crew. Two video cameras, a two-color pyrometer for measuring sample temperatures, and a fast infrared detector for monitoring solidification spikes, will be mounted to the process chamber to facilitate observation and analysis. In addition, a dedicated high-resolution video camera can be attached to the TEMPUS to measure the sample volume precisely.
Analysis of one dimension migration law from rainfall runoff on urban roof
NASA Astrophysics Data System (ADS)
Weiwei, Chen
2017-08-01
Research was taken on the hydrology and water quality process in the natural rain condition and water samples were collected and analyzed. The pollutant were included SS, COD and TN. Based on the mass balance principle, one dimension migration model was built for the rainfall runoff pollution in surface. The difference equation was developed according to the finite difference method, by applying the Newton iteration method for solving it. The simulated pollutant concentration process was in consistent with the measured value on model, and Nash-Sutcliffe coefficient was higher than 0.80. The model had better practicability, which provided evidence for effectively utilizing urban rainfall resource, non-point source pollution of making management technologies and measures, sponge city construction, and so on.
Development and Validation of Instruments to Measure Learning of Expert-Like Thinking
NASA Astrophysics Data System (ADS)
Adams, Wendy K.; Wieman, Carl E.
2011-06-01
This paper describes the process for creating and validating an assessment test that measures the effectiveness of instruction by probing how well that instruction causes students in a class to think like experts about specific areas of science. The design principles and process are laid out and it is shown how these align with professional standards that have been established for educational and psychological testing and the elements of assessment called for in a recent National Research Council study on assessment. The importance of student interviews for creating and validating the test is emphasized, and the appropriate interview procedures are presented. The relevance and use of standard psychometric statistical tests are discussed. Additionally, techniques for effective test administration are presented.
Applying Statistical Process Control to Clinical Data: An Illustration.
ERIC Educational Resources Information Center
Pfadt, Al; And Others
1992-01-01
Principles of statistical process control are applied to a clinical setting through the use of control charts to detect changes, as part of treatment planning and clinical decision-making processes. The logic of control chart analysis is derived from principles of statistical inference. Sample charts offer examples of evaluating baselines and…
Single-Atom Demonstration of the Quantum Landauer Principle
NASA Astrophysics Data System (ADS)
Yan, L. L.; Xiong, T. P.; Rehan, K.; Zhou, F.; Liang, D. F.; Chen, L.; Zhang, J. Q.; Yang, W. L.; Ma, Z. H.; Feng, M.
2018-05-01
One of the outstanding challenges to information processing is the eloquent suppression of energy consumption in the execution of logic operations. The Landauer principle sets an energy constraint in deletion of a classical bit of information. Although some attempts have been made to experimentally approach the fundamental limit restricted by this principle, exploring the Landauer principle in a purely quantum mechanical fashion is still an open question. Employing a trapped ultracold ion, we experimentally demonstrate a quantum version of the Landauer principle, i.e., an equality associated with the energy cost of information erasure in conjunction with the entropy change of the associated quantized environment. Our experimental investigation substantiates an intimate link between information thermodynamics and quantum candidate systems for information processing.
WKB analysis of relativistic Stern–Gerlach measurements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Palmer, Matthew C., E-mail: m.palmer@physics.usyd.edu.au; Takahashi, Maki, E-mail: m.takahashi@physics.usyd.edu.au; Westman, Hans F., E-mail: hwestman74@gmail.com
2013-09-15
Spin is an important quantum degree of freedom in relativistic quantum information theory. This paper provides a first-principles derivation of the observable corresponding to a Stern–Gerlach measurement with relativistic particle velocity. The specific mathematical form of the Stern–Gerlach operator is established using the transformation properties of the electromagnetic field. To confirm that this is indeed the correct operator we provide a detailed analysis of the Stern–Gerlach measurement process. We do this by applying a WKB approximation to the minimally coupled Dirac equation describing an interaction between a massive fermion and an electromagnetic field. Making use of the superposition principle wemore » show that the +1 and −1 spin eigenstates of the proposed spin operator are split into separate packets due to the inhomogeneity of the Stern–Gerlach magnetic field. The operator we obtain is dependent on the momentum between particle and Stern–Gerlach apparatus, and is mathematically distinct from two other commonly used operators. The consequences for quantum tomography are considered. -- Highlights: •Derivation of the spin observable for a relativistic Stern–Gerlach measurement. •Relativistic model of spin measurement using WKB approximation of Dirac equation. •The derived spin operator is distinct from two other commonly used operators. •Consequences for quantum tomography are considered.« less
Point-of-care optical tool to detect early stage of hemorrhage and shock
NASA Astrophysics Data System (ADS)
Gurjar, Rajan S.; Riccardi, Suzannah L.; Johnson, Blair D.; Johnson, Christopher P.; Paradis, Norman A.; Joyner, Michael J.; Wolf, David E.
2014-02-01
There is a critical unmet clinical need for a device that can monitor and predict the onset of shock: hemorrhagic shock or bleeding to death, septic shock or systemic infection, and cardiogenic shock or blood flow and tissue oxygenation impairment due to heart attack. Together these represent 141 M patients per year. We have developed a monitor for shock based on measuring blood flow in peripheral (skin) capillary beds using diffuse correlation spectroscopy, a form of dynamic light scattering, and have demonstrated proof-of-principle both in pigs and humans. Our results show that skin blood flow measurement, either alone or in conjunction with other hemodynamic properties such as heart rate variability, pulse pressure variability, and tissue oxygenation, can meet this unmet need in a small self-contained patch-like device in conjunction with a hand-held processing unit. In this paper we describe and discuss the experimental work and the multivariate statistical analysis performed to demonstrate proof-of-principle of the concept.
No Quantum Realization of Extremal No-Signaling Boxes
NASA Astrophysics Data System (ADS)
Ramanathan, Ravishankar; Tuziemski, Jan; Horodecki, Michał; Horodecki, Paweł
2016-07-01
The study of quantum correlations is important for fundamental reasons as well as for quantum communication and information processing tasks. On the one hand, it is of tremendous interest to derive the correlations produced by measurements on separated composite quantum systems from within the set of all correlations obeying the no-signaling principle of relativity, by means of information-theoretic principles. On the other hand, an important ongoing research program concerns the formulation of device-independent cryptographic protocols based on quantum nonlocal correlations for the generation of secure keys, and the amplification and expansion of random bits against general no-signaling adversaries. In both these research programs, a fundamental question arises: Can any measurements on quantum states realize the correlations present in pure extremal no-signaling boxes? Here, we answer this question in full generality showing that no nontrivial (not local realistic) extremal boxes of general no-signaling theories can be realized in quantum theory. We then explore some important consequences of this fact.
The Microscope Space Mission and the In-Orbit Calibration Plan for its Instrument
NASA Astrophysics Data System (ADS)
Levy, Agnès Touboul, Pierre; Rodrigues, Manuel; Onera, Émilie Hardy; Métris, Gilles; Robert, Alain
2015-01-01
The MICROSCOPE space mission aims at testing the Equivalence Principle (EP) with an accuracy of 10-15. This principle is one of the basis of the General Relativity theory; it states the equivalence between gravitational and inertial mass. The test is based on the precise measurement of a gravitational signal by a differential electrostatic accelerometer which includes two cylindrical test masses made of different materials. The accelerometers constitute the payload accommodated on board a drag-free micro-satellite which is controlled inertial or rotating about the normal to the orbital plane. The acceleration estimates used for the EP test are disturbed by the instruments physical parameters and by the instrument environment conditions on-board the satellite. These parameters are partially measured with ground tests or during the integration of the instrument in the satellite (alignment). Nevertheless, the ground evaluations are not sufficient with respect to the EP test accuracy objectives. An in-orbit calibration is therefore needed to characterize them finely. The calibration process for each parameter has been defined.
Design for the Maintainer: Projecting Maintenance Performance from Design Characteristics.
1981-07-01
of Kahneman and Tversky (Tversky & Kahneman, 1974; Kahneman & Tversky, 1979). They have observed some general principles to which human decision...makers tend to adhere. The first of these is the "representativeness heuristicw . According to this principle , the question, ’will event A be generated by...process B?", will be decided affirmatively to the extent that the event A resembles process B. According to this principle , if failure in a computer
Time-Frequency Representations for Speech Signals.
1987-06-01
and subsequent processing can take these weights into account . This is, in principle , safer, but pratically it is much harder to think about processing...and frequency along the other. But how should this idea be made precise (the well-known uncertainty principle of fourier analysis is one of the thorny...produce similar results. q2.3. Non-stationarity 19 it is the unique shape that meets the uncertainty principle with equality. 2.2. The quasi-stationary
Quinn, Paul C; Bhatt, Ramesh S
2009-08-01
Previous research has demonstrated that organizational principles become functional over different time courses of development: Lightness similarity is available at 3 months of age, but form similarity is not readily in evidence until 6 months of age. We investigated whether organization would transfer across principles and whether perceptual scaffolding can occur from an already functional principle to a not-yet-operational principle. Six- to 7-month-old infants (Experiment 1) and 3- to 4-month-old infants (Experiment 2) who were familiarized with arrays of elements organized by lightness similarity displayed a subsequent visual preference for a novel organization defined by form similarity. Results with the older infants demonstrate transfer in perceptual grouping: The organization defined by one grouping principle can direct a visual preference for a novel organization defined by a different grouping principle. Findings with the younger infants suggest that learning based on an already functional organizational process enables an organizational process that is not yet functional through perceptual scaffolding.
Curvature sensor for ocular wavefront measurement.
Díaz-Doutón, Fernando; Pujol, Jaume; Arjona, Montserrat; Luque, Sergio O
2006-08-01
We describe a new wavefront sensor for ocular aberration determination, based on the curvature sensing principle, which adapts the classical system used in astronomy for the living eye's measurements. The actual experimental setup is presented and designed following a process guided by computer simulations to adjust the design parameters for optimal performance. We present results for artificial and real young eyes, compared with the Hartmann-Shack estimations. Both methods show a similar performance for these cases. This system will allow for the measurement of higher order aberrations than the currently used wavefront sensors in situations in which they are supposed to be significant, such as postsurgery eyes.
Surface roughness measurement in the submicrometer range using laser scattering
NASA Astrophysics Data System (ADS)
Wang, S. H.; Quan, Chenggen; Tay, C. J.; Shang, H. M.
2000-06-01
A technique for measuring surface roughness in the submicrometer range is developed. The principle of the method is based on laser scattering from a rough surface. A telecentric optical setup that uses a laser diode as a light source is used to record the light field scattered from the surface of a rough object. The light intensity distribution of the scattered band, which is correlated to the surface roughness, is recorded by a linear photodiode array and analyzed using a single-chip microcomputer. Several sets of test surfaces prepared by different machining processes are measured and a method for the evaluation of surface roughness is proposed.
Towards Fast Tracking of the Keyhole Geometry
NASA Astrophysics Data System (ADS)
Brock, C.; Hohenstein, R.; Schmidt, M.
We describe a sensor principle permitting the fast online measurement of the position of the optical process emissions in deep penetration laser welding. Experiments show a strong correlation between the position of the vapour plume and the keyhole geometry, demonstrated here by varying the penetration depth of the weld. In order to achieve an absolute position measurement, the sensor was calibrated using a light source with well defined characteristics. The setup for the calibration measurements and the corresponding data evaluation methods are discussed. The precision of the calibration with a green LED is 6 μm in lateral and 55 μm in axial direction, for a working distance of 200 mm.
Visual Display Principles for C3I System Tasks
1993-06-01
early in the design process is now explicitly recognized in military R & D policy as evidenced by the Navy’s HARDMAN and the Army’s MANPRINT programs...information): required sampling rate for each battlefield area, target type, and sensor type, etc.? - Change detections aids - Where is the enemy...increasing load and sophistication for - Automated measurement and operators and decisionmakers scoring (%hits, miss distances, attrition rates , etc
NASA total quality management 1989 accomplishments report
NASA Technical Reports Server (NTRS)
1990-01-01
Described here are the accomplishments of NASA as a result of the use of Total Quality Management (TQM). The principles in practice which led to these process refinements are important cultural elements to any organization's productivity and quality efforts. The categories of TQM discussed here are top management leadership and support, strategic planning, focus on the customer, employee training and recognition, employee empowerment and teamwork, measurement and analysis, and quality assurance.
Blood transfusion safety: a new philosophy.
Franklin, I M
2012-12-01
Blood transfusion safety has had a chequered history, and there are current and future challenges. Internationally, there is no clear consensus for many aspects of the provision of safe blood, although pan-national legislation does provide a baseline framework in the European Union. Costs are rising, and new safety measures can appear expensive, especially when tested against some other medical interventions, such as cancer treatment and vaccination programmes. In this article, it is proposed that a comprehensive approach is taken to the issue of blood transfusion safety that considers all aspects of the process rather than considering only new measures. The need for an agreed level of safety for specified and unknown risks is also suggested. The importance of providing care and support for those inadvertently injured as a result of transfusion problems is also made. Given that the current blood safety decision process often uses a utilitarian principle for decision making--through the calculation of Quality Adjusted Life Years--an alternative philosophy is proposed. A social contract for blood safety, based on the principles of 'justice as fairness' developed by John Rawls, is recommended as a means of providing an agreed level of safety, containing costs and providing support for any adverse outcomes. © 2012 The Author. Transfusion Medicine © 2012 British Blood Transfusion Society.
Meslin, Eric M; Schwartz, Peter H
2015-01-01
Ethics should guide the design of electronic health records (EHR), and recognized principles of bioethics can play an important role. This approach was recently adopted by a team of informaticists who are designing and testing a system where patients exert granular control over who views their personal health information. While this method of building ethics in from the start of the design process has significant benefits, questions remain about how useful the application of bioethics principles can be in this process, especially when principles conflict. For instance, while the ethical principle of respect for autonomy supports a robust system of granular control, the principles of beneficence and nonmaleficence counsel restraint due to the danger of patients being harmed by restrictions on provider access to data. Conflict between principles has long been recognized by ethicists and has even motivated attacks on approaches that state and apply principles. In this paper, we show how using ethical principles can help in the design of EHRs by first explaining how ethical principles can and should be used generally, and then by discussing how attention to details in specific cases can show that the tension between principles is not as bad as it initially appeared. We conclude by suggesting ways in which the application of these (and other) principles can add value to the ongoing discussion of patient involvement in their health care. This is a new approach to linking principles to informatics design that we expect will stimulate further interest.
Flores, Walter
2010-01-01
Governance refers to decision-making processes in which power relationships and actors and institutions' particular interests converge. Situations of consensus and conflict are inherent to such processes. Furthermore, decision-making happens within a framework of ethical principles, motivations and incentives which could be explicit or implicit. Health systems in most Latin-American and Caribbean countries take the principles of equity, solidarity, social participation and the right to health as their guiding principles; such principles must thus rule governance processes. However, this is not always the case and this is where the importance of investigating governance in health systems lies. Making advances in investigating governance involves conceptual and methodological implications. Clarifying and integrating normative and analytical approaches is relevant at conceptual level as both are necessary for an approach seeking to investigate and understand social phenomena's complexity. In relation to methodological level, there is a need to expand the range of variables, sources of information and indicators for studying decision-making aimed to greater equity, health citizenship and public policy efficiency.
Introduction of a pyramid guiding process for general musculoskeletal physical rehabilitation.
Stark, Timothy W
2006-06-08
Successful instruction of a complicated subject as Physical Rehabilitation demands organization. To understand principles and processes of such a field demands a hierarchy of steps to achieve the intended outcome. This paper is intended to be an introduction to a proposed pyramid scheme of general physical rehabilitation principles. The purpose of the pyramid scheme is to allow for a greater understanding for the student and patient. As the respected Food Guide Pyramid accomplishes, the student will further appreciate and apply supported physical rehabilitation principles and the patient will understand that there is a progressive method to their functional healing process.
Du, Baoqiang; Dong, Shaofeng; Wang, Yanfeng; Guo, Shuting; Cao, Lingzhi; Zhou, Wei; Zuo, Yandi; Liu, Dan
2013-11-01
A wide-frequency and high-resolution frequency measurement method based on the quantized phase step law is presented in this paper. Utilizing a variation law of the phase differences, the direct different frequency phase processing, and the phase group synchronization phenomenon, combining an A/D converter and the adaptive phase shifting principle, a counter gate is established in the phase coincidences at one-group intervals, which eliminates the ±1 counter error in the traditional frequency measurement method. More importantly, the direct phase comparison, the measurement, and the control between any periodic signals have been realized without frequency normalization in this method. Experimental results show that sub-picosecond resolution can be easily obtained in the frequency measurement, the frequency standard comparison, and the phase-locked control based on the phase quantization processing technique. The method may be widely used in navigation positioning, space techniques, communication, radar, astronomy, atomic frequency standards, and other high-tech fields.
NASA Astrophysics Data System (ADS)
Miao, Changyun; Shi, Boya; Li, Hongqiang
2008-12-01
A human physiological parameters intelligent clothing is researched with FBG sensor technology. In this paper, the principles and methods of measuring human physiological parameters including body temperature and heart rate in intelligent clothing with distributed FBG are studied, the mathematical models of human physiological parameters measurement are built; the processing method of body temperature and heart rate detection signals is presented; human physiological parameters detection module is designed, the interference signals are filtered out, and the measurement accuracy is improved; the integration of the intelligent clothing is given. The intelligent clothing can implement real-time measurement, processing, storage and output of body temperature and heart rate. It has accurate measurement, portability, low cost, real-time monitoring, and other advantages. The intelligent clothing can realize the non-contact monitoring between doctors and patients, timely find the diseases such as cancer and infectious diseases, and make patients get timely treatment. It has great significance and value for ensuring the health of the elders and the children with language dysfunction.
Stern-Gerlach-like approach to electron orbital angular momentum measurement
Harvey, Tyler R.; Grillo, Vincenzo; McMorran, Benjamin J.
2017-02-28
Many methods now exist to prepare free electrons into orbital-angular-momentum states, and the predicted applications of these electron states as probes of materials and scattering processes are numerous. The development of electron orbital-angular-momentum measurement techniques has lagged behind. We show that coupling between electron orbital angular momentum and a spatially varying magnetic field produces an angular-momentum-dependent focusing effect. We propose a design for an orbital-angular-momentum measurement device built on this principle. As the method of measurement is noninterferometric, the device works equally well for mixed, superposed, and pure final orbital-angular-momentum states. The energy and orbital-angular-momentum distributions of inelastically scattered electronsmore » may be simultaneously measurable with this technique.« less
New method of noncontact temperature measurement in on-line textile production
NASA Astrophysics Data System (ADS)
Cheng, Xianping; Song, Xing-Li; Deng, Xing-Zhong
1993-09-01
Based on the condition of textile production the method of infrared non-contact temperature measurement is adcpted in the heat-setting and drying heat-treatment process . This method is used to monitor the moving cloth. The temperature of the cloth is displayed rapidly and exactly. The principle of the temperature measurement is analysed theoretically in this paper. Mathematical analysis and calculation are used for introducing signal transmitting method. Adopted method of combining software with hardware the temperature is corrected and compensated with the aid of a single-chip microcomputer. The results of test indicate that the application of temperature measurement instrument provides reliable parameters in the quality control. And it is an important measure on improving the quality of products.
Stern-Gerlach-like approach to electron orbital angular momentum measurement
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harvey, Tyler R.; Grillo, Vincenzo; McMorran, Benjamin J.
Many methods now exist to prepare free electrons into orbital-angular-momentum states, and the predicted applications of these electron states as probes of materials and scattering processes are numerous. The development of electron orbital-angular-momentum measurement techniques has lagged behind. We show that coupling between electron orbital angular momentum and a spatially varying magnetic field produces an angular-momentum-dependent focusing effect. We propose a design for an orbital-angular-momentum measurement device built on this principle. As the method of measurement is noninterferometric, the device works equally well for mixed, superposed, and pure final orbital-angular-momentum states. The energy and orbital-angular-momentum distributions of inelastically scattered electronsmore » may be simultaneously measurable with this technique.« less
Study on the high-frequency laser measurement of slot surface difference
NASA Astrophysics Data System (ADS)
Bing, Jia; Lv, Qiongying; Cao, Guohua
2017-10-01
In view of the measurement of the slot surface difference in the large-scale mechanical assembly process, Based on high frequency laser scanning technology and laser detection imaging principle, This paragraph designs a double galvanometer pulse laser scanning system. Laser probe scanning system architecture consists of three parts: laser ranging part, mechanical scanning part, data acquisition and processing part. The part of laser range uses high-frequency laser range finder to measure the distance information of the target shape and get a lot of point cloud data. Mechanical scanning part includes high-speed rotary table, high-speed transit and related structure design, in order to realize the whole system should be carried out in accordance with the design of scanning path on the target three-dimensional laser scanning. Data processing part mainly by FPGA hardware with LAbVIEW software to design a core, to process the point cloud data collected by the laser range finder at the high-speed and fitting calculation of point cloud data, to establish a three-dimensional model of the target, so laser scanning imaging is realized.
Fluid flow measurements by means of vibration monitoring
NASA Astrophysics Data System (ADS)
Campagna, Mauro M.; Dinardo, Giuseppe; Fabbiano, Laura; Vacca, Gaetano
2015-11-01
The achievement of accurate fluid flow measurements is fundamental whenever the control and the monitoring of certain physical quantities governing an industrial process are required. In that case, non-intrusive devices are preferable, but these are often more sophisticated and expensive than those which are more common (such as nozzles, diaphrams, Coriolis flowmeters and so on). In this paper, a novel, non-intrusive, simple and inexpensive methodology is presented to measure the fluid flow rate (in a turbulent regime) whose physical principle is based on the acquisition of transversal vibrational signals induced by the fluid itself onto the pipe walls it is flowing through. Such a principle of operation would permit the use of micro-accelerometers capable of acquiring and transmitting the signals, even by means of wireless technology, to a control room for the monitoring of the process under control. A possible application (whose feasibility will be investigated by the authors in a further study) of this introduced technology is related to the employment of a net of micro-accelerometers to be installed on pipeline networks of aqueducts. This apparatus could lead to the faster and easier detection and location of possible leaks of fluid affecting the pipeline network with more affordable costs. The authors, who have previously proven the linear dependency of the acceleration harmonics amplitude on the flow rate, here discuss an experimental analysis of this functional relation with the variation in the physical properties of the pipe in terms of its diameter and constituent material, to find the eventual limits to the practical application of the measurement methodology.
Three Principles of Water Flow in Soils
NASA Astrophysics Data System (ADS)
Guo, L.; Lin, H.
2016-12-01
Knowledge of water flow in soils is crucial to understanding terrestrial hydrological cycle, surface energy balance, biogeochemical dynamics, ecosystem services, contaminant transport, and many other Critical Zone processes. However, due to the complex and dynamic nature of non-uniform flow, reconstruction and prediction of water flow in natural soils remain challenging. This study synthesizes three principles of water flow in soils that can improve modeling water flow in soils of various complexity. The first principle, known as the Darcy's law, came to light in the 19th century and suggested a linear relationship between water flux density and hydraulic gradient, which was modified by Buckingham for unsaturated soils. Combining mass balance and the Buckingham-Darcy's law, L.A. Richards quantitatively described soil water change with space and time, i.e., Richards equation. The second principle was proposed by L.A. Richards in the 20th century, which described the minimum pressure potential needed to overcome surface tension of fluid and initiate water flow through soil-air interface. This study extends this principle to encompass soil hydrologic phenomena related to varied interfaces and microscopic features and provides a more cohesive explanation of hysteresis, hydrophobicity, and threshold behavior when water moves through layered soils. The third principle is emerging in the 21st century, which highlights the complex and evolving flow networks embedded in heterogeneous soils. This principle is summarized as: Water moves non-uniformly in natural soils with a dual-flow regime, i.e., it follows the least-resistant or preferred paths when "pushed" (e.g., by storms) or "attracted" (e.g., by plants) or "restricted" (e.g., by bedrock), but moves diffusively into the matrix when "relaxed" (e.g., at rest) or "touched" (e.g., adsorption). The first principle is a macroscopic view of steady-state water flow, the second principle is a microscopic view of interface-based dynamics of water flow, and the third principle combines macroscopic and microscopic consideration to explain a mosaic-like flow regime in soils. Integration of above principles can advance flow theory, measurement, and modeling and can improve management of soil and water resources.
Other ways of measuring `Big G'
NASA Astrophysics Data System (ADS)
Rothleitner, Christian
2016-03-01
In 1798, the British scientist Henry Cavendish performed the first laboratory experiment to determine the gravitational force between two massive bodies. From his result, Newton's gravitational constant, G, was calculated. Cavendish's measurement principle was the torsion balance invented by John Michell some 15 years before. During the following two centuries, more than 300 new measurements followed. Although technology - and physics - developed rapidly during this time, surprisingly, most experiments were still based on the same principle. In fact, the most accurate determination of G to date is a measurement based on the torsion balance principle. Despite the fact that G was one of the first fundamental physical constants ever measured, and despite the huge number of experiments performed on it to this day, its CODATA recommended value still has the highest standard measurement uncertainty when compared to other fundamental physical constants. Even more serious is the fact that even measurements based on the same principle often do not overlap within their attributed standard uncertainties. It must be assumed that various experiments are subject to one or more unknown biases. In this talk I will present some alternative experimental setups to the torsion balance which have been performed or proposed to measure G. Although their estimated uncertainties are often higher than most torsion balance experiments, revisiting such ideas is worthwhile. Advances in technology could offer solutions to problems which were previously insurmountable, these solutions could result in lower measurement uncertainties. New measurement principles could also help to uncover hidden systematic effects.
Lessons from a broad view of science: a response to Dr Robergs’ article
Pires, Flavio Oliveira
2018-01-01
Dr Robergs suggested that the central governor model (CGM) is not a well-worded theory, as it deviated from the tenant of falsification criteria. According to his view of science, exercise researches with the intent to prove rather than disprove the theory contribute little to new knowledge and condemn the theory to the label of pseudoscience. However, exercise scientists should be aware of limitations of the falsification criteria. First, the number of potential falsifiers for a given hypothesis is always infinite so that there is no mean to ensure asymmetric comparison between theories. Thus, assuming a competition between CGM and dichotomised central versus peripheral fatigue theories, scientists guided by the falsification principle should know, a priori, all possible falsifiers between these two theories in order to choose the finest one, thereby leading to an oversimplification of the theories. Second, the failure to formulate refutable hypothesis may be a simple consequence of the lack of instruments to make crucial measurements. The use of refutation principles to test the CGM theory requires capable technology for online feedback and feedforward measures integrated in the central nervous system, in a real-time exercise. Consequently, falsification principle is currently impracticable to test CGM theory. The falsification principle must be applied with equilibrium, as we should do with positive induction process, otherwise Popperian philosophy will be incompatible with the actual practice in science. Rather than driving the scientific debate on a biased single view of science, researchers in the field of exercise sciences may benefit more from different views of science. PMID:29629188
New principle for measuring arterial blood oxygenation, enabling motion-robust remote monitoring.
van Gastel, Mark; Stuijk, Sander; de Haan, Gerard
2016-12-07
Finger-oximeters are ubiquitously used for patient monitoring in hospitals worldwide. Recently, remote measurement of arterial blood oxygenation (SpO 2 ) with a camera has been demonstrated. Both contact and remote measurements, however, require the subject to remain static for accurate SpO 2 values. This is due to the use of the common ratio-of-ratios measurement principle that measures the relative pulsatility at different wavelengths. Since the amplitudes are small, they are easily corrupted by motion-induced variations. We introduce a new principle that allows accurate remote measurements even during significant subject motion. We demonstrate the main advantage of the principle, i.e. that the optimal signature remains the same even when the SNR of the PPG signal drops significantly due to motion or limited measurement area. The evaluation uses recordings with breath-holding events, which induce hypoxemia in healthy moving subjects. The events lead to clinically relevant SpO 2 levels in the range 80-100%. The new principle is shown to greatly outperform current remote ratio-of-ratios based methods. The mean-absolute SpO 2 -error (MAE) is about 2 percentage-points during head movements, where the benchmark method shows a MAE of 24 percentage-points. Consequently, we claim ours to be the first method to reliably measure SpO 2 remotely during significant subject motion.
Integrating automated support for a software management cycle into the TAME system
NASA Technical Reports Server (NTRS)
Sunazuka, Toshihiko; Basili, Victor R.
1989-01-01
Software managers are interested in the quantitative management of software quality, cost and progress. An integrated software management methodology, which can be applied throughout the software life cycle for any number purposes, is required. The TAME (Tailoring A Measurement Environment) methodology is based on the improvement paradigm and the goal/question/metric (GQM) paradigm. This methodology helps generate a software engineering process and measurement environment based on the project characteristics. The SQMAR (software quality measurement and assurance technology) is a software quality metric system and methodology applied to the development processes. It is based on the feed forward control principle. Quality target setting is carried out before the plan-do-check-action activities are performed. These methodologies are integrated to realize goal oriented measurement, process control and visual management. A metric setting procedure based on the GQM paradigm, a management system called the software management cycle (SMC), and its application to a case study based on NASA/SEL data are discussed. The expected effects of SMC are quality improvement, managerial cost reduction, accumulation and reuse of experience, and a highly visual management reporting system.
THE MAXIMIUM POWER PRINCIPLE: AN EMPIRICAL INVESTIGATION
The maximum power principle is a potential guide to understanding the patterns and processes of ecosystem development and sustainability. The principle predicts the selective persistence of ecosystem designs that capture a previously untapped energy source. This hypothesis was in...
[Study on culture and philosophy of processing of traditional Chinese medicines].
Yang, Ming; Zhang, Ding-Kun; Zhong, Ling-Yun; Wang, Fang
2013-07-01
According to cultural views and philosophical thoughts, this paper studies the cultural origin, thinking modes, core principles, general regulation and methods of processing, backtracks processing's culture and history which contains generation and deduction process, experienced and promoting process, and core value, summarizes processing's basic principles which are directed by holistic, objective, dynamic, balanced and appropriate thoughts; so as to propagate cultural characteristic and philosophical wisdom of traditional Chinese medicine processing, to promote inheritance and development of processing and to ensure the maximum therapeutic value of Chinese medical clinical.
HEALTH TECHNOLOGY ASSESSMENT FOR DECISION MAKING IN LATIN AMERICA: GOOD PRACTICE PRINCIPLES.
Pichon-Riviere, Andrés; Soto, Natalie C; Augustovski, Federico Ariel; García Martí, Sebastián; Sampietro-Colom, Laura
2018-06-11
The aim of this study was to identify good practice principles for health technology assessment (HTA) that are the most relevant and of highest priority for application in Latin America and to identify potential barriers to their implementation in the region. HTA good practice principles proposed at the international level were identified and then explored during a deliberative process in a forum of assessors, funders, and product manufacturers. Forty-two representatives from ten Latin American countries participated. Good practice principles proposed at the international level were considered valid and potentially relevant to Latin America. Five principles were identified as priority and with the greatest potential to be strengthened at this time: transparency in the production of HTA, involvement of relevant stakeholders in the HTA process, mechanisms to appeal decisions, clear priority-setting processes in HTA, and a clear link between HTA and decision making. The main challenge identified was to find a balance between the application of these principles and the available resources in a way that would not detract from the production of reports and adaptation to the needs of decision makers. The main recommendation was to progress gradually in strengthening HTA and its link to decision making by developing appropriate processes for each country, without trying to impose, in the short-term, standards taken from examples at the international level without adequate adaptation of these to local contexts.
Strategic planning in a complex academic environment: lessons from one academic health center.
Levinson, Wendy; Axler, Helena
2007-08-01
Leaders in academic health centers (AHCs) must create a vision for their academic unit embedded in a complex environment. A formal strategic planning process can be valuable to help shape a clear vision taking advantage of potential collaborations and to develop specific achievable long- and short-term goals. The authors describe the steps in a formal strategic planning process and illustrate it with the example of the Department of Medicine at the University of Toronto Faculty of Medicine beginning in 2004. The process included the active participation of over 300 faculty members, trainees, and stakeholders of the department and resulted in broad-based support and leadership for the resulting plan. The authors describe the steps, which include getting started, committing to planning principles, establishing the work plan, understanding the environment, pulling it all together, shaping the vision, testing strategic directions, building effective implementation, and promoting the plan. Articulation of vision, mission, and values informed the plan's development, as well as 10 key principles integral to the plan. Challenges and lessons learned are also described. The final strategic plan is an active core activity of the department, guiding decisions and resource allocation and facilitating measurement of success or shortcomings. The process the authors describe is applicable to multiple academic units, including divisions/sections, departments, or thematic programs in AHCs.
Stochastic Averaging Principle for Spatial Birth-and-Death Evolutions in the Continuum
NASA Astrophysics Data System (ADS)
Friesen, Martin; Kondratiev, Yuri
2018-06-01
We study a spatial birth-and-death process on the phase space of locally finite configurations Γ^+ × Γ^- over R}^d. Dynamics is described by an non-equilibrium evolution of states obtained from the Fokker-Planck equation and associated with the Markov operator L^+(γ ^-) + 1/ɛ L^-, ɛ > 0. Here L^- describes the environment process on Γ^- and L^+(γ ^-) describes the system process on Γ^+, where γ ^- indicates that the corresponding birth-and-death rates depend on another locally finite configuration γ ^- \\in Γ^-. We prove that, for a certain class of birth-and-death rates, the corresponding Fokker-Planck equation is well-posed, i.e. there exists a unique evolution of states μ _t^{ɛ } on Γ^+ × Γ^-. Moreover, we give a sufficient condition such that the environment is ergodic with exponential rate. Let μ _{inv} be the invariant measure for the environment process on Γ^-. In the main part of this work we establish the stochastic averaging principle, i.e. we prove that the marginal of μ _t^{ɛ } onto Γ^+ converges weakly to an evolution of states on {Γ}^+ associated with the averaged Markov birth-and-death operator {\\overline{L}} = \\int _{Γ}^- L^+(γ ^-)d μ _{inv}(γ ^-).
Stochastic Averaging Principle for Spatial Birth-and-Death Evolutions in the Continuum
NASA Astrophysics Data System (ADS)
Friesen, Martin; Kondratiev, Yuri
2018-04-01
We study a spatial birth-and-death process on the phase space of locally finite configurations Γ^+ × Γ^- over R^d . Dynamics is described by an non-equilibrium evolution of states obtained from the Fokker-Planck equation and associated with the Markov operator L^+(γ ^-) + 1/ɛ L^- , ɛ > 0 . Here L^- describes the environment process on Γ^- and L^+(γ ^-) describes the system process on Γ^+ , where γ ^- indicates that the corresponding birth-and-death rates depend on another locally finite configuration γ ^- \\in Γ^- . We prove that, for a certain class of birth-and-death rates, the corresponding Fokker-Planck equation is well-posed, i.e. there exists a unique evolution of states μ _t^{ɛ } on Γ^+ × Γ^- . Moreover, we give a sufficient condition such that the environment is ergodic with exponential rate. Let μ _{inv} be the invariant measure for the environment process on Γ^- . In the main part of this work we establish the stochastic averaging principle, i.e. we prove that the marginal of μ _t^{ɛ } onto Γ^+ converges weakly to an evolution of states on Γ^+ associated with the averaged Markov birth-and-death operator \\overline{L} = \\int _{Γ}^-}L^+(γ ^-)d μ _{inv}(γ ^-).
NASA Technical Reports Server (NTRS)
Schlegel, R. G.
1982-01-01
It is important for industry and NASA to assess the status of acoustic design technology for predicting and controlling helicopter external noise in order for a meaningful research program to be formulated which will address this problem. The prediction methodologies available to the designer and the acoustic engineer are three-fold. First is what has been described as a first principle analysis. This analysis approach attempts to remove any empiricism from the analysis process and deals with a theoretical mechanism approach to predicting the noise. The second approach attempts to combine first principle methodology (when available) with empirical data to formulate source predictors which can be combined to predict vehicle levels. The third is an empirical analysis, which attempts to generalize measured trends into a vehicle noise prediction method. This paper will briefly address each.
Energy driven self-organization in nanoscale metallic liquid films.
Krishna, H; Shirato, N; Favazza, C; Kalyanaraman, R
2009-10-01
Nanometre thick metallic liquid films on inert substrates can spontaneously dewet and self-organize into complex nanomorphologies and nanostructures with well-defined length scales. Nanosecond pulses of an ultraviolet laser can capture the dewetting evolution and ensuing nanomorphologies, as well as introduce dramatic changes to dewetting length scales due to the nanoscopic nature of film heating. Here, we show theoretically that the self-organization principle, based on equating the rate of transfer of thermodynamic free energy to rate of loss in liquid flow, accurately describes the spontaneous dewetting. Experimental measurements of laser dewetting of Ag and Co liquid films on SiO(2) substrates confirm this principle. This energy transfer approach could be useful for analyzing the behavior of nanomaterials and chemical processes in which spontaneous changes are important.
The heuristic-analytic theory of reasoning: extension and evaluation.
Evans, Jonathan St B T
2006-06-01
An extensively revised heuristic-analytic theory of reasoning is presented incorporating three principles of hypothetical thinking. The theory assumes that reasoning and judgment are facilitated by the formation of epistemic mental models that are generated one at a time (singularity principle) by preconscious heuristic processes that contextualize problems in such a way as to maximize relevance to current goals (relevance principle). Analytic processes evaluate these models but tend to accept them unless there is good reason to reject them (satisficing principle). At a minimum, analytic processing of models is required so as to generate inferences or judgments relevant to the task instructions, but more active intervention may result in modification or replacement of default models generated by the heuristic system. Evidence for this theory is provided by a review of a wide range of literature on thinking and reasoning.
NASA Astrophysics Data System (ADS)
Niedermeier, Dennis; Voigtländer, Jens; Siebert, Holger; Desai, Neel; Shaw, Raymond; Chang, Kelken; Krueger, Steven; Schumacher, Jörg; Stratmann, Frank
2017-11-01
Turbulence - cloud droplet interaction processes have been investigated primarily through numerical simulation and field measurements over the last ten years. However, only in the laboratory we can be confident in our knowledge of initial and boundary conditions, and are able to measure for extended times under statistically stationary and repeatable conditions. Therefore, the newly built turbulent wind tunnel LACIS-T (Turbulent Leipzig Aerosol Cloud Interaction Simulator) is an ideal facility for pursuing mechanistic understanding of these processes. Within the tunnel we are able to adjust precisely controlled turbulent temperature and humidity fields so as to achieve supersaturation levels allowing for detailed investigations of the interactions between cloud microphysical processes (e.g., cloud droplet activation) and the turbulent flow, under well-defined and reproducible laboratory conditions. We will present the fundamental operating principle, first results from ongoing characterization efforts, numerical simulations as well as first droplet activation experiments.
Automated absolute phase retrieval in across-track interferometry
NASA Technical Reports Server (NTRS)
Madsen, Soren N.; Zebker, Howard A.
1992-01-01
Discussed is a key element in the processing of topographic radar maps acquired by the NASA/JPL airborne synthetic aperture radar configured as an across-track interferometer (TOPSAR). TOPSAR utilizes a single transmit and two receive antennas; the three-dimensional target location is determined by triangulation based on a known baseline and two measured slant ranges. The slant range difference is determined very accurately from the phase difference between the signals received by the two antennas. This phase is measured modulo 2pi, whereas it is the absolute phase which relates directly to the difference in slant range. It is shown that splitting the range bandwidth into two subbands in the processor and processing each individually allows for the absolute phase. The underlying principles and system errors which must be considered are discussed, together with the implementation and results from processing data acquired during the summer of 1991.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harvey, Dustin Yewell
Echo™ is a MATLAB-based software package designed for robust and scalable analysis of complex data workflows. An alternative to tedious, error-prone conventional processes, Echo is based on three transformative principles for data analysis: self-describing data, name-based indexing, and dynamic resource allocation. The software takes an object-oriented approach to data analysis, intimately connecting measurement data with associated metadata. Echo operations in an analysis workflow automatically track and merge metadata and computation parameters to provide a complete history of the process used to generate final results, while automated figure and report generation tools eliminate the potential to mislabel those results. History reportingmore » and visualization methods provide straightforward auditability of analysis processes. Furthermore, name-based indexing on metadata greatly improves code readability for analyst collaboration and reduces opportunities for errors to occur. Echo efficiently manages large data sets using a framework that seamlessly allocates resources such that only the necessary computations to produce a given result are executed. Echo provides a versatile and extensible framework, allowing advanced users to add their own tools and data classes tailored to their own specific needs. Applying these transformative principles and powerful features, Echo greatly improves analyst efficiency and quality of results in many application areas.« less
Diffracted diffraction radiation and its application to beam diagnostics
NASA Astrophysics Data System (ADS)
Goponov, Yu. A.; Shatokhin, R. A.; Sumitani, K.; Syshchenko, V. V.; Takabayashi, Y.; Vnukov, I. E.
2018-03-01
We present theoretical considerations for diffracted diffraction radiation and also propose an application of this process to diagnosing ultra-relativistic electron (positron) beams for the first time. Diffraction radiation is produced when relativistic particles move near a target. If the target is a crystal or X-ray mirror, diffraction radiation in the X-ray region is expected to be diffracted at the Bragg angle and therefore be detectable. We present a scheme for applying this process to measurements of the beam angular spread, and consider how to conduct a proof-of-principle experiment for the proposed method.
S-193 scatterometer transfer function analysis for data processing
NASA Technical Reports Server (NTRS)
Johnson, L.
1974-01-01
A mathematical model for converting raw data measurements of the S-193 scatterometer into processed values of radar scattering coefficient is presented. The argument is based on an approximation derived from the Radar Equation and actual operating principles of the S-193 Scatterometer hardware. Possible error sources are inaccuracies in transmitted wavelength, range, antenna illumination integrals, and the instrument itself. The dominant source of error in the calculation of scattering coefficent is accuracy of the range. All other ractors with the possible exception of illumination integral are not considered to cause significant error in the calculation of scattering coefficient.
Balasubramanian, Bijal A; Cohen, Deborah J; Davis, Melinda M; Gunn, Rose; Dickinson, L Miriam; Miller, William L; Crabtree, Benjamin F; Stange, Kurt C
2015-03-10
In healthcare change interventions, on-the-ground learning about the implementation process is often lost because of a primary focus on outcome improvements. This paper describes the Learning Evaluation, a methodological approach that blends quality improvement and implementation research methods to study healthcare innovations. Learning Evaluation is an approach to multi-organization assessment. Qualitative and quantitative data are collected to conduct real-time assessment of implementation processes while also assessing changes in context, facilitating quality improvement using run charts and audit and feedback, and generating transportable lessons. Five principles are the foundation of this approach: (1) gather data to describe changes made by healthcare organizations and how changes are implemented; (2) collect process and outcome data relevant to healthcare organizations and to the research team; (3) assess multi-level contextual factors that affect implementation, process, outcome, and transportability; (4) assist healthcare organizations in using data for continuous quality improvement; and (5) operationalize common measurement strategies to generate transportable results. Learning Evaluation principles are applied across organizations by the following: (1) establishing a detailed understanding of the baseline implementation plan; (2) identifying target populations and tracking relevant process measures; (3) collecting and analyzing real-time quantitative and qualitative data on important contextual factors; (4) synthesizing data and emerging findings and sharing with stakeholders on an ongoing basis; and (5) harmonizing and fostering learning from process and outcome data. Application to a multi-site program focused on primary care and behavioral health integration shows the feasibility and utility of Learning Evaluation for generating real-time insights into evolving implementation processes. Learning Evaluation generates systematic and rigorous cross-organizational findings about implementing healthcare innovations while also enhancing organizational capacity and accelerating translation of findings by facilitating continuous learning within individual sites. Researchers evaluating change initiatives and healthcare organizations implementing improvement initiatives may benefit from a Learning Evaluation approach.
Zachariah, Marianne; Seidling, Hanna M; Neri, Pamela M; Cresswell, Kathrin M; Duke, Jon; Bloomrosen, Meryl; Volk, Lynn A; Bates, David W
2011-01-01
Background Medication-related decision support can reduce the frequency of preventable adverse drug events. However, the design of current medication alerts often results in alert fatigue and high over-ride rates, thus reducing any potential benefits. Methods The authors previously reviewed human-factors principles for relevance to medication-related decision support alerts. In this study, instrument items were developed for assessing the appropriate implementation of these human-factors principles in drug–drug interaction (DDI) alerts. User feedback regarding nine electronic medical records was considered during the development process. Content validity, construct validity through correlation analysis, and inter-rater reliability were assessed. Results The final version of the instrument included 26 items associated with nine human-factors principles. Content validation on three systems resulted in the addition of one principle (Corrective Actions) to the instrument and the elimination of eight items. Additionally, the wording of eight items was altered. Correlation analysis suggests a direct relationship between system age and performance of DDI alerts (p=0.0016). Inter-rater reliability indicated substantial agreement between raters (κ=0.764). Conclusion The authors developed and gathered preliminary evidence for the validity of an instrument that measures the appropriate use of human-factors principles in the design and display of DDI alerts. Designers of DDI alerts may use the instrument to improve usability and increase user acceptance of medication alerts, and organizations selecting an electronic medical record may find the instrument helpful in meeting their clinicians' usability needs. PMID:21946241
Helland, Turid; Tjus, Tomas; Hovden, Marit; Ofte, Sonja; Heimann, Mikael
2011-01-01
This longitudinal study focused on the effects of two different principles of intervention in children at risk of developing dyslexia from 5 to 8 years old. The children were selected on the basis of a background questionnaire given to parents and preschool teachers, with cognitive and functional magnetic resonance imaging results substantiating group differences in neuropsychological processes associated with phonology, orthography, and phoneme-grapheme correspondence (i.e., alphabetic principle). The two principles of intervention were bottom-up (BU), "from sound to meaning", and top-down (TD), "from meaning to sound." Thus, four subgroups were established: risk/BU, risk/TD, control/BU, and control/TD. Computer-based training took place for 2 months every spring, and cognitive assessments were performed each fall of the project period. Measures of preliteracy skills for reading and spelling were phonological awareness, working memory, verbal learning, and letter knowledge. Literacy skills were assessed by word reading and spelling. At project end the control group scored significantly above age norm, whereas the risk group scored within the norm. In the at-risk group, training based on the BU principle had the strongest effects on phonological awareness and working memory scores, whereas training based on the TD principle had the strongest effects on verbal learning, letter knowledge, and literacy scores. It was concluded that appropriate, specific, data-based intervention starting in preschool can mitigate literacy impairment and that interventions should contain BU training for preliteracy skills and TD training for literacy training.
Kampmann, Peter; Kirchner, Frank
2014-01-01
With the increasing complexity of robotic missions and the development towards long-term autonomous systems, the need for multi-modal sensing of the environment increases. Until now, the use of tactile sensor systems has been mostly based on sensing one modality of forces in the robotic end-effector. The use of a multi-modal tactile sensory system is motivated, which combines static and dynamic force sensor arrays together with an absolute force measurement system. This publication is focused on the development of a compact sensor interface for a fiber-optic sensor array, as optic measurement principles tend to have a bulky interface. Mechanical, electrical and software approaches are combined to realize an integrated structure that provides decentralized data pre-processing of the tactile measurements. Local behaviors are implemented using this setup to show the effectiveness of this approach. PMID:24743158
Social network analysis for program implementation.
Valente, Thomas W; Palinkas, Lawrence A; Czaja, Sara; Chu, Kar-Hai; Brown, C Hendricks
2015-01-01
This paper introduces the use of social network analysis theory and tools for implementation research. The social network perspective is useful for understanding, monitoring, influencing, or evaluating the implementation process when programs, policies, practices, or principles are designed and scaled up or adapted to different settings. We briefly describe common barriers to implementation success and relate them to the social networks of implementation stakeholders. We introduce a few simple measures commonly used in social network analysis and discuss how these measures can be used in program implementation. Using the four stage model of program implementation (exploration, adoption, implementation, and sustainment) proposed by Aarons and colleagues [1] and our experience in developing multi-sector partnerships involving community leaders, organizations, practitioners, and researchers, we show how network measures can be used at each stage to monitor, intervene, and improve the implementation process. Examples are provided to illustrate these concepts. We conclude with expected benefits and challenges associated with this approach.
Social Network Analysis for Program Implementation
Valente, Thomas W.; Palinkas, Lawrence A.; Czaja, Sara; Chu, Kar-Hai; Brown, C. Hendricks
2015-01-01
This paper introduces the use of social network analysis theory and tools for implementation research. The social network perspective is useful for understanding, monitoring, influencing, or evaluating the implementation process when programs, policies, practices, or principles are designed and scaled up or adapted to different settings. We briefly describe common barriers to implementation success and relate them to the social networks of implementation stakeholders. We introduce a few simple measures commonly used in social network analysis and discuss how these measures can be used in program implementation. Using the four stage model of program implementation (exploration, adoption, implementation, and sustainment) proposed by Aarons and colleagues [1] and our experience in developing multi-sector partnerships involving community leaders, organizations, practitioners, and researchers, we show how network measures can be used at each stage to monitor, intervene, and improve the implementation process. Examples are provided to illustrate these concepts. We conclude with expected benefits and challenges associated with this approach. PMID:26110842
Applying the cognitive theory of multimedia learning: an analysis of medical animations.
Yue, Carole; Kim, Jessie; Ogawa, Rikke; Stark, Elena; Kim, Sara
2013-04-01
Instructional animations play a prominent role in medical education, but the degree to which these teaching tools follow empirically established learning principles, such as those outlined in the cognitive theory of multimedia learning (CTML), is unknown. These principles provide guidelines for designing animations in a way that promotes optimal cognitive processing and facilitates learning, but the application of these learning principles in current animations has not yet been investigated. A large-scale review of existing educational tools in the context of this theoretical framework is necessary to examine if and how instructional medical animations adhere to these principles and where improvements can be made. We conducted a comprehensive review of instructional animations in the health sciences domain and examined whether these animations met the three main goals of CTML: managing essential processing; minimising extraneous processing, and facilitating generative processing. We also identified areas for pedagogical improvement. Through Google keyword searches, we identified 4455 medical animations for review. After the application of exclusion criteria, 860 animations from 20 developers were retained. We randomly sampled and reviewed 50% of the identified animations. Many animations did not follow the recommended multimedia learning principles, particularly those that support the management of essential processing. We also noted an excess of extraneous visual and auditory elements and few opportunities for learner interactivity. Many unrealised opportunities exist for improving the efficacy of animations as learning tools in medical education; instructors can look to effective examples to select or design animations that incorporate the established principles of CTML. © Blackwell Publishing Ltd 2013.
Supercritical Fluid Fractionation of JP-8
1991-12-26
applications, such as coffee decaffeination , spice extraction, and lipids purification. The processing principles have also long been well known and ipracticed...PRINCIPLES OF SUPERCRITICAL FLUID EXTRACTION 8 A. Background on Supercritical Fluid Solubility 8 B. Supercritical Fluid Extraction Process ...Operation I0 1. Batch Extraction of Solid Materials 10 2. Counter-Current Continuous SCF Processing of Liquid 15 Products 3. Supercritical Fluid Extraction vs
Read, S J; Vanman, E J; Miller, L C
1997-01-01
We argue that recent work in connectionist modeling, in particular the parallel constraint satisfaction processes that are central to many of these models, has great importance for understanding issues of both historical and current concern for social psychologists. We first provide a brief description of connectionist modeling, with particular emphasis on parallel constraint satisfaction processes. Second, we examine the tremendous similarities between parallel constraint satisfaction processes and the Gestalt principles that were the foundation for much of modem social psychology. We propose that parallel constraint satisfaction processes provide a computational implementation of the principles of Gestalt psychology that were central to the work of such seminal social psychologists as Asch, Festinger, Heider, and Lewin. Third, we then describe how parallel constraint satisfaction processes have been applied to three areas that were key to the beginnings of modern social psychology and remain central today: impression formation and causal reasoning, cognitive consistency (balance and cognitive dissonance), and goal-directed behavior. We conclude by discussing implications of parallel constraint satisfaction principles for a number of broader issues in social psychology, such as the dynamics of social thought and the integration of social information within the narrow time frame of social interaction.
[Design of an HACCP program for a cocoa processing facility].
López D'Sola, Patrizia; Sandia, María Gabriela; Bou Rached, Lizet; Hernández Serrano, Pilar
2012-12-01
The HACCP plan is a food safety management tool used to control physical, chemical and biological hazards associated to food processing through all the processing chain. The aim of this work is to design a HACCP Plan for a Venezuelan cocoa processing facility.The production of safe food products requires that the HACCP system be built upon a solid foundation of prerequisite programs such as Good Manufacturing Practices (GMP) and Sanitation Standard Operating Procedures (SSOP). The existence and effectiveness of these prerequisite programs were previously assessed.Good Agriculture Practices (GAP) audit to cocoa nibs suppliers were performed. To develop the HACCP plan, the five preliminary tasks and the seven HACCP principles were accomplished according to Codex Alimentarius procedures. Three Critical Control Points (CCP) were identified using a decision tree: winnowing (control of ochratoxin A), roasting (Salmonella control) and metallic particles detection. For each CCP, Critical limits were established, the Monitoring procedures, Corrective actions, Procedures for Verification and Documentation concerning all procedures and records appropriate to these principles and their application was established. To implement and maintain a HACCP plan for this processing plant is suggested. Recently OchratoxinA (OTA) has been related to cocoa beans. Although the shell separation from the nib has been reported as an effective measure to control this chemical hazard, ochratoxin prevalence study in cocoa beans produced in the country is recommended, and validate the winnowing step as well
Magasi, Susan; Harniss, Mark; Heinemann, Allen W
2018-01-01
Principles of fairness in testing require that all test takers, including people with disabilities, have an equal opportunity to demonstrate their capacity on the construct being measured. Measurement design features and assessment protocols can pose barriers for people with disabilities. Fairness in testing is a fundamental validity issue at all phases in the design, administration, and interpretation of measurement instruments in clinical practice and research. There is limited guidance for instrument developers on how to develop and evaluate the accessibility and usability of measurement instruments. This article describes a 6-stage iterative process for developing accessible computer-administered measurement instruments grounded in the procedures implemented across several major measurement initiatives. A key component of this process is interdisciplinary teams of accessibility experts, content and measurement experts, information technology experts, and people with disabilities working together to ensure that measurement instruments are accessible and usable by a wide range of users. The development of accessible measurement instruments is not only an ethical requirement, it also ensures better science by minimizing measurement bias, missing data, and attrition due to mismatches between the target population and test administration platform and protocols. Copyright © 2017 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.
Launch vehicle systems design analysis
NASA Technical Reports Server (NTRS)
Ryan, Robert; Verderaime, V.
1993-01-01
Current launch vehicle design emphasis is on low life-cycle cost. This paper applies total quality management (TQM) principles to a conventional systems design analysis process to provide low-cost, high-reliability designs. Suggested TQM techniques include Steward's systems information flow matrix method, quality leverage principle, quality through robustness and function deployment, Pareto's principle, Pugh's selection and enhancement criteria, and other design process procedures. TQM quality performance at least-cost can be realized through competent concurrent engineering teams and brilliance of their technical leadership.
NASA Astrophysics Data System (ADS)
Karpeshin, F. F.
2002-11-01
Main principles of the resonance effect arising in the electron shells in interaction of the nuclei with electromagnetic radiation are analyzed and presented in the historical aspect. Principles of NEET are considered from a more general position, as compared to how this is usually presented. Characteristic features of NEET and its reverse, TEEN, as internal conversion processes are analyzed, and ways are offered of inducing them by laser radiation. The ambivalent role of the Pauli exclusion principles in NEET and TEEN processes is investigated.
The TAME Project: Towards improvement-oriented software environments
NASA Technical Reports Server (NTRS)
Basili, Victor R.; Rombach, H. Dieter
1988-01-01
Experience from a dozen years of analyzing software engineering processes and products is summarized as a set of software engineering and measurement principles that argue for software engineering process models that integrate sound planning and analysis into the construction process. In the TAME (Tailoring A Measurement Environment) project at the University of Maryland, such an improvement-oriented software engineering process model was developed that uses the goal/question/metric paradigm to integrate the constructive and analytic aspects of software development. The model provides a mechanism for formalizing the characterization and planning tasks, controlling and improving projects based on quantitative analysis, learning in a deeper and more systematic way about the software process and product, and feeding the appropriate experience back into the current and future projects. The TAME system is an instantiation of the TAME software engineering process model as an ISEE (integrated software engineering environment). The first in a series of TAME system prototypes has been developed. An assessment of experience with this first limited prototype is presented including a reassessment of its initial architecture.
Introduction of a pyramid guiding process for general musculoskeletal physical rehabilitation
Stark, Timothy W
2006-01-01
Successful instruction of a complicated subject as Physical Rehabilitation demands organization. To understand principles and processes of such a field demands a hierarchy of steps to achieve the intended outcome. This paper is intended to be an introduction to a proposed pyramid scheme of general physical rehabilitation principles. The purpose of the pyramid scheme is to allow for a greater understanding for the student and patient. As the respected Food Guide Pyramid accomplishes, the student will further appreciate and apply supported physical rehabilitation principles and the patient will understand that there is a progressive method to their functional healing process. PMID:16759396
Maximum caliber inference of nonequilibrium processes
NASA Astrophysics Data System (ADS)
Otten, Moritz; Stock, Gerhard
2010-07-01
Thirty years ago, Jaynes suggested a general theoretical approach to nonequilibrium statistical mechanics, called maximum caliber (MaxCal) [Annu. Rev. Phys. Chem. 31, 579 (1980)]. MaxCal is a variational principle for dynamics in the same spirit that maximum entropy is a variational principle for equilibrium statistical mechanics. Motivated by the success of maximum entropy inference methods for equilibrium problems, in this work the MaxCal formulation is applied to the inference of nonequilibrium processes. That is, given some time-dependent observables of a dynamical process, one constructs a model that reproduces these input data and moreover, predicts the underlying dynamics of the system. For example, the observables could be some time-resolved measurements of the folding of a protein, which are described by a few-state model of the free energy landscape of the system. MaxCal then calculates the probabilities of an ensemble of trajectories such that on average the data are reproduced. From this probability distribution, any dynamical quantity of the system can be calculated, including population probabilities, fluxes, or waiting time distributions. After briefly reviewing the formalism, the practical numerical implementation of MaxCal in the case of an inference problem is discussed. Adopting various few-state models of increasing complexity, it is demonstrated that the MaxCal principle indeed works as a practical method of inference: The scheme is fairly robust and yields correct results as long as the input data are sufficient. As the method is unbiased and general, it can deal with any kind of time dependency such as oscillatory transients and multitime decays.
A simple method for processing data with least square method
NASA Astrophysics Data System (ADS)
Wang, Chunyan; Qi, Liqun; Chen, Yongxiang; Pang, Guangning
2017-08-01
The least square method is widely used in data processing and error estimation. The mathematical method has become an essential technique for parameter estimation, data processing, regression analysis and experimental data fitting, and has become a criterion tool for statistical inference. In measurement data analysis, the distribution of complex rules is usually based on the least square principle, i.e., the use of matrix to solve the final estimate and to improve its accuracy. In this paper, a new method is presented for the solution of the method which is based on algebraic computation and is relatively straightforward and easy to understand. The practicability of this method is described by a concrete example.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cox, Jordan M.; Walton, Ian M.; Bateman, Gage
2017-07-25
Understanding the processes by which porous solid-state materials adsorb and release guest molecules would represent a significant step towards developing rational design principles for functional porous materials. To elucidate the process of liquid exchange in these materials, dynamicin situX-ray diffraction techniques have been developed which utilize liquid-phase chemical stimuli. Using these time-resolved diffraction techniques, the ethanol solvation process in a flexible metal–organic framework [Co(AIP)(bpy) 0.5(H 2O)]·2H 2O was examined. The measurements provide important insight into the nature of the chemical transformation in this system including the presence of a previously unreported neat ethanol solvate structure.
Using the balanced scorecard in the development of community partnerships.
Tsasis, Peter; Owen, Susan M
2009-02-01
The benefits of community partnerships have been well established in the health service literature. However, measuring these benefits and associated outcomes is relatively new. This paper presents an innovative initiative in the application of a balanced scorecard framework for measuring and monitoring partnership activity at the community level, while adopting principles of evidence-based practice to the partnership process. In addition, it serves as an excellent example of how organizations can apply scorecard methodology to move away from relationship-based partnerships and into new collaborations of which they can select - using a formal skill and competency assessment for partnership success.
How measurement science can improve confidence in research results.
Plant, Anne L; Becker, Chandler A; Hanisch, Robert J; Boisvert, Ronald F; Possolo, Antonio M; Elliott, John T
2018-04-01
The current push for rigor and reproducibility is driven by a desire for confidence in research results. Here, we suggest a framework for a systematic process, based on consensus principles of measurement science, to guide researchers and reviewers in assessing, documenting, and mitigating the sources of uncertainty in a study. All study results have associated ambiguities that are not always clarified by simply establishing reproducibility. By explicitly considering sources of uncertainty, noting aspects of the experimental system that are difficult to characterize quantitatively, and proposing alternative interpretations, the researcher provides information that enhances comparability and reproducibility.
46th Annual Gun and Missile Systems Conference and Exhibition. Volume 3 - Thursday
2011-09-01
Grade Sensors Through Use of Accelerated Aging Principles Mr. Scott Gift 11657 Modeling of the Autofrettage Processes of a Gun Barrel Mr. Sudhir...Emissions Measured on the Outer Portion of a Composite Barrel Ms. Rushie Ghimire GUN & MISSILE SYSTEMS ADDITIONAL AUTHORS GUN & MISSILE SYSTEMS...Transportation – Loading – Gun Fire to Barrel Exit – After Barrel Exit • Passing: Fuze safety devices remain safe; safe for disposal or safe for
Ensuring Food Integrity by Metrology and FAIR Data Principles
Rychlik, Michael; Zappa, Giovanna; Añorga, Larraitz; Belc, Nastasia; Castanheira, Isabel; Donard, Olivier F. X.; Kouřimská, Lenka; Ogrinc, Nives; Ocké, Marga C.; Presser, Karl; Zoani, Claudia
2018-01-01
Food integrity is a general term for sound, nutritive, healthy, tasty, safe, authentic, traceable, as well as ethically, safely, environment-friendly, and sustainably produced foods. In order to verify these properties, analytical methods with a higher degree of accuracy, sensitivity, standardization and harmonization and a harmonized system for their application in analytical laboratories are required. In this view, metrology offers the opportunity to achieve these goals. In this perspective article the current global challenges in food analysis and the principles of metrology to fill these gaps are presented. Therefore, the pan-European project METROFOOD-RI within the framework of the European Strategy Forum on Research Infrastructures (ESFRI) was developed to establish a strategy to allow reliable and comparable analytical measurements in foods along the whole process line starting from primary producers until consumers and to make all data findable, accessible, interoperable, and re-usable according to the FAIR data principles. The initiative currently consists of 48 partners from 18 European Countries and concluded its “Early Phase” as research infrastructure by organizing its future structure and presenting a proof of concept by preparing, distributing and comprehensively analyzing three candidate Reference Materials (rice grain, rice flour, and oyster tissue) and establishing a system how to compile, process, and store the generated data and how to exchange, compare them and make them accessible in data bases. PMID:29872651
Guiding principles for evaluating the impacts of conservation interventions on human well-being
Woodhouse, Emily; Homewood, Katherine M.; Beauchamp, Emilie; Clements, Tom; McCabe, J. Terrence; Wilkie, David; Milner-Gulland, E. J.
2015-01-01
Measures of socio-economic impacts of conservation interventions have largely been restricted to externally defined indicators focused on income, which do not reflect people's priorities. Using a holistic, locally grounded conceptualization of human well-being instead provides a way to understand the multi-faceted impacts of conservation on aspects of people's lives that they value. Conservationists are engaging with well-being for both pragmatic and ethical reasons, yet current guidance on how to operationalize the concept is limited. We present nine guiding principles based around a well-being framework incorporating material, relational and subjective components, and focused on gaining knowledge needed for decision-making. The principles relate to four key components of an impact evaluation: (i) defining well-being indicators, giving primacy to the perceptions of those most impacted by interventions through qualitative research, and considering subjective well-being, which can affect engagement with conservation; (ii) attributing impacts to interventions through quasi-experimental designs, or alternative methods such as theory-based, case study and participatory approaches, depending on the setting and evidence required; (iii) understanding the processes of change including evidence of causal linkages, and consideration of trajectories of change and institutional processes; and (iv) data collection with methods selected and applied with sensitivity to research context, consideration of heterogeneity of impacts along relevant societal divisions, and conducted by evaluators with local expertise and independence from the intervention. PMID:26460137
Ensuring Food Integrity by Metrology and FAIR Data Principles
NASA Astrophysics Data System (ADS)
Rychlik, Michael; Zappa, Giovanna; Añorga, Larraitz; Belc, Nastasia; Castanheira, Isabel; Donard, Olivier F. X.; Kouřimská, Lenka; Ogrinc, Nives; Ocké, Marga C.; Presser, Karl; Zoani, Claudia
2018-05-01
Food integrity is a general term for sound, nutritive, healthy, tasty, safe, authentic, traceable, as well as ethically, safely, environment-friendly and sustainably produced foods. In order to verify these properties, analytical methods with a higher degree of accuracy, sensitivity, standardization and harmonization and a harmonized system for their application in analytical laboratories are required. In this view, metrology offers the opportunity to achieve these goals. In this perspective article the current global challenges in food analysis and the principles of metrology to fill these gaps are presented. Therefore, the pan-European project METROFOOD-RI within the framework of the European Strategy Forum on Research Infrastructures (ESFRI) was developed to establish a strategy to allow reliable and comparable analytical measurements in foods along the whole process line starting from primary producers until consumers and to make all data findable, accessible, interoperable, and re-usable according to the FAIR data principles. The initiative currently consists of 48 partners from 18 European Countries and concluded its “Early Phase” as research infrastructure by organizing its future structure and presenting a proof of concept by preparing, distributing and comprehensively analyzing three candidate Reference Materials (rice grain, rice flour and oyster tissue) and establishing a system how to compile, process and store the generated data and how to exchange, compare them and make them accessible in data bases.
Ensuring Food Integrity by Metrology and FAIR Data Principles.
Rychlik, Michael; Zappa, Giovanna; Añorga, Larraitz; Belc, Nastasia; Castanheira, Isabel; Donard, Olivier F X; Kouřimská, Lenka; Ogrinc, Nives; Ocké, Marga C; Presser, Karl; Zoani, Claudia
2018-01-01
Food integrity is a general term for sound, nutritive, healthy, tasty, safe, authentic, traceable, as well as ethically, safely, environment-friendly, and sustainably produced foods. In order to verify these properties, analytical methods with a higher degree of accuracy, sensitivity, standardization and harmonization and a harmonized system for their application in analytical laboratories are required. In this view, metrology offers the opportunity to achieve these goals. In this perspective article the current global challenges in food analysis and the principles of metrology to fill these gaps are presented. Therefore, the pan-European project METROFOOD-RI within the framework of the European Strategy Forum on Research Infrastructures (ESFRI) was developed to establish a strategy to allow reliable and comparable analytical measurements in foods along the whole process line starting from primary producers until consumers and to make all data findable, accessible, interoperable, and re-usable according to the FAIR data principles. The initiative currently consists of 48 partners from 18 European Countries and concluded its "Early Phase" as research infrastructure by organizing its future structure and presenting a proof of concept by preparing, distributing and comprehensively analyzing three candidate Reference Materials (rice grain, rice flour, and oyster tissue) and establishing a system how to compile, process, and store the generated data and how to exchange, compare them and make them accessible in data bases.
Theory of sampling: four critical success factors before analysis.
Wagner, Claas; Esbensen, Kim H
2015-01-01
Food and feed materials characterization, risk assessment, and safety evaluations can only be ensured if QC measures are based on valid analytical data, stemming from representative samples. The Theory of Sampling (TOS) is the only comprehensive theoretical framework that fully defines all requirements to ensure sampling correctness and representativity, and to provide the guiding principles for sampling in practice. TOS also defines the concept of material heterogeneity and its impact on the sampling process, including the effects from all potential sampling errors. TOS's primary task is to eliminate bias-generating errors and to minimize sampling variability. Quantitative measures are provided to characterize material heterogeneity, on which an optimal sampling strategy should be based. Four critical success factors preceding analysis to ensure a representative sampling process are presented here.
Low-cost optical data acquisition system for blade vibration measurement
NASA Technical Reports Server (NTRS)
Posta, Stephen J.
1988-01-01
A low cost optical data acquisition system was designed to measure deflection of vibrating rotor blade tips. The basic principle of the new design is to record raw data, which is a set of blade arrival times, in memory and to perform all processing by software following a run. This approach yields a simple and inexpensive system with the least possible hardware. Functional elements of the system were breadboarded and operated satisfactorily during rotor simulations on the bench, and during a data collection run with a two-bladed rotor in the Lewis Research Center Spin Rig. Software was written to demonstrate the sorting and processing of data stored in the system control computer, after retrieval from the data acquisition system. The demonstration produced an accurate graphical display of deflection versus time.
Use of Invertebrate Animals to Teach Physiological Principles.
ERIC Educational Resources Information Center
Deyrup-Olsen, Ingrith; Linder, Thomas M.
1991-01-01
The advantages of using invertebrates in teaching physiological principles are discussed. The ability to illustrate with greater clarity physiological principles, the range and variety of physiological processes available for examination, and the unlimited possibilities for student research are topics of discussion. (KR)
Principles and Techniques of Radiation Chemistry.
ERIC Educational Resources Information Center
Dorfman, Leon M.
1981-01-01
Discusses the physical processes involved in the deposition of energy from ionizing radiation in the absorber system. Identifies principles relevant to these processes which are responsible for ionization and excitation of the components of the absorber system. Briefly describes some experimental techniques in use in radiation chemical studies.…
Integrating Leadership Processes: Redefining the Principles Course.
ERIC Educational Resources Information Center
Neff, Bonita Dostal
2002-01-01
Revamps the principles of a public relations course, the first professional course in the public relations sequence, by integrating a leadership process and a service-learning component. Finds that more students are reflecting the interpersonal and team skills desired in the 1998 national study on public relations. (SG)
Traditional Chinese medicine on the effects of low-intensity laser irradiation on cells
NASA Astrophysics Data System (ADS)
Liu, Timon C.; Duan, Rui; Li, Yan; Cai, Xiongwei
2002-04-01
In previous paper, process-specific times (PSTs) are defined by use of molecular reaction dynamics and time quantum theory established by TCY Liu et al., and the change of PSTs representing two weakly nonlinearly coupled bio-processes are shown to be parallel, which is called time parallel principle (TPP). The PST of a physiological process (PP) is called physiological time (PT). After the PTs of two PPs are compared with their Yin-Yang property of traditional Chinese medicine (TCM), the PST model of Yin and Yang (YPTM) was put forward: for two related processes, the process of small PST is Yin, and the other process is Yang. The Yin-Yang parallel principle (YPP) was put forward in terms of YPTM and TPP, which is the fundamental principle of TCM. In this paper, we apply it to study TCM on the effects of low intensity laser on cells, and successfully explained observed phenomena.
Romano, Francesco; Gustén, Jan; De Antonellis, Stefano; Joppolo, Cesare M
2017-01-30
Air cleanliness in operating theatres (OTs) is an important factor for preserving the health of both the patient and the medical staff. Particle contamination in OTs depends mainly on the surgery process, ventilation principle, personnel clothing systems and working routines. In many open surgical operations, electrosurgical tools (ESTs) are used for tissue cauterization. ESTs generate a significant airborne contamination, as surgical smoke. Surgical smoke is a work environment quality problem. Ordinary surgical masks and OT ventilation systems are inadequate to control this problem. This research work is based on numerous monitoring campaigns of ultrafine particle concentrations in OTs, equipped with upward displacement ventilation or with a downward unidirectional airflow system. Measurements performed during ten real surgeries highlight that the use of ESTs generates a quite sharp and relevant increase of particle concentration in the surgical area as well within the entire OT area. The measured contamination level in the OTs are linked to surgical operation, ventilation principle, and ESTs used. A better knowledge of airborne contamination is crucial for limiting the personnel's exposure to surgical smoke. Research results highlight that downward unidirectional OTs can give better conditions for adequate ventilation and contaminant removal performances than OTs equipped with upward displacement ventilation systems.
Chew, Gina; Walczyk, Thomas
2013-04-02
Subtle variations in the isotopic composition of elements carry unique information about physical and chemical processes in nature and are now exploited widely in diverse areas of research. Reliable measurement of natural isotope abundance variations is among the biggest challenges in inorganic mass spectrometry as they are highly sensitive to methodological bias. For decades, double spiking of the sample with a mix of two stable isotopes has been considered the reference technique for measuring such variations both by multicollector-inductively coupled plasma mass spectrometry (MC-ICPMS) and multicollector-thermal ionization mass spectrometry (MC-TIMS). However, this technique can only be applied to elements having at least four stable isotopes. Here we present a novel approach that requires measurement of three isotope signals only and which is more robust than the conventional double spiking technique. This became possible by gravimetric mixing of the sample with an isotopic spike in different proportions and by applying principles of isotope dilution for data analysis (GS-IDA). The potential and principle use of the technique is demonstrated for Mg in human urine using MC-TIMS for isotopic analysis. Mg is an element inaccessible to double spiking methods as it consists of three stable isotopes only and shows great potential for metabolically induced isotope effects waiting to be explored.
Marino, Christopher J; Mahan, Robert R
2005-01-01
The nutrition label format currently used by consumers to make dietary-related decisions presents significant information-processing demands for integration-based decisions; however, those demands were not considered as primary factors when the format was adopted. Labels designed in accordance with known principles of cognitive psychology might enhance the kind of decision making that food labeling was intended to facilitate. Three experiments were designed on the basis of the proximity compatibility principle (PCP) to investigate the relationship between nutrition label format and decision making; the experiments involved two types of integration decisions and one type of filtering decision. Based on the PCP, decision performance was measured to test the overall hypothesis that matched task-display tandems would result in better decision performance than would mismatched tandems. In each experiment, a statistically significant increase in mean decision performance was found when the display design was cognitively matched to the demands of the task. Combined, the results from all three experiments support the general hypothesis that task-display matching is a design principle that may enhance the utility of nutrition labeling in nutrition-related decision making. Actual or potential applications of this research include developing robust display solutions that aid in less effortful assimilation of nutrition-related information for consumers.
De Vincenzi, M
1996-01-01
This paper presents three experiments on the parsing of Italian wh-questions that manipulate the wh-type (who vs. which-N) and the wh extraction site (main clause, dependent clause with or without complementizer). The aim of these manipulations is to see whether the parser is sensitive to the type of dependencies being processed and whether the processing effects can be explained by a unique processing principle, the minimal chain principle (MCP; De Vincenzi, 1991). The results show that the parser, following the MCP, prefers structures with fewer and less complex chains. In particular: (1) There is a processing advantage for the wh-subject extractions, the structures with less complex chains; (2) there is a processing dissociation between the who and which questions; (3) the parser respects the principle that governs the well-formedness of the empty categories (ECP).
Online devices and measuring systems for the automatic control of newspaper printing
NASA Astrophysics Data System (ADS)
Marszalec, Elzbieta A.; Heikkila, Ismo; Juhola, Helene; Lehtonen, Tapio
1999-09-01
The paper reviews the state-of-the-art color measuring systems used for the control of newspaper printing. The printing process requirements are specified and different off-line and on-line color quality control systems, commercially available and under development, are evaluated. Recent market trends in newspaper printing are discussed based on the survey. The study was made on information derived from: conference proceedings (TAGA, IARIGAI, SPIE and IS&T), journals (American Printer, Applied Optics), discussions with experts (GMI, QTI, HONEYWELL, TOBIAS, GretagMacbeth), IFRA Expo'98/Quality Measuring Technologies, commercial brochures, and the Internet. On the background of this review, three different measuring principles, currently, under investigation at VTT Information Technology, are described and their applicability to newspaper printing is evaluated.
Dynamics of non-stationary processes that follow the maximum of the Rényi entropy principle.
Shalymov, Dmitry S; Fradkov, Alexander L
2016-01-01
We propose dynamics equations which describe the behaviour of non-stationary processes that follow the maximum Rényi entropy principle. The equations are derived on the basis of the speed-gradient principle originated in the control theory. The maximum of the Rényi entropy principle is analysed for discrete and continuous cases, and both a discrete random variable and probability density function (PDF) are used. We consider mass conservation and energy conservation constraints and demonstrate the uniqueness of the limit distribution and asymptotic convergence of the PDF for both cases. The coincidence of the limit distribution of the proposed equations with the Rényi distribution is examined.
Dynamics of non-stationary processes that follow the maximum of the Rényi entropy principle
2016-01-01
We propose dynamics equations which describe the behaviour of non-stationary processes that follow the maximum Rényi entropy principle. The equations are derived on the basis of the speed-gradient principle originated in the control theory. The maximum of the Rényi entropy principle is analysed for discrete and continuous cases, and both a discrete random variable and probability density function (PDF) are used. We consider mass conservation and energy conservation constraints and demonstrate the uniqueness of the limit distribution and asymptotic convergence of the PDF for both cases. The coincidence of the limit distribution of the proposed equations with the Rényi distribution is examined. PMID:26997886
Common computational properties found in natural sensory systems
NASA Astrophysics Data System (ADS)
Brooks, Geoffrey
2009-05-01
Throughout the animal kingdom there are many existing sensory systems with capabilities desired by the human designers of new sensory and computational systems. There are a few basic design principles constantly observed among these natural mechano-, chemo-, and photo-sensory systems, principles that have been proven by the test of time. Such principles include non-uniform sampling and processing, topological computing, contrast enhancement by localized signal inhibition, graded localized signal processing, spiked signal transmission, and coarse coding, which is the computational transformation of raw data using broadly overlapping filters. These principles are outlined here with references to natural biological sensory systems as well as successful biomimetic sensory systems exploiting these natural design concepts.
French, Katy E; Albright, Heidi W; Frenzel, John C; Incalcaterra, James R; Rubio, Augustin C; Jones, Jessica F; Feeley, Thomas W
2013-12-01
The value and impact of process improvement initiatives are difficult to quantify. We describe the use of time-driven activity-based costing (TDABC) in a clinical setting to quantify the value of process improvements in terms of cost, time and personnel resources. Difficulty in identifying and measuring the cost savings of process improvement initiatives in a Preoperative Assessment Center (PAC). Use TDABC to measure the value of process improvement initiatives that reduce the costs of performing a preoperative assessment while maintaining the quality of the assessment. Apply the principles of TDABC in a PAC to measure the value, from baseline, of two phases of performance improvement initiatives and determine the impact of each implementation in terms of cost, time and efficiency. Through two rounds of performance improvements, we quantified an overall reduction in time spent by patient and personnel of 33% that resulted in a 46% reduction in the costs of providing care in the center. The performance improvements resulted in a 17% decrease in the total number of full time equivalents (FTE's) needed to staff the center and a 19% increase in the numbers of patients assessed in the center. Quality of care, as assessed by the rate of cancellations on the day of surgery, was not adversely impacted by the process improvements. © 2013 Published by Elsevier Inc.
Improving quality of care in substance abuse treatment using five key process improvement principles
Hoffman, Kim A.; Green, Carla A.; Ford, James H.; Wisdom, Jennifer P.; Gustafson, David H.; McCarty, Dennis
2012-01-01
Process and quality improvement techniques have been successfully applied in health care arenas, but efforts to institute these strategies in alcohol and drug treatment are underdeveloped. The Network for the Improvement of Addiction Treatment (NIATx) teaches participating substance abuse treatment agencies to use process improvement strategies to increase client access to, and retention in, treatment. NIATx recommends five principles to promote organizational change: 1) Understand and involve the customer; 2) Fix key problems; 3) Pick a powerful change leader; 4) Get ideas from outside the organization; and 5) Use rapid-cycle testing. Using case studies, supplemented with cross-agency analyses of interview data, this paper profiles participating NIATx treatment agencies that illustrate application of each principle. Results suggest that the most successful organizations integrate and apply most, if not all, of the five principles as they develop and test change strategies. PMID:22282129
NASA Astrophysics Data System (ADS)
Li, Tianxing; Zhou, Junxiang; Deng, Xiaozhong; Li, Jubo; Xing, Chunrong; Su, Jianxin; Wang, Huiliang
2018-07-01
A manufacturing error of a cycloidal gear is the key factor affecting the transmission accuracy of a robot rotary vector (RV) reducer. A methodology is proposed to realize the digitized measurement and data processing of the cycloidal gear manufacturing error based on the gear measuring center, which can quickly and accurately measure and evaluate the manufacturing error of the cycloidal gear by using both the whole tooth profile measurement and a single tooth profile measurement. By analyzing the particularity of the cycloidal profile and its effect on the actual meshing characteristics of the RV transmission, the cycloid profile measurement strategy is planned, and the theoretical profile model and error measurement model of cycloid-pin gear transmission are established. Through the digital processing technology, the theoretical trajectory of the probe and the normal vector of the measured point are calculated. By means of precision measurement principle and error compensation theory, a mathematical model for the accurate calculation and data processing of manufacturing error is constructed, and the actual manufacturing error of the cycloidal gear is obtained by the optimization iterative solution. Finally, the measurement experiment of the cycloidal gear tooth profile is carried out on the gear measuring center and the HEXAGON coordinate measuring machine, respectively. The measurement results verify the correctness and validity of the measurement theory and method. This methodology will provide the basis for the accurate evaluation and the effective control of manufacturing precision of the cycloidal gear in a robot RV reducer.
Transmission loss measurement of acoustic material using time-domain pulse-separation method (L).
Sun, Liang; Hou, Hong
2011-04-01
An alternative method for measuring the normal incidence sound transmission loss (nSTL) is presented in this paper based on the time-domain separation of so-called Butterworth pulse with a short-duration time about 1 ms in a standing wave tube. During the generation process of the pulse, inverse filter principle was adopted to compensate the loudspeaker response, besides this, the effect of the characteristics of tube termination can be eliminated through the generation process of the pulse so as to obtain a single plane pulse wave in the standing wave tube which makes the nSTL measurement very simple. A polyurethane foam material with low transmission loss and a kind of rubber material with relatively high transmission loss are used to verify the proposed method. When compared with the traditional two-load method, a relatively good agreement between these two methods can be observed. The main error of this method results from the measuring accuracy of the amplitude of transmission coefficient.
NASA Astrophysics Data System (ADS)
Ströhl, Florian; Wong, Hovy H. W.; Holt, Christine E.; Kaminski, Clemens F.
2018-01-01
Fluorescence anisotropy imaging microscopy (FAIM) measures the depolarization properties of fluorophores to deduce molecular changes in their environment. For successful FAIM, several design principles have to be considered and a thorough system-specific calibration protocol is paramount. One important calibration parameter is the G factor, which describes the system-induced errors for different polarization states of light. The determination and calibration of the G factor is discussed in detail in this article. We present a novel measurement strategy, which is particularly suitable for FAIM with high numerical aperture objectives operating in TIRF illumination mode. The method makes use of evanescent fields that excite the sample with a polarization direction perpendicular to the image plane. Furthermore, we have developed an ImageJ/Fiji plugin, AniCalc, for FAIM data processing. We demonstrate the capabilities of our TIRF-FAIM system by measuring β -actin polymerization in human embryonic kidney cells and in retinal neurons.
Fluorescence lifetime as a new parameter in analytical cytology measurements
NASA Astrophysics Data System (ADS)
Steinkamp, John A.; Deka, Chiranjit; Lehnert, Bruce E.; Crissman, Harry A.
1996-05-01
A phase-sensitive flow cytometer has been developed to quantify fluorescence decay lifetimes on fluorochrome-labeled cells/particles. This instrument combines flow cytometry (FCM) and frequency-domain fluorescence spectroscopy measurement principles to provide unique capabilities for making phase-resolved lifetime measurements, while preserving conventional FCM capabilities. Cells are analyzed as they intersect a high-frequency, intensity-modulated (sine wave) laser excitation beam. Fluorescence signals are processed by conventional and phase-sensitive signal detection electronics and displayed as frequency distribution histograms. In this study we describe results of fluorescence intensity and lifetime measurements on fluorescently labeled particles, cells, and chromosomes. Examples of measurements on intrinsic cellular autofluorescence, cells labeled with immunofluorescence markers for cell- surface antigens, mitochondria stains, and on cellular DNA and protein binding fluorochromes will be presented to illustrate unique differences in measured lifetimes and changes caused by fluorescence quenching. This innovative technology will be used to probe fluorochrome/molecular interactions in the microenvironment of cells/chromosomes as a new parameter and thus expand the researchers' understanding of biochemical processes and structural features at the cellular and molecular level.
76 FR 13101 - Requirements for Processing, Clearing, and Transfer of Customer Positions
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-10
... organization (DCO) for clearing. Proposed regulations also would facilitate compliance with DCO Core Principle.... The Commission is further proposing related regulations implementing SEF Core Principle 7 (Financial Integrity of Transactions) and DCM Core Principle 11 (Financial Integrity of Transactions), requiring...
Streamlining the Acquisition Process: Should Program Directors be Granted Contracting Authority
1989-09-01
relationship between program directors and contracting officers contradicts basic management principles. One of Fayol’s principles of management is that...Franklin, Stephen G. Principles of Management , Eighth Edition. Homewood, II: Richard D. Irwin, Inc., 1982. Thybony, William W. Government Contracting based
Mobley, Kenyon B; Jones, Adam G
2013-03-01
The genetic mating system is a key component of the sexual selection process, yet methods for the quantification of mating systems remain controversial. One approach involves metrics derived from Bateman's principles, which are based on variances in mating and reproductive success and the relationship between them. However, these measures are extremely difficult to measure for both sexes in open populations, because missing data can result in biased estimates. Here, we develop a novel approach for the estimation of mating system metrics based on Bateman's principles and apply it to a microsatellite-based parentage analysis of a natural population of the dusky pipefish, Syngnathus floridae. Our results show that both male and female dusky pipefish have significantly positive Bateman gradients. However, females exhibit larger values of the opportunity for sexual selection and the opportunity for selection compared to males. These differences translate into a maximum intensity of sexual selection (S'max) for females three times larger than that for males. Overall, this study identifies a critical source of bias that affects studies of mating systems in open populations, presents a novel method for overcoming this bias, and applies this method for the first time in a sex-role-reversed pipefish. © 2012 The Author(s). Evolution© 2012 The Society for the Study of Evolution.
Disaster Management: Mental Health Perspective
Math, Suresh Bada; Nirmala, Maria Christine; Moirangthem, Sydney; Kumar, Naveen C.
2015-01-01
Disaster mental health is based on the principles of ‘preventive medicine’ This principle has necessitated a paradigm shift from relief centered post-disaster management to a holistic, multi-dimensional integrated community approach of health promotion, disaster prevention, preparedness and mitigation. This has ignited the paradigm shift from curative to preventive aspects of disaster management. This can be understood on the basis of six ‘R’s such as Readiness (Preparedness), Response (Immediate action), Relief (Sustained rescue work), Rehabilitation (Long term remedial measures using community resources), Recovery (Returning to normalcy) and Resilience (Fostering). Prevalence of mental health problems in disaster affected population is found to be higher by two to three times than that of the general population. Along with the diagnosable mental disorders, affected community also harbours large number of sub-syndromal symptoms. Majority of the acute phase reactions and disorders are self-limiting, whereas long-term phase disorders require assistance from mental health professionals. Role of psychotropic medication is very limited in preventing mental health morbidity. The role of cognitive behaviour therapy (CBT) in mitigating the mental health morbidity appears to be promising. Role of Psychological First Aid (PFA) and debriefing is not well-established. Disaster management is a continuous and integrated cyclical process of planning, organising, coordinating and implementing measures to prevent and to manage disaster effectively. Thus, now it is time to integrate public health principles into disaster mental health. PMID:26664073
75 FR 78198 - Proposed Final Policy on Consultation and Coordination With Indian Tribes
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-15
... promote consistency in, and coordination of, the consultation process; and establish a management... found in Executive Order 13175. The Policy reflects the principles expressed in the 1984 EPA Policy for.... Definitions IV. Guiding Principles V. Consultation A. The Consultation Process B. What Activities May Involve...
[Factor Analysis: Principles to Evaluate Measurement Tools for Mental Health].
Campo-Arias, Adalberto; Herazo, Edwin; Oviedo, Heidi Celina
2012-09-01
The validation of a measurement tool in mental health is a complex process that usually starts by estimating reliability, to later approach its validity. Factor analysis is a way to know the number of dimensions, domains or factors of a measuring tool, generally related to the construct validity of the scale. The analysis could be exploratory or confirmatory, and helps in the selection of the items with better performance. For an acceptable factor analysis, it is necessary to follow some steps and recommendations, conduct some statistical tests, and rely on a proper sample of participants. Copyright © 2012 Asociación Colombiana de Psiquiatría. Publicado por Elsevier España. All rights reserved.
Processing data, for improved, accuracy, from device for measuring speed of sound in a gas
Owen, Thomas E.
2006-09-19
A method, used in connection with a pulse-echo type sensor for determining the speed of sound in a gas, for improving the accuracy of speed of sound measurements. The sensor operates on the principle that speed of sound can be derived from the difference between the two-way travel time of signals reflected from two different target faces of the sensor. This time difference is derived by computing the cross correlation between the two reflections. The cross correlation function may be fitted to a parabola whose vertex represents the optimum time coordinate of the coherence peak, thereby providing an accurate measure of the two-way time diffference.
NASA Technical Reports Server (NTRS)
Oran, W. A.; Reiss, D. A.; Berge, L. H.; Parker, H. W.
1979-01-01
The acoustic fields and levitation forces produced along the axis of a single-axis resonance system were measured. The system consisted of a St. Clair generator and a planar reflector. The levitation force was measured for bodies of various sizes and geometries (i.e., spheres, cylinders, and discs). The force was found to be roughly proportional to the volume of the body until the characteristic body radius reaches approximately 2/k (k = wave number). The acoustic pressures along the axis were modeled using Huygens principle and a method of imaging to approximate multiple reflections. The modeled pressures were found to be in reasonable agreement with those measured with a calibrated microphone.
Kleidon, Axel
2009-06-01
The Earth system is maintained in a unique state far from thermodynamic equilibrium, as, for instance, reflected in the high concentration of reactive oxygen in the atmosphere. The myriad of processes that transform energy, that result in the motion of mass in the atmosphere, in oceans, and on land, processes that drive the global water, carbon, and other biogeochemical cycles, all have in common that they are irreversible in their nature. Entropy production is a general consequence of these processes and measures their degree of irreversibility. The proposed principle of maximum entropy production (MEP) states that systems are driven to steady states in which they produce entropy at the maximum possible rate given the prevailing constraints. In this review, the basics of nonequilibrium thermodynamics are described, as well as how these apply to Earth system processes. Applications of the MEP principle are discussed, ranging from the strength of the atmospheric circulation, the hydrological cycle, and biogeochemical cycles to the role that life plays in these processes. Nonequilibrium thermodynamics and the MEP principle have potentially wide-ranging implications for our understanding of Earth system functioning, how it has evolved in the past, and why it is habitable. Entropy production allows us to quantify an objective direction of Earth system change (closer to vs further away from thermodynamic equilibrium, or, equivalently, towards a state of MEP). When a maximum in entropy production is reached, MEP implies that the Earth system reacts to perturbations primarily with negative feedbacks. In conclusion, this nonequilibrium thermodynamic view of the Earth system shows great promise to establish a holistic description of the Earth as one system. This perspective is likely to allow us to better understand and predict its function as one entity, how it has evolved in the past, and how it is modified by human activities in the future.
Endoscopic fringe projection for in-situ inspection of a sheet-bulk metal forming process
NASA Astrophysics Data System (ADS)
Matthias, Steffen; Kästner, Markus; Reithmeier, Eduard
2015-05-01
Sheet-bulk metal forming is a new production process capable of performing deep-drawing and massive forming steps in a single operation. However, due to the high forming forces of the forming process, continuous process control is required in order to detect wear on the forming tool before production quality is impacted. To be able to measure the geometry of the forming tool in the limited space of forming presses, a new inspection system is being developed within the SFB/TR 73 collaborative research center. In addition to the limited space, the process restricts the amount of time available for inspection. Existing areal optical measurement systems suffer from shadowing when measuring the tool's inner elements, as they cannot be placed in the limited space next to the tool, while tactile measurement systems cannot meet the time restrictions for measuring the areal geometries. The new inspection system uses the fringe projection optical measurement principle to capture areal geometry data from relevant parts of the forming tool in short time. Highresolution image fibers are used to connect the system's compact sensor head to a base unit containing both camera and projector of the fringe projection system, which can be positioned outside of the moving parts of the press. To enable short measurement times, a high intensity laser source is used in the projector in combination with a digital micro-mirror device. Gradient index lenses are featured in the sensor head to allow for a very compact design that can be used in the narrow space above the forming tool inside the press. The sensor head is attached to an extended arm, which also guides the image fibers to the base unit. A rotation stage offers the possibility to capture measurements of different functional elements on the circular forming tool by changing the orientation of the sensor head next to the forming tool. During operation of the press, the arm can be travelled out of the moving parts of the forming press. To further reduce the measurement times of the fringe projection system, the inverse fringe projection principle has been adapted to the system to detect geometry deviations in a single camera image. Challenges arise from vibrations of both the forming machine and the positioning stages, which are transferred via the extended arm to the sensor head. Vibrations interfere with the analysis algorithms of both encoded and inverse fringe projection and thus impair measurement accuracy. To evaluate the impact of vibrations on the endoscopic system, results of measurements of simple geometries under the influence of vibrations are discussed. The effect of vibrations is imitated by displacing the measurement specimen during the measurement with a linear positioning stage. The concept of the new inspection system is presented within the scope of the TR 73 demonstrational sheet-bulk metal forming process. Finally, the capabilities of the endoscopic fringe projection system are shown by measurements of gearing structures on a forming tool compared to a CAD-reference.
Herens, Marion; Wagemakers, Annemarie; Vaandrager, Lenneke; Koelen, Maria
2015-11-25
Physical inactivity is a core risk factor for non-communicable diseases. In the Netherlands, socially vulnerable groups are relatively less active than groups with higher socio-economic status. Community-based health-enhancing physical activity (CBHEPA) programs aim to empower socially vulnerable groups by improving participants' health and wellbeing through physical activity. CBHEPA programs often revolve around group-based principles for action, such as active participation, enjoyment, and fostering group processes. As such principles are rarely made explicit, our study aims to identify which of the group-based principles for action are considered important by participants. Respondents (n = 76) from ten focus groups scored their individual appreciation of group-based principles for action - active participation, enjoyment, and fostering group processes - on a three-point, statement-based scale. Opinions were further discussed in the focus group. Focus group discussions were transcribed and analysed by a team of investigators. The coding procedures, identifying elements appreciated in group-based principles for action, were thematic and data driven. Statements about participatory programming generated much less consensus in appreciation among respondents than statements about enjoyment and fostering group processes. To some extent, group members participated in the development of program content. Participation in group formation or community initiatives was less frequently perceived as something within group members' control. Enjoyment, expressed as physical and emotional experiences, was found to be an individual driver of group exercise. Fostering group processes, expressed as social support, was found to contribute to enjoyment and learning achievements. Responsive leadership, ensuring responsive guidance, by an enthusiastic exercise trainer acting as a role model, were identified as additional necessary principles for action. Group-based principles for action in CBHEPA programs are not clearly demarcated. Fostering group processes is an overarching principle, conditional for the spin-off in terms of enjoyment and active participation. This, in turn, leads to a sense of ownership among participants, who take up responsibility for the exercise group as well as their individual activity behaviour. CBHEPA programs thrive on participants having fun together and exercise trainers' leadership skills. A professional, competent, responsive exercise trainer plays a key role in the organisation and maintenance of CBHEPA programs.
The balance principle in scientific research.
Hu, Liang-ping; Bao, Xiao-lei; Wang, Qi
2012-05-01
The principles of balance, randomization, control and repetition, which are closely related, constitute the four principles of scientific research. The balance principle is the kernel of the four principles which runs through the other three. However, in scientific research, the balance principle is always overlooked. If the balance principle is not well performed, the research conclusion is easy to be denied, which may lead to the failure of the whole research. Therefore, it is essential to have a good command of the balance principle in scientific research. This article stresses the definition and function of the balance principle, the strategies and detailed measures to improve balance in scientific research, and the analysis of the common mistakes involving the use of the balance principle in scientific research.
Assessing the Infusion of Sustainability Principles into University Curricula
ERIC Educational Resources Information Center
Biasutti, Michele; De Baz, Theodora; Alshawa, Hala
2016-01-01
The current paper presents the assessment of the infusion of sustainability principles into university curricula at two Jordanian universities. The peer review process of revising the curricula infusing sustainability principles is also discussed. The research methodology involved quantitative methods to assess the revised courses. The results…
ERIC Educational Resources Information Center
MacBeath, John; Swaffield, Sue; Frost, David
2009-01-01
This article provides an overview of the "Carpe Vitam: Leadership for Learning" project, accounting for its provenance and purposes, before focusing on the principles for practice that constitute an important part of the project's legacy. These principles framed the dialogic process that was a dominant feature of the project and are presented,…
Research to Assembly Scheme for Satellite Deck Based on Robot Flexibility Control Principle
NASA Astrophysics Data System (ADS)
Guo, Tao; Hu, Ruiqin; Xiao, Zhengyi; Zhao, Jingjing; Fang, Zhikai
2018-03-01
Deck assembly is critical quality control point in final satellite assembly process, and cable extrusion and structure collision problems in assembly process will affect development quality and progress of satellite directly. Aimed at problems existing in deck assembly process, assembly project scheme for satellite deck based on robot flexibility control principle is proposed in this paper. Scheme is introduced firstly; secondly, key technologies on end force perception and flexible docking control in the scheme are studied; then, implementation process of assembly scheme for satellite deck is described in detail; finally, actual application case of assembly scheme is given. Result shows that compared with traditional assembly scheme, assembly scheme for satellite deck based on robot flexibility control principle has obvious advantages in work efficiency, reliability and universality aspects etc.
The action uncertainty principle and quantum gravity
NASA Astrophysics Data System (ADS)
Mensky, Michael B.
1992-02-01
Results of the path-integral approach to the quantum theory of continuous measurements have been formulated in a preceding paper in the form of an inequality of the type of the uncertainty principle. The new inequality was called the action uncertainty principle, AUP. It was shown that the AUP allows one to find in a simple what outputs of the continuous measurements will occur with high probability. Here a more simple form of the AUP will be formulated, δ S≳ħ. When applied to quantum gravity, it leads in a very simple way to the Rosenfeld inequality for measurability of the average curvature.
Synchronization invariance under network structural transformations
NASA Astrophysics Data System (ADS)
Arola-Fernández, Lluís; Díaz-Guilera, Albert; Arenas, Alex
2018-06-01
Synchronization processes are ubiquitous despite the many connectivity patterns that complex systems can show. Usually, the emergence of synchrony is a macroscopic observable; however, the microscopic details of the system, as, e.g., the underlying network of interactions, is many times partially or totally unknown. We already know that different interaction structures can give rise to a common functionality, understood as a common macroscopic observable. Building upon this fact, here we propose network transformations that keep the collective behavior of a large system of Kuramoto oscillators invariant. We derive a method based on information theory principles, that allows us to adjust the weights of the structural interactions to map random homogeneous in-degree networks into random heterogeneous networks and vice versa, keeping synchronization values invariant. The results of the proposed transformations reveal an interesting principle; heterogeneous networks can be mapped to homogeneous ones with local information, but the reverse process needs to exploit higher-order information. The formalism provides analytical insight to tackle real complex scenarios when dealing with uncertainty in the measurements of the underlying connectivity structure.
Understanding linear measurement: A comparison of filipino and new zealand children
NASA Astrophysics Data System (ADS)
Irwin, Kathryn C.; Ell, Fiona R.; Vistro-Yu, Catherine P.
2004-06-01
An understanding of linear measurement depends on principles that include standard unit size, iteration of units, numbering of a unit at its end, and partial units for measuring continuous length. Children may learn these principles at school, for example through experience with informal measurement, or they may learn them through use of measurement in society. This study compared the application of these principles by children aged 8 and 9 from the Philippines and New Zealand. These countries were selected because they have quite different curricula, societal influences and economies. Ninety-one children were interviewed individually on a common set of unusual tasks that were designed to tap underlying principles. Results showed many similarities and some differences between countries. Most tasks requiring visualisation and informal units were done more accurately by New Zealand children. Some tasks involving the use of a conventional ruler were done more accurately by Filipino children. These differences appear to be related to differences in curricula and possibly to differences in societal use of measurement. We suggest that these results, like those of other writers cited, demonstrate the need for extensive work on the underlying concepts in measurement through work on informal measurement and a careful transition between informal and formal measurement.
The Basic Principles and Methods of the System Approach to Compression of Telemetry Data
NASA Astrophysics Data System (ADS)
Levenets, A. V.
2018-01-01
The task of data compressing of measurement data is still urgent for information-measurement systems. In paper the basic principles necessary for designing of highly effective systems of compression of telemetric information are offered. A basis of the offered principles is representation of a telemetric frame as whole information space where we can find of existing correlation. The methods of data transformation and compressing algorithms realizing the offered principles are described. The compression ratio for offered compression algorithm is about 1.8 times higher, than for a classic algorithm. Thus, results of a research of methods and algorithms showing their good perspectives.
On-line identification of fermentation processes for ethanol production.
Câmara, M M; Soares, R M; Feital, T; Naomi, P; Oki, S; Thevelein, J M; Amaral, M; Pinto, J C
2017-07-01
A strategy for monitoring fermentation processes, specifically, simultaneous saccharification and fermentation (SSF) of corn mash, was developed. The strategy covered the development and use of first principles, semimechanistic and unstructured process model based on major kinetic phenomena, along with mass and energy balances. The model was then used as a reference model within an identification procedure capable of running on-line. The on-line identification procedure consists on updating the reference model through the estimation of corrective parameters for certain reaction rates using the most recent process measurements. The strategy makes use of standard laboratory measurements for sugars quantification and in situ temperature and liquid level data. The model, along with the on-line identification procedure, has been tested against real industrial data and have been able to accurately predict the main variables of operational interest, i.e., state variables and its dynamics, and key process indicators. The results demonstrate that the strategy is capable of monitoring, in real time, this complex industrial biomass fermentation. This new tool provides a great support for decision-making and opens a new range of opportunities for industrial optimization.
Moore, Alex M.; vanMarle, Kristy; Geary, David C.
2016-01-01
Fluency in first graders’ processing of the magnitudes associated with Arabic numerals, collections of objects, and mixtures of objects and numerals predicts current and future mathematics achievement. The quantitative competencies that support the development of fluent processing of magnitude are not fully understood, however. At the beginning and end of preschool (M = 3 years, 9 months at first assessment; range 3 years, 3 months to 4years, 3 months), 112 (51 boys) children completed tasks measuring numeral recognition and comparison, acuity of the approximate number system, and knowledge of counting principles, cardinality, and implicit arithmetic, and completed a magnitude processing task (number sets test) in kindergarten. Use of Bayesian and linear regression techniques revealed that two measures of preschoolers’ cardinal knowledge and their competence at implicit arithmetic predicted later fluency of magnitude processing, controlling domain general factors, preliteracy skills, and parental education. The results help to narrow the search for the early foundation of children’s emerging competence with symbolic mathematics and provide direction for early interventions. PMID:27236038
Moore, Alex M; vanMarle, Kristy; Geary, David C
2016-10-01
Fluency in first graders' processing of the magnitudes associated with Arabic numerals, collections of objects, and mixtures of objects and numerals predicts current and future mathematics achievement. The quantitative competencies that support the development of fluent processing of magnitude, however, are not fully understood. At the beginning and end of preschool (M=3years 9months at first assessment, range=3years 3months to 4years 3months), 112 children (51 boys) completed tasks measuring numeral recognition and comparison, acuity of the approximate number system, and knowledge of counting principles, cardinality, and implicit arithmetic and also completed a magnitude processing task (number sets test) in kindergarten. Use of Bayesian and linear regression techniques revealed that two measures of preschoolers' cardinal knowledge and their competence at implicit arithmetic predicted later fluency of magnitude processing, controlling domain-general factors, preliteracy skills, and parental education. The results help to narrow the search for the early foundation of children's emerging competence with symbolic mathematics and provide direction for early interventions. Copyright © 2016 Elsevier Inc. All rights reserved.
Monitoring Earth Surface Dynamics With Optical Imagery
NASA Astrophysics Data System (ADS)
Leprince, Sébastien; Berthier, Etienne; Ayoub, François; Delacourt, Christophe; Avouac, Jean-Philippe
2008-01-01
The increasing availability of high-quality optical satellite images should allow, in principle, continuous monitoring of Earth's surface changes due to geologic processes, climate change, or anthropic activity. For instance, sequential optical images have been used to measure displacements at Earth's surface due to coseismic ground deformation [e.g., Van Puymbroeck et al., 2000], ice flow [Scambos et al., 1992; Berthier et al., 2005], sand dune migration [Crippen, 1992], and landslides [Kääb, 2002; Delacourt et al., 2004]. Surface changes related to agriculture, deforestation, urbanization, and erosion-which do not involve ground displacement-might also be monitored, provided that the images can be registered with sufficient accuracy. Although the approach is simple in principle, its use is still limited, mainly because of geometric distortion of the images induced by the imaging system, biased correlation techniques, and implementation difficulties.
Anomalous pressure dependence of thermal conductivities of large mass ratio compounds
Lindsay, Lucas R; Broido, David A.; Carrete, Jesus; ...
2015-03-27
The lattice thermal conductivities (k) of binary compound materials are examined as a function of hydrostatic pressure P using a first-principles approach. Compound materials with relatively small mass ratios, such as MgO, show an increase in k with P, consistent with measurements. Conversely, compounds with large mass ratios (e.g., BSb, BAs, BeTe, BeSe) exhibit decreasing with increasing P, a behavior that cannot be understood using simple theories of k. This anomalous P dependence of k arises from the fundamentally different nature of the intrinsic scattering processes for heat-carrying acoustic phonons in large mass ratio compounds compared to those with smallmore » mass ratios. We find this work demonstrates the power of first principles methods for thermal properties and advances the understanding of thermal transport in non-metals.« less
Palliative care, public health and justice: setting priorities in resource poor countries.
Blinderman, Craig
2009-12-01
Many countries have not considered palliative care a public health problem. With limited resources, disease-oriented therapies and prevention measures take priority. In this paper, I intend to describe the moral framework for considering palliative care as a public health priority in resource-poor countries. A distributive theory of justice for health care should consider integrative palliative care as morally required as it contributes to improving normal functioning and preserving opportunities for the individual. For patients requiring terminal care, we are guided less by principles of justice and more by the duty to relieve suffering and society's commitment to protecting the professional's obligation to uphold principles of beneficence, compassion and non-abandonment. A fair deliberation process is necessary to allow these strong moral commitments to serve as reasons when setting priorities in resource poor countries.
The NASA planning process, appendix D. [as useful planning approach for solving urban problems
NASA Technical Reports Server (NTRS)
Annett, H. A.
1973-01-01
The planning process is outlined which NASA used in making some fundamental post-Apollo decisions concerning the reuseable space shuttle and the orbiting laboratory. It is suggested that the basic elements and principles of the process, when combined, form a useful planning approach for solving urban problems. These elements and principles are defined along with the basic strengths of the planning model.
Process improvement of pap smear tracking in a women's medicine center clinic in residency training.
Calhoun, Byron C; Goode, Jeff; Simmons, Kathy
2011-11-01
Application of Six-Sigma methodology and Change Acceleration Process (CAP)/Work Out (WO) tools to track pap smear results in an outpatient clinic in a hospital-based residency-training program. Observational study of impact of changes obtained through application of Six-Sigma principles in clinic process with particular attention to prevention of sentinel events. Using cohort analysis and applying Six-Sigma principles to an interactive electronic medical record Soarian workflow engine, we designed a system of timely accession and reporting of pap smear and pathology results. We compared manual processes from January 1, 2007 to February 28, 2008 to automated processes from March 1, 2008 to December 31, 2009. Using the Six-Sigma principles, CAP/WO tools, including "voice of the customer" and team focused approach, no outlier events went untracked. Applying the Soarian workflow engine to track prescribed 7 day turnaround time for completion, we identified 148 pap results in 3,936, 3 non-gynecological results in 15, and 41 surgical results in 246. We applied Six-Sigma principles to an outpatient clinic facilitating an interdisciplinary team approach to improve the clinic's reporting system. Through focused problem assessment, verification of process, and validation of outcomes, we improved patient care for pap smears and critical pathology. © 2011 National Association for Healthcare Quality.
An Overview of Legal Principles and Issues Affecting Postsecondary Athletics.
ERIC Educational Resources Information Center
Kaplin, William A.
1977-01-01
Discussions of procedural due process, first amendment rights, sex discrimination, tort law, discrimination on the basis of handicap, and legal principles regarding athletic associations and conferences indicate the wide range of legal principles to which postsecondary athletic programs are subject. Sex discrimination is noted as a major issue in…
Doctrine Development Process in the Kenya Army: Bridging the Gap
2014-06-13
concepts, and principles . It must broadly follow three doctrine development phases: the collection/information gathering phase; the formulation and...a capable lead organization. The organization must eliminate terminological and utility confusion among doctrine, concepts, and principles . It must...15 The relationship Between Military Doctrine, Concept and Principle
Strong van der Waals attractive forces in nanotechnology
NASA Astrophysics Data System (ADS)
Reimers, Jeffrey
The Dobson classification scheme for failure of London-like expressions for describing dispersion is reviewed. New ways to measure using STM data and calculate by first principles free energies of organic self-assembly processes from solution will be discussed, considering tetraalkylporphyrins on graphite. How strong van der Waals forces can compete against covalent bonding to produce new molecular isomers and reaction pathways will also be demonstrated, focusing on golds-sulfur bonds for sensors and stabilizing nanoparticles.
46th Annual Gun and Missile Systems Conference and Exhibition
2011-09-01
Grade Sensors Through Use of Accelerated Aging Principles Mr. Scott Gift 11657 Modeling of the Autofrettage Processes of a Gun Barrel Mr. Sudhir...Emissions Measured on the Outer Portion of a Composite Barrel Ms. Rushie Ghimire GUN & MISSILE SYSTEMS ADDITIONAL AUTHORS GUN & MISSILE SYSTEMS...X 3BCT 2BCT X 1BCT TACP COLT • BLUFOR scout squad engaged by enemy infantry and are overwhelmed by heavy machine gun fire 100m to their northwest
2016-03-04
summary of the linear algebra involved. As we have seen, the RSC process begins with the interferometric phase measurement β, which due to wrapping will...mentary Divisors) in Section 2 and the following defi- nition of the matrix determinant. This definition is given in many linear algebra texts (see...principle solve for a particular solution of this system by arbitrarily setting two object phases (whose spatial frequencies are not co- linear ) and one
Hayes, Steven C.; Levin, Michael E.; Plumb-Vilardaga, Jennifer; Villatte, Jennifer L.; Pistorello, Jacqueline
2012-01-01
A number of recent authors have compared acceptance and commitment therapy (ACT) and traditional cognitive behavior therapy (CBT). The present article describes ACT as a distinct and unified model of behavior change, linked to a specific strategy of scientific development, which we term “contextual behavioral science.” We outline the empirical progress of ACT and describe its distinctive development strategy. A contextual behavioral science approach is an inductive attempt to build more adequate psychological systems based on philosophical clarity; the development of basic principles and theories; the development of applied theories linked to basic ones; techniques and components linked to these processes and principles; measurement of theoretically key processes; an emphasis on mediation and moderation in the analysis of applied impact; an interest in effectiveness, dissemination, and training; empirical testing of the research program across a broad range of areas and levels of analysis; and the creation of a more effective scientific and clinical community. We argue that this is a reasonable approach, focused on long-term progress, and that in broad terms it seems to be working. ACT is not hostile to traditional CBT, and is not directly buoyed by whatever weaknesses traditional CBT may have. ACT should be measured at least in part against its own goals as specified by its own developmental strategy. PMID:23611068
Martínez-Pernía, David; González-Castán, Óscar; Huepe, David
2017-02-01
The development of rehabilitation has traditionally focused on measurements of motor disorders and measurements of the improvements produced during the therapeutic process; however, physical rehabilitation sciences have not focused on understanding the philosophical and scientific principles in clinical intervention and how they are interrelated. The main aim of this paper is to explain the foundation stones of the disciplines of physical therapy, occupational therapy, and speech/language therapy in recovery from motor disorder. To reach our goals, the mechanistic view and how it is integrated into physical rehabilitation will first be explained. Next, a classification into mechanistic therapy based on an old version (automaton model) and a technological version (cyborg model) will be shown. Then, it will be shown how physical rehabilitation sciences found a new perspective in motor recovery, which is based on functionalism, during the cognitive revolution in the 1960s. Through this cognitive theory, physical rehabilitation incorporated into motor recovery of those therapeutic strategies that solicit the activation of the brain and/or symbolic processing; aspects that were not taken into account in mechanistic therapy. In addition, a classification into functionalist rehabilitation based on a computational therapy and a brain therapy will be shown. At the end of the article, the methodological principles in physical rehabilitation sciences will be explained. It will allow us to go deeper into the differences and similarities between therapeutic mechanism and therapeutic functionalism.
Hayes, Steven C; Levin, Michael E; Plumb-Vilardaga, Jennifer; Villatte, Jennifer L; Pistorello, Jacqueline
2013-06-01
A number of recent authors have compared acceptance and commitment therapy (ACT) and traditional cognitive behavior therapy (CBT). The present article describes ACT as a distinct and unified model of behavior change, linked to a specific strategy of scientific development, which we term "contextual behavioral science." We outline the empirical progress of ACT and describe its distinctive development strategy. A contextual behavioral science approach is an inductive attempt to build more adequate psychological systems based on philosophical clarity; the development of basic principles and theories; the development of applied theories linked to basic ones; techniques and components linked to these processes and principles; measurement of theoretically key processes; an emphasis on mediation and moderation in the analysis of applied impact; an interest in effectiveness, dissemination, and training; empirical testing of the research program across a broad range of areas and levels of analysis; and the creation of a more effective scientific and clinical community. We argue that this is a reasonable approach, focused on long-term progress, and that in broad terms it seems to be working. ACT is not hostile to traditional CBT, and is not directly buoyed by whatever weaknesses traditional CBT may have. ACT should be measured at least in part against its own goals as specified by its own developmental strategy. Copyright © 2011. Published by Elsevier Ltd.
The minimal work cost of information processing
NASA Astrophysics Data System (ADS)
Faist, Philippe; Dupuis, Frédéric; Oppenheim, Jonathan; Renner, Renato
2015-07-01
Irreversible information processing cannot be carried out without some inevitable thermodynamical work cost. This fundamental restriction, known as Landauer's principle, is increasingly relevant today, as the energy dissipation of computing devices impedes the development of their performance. Here we determine the minimal work required to carry out any logical process, for instance a computation. It is given by the entropy of the discarded information conditional to the output of the computation. Our formula takes precisely into account the statistically fluctuating work requirement of the logical process. It enables the explicit calculation of practical scenarios, such as computational circuits or quantum measurements. On the conceptual level, our result gives a precise and operational connection between thermodynamic and information entropy, and explains the emergence of the entropy state function in macroscopic thermodynamics.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gonçalves, L.A.; Olavo, L.S.F., E-mail: olavolsf@gmail.com
Dissipation in Quantum Mechanics took some time to become a robust field of investigation after the birth of the field. The main issue hindering developments in the field is that the Quantization process was always tightly connected to the Hamiltonian formulation of Classical Mechanics. In this paper we present a quantization process that does not depend upon the Hamiltonian formulation of Classical Mechanics (although still departs from Classical Mechanics) and thus overcome the problem of finding, from first principles, a completely general Schrödinger equation encompassing dissipation. This generalized process of quantization is shown to be nothing but an extension ofmore » a more restricted version that is shown to produce the Schrödinger equation for Hamiltonian systems from first principles (even for Hamiltonian velocity dependent potential). - Highlights: • A Quantization process independent of the Hamiltonian formulation of quantum Mechanics is proposed. • This quantization method is applied to dissipative or absorptive systems. • A Dissipative Schrödinger equation is derived from first principles.« less
NASA Astrophysics Data System (ADS)
Santos, José; Janeiro, Fernando M.; Ramos, Pedro M.
2015-10-01
This paper presents an embedded liquid viscosity measurement system based on a vibrating wire sensor. Although multiple viscometers based on different working principles are commercially available, there is still a market demand for a dedicated measurement system capable of performing accurate, fast measurements and requiring little or no operator training for simple systems and solution monitoring. The developed embedded system is based on a vibrating wire sensor that works by measuring the impedance response of the sensor, which depends on the viscosity and density of the liquid in which the sensor is immersed. The core of the embedded system is a digital signal processor (DSP) which controls the waveform generation and acquisitions for the measurement of the impedance frequency response. The DSP also processes the acquired waveforms and estimates the liquid viscosity. The user can interact with the measurement system through a keypad and an LCD or through a computer with a USB connection for data logging and processing. The presented system is tested on a set of viscosity standards and the estimated values are compared with the standard manufacturer specified viscosity values. A stability study of the measurement system is also performed.
The equivalence of a human observer and an ideal observer in binary diagnostic tasks
NASA Astrophysics Data System (ADS)
He, Xin; Samuelson, Frank; Gallas, Brandon D.; Sahiner, Berkman; Myers, Kyle
2013-03-01
The Ideal Observer (IO) is "ideal" for given data populations. In the image perception process, as the raw images are degraded by factors such as display and eye optics, there is an equivalent IO (EIO). The EIO uses the statistical information that exits the perception/cognitive degradations as the data. We assume a human observer who received sufficient training, e.g., radiologists, and hypothesize that such a human observer can be modeled as if he is an EIO. To measure the likelihood ratio (LR) distributions of an EIO, we formalize experimental design principles that encourage rationality based on von Neumann and Morgenstern's (vNM) axioms. We present examples to show that many observer study design refinements, although motivated by empirical principles explicitly, implicitly encourage rationality. Our hypothesis is supported by a recent review paper on ROC curve convexity by Pesce, Metz, and Berbaum. We also provide additional evidence based on a collection of observer studies in medical imaging. EIO theory shows that the "sub-optimal" performance of a human observer can be mathematically formalized in the form of an IO, and measured through rationality encouragement.
Quantitative measures for redox signaling.
Pillay, Ché S; Eagling, Beatrice D; Driscoll, Scott R E; Rohwer, Johann M
2016-07-01
Redox signaling is now recognized as an important regulatory mechanism for a number of cellular processes including the antioxidant response, phosphokinase signal transduction and redox metabolism. While there has been considerable progress in identifying the cellular machinery involved in redox signaling, quantitative measures of redox signals have been lacking, limiting efforts aimed at understanding and comparing redox signaling under normoxic and pathogenic conditions. Here we have outlined some of the accepted principles for redox signaling, including the description of hydrogen peroxide as a signaling molecule and the role of kinetics in conferring specificity to these signaling events. Based on these principles, we then develop a working definition for redox signaling and review a number of quantitative methods that have been employed to describe signaling in other systems. Using computational modeling and published data, we show how time- and concentration- dependent analyses, in particular, could be used to quantitatively describe redox signaling and therefore provide important insights into the functional organization of redox networks. Finally, we consider some of the key challenges with implementing these methods. Copyright © 2016 Elsevier Inc. All rights reserved.
Møller, Cecilie; Højlund, Andreas; Bærentsen, Klaus B; Hansen, Niels Chr; Skewes, Joshua C; Vuust, Peter
2018-05-01
Perception is fundamentally a multisensory experience. The principle of inverse effectiveness (PoIE) states how the multisensory gain is maximal when responses to the unisensory constituents of the stimuli are weak. It is one of the basic principles underlying multisensory processing of spatiotemporally corresponding crossmodal stimuli that are well established at behavioral as well as neural levels. It is not yet clear, however, how modality-specific stimulus features influence discrimination of subtle changes in a crossmodally corresponding feature belonging to another modality. Here, we tested the hypothesis that reliance on visual cues to pitch discrimination follow the PoIE at the interindividual level (i.e., varies with varying levels of auditory-only pitch discrimination abilities). Using an oddball pitch discrimination task, we measured the effect of varying visually perceived vertical position in participants exhibiting a wide range of pitch discrimination abilities (i.e., musicians and nonmusicians). Visual cues significantly enhanced pitch discrimination as measured by the sensitivity index d', and more so in the crossmodally congruent than incongruent condition. The magnitude of gain caused by compatible visual cues was associated with individual pitch discrimination thresholds, as predicted by the PoIE. This was not the case for the magnitude of the congruence effect, which was unrelated to individual pitch discrimination thresholds, indicating that the pitch-height association is robust to variations in auditory skills. Our findings shed light on individual differences in multisensory processing by suggesting that relevant multisensory information that crucially aids some perceivers' performance may be of less importance to others, depending on their unisensory abilities.
The Spirit of OMERACT: Q Methodology Analysis of Conference Characteristics Valued by Delegates.
Flurey, Caroline A; Kirwan, John R; Hadridge, Phillip; Richards, Pamela; Grosskleg, Shawna; Tugwell, Peter S
2015-10-01
To identify the major features of OMERACT meetings as valued by frequent participants and to explore whether there are groups of participants with different opinions. Using Q methodology (a qualitative and quantitative approach to grouping people according to subjective opinion), participants (who attended more than 1 OMERACT conference) sorted 66 statements relating to the "spirit of OMERACT" according to level of agreement across a normal distribution grid. Data were examined using Q factor analysis. Of 226 potential participants, 105 responded (46%). All participants highly ranked the focus on global standardization of methods, outcome measures, data-driven research, methodological discussion, and international collaboration. Four factors describing the "spirit of OMERACT" were identified: "Evidence not eminence" (n = 31) valued the data- and evidence-driven research above personality and status; "Collaboration and collegiality" (n = 19) valued the international and cross-stakeholder collaboration, interaction, and collegiality; "Equal voices, equal votes, common goals" (n = 12) valued equality in discussion and voting, with everyone striving toward the same goal; "principles and product, not process" (n = 8) valued the principles of focusing on outcome measures and the product of guiding clinical trials, but were unsure whether the process is necessary to reach this. The factors did not segregate different stakeholder groups. Delegates value different elements of OMERACT, and thus the "spirit of OMERACT" encompasses evidence-based research, collaboration, and equality, although a small group are unsure whether the process is necessary to achieve the end result. Q methodology may prove useful for conference organizers to identify their delegates' different needs to tailor conference content.
Lourenço, Vera; Herdling, Thorsten; Reich, Gabriele; Menezes, José C; Lochmann, Dirk
2011-08-01
A set of 192 fluid bed granulation batches at industrial scale were in-line monitored using microwave resonance technology (MRT) to determine moisture, temperature and density of the granules. Multivariate data analysis techniques such as multiway partial least squares (PLS), multiway principal component analysis (PCA) and multivariate batch control charts were applied onto collected batch data sets. The combination of all these techniques, along with off-line particle size measurements, led to significantly increased process understanding. A seasonality effect could be put into evidence that impacted further processing through its influence on the final granule size. Moreover, it was demonstrated by means of a PLS that a relation between the particle size and the MRT measurements can be quantitatively defined, highlighting a potential ability of the MRT sensor to predict information about the final granule size. This study has contributed to improve a fluid bed granulation process, and the process knowledge obtained shows that the product quality can be built in process design, following Quality by Design (QbD) and Process Analytical Technology (PAT) principles. Copyright © 2011. Published by Elsevier B.V.
Hill, Ryley; Masui, Kiyoshi W; Scott, Douglas
2018-05-01
Cosmic background (CB) radiation, encompassing the sum of emission from all sources outside our own Milky Way galaxy across the entire electromagnetic spectrum, is a fundamental phenomenon in observational cosmology. Many experiments have been conceived to measure it (or its constituents) since the extragalactic Universe was first discovered; in addition to estimating the bulk (cosmic monopole) spectrum, directional variations have also been detected over a wide range of wavelengths. Here we gather the most recent of these measurements and discuss the current status of our understanding of the CB from radio to γ-ray energies. Using available data in the literature, we piece together the sky-averaged intensity spectrum and discuss the emission processes responsible for what is observed. We examine the effect of perturbations to the continuum spectrum from atomic and molecular line processes and comment on the detectability of these signals. We also discuss how one could, in principle, obtain a complete census of the CB by measuring the full spectrum of each spherical harmonic expansion coefficient. This set of spectra of multipole moments effectively encodes the entire statistical history of nuclear, atomic, and molecular processes in the Universe.
NASA Astrophysics Data System (ADS)
Hill, Ryley; Masui, Kiyoshi W.; Scott, Douglas
2018-05-01
The cosmic background (CB) radiation, encompassing the sum of emission from all sources outside our own Milky Way galaxy across the entire electromagnetic spectrum, is a fundamental phenomenon in observational cosmology. Many experiments have been conceived to measure it (or its constituents) since the extragalactic Universe was first discovered; in addition to estimating the bulk (cosmic monopole) spectrum, directional variations have also been detected over a wide range of wavelengths. Here we gather the most recent of these measurements and discuss the current status of our understanding of the CB from radio to $\\gamma$-ray energies. Using available data in the literature we piece together the sky-averaged intensity spectrum, and discuss the emission processes responsible for what is observed. We examine the effect of perturbations to the continuum spectrum from atomic and molecular line processes and comment on the detectability of these signals. We also discuss how one could in principle obtain a complete census of the CB by measuring the full spectrum of each spherical harmonic expansion coefficient. This set of spectra of multipole moments effectively encodes the entire statistical history of nuclear, atomic and molecular processes in the Universe.
Modelling non-equilibrium thermodynamic systems from the speed-gradient principle.
Khantuleva, Tatiana A; Shalymov, Dmitry S
2017-03-06
The application of the speed-gradient (SG) principle to the non-equilibrium distribution systems far away from thermodynamic equilibrium is investigated. The options for applying the SG principle to describe the non-equilibrium transport processes in real-world environments are discussed. Investigation of a non-equilibrium system's evolution at different scale levels via the SG principle allows for a fresh look at the thermodynamics problems associated with the behaviour of the system entropy. Generalized dynamic equations for finite and infinite number of constraints are proposed. It is shown that the stationary solution to the equations, resulting from the SG principle, entirely coincides with the locally equilibrium distribution function obtained by Zubarev. A new approach to describe time evolution of systems far from equilibrium is proposed based on application of the SG principle at the intermediate scale level of the system's internal structure. The problem of the high-rate shear flow of viscous fluid near the rigid plane plate is discussed. It is shown that the SG principle allows closed mathematical models of non-equilibrium processes to be constructed.This article is part of the themed issue 'Horizons of cybernetical physics'. © 2017 The Author(s).
Modelling non-equilibrium thermodynamic systems from the speed-gradient principle
NASA Astrophysics Data System (ADS)
Khantuleva, Tatiana A.; Shalymov, Dmitry S.
2017-03-01
The application of the speed-gradient (SG) principle to the non-equilibrium distribution systems far away from thermodynamic equilibrium is investigated. The options for applying the SG principle to describe the non-equilibrium transport processes in real-world environments are discussed. Investigation of a non-equilibrium system's evolution at different scale levels via the SG principle allows for a fresh look at the thermodynamics problems associated with the behaviour of the system entropy. Generalized dynamic equations for finite and infinite number of constraints are proposed. It is shown that the stationary solution to the equations, resulting from the SG principle, entirely coincides with the locally equilibrium distribution function obtained by Zubarev. A new approach to describe time evolution of systems far from equilibrium is proposed based on application of the SG principle at the intermediate scale level of the system's internal structure. The problem of the high-rate shear flow of viscous fluid near the rigid plane plate is discussed. It is shown that the SG principle allows closed mathematical models of non-equilibrium processes to be constructed. This article is part of the themed issue 'Horizons of cybernetical physics'.
Modelling non-equilibrium thermodynamic systems from the speed-gradient principle
Khantuleva, Tatiana A.
2017-01-01
The application of the speed-gradient (SG) principle to the non-equilibrium distribution systems far away from thermodynamic equilibrium is investigated. The options for applying the SG principle to describe the non-equilibrium transport processes in real-world environments are discussed. Investigation of a non-equilibrium system's evolution at different scale levels via the SG principle allows for a fresh look at the thermodynamics problems associated with the behaviour of the system entropy. Generalized dynamic equations for finite and infinite number of constraints are proposed. It is shown that the stationary solution to the equations, resulting from the SG principle, entirely coincides with the locally equilibrium distribution function obtained by Zubarev. A new approach to describe time evolution of systems far from equilibrium is proposed based on application of the SG principle at the intermediate scale level of the system's internal structure. The problem of the high-rate shear flow of viscous fluid near the rigid plane plate is discussed. It is shown that the SG principle allows closed mathematical models of non-equilibrium processes to be constructed. This article is part of the themed issue ‘Horizons of cybernetical physics’. PMID:28115617
de Lusignan, Simon; Krause, Paul
2010-01-01
There has been much criticism of the NHS national programme for information technology (IT); it has been an expensive programme and some elements appear to have achieved little. The Hayes report was written as an independent review of health and social care IT in England. To identify key principles for health IT implementation which may have relevance beyond the critique of NHS IT. We elicit ten principles from the Hayes report, which if followed may result in more effective IT implementation in health care. They divide into patient-centred, subsidiarity and strategic principles. The patient-centred principles are: 1) the patient must be at the centre of all information systems; 2) the provision of patient-level operational data should form the foundation - avoid the dataset mentality; 3) store health data as close to the patient as possible; 4) enable the patient to take a more active role with their health data within a trusted doctor-patient relationship. The subsidiarity principles set out to balance the local and health-system-wide needs: 5) standardise centrally - patients must be able to benefit from interoperability; 6) provide a standard procurement package and an approved process that ensures safety standards and provision of interoperable systems; 7) authorise a range of local suppliers so that health providers can select the system best meeting local needs; 8) allow local migration from legacy systems, as and when improved functionality for patients is available. And finally the strategic principles: 9) evaluate health IT systems in terms of measureable benefits to patients; 10) strategic planning of systems should reflect strategic goals for the health of patients/the population. Had the Hayes principles been embedded within our approach to health IT, and in particular to medical record implementation, we might have avoided many of the costly mistakes with the UK national programme. However, these principles need application within the modern IT environment. Closeness to the patient must not be interpreted as physical but instead as a virtual patient-centred space; data will be secure within the cloud and we should dump the vault and infrastructure mentality. Health IT should be developed as an adaptive ecosystem.
Accumulating Data to Optimally Predict Obesity Treatment (ADOPT) Core Measures: Psychosocial Domain.
Sutin, Angelina R; Boutelle, Kerri; Czajkowski, Susan M; Epel, Elissa S; Green, Paige A; Hunter, Christine M; Rice, Elise L; Williams, David M; Young-Hyman, Deborah; Rothman, Alexander J
2018-04-01
Within the Accumulating Data to Optimally Predict obesity Treatment (ADOPT) Core Measures Project, the psychosocial domain addresses how psychosocial processes underlie the influence of obesity treatment strategies on weight loss and weight maintenance. The subgroup for the psychosocial domain identified an initial list of high-priority constructs and measures that ranged from relatively stable characteristics about the person (cognitive function, personality) to dynamic characteristics that may change over time (motivation, affect). This paper describes (a) how the psychosocial domain fits into the broader model of weight loss and weight maintenance as conceptualized by ADOPT; (b) the guiding principles used to select constructs and measures for recommendation; (c) the high-priority constructs recommended for inclusion; (d) domain-specific issues for advancing the science; and (e) recommendations for future research. The inclusion of similar measures across trials will help to better identify how psychosocial factors mediate and moderate the weight loss and weight maintenance process, facilitate research into dynamic interactions with factors in the other ADOPT domains, and ultimately improve the design and delivery of effective interventions. © 2018 The Obesity Society.
Building healthy communities. Six steps for the board.
Goodspeed, S W
1998-01-01
Many trustees believe that health care reform must begin in the communities that their organizations serve. To become the visionary leaders that health care needs, trustees must reexamine many long-held beliefs and values, adopt 21st-century principles of governance, embrace the concept of a healthy community, and develop a systematic plan for change. Based on the collective knowledge of boards that have successfully led their organizations through change, the plan described here consists of a systematic six-step process. The process begins with techniques for creating awareness of the need to change and ends with techniques for measuring and sustaining gains (see figure 1 at right).
NASA Astrophysics Data System (ADS)
Averin, Dmitri V.; Pekola, Jukka P.
2017-03-01
According to Landauer's principle, erasure of information is the only part of a computation process that unavoidably involves energy dissipation. If done reversibly, such an erasure generates the minimal heat of $k_BT\\ln 2$ per erased bit of information. The goal of this work is to discuss the actual reversal of the optimal erasure which can serve as the basis for the Maxwell's demon operating with ultimate thermodynamic efficiency as dictated by the second law of thermodynamics. The demon extracts $k_BT\\ln 2$ of heat from an equilibrium reservoir at temperature $T$ per one bit of information obtained about the measured system used by the demon. We have analyzed this Maxwell's demon in the situation when it uses a general quantum system with a discrete spectrum of energy levels as its working body. In the case of the effectively two-level system, which has been realized experimentally based on tunneling of individual electron in a single-electron box [J.V. Koski et al., PNAS 111, 13786 (2014)], we also studied and minimized corrections to the ideal reversible operation of the demon. These corrections include, in particular, the non-adiabatic terms which are described by a version of the classical fluctuation-dissipation theorem. The overall reversibility of the Maxwell's demon requires, beside the reversibility of the intrinsic working body dynamics, the reversibility of the measurement and feedback processes. The single-electron demon can, in principle, be made fully reversible by developing a thermodynamically reversible single-electron charge detector for measurements of the individual charge states of the single-electron box.
A Conceptual Framework and Principles for Trusted Pervasive Health
Blobel, Bernd Gerhard; Seppälä, Antto Veikko; Sorvari, Hannu Olavi; Nykänen, Pirkko Anneli
2012-01-01
Background Ubiquitous computing technology, sensor networks, wireless communication and the latest developments of the Internet have enabled the rise of a new concept—pervasive health—which takes place in an open, unsecure, and highly dynamic environment (ie, in the information space). To be successful, pervasive health requires implementable principles for privacy and trustworthiness. Objective This research has two interconnected objectives. The first is to define pervasive health as a system and to understand its trust and privacy challenges. The second goal is to build a conceptual model for pervasive health and use it to develop principles and polices which can make pervasive health trustworthy. Methods In this study, a five-step system analysis method is used. Pervasive health is defined using a metaphor of digital bubbles. A conceptual framework model focused on trustworthiness and privacy is then developed for pervasive health. On that model, principles and rules for trusted information management in pervasive health are defined. Results In the first phase of this study, a new definition of pervasive health was created. Using this model, differences between pervasive health and health care are stated. Reviewed publications demonstrate that the widely used principles of predefined and static trust cannot guarantee trustworthiness and privacy in pervasive health. Instead, such an environment requires personal dynamic and context-aware policies, awareness, and transparency. A conceptual framework model focused on information processing in pervasive health is developed. Using features of pervasive health and relations from the framework model, new principles for trusted pervasive health have been developed. The principles propose that personal health data should be under control of the data subject. The person shall have the right to verify the level of trust of any system which collects or processes his or her health information. Principles require that any stakeholder or system collecting or processing health data must support transparency and shall publish its trust and privacy attributes and even its domain specific policies. Conclusions The developed principles enable trustworthiness and guarantee privacy in pervasive health. The implementation of principles requires new infrastructural services such as trust verification and policy conflict resolution. After implementation, the accuracy and usability of principles should be analyzed. PMID:22481297
A conceptual framework and principles for trusted pervasive health.
Ruotsalainen, Pekka Sakari; Blobel, Bernd Gerhard; Seppälä, Antto Veikko; Sorvari, Hannu Olavi; Nykänen, Pirkko Anneli
2012-04-06
Ubiquitous computing technology, sensor networks, wireless communication and the latest developments of the Internet have enabled the rise of a new concept-pervasive health-which takes place in an open, unsecure, and highly dynamic environment (ie, in the information space). To be successful, pervasive health requires implementable principles for privacy and trustworthiness. This research has two interconnected objectives. The first is to define pervasive health as a system and to understand its trust and privacy challenges. The second goal is to build a conceptual model for pervasive health and use it to develop principles and policies which can make pervasive health trustworthy. In this study, a five-step system analysis method is used. Pervasive health is defined using a metaphor of digital bubbles. A conceptual framework model focused on trustworthiness and privacy is then developed for pervasive health. On that model, principles and rules for trusted information management in pervasive health are defined. In the first phase of this study, a new definition of pervasive health was created. Using this model, differences between pervasive health and health care are stated. Reviewed publications demonstrate that the widely used principles of predefined and static trust cannot guarantee trustworthiness and privacy in pervasive health. Instead, such an environment requires personal dynamic and context-aware policies, awareness, and transparency. A conceptual framework model focused on information processing in pervasive health is developed. Using features of pervasive health and relations from the framework model, new principles for trusted pervasive health have been developed. The principles propose that personal health data should be under control of the data subject. The person shall have the right to verify the level of trust of any system which collects or processes his or her health information. Principles require that any stakeholder or system collecting or processing health data must support transparency and shall publish its trust and privacy attributes and even its domain specific policies. The developed principles enable trustworthiness and guarantee privacy in pervasive health. The implementation of principles requires new infrastructural services such as trust verification and policy conflict resolution. After implementation, the accuracy and usability of principles should be analyzed.
Unifying decoherence and the Heisenberg Principle
NASA Astrophysics Data System (ADS)
Janssens, Bas
2017-08-01
We exhibit three inequalities involving quantum measurement, all of which are sharp and state independent. The first inequality bounds the performance of joint measurement. The second quantifies the trade-off between the measurement quality and the disturbance caused on the measured system. Finally, the third inequality provides a sharp lower bound on the amount of decoherence in terms of the measurement quality. This gives a unified description of both the Heisenberg uncertainty principle and the collapse of the wave function.
NASA Astrophysics Data System (ADS)
Mukhopadhyay, Saikat; Bansal, Dipanshu; Delaire, Olivier; Perrodin, Didier; Bourret-Courchesne, Edith; Singh, David J.; Lindsay, Lucas
2017-09-01
Strongly anharmonic phonon properties of CuCl are investigated with inelastic neutron-scattering measurements and first-principles simulations. An unusual quasiparticle spectral peak emerges in the phonon density of states with increasing temperature, in both simulations and measurements, emanating from exceptionally strong coupling between conventional phonon modes. Associated with this strong anharmonicity, the lattice thermal conductivity of CuCl is extremely low and exhibits anomalous, nonmonotonic pressure dependence. We show how this behavior arises from the structure of the phonon dispersions augmenting the phase space available for anharmonic three-phonon scattering processes, and contrast this mechanism with common arguments based on negative Grüneisen parameters. These results demonstrate the importance of considering intrinsic phonon-dispersion structure toward understanding scattering processes and designing new ultralow thermal conductivity materials.
Testing two principles of the Health Action Process Approach in individuals with type 2 diabetes.
Lippke, Sonia; Plotnikoff, Ronald C
2014-01-01
The Health Action Process Approach (HAPA) proposes principles that can be translated into testable hypotheses. This is one of the first studies to have explicitly tested HAPA's first 2 principles, which are (1) health behavior change process can be subdivided into motivation and volition, and (2) volition can be grouped into intentional and action stages. The 3 stage groups are labeled preintenders, intenders, and actors. The hypotheses of the HAPA model were investigated in a sample of 1,193 individuals with Type 2 diabetes. Study participants completed a questionnaire assessing the HAPA variables. The hypotheses were evaluated by examining mean differences of test variables and by the use of multigroup structural equation modeling (MSEM). Findings support the HAPA's 2 principles and 3 distinct stages. The 3 HAPA stages were significantly different in several stage-specific variables, and discontinuity patterns were found in terms of nonlinear trends across means. In terms of predicting goals, action planning, and behavior, differences transpired between the 2 motivational stages (preintenders and intenders), and between the 2 volitional stages (intenders and actors). Results indicate implications for supporting behavior change processes, depending on in which stage a person is at: All individuals should be helped to increase self-efficacy. Preintenders and intenders require interventions targeting outcome expectancies. Actors benefit from an improvement in action planning to maintain and increase their previous behavior. Overall, the first 2 principles of the HAPA were supported and some evidence for the other principles was found. Future research should experimentally test these conclusions. 2014 APA, all rights reserved
Al transmon qubits on silicon-on-insulator for quantum device integration
NASA Astrophysics Data System (ADS)
Keller, Andrew J.; Dieterle, Paul B.; Fang, Michael; Berger, Brett; Fink, Johannes M.; Painter, Oskar
2017-07-01
We present the fabrication and characterization of an aluminum transmon qubit on a silicon-on-insulator substrate. Key to the qubit fabrication is the use of an anhydrous hydrofluoric vapor process which selectively removes the lossy silicon oxide buried underneath the silicon device layer. For a 5.6 GHz qubit measured dispersively by a 7.1 GHz resonator, we find T1 = 3.5 μs and T2* = 2.2 μs. This process in principle permits the co-fabrication of silicon photonic and mechanical elements, providing a route towards chip-scale integration of electro-opto-mechanical transducers for quantum networking of superconducting microwave quantum circuits. The additional processing steps are compatible with established fabrication techniques for aluminum transmon qubits on silicon.
Photosynthetic Energy Transfer at the Quantum/Classical Border.
Keren, Nir; Paltiel, Yossi
2018-06-01
Quantum mechanics diverges from the classical description of our world when very small scales or very fast processes are involved. Unlike classical mechanics, quantum effects cannot be easily related to our everyday experience and are often counterintuitive to us. Nevertheless, the dimensions and time scales of the photosynthetic energy transfer processes puts them close to the quantum/classical border, bringing them into the range of measurable quantum effects. Here we review recent advances in the field and suggest that photosynthetic processes can take advantage of the sensitivity of quantum effects to the environmental 'noise' as means of tuning exciton energy transfer efficiency. If true, this design principle could be a base for 'nontrivial' coherent wave property nano-devices. Copyright © 2018 Elsevier Ltd. All rights reserved.
Corbie-Smith, Giselle; Bryant, Angela R; Walker, Deborah J; Blumenthal, Connie; Council, Barbara; Courtney, Dana; Adimora, Ada
2015-01-01
In health research, investigators and funders are emphasizing the importance of collaboration between communities and academic institutions to achieve health equity. Although the principles underlying community-academic partnered research have been well-articulated, the processes by which partnerships integrate these principles when working across cultural differences are not as well described. We present how Project GRACE (Growing, Reaching, Advocating for Change and Empowerment) integrated participatory research principles with the process of building individual and partnership capacity. We worked with Vigorous Interventions In Ongoing Natural Settings (VISIONS) Inc., a process consultant and training organization, to develop a capacity building model. We present the conceptual framework and multicultural process of change (MPOC) that was used to build individual and partnership capacity to address health disparities. The process and capacity building model provides a common language, approach, and toolset to understand differences and the dynamics of inequity. These tools can be used by other partnerships in the conduct of research to achieve health equity.
Synchronization in spread spectrum laser radar systems based on PMD-DLL
NASA Astrophysics Data System (ADS)
Buxbaum, Bernd; Schwarte, Rudolf; Ringbeck, Thorsten; Luan, Xuming; Zhang, Zhigang; Xu, Zhanping; Hess, H.
2000-09-01
This paper proposes a new optoelectronic delay locked loop (OE-DLL) and its use in optical ranging systems. The so called PMD-DLL receiver module is based on a novel electro-optical modulator (EOM), called the Photonic Mixer Device (PMD). This sensor element is a semiconductor device, which combines fast optical sensing and mixing of incoherent light signals in one component part by its unique and powerful principle of operation. Integration of some simple additional on-chip components offers a high integrated electro-optical correlation unit. Simulations and experimental results have already impressively verified the operation principle of PMD structures, all realized in CMOS technology so far. Although other technologies are also promising candidates for the PMD realization they should not be further discussed in this contribution. The principle of the new DLL approach is intensively discussed in this paper. Theoretical analysis as well as experimental results of a realized PMD-DLL system are demonstrated and judged. Due to the operation principle of sophisticated PMD devices and their unique features, a correlation process may be realized in order to synchronize a reflected incoherent light wave with an electronic reference signal. The phase shift between both signals represents the distance to an obstacle and may be determined by means of the synchronization process. This new approach, avoiding so far needed critical components such as broadband amplifiers and mixers for the detection of small photo currents in optical distance measurement, offers an extremely fast and precise phase determination in ranging applications based on the time- of-flight (TOF) principle. However, the optical measurement signal may be incoherent -- therefore a laser source is not needed imperatively. The kind of waveform used for the modulation of the light signal is variable and depends on the demands of every specific application. Even if there are plenty other alternatives (e.g., heterodyne techniques), in this contribution only so called quasi-heterodyne techniques - - also known as phase shifting methods -- are discussed and used for the implementation. The light modulation schemes described in this contribution are square-wave as well as pseudo-noise modulation. The latter approach, inspired by the wide spread use in communication as well as in position detection (e.g., IS-95 and GPS), offers essential advantages and is the most promising modulation method for the ranging approach. So called CDMA (code division multiple access) systems form a major task in communication technology investigations since the third generation mobile phone standard is also partly based on this principle. Fast and reliable synchronization in direct sequence spread spectrum communication systems (DSSS) differs hardly from the already mentioned ranging approach and will also be discussed. The possibility to integrate all components in a monolithic PMD based DLL design is also presented and discussed. This method might offer the feature to integrate complete lines or matrixes of PMD based DLLs for highly parallel, multidimensional ranging. Finally, an outlook is given with regard to further optimized PMD front ends. An estimation of the expected characteristics concerning accuracy and speed of the distance measurement is given in conclusion.
Does the Relative Strength of Grouping Principles Modulate the Interactions between them?
Montoro, Pedro R; Luna, Dolores
2015-06-05
This study examines the influence of the relative strength of grouping principles on interactions between the intrinsic principle of proximity and the extrinsic principle of common region in the process of perceptual organization. Cooperation and competition between intrinsic and extrinsic principles were examined by presenting the principle either alone or conjoined with another principle. The relative grouping strength of the principles operating alone was varied in two different groups of participants so that it was similar for one group and very different for the other group. Results showed that, when principles acting alone had different strengths, the grouping effect of the strongest principle was similar to that of the cooperation condition, and the effect of the weakest principle was similar to that of competing conjoined principles. In contrast, when the strength of principles acting alone was similar, the effect of conjoined cooperating principles was greater than that of either principle acting alone. Moreover, the effect of conjoined competing principles was smaller than that of either principle operating alone. Results show that cooperation and competition between intrinsic and extrinsic principles are modulated by the relative grouping strength of principles acting alone. Furthermore, performance in these conditions could be predicted on the basis of performance in single-principle conditions.
The Gestalt Principle of Similarity Benefits Visual Working Memory
Peterson, Dwight J.; Berryhill, Marian E.
2013-01-01
Visual working memory (VWM) is essential for many cognitive processes yet it is notably limited in capacity. Visual perception processing is facilitated by Gestalt principles of grouping, such as connectedness, similarity, and proximity. This introduces the question: do these perceptual benefits extend to VWM? If so, can this be an approach to enhance VWM function by optimizing the processing of information? Previous findings demonstrate that several Gestalt principles (connectedness, common region, and spatial proximity) do facilitate VWM performance in change detection tasks (Woodman, Vecera, & Luck, 2003; Xu, 2002a, 2006; Xu & Chun, 2007; Jiang, Olson & Chun, 2000). One prevalent Gestalt principle, similarity, has not been examined with regard to facilitating VWM. Here, we investigated whether grouping by similarity benefits VWM. Experiment 1 established the basic finding that VWM performance could benefit from grouping. Experiment 2 replicated and extended this finding by showing that similarity was only effective when the similar stimuli were proximal. In short, the VWM performance benefit derived from similarity was constrained by spatial proximity such that similar items need to be near each other. Thus, the Gestalt principle of similarity benefits visual perception, but it can provide benefits to VWM as well. PMID:23702981
The Gestalt principle of similarity benefits visual working memory.
Peterson, Dwight J; Berryhill, Marian E
2013-12-01
Visual working memory (VWM) is essential for many cognitive processes, yet it is notably limited in capacity. Visual perception processing is facilitated by Gestalt principles of grouping, such as connectedness, similarity, and proximity. This introduces the question, do these perceptual benefits extend to VWM? If so, can this be an approach to enhance VWM function by optimizing the processing of information? Previous findings have demonstrated that several Gestalt principles (connectedness, common region, and spatial proximity) do facilitate VWM performance in change detection tasks (Jiang, Olson, & Chun, 2000; Woodman, Vecera, & Luck, 2003; Xu, 2002, 2006; Xu & Chun, 2007). However, one prevalent Gestalt principle, similarity, has not been examined with regard to facilitating VWM. Here, we investigated whether grouping by similarity benefits VWM. Experiment 1 established the basic finding that VWM performance could benefit from grouping. Experiment 2 replicated and extended this finding by showing that similarity was only effective when the similar stimuli were proximal. In short, the VWM performance benefit derived from similarity was constrained by spatial proximity, such that similar items need to be near each other. Thus, the Gestalt principle of similarity benefits visual perception, but it can provide benefits to VWM as well.
Ninio, Jacques
2014-01-01
Geometrical illusions are known through a small core of classical illusions that were discovered in the second half of the nineteenth century. Most experimental studies and most theoretical discussions revolve around this core of illusions, as though all other illusions were obvious variants of these. Yet, many illusions, mostly described by German authors at the same time or at the beginning of the twentieth century have been forgotten and are awaiting their rehabilitation. Recently, several new illusions were discovered, mainly by Italian authors, and they do not seem to take place into any current classification. Among the principles that are invoked to explain the illusions, there are principles relating to the metric aspects (contrast, assimilation, shrinkage, expansion, attraction of parallels) principles relating to orientations (regression to right angles, orthogonal expansion) or, more recently, to gestalt effects. Here, metric effects are discussed within a measurement framework, in which the geometric illusions are the outcome of a measurement process. There would be a main "convexity" bias in the measures: the measured value m(x) of an extant x would grow more than proportionally with x. This convexity principle, completed by a principle of compromise for conflicting measures can replace, for a large number of patterns, both the assimilation and the contrast effects. We know from evolutionary theory that the most pertinent classification criteria may not be the most salient ones (e.g., a dolphin is not a fish). In order to obtain an objective classification of illusions, I initiated with Kevin O'Regan systematic work on "orientation profiles" (describing how the strength of an illusion varies with its orientation in the plane). We showed first that the Zöllner illusion already exists at the level of single stacks, and that it does not amount to a rotation of the stacks. Later work suggested that it is best described by an "orthogonal expansion"-an expansion of the stacks applied orthogonally to the oblique segments of the stacks, generating an apparent rotation effect. We showed that the Poggendorff illusion was mainly a misangulation effect. We explained the hierarchy of the illusion magnitudes found among variants of the Poggendorff illusion by the existence of control devices that counteract the loss of parallelism or the loss of collinearity produced by the biased measurements. I then studied the trapezium illusion. The oblique sides, but not the bases, were essential to the trapezium illusion, suggesting the existence of a common component between the trapezium and the Zöllner illusion. Unexpectedly, the trapeziums sometimes appeared as twisted surfaces in 3d. It also appeared impossible, using a nulling procedure, to make all corresponding sides of two trapeziums simultaneously equal. The square-diamond illusion is usually presented with one apex of the diamond pointing toward the square. I found that when the figures were displayed more symmetrically, the illusion was significantly reduced. Furthermore, it is surpassed, for all subjects, by an illusion that goes in the opposite direction, in which the diagonal of a small diamond is underestimated with respect to the side of a larger square. In general, the experimental work generated many unexpected results. Each illusory stimulus was compared to a number of control variants, and often, I measured larger distortions in a variant than in the standard stimulus. In the Discussion, I will stress what I think are the main ordering principle in the metric and the orientation domains for illusory patterns. The convexity bias principle and the orthogonal expansion principles help to establish unsuspected links between apparently unrelated stimuli, and reduce their apparently extreme heterogeneity. However, a number of illusions (e.g., those of the twisted cord family, or the Poggendorff illusions) remain unpredicted by the above principles. Finally, I will develop the idea that the brain is constructing several representations, and the one that is commonly used for the purpose of shape perception generates distortions inasmuch as it must satisfy a number of conflicting constraints, such as the constraint of producing a stable shape despite the changing perspectives produced by eye movements.
Reproducibility of the cutoff probe for the measurement of electron density
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, D. W.; Oh, W. Y.; You, S. J., E-mail: sjyou@cnu.ac.kr
2016-06-15
Since a plasma processing control based on plasma diagnostics attracted considerable attention in industry, the reproducibility of the diagnostics using in this application has become a great interest. Because the cutoff probe is one of the potential candidates for this application, knowing the reproducibility of the cutoff probe measurement becomes quit important in the cutoff probe application research. To test the reproducibility of the cutoff probe measurement, in this paper, a comparative study among the different cutoff probe measurements was performed. The comparative study revealed remarkable result: the cutoff probe has a great reproducibility for the electron density measurement, i.e.,more » there are little differences among measurements by different probes made by different experimenters. The discussion including the reason for the result was addressed via this paper by using a basic measurement principle of cutoff probe and a comparative experiment with Langmuir probe.« less
Principles of the Organization of the Global Economic System
ERIC Educational Resources Information Center
Dyatlov, Sergey A.; Bulavko, Olga A.; Balanovskaya, Anna V.; Nikitina, Natalia V.; Chudaeva, Alexandra A.
2016-01-01
The development of the economic system is not a spontaneous but a programmed and controlled process. Economy is always a controlled system in which there is always an appropriate subject of management. The article considers principles of the organization of the global economic system. The characteristic of the principle of "hierarchy of…
Developing Principles of Physical Education Teacher Education Practice through Self-Study
ERIC Educational Resources Information Center
Fletcher, Tim
2016-01-01
Background: The articulation of specific principles of teacher education practice allows teacher educators to make explicit the beliefs, values, and actions that shape their practice. Engaging in processes to articulate the principles that guide practice is beneficial not only for teacher educators and their colleagues but also for students. There…
77 FR 13098 - Multistakeholder Process To Develop Consumer Data Privacy Codes of Conduct
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-05
... Rights, which is a set of principles the Administration believes should govern the handling of personal... Commission's jurisdiction.\\3\\ Enforceable codes of conduct based on the principles set forth in the Consumer... businesses greater certainty about how agreed upon privacy principles apply to them. Companies will build...
The duty of care: an update. Current legal principles.
Fullbrook, Suzanne
2005-02-01
The author had a twofold purpose in writing this article: firstly to inform readers of the current legal principles that inform the duty of care, and secondly to discuss these principles with reference to two recent cases. Practitioners giving consideration to these issues will be assisted in their professional decision-making processes.
Integrating UDL Principles and Practices into the Online Course Development Process: A Delphi Study
ERIC Educational Resources Information Center
Singleton, Korey J.
2017-01-01
The literature shows that both faculty and students hold favorable opinions about UDL principles and practices and students' benefit from such practices when implemented in the higher education classroom. Despite this, faculty members remain resistant to implementing UDL principles and practices. Few studies have examined the barriers impacting…
Differential pricing of drugs: a role for cost-effectiveness analysis?
Lopert, Ruth; Lang, Danielle L; Hill, Suzanne R; Henry, David A
2002-06-15
Internationally, the high costs of pharmaceutical products limit access to treatment. The principle of differential pricing is that drug prices should vary according to some measure of affordability. How differential prices should be determined is, however, unclear. Here we describe a method whereby differential prices for essential drugs could be derived in countries of variable national wealth, and, using angiotensin-converting enzyme inhibitors provide an example of how the process might work. Indicative prices for drugs can be derived by cost-effectiveness analysis that incorporates a measure of national wealth. Such prices could be used internationally as a basis of differential price negotiations.
NASA Astrophysics Data System (ADS)
Laubscher, Markus; Bourquin, Stéphane; Froehly, Luc; Karamata, Boris; Lasser, Theo
2004-07-01
Current spectroscopic optical coherence tomography (OCT) methods rely on a posteriori numerical calculation. We present an experimental alternative for accessing spectroscopic information in OCT without post-processing based on wavelength de-multiplexing and parallel detection using a diffraction grating and a smart pixel detector array. Both a conventional A-scan with high axial resolution and the spectrally resolved measurement are acquired simultaneously. A proof-of-principle demonstration is given on a dynamically changing absorbing sample. The method's potential for fast spectroscopic OCT imaging is discussed. The spectral measurements obtained with this approach are insensitive to scan non-linearities or sample movements.
Photonic-based liquid level transmitter using Mach-Zehnder interferometer for industrial application
NASA Astrophysics Data System (ADS)
Singh, Yadvendra; Raghuwanshi, Sanjeev K.; Kumar, Manish
2018-02-01
In the present scenario the process control industries mainly uses 1-5 Volt or 4-20 mA protocol for transmitting the measured signal to remote location operators. These types of protocol prone to interference and limited data transfer rate. To overcome these types of limitation we proposed photonic based transmitter for liquid level measurement which will enhance data transfer rate and interference reduction to eliminate noise signal in the channel during transmission to make transmission more reliable, accurate and consistent in performance. The required mathematical derivation and the principle of operation of the transmitter are shown in the paper.
A Method for Assessing the Accuracy of a Photogrammetry System for Precision Deployable Structures
NASA Technical Reports Server (NTRS)
Moore, Ashley
2005-01-01
The measurement techniques used to validate analytical models of large deployable structures are an integral Part of the technology development process and must be precise and accurate. Photogrammetry and videogrammetry are viable, accurate, and unobtrusive methods for measuring such large Structures. Photogrammetry uses Software to determine the three-dimensional position of a target using camera images. Videogrammetry is based on the same principle, except a series of timed images are analyzed. This work addresses the accuracy of a digital photogrammetry system used for measurement of large, deployable space structures at JPL. First, photogrammetry tests are performed on a precision space truss test article, and the images are processed using Photomodeler software. The accuracy of the Photomodeler results is determined through, comparison with measurements of the test article taken by an external testing group using the VSTARS photogrammetry system. These two measurements are then compared with Australis photogrammetry software that simulates a measurement test to predict its accuracy. The software is then used to study how particular factors, such as camera resolution and placement, affect the system accuracy to help design the setup for the videogrammetry system that will offer the highest level of accuracy for measurement of deploying structures.
Framework for the quality assurance of 'omics technologies considering GLP requirements.
Kauffmann, Hans-Martin; Kamp, Hennicke; Fuchs, Regine; Chorley, Brian N; Deferme, Lize; Ebbels, Timothy; Hackermüller, Jörg; Perdichizzi, Stefania; Poole, Alan; Sauer, Ursula G; Tollefsen, Knut E; Tralau, Tewes; Yauk, Carole; van Ravenzwaay, Ben
2017-12-01
'Omics technologies are gaining importance to support regulatory toxicity studies. Prerequisites for performing 'omics studies considering GLP principles were discussed at the European Centre for Ecotoxicology and Toxicology of Chemicals (ECETOC) Workshop Applying 'omics technologies in Chemical Risk Assessment. A GLP environment comprises a standard operating procedure system, proper pre-planning and documentation, and inspections of independent quality assurance staff. To prevent uncontrolled data changes, the raw data obtained in the respective 'omics data recording systems have to be specifically defined. Further requirements include transparent and reproducible data processing steps, and safe data storage and archiving procedures. The software for data recording and processing should be validated, and data changes should be traceable or disabled. GLP-compliant quality assurance of 'omics technologies appears feasible for many GLP requirements. However, challenges include (i) defining, storing, and archiving the raw data; (ii) transparent descriptions of data processing steps; (iii) software validation; and (iv) ensuring complete reproducibility of final results with respect to raw data. Nevertheless, 'omics studies can be supported by quality measures (e.g., GLP principles) to ensure quality control, reproducibility and traceability of experiments. This enables regulators to use 'omics data in a fit-for-purpose context, which enhances their applicability for risk assessment. Copyright © 2017 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Setari, Anthony Philip
2016-01-01
The purpose of this study was to construct a holistic education school evaluation tool using Montessori Erdkinder principles, and begin the validation process of examining the proposed tool. This study addresses a vital need in the holistic education community for a school evaluation tool. The tool construction process included using Erdkinder…
ERIC Educational Resources Information Center
Cheng, Ming-Chang; Chou, Pei-I; Wang, Ya-Ting; Lin, Chih-Ho
2015-01-01
This study investigates how the illustrations in a science textbook, with their design modified according to cognitive process principles, affected students' learning performance. The quasi-experimental design recruited two Grade 5 groups (N?=?58) as the research participants. The treatment group (n?=?30) used the modified version of the textbook,…
ERIC Educational Resources Information Center
Villacañas de Castro, Luis S.
2016-01-01
This article aims to apply Stenhouse's process model of curriculum to foreign language (FL) education, a model which is characterized by enacting "principles of procedure" which are specific to the discipline which the school subject belongs to. Rather than to replace or dissolve current approaches to FL teaching and curriculum…
Uncertainty estimation of the self-thinning process by Maximum-Entropy Principle
Shoufan Fang; George Z. Gertner
2000-01-01
When available information is scarce, the Maximum-Entropy Principle can estimate the distributions of parameters. In our case study, we estimated the distributions of the parameters of the forest self-thinning process based on literature information, and we derived the conditional distribution functions and estimated the 95 percent confidence interval (CI) of the self-...
ERIC Educational Resources Information Center
LaFleur, Elizabeth K.; Babin, Laurie A.; Lopez, Tara Burnthorne
2009-01-01
This article describes the process one marketing faculty followed to demonstrate assurance of learning for marketing students and presents longitudinal results associated with a course-embedded direct assessment device in the Principles of Marketing course. The process follows closely the Association to Advance Collegiate Schools of Business…
Assessing Students' Ability to Trace Matter in Dynamic Systems in Cell Biology
ERIC Educational Resources Information Center
Wilson, Christopher D.; Anderson, Charles W.; Heidemann, Merle; Merrill, John E.; Merritt, Brett W.; Richmond, Gail; Sibley, Duncan F.; Parker, Joyce M.
2006-01-01
College-level biology courses contain many complex processes that are often taught and learned as detailed narratives. These processes can be better understood by perceiving them as dynamic systems that are governed by common fundamental principles. Conservation of matter is such a principle, and thus tracing matter is an essential step in…
ERIC Educational Resources Information Center
Paulson, Eric J.
2005-01-01
This theoretical article examines reading processes using chaos theory as an analogy. Three principles of chaos theory are identified and discussed, then related to reading processes as revealed through eye movement research. Used as an analogy, the chaos theory principle of sensitive dependence contributes to understanding the difficulty in…
ERIC Educational Resources Information Center
Habecker, Eugene B.
A brief historical review of the student disciplinary process in private colleges and universities, as well as a discussion of current practices and principles of student discipline, provide background for discussion of future possibilities. The analysis of current practices and principles includes a brief theoretical discussion about the legal…
ERIC Educational Resources Information Center
Kies, Cosette N.
A brief overview of the functions of public relations in libraries introduces this manual, which provides an explanation of the public relations (PR) process, including fact-finding, planning, communicating, evaluating, and marketing; some PR principles; a 10-step program that could serve as a model for planning a PR program; a discussion of PR…
ERIC Educational Resources Information Center
Burton, Laura J.; Mazerolle, Stephanie M.
2011-01-01
Context: Instrument validation is an important facet of survey research methods and athletic trainers must be aware of the important underlying principles. Objective: To discuss the process of survey development and validation, specifically the process of construct validation. Background: Athletic training researchers frequently employ the use of…
Ethical Standards of the American Association for Counseling and Development.
ERIC Educational Resources Information Center
Journal of Counseling and Development, 1988
1988-01-01
Presents principles that define the ethical behavior of American Association for Counseling and Development members. In addition to 11 general principles, includes principles on the counseling relationship, measurement and evaluation, research and publication, consulting, private practice, personnel administration, and preparation standards. (ABL)
Munakata, Y; McClelland, J L; Johnson, M H; Siegler, R S
1997-10-01
Infants seem sensitive to hidden objects in habituation tasks at 3.5 months but fail to retrieve hidden objects until 8 months. The authors first consider principle-based accounts of these successes and failures, in which early successes imply knowledge of principles and failures are attributed to ancillary deficits. One account is that infants younger than 8 months have the object permanence principle but lack means-ends abilities. To test this, 7-month-olds were trained on means-ends behaviors and were tested on retrieval of visible and occluded toys. Means-ends demands were the same, yet infants made more toy-guided retrievals in the visible case. The authors offer an adaptive process account in which knowledge is graded and embedded in specific behavioral processes. Simulation models that learn gradually to represent occluded objects show how this approach can account for success and failure in object permanence tasks without assuming principles and ancillary deficits.
Looby, Mairead; Ibarra, Neysi; Pierce, James J; Buckley, Kevin; O'Donovan, Eimear; Heenan, Mary; Moran, Enda; Farid, Suzanne S; Baganz, Frank
2011-01-01
This study describes the application of quality by design (QbD) principles to the development and implementation of a major manufacturing process improvement for a commercially distributed therapeutic protein produced in Chinese hamster ovary cell culture. The intent of this article is to focus on QbD concepts, and provide guidance and understanding on how the various components combine together to deliver a robust process in keeping with the principles of QbD. A fed-batch production culture and a virus inactivation step are described as representative examples of upstream and downstream unit operations that were characterized. A systematic approach incorporating QbD principles was applied to both unit operations, involving risk assessment of potential process failure points, small-scale model qualification, design and execution of experiments, definition of operating parameter ranges and process validation acceptance criteria followed by manufacturing-scale implementation and process validation. Statistical experimental designs were applied to the execution of process characterization studies evaluating the impact of operating parameters on product quality attributes and process performance parameters. Data from process characterization experiments were used to define the proven acceptable range and classification of operating parameters for each unit operation. Analysis of variance and Monte Carlo simulation methods were used to assess the appropriateness of process design spaces. Successful implementation and validation of the process in the manufacturing facility and the subsequent manufacture of hundreds of batches of this therapeutic protein verifies the approaches taken as a suitable model for the development, scale-up and operation of any biopharmaceutical manufacturing process. Copyright © 2011 American Institute of Chemical Engineers (AIChE).
Thermal Remote Sensing and the Thermodynamics of Ecosystems Development
NASA Technical Reports Server (NTRS)
Luvall, Jeffrey C.; Kay, James J.; Fraser, Roydon F.; Goodman, H. Michael (Technical Monitor)
2001-01-01
Thermal remote sensing can provide environmental measuring tools with capabilities for measuring ecosystem development and integrity. Recent advances in applying principles of nonequilibrium thermodynamics to ecology provide fundamental insights into energy partitioning in ecosystems. Ecosystems are nonequilibrium systems, open to material and energy flows, which grow and develop structures and processes to increase energy degradation. More developed terrestrial ecosystems will be more effective at dissipating the solar gradient (degrading its energy content). This can be measured by the effective surface temperature of the ecosystem on a landscape scale. A series of airborne thermal infrared multispectral scanner data were collected from several forested ecosystems ranging from a western US douglas-fir forest to a tropical rain forest in Costa Rica. Also measured were agriculture systems. These data were used to develop measures of ecosystem development and integrity based on surface temperature.
Computational principles of working memory in sentence comprehension.
Lewis, Richard L; Vasishth, Shravan; Van Dyke, Julie A
2006-10-01
Understanding a sentence requires a working memory of the partial products of comprehension, so that linguistic relations between temporally distal parts of the sentence can be rapidly computed. We describe an emerging theoretical framework for this working memory system that incorporates several independently motivated principles of memory: a sharply limited attentional focus, rapid retrieval of item (but not order) information subject to interference from similar items, and activation decay (forgetting over time). A computational model embodying these principles provides an explanation of the functional capacities and severe limitations of human processing, as well as accounts of reading times. The broad implication is that the detailed nature of cross-linguistic sentence processing emerges from the interaction of general principles of human memory with the specialized task of language comprehension.
Principles of recruitment and retention in clinical trials.
Aitken, Leanne; Gallagher, Robyn; Madronio, Christine
2003-12-01
Efficient and effective recruitment and retention of participants is the largest single component of the study workload and forms an essential component in the conduct of clinical trials. In this paper, we present five principles to guide the processes of both recruitment and retention. These principles include the selection of an appropriate population to adequately answer the research question, followed by the establishment of a sampling process that accurately represents that population. Creation of systematic and effective recruitment mechanisms should be supported by implementation of follow-up mechanisms that promote participant retention. Finally, all activities related to recruitment and retention must be conducted within the framework of ethics and privacy regulations. Adherence to these principles will assist the researcher in achieving the goals of the study within the available resources.
The geometry of three-dimensional measurement from paired coplanar x-ray images.
Baumrind, S; Moffitt, F H; Curry, S
1983-10-01
This article outlines the geometric principles which underlie the process of making craniofacial measurements in three dimensions by combining information from pairs of coplanar x-ray images. The main focus is upon the rationale of the method rather than upon the computational details. We stress particularly the importance of having available accurate measurements as to the relative positions of the x-ray tubes and the film plane. The use of control arrays of radiopaque "points" whose projected images upon the film plane allow the retrospective calculation of the spatial relationship between the x-ray tubes and the film plane is explained. Finally, the question of correcting for movement of the subject between two films of an image pair is considered briefly.
A New Multifunctional Sensor for Measuring Concentrations of Ternary Solution
NASA Astrophysics Data System (ADS)
Wei, Guo; Shida, Katsunori
This paper presents a multifunctional sensor with novel structure, which is capable of directly sensing temperature and two physical parameters of solutions, namely ultrasonic velocity and conductivity. By combined measurement of these three measurable parameters, the concentrations of various components in a ternary solution can be simultaneously determined. The structure and operation principle of the sensor are described, and a regression algorithm based on natural cubic spline interpolation and the least square method is adopted to estimate the concentrations. The performances of the proposed sensor are experimentally tested by the use of ternary aqueous solution of sodium chloride and sucrose, which is widely involved in food and beverage industries. This sensor could prove valuable as a process control sensor in industry fields.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sterczewski, L. A., E-mail: lukasz.sterczewski@pwr.edu.pl; Grzelczak, M. P.; Plinski, E. F.
In this paper, we present an electronic circuit used to bias a photoconductive antenna that generates terahertz radiation. The working principles and the design process for the device are discussed in detail. The noise and shape of the wave measurements for a built device are considered. Furthermore, their impact on a terahertz pulse and its spectra is also examined. The proposed implementation is simple to build, robust and offers a real improvement over THz instrumentation due to the frequency tuning. Additionally, it provides for galvanic isolation and ESD protection.
Laser anemometry for hot flows
NASA Astrophysics Data System (ADS)
Kugler, P.; Langer, G.
1987-07-01
The fundamental principles, instrumentation, and practical operation of LDA and laser-transit-anemometry systems for measuring velocity profiles and the degree of turbulence in high-temperature flows are reviewed and illustrated with diagrams, drawings and graphs of typical data. Consideration is given to counter, tracker, spectrum-analyzer and correlation methods of LDA signal processing; multichannel analyzer and cross correlation methods for LTA data; LTA results for a small liquid fuel rocket motor; and experiments demonstrating the feasibility of an optoacoustic demodulation scheme for LDA signals from unsteady flows.
Boyle, Cynthia J.; Janke, Kristin K.
2013-01-01
Objective. To assist administrators and faculty members in colleges and schools of pharmacy by gathering expert opinion to frame, direct, and support investments in student leadership development. Methods. Twenty-six leadership instructors participated in a 3-round, online, modified Delphi process to define doctor of pharmacy (PharmD) student leadership instruction. Round 1 asked open-ended questions about leadership knowledge, skills, and attitudes to begin the generation of student leadership development guiding principles and competencies. Statements were identified as guiding principles when they were perceived as foundational to the instructional approach. Round 2 grouped responses for agreement rating and comment. Group consensus with a statement as a guiding principle was set prospectively at 80%. Round 3 allowed rating and comment on guidelines, modified from feedback in round 2, that did not meet consensus. The principles were verified by identifying common contemporary leadership development approaches in the literature. Results. Twelve guiding principles, related to concepts of leadership and educational philosophy, were defined and could be linked to contemporary leadership development thought. These guiding principles describe the motivation for teaching leadership, the fundamental precepts of student leadership development, and the core tenets for leadership instruction. Conclusions. Expert opinion gathered using a Delphi process resulted in guiding principles that help to address many of the fundamental questions that arise when implementing or refining leadership curricula. The principles identified are supported by common contemporary leadership development thought. PMID:24371345
Traynor, Andrew P; Boyle, Cynthia J; Janke, Kristin K
2013-12-16
To assist administrators and faculty members in colleges and schools of pharmacy by gathering expert opinion to frame, direct, and support investments in student leadership development. Twenty-six leadership instructors participated in a 3-round, online, modified Delphi process to define doctor of pharmacy (PharmD) student leadership instruction. Round 1 asked open-ended questions about leadership knowledge, skills, and attitudes to begin the generation of student leadership development guiding principles and competencies. Statements were identified as guiding principles when they were perceived as foundational to the instructional approach. Round 2 grouped responses for agreement rating and comment. Group consensus with a statement as a guiding principle was set prospectively at 80%. Round 3 allowed rating and comment on guidelines, modified from feedback in round 2, that did not meet consensus. The principles were verified by identifying common contemporary leadership development approaches in the literature. Twelve guiding principles, related to concepts of leadership and educational philosophy, were defined and could be linked to contemporary leadership development thought. These guiding principles describe the motivation for teaching leadership, the fundamental precepts of student leadership development, and the core tenets for leadership instruction. Expert opinion gathered using a Delphi process resulted in guiding principles that help to address many of the fundamental questions that arise when implementing or refining leadership curricula. The principles identified are supported by common contemporary leadership development thought.
76 FR 16345 - Net Worth and Equity Ratio
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-23
... acquisition must be measured under generally accepted accounting principles as referenced in the Act. 12 U.S.C... equity or member interest in the acquirer. Generally accepted accounting principles require this excess... generally accepted accounting principles. For low income-designated credit unions, net worth also includes...
[Improving a hospital's supply chain through lean management].
Aguilar-Escobar, V G; Garrido-Vega, P; Godino-Gallego, N
2013-01-01
Supply management is an area where hospitals have significant opportunities for improvement. The main objective of this paper has been to analyze how the application of Lean principles can improve logistics costs and user satisfaction. In connection with satisfaction, it also aimed to examine which aspects of the service define it and check for differences between different groups of users. The results of an experience to reorganize the hospital logistic system based on some Lean principles have been studied. This is therefore a case study, which combine different methods of data collection. The logistics cost calculation was carried out using the full costing method. To measure satisfaction of healthcare personnel, the internal logistics service users, an anonymous survey was conducted. Processing of the data obtained from the survey have included exploratory analysis, factor analysis and ANOVAs. The data have showed an improvement in logistics management after the implementation of Lean principles. Logistics costs were reduced and the satisfaction level of the internal users with the new logistics system was increased. Some differences in the degree of satisfaction by different groups of users were also detected, although they did not seem to distinguish between different aspects of logistic service. The analyzed experience shows the applicability and suitability of Lean principles to improve logistics operational costs and increase user satisfaction. Copyright © 2012 SECA. Published by Elsevier Espana. All rights reserved.
Ignition and combustion characteristics of metallized propellants
NASA Technical Reports Server (NTRS)
Mueller, D. C.; Turns, Stephen R.
1991-01-01
Over the past six months, experimental investigations were continued and theoretical work on the secondary atomization process was begun. Final shakedown of the sizing/velocity measuring system was completed and the aluminum combustion detection system was modified and tested. Atomizer operation was improved to allow steady state operation over long periods of time for several slurries. To validate the theoretical modeling, work involving carbon slurry atomization and combustion was begun and qualitative observations were made. Simultaneous measurements of aluminum slurry droplet size distributions and detection of burning aluminum particles were performed at several axial locations above the burner. The principle theoretical effort was the application of a rigid shell formation model to aluminum slurries and an investigation of the effects of various parameters on the shell formation process. This shell formation model was extended to include the process leading up to droplet disruption, and previously developed analytical models were applied to yield theoretical aluminum agglomerate ignition and combustion times. The several theoretical times were compared with the experimental results.
Quantifying Reinforcement Value and Demand for Psychoactive Substances in Humans
Heinz, Adrienne J.; Lilje, Todd C.; Kassel, Jon D.; de Wit, Harriet
2013-01-01
Behavioral economics is an emerging cross-disciplinary field that is providing an exciting new contextual framework for researchers to study addictive processes. New initiatives to study addiction under a behavioral economic rubric have yielded variable terminology and differing methods and theoretical approaches that are consistent with the multidimensional nature of addiction. The present article is intended to provide an integrative overview of the behavioral economic nomenclature and to describe relevant theoretical models, principles and concepts. Additionally, we present measures derived from behavioral economic theories that quantify demand for substances and assess decision making processes surrounding substance use. The sensitivity of these measures to different contextual elements (e.g., drug use status, acute drug effects, deprivation) is also addressed. The review concludes with discussion of the validity of these approaches and their potential for clinical application and highlights areas that warrant further research. Overall, behavioral economics offers a compelling framework to help explicate complex addictive processes and it is likely to provide a translational platform for clinical intervention. PMID:23062106
Large deviations and mixing for dissipative PDEs with unbounded random kicks
NASA Astrophysics Data System (ADS)
Jakšić, V.; Nersesyan, V.; Pillet, C.-A.; Shirikyan, A.
2018-02-01
We study the problem of exponential mixing and large deviations for discrete-time Markov processes associated with a class of random dynamical systems. Under some dissipativity and regularisation hypotheses for the underlying deterministic dynamics and a non-degeneracy condition for the driving random force, we discuss the existence and uniqueness of a stationary measure and its exponential stability in the Kantorovich-Wasserstein metric. We next turn to the large deviations principle (LDP) and establish its validity for the occupation measures of the Markov processes in question. The proof is based on Kifer’s criterion for non-compact spaces, a result on large-time asymptotics for generalised Markov semigroup, and a coupling argument. These tools combined together constitute a new approach to LDP for infinite-dimensional processes without strong Feller property in a non-compact space. The results obtained can be applied to the two-dimensional Navier-Stokes system in a bounded domain and to the complex Ginzburg-Landau equation.
Principle of minimal work fluctuations.
Xiao, Gaoyang; Gong, Jiangbin
2015-08-01
Understanding and manipulating work fluctuations in microscale and nanoscale systems are of both fundamental and practical interest. For example, in considering the Jarzynski equality 〈e-βW〉=e-βΔF, a change in the fluctuations of e-βW may impact how rapidly the statistical average of e-βW converges towards the theoretical value e-βΔF, where W is the work, β is the inverse temperature, and ΔF is the free energy difference between two equilibrium states. Motivated by our previous study aiming at the suppression of work fluctuations, here we obtain a principle of minimal work fluctuations. In brief, adiabatic processes as treated in quantum and classical adiabatic theorems yield the minimal fluctuations in e-βW. In the quantum domain, if a system initially prepared at thermal equilibrium is subjected to a work protocol but isolated from a bath during the time evolution, then a quantum adiabatic process without energy level crossing (or an assisted adiabatic process reaching the same final states as in a conventional adiabatic process) yields the minimal fluctuations in e-βW, where W is the quantum work defined by two energy measurements at the beginning and at the end of the process. In the classical domain where the classical work protocol is realizable by an adiabatic process, then the classical adiabatic process also yields the minimal fluctuations in e-βW. Numerical experiments based on a Landau-Zener process confirm our theory in the quantum domain, and our theory in the classical domain explains our previous numerical findings regarding the suppression of classical work fluctuations [G. Y. Xiao and J. B. Gong, Phys. Rev. E 90, 052132 (2014)].
Ethical principles in health research and review process.
Tangwa, Godfrey B
2009-11-01
In this paper I want to reflect on the fundamental ethical principles and their application in different particular contexts, especially in health research and the ethics review process. Four fundamental ethical principles have been identified and widely discussed in bioethical literature. These principles namely are: autonomy or respect for others, beneficence, non-maleficence and justice. These principles have cross-cultural validity, relevance and applicability. Every real-life situation and every concrete particular case in which ethical decision-making is called-for is unique and different from all others; but the same fundamental ethical principles are relevant and used in addressing all such cases and situations. Very often ethical problems will present themselves in the form of dilemmas and it is then necessary to use the same fundamental principles to analyze the situations, to argue persuasively and cogently with competence for the best options or choices in such situations. The issues I will be dealing with in this paper are necessarily more abstract and theoretical, but we will be discussing them from a very practical viewpoint and impulse, with a view to application in concrete real-life situations. The paper ends with some sample practical examples of cases that the reader can use to test his/her grasp of the principles, how to apply them, how to balance them in differing situations and contexts and how to adjudicate between them when they seem to be in conflict.
Pethica, Brian A
2007-12-21
As indicated by Gibbs and made explicit by Guggenheim, the electrical potential difference between two regions of different chemical composition cannot be measured. The Gibbs-Guggenheim Principle restricts the use of classical electrostatics in electrochemical theories as thermodynamically unsound with some few approximate exceptions, notably for dilute electrolyte solutions and concomitant low potentials where the linear limit for the exponential of the relevant Boltzmann distribution applies. The Principle invalidates the widespread use of forms of the Poisson-Boltzmann equation which do not include the non-electrostatic components of the chemical potentials of the ions. From a thermodynamic analysis of the parallel plate electrical condenser, employing only measurable electrical quantities and taking into account the chemical potentials of the components of the dielectric and their adsorption at the surfaces of the condenser plates, an experimental procedure to provide exceptions to the Principle has been proposed. This procedure is now reconsidered and rejected. No other related experimental procedures circumvent the Principle. Widely-used theoretical descriptions of electrolyte solutions, charged surfaces and colloid dispersions which neglect the Principle are briefly discussed. MD methods avoid the limitations of the Poisson-Bolzmann equation. Theoretical models which include the non-electrostatic components of the inter-ion and ion-surface interactions in solutions and colloid systems assume the additivity of dispersion and electrostatic forces. An experimental procedure to test this assumption is identified from the thermodynamics of condensers at microscopic plate separations. The available experimental data from Kelvin probe studies are preliminary, but tend against additivity. A corollary to the Gibbs-Guggenheim Principle is enunciated, and the Principle is restated that for any charged species, neither the difference in electrostatic potential nor the sum of the differences in the non-electrostatic components of the thermodynamic potential difference between regions of different chemical compositions can be measured.
Ultrasonic flow measurements for irrigation process monitoring
NASA Astrophysics Data System (ADS)
Ziani, Elmostafa; Bennouna, Mustapha; Boissier, Raymond
2004-02-01
This paper presents the state of the art of the general principle of liquid flow measurements by ultrasonic method, and problems of flow measurements. We present an ultrasonic flowmeter designed according to smart sensors concept, for the measurement of irrigation water flowing through pipelines or open channels, using the ultrasonic transit time approach. The new flowmeter works on the principle of measuring time delay differences between sound pulses transmitted upstream and downstream in the flowing liquid. The speed of sound in the flowing medium is eliminated as a variable because the flowrate calculations are based on the reciprocals of the transmission times. The transit time difference is digitally measured by means of a suitable, microprocessor controlled logic. This type of ultrasonic flowmeter will be widely used in industry and water management, it is well studied in this work, followed by some experimental results. For pressurized channels, we use one pair of ultrasonic transducer arranged in proper positions and directions of the pipe, in this case, to determine the liquid velocity, a real time on-line analysis taking account the geometries of the hydraulic system, is applied to the obtained ultrasonic data. In the open channels, we use a single or two pairs of ultrasonic emitter-receiver according to the desired performances. Finally, the goals of this work consist in integrating the smart sensor into irrigation systems monitoring in order to evaluate potential advantages and demonstrate their performance, on the other hand, to understand and use ultrasonic approach for determining flow characteristics and improving flow measurements by reducing errors caused by disturbances of the flow profiles.
Arnold, Katrin; Scheibe, Madlen; Müller, Olaf; Schmitt, Jochen
2016-11-01
The limited number of telemedicine applications being transferred to standard medical care in Germany may to some extent be explained by deficits in the current evaluation practice. Effectiveness and cost effectiveness can only be demonstrated to decision makers and potential users with methodologically sound and fully published evaluations. There is a lack of well-founded and mandatory standards for adequate, comparable evaluations of telemedicine applications. As part of the project CCS Telehealth Eastern Saxony (CCS THOS), a systematic review on evaluation concepts for telemedicine applications (search period until September 2014, databases Medline, Embase, HTA-Database, DARE, NHS EED) as well as an additional selective literature search were conducted. Suggestions for evaluation fundamentals were derived from the results. These suggestions were subjected to a formal consensus process (nominal group process) with relevant stakeholder groups (healthcare payers, healthcare providers, health policy representatives, researchers). 19 papers were included in the systematic review. In accordance with the predefined inclusion criteria, each presented an evaluation concept for telemedicine applications that was based upon a systematic review and/or a consensus process. Via a formal consensus process, the suggestions for evaluation principles derived from the review and the selective literature search (23 papers) resulted in ten agreed evaluation principles. Eight of them were unanimously agreed upon, two were arrived at with one abstention each. The principles enclose criteria for the planning, conduct and reporting of telemedicine evaluations. Adherence to them is obligatory for users of the telemedical infrastructure provided by CCS THOS. Furthermore, right from the beginning the intention was very much for these principles to be seized upon by other projects and initiatives. The agreed evaluation principles for telemedicine applications are the first in Germany to be based both upon evidence and consensus. Due to the methodology of development, they have a strong scientific and health policy legitimation. Therefore, and because of their general applicability, adherence to these principles beyond the context of the telemedicine platform developed within CCS THOS is recommended, namely throughout the German telemedicine scene. Copyright © 2016. Published by Elsevier GmbH.
How can we identify and communicate the ecological value of deep-sea ecosystem services?
Jobstvogt, Niels; Townsend, Michael; Witte, Ursula; Hanley, Nick
2014-01-01
Submarine canyons are considered biodiversity hotspots which have been identified for their important roles in connecting the deep sea with shallower waters. To date, a huge gap exists between the high importance that scientists associate with deep-sea ecosystem services and the communication of this knowledge to decision makers and to the wider public, who remain largely ignorant of the importance of these services. The connectivity and complexity of marine ecosystems makes knowledge transfer very challenging, and new communication tools are necessary to increase understanding of ecological values beyond the science community. We show how the Ecosystem Principles Approach, a method that explains the importance of ocean processes via easily understandable ecological principles, might overcome this challenge for deep-sea ecosystem services. Scientists were asked to help develop a list of clear and concise ecosystem principles for the functioning of submarine canyons through a Delphi process to facilitate future transfers of ecological knowledge. These ecosystem principles describe ecosystem processes, link such processes to ecosystem services, and provide spatial and temporal information on the connectivity between deep and shallow waters. They also elucidate unique characteristics of submarine canyons. Our Ecosystem Principles Approach was successful in integrating ecological information into the ecosystem services assessment process. It therefore has a high potential to be the next step towards a wider implementation of ecological values in marine planning. We believe that successful communication of ecological knowledge is the key to a wider public support for ocean conservation, and that this endeavour has to be driven by scientists in their own interest as major deep-sea stakeholders.