On the diagnostic emulation technique and its use in the AIRLAB
NASA Technical Reports Server (NTRS)
Migneault, Gerard E.
1988-01-01
An aid is presented for understanding and judging the relevance of the diagnostic emulation technique to studies of highly reliable, digital computing systems for aircraft. A short review is presented of the need for and the use of the technique as well as an explanation of its principles of operation and implementation. Details that would be needed for operational control or modification of existing versions of the technique are not described.
Principles of Dataset Versioning: Exploring the Recreation/Storage Tradeoff
Bhattacherjee, Souvik; Chavan, Amit; Huang, Silu; Deshpande, Amol; Parameswaran, Aditya
2015-01-01
The relative ease of collaborative data science and analysis has led to a proliferation of many thousands or millions of versions of the same datasets in many scientific and commercial domains, acquired or constructed at various stages of data analysis across many users, and often over long periods of time. Managing, storing, and recreating these dataset versions is a non-trivial task. The fundamental challenge here is the storage-recreation trade-off: the more storage we use, the faster it is to recreate or retrieve versions, while the less storage we use, the slower it is to recreate or retrieve versions. Despite the fundamental nature of this problem, there has been a surprisingly little amount of work on it. In this paper, we study this trade-off in a principled manner: we formulate six problems under various settings, trading off these quantities in various ways, demonstrate that most of the problems are intractable, and propose a suite of inexpensive heuristics drawing from techniques in delay-constrained scheduling, and spanning tree literature, to solve these problems. We have built a prototype version management system, that aims to serve as a foundation to our DataHub system for facilitating collaborative data science. We demonstrate, via extensive experiments, that our proposed heuristics provide efficient solutions in practical dataset versioning scenarios. PMID:28752014
Unit: Science and Safety, Inspection Set, National Trial Print.
ERIC Educational Resources Information Center
Australian Science Education Project, Toorak, Victoria.
This unit, a trial version prepared by the Australian Science Education Project, is intended to create in students an awareness of the potential hazards of a science room, to help build confidence by teaching safe techniques of apparatus manipulation, and to demonstrate the utility of planning work thoroughly. The safety principles are extended to…
Technology Allows Engineers to Make Solid Objects from Computer Designs.
ERIC Educational Resources Information Center
Wheeler, David L.
1992-01-01
Computer operators using the technique of three-dimensional printing or rapid prototyping may soon be able to sculpt an object on the screen and within minutes, have a paper, plastic, or ceramic version of the object in hand. The process uses the principle that physical objects can be created in layers. (MSE)
Morrison, Geoffrey Stewart
2014-05-01
In this paper it is argued that one should not attempt to directly assess whether a forensic analysis technique is scientifically acceptable. Rather one should first specify what one considers to be appropriate principles governing acceptable practice, then consider any particular approach in light of those principles. This paper focuses on one principle: the validity and reliability of an approach should be empirically tested under conditions reflecting those of the case under investigation using test data drawn from the relevant population. Versions of this principle have been key elements in several reports on forensic science, including forensic voice comparison, published over the last four-and-a-half decades. The aural-spectrographic approach to forensic voice comparison (also known as "voiceprint" or "voicegram" examination) and the currently widely practiced auditory-acoustic-phonetic approach are considered in light of this principle (these two approaches do not appear to be mutually exclusive). Approaches based on data, quantitative measurements, and statistical models are also considered in light of this principle. © 2013.
Radically questioning the principle of the least restrictive alternative: a reply to Nir Eyal
Saghai, Yashar
2014-01-01
In his insightful editorial, Nir Eyal explores the connections between nudging and shaming. One upshot of his argument is that we should question the principle of the least restrictive alternative in public health and health policy. In this commentary, I maintain that Eyal’s argument undermines only a rather implausible version of the principle of the least restrictive alternative and I sketch two reasons for rejecting the mainstream and more plausible version of this principle. PMID:25396212
Saghai, Yashar
2014-11-01
In his insightful editorial, Nir Eyal explores the connections between nudging and shaming. One upshot of his argument is that we should question the principle of the least restrictive alternative in public health and health policy. In this commentary, I maintain that Eyal's argument undermines only a rather implausible version of the principle of the least restrictive alternative and I sketch two reasons for rejecting the mainstream and more plausible version of this principle.
Is Memory Search Governed by Universal Principles or Idiosyncratic Strategies?
Healey, M. Karl; Kahana, Michael J.
2013-01-01
Laboratory paradigms have provided an empirical foundation for much of psychological science. Some have argued, however, that such paradigms are highly susceptible to idiosyncratic strategies and that rather than reflecting fundamental cognitive principles, many findings are artifacts of averaging across participants who employ different strategies. We develop a set of techniques to rigorously test the extent to which average data are distorted by such strategy differences and apply these techniques to free recall data from the Penn Electrophysiology of Encoding and Retrieval Study (PEERS). Recall initiation showed evidence of subgroups: the majority of participants initiate recall from the last item in the list, but one subgroup show elevated initiation probabilities for items 2–4 back from the end of the list and another showed elevated probabilities for the beginning of the list. By contrast, serial position curves and temporal and semantic clustering functions were remarkably consistent, with almost every participant exhibiting a recognizable version of the average function, suggesting that these functions reflect fundamental principles of the memory system. The approach taken here can serve as a model for evaluating the extent to which other laboratory paradigms are influenced by individual differences in strategy use. PMID:23957279
The precautionary principle is incoherent.
Peterson, Martin
2006-06-01
This article argues that no version of the precautionary principle can be reasonably applied to decisions that may lead to fatal outcomes. In support of this strong claim, a number of desiderata are proposed, which reasonable rules for rational decision making ought to satisfy. Thereafter, two impossibility theorems are proved, showing that no version of the precautionary principle can satisfy the proposed desiderata. These theorems are directly applicable to recent discussions of the precautionary principle in medicine, biotechnology, environmental management, and related fields. The impossibility theorems do not imply, however, that the precautionary principle is of no relevance at all in policy discussions. Even if it is not a reasonable rule for rational decision making, it is possible to interpret the precautionary principle in other ways, e.g., as an argumentative tool or as an epistemic principle favoring a reversed burden of proof.
Reynoso, G. A.; March, A. D.; Berra, C. M.; Strobietto, R. P.; Barani, M.; Iubatti, M.; Chiaradio, M. P.; Serebrisky, D.; Kahn, A.; Vaccarezza, O. A.; Leguiza, J. L.; Ceitlin, M.; Luna, D. A.; Bernaldo de Quirós, F. G.; Otegui, M. I.; Puga, M. C.; Vallejos, M.
2000-01-01
This presentation features linguistic and terminology management issues related to the development of the Spanish version of the Systematized Nomenclature of Medicine (SNOMED). It aims at describing the aspects of translating and the difficulties encountered in delivering a natural and consistent medical nomenclature. Bunge's three-layered model is referenced to analyze the sequence of symbolic concept representations. It further explains how a communicative translation based on a concept-to-concept approach was used to achieve the highest level of flawlessness and naturalness for the Spanish rendition of SNOMED. Translation procedures and techniques are described and exemplified. Both the computer-aided and human translation methods are portrayed. The scientific and translation team tasks are detailed, with focus on Newmark's four-level principle for the translation process, extended with a fifth further level relevant to the ontology to control the consistency of the typology of concepts. Finally the convenience for a common methodology to develop non-English versions of SNOMED is suggested. PMID:11079973
A portable instrument for measuring emissivities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Perinic, G.; Schulz, K.; Scherber, W.
1995-12-01
The quality control of surface emissivities is an important aspect in the manufacturing of cryopumps and other cryogenics equipment. It is particularly important in fusion reactor applications where standard coating techniques cannot be applied for the cryocondensation panels and for the thermal shielding baffles. The paper describes the working principle of a table top instrument developed by Dornier for measuring the mean emissivity in the spectral range 0.6-40 {mu}m at ambient temperature and the further development of the instrument to a portable version which can be used for on site measurements.
The method of 'principlism': a critique of the critique.
Lustig, B A
1992-10-01
Several scholars have recently criticized the dominant emphasis upon mid-level principles in bioethics best exemplified by Beauchamp and Childress's Principles of Biomedical Ethics. In Part I of this essay, I assess the fairness and cogency of three broad criticisms raised against 'principlism' as an approach: (1) that principlism, as an exercise in applied ethics, is insufficiently attentive to the dialectical relations between ethical theory and mortal practice; (2) that principlism fails to offer a systematic account of the principles of non-maleficence, beneficence, respect for autonomy, and justice; and (3) that principlism, as a version of moral pluralism, is fatally flawed by its theoretical agnosticism. While acknowledging that Beauchamp and Childress's reliance upon Ross's version of intuitionism is problematic, I conclude that the critics of principlism have failed to make a compelling case against its theoretical or practical adequacy as an ethical approach. In Part II, I assess the moral theory developed by Bernard Gert in Mortality: A New Justification of the Moral Rules, because Gert has recommended his approach as a systematic alternative to principlism. I judge Gert's theory to be seriously incomplete and, in contrast to principlism, unable to generate coherent conclusions about cases of active euthanasia and paternalism.
Synthetic Biology Open Language (SBOL) Version 2.2.0.
Cox, Robert Sidney; Madsen, Curtis; McLaughlin, James Alastair; Nguyen, Tramy; Roehner, Nicholas; Bartley, Bryan; Beal, Jacob; Bissell, Michael; Choi, Kiri; Clancy, Kevin; Grünberg, Raik; Macklin, Chris; Misirli, Goksel; Oberortner, Ernst; Pocock, Matthew; Samineni, Meher; Zhang, Michael; Zhang, Zhen; Zundel, Zach; Gennari, John H; Myers, Chris; Sauro, Herbert; Wipat, Anil
2018-04-02
Synthetic biology builds upon the techniques and successes of genetics, molecular biology, and metabolic engineering by applying engineering principles to the design of biological systems. The field still faces substantial challenges, including long development times, high rates of failure, and poor reproducibility. One method to ameliorate these problems would be to improve the exchange of information about designed systems between laboratories. The synthetic biology open language (SBOL) has been developed as a standard to support the specification and exchange of biological design information in synthetic biology, filling a need not satisfied by other pre-existing standards. This document details version 2.2.0 of SBOL that builds upon version 2.1.0 published in last year's JIB special issue. In particular, SBOL 2.2.0 includes improved description and validation rules for genetic design provenance, an extension to support combinatorial genetic designs, a new class to add non-SBOL data as attachments, a new class for genetic design implementations, and a description of a methodology to describe the entire design-build-test-learn cycle within the SBOL data model.
Synthetic Biology Open Language (SBOL) Version 2.1.0.
Beal, Jacob; Cox, Robert Sidney; Grünberg, Raik; McLaughlin, James; Nguyen, Tramy; Bartley, Bryan; Bissell, Michael; Choi, Kiri; Clancy, Kevin; Macklin, Chris; Madsen, Curtis; Misirli, Goksel; Oberortner, Ernst; Pocock, Matthew; Roehner, Nicholas; Samineni, Meher; Zhang, Michael; Zhang, Zhen; Zundel, Zach; Gennari, John H; Myers, Chris; Sauro, Herbert; Wipat, Anil
2016-09-01
Synthetic biology builds upon the techniques and successes of genetics, molecular biology, and metabolic engineering by applying engineering principles to the design of biological systems. The field still faces substantial challenges, including long development times, high rates of failure, and poor reproducibility. One method to ameliorate these problems would be to improve the exchange of information about designed systems between laboratories. The Synthetic Biology Open Language (SBOL) has been developed as a standard to support the specification and exchange of biological design information in synthetic biology, filling a need not satisfied by other pre-existing standards. This document details version 2.1 of SBOL that builds upon version 2.0 published in last year's JIB special issue. In particular, SBOL 2.1 includes improved rules for what constitutes a valid SBOL document, new role fields to simplify the expression of sequence features and how components are used in context, and new best practices descriptions to improve the exchange of basic sequence topology information and the description of genetic design provenance, as well as miscellaneous other minor improvements.
Synthetic Biology Open Language (SBOL) Version 2.1.0.
Beal, Jacob; Cox, Robert Sidney; Grünberg, Raik; McLaughlin, James; Nguyen, Tramy; Bartley, Bryan; Bissell, Michael; Choi, Kiri; Clancy, Kevin; Macklin, Chris; Madsen, Curtis; Misirli, Goksel; Oberortner, Ernst; Pocock, Matthew; Roehner, Nicholas; Samineni, Meher; Zhang, Michael; Zhang, Zhen; Zundel, Zach; Gennari, John; Myers, Chris; Sauro, Herbert; Wipat, Anil
2016-12-18
Synthetic biology builds upon the techniques and successes of genetics, molecular biology, and metabolic engineering by applying engineering principles to the design of biological systems. The field still faces substantial challenges, including long development times, high rates of failure, and poor reproducibility. One method to ameliorate these problems would be to improve the exchange of information about designed systems between laboratories. The Synthetic Biology Open Language (SBOL) has been developed as a standard to support the specification and exchange of biological design information in synthetic biology, filling a need not satisfied by other pre-existing standards. This document details version 2.1 of SBOL that builds upon version 2.0 published in last year’s JIB special issue. In particular, SBOL 2.1 includes improved rules for what constitutes a valid SBOL document, new role fields to simplify the expression of sequence features and how components are used in context, and new best practices descriptions to improve the exchange of basic sequence topology information and the description of genetic design provenance, as well as miscellaneous other minor improvements.
Department of Defense Information Enterprise Architecture Version 1.2
2010-05-07
mission. Principles express an organization’s intentions so that design and investment decisions can be made from a common basis of understanding ... Business rules are definitive statements that constrain operations to implement the principle and associated policies. The vision, principles, and
Polar Wavelet Transform and the Associated Uncertainty Principles
NASA Astrophysics Data System (ADS)
Shah, Firdous A.; Tantary, Azhar Y.
2018-06-01
The polar wavelet transform- a generalized form of the classical wavelet transform has been extensively used in science and engineering for finding directional representations of signals in higher dimensions. The aim of this paper is to establish new uncertainty principles associated with the polar wavelet transforms in L2(R2). Firstly, we study some basic properties of the polar wavelet transform and then derive the associated generalized version of Heisenberg-Pauli-Weyl inequality. Finally, following the idea of Beckner (Proc. Amer. Math. Soc. 123, 1897-1905 1995), we drive the logarithmic version of uncertainty principle for the polar wavelet transforms in L2(R2).
Robotic radical cystectomy and intracorporeal urinary diversion: The USC technique.
Abreu, Andre Luis de Castro; Chopra, Sameer; Azhar, Raed A; Berger, Andre K; Miranda, Gus; Cai, Jie; Gill, Inderbir S; Aron, Monish; Desai, Mihir M
2014-07-01
Radical cystectomy is the gold-standard treatment for muscle-invasive and refractory nonmuscle-invasive bladder cancer. We describe our technique for robotic radical cystectomy (RRC) and intracorporeal urinary diversion (ICUD), that replicates open surgical principles, and present our preliminary results. Specific descriptions for preoperative planning, surgical technique, and postoperative care are provided. Demographics, perioperative and 30-day complications data were collected prospectively and retrospectively analyzed. Learning curve trends were analyzed individually for ileal conduits (IC) and neobladders (NB). SAS(®) Software Version 9.3 was used for statistical analyses with statistical significance set at P < 0.05. Between July 2010 and September 2013, RRC and lymph node dissection with ICUD were performed in 103 consecutive patients (orthotopic NB=46, IC 57). All procedures were completed robotically replicating the open surgical principles. The learning curve trends showed a significant reduction in hospital stay for both IC (11 vs. 6-day, P < 0.01) and orthotopic NB (13 vs. 7.5-day, P < 0.01) when comparing the first third of the cohort with the rest of the group. Overall median (range) operative time and estimated blood loss was 7 h (4.8-13) and 200 mL (50-1200), respectively. Within 30-day postoperatively, complications occurred in 61 (59%) patients, with the majority being low grade (n = 43), and no patient died. Median (range) nodes yield was 36 (0-106) and 4 (3.9%) specimens had positive surgical margins. Robotic radical cystectomy with totally ICUD is safe and feasible. It can be performed using the established open surgical principles with encouraging perioperative outcomes.
How you ask is what you get: Framing effects in willingness-to-pay for a QALY.
Ahlert, Marlies; Breyer, Friedrich; Schwettmann, Lars
2016-02-01
In decisions on financing new and innovative health care technologies a central question is how to determine the value citizens place on the gains in health and life expectancy that result from respective medical treatments. We report results of surveys of four representative samples of the German population. In 2010 and 2012, in total about 5000 respondents were asked for their willingness-to-pay (WTP) for either an extension of their life or an improvement in their health corresponding to a gain of one quality-adjusted life year (QALY). Specific changes of the study design allow for ceteris paribus comparisons of different survey versions. While the initial version exactly copied a questionnaire used in the EuroVaQ (European Value of a QALY) project, which was conducted in nine European countries and Palestine, but not in Germany, in other versions the wording and the survey technique were modified. The findings show that the technique of posing the questions plays an important role when respondents are asked to imagine being in hypothetical situations. This clearly refers to the wording of the questions and the survey setting (personal or online interview). But even simple design elements such as putting a yes/no filter in front greatly affect the answers in terms of both the frequency of zero WTP and the distribution of positive amounts. From the different results, we conclude that it is inevitable to conduct studies comprising a broad variety of versions when trying to elicit WTP for a specific type of QALY in order to achieve an array of values combined by insights into the principles of their sensitivity. Copyright © 2015 Elsevier Ltd. All rights reserved.
Geuens, Jonas; Swinnen, Thijs Willem; Westhovens, Rene; de Vlam, Kurt; Geurts, Luc; Vanden Abeele, Vero
2016-10-13
Chronic arthritis (CA), an umbrella term for inflammatory rheumatic and other musculoskeletal diseases, is highly prevalent. Effective disease-modifying antirheumatic drugs for CA are available, with the exception of osteoarthritis, but require a long-term commitment of patients to comply with the medication regimen and management program as well as a tight follow-up by the treating physician and health professionals. Additionally, patients are advised to participate in physical exercise programs. Adherence to exercises and physical activity programs is often very low. Patients would benefit from support to increase medication compliance as well as compliance to the physical exercise programs. To address these shortcomings, health apps for CA patients have been created. These mobile apps assist patients in self-management of overall health measures, health prevention, and disease management. By including persuasive principles designed to reinforce, change, or shape attitudes or behaviors, health apps can transform into support tools that motivate and stimulate users to achieve or keep up with target behavior, also called persuasive systems. However, the extent to which health apps for CA patients consciously and successfully employ such persuasive principles remains unknown. The objective of this study was to evaluate the number and type of persuasive principles present in current health apps for CA patients. A review of apps for arthritis patients was conducted across the three major app stores (Google Play, Apple App Store, and Windows Phone Store). Collected apps were coded according to 37 persuasive principles, based on an altered version of the Persuasive System Design taxonomy of Oinas-Kukkonen and Harjuma and the taxonomy of Behavior Change Techniques of Michie and Abraham. In addition, user ratings, number of installs, and price of the apps were also coded. We coded 28 apps. On average, 5.8 out of 37 persuasive principles were used in each app. The most used category of persuasive principles was System Credibility with an average of 2.6 principles. Task Support was the second most used, with an average of 2.3 persuasive principles. Next was Dialogue Support with an average of 0.5 principles. Social Support was last with an average of 0.01 persuasive principles only. Current health apps for CA patients would benefit from adding Social Support techniques (eg, social media, user fora) and extending Dialogue Support techniques (eg, rewards, praise). The addition of automated tracking of health-related parameters (eg, physical activity, step count) could further reduce the effort for CA patients to manage their disease and thus increase Task Support. Finally, apps for health could benefit from a more evidence-based approach, both in developing the app as well as ensuring that content can be verified as scientifically proven, which will result in enhanced System Credibility.
A knowledge based system for scientific data visualization
NASA Technical Reports Server (NTRS)
Senay, Hikmet; Ignatius, Eve
1992-01-01
A knowledge-based system, called visualization tool assistant (VISTA), which was developed to assist scientists in the design of scientific data visualization techniques, is described. The system derives its knowledge from several sources which provide information about data characteristics, visualization primitives, and effective visual perception. The design methodology employed by the system is based on a sequence of transformations which decomposes a data set into a set of data partitions, maps this set of partitions to visualization primitives, and combines these primitives into a composite visualization technique design. Although the primary function of the system is to generate an effective visualization technique design for a given data set by using principles of visual perception the system also allows users to interactively modify the design, and renders the resulting image using a variety of rendering algorithms. The current version of the system primarily supports visualization techniques having applicability in earth and space sciences, although it may easily be extended to include other techniques useful in other disciplines such as computational fluid dynamics, finite-element analysis and medical imaging.
Sala-Sastre, Nohemi; Herdman, Mike; Navarro, Lidia; de la Prada, Miriam; Pujol, Ramón M; Serra, Consol; Alonso, Jordi; Flyvholm, Mari-Ann; Giménez-Arnau, Ana M
2009-08-01
Occupational skin diseases are among the most frequent work-related diseases in industrialized countries. The Nordic Occupational Skin Questionnaire (NOSQ-2002), developed in English, is a useful tool for screening of occupational skin diseases. To culturally adapt the NOSQ-2002 to Spanish and Catalan and to assess the clarity, comprehension, cultural relevance and appropriateness of the translated versions. The International Society for Pharmacoeconomics and Outcomes Research (ISPOR) principles of good practice for the translation and cultural adaptation of patient-reported outcomes were followed. After translation into the target language, a first consensus version of the questionnaire was evaluated in multiple cognitive debriefing interviews. The expert panel introduced some modifications in 39 (68%) and 27 (47%) items in the Spanish and Catalan version, respectively (e.g. addition of examples and definitions, reformulation of instructions and use of direct question format). This version was back translated and submitted to the original authors, who suggested a further seven and two modifications in the Spanish and Catalan versions, respectively. A second set of cognitive interviews were performed. A consensus version of both questionnaires was obtained after final modifications based on comments by the patients. The final versions of the Spanish and Catalan NOSQ-2002 questionnaires are now available at www.NRCWE.dk/NOSQ.
Analysis of an epigenetic argument against human reproductive cloning.
Nordgren, Anders
2006-08-01
Human reproductive cloning is a much disputed ethical issue. This technology is often condemned as being contrary to human dignity. However, there are also risk arguments. An ethical argument that is often put forward by scientists but seldom developed in more detail focuses on health risks in animal cloning. There is a high risk that animal clones exhibit abnormalities and these are increasingly believed to be due to errors in epigenetic reprogramming. The argument is that human reproductive cloning should not be carried out because human clones are also likely to exhibit abnormalities due to inappropriate epigenetic reprogramming. Different versions of this epigenetic argument are analysed, a categorical version and a non-categorical. The non-categorical version is suggested to be more well-considered. With regard to policy making on human reproductive cloning, the categorical version can be used to prescribe a permanent ban, while the non-categorical version can be used to prescribe a temporary ban. The implications of the precautionary principle--as interpreted in the European Union--are investigated. The conclusion is that it seems possible to support a temporary ban by reference to this principle.
Kostopoulos, Epameinondas; Agiannidis, Christos; Konofaos, Petros; Kotsakis, Ioannis; Hatzigianni, Panagiota; Georgopoulos, Gerasimos; Papadatou, Zoe; Konstantinidou, Chara; Champsas, Gregorios; Papadopoulos, Othon; Casoli, Vincent
2018-03-08
Medial canthus is a common area of skin cancer prevalence. Defects in this region represent a challenging reconstructive task. The nasal version of keystone perforator island flap (KPIF) has proven its versatility. The aim of the present study was to expand its utilization in the neighbor medial canthus area. A modified croissant-like KPIF (CKPIF) was used resolving inner convexity-related problems. The presence of procerus in the glabella area, bridging a surface from nasalis up to the frontalis, changed the traditional dissecting flap technique. Thus, the authors introduce the bridge principle, which consists of the indirect transfer of the flap to the defect site through a muscular "bridge" (the procerus). The authors report their experience in medial canthal reconstruction combining a modified KPIF with a new dissecting "principle." From November 2016 to July 2017, a series of patients presenting soft tissue defects of various dimensions in the medial canthus, secondary to tumor extirpation, sustained reconstruction with a CKPIF dissected with the bridge principle. A total of 15 patients were treated with this new technique. Their mean age was 75.3 years. The mean size of the defect was 2.08 cm (length) × 1.5 cm (width). All flaps survived without any sign of venous congestion. A transient epiphora presented in 4 patients (4/15 or 26.6%), which was subsided 2 months later. A new approach following a novel paradigm was introduced to resolve an old problem. Initial outcomes are encouraging. However, longer series are needed to extract definitive and safer conclusion.
Reusable Launch Vehicle Attitude Control Using a Time-Varying Sliding Mode Control Technique
NASA Technical Reports Server (NTRS)
Shtessel, Yuri B.; Zhu, J. Jim; Daniels, Dan; Jackson, Scott (Technical Monitor)
2002-01-01
In this paper we present a time-varying sliding mode control (TVSMC) technique for reusable launch vehicle (RLV) attitude control in ascent and entry flight phases. In ascent flight the guidance commands Euler roll, pitch and yaw angles, and in entry flight it commands the aerodynamic angles of bank, attack and sideslip. The controller employs a body rate inner loop and the attitude outer loop, which are separated in time-scale by the singular perturbation principle. The novelty of the TVSMC is that both the sliding surface and the boundary layer dynamics can be varied in real time using the PD-eigenvalue assignment technique. This salient feature is used to cope with control command saturation and integrator windup in the presence of severe disturbance or control effector failure, which enhances the robustness and fault tolerance of the controller. The TV-SMC ascent and descent designs are currently being tested with high fidelity, 6-DOF dispersion simulations. The test results will be presented in the final version of this paper.
On the use of The Bio-Impedance technique for Body Composition Measurements
NASA Astrophysics Data System (ADS)
Huerta-Franco, R.; Vargas-Luna, M.; González-Solís, J. L.; Gutiérrez-Juárez, G.
2003-09-01
Reviewing the methods and physical principles used in body composition measurements (BCM), it is evident that more accurate, reliable, and easily handled methods are required. The use of bio-impedance analysis (BIA) has been very useful in BCM. This technique, in the single frequency mode, has some commercial versions to perform BCM. However these apparatus have significant variability in the BCM values. The multi-frequency option of the bio-impedance technique has still a lot of challenges to overcome. We studied the variability of the body impedance spectrum (from 1 Hz to 1 MHz) in a group of subjects compared to the values obtained from commercial apparatus. We compared different anatomical body regions, some of them with less subcutaneous body fat (frontal, anterior tibial, knee, and frontal regions); others with more subcutaneous body fat (pectoral, abdominal, and internal calf regions). In order to model the bio-impedance spectrum, we analyzed layered samples with different thickness and material composition.
Action Principle Derivation of Magnetofluid Models
NASA Astrophysics Data System (ADS)
Wurm, Alexander; Morrison, P. J.
2003-10-01
As it is well-known, ideal MHD possesses an action principle formulation when it is expressed in terms of Lagrangian (or material) variables.^1 Starting with a general magneto-two-fluid Lagrangian, we derive action principles for both MHD approximations and generalizations that contain more complete versions of Ohm's law. ^1 W.A. Newcomb, Nuclear Fusion: 1962 Suppl. Part 2, p. 451
Swinnen, Thijs Willem; Westhovens, Rene; de Vlam, Kurt; Geurts, Luc; Vanden Abeele, Vero
2016-01-01
Background Chronic arthritis (CA), an umbrella term for inflammatory rheumatic and other musculoskeletal diseases, is highly prevalent. Effective disease-modifying antirheumatic drugs for CA are available, with the exception of osteoarthritis, but require a long-term commitment of patients to comply with the medication regimen and management program as well as a tight follow-up by the treating physician and health professionals. Additionally, patients are advised to participate in physical exercise programs. Adherence to exercises and physical activity programs is often very low. Patients would benefit from support to increase medication compliance as well as compliance to the physical exercise programs. To address these shortcomings, health apps for CA patients have been created. These mobile apps assist patients in self-management of overall health measures, health prevention, and disease management. By including persuasive principles designed to reinforce, change, or shape attitudes or behaviors, health apps can transform into support tools that motivate and stimulate users to achieve or keep up with target behavior, also called persuasive systems. However, the extent to which health apps for CA patients consciously and successfully employ such persuasive principles remains unknown. Objective The objective of this study was to evaluate the number and type of persuasive principles present in current health apps for CA patients. Methods A review of apps for arthritis patients was conducted across the three major app stores (Google Play, Apple App Store, and Windows Phone Store). Collected apps were coded according to 37 persuasive principles, based on an altered version of the Persuasive System Design taxonomy of Oinas-Kukkonen and Harjuma and the taxonomy of Behavior Change Techniques of Michie and Abraham. In addition, user ratings, number of installs, and price of the apps were also coded. Results We coded 28 apps. On average, 5.8 out of 37 persuasive principles were used in each app. The most used category of persuasive principles was System Credibility with an average of 2.6 principles. Task Support was the second most used, with an average of 2.3 persuasive principles. Next was Dialogue Support with an average of 0.5 principles. Social Support was last with an average of 0.01 persuasive principles only. Conclusions Current health apps for CA patients would benefit from adding Social Support techniques (eg, social media, user fora) and extending Dialogue Support techniques (eg, rewards, praise). The addition of automated tracking of health-related parameters (eg, physical activity, step count) could further reduce the effort for CA patients to manage their disease and thus increase Task Support. Finally, apps for health could benefit from a more evidence-based approach, both in developing the app as well as ensuring that content can be verified as scientifically proven, which will result in enhanced System Credibility. PMID:27742604
Simons, M; King, S; Edgar, D
2003-01-01
Clinical practice guidelines are a tool to assist with clinical decision making. They provide information about the care for a condition and make recommendations based on research evidence, which can be adapted locally. A focus group within the Allied Health Interest Group of the Australian and New Zealand Burn Association has compiled the "Occupational Therapy and Physiotherapy for the Patient with Burns--Principles and Management Guidelines." These guidelines are designed as a practical guide to the relevant clinical knowledge and therapy intervention techniques required for effective patient management. Content areas include respiratory management, edema management, splinting and positioning, physical function (mobility, function, exercise), scar management, and psychosocial and mutual elements. The document has undergone extensive review by members of the Australian and New Zealand Burn Association to ensure clarity, internal consistency, and acceptability. The guidelines have been endorsed by the Australian and New Zealand Burn Association. An abridged version of the guidelines is included in this article, with the full document available from www.anzba.org.au.
Communicating with Muslim parents: "the four principles" are not as culturally neutral as suggested.
Westra, Anna E; Willems, Dick L; Smit, Bert J
2009-11-01
The "four principles approach" has been popularly accepted as a set of universal guidelines for biomedical ethics. Based on four allegedly trans-cultural principles (respect for autonomy, nonmaleficence, beneficence and justice), it is supposed to fulfil the need of a 'culturally neutral approach to thinking about ethical issues in health care'. On the basis of a case-history, this paper challenges the appropriateness of communicating in terms of these four principles with patients with a different background. The case describes the situation in which Muslim parents bring forward that their religion keeps them from consenting to end-of-life decisions by non-religious paediatricians. In a literature analysis, the different meanings and roles of the relevant principles in non-religious and Islamic ethics are compared. In non-religious ethics, the principle of nonmaleficence may be used to justify withholding or withdrawing futile or damaging treatments, whereas Islamic ethics applies this principle to forbid all actions that may harm life. And while the non-religious version of the principle of respect for autonomy emphasises the need for informed consent, the Islamic version focuses on "respect for the patient". We conclude that the parties involved in the described disagreement may feel committed to seemingly similar, but actually quite different principles. In such cases, communication in terms of these principles may create a conflict within an apparently common conceptual framework. The four principles approach may be very helpful in analysing ethical dilemmas, but when communicating with patients with different backgrounds, an alternative approach is needed that pays genuine attention to the different backgrounds.
Moral absolutism and ectopic pregnancy.
Kaczor, C
2001-02-01
If one accepts a version of absolutism that excludes the intentional killing of any innocent human person from conception to natural death, ectopic pregnancy poses vexing difficulties. Given that the embryonic life almost certainly will die anyway, how can one retain one's moral principle and yet adequately respond to a situation that gravely threatens the life of the mother and her future fertility? The four options of treatment most often discussed in the literature are non-intervention, salpingectomy (removal of tube with embryo), salpingostomy (removal of embryo alone), and use of methotrexate (MXT). In this essay, I review these four options and introduce a fifth (the milking technique). In order to assess these options in terms of the absolutism mentioned, it will also be necessary to discuss various accounts of the intention/foresight distinction. I conclude that salpingectomy, salpingostomy, and the milking technique are compatible with absolutist presuppositions, but not the use of methotrexate.
The Basic Principle of Calculus?
ERIC Educational Resources Information Center
Hardy, Michael
2011-01-01
A simple partial version of the Fundamental Theorem of Calculus can be presented on the first day of the first-year calculus course, and then relied upon repeatedly in assigned problems throughout the course. With that experience behind them, students can use the partial version to understand the full-fledged Fundamental Theorem, with further…
Principles of Radio: A Laboratory Experiment
ERIC Educational Resources Information Center
Kraftmakher, Yaakov
2002-01-01
An experiment is proposed for learning the principles of radio. A simple radio receiver illustrates amplitude modulation and demodulation, the selectivity of a receiver and the features of a directional antenna. Both normal and computerized versions of the experiment are described. The computerized experiment employs the "ScienceWorkshop"…
The Uncertainty Principle, Virtual Particles and Real Forces
ERIC Educational Resources Information Center
Jones, Goronwy Tudor
2002-01-01
This article provides a simple practical introduction to wave-particle duality, including the energy-time version of the Heisenberg Uncertainty Principle. It has been successful in leading students to an intuitive appreciation of "virtual particles" and the role they play in describing the way ordinary particles, like electrons and protons, exert…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-23
... Exchange notes that a version of the instant filing requesting an extension of the Pilot was formally filed... May 27, 2010, due to technical deficiencies in that filing. The instant version corrects those... the instant filing is consistent with these principles. Specifically an extension will allow the...
Myhr, Anne Ingeborg; Myskja, Bjørn K
2011-04-01
Nanoparticles have multifaceted advantages in drug administration as vaccine delivery and hence hold promises for improving protection of farmed fish against diseases caused by pathogens. However, there are concerns that the benefits associated with distribution of nanoparticles may also be accompanied with risks to the environment and health. The complexity of the natural and social systems involved implies that the information acquired in quantified risk assessments may be inadequate for evidence-based decisions. One controversial strategy for dealing with this kind of uncertainty is the precautionary principle. A few years ago, an UNESCO expert group suggested a new approach for implementation of the principle. Here we compare the UNESCO principle with earlier versions and explore the advantages and disadvantages by employing the UNESCO version to the use of PLGA nanoparticles for delivery of vaccines in aquaculture. Finally, we discuss whether a combined scientific and ethical analysis that involves the concept of responsibility will enable approaches that can provide a supplement to the precautionary principle as basis for decision-making in areas of scientific uncertainty, such as the application of nanoparticles in the vaccination of farmed fish.
How not to criticize the precautionary principle.
Hughes, Jonathan
2006-10-01
The precautionary principle has its origins in debates about environmental policy, but is increasingly invoked in bioethical contexts. John Harris and Søren Holm argue that the principle should be rejected as incoherent, irrational, and representing a fundamental threat to scientific advance and technological progress. This article argues that while there are problems with standard formulations of the principle, Harris and Holm's rejection of all its forms is mistaken. In particular, they focus on strong versions of the principle and fail to recognize that weaker forms, which may escape their criticisms, are both possible and advocated in the literature.
A general derivation and quantification of the third law of thermodynamics.
Masanes, Lluís; Oppenheim, Jonathan
2017-03-14
The most accepted version of the third law of thermodynamics, the unattainability principle, states that any process cannot reach absolute zero temperature in a finite number of steps and within a finite time. Here, we provide a derivation of the principle that applies to arbitrary cooling processes, even those exploiting the laws of quantum mechanics or involving an infinite-dimensional reservoir. We quantify the resources needed to cool a system to any temperature, and translate these resources into the minimal time or number of steps, by considering the notion of a thermal machine that obeys similar restrictions to universal computers. We generally find that the obtainable temperature can scale as an inverse power of the cooling time. Our results also clarify the connection between two versions of the third law (the unattainability principle and the heat theorem), and place ultimate bounds on the speed at which information can be erased.
A general derivation and quantification of the third law of thermodynamics
Masanes, Lluís; Oppenheim, Jonathan
2017-01-01
The most accepted version of the third law of thermodynamics, the unattainability principle, states that any process cannot reach absolute zero temperature in a finite number of steps and within a finite time. Here, we provide a derivation of the principle that applies to arbitrary cooling processes, even those exploiting the laws of quantum mechanics or involving an infinite-dimensional reservoir. We quantify the resources needed to cool a system to any temperature, and translate these resources into the minimal time or number of steps, by considering the notion of a thermal machine that obeys similar restrictions to universal computers. We generally find that the obtainable temperature can scale as an inverse power of the cooling time. Our results also clarify the connection between two versions of the third law (the unattainability principle and the heat theorem), and place ultimate bounds on the speed at which information can be erased. PMID:28290452
NASA Astrophysics Data System (ADS)
Mattingly, James
2014-05-01
I argue that the key principle of microgravity is what I have called elsewhere the Lorentzian strategy. This strategy may be seen as either a reverse-engineering approach or a descent with modification approach, but however one sees if the method works neither by attempting to propound a theory that is the quantum version of either an extant or generalized gravitation theory nor by attempting to propound a theory that is the final version of quantum mechanics and finding gravity within it. Instead the method works by beginning with what we are pretty sure is a good approximation to the low-energy limit of whatever the real microprocesses are that generate what we experience as gravitation. This method is powerful, fruitful, and not committed to principles for which we have, as yet, only scant evidence; the method begins with what we do know and teases out what we can know next. The principle is methodological, not ontological.
A Spectral Approach for Quenched Limit Theorems for Random Expanding Dynamical Systems
NASA Astrophysics Data System (ADS)
Dragičević, D.; Froyland, G.; González-Tokman, C.; Vaienti, S.
2018-06-01
We prove quenched versions of (i) a large deviations principle (LDP), (ii) a central limit theorem (CLT), and (iii) a local central limit theorem for non-autonomous dynamical systems. A key advance is the extension of the spectral method, commonly used in limit laws for deterministic maps, to the general random setting. We achieve this via multiplicative ergodic theory and the development of a general framework to control the regularity of Lyapunov exponents of twisted transfer operator cocycles with respect to a twist parameter. While some versions of the LDP and CLT have previously been proved with other techniques, the local central limit theorem is, to our knowledge, a completely new result, and one that demonstrates the strength of our method. Applications include non-autonomous (piecewise) expanding maps, defined by random compositions of the form {T_{σ^{n-1} ω} circ\\cdotscirc T_{σω}circ T_ω}. An important aspect of our results is that we only assume ergodicity and invertibility of the random driving {σ:Ω\\toΩ} ; in particular no expansivity or mixing properties are required.
A Spectral Approach for Quenched Limit Theorems for Random Expanding Dynamical Systems
NASA Astrophysics Data System (ADS)
Dragičević, D.; Froyland, G.; González-Tokman, C.; Vaienti, S.
2018-01-01
We prove quenched versions of (i) a large deviations principle (LDP), (ii) a central limit theorem (CLT), and (iii) a local central limit theorem for non-autonomous dynamical systems. A key advance is the extension of the spectral method, commonly used in limit laws for deterministic maps, to the general random setting. We achieve this via multiplicative ergodic theory and the development of a general framework to control the regularity of Lyapunov exponents of twisted transfer operator cocycles with respect to a twist parameter. While some versions of the LDP and CLT have previously been proved with other techniques, the local central limit theorem is, to our knowledge, a completely new result, and one that demonstrates the strength of our method. Applications include non-autonomous (piecewise) expanding maps, defined by random compositions of the form {T_{σ^{n-1} ω} circ\\cdotscirc T_{σω}circ T_ω} . An important aspect of our results is that we only assume ergodicity and invertibility of the random driving {σ:Ω\\toΩ} ; in particular no expansivity or mixing properties are required.
Teaching Keynes's Principle of Effective Demand Using the Aggregate Labor Market Diagram.
ERIC Educational Resources Information Center
Dalziel, Paul; Lavoie, Marc
2003-01-01
Suggests a method to teach John Keynes's principle of effective demand using a standard aggregate labor market diagram familiar to students taking advanced undergraduate macroeconomics courses. States the analysis incorporates Michal Kalecki's version to show Keynesian unemployment as a point on the aggregate labor demand curve inside the…
NASA Technical Reports Server (NTRS)
1975-01-01
A system is presented which processes FORTRAN based software systems to surface potential problems before they become execution malfunctions. The system complements the diagnostic capabilities of compilers, loaders, and execution monitors rather than duplicating these functions. Also, it emphasizes frequent sources of FORTRAN problems which require inordinate manual effort to identify. The principle value of the system is extracting small sections of unusual code from the bulk of normal sequences. Code structures likely to cause immediate or future problems are brought to the user's attention. These messages stimulate timely corrective action of solid errors and promote identification of 'tricky' code. Corrective action may require recoding or simply extending software documentation to explain the unusual technique.
Effective use of remote sensing products in litigation
NASA Technical Reports Server (NTRS)
Jaynes, R. A.
1983-01-01
A boiled-down version of major legal principles affecting the admissibility of data and products from remote sensing devices is presented. It is suggested that enhancements or classifications of digital data (from scanning devices or from digitized aerial photography) be proffered as evidence in a fashion similar to the manner in which maps from photogrammetric techniques are introduced as evidence. Every effort should be made to illucidate the processes by which digital data are analytically treated or manipulated. Remote sensing expert witnesses should be practiced in providing concise and clear explanations of both data and methods. Special emphasis should be placed on being prepared to provide a detailed accounting of steps taken to calibrate and verify spectral characteristics with ground truth.
French, Lauren; Gerrie, Matthew P; Garry, Maryanne; Mori, Kazuo
2009-11-01
The MORI technique provides a unique way to research social influences on memory. The technique allows people to watch different movies on the same screen at the same time without realizing that each of them sees something different. As a result, researchers can create a situation in which people feel as though they share an experience, but systematic differences are introduced into their memories, and the effect of those differences can be tracked through a discussion. Despite its methodological advances, the MORI technique has been met with criticism, mostly because reviewers are worried that the MORI technique might not completely block the alternate movie version from view, leading people in these studies to see their partner's version of the movie as well as their own. We addressed these concerns in two experiments. We found no evidence that subjects noticed the alternate movie version while watching a movie via the MORI technique (Experiment 1) and no evidence that subjects remembered details from the alternate movie version (Experiment 2). Taken together, the results provide support for the MORI technique as a valuable research tool.
Flawed Applications of Bernoulli's Principle
ERIC Educational Resources Information Center
Koumaras, Panagiotis; Primerakis, Georgios
2018-01-01
One of the most popular demonstration experiments pertaining to Bernoulli's principle is the production of a water spray by using a vertical plastic straw immersed in a glass of water and a horizontal straw to blow air towards the top edge of the vertical one. A more general version of this phenomenon, appearing also in school physics problems, is…
ERIC Educational Resources Information Center
Cheng, Ming-Chang; Chou, Pei-I; Wang, Ya-Ting; Lin, Chih-Ho
2015-01-01
This study investigates how the illustrations in a science textbook, with their design modified according to cognitive process principles, affected students' learning performance. The quasi-experimental design recruited two Grade 5 groups (N?=?58) as the research participants. The treatment group (n?=?30) used the modified version of the textbook,…
ERIC Educational Resources Information Center
Brancaccio-Taras, Loretta; Pape-Lindstrom, Pamela; Peteroy-Kelly, Marcy; Aguirre, Karen; Awong-Taylor, Judy; Balser, Teri; Cahill, Michael J.; Frey, Regina F.; Jack, Thomas; Kelrick, Michael; Marley, Kate; Miller, Kathryn G.; Osgood, Marcy; Romano, Sandra; Uzman, J. Akif; Zhao, Jiuqing
2016-01-01
The PULSE Vision & Change Rubrics, version 1.0, assess life sciences departments' progress toward implementation of the principles of the "Vision and Change report." This paper reports on the development of the rubrics, their validation, and their reliability in measuring departmental change aligned with the "Vision and…
ERIC Educational Resources Information Center
Siegle, Del, Ed.
This pamphlet (Practitioner's Guide), in both an English version and Spanish version, is intended for parents of precocious readers. Research facts on early reading are briefly summarized. Implications for the classroom and home are offered and include a discussion of early school entrance, principles of reading instruction, and ways the parent…
Self-consistency in the phonon space of the particle-phonon coupling model
NASA Astrophysics Data System (ADS)
Tselyaev, V.; Lyutorovich, N.; Speth, J.; Reinhard, P.-G.
2018-04-01
In the paper the nonlinear generalization of the time blocking approximation (TBA) is presented. The TBA is one of the versions of the extended random-phase approximation (RPA) developed within the Green-function method and the particle-phonon coupling model. In the generalized version of the TBA the self-consistency principle is extended onto the phonon space of the model. The numerical examples show that this nonlinear version of the TBA leads to the convergence of results with respect to enlarging the phonon space of the model.
Development and Application of the p-Version of the Finite Element Method.
1987-12-30
element method has been the subject of intensive study since the early 1950’s and perhaps even earlier. Study of the p-version of the finite element...method, on the other hand, began at *Washington University in St. Louis in the early 1970’s and led to a more recent study of the h-p version. Research...infinite strip to a bounded domain. 3.3 A Numerical Argument Principle In order to assure that all roots have indeed been obtained, we have studied the
NASA Astrophysics Data System (ADS)
Salleh, Khalijah Mohd; Abdullah, Abu Bakar Bin
2008-05-01
An explorative study was carried out to confirm Malaysian Physics teachers' perception that Archimedes' principle is a difficult topic for secondary level students. The interview method was used for data collection. The study sample was made of nine national secondary schools teachers from Miri, Sarawak. The data was analysed qualitatively using the Atlas-ti version 5.2 software. The findings of the study showed that i) Archimedes' principle as compared to Bernoulli's and Pascal's is the most difficult principle of hydrodynamics for students, ii) more time was given in the teaching and learning (TL) of Archimedes principle compared to the other two principles, iii) the major TL problems include conceptual understanding, application of physics principles and ideas, and lack of mathematical skills. These findings implicate the need to develop corresponding instructional materials and learning kits that can assist students' understanding of Archimedes' principle.
Synthetic Biology Open Language (SBOL) Version 2.0.0.
Bartley, Bryan; Beal, Jacob; Clancy, Kevin; Misirli, Goksel; Roehner, Nicholas; Oberortner, Ernst; Pocock, Matthew; Bissell, Michael; Madsen, Curtis; Nguyen, Tramy; Zhang, Zhen; Gennari, John H; Myers, Chris; Wipat, Anil; Sauro, Herbert
2015-09-04
Synthetic biology builds upon the techniques and successes of genetics, molecular biology, and metabolic engineering by applying engineering principles to the design of biological systems. The field still faces substantial challenges, including long development times, high rates of failure, and poor reproducibility. One method to ameliorate these problems would be to improve the exchange of information about designed systems between laboratories. The Synthetic Biology Open Language (SBOL) has been developed as a standard to support the specification and exchange of biological design information in synthetic biology, filling a need not satisfied by other pre-existing standards. This document details version 2.0 of SBOL, introducing a standardized format for the electronic exchange of information on the structural and functional aspects of biological designs. The standard has been designed to support the explicit and unambiguous description of biological designs by means of a well defined data model. The standard also includes rules and best practices on how to use this data model and populate it with relevant design details. The publication of this specification is intended to make these capabilities more widely accessible to potential developers and users in the synthetic biology community and beyond.
Single-Atom Demonstration of the Quantum Landauer Principle
NASA Astrophysics Data System (ADS)
Yan, L. L.; Xiong, T. P.; Rehan, K.; Zhou, F.; Liang, D. F.; Chen, L.; Zhang, J. Q.; Yang, W. L.; Ma, Z. H.; Feng, M.
2018-05-01
One of the outstanding challenges to information processing is the eloquent suppression of energy consumption in the execution of logic operations. The Landauer principle sets an energy constraint in deletion of a classical bit of information. Although some attempts have been made to experimentally approach the fundamental limit restricted by this principle, exploring the Landauer principle in a purely quantum mechanical fashion is still an open question. Employing a trapped ultracold ion, we experimentally demonstrate a quantum version of the Landauer principle, i.e., an equality associated with the energy cost of information erasure in conjunction with the entropy change of the associated quantized environment. Our experimental investigation substantiates an intimate link between information thermodynamics and quantum candidate systems for information processing.
Identification of stars and digital version of the catalogue of 1958 by Brodskaya and Shajn
NASA Astrophysics Data System (ADS)
Gorbunov, M. A.; Shlyapnikov, A. A.
2017-12-01
The following topics are considered: the identification of objects on search maps, the determination of their coordinates at the epoch of 2000, and converting the published version of the catalogue of 1958 by Brodskaya and Shajn into a machine-readable format. The statistics for photometric and spectral data from the original catalogue is presented. A digital version of the catalogue is described, as well as its presentation in HTML, VOTable and AJS formats and the basic principles of work in the interactive application of International Virtual Observatory - the Aladin Sky Atlas.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gokaltun, Seckin; Munroe, Norman; Subramaniam, Shankar
2014-12-31
This study presents a new drag model, based on the cohesive inter-particle forces, implemented in the MFIX code. This new drag model combines an existing standard model in MFIX with a particle-based drag model based on a switching principle. Switches between the models in the computational domain occur where strong particle-to-particle cohesion potential is detected. Three versions of the new model were obtained by using one standard drag model in each version. Later, performance of each version was compared against available experimental data for a fluidized bed, published in the literature and used extensively by other researchers for validation purposes.more » In our analysis of the results, we first observed that standard models used in this research were incapable of producing closely matching results. Then, we showed for a simple case that a threshold is needed to be set on the solid volume fraction. This modification was applied to avoid non-physical results for the clustering predictions, when governing equation of the solid granular temperate was solved. Later, we used our hybrid technique and observed the capability of our approach in improving the numerical results significantly; however, improvement of the results depended on the threshold of the cohesive index, which was used in the switching procedure. Our results showed that small values of the threshold for the cohesive index could result in significant reduction of the computational error for all the versions of the proposed drag model. In addition, we redesigned an existing circulating fluidized bed (CFB) test facility in order to create validation cases for clustering regime of Geldart A type particles.« less
Yesterday's Extraordinary Research Yields Today's Ordinary Principles
ERIC Educational Resources Information Center
Thomas, Mary Norris
2005-01-01
Ordinary performance improvement tips, techniques, and principles that are taken for granted today have their roots in extraordinary research. Today, the learning principle that states that things that occur together tend to be recalled together is widely accepted, and this principle of association as an instructional technique is often used. How…
Electrochemical Biosensors - Sensor Principles and Architectures
Grieshaber, Dorothee; MacKenzie, Robert; Vörös, Janos; Reimhult, Erik
2008-01-01
Quantification of biological or biochemical processes are of utmost importance for medical, biological and biotechnological applications. However, converting the biological information to an easily processed electronic signal is challenging due to the complexity of connecting an electronic device directly to a biological environment. Electrochemical biosensors provide an attractive means to analyze the content of a biological sample due to the direct conversion of a biological event to an electronic signal. Over the past decades several sensing concepts and related devices have been developed. In this review, the most common traditional techniques, such as cyclic voltammetry, chronoamperometry, chronopotentiometry, impedance spectroscopy, and various field-effect transistor based methods are presented along with selected promising novel approaches, such as nanowire or magnetic nanoparticle-based biosensing. Additional measurement techniques, which have been shown useful in combination with electrochemical detection, are also summarized, such as the electrochemical versions of surface plasmon resonance, optical waveguide lightmode spectroscopy, ellipsometry, quartz crystal microbalance, and scanning probe microscopy. The signal transduction and the general performance of electrochemical sensors are often determined by the surface architectures that connect the sensing element to the biological sample at the nanometer scale. The most common surface modification techniques, the various electrochemical transduction mechanisms, and the choice of the recognition receptor molecules all influence the ultimate sensitivity of the sensor. New nanotechnology-based approaches, such as the use of engineered ion-channels in lipid bilayers, the encapsulation of enzymes into vesicles, polymersomes, or polyelectrolyte capsules provide additional possibilities for signal amplification. In particular, this review highlights the importance of the precise control over the delicate interplay between surface nano-architectures, surface functionalization and the chosen sensor transducer principle, as well as the usefulness of complementary characterization tools to interpret and to optimize the sensor response. PMID:27879772
Guidelines for chemical peeling in Japan (3rd edition).
2012-04-01
Chemical peeling may be defined as the therapies, procedures and techniques used for the treatment of certain cutaneous diseases or conditions, and for aesthetic improvement. The procedures include the application of one or more chemical agents to the skin. Chemical peeling has been very popular in both medical and aesthetic fields. Because neither its scientific background is well understood nor a systematic approach established, medical and social problems have taken place. This prompted us to establish and distribute a standard guideline of care for chemical peeling. Previous guidelines such as the 2001 and 2004 versions included minimum standards of care such as indications, chemicals, applications, and any associated precautions, including post-peeling care. The principles in this updated version of chemical peeling are as follows: (i) chemical peeling should be performed under the strict technical control and responsibility of a physician; (ii) the physician should have sufficient knowledge of the structure and physiology of the skin and subcutaneous tissues, and understand the mechanisms of wound-healing induced by chemical peeling; (iii) the physician should be board-certified in an appropriate specialty such as dermatology; and (iv) the ultimate judgment regarding the appropriateness of any specific chemical peeling procedure must be made by the physician while considering all standard therapeutic protocols, which should be presented to each individual patient. Keeping these concepts in mind, this new version of the guidelines includes a more scientific and detailed approach from the viewpoint of evidence-based medicine. © 2011 Japanese Dermatological Association.
Grudniewicz, Agnes; Bhattacharyya, Onil; McKibbon, K Ann; Straus, Sharon E
2015-11-04
Printed educational materials (PEMs) are a frequently used tool to disseminate clinical information and attempt to change behavior within primary care. However, their effect on clinician behavior is limited. In this study, we explored how PEMs can be redesigned to better meet the needs of primary care physicians (PCPs) and whether usability and selection can be increased when design principles and user preferences are used. We redesigned a publicly available PEM using physician preferences, design principles, and graphic designer support. We invited PCPs to select their preferred document between the redesigned and original versions in a discrete choice experiment, followed by an assessment of usability with the System Usability Scale and a think aloud process. We conducted this study in both a controlled and opportunistic setting to determine whether usability testing results vary by study location. Think aloud data was thematically analyzed, and results were interpreted using the Technology Acceptance Model. One hundred and eighty four PCPs participated in the discrete choice experiment at the 2014 Family Medicine Forum, a large Canadian conference for family physicians. Of these, 87.7 % preferred the redesigned version. Follow-up interviews were held with a randomly selected group of seven participants. We repeated this in a controlled setting in Toronto, Canada, with a set of 14 participants. Using the System Usability Scale, we found that usability scores were significantly increased with the redesign (p < 0.001). We also found that when PCPs were given the choice between the two versions, they selected the redesigned version as their preferred PEM more often than the original (p < 0.001). Results did not appear to differ between the opportunistic and controlled setting. We used the results of the think aloud process to add to a list of end user preferences developed in a previous study. We found that redesigning a PEM with user preferences and design principles can improve its usability and result in the PEM being selected more often than the original. We feel this finding supports the involvement of the user, application of design principles, and the assistance of a graphic designer in the development of PEMs.
Vapourisers: Physical Principles and Classification
Dhulkhed, Vithal; Shetti, Akshaya; Naik, Shraddha; Dhulkhed, Pavan
2013-01-01
Vapourisers have evolved from rudimentary inhalers to the microprocessor controlled, temperature compensated and flow sensing devices, which are universal today. The improvements in the design was influenced by the development of potent inhalational anaesthetics, unique properties of some agents, a deeper understanding of their mechanism of action, inherent flaws in the older vapourisers, mechanical problems due to thymol deposition, factors influencing their output such as temperature and pressure variations. It is important to review the principles governing the design of the vapouriser to gain insight into their working. It is fascinating to know how some of the older vapourisers, popularly used in the past, functioned. The descendant of Oxford Miniature Vapourizer, the Triservice vapouriser is still a part of the military anaesthesia draw over equipment meant for field use whereas the Copper Kettle the first precision device is the fore-runner of the Tec 6 and Aladdin cassette vapouriser. Anaesthesia trainees if exposed to draw over techniques get a deeper understanding of equipment and improved skills for disaster situations. In the recent advanced versions of the vapouriser a central processing unit in the anaesthetic machine controls the operation by continuously monitoring and adjusting fresh gas flow through the vapouriser to maintain desired concentration of the vapour. PMID:24249878
Automated first-principles mapping for phase-change materials.
Esser, Marc; Maintz, Stefan; Dronskowski, Richard
2017-04-05
Plotting materials on bi-coordinate maps according to physically meaningful descriptors has a successful tradition in computational solid-state science spanning more than four decades. Equipped with new ab initio techniques introduced in this work, we generate an improved version of the treasure map for phase-change materials (PCMs) as introduced previously by Lencer et al. which, other than before, charts all industrially used PCMs correctly. Furthermore, we suggest seven new PCM candidates, namely SiSb 4 Te 7 , Si 2 Sb 2 Te 5 , SiAs 2 Te 4 , PbAs 2 Te 4 , SiSb 2 Te 4 , Sn 2 As 2 Te 5 , and PbAs 4 Te 7 , to be used as synthetic targets. To realize aforementioned maps based on orbital mixing (or "hybridization") and ionicity coordinates, structural information was first included into an ab initio numerical descriptor for sp 3 orbital mixing and then generalized beyond high-symmetry structures. In addition, a simple, yet powerful quantum-mechanical ionization measure also including structural information was introduced. Taken together, these tools allow for (automatically) generating materials maps solely relying on first-principles calculations. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.
1998-08-07
cognitive flexibility theory and generative learning theory which focus primarily on the individual student’s cognitive development , collaborative... develop "Handling Transfusion Hazards," a computer program based upon cognitive flexibility theory principles. The Program: Handling Transfusion Hazards...computer program was developed according to cognitive flexibility theory principles. A generative version was then developed by embedding
Advanced stress analysis methods applicable to turbine engine structures
NASA Technical Reports Server (NTRS)
Pian, T. H. H.
1985-01-01
Advanced stress analysis methods applicable to turbine engine structures are investigated. Constructions of special elements which containing traction-free circular boundaries are investigated. New versions of mixed variational principle and version of hybrid stress elements are formulated. A method is established for suppression of kinematic deformation modes. semiLoof plate and shell elements are constructed by assumed stress hybrid method. An elastic-plastic analysis is conducted by viscoplasticity theory using the mechanical subelement model.
ERIC Educational Resources Information Center
ROSEN, ELLEN F.; STOLUROW, LAWRENCE M.
IN ORDER TO FIND A GOOD PREDICTOR OF EMPIRICAL DIFFICULTY, AN OPERATIONAL DEFINITION OF STEP SIZE, TEN PROGRAMER-JUDGES RATED CHANGE IN COMPLEXITY IN TWO VERSIONS OF A MATHEMATICS PROGRAM, AND THESE RATINGS WERE THEN COMPARED WITH MEASURES OF EMPIRICAL DIFFICULTY OBTAINED FROM STUDENT RESPONSE DATA. THE TWO VERSIONS, A 54 FRAME BOOKLET AND A 35…
NASA Astrophysics Data System (ADS)
Capelli, Riccardo; Tiana, Guido; Camilloni, Carlo
2018-05-01
Inferential methods can be used to integrate experimental informations and molecular simulations. The maximum entropy principle provides a framework for using equilibrium experimental data, and it has been shown that replica-averaged simulations, restrained using a static potential, are a practical and powerful implementation of such a principle. Here we show that replica-averaged simulations restrained using a time-dependent potential are equivalent to the principle of maximum caliber, the dynamic version of the principle of maximum entropy, and thus may allow us to integrate time-resolved data in molecular dynamics simulations. We provide an analytical proof of the equivalence as well as a computational validation making use of simple models and synthetic data. Some limitations and possible solutions are also discussed.
Capelli, Riccardo; Tiana, Guido; Camilloni, Carlo
2018-05-14
Inferential methods can be used to integrate experimental informations and molecular simulations. The maximum entropy principle provides a framework for using equilibrium experimental data, and it has been shown that replica-averaged simulations, restrained using a static potential, are a practical and powerful implementation of such a principle. Here we show that replica-averaged simulations restrained using a time-dependent potential are equivalent to the principle of maximum caliber, the dynamic version of the principle of maximum entropy, and thus may allow us to integrate time-resolved data in molecular dynamics simulations. We provide an analytical proof of the equivalence as well as a computational validation making use of simple models and synthetic data. Some limitations and possible solutions are also discussed.
Growth studies at bulk III-Vs by image processing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Donecker, J.; Hempel, G.; Kluge, J.
1996-12-01
The patterns of inhomogeneities in GaAs and InP are studied by scattering and diffraction of light. An adapted version of laser scattering tomography is used for observations with short exposure times and large fields. The information about the three-dimensional distribution of the scatterers in GaAs are evaluated by video travels through the crystal and images of intensities added in interesting directions. Near-infrared transmission and striation distance mapping act like special data compression techniques due to their optical principles. In general, columnar extension of cellular patterns and striations could not be detected in s.i. GaAs. Long-range correlations exist for lineages andmore » slip lines. The comparison with the behavior of striations in doped InP cannot confirm the idea that cellular patterns in GaAs originate from constitutional supercooling during solidification.« less
On the Shock-Response-Spectrum Recursive Algorithm of Kelly and Richman
NASA Technical Reports Server (NTRS)
Martin, Justin N.; Sinclair, Andrew J.; Foster, Winfred A.
2010-01-01
The monograph Principles and Techniques of Shock Data Analysis written by Kelly and Richman in 1969 has become a seminal reference on the shock response spectrum (SRS) [1]. Because of its clear physical descriptions and mathematical presentation of the SRS, it has been cited in multiple handbooks on the subject [2, 3] and research articles [4 10]. Because of continued interest, two additional versions of the monograph have been published: a second edition by Scavuzzo and Pusey in 1996 [11] and a reprint of the original edition in 2008 [12]. The main purpose of this note is to correct several typographical errors in the manuscript's presentation of a recursive algorithm for SRS calculations. These errors are consistent across all three editions of the monograph. The secondary purpose of this note is to present a Matlab implementation of the corrected algorithm.
What is Your Cosmic Connection to the Elements?
NASA Technical Reports Server (NTRS)
Lochner, J.
2003-01-01
This booklet provides information and classroom activities covering topics in astronomy, physics, and chemistry. Chemistry teachers will find information about the cosmic origin of the chemical elements. The astronomy topics include the big bang, life cycles of small and large stars, supernovae, and cosmic rays. Physics teachers will find information on fusion processes, and physical principles important in stellar evolution. While not meant to replace a textbook, the information provided here is meant to give the necessary background for the theme of :our cosmic connection to the elements." The activities can be used to re-enforce the material across a number of disciplines, using a variety of techniques, and to engage and excite students about the topic. Additional activities, and on-line versions of the activities published here, are available at http://imagine.gsfc.nasa.gov/docs/teachers/elements/.
Deterring watermark collusion attacks using signal processing techniques
NASA Astrophysics Data System (ADS)
Lemma, Aweke N.; van der Veen, Michiel
2007-02-01
Collusion attack is a malicious watermark removal attack in which the hacker has access to multiple copies of the same content with different watermarks and tries to remove the watermark using averaging. In the literature, several solutions to collusion attacks have been reported. The main stream solutions aim at designing watermark codes that are inherently resistant to collusion attacks. The other approaches propose signal processing based solutions that aim at modifying the watermarked signals in such a way that averaging multiple copies of the content leads to a significant degradation of the content quality. In this paper, we present signal processing based technique that may be deployed for deterring collusion attacks. We formulate the problem in the context of electronic music distribution where the content is generally available in the compressed domain. Thus, we first extend the collusion resistance principles to bit stream signals and secondly present experimental based analysis to estimate a bound on the maximum number of modified versions of a content that satisfy good perceptibility requirement on one hand and destructive averaging property on the other hand.
Detection of faults and software reliability analysis
NASA Technical Reports Server (NTRS)
Knight, John C.
1987-01-01
Multi-version or N-version programming is proposed as a method of providing fault tolerance in software. The approach requires the separate, independent preparation of multiple versions of a piece of software for some application. These versions are executed in parallel in the application environment; each receives identical inputs and each produces its version of the required outputs. The outputs are collected by a voter and, in principle, they should all be the same. In practice there may be some disagreement. If this occurs, the results of the majority are taken to be the correct output, and that is the output used by the system. A total of 27 programs were produced. Each of these programs was then subjected to one million randomly-generated test cases. The experiment yielded a number of programs containing faults that are useful for general studies of software reliability as well as studies of N-version programming. Fault tolerance through data diversity and analytic models of comparison testing are discussed.
Brst-Bfv Quantization and the Schwinger Action Principle
NASA Astrophysics Data System (ADS)
Garcia, J. Antonio; Vergara, J. David; Urrutia, Luis F.
We introduce an operator version of the BRST-BFV effective action for arbitrary systems with first class constraints. Using the Schwinger action principle we calculate the propagators corresponding to: (i) the parametrized nonrelativistic free particle, (ii) the relativistic free particle and (iii) the spinning relativistic free particle. Our calculation correctly imposes the BRST invariance at the end points. The precise use of the additional boundary terms required in the description of fermionic variables is incorporated.
Flawed Applications of Bernoulli's Principle
NASA Astrophysics Data System (ADS)
Koumaras, Panagiotis; Primerakis, Georgios
2018-04-01
One of the most popular demonstration experiments pertaining to Bernoulli's principle is the production of a water spray by using a vertical plastic straw immersed in a glass of water and a horizontal straw to blow air towards the top edge of the vertical one. A more general version of this phenomenon, appearing also in school physics problems, is the determination of the rise of the water level h in the straw (see Fig. 1).
Supporting ontology adaptation and versioning based on a graph of relevance
NASA Astrophysics Data System (ADS)
Sassi, Najla; Jaziri, Wassim; Alharbi, Saad
2016-11-01
Ontologies recently have become a topic of interest in computer science since they are seen as a semantic support to explicit and enrich data-models as well as to ensure interoperability of data. Moreover, supporting ontology adaptation becomes essential and extremely important, mainly when using ontologies in changing environments. An important issue when dealing with ontology adaptation is the management of several versions. Ontology versioning is a complex and multifaceted problem as it should take into account change management, versions storage and access, consistency issues, etc. The purpose of this paper is to propose an approach and tool for ontology adaptation and versioning. A series of techniques are proposed to 'safely' evolve a given ontology and produce a new consistent version. The ontology versions are ordered in a graph according to their relevance. The relevance is computed based on four criteria: conceptualisation, usage frequency, abstraction and completeness. The techniques to carry out the versioning process are implemented in the Consistology tool, which has been developed to assist users in expressing adaptation requirements and managing ontology versions.
Self-completeness and the generalized uncertainty principle
NASA Astrophysics Data System (ADS)
Isi, Maximiliano; Mureika, Jonas; Nicolini, Piero
2014-03-01
The generalized uncertainty principle discloses a self-complete characteristic of gravity, namely the possibility of masking any curvature singularity behind an event horizon as a result of matter compression at the Planck scale. In this paper we extend the above reasoning in order to overcome some current limitations to the framework, including the absence of a consistent metric describing such Planck-scale black holes. We implement a minimum-size black hole in terms of the extremal configuration of a neutral non-rotating metric, which we derived by mimicking the effects of the generalized uncertainty principle via a short scale modified version of Einstein gravity. In such a way, we find a self- consistent scenario that reconciles the self-complete character of gravity and the generalized uncertainty principle.
Self-completeness and the generalized uncertainty principle
NASA Astrophysics Data System (ADS)
Isi, Maximiliano; Mureika, Jonas; Nicolini, Piero
2013-11-01
The generalized uncertainty principle discloses a self-complete characteristic of gravity, namely the possibility of masking any curvature singularity behind an event horizon as a result of matter compression at the Planck scale. In this paper we extend the above reasoning in order to overcome some current limitations to the framework, including the absence of a consistent metric describing such Planck-scale black holes. We implement a minimum-size black hole in terms of the extremal configuration of a neutral non-rotating metric, which we derived by mimicking the effects of the generalized uncertainty principle via a short scale modified version of Einstein gravity. In such a way, we find a self-consistent scenario that reconciles the self-complete character of gravity and the generalized uncertainty principle.
Deformation of second and third quantization
NASA Astrophysics Data System (ADS)
Faizal, Mir
2015-03-01
In this paper, we will deform the second and third quantized theories by deforming the canonical commutation relations in such a way that they become consistent with the generalized uncertainty principle. Thus, we will first deform the second quantized commutator and obtain a deformed version of the Wheeler-DeWitt equation. Then we will further deform the third quantized theory by deforming the third quantized canonical commutation relation. This way we will obtain a deformed version of the third quantized theory for the multiverse.
Advanced stress analysis methods applicable to turbine engine structures
NASA Technical Reports Server (NTRS)
Pian, Theodore H. H.
1991-01-01
The following tasks on the study of advanced stress analysis methods applicable to turbine engine structures are described: (1) constructions of special elements which contain traction-free circular boundaries; (2) formulation of new version of mixed variational principles and new version of hybrid stress elements; (3) establishment of methods for suppression of kinematic deformation modes; (4) construction of semiLoof plate and shell elements by assumed stress hybrid method; and (5) elastic-plastic analysis by viscoplasticity theory using the mechanical subelement model.
ERIC Educational Resources Information Center
O'Toole, Brian
The document, in both English and French, describes a 2-year project in Guyana based on the principle of community-based rehabilitation (CBR), which stresses training local community residents as supervisors to provide training of rehabilitation workers and direct services to children with disabilities living in rural areas. The program provided…
Logic-based assessment of the compatibility of UMLS ontology sources
2011-01-01
Background The UMLS Metathesaurus (UMLS-Meta) is currently the most comprehensive effort for integrating independently-developed medical thesauri and ontologies. UMLS-Meta is being used in many applications, including PubMed and ClinicalTrials.gov. The integration of new sources combines automatic techniques, expert assessment, and auditing protocols. The automatic techniques currently in use, however, are mostly based on lexical algorithms and often disregard the semantics of the sources being integrated. Results In this paper, we argue that UMLS-Meta’s current design and auditing methodologies could be significantly enhanced by taking into account the logic-based semantics of the ontology sources. We provide empirical evidence suggesting that UMLS-Meta in its 2009AA version contains a significant number of errors; these errors become immediately apparent if the rich semantics of the ontology sources is taken into account, manifesting themselves as unintended logical consequences that follow from the ontology sources together with the information in UMLS-Meta. We then propose general principles and specific logic-based techniques to effectively detect and repair such errors. Conclusions Our results suggest that the methodologies employed in the design of UMLS-Meta are not only very costly in terms of human effort, but also error-prone. The techniques presented here can be useful for both reducing human effort in the design and maintenance of UMLS-Meta and improving the quality of its contents. PMID:21388571
Holographic wavefront sensor, based on diffuse Fourier holography
NASA Astrophysics Data System (ADS)
Gorelaya, Alina; Orlov, Vyacheslav; Venediktov, Vladimir
2017-09-01
Many areas of optical science and technology require fast and accurate measurement of the radiation wavefront shape. Today there are known a lot of wavefront sensor (WFS) techniques, and their number is growing up. The last years have brought a growing interest in several schematics of WFS, employing the holography principles and holographic optical elements (HOE). Some of these devices are just the improved versions of the standard and most popular Shack-Hartman WFS, while other are based on the intrinsic features of HOE. A holographic mode wavefront sensor is proposed, which makes it possible to measure up to several tens of wavefront modes. The increase in the number of measured modes is implemented using the conversion of a light wave entering the sensor into a wide diffuse light beam, which allows one to record a large number of holograms, each intended for measuring one of the modes.
Konstantinidis, Georgios; Anastassopoulos, George C; Karakos, Alexandros S; Anagnostou, Emmanouil; Danielides, Vasileios
2012-04-01
The aim of this study is to present our perspectives on healthcare analysis and design and the lessons learned from our experience with the development of a distributed, object-oriented Clinical Information System (CIS). In order to overcome known issues regarding development, implementation and finally acceptance of a CIS by the physicians we decided to develop a novel object-oriented methodology by integrating usability principles and techniques in a simplified version of a well established software engineering process (SEP), the Unified Process (UP). A multilayer architecture has been defined and implemented with the use of a vendor application framework. Our first experiences from a pilot implementation of our CIS are positive. This approach allowed us to gain a socio-technical understanding of the domain and enabled us to identify all the important factors that define both the structure and the behavior of a Health Information System.
Zhang, Ao; Yan, Xing-Ke; Liu, An-Guo
2016-12-25
In the present paper, the authors introduce a newly-developed "Acupuncture Needle Manipulation Training-evaluation System" based on optical motion capture technique. It is composed of two parts, sensor and software, and overcomes some shortages of mechanical motion capture technique. This device is able to analyze the data of operations of the pressing-hand and needle-insertion hand during acupuncture performance and its software contains personal computer (PC) version, Android version, and Internetwork Operating System (IOS) Apple version. It is competent in recording and analyzing information of any ope-rator's needling manipulations, and is quite helpful for teachers in teaching, training and examining students in clinical practice.
Educational principles and techniques for interpreters.
F. David Boulanger; John P. Smith
1973-01-01
Interpretation is in large part education, since it attempts to convey information, concepts, and principles while creating attitude changes and such emotional states as wonder, delight, and appreciation. Although interpreters might profit greatly by formal training in the principles and techniques of teaching, many have not had such training. Some means of making the...
Abdolmanafi, Atefe; Azadfallah, Parviz; Fata, Ladan; Roosta, Mohsen; Peixoto, Maria Manuela; Nobre, Pedro
2015-08-01
The sexual dysfunctional beliefs questionnaire (SDBQ) is a validated measure for assessing dysfunctional sexual beliefs. The aim of this study was to translate and validate the SDBQ to Iranian context. In order to translate the questionnaire from English into Persian, a forward-backward procedure was applied. After linguistic validation, the psychometric properties of the Iranian version were assessed for both men and women. A total of 387 participants (226 women and 161 men) completed the SDBQ. A principle component analysis with varimax rotation was performed for both the male and female samples. Reliability was evaluated by calculating Cronbach's alpha (internal consistency) and test-retest coefficients (intraclass correlation coefficient). The results from the principle component analysis identified six factors in the female version: sexual conservatism and female sexual passivity, beliefs about masturbation, body image beliefs, sexual desire and pleasure as a sin, age-related beliefs, and denying affection primacy. In the male version six factors were also identified: sex as an abuse of men's power, beliefs related to women's satisfaction, sexual conservatism, female sexual power, "macho" beliefs, and restrictive attitudes toward sex. Findings support the original six-factor solution for the male sample. For the female sample, although a six-factor solution was found, original motherhood-related beliefs were included in the sexual conservatism and female sexual passivity factor, and a new dimension has emerged, related to masturbation beliefs. Additionally, results indicated that the SDBQ had good internal consistency and test-retest reliability in both male and female versions. Current findings support the reliability and validity of the SDBQ in an Iranian sample and suggest its applicability to assess sexual beliefs in both clinical samples and the general population in Iran. © 2015 International Society for Sexual Medicine.
Quantum Theory of Jaynes' Principle, Bayes' Theorem, and Information
NASA Astrophysics Data System (ADS)
Haken, Hermann
2014-12-01
After a reminder of Jaynes' maximum entropy principle and of my quantum theoretical extension, I consider two coupled quantum systems A,B and formulate a quantum version of Bayes' theorem. The application of Feynman's disentangling theorem allows me to calculate the conditional density matrix ρ (A|B) , if system A is an oscillator (or a set of them), linearly coupled to an arbitrary quantum system B. Expectation values can simply be calculated by means of the normalization factor of ρ (A|B) that is derived.
TECHNIQUES FOR TEACHING CONSERVATION EDUCATION.
ERIC Educational Resources Information Center
BROWN, ROBERT E.; MOUSER, G.W.
CONSERVATION PRINCIPLES, FIELD METHODS AND TECHNIQUES, AND SPECIFIC FIELD LEARNING ACTIVITIES ARE INCLUDED IN THIS REFERENCE VOLUME FOR TEACHERS. CONSERVATION PRINCIPLES INCLUDE STATEMENTS PERTAINING TO (1) SOIL, (2) WATER, (3) FOREST, AND (4) WILDLIFE. FIELD METHODS AND TECHNIQUES INCLUDE (1) PREPARING FOR A FIELD TRIP, (2) GETTING STUDENT…
Gang, Wei-juan; Wang, Xin; Wang, Fang; Dong, Guo-feng; Wu, Xiao-dong
2015-08-01
The national standard of "Regulations of Acupuncture-needle Manipulating Techniques" is one of the national Criteria of Acupuncturology for which a total of 22 items have been already established. In the process of formulation, a series of common and specific problems have been met. In the present paper, the authors expound these problems from 3 aspects, namely principles for formulation, methods for formulating criteria, and considerations about some problems. The formulating principles include selection and regulations of principles for technique classification and technique-related key factors. The main methods for formulating criteria are 1) taking the literature as the theoretical foundation, 2) taking the clinical practice as the supporting evidence, and 3) taking the expounded suggestions or conclusions through peer review.
ERIC Educational Resources Information Center
Coles, Mike; Nelms, Rick
1996-01-01
Describes a study that explores the depth and breadth of scientific facts, principles, and procedures which are required in the Advanced General National Vocational Qualifications (GNVQ) science through comparison with GCE Advanced level. The final report takes account of the updated 1996 version of GNVQ science. (DDR)
The Expanding Universe and the Large-Scale Geometry of Spacetime.
ERIC Educational Resources Information Center
Shu, Frank
1983-01-01
Presents a condensed version of textbook account of cosmological theory and principles. Topics discussed include quasars, general and special relativity, relativistic cosmology, and the curvature of spacetime. Some philosophical assumptions necessary to the theory are also discussed. (JM)
Simplifying Chemical Reactor Design by using Molar Quantities Instead of Fractional Conversion.
ERIC Educational Resources Information Center
Brown, Lee F.; Falconer, John L.
1987-01-01
Explains the advantages of using molar quantities in chemical reactor design. Advocates the use of differential versions of reactor mass balances rather than the integrated forms. Provides specific examples and cases to illustrate the principles. (ML)
Regression Verification Using Impact Summaries
NASA Technical Reports Server (NTRS)
Backes, John; Person, Suzette J.; Rungta, Neha; Thachuk, Oksana
2013-01-01
Regression verification techniques are used to prove equivalence of syntactically similar programs. Checking equivalence of large programs, however, can be computationally expensive. Existing regression verification techniques rely on abstraction and decomposition techniques to reduce the computational effort of checking equivalence of the entire program. These techniques are sound but not complete. In this work, we propose a novel approach to improve scalability of regression verification by classifying the program behaviors generated during symbolic execution as either impacted or unimpacted. Our technique uses a combination of static analysis and symbolic execution to generate summaries of impacted program behaviors. The impact summaries are then checked for equivalence using an o-the-shelf decision procedure. We prove that our approach is both sound and complete for sequential programs, with respect to the depth bound of symbolic execution. Our evaluation on a set of sequential C artifacts shows that reducing the size of the summaries can help reduce the cost of software equivalence checking. Various reduction, abstraction, and compositional techniques have been developed to help scale software verification techniques to industrial-sized systems. Although such techniques have greatly increased the size and complexity of systems that can be checked, analysis of large software systems remains costly. Regression analysis techniques, e.g., regression testing [16], regression model checking [22], and regression verification [19], restrict the scope of the analysis by leveraging the differences between program versions. These techniques are based on the idea that if code is checked early in development, then subsequent versions can be checked against a prior (checked) version, leveraging the results of the previous analysis to reduce analysis cost of the current version. Regression verification addresses the problem of proving equivalence of closely related program versions [19]. These techniques compare two programs with a large degree of syntactic similarity to prove that portions of one program version are equivalent to the other. Regression verification can be used for guaranteeing backward compatibility, and for showing behavioral equivalence in programs with syntactic differences, e.g., when a program is refactored to improve its performance, maintainability, or readability. Existing regression verification techniques leverage similarities between program versions by using abstraction and decomposition techniques to improve scalability of the analysis [10, 12, 19]. The abstractions and decomposition in the these techniques, e.g., summaries of unchanged code [12] or semantically equivalent methods [19], compute an over-approximation of the program behaviors. The equivalence checking results of these techniques are sound but not complete-they may characterize programs as not functionally equivalent when, in fact, they are equivalent. In this work we describe a novel approach that leverages the impact of the differences between two programs for scaling regression verification. We partition program behaviors of each version into (a) behaviors impacted by the changes and (b) behaviors not impacted (unimpacted) by the changes. Only the impacted program behaviors are used during equivalence checking. We then prove that checking equivalence of the impacted program behaviors is equivalent to checking equivalence of all program behaviors for a given depth bound. In this work we use symbolic execution to generate the program behaviors and leverage control- and data-dependence information to facilitate the partitioning of program behaviors. The impacted program behaviors are termed as impact summaries. The dependence analyses that facilitate the generation of the impact summaries, we believe, could be used in conjunction with other abstraction and decomposition based approaches, [10, 12], as a complementary reduction technique. An evaluation of our regression verification technique shows that our approach is capable of leveraging similarities between program versions to reduce the size of the queries and the time required to check for logical equivalence. The main contributions of this work are: - A regression verification technique to generate impact summaries that can be checked for functional equivalence using an off-the-shelf decision procedure. - A proof that our approach is sound and complete with respect to the depth bound of symbolic execution. - An implementation of our technique using the LLVMcompiler infrastructure, the klee Symbolic Virtual Machine [4], and a variety of Satisfiability Modulo Theory (SMT) solvers, e.g., STP [7] and Z3 [6]. - An empirical evaluation on a set of C artifacts which shows that the use of impact summaries can reduce the cost of regression verification.
Schema Versioning for Multitemporal Relational Databases.
ERIC Educational Resources Information Center
De Castro, Cristina; Grandi, Fabio; Scalas, Maria Rita
1997-01-01
Investigates new design options for extended schema versioning support for multitemporal relational databases. Discusses the improved functionalities they may provide. Outlines options and basic motivations for the new design solutions, as well as techniques for the management of proposed schema versioning solutions, includes algorithms and…
Comparison of bone density measurement techniques: DXA and Archimedes' principle.
Keenan, M J; Hegsted, M; Jones, K L; Delany, J P; Kime, J C; Melancon, L E; Tulley, R T; Hong, K D
1997-11-01
The standard method for determination of density (g/cm3) of bones from small animals has been the application of Archimedes' principle. A recent development has been software for the determination of "density" (g/cm2) of small animal bones with dual-energy X-ray absorptiometry (DXA). We compared Archimedes' principle and DXA (Hologic QDR-2000) in the measurement of the densities of whole and hollowed femurs of 5- to 6-month-old retired female breeder rats. In an attempt to ensure detectable treatment differences, rats were used from a low-vitamin D Holtzman and a supplemental-vitamin D Sprague-Dawley colony. Whole femur densities were higher for supplemental-vitamin D colony rats than for low vitamin D rats using both techniques (Archimedes' principle, p < 0.002; DXA, p < 0.005), and the densities from the two techniques were highly correlated (r = 0.82, p < 0.0001). Actual density values were higher for Archimedes' principle than for DXA. Other variables such as femur ash weight and calcium content were also highly correlated to densities with both techniques. Hollowed femur density values were higher than whole femur values with Archimedes' principle but lower with DXA. Colony effects for hollowed femur densities were diminished with Archimedes' principle (p < 0.03) and eliminated with DXA (p < 0.53). Investigation of whole bones is more biologically relevant, and both techniques were effective in detecting differences between whole femurs from low-vitamin D and supplemental-vitamin D colony rats.
EFSUMB Statement on Medical Student Education in Ultrasound [long version
Cantisani, V.; Dietrich, C. F.; Badea, R.; Dudea, S.; Prosch, H.; Cerezo, E.; Nuernberg, D.; Serra, A. L.; Sidhu, P. S.; Radzina, M.; Piscaglia, F.; Bachmann Nielsen, M.; Ewertsen, C.; Săftoiu, A.; Calliada, F.; Gilja, O. H.
2016-01-01
The European Federation of Societies for Ultrasound in Medicine and Biology (EFSUMB) recommends that ultrasound should be used systematically as an easy accessible and instructive educational tool in the curriculum of modern medical schools. Medical students should acquire theoretical knowledge of the modality and hands-on training should be implemented and adhere to evidence-based principles. In this paper we report EFSUMB policy statements on medical student education in ultrasound that in a short version is already published in Ultraschall in der Medizin 1. PMID:27689163
A Framework for Global Electronic Commerce: An Executive Summary.
ERIC Educational Resources Information Center
Office of the Press Secretary of the White House
1997-01-01
An abbreviated version of a longer policy document on electronic commerce released by the Clinton Administration, this article examines principles and recommendations on tariffs, taxes, electronic payment systems, uniform commercial code for electronic commerce, intellectual property protection, privacy, security, telecommunications infrastructure…
Dobson ozone spectrophotometer modification.
NASA Technical Reports Server (NTRS)
Komhyr, W. D.; Grass, R. D.
1972-01-01
Description of a modified version of the Dobson ozone spectrophotometer in which several outdated electronic design features have been replaced by circuitry embodying more modern design concepts. The resulting improvement in performance characteristics has been obtained without changing the principle of operation of the original instrument.
Authentic Assessment through Rich Tasks
ERIC Educational Resources Information Center
Wrigley, Terry
2017-01-01
This short article explains the key principles of "rich tasks," a version of authentic assessment developed in Queensland, Australia, as part of a major curriculum development called the "New Basics." In various documents, the project leaders recognised the danger that inappropriate assessment would undermine the proposed…
Investment, regulation, and uncertainty: managing new plant breeding techniques.
Smyth, Stuart J; McDonald, Jillian; Falck-Zepeda, Jose
2014-01-01
As with any technological innovation, time refines the technology, improving upon the original version of the innovative product. The initial GM crops had single traits for either herbicide tolerance or insect resistance. Current varieties have both of these traits stacked together and in many cases other abiotic and biotic traits have also been stacked. This innovation requires investment. While this is relatively straight forward, certain conditions need to exist such that investments can be facilitated. The principle requirement for investment is that regulatory frameworks render consistent and timely decisions. If the certainty of regulatory outcomes weakens, the potential for changes in investment patterns increases. This article provides a summary background to the leading plant breeding technologies that are either currently being used to develop new crop varieties or are in the pipeline to be applied to plant breeding within the next few years. Challenges for existing regulatory systems are highlighted. Utilizing an option value approach from investment literature, an assessment of uncertainty regarding the regulatory approval for these varying techniques is undertaken. This research highlights which technology development options have the greatest degree of uncertainty and hence, which ones might be expected to see an investment decline.
A portable meter for measuring low frequency currents in the human body.
Niple, J C; Daigle, J P; Zaffanella, L E; Sullivan, T; Kavet, R
2004-07-01
A portable meter has been developed for measuring low frequency currents that flow in the human body. Although the present version of the meter was specifically designed to measure 50/60 Hz "contact currents," the principles involved can be used with other low frequency body currents. Contact currents flow when the human body provides a conductive path between objects in the environment with different electrical potentials. The range of currents the meter detects is approximately 0.4-800 microA. This provides measurements of currents from the threshold of human perception (approximately 500 microA(RMS)) down to single microampere levels. The meter has a unique design, which utilizes the human subject's body impedance as the sensing element. Some of the advantages of this approach are high sensitivity, the ability to measure current flow in the majority of the body, and relative insensitivity to the current path connection points. Current measurement accuracy varies with the accuracy of the body impedance (resistance) measurement and different techniques can be used to obtain a desired level of accuracy. Techniques are available to achieve an estimated +/-20% accuracy. Copyright 2004 Wiley-Liss, Inc.
Verma, Mudita; Meena, N.; Kumari, R. Anitha; Mallandur, Sudhanva; Vikram, R.; Gowda, Vishwas
2017-01-01
Aims: The aim of this study was to quantify the debris extruded apically from teeth using rotary and reciprocation instrumentation systems. Subjects and Methods: Eighty extracted human mandibular premolars with single canals and similar lengths were instrumented using ProTaper Universal (40, 06; Dentsply Maillefer, Ballaigues, Switzerland), ProTaper Next (40, 06; Dentsply Maillefer, Ballaigues, Switzerland), WaveOne (40, 06; Dentsply Maillefer, Ballaigues, Switzerland), and Reciproc (R40; VDW GmbH, Munich, Germany). Debris extruded during instrumentation was collected into preweighed Eppendorf tubes, which were then stored in an incubator at 70°C for 5 days. The final weight of the Eppendorf tubes with the extruded debris was calculated after obtaining the mean of three consecutive weights obtained for each tube. Statistical Analysis Used: Statistical analysis was performed using SPSS version 16.0 software. The groups were compared using the Kruskal–Wallis test for all variables. Results: There was no statistically significant difference between the groups (P = 0.1114). However, the ProTaper Universal group produced more extrusion and ProTaper Next produced least debris extrusion among the instrument groups (P > 0.05). Conclusions: All instrumentation techniques were associated with extruded debris. PMID:28855755
Sala-Sastre, N; Herdman, M; Navarro, L; de la Prada, M; Pujol, R; Serra, C; Alonso, J; Flyvholm, M A; Giménez-Arnau, A M
2009-10-01
Eczema of the hands and urticaria are very common occupational dermatoses. The Nordic Occupational Skin Questionnaire (NOSQ-2002), developed in English, is an essential tool for the study of occupational skin diseases. The short version of the questionnaire is useful for screening and the long version is used to study risk factors. OBJECTIVE. The aim of this study was to culturally adapt the long version of the NOSQ to Spanish and Catalan and to ensure comprehension, semantic validity, and equivalence with the original. The principles of the International Society for Pharmacoeconomics and Outcomes Research for good research practices were applied. A 4-phase method was used, with direct, revised translation, back translation, and cognitive interviews. After direct translation, a first version was issued by the Spanish Working Group. This version was evaluated in cognitive interviews. Modifications were made to 39 questions (68 %) in the Spanish version and 27 questions (47 %) in the Catalan version. Changes included addition of examples to improve understanding, reformulation of instructions, change to use of a direct question format, and addition of certain definitions. The back translation was evaluated by the original authors, leading to a further 7 changes in the Spanish version and 2 in the Catalan version. The third consensus version underwent a second round of cognitive interviews, after which the definitive version in each language was issued. CONCLUSION. Spanish and Catalan versions of the NOSQ-2002 questionnaire are available at www.ami.dk/NOSQ and www.arbejdsmiljoforskning.dk.
Zander, Steffi; Wetzel, Stefanie; Kühl, Tim; Bertel, Sven
2017-01-01
One of the frequently examined design principles in multimedia learning is the personalization principle. Based on empirical evidence this principle states that using personalized messages in multimedia learning is more beneficial than using formal language (e.g., using ‘you’ instead of ‘the’). Although there is evidence that these slight changes in regard to the language style affect learning, motivation and the perceived cognitive load, it remains unclear, (1) whether the positive effects of personalized language can be transferred to all kinds of content of learning materials (e.g., specific potentially aversive health issues) and (2) which are the underlying processes (e.g., attention allocation) of the personalization effect. German university students (N = 37) learned symptoms and causes of cerebral hemorrhages either with a formal or a personalized version of the learning material. Analysis revealed comparable results to the few existing previous studies, indicating an inverted personalization effect for potentially aversive learning material. This effect was specifically revealed in regard to decreased average fixation duration and the number of fixations exclusively on the images in the personalized compared to the formal version. These results can be seen as indicators for an inverted effect of personalization on the level of visual attention. PMID:29326630
Structural dynamics payload loads estimates
NASA Technical Reports Server (NTRS)
Engels, R. C.
1982-01-01
Methods for the prediction of loads on large space structures are discussed. Existing approaches to the problem of loads calculation are surveyed. A full scale version of an alternate numerical integration technique to solve the response part of a load cycle is presented, and a set of short cut versions of the algorithm developed. The implementation of these techniques using the software package developed is discussed.
Simulation of a Moving Elastic Beam Using Hamilton’s Weak Principle
2006-03-01
versions were limited to two-dimensional systems with open tree configurations (where a cut in any component separates the system in half) [48]. This...whose com- ponents experienced large angular rotations (turbomachinery, camshafts , flywheels, etc.). More complex systems required the simultaneous
Severe traumatic brain injury management and clinical outcome using the Lund concept.
Koskinen, L-O D; Olivecrona, M; Grände, P O
2014-12-26
This review covers the main principles of the Lund concept for treatment of severe traumatic brain injury. This is followed by a description of results of clinical studies in which this therapy or a modified version of the therapy has been used. Unlike other guidelines, which are based on meta-analytical approaches, important components of the Lund concept are based on physiological mechanisms for regulation of brain volume and brain perfusion and to reduce transcapillary plasma leakage and the need for plasma volume expanders. There have been nine non-randomized and two randomized outcome studies with the Lund concept or modified versions of the concept. The non-randomized studies indicated that the Lund concept is beneficial for outcome. The two randomized studies were small but showed better outcome in the groups of patients treated according to the modified principles of the Lund concept than in the groups given a more conventional treatment. Copyright © 2014 IBRO. Published by Elsevier Ltd. All rights reserved.
Fabrication Techniques and Principles for Flat Plate Antennas
DOT National Transportation Integrated Search
1973-09-01
The report documents the fabrication techniques and principles selected to produce one and ten million flat plate antennas per year. An engineering analysis of the reliability, electrical integrity, and repeatability is made, and a cost analysis summ...
Fast secant methods for the iterative solution of large nonsymmetric linear systems
NASA Technical Reports Server (NTRS)
Deuflhard, Peter; Freund, Roland; Walter, Artur
1990-01-01
A family of secant methods based on general rank-1 updates was revisited in view of the construction of iterative solvers for large non-Hermitian linear systems. As it turns out, both Broyden's good and bad update techniques play a special role, but should be associated with two different line search principles. For Broyden's bad update technique, a minimum residual principle is natural, thus making it theoretically comparable with a series of well known algorithms like GMRES. Broyden's good update technique, however, is shown to be naturally linked with a minimum next correction principle, which asymptotically mimics a minimum error principle. The two minimization principles differ significantly for sufficiently large system dimension. Numerical experiments on discretized partial differential equations of convection diffusion type in 2-D with integral layers give a first impression of the possible power of the derived good Broyden variant.
Optimization technique for problems with an inequality constraint
NASA Technical Reports Server (NTRS)
Russell, K. J.
1972-01-01
General technique uses a modified version of an existing technique termed the pattern search technique. New procedure called the parallel move strategy permits pattern search technique to be used with problems involving a constraint.
1994-03-01
evaluation of its anticipated value. If the program can be accomplished using conventional techniques , this should be seriously considered. Development or...the direct frequency generating principles such as, pulse tachos, turbine flowmeters, and encoders, also Doppler and laser techniques used for...CERAMIC BLOCK Figure 5.3. The basic concepts of the laser ring gyro (LRG). The principle depends upon the guidance of two beams of laser light around an
THE MAN MADE WORLD, TEACHER'S MANUAL.
ERIC Educational Resources Information Center
Commission on Engineering Education, Washington, DC.
THIS TEACHER'S MANUAL FOR THE ENGINEERING CONCEPTS CURRICULUM PROJECT'S HIGH SCHOOL COURSE, "THE MAN MADE WORLD," IS THE THIRD DRAFT OF THE EXPERIMENTAL VERSION. THE MATERIAL WRITTEN BY ENGINEERS, SCIENTISTS, AND EDUCATORS, EMPHASIZES ENGINEERING--MAN'S APPLICATION OF SCIENTIFIC PRINCIPLES TO THE CONTROL AND UTILIZATION OF HIS ENVIRONMENT.…
Steps toward Promoting Consistency in Educational Decisions
ERIC Educational Resources Information Center
Klein, Joseph
2010-01-01
Purpose: The literature indicates the advantages of decisions formulated through intuition, as well as the limitations, such as lack of consistency in similar situations. The principle of consistency (invariance), requiring that two equivalent versions of choice-problems will produce the same preference, is violated in intuitive judgment. This…
ERIC Educational Resources Information Center
Schenk, Robert
2003-01-01
Describes CyberEconomics, a complete, free, two-semester principles of economics textbook available on the World Wide Web. Contains chapters, sections, a table of contents, a set of learning objectives, and links to chapter introductions and sections. Offers a CD-ROM version available for a fee that contains interactive review questions. (JEH)
ERIC Educational Resources Information Center
Schattschneider, Doris
1991-01-01
Provided are examples from many domains of mathematics that illustrate the Fubini Principle in its discrete version: the value of a summation over a rectangular array is independent of the order of summation. Included are: counting using partitions as in proof by pictures, combinatorial arguments, indirect counting as in the inclusion-exclusion…
Do You Hear More Piano or Drum Sounds? An Auditory Version of the Solitaire Illusion.
Prpic, Valter; Luccio, Riccardo
2016-10-03
The solitaire illusion is an illusion of numerosity proposed by Frith and Frith. In the original version, an apparent number of elements was determined by the spatial arrangement of two kinds of elements (black and white marbles). In our study, an auditory version of the solitaire illusion was demonstrated. Participants were asked to judge if they perceived more drum or piano sounds. When half of the piano tones were perceived as lower in pitch than a drum sound and the other half higher, piano tones appeared to be arranged in small units, leading to numerosity underestimation. Conversely, when all piano tones were perceived to be higher in pitch than the drum sounds, they appeared to be arranged in a single large unit, leading to numerosity overestimation. Comparable to the visual version of the solitaire illusion, the clustering seems to be determined by Gestalt principles. In our auditory version, a clear reversal of the illusion (numerosity overestimation or underestimation) was observed when piano tones appeared to be arranged in a single large cluster or in several small clusters, respectively. © The Author(s) 2016.
NASA Technical Reports Server (NTRS)
Sandell, N. R., Jr.; Athans, M.
1975-01-01
The development of the theory of the finite - state, finite - memory (FSFM) stochastic control problem is discussed. The sufficiency of the FSFM minimum principle (which is in general only a necessary condition) was investigated. By introducing the notion of a signaling strategy as defined in the literature on games, conditions under which the FSFM minimum principle is sufficient were determined. This result explicitly interconnects the information structure of the FSFM problem with its optimality conditions. The min-H algorithm for the FSFM problem was studied. It is demonstrated that a version of the algorithm always converges to a particular type of local minimum termed a person - by - person extremal.
Swine influenza and vaccines: an alternative approach for decision making about pandemic prevention.
Basili, Marcello; Ferrini, Silvia; Montomoli, Emanuele
2013-08-01
During the global pandemic of A/H1N1/California/07/2009 (A/H1N1/Cal) influenza, many governments signed contracts with vaccine producers for a universal influenza immunization program and bought hundreds of millions of vaccines doses. We argue that, as Health Ministers assumed the occurrence of the worst possible scenario (generalized pandemic influenza) and followed the strong version of the Precautionary Principle, they undervalued the possibility of mild or weak pandemic wave. An alternative decision rule, based on the non-extensive entropy principle, is introduced, and a different Precautionary Principle characterization is applied. This approach values extreme negative results (catastrophic events) in a different way and predicts more plausible and mild events. It introduces less pessimistic forecasts in the case of uncertain influenza pandemic outbreaks. A simplified application is presented using seasonal data of morbidity and severity among Italian children influenza-like illness for the period 2003-10. Established literature results predict an average attack rate of not less than 15% for the next pandemic influenza [Meltzer M, Cox N, Fukuda K. The economic impact of pandemic influenza in the United States: implications for setting priorities for interventions. Emerg Infect Dis 1999;5:659-71; Meltzer M, Cox N, Fukuda K. Modeling the Economic Impact of Pandemic Influenza in the United States: Implications for Setting Priorities for Intervention. Background paper. Atlanta, GA: CDC, 1999. Available at: http://www.cdc.gov/ncidod/eid/vol5no5/melt_back.htm (7 January 2011, date last accessed))]. The strong version of the Precautionary Principle would suggest using this prediction for vaccination campaigns. On the contrary, the non-extensive maximum entropy principle predicts a lower attack rate, which induces a 20% saving in public funding for vaccines doses. The need for an effective influenza pandemic prevention program, coupled with an efficient use of public funding, calls for a rethinking of the Precautionary Principle. The non-extensive maximum entropy principle, which incorporates vague and incomplete information available to decision makers, produces a more coherent forecast of possible influenza pandemic and a conservative spending in public funding.
Sofaer, Neema
2014-01-01
A common reason for giving research participants post-trial access (PTA) to the trial intervention appeals to reciprocity, the principle, stated most generally, that if one person benefits a second, the second should reciprocate: benefit the first in return. Many authors consider it obvious that reciprocity supports PTA. Yet their reciprocity principles differ, with many authors apparently unaware of alternative versions. This article is the first to gather the range of reciprocity principles. It finds that: (1) most are false. (2) The most plausible principle, which is also problematic, applies only when participants experience significant net risks or burdens. (3) Seldom does reciprocity support PTA for participants or give researchers stronger reason to benefit participants than equally needy non-participants. (4) Reciprocity fails to explain the common view that it is bad when participants in a successful trial have benefited from the trial intervention but lack PTA to it. PMID:24602060
The Precautionary Principle and the Tolerability of Blood Transfusion Risks.
Kramer, Koen; Zaaijer, Hans L; Verweij, Marcel F
2017-03-01
Tolerance for blood transfusion risks is very low, as evidenced by the implementation of expensive blood tests and the rejection of gay men as blood donors. Is this low risk tolerance supported by the precautionary principle, as defenders of such policies claim? We discuss three constraints on applying (any version of) the precautionary principle and show that respecting these implies tolerating certain risks. Consistency means that the precautionary principle cannot prescribe precautions that it must simultaneously forbid taking, considering the harms they might cause. Avoiding counterproductivity requires rejecting precautions that cause more harm than they prevent. Proportionality forbids taking precautions that are more harmful than adequate alternatives. When applying these constraints, we argue, attention should not be restricted to harms that are human caused or that affect human health or the environment. Tolerating transfusion risks can be justified if available precautions have serious side effects, such as high social or economic costs.
Study of fault tolerant software technology for dynamic systems
NASA Technical Reports Server (NTRS)
Caglayan, A. K.; Zacharias, G. L.
1985-01-01
The major aim of this study is to investigate the feasibility of using systems-based failure detection isolation and compensation (FDIC) techniques in building fault-tolerant software and extending them, whenever possible, to the domain of software fault tolerance. First, it is shown that systems-based FDIC methods can be extended to develop software error detection techniques by using system models for software modules. In particular, it is demonstrated that systems-based FDIC techniques can yield consistency checks that are easier to implement than acceptance tests based on software specifications. Next, it is shown that systems-based failure compensation techniques can be generalized to the domain of software fault tolerance in developing software error recovery procedures. Finally, the feasibility of using fault-tolerant software in flight software is investigated. In particular, possible system and version instabilities, and functional performance degradation that may occur in N-Version programming applications to flight software are illustrated. Finally, a comparative analysis of N-Version and recovery block techniques in the context of generic blocks in flight software is presented.
Histopathology of fish: I. Techniques and principles
Wood, E.M.; Yasutake, W.T.
1955-01-01
The techniques of histopathology have been used for many years in the study of human and animal diseases. Until very recent times, however, histology has been applied to fish studies only very infrequently. This brief discussion is intended to acquaint the reader with the techniques and principles involved and to explain how histological studies may help to overcome fish diseases and nutritional problems.
Matrix Wings: Continuous Process Improvement an Operator Can Love
2016-09-01
could employ the principles in AFSO21 but preferably by em- ploying a version that does not take time away from mission preparation/execution and that...results, the DOD’s director of quality management sent a certified public accountant to Yokota to verify the claims. The CPA validated that
A Nuclear Reactions Primer with Computers.
ERIC Educational Resources Information Center
Calle, Carlos I.; Roach, Jennifer A.
1987-01-01
Described is a microcomputer software program NUCLEAR REACTIONS designed for college level students and in use at Sweet Briar College (Sweet Briar, VA). The program is written in Microsoft Basic Version 2.1 for the Apple Macintosh Microcomputer. It introduces two conservation principles: (1) conservation of charge; and (2) conservation of nucleon…
Reflections on the New Russian Education.
ERIC Educational Resources Information Center
Boe, Barbara L.
1993-01-01
Describes a trip to Russia by a delegation of educators from the United States to learn about educational reform, examining the 10 principles currently guiding reform efforts in Russia. Participants learned that education there has already changed from the version that existed under the U.S.S.R. governmental structure. (SM)
Facilitating Family Group Inquiry at Science Museum Exhibits
ERIC Educational Resources Information Center
Gutwill, Joshua P.; Allen, Sue
2010-01-01
We describe a study of programs to deepen families' scientific inquiry practices in a science museum setting. The programs incorporated research-based learning principles from formal and informal educational environments. In a randomized experimental design, two versions of the programs, called "inquiry games," were compared to two control…
On the Inclusion of Externally Controlled Actions in Action Planning
ERIC Educational Resources Information Center
Tsai, Jessica Chia-Chin; Knoblich, Gunther; Sebanz, Natalie
2011-01-01
According to ideomotor theories, perceiving action effects produced by others triggers corresponding action representations in the observer. We tested whether this principle extends to actions performed by externally controlled limbs and tools. Participants performed a go-no-go version of a spatial compatibility task in which their own actions…
Effects of Knowledge and Display Design on Comprehension of Complex Graphics
ERIC Educational Resources Information Center
Canham, Matt; Hegarty, Mary
2010-01-01
In two experiments, participants made inferences from weather maps, before and after they received instruction about relevant meteorological principles. Different versions of the maps showed either task-relevant information alone, or both task-relevant and task-irrelevant information. Participants improved on the inference task after instruction,…
A Philosophical Item Analysis of the Right-Wing Authoritarianism Scale.
ERIC Educational Resources Information Center
Eigenberger, Marty
Items of Altemeyer's 1986 version of the "Right-Wing Authoritarianism Scale" (RWA Scale) were analyzed as philosophical propositions in an effort to establish each item's suggestive connotation and denotation. The guiding principle of the analysis was the way in which the statements reflected authoritarianism's defining characteristics…
Software Fault Tolerance: A Tutorial
NASA Technical Reports Server (NTRS)
Torres-Pomales, Wilfredo
2000-01-01
Because of our present inability to produce error-free software, software fault tolerance is and will continue to be an important consideration in software systems. The root cause of software design errors is the complexity of the systems. Compounding the problems in building correct software is the difficulty in assessing the correctness of software for highly complex systems. After a brief overview of the software development processes, we note how hard-to-detect design faults are likely to be introduced during development and how software faults tend to be state-dependent and activated by particular input sequences. Although component reliability is an important quality measure for system level analysis, software reliability is hard to characterize and the use of post-verification reliability estimates remains a controversial issue. For some applications software safety is more important than reliability, and fault tolerance techniques used in those applications are aimed at preventing catastrophes. Single version software fault tolerance techniques discussed include system structuring and closure, atomic actions, inline fault detection, exception handling, and others. Multiversion techniques are based on the assumption that software built differently should fail differently and thus, if one of the redundant versions fails, it is expected that at least one of the other versions will provide an acceptable output. Recovery blocks, N-version programming, and other multiversion techniques are reviewed.
Cessford, Tara; Meneilly, Graydon S; Arishenkoff, Shane; Eddy, Christopher; Chen, Luke Y C; Kim, Daniel J; Ma, Irene W Y
2017-12-08
To determine whether sonographic versions of physical examination techniques can accurately identify splenomegaly, Castell's method (Ann Intern Med 1967; 67:1265-1267), the sonographic Castell's method, spleen tip palpation, and the sonographic spleen tip technique were compared with reference measurements. Two clinicians trained in bedside sonography patients recruited from an urban hematology clinic. Each patient was examined for splenomegaly using conventional percussion and palpation techniques (Castell's method and spleen tip palpation, respectively), as well as the sonographic versions of these maneuvers (sonographic Castell's method and sonographic spleen tip technique). Results were compared with a reference standard based on professional sonographer measurements. The sonographic Castell's method had greater sensitivity (91.7% [95% confidence interval, 61.5% to 99.8%]) than the traditional Castell's method (83.3% [95% confidence interval, 51.6% to 97.9%]) but took longer to perform [mean ± SD, 28.8 ± 18.6 versus 18.8 ± 8.1 seconds; P = .01). Palpable and positive sonographic spleen tip results were both 100% specific, but the sonographic spleen tip method was more sensitive (58.3% [95% confidence interval, 27.7% to 84.8%] versus 33.3% [95% confidence interval, 9.9% to 65.1%]). Sonographic versions of traditional physical examination maneuvers have greater diagnostic accuracy than the physical examination maneuvers from which they are derived but may take longer to perform. We recommend a combination of traditional physical examination and sonographic techniques when evaluating for splenomegaly at the bedside. © 2017 by the American Institute of Ultrasound in Medicine.
Reconstructing high-dimensional two-photon entangled states via compressive sensing
Tonolini, Francesco; Chan, Susan; Agnew, Megan; Lindsay, Alan; Leach, Jonathan
2014-01-01
Accurately establishing the state of large-scale quantum systems is an important tool in quantum information science; however, the large number of unknown parameters hinders the rapid characterisation of such states, and reconstruction procedures can become prohibitively time-consuming. Compressive sensing, a procedure for solving inverse problems by incorporating prior knowledge about the form of the solution, provides an attractive alternative to the problem of high-dimensional quantum state characterisation. Using a modified version of compressive sensing that incorporates the principles of singular value thresholding, we reconstruct the density matrix of a high-dimensional two-photon entangled system. The dimension of each photon is equal to d = 17, corresponding to a system of 83521 unknown real parameters. Accurate reconstruction is achieved with approximately 2500 measurements, only 3% of the total number of unknown parameters in the state. The algorithm we develop is fast, computationally inexpensive, and applicable to a wide range of quantum states, thus demonstrating compressive sensing as an effective technique for measuring the state of large-scale quantum systems. PMID:25306850
Benifla, J L; Goffinet, F; Darai, E; Madelenat, P
1994-12-01
Transabdominal amnioinfusion can be used to facilitate external cephalic version. Our technique involves filling the uterine cavity with 700 or 900 mL of 37C saline under continuous echographic monitoring. External cephalic version is done the next morning. We have used this procedure in six women, all of whom had previous unsuccessful attempts at external cephalic version. After amnioinfusion, all six patients were converted to cephalic presentation and delivered normally, without obstetric or neonatal complications.
Sound Medication Therapy Management Programs, Version 2.0 with validation study.
2008-01-01
The Academy of Managed Care Pharmacy (AMCP, the Academy) contracted with the National Committee for Quality Assurance (NCQA) to conduct a field study to validate and assess the 2006 Sound Medication Therapy Management Programs, Version 1.0 document. Version 1.0 posits several principles of sound medication therapy management (MTM) programs: they (1) recruit patients whose data show they may need assistance with managing medications; (2) have health professionals who intervene with patients and their physicians to improve medication regimens; and (3) measure their results. The validation study determined the extent to which the principles identified in version 1.0 are incorporated in MTM programs. The method was designed to determine to what extent the important features and operational elements of sound MTM programs as described in version 1.0 are (1) acceptable and seen as comprehensive to users, (2) incorporated into MTM programs in the field, (3) reflective of the consensus group's intentions, and (4) in need of modification or updating. NCQA first conducted Phase One, in which NCQA gathered perspectives on the principles in the consensus document from a mixed group of stakeholders representing both providers and users of MTM programs. Phase Two involved a deeper analysis of existing programs related to the consensus document, in which NCQA conducted a Web-based survey of 20 varied MTM programs and conducted in-depth site visits with 5 programs. NCQA selected programs offered by a range of MTM-providing organizations -- health plans, pharmacy benefit management companies, disease management organizations, and stand-alone MTM providers. NCQA analyzed the results of both phases. The Phase Two survey asked specific questions of the programs and found that some programs perform beyond the principles listed in version 1.0. NCQA found that none of the elements of the consensus document should be eliminated because programs cannot perform them, although NCQA suggested some areas where the document could be more expansive or more specific, given the state of MTM operations in the field. The important features and operational elements in the document were categorized into the following 3 overall categories, which NCQA used to structure the survey and conduct the site visits in Phase Two: (1) eligibility and enrollment, (2) operations, and (3) quality management. NCQA found that the original consensus document was realistic in identifying the elements of sound MTM. In the current project, NCQA's purpose was not to make judgments about the effectiveness of MTM programs in general or any individual program in particular. NCQA recommended that the consensus document could be made stronger and more specific in 3 areas: (1) specifically state that the Patient Identification and Recruitment section advocates use of various eligibility criteria that may include, but are not limited to, Medicare-defined MTM eligibility criteria; (2) reframe or remove the statement in Appendix A of the consensus document that the preferred modality for MTM is face-to-face interaction between patient and pharmacist, unless there are comparative data to support it as currently written; and (3) specifically recommend that programs measure performance across the entire populations in their plans in addition to measuring results for those patients selected into MTM. This will make benchmarking among programs possible and will lead to substantiated best practices in this growing field.
Virtue ethics - an old answer to a new dilemma? Part 1. Problems with contemporary medical ethics.
Misselbrook, David
2015-02-01
The commonest practical model used in contemporary medical ethics is Principlism. Yet, while Principlism is a widely accepted consensus statement for ethics, the moral theory that underpins it faces serious challenges in its attempt to provide a coherent and accepted system of moral analysis. This inevitably challenges the stability of such a consensus statement and makes it vulnerable to attack by competitors such as preference consequentialism. This two-part paper proposes an inclusive version of virtue theory as a more grounded system of moral analysis. © The Royal Society of Medicine.
Virtue ethics – an old answer to a new dilemma? Part 1. Problems with contemporary medical ethics
2015-01-01
The commonest practical model used in contemporary medical ethics is Principlism. Yet, while Principlism is a widely accepted consensus statement for ethics, the moral theory that underpins it faces serious challenges in its attempt to provide a coherent and accepted system of moral analysis. This inevitably challenges the stability of such a consensus statement and makes it vulnerable to attack by competitors such as preference consequentialism. This two-part paper proposes an inclusive version of virtue theory as a more grounded system of moral analysis. PMID:25721113
Principles and Techniques of Radiation Chemistry.
ERIC Educational Resources Information Center
Dorfman, Leon M.
1981-01-01
Discusses the physical processes involved in the deposition of energy from ionizing radiation in the absorber system. Identifies principles relevant to these processes which are responsible for ionization and excitation of the components of the absorber system. Briefly describes some experimental techniques in use in radiation chemical studies.…
A Monte Carlo Simulation of Brownian Motion in the Freshman Laboratory
ERIC Educational Resources Information Center
Anger, C. D.; Prescott, J. R.
1970-01-01
Describes a dry- lab" experiment for the college freshman laboratory, in which the essential features of Browian motion are given principles, using the Monte Carlo technique. Calculations principles, using the Monte Carlo technique. Calculations are carried out by a computation sheme based on computer language. Bibliography. (LC)
Estimation and enhancement of real-time software reliability through mutation analysis
NASA Technical Reports Server (NTRS)
Geist, Robert; Offutt, A. J.; Harris, Frederick C., Jr.
1992-01-01
A simulation-based technique for obtaining numerical estimates of the reliability of N-version, real-time software is presented. An extended stochastic Petri net is employed to represent the synchronization structure of N versions of the software, where dependencies among versions are modeled through correlated sampling of module execution times. Test results utilizing specifications for NASA's planetary lander control software indicate that mutation-based testing could hold greater potential for enhancing reliability than the desirable but perhaps unachievable goal of independence among N versions.
Total Quality Management (TQM) in Higher Education.
ERIC Educational Resources Information Center
Sullivan, Michael F.
This document consists largely of paper versions of the transparencies used by the author to give his conference paper on Total Quality Management (TQM) in the college and university setting. An introduction lists a series of definitional phrases, a list of what TQM is not, and 11 fundamental principles describing what TQM is. The three major…
Translation of the Holy Quran: A Call for Standardization
ERIC Educational Resources Information Center
Halimah, Ahmad Mustafa
2014-01-01
The recent increase in the number of English translations of the Quran has led to problematic misrepresentations, misinterpretations and even textual discrepancies in the translations of a number of Islamic concepts, principles and norms. This paper is an attempt to evaluate five different English versions of the translation of the Quran using…
Survey of Library and Information Manpower Needs in the Caribbean. (Preliminary Version).
ERIC Educational Resources Information Center
Moore, Nick
In order to provide a base for national information planning and the restructuring of existing training institutions, a detailed study was conducted of manpower needs--at professional, paraprofessional, and technician levels--for information systems and services in the Caribbean region. A paper setting out the basic principles underlying manpower…
On the Status of Logic in Piaget
ERIC Educational Resources Information Center
Reginensi, Luc
2004-01-01
This article analyses the way in which Piaget links the analogy between the child and the primitive with a theory of the history of the sciences, that is, it analyses Piaget's version of Haeckel's principle in which ontogenesis recapitulates phylogenesis. From this analysis, we reconstitute the operations through which Piaget forms and expresses…
David Riesman and the Problem of Diversity in American Education
ERIC Educational Resources Information Center
McClay, Wilfred M.
2005-01-01
Americans are increasingly drawn to their own version of "muddling through," and are likely to view the process of reasoning from intellectual or moral principles with grave suspicion, if not outright hostility, as a form of undemocratic confinement. One sees this with especial clarity in today's institutions of higher education, in…
"Is This Ethical?" A Survey of Opinion on Principles and Practices of Document Design.
ERIC Educational Resources Information Center
Dragga, Sam
1996-01-01
Reprints a corrected version of an article originally published in the volume 43, number 1 issue of this journal. Presents results of a national survey of technical communicators and technical communication teachers assessing the ethics of seven document design cases involving manipulation of typography, illustrations, and photographs. Offers…
Sources of History for "A Psychology of Verbal Communication"
ERIC Educational Resources Information Center
O'Connell, Daniel C.; Kowal, Sabine
2011-01-01
There is a standard version of the history of modern mainstream psycholinguistics that emphasizes an extraordinary explosion of research in mid twentieth century under the guidance and leadership of George A. Miller and Noam Chomsky. The narrative is cast as a dramatic shift away from behavioristic principles and toward mentalistic principles…
The Third Edition of the Test of Understanding in College Economics.
ERIC Educational Resources Information Center
Saunders, Phillip
1991-01-01
Discusses the content and cognitive specification of the third edition of the Test of Understanding in College Economics. Presents examples of the construction and sampling criteria employed in the latest and previous versions of the test. Explains that the test emphasizes recognition and understanding of basic terms, concepts, and principles with…
Arguments for a Common Set of Principles for Collaborative Inquiry in Evaluation
ERIC Educational Resources Information Center
Cousins, J. Bradley; Whitmore, Elizabeth; Shulha, Lyn
2013-01-01
In this article, we critique two recent theoretical developments about collaborative inquiry in evaluation--using logic models as a means to understand theory, and efforts to compartmentalize versions of collaborative inquiry into discrete genres--as a basis for considering future direction for the field. We argue that collaborative inquiry in…
Pedestrian detection in crowded scenes with the histogram of gradients principle
NASA Astrophysics Data System (ADS)
Sidla, O.; Rosner, M.; Lypetskyy, Y.
2006-10-01
This paper describes a close to real-time scale invariant implementation of a pedestrian detector system which is based on the Histogram of Oriented Gradients (HOG) principle. Salient HOG features are first selected from a manually created very large database of samples with an evolutionary optimization procedure that directly trains a polynomial Support Vector Machine (SVM). Real-time operation is achieved by a cascaded 2-step classifier which uses first a very fast linear SVM (with the same features as the polynomial SVM) to reject most of the irrelevant detections and then computes the decision function with a polynomial SVM on the remaining set of candidate detections. Scale invariance is achieved by running the detector of constant size on scaled versions of the original input images and by clustering the results over all resolutions. The pedestrian detection system has been implemented in two versions: i) fully body detection, and ii) upper body only detection. The latter is especially suited for very busy and crowded scenarios. On a state-of-the-art PC it is able to run at a frequency of 8 - 20 frames/sec.
NASA Astrophysics Data System (ADS)
Balin Talamba, D.; Higy, C.; Joerin, C.; Musy, A.
The paper presents an application concerning the hydrological modelling for the Haute-Mentue catchment, located in western Switzerland. A simplified version of Topmodel, developed in a Labview programming environment, was applied in the aim of modelling the hydrological processes on this catchment. Previous researches car- ried out in this region outlined the importance of the environmental tracers in studying the hydrological behaviour and an important knowledge has been accumulated dur- ing this period concerning the mechanisms responsible for runoff generation. In con- formity with the theoretical constraints, Topmodel was applied for an Haute-Mentue sub-catchment where tracing experiments showed constantly low contributions of the soil water during the flood events. The model was applied for two humid periods in 1998. First, the model calibration was done in order to provide the best estimations for the total runoff. Instead, the simulated components (groundwater and rapid flow) showed far deviations from the reality indicated by the tracing experiments. Thus, a new calibration was performed including additional information given by the environ- mental tracing. The calibration of the model was done by using simulated annealing (SA) techniques, which are easy to implement and statistically allow for converging to a global minimum. The only problem is that the method is time and computer consum- ing. To improve this, a version of SA was used which is known as very fast-simulated annealing (VFSA). The principles are the same as for the SA technique. The random search is guided by certain probability distribution and the acceptance criterion is the same as for SA but the VFSA allows for better taking into account the ranges of vari- ation of each parameter. Practice with Topmodel showed that the energy function has different sensitivities along different dimensions of the parameter space. The VFSA algorithm allows differentiated search in relation with the sensitivity of the param- eters. The environmental tracing was used in the aim of constraining the parameter space in order to better simulate the hydrological behaviour of the catchment. VFSA outlined issues for characterising the significance of Topmodel input parameters as well as their uncertainty for the hydrological modelling.
NASA Astrophysics Data System (ADS)
Christensen, David B.; Basaeri, Hamid; Roundy, Shad
2017-12-01
In acoustic power transfer systems, a receiver is displaced from a transmitter by an axial depth, a lateral offset (alignment), and a rotation angle (orientation). In systems where the receiver’s position is not fixed, such as a receiver implanted in biological tissue, slight variations in depth, orientation, or alignment can cause significant variations in the received voltage and power. To address this concern, this paper presents a computationally efficient technique to model the effects of depth, orientation, and alignment via ray tracing (DOART) on received voltage and power in acoustic power transfer systems. DOART combines transducer circuit equivalent models, a modified version of Huygens principle, and ray tracing to simulate pressure wave propagation and reflection between a transmitter and a receiver in a homogeneous medium. A reflected grid method is introduced to calculate propagation distances, reflection coefficients, and initial vectors between a point on the transmitter and a point on the receiver for an arbitrary number of reflections. DOART convergence and simulation time per data point is discussed as a function of the number of reflections and elements chosen. Finally, experimental data is compared to DOART simulation data in terms of magnitude and shape of the received voltage signal.
Initial singularity and pure geometric field theories
NASA Astrophysics Data System (ADS)
Wanas, M. I.; Kamal, Mona M.; Dabash, Tahia F.
2018-01-01
In the present article we use a modified version of the geodesic equation, together with a modified version of the Raychaudhuri equation, to study initial singularities. These modified equations are used to account for the effect of the spin-torsion interaction on the existence of initial singularities in cosmological models. Such models are the results of solutions of the field equations of a class of field theories termed pure geometric. The geometric structure used in this study is an absolute parallelism structure satisfying the cosmological principle. It is shown that the existence of initial singularities is subject to some mathematical (geometric) conditions. The scheme suggested for this study can be easily generalized.
Quantum Common Causes and Quantum Causal Models
NASA Astrophysics Data System (ADS)
Allen, John-Mark A.; Barrett, Jonathan; Horsman, Dominic C.; Lee, Ciarán M.; Spekkens, Robert W.
2017-07-01
Reichenbach's principle asserts that if two observed variables are found to be correlated, then there should be a causal explanation of these correlations. Furthermore, if the explanation is in terms of a common cause, then the conditional probability distribution over the variables given the complete common cause should factorize. The principle is generalized by the formalism of causal models, in which the causal relationships among variables constrain the form of their joint probability distribution. In the quantum case, however, the observed correlations in Bell experiments cannot be explained in the manner Reichenbach's principle would seem to demand. Motivated by this, we introduce a quantum counterpart to the principle. We demonstrate that under the assumption that quantum dynamics is fundamentally unitary, if a quantum channel with input A and outputs B and C is compatible with A being a complete common cause of B and C , then it must factorize in a particular way. Finally, we show how to generalize our quantum version of Reichenbach's principle to a formalism for quantum causal models and provide examples of how the formalism works.
The Uncertainty Principle in the Presence of Quantum Memory
NASA Astrophysics Data System (ADS)
Renes, Joseph M.; Berta, Mario; Christandl, Matthias; Colbeck, Roger; Renner, Renato
2010-03-01
One consequence of Heisenberg's uncertainty principle is that no observer can predict the outcomes of two incompatible measurements performed on a system to arbitrary precision. However, this implication is invalid if the the observer possesses a quantum memory, a distinct possibility in light of recent technological advances. Entanglement between the system and the memory is responsible for the breakdown of the uncertainty principle, as illustrated by the EPR paradox. In this work we present an improved uncertainty principle which takes this entanglement into account. By quantifying uncertainty using entropy, we show that the sum of the entropies associated with incompatible measurements must exceed a quantity which depends on the degree of incompatibility and the amount of entanglement between system and memory. Apart from its foundational significance, the uncertainty principle motivated the first proposals for quantum cryptography, though the possibility of an eavesdropper having a quantum memory rules out using the original version to argue that these proposals are secure. The uncertainty relation introduced here alleviates this problem and paves the way for its widespread use in quantum cryptography.
Increased anteversion of press-fit femoral stems compared with anatomic femur.
Emerson, Roger H
2012-02-01
With contemporary canal-filling press-fit stems, there is no adjustability of stem position in the canal and therefore the canal anatomy determines stem version. Stem version will affect head/neck impingement, polyethylene wear from edge loading, and hip stability, but despite this, the postoperative version of a canal-filling press-fit stem is unclear. Is there a difference between the version of the nonoperated femur and the final version of a canal-filling press-fit femoral component? Could a difference create an alignment problem for the hip replacement? Sixty-four hips were studied with fluoroscopy and 46 nonarthritic and 41 arthritic hips were studied with MRI. A standardized fluoroscopic technique for determining preoperative and postoperative femoral version was developed with the patient supine on a fracture table undergoing supine total hip arthroplasty. To validate the methods, the results were compared with two selected series of axial MRI views of the hip comparing the version of the head with the version of the canal at the base of the neck. For the operated hips, the mean anatomic hip version was less than the stem version: 18.9° versus 27.0°. The difference on average was 8.1° of increased anteversion (SD, 7.4°). Both MRI series showed the femoral neck was more anteverted on average than the femoral head, thereby explaining the operative findings. With a canal-filling press-fit femoral component there is wide variation of postoperative component anteversion with most stems placed in increased anteversion compared with the anatomic head. The surgical technique may need to adjust for this if causing intraoperative impingement or instability.
ERIC Educational Resources Information Center
Poole, Harrison Grant
2018-01-01
Fred Rogers's television program, "Mister Rogers' Neighborhood", connected with young children and educated them about difficult concepts for more than 30 years. The author analyzes and discusses several principles and pedagogical techniques that were used in Rogers's television program, including communicating with children,…
Sofaer, Neema
2014-11-01
A common reason for giving research participants post-trial access (PTA) to the trial intervention appeals to reciprocity, the principle, stated most generally, that if one person benefits a second, the second should reciprocate: benefit the first in return. Many authors consider it obvious that reciprocity supports PTA. Yet their reciprocity principles differ, with many authors apparently unaware of alternative versions. This article is the first to gather the range of reciprocity principles. It finds that: (1) most are false. (2) The most plausible principle, which is also problematic, applies only when participants experience significant net risks or burdens. (3) Seldom does reciprocity support PTA for participants or give researchers stronger reason to benefit participants than equally needy non-participants. (4) Reciprocity fails to explain the common view that it is bad when participants in a successful trial have benefited from the trial intervention but lack PTA to it. © 2013 John Wiley & Sons Ltd.
Guiding principles of subcutaneous immunotherapy for allergic rhinitis in Japan.
Okamoto, Yoshitaka; Ohta, Nobuo; Okano, Mitsuhiro; Kamijo, Atsushi; Gotoh, Minoru; Suzuki, Motohiko; Takeno, Sachio; Terada, Tetsuya; Hanazawa, Toyoyuki; Horiguchi, Shigetoshi; Honda, Kohei; Matsune, Shoji; Yamada, Takechiyo; Yuta, Atsushi; Nakayama, Takeo; Fujieda, Shigeharu
2014-02-01
In anticipation of the development of guidelines for antigen-specific subcutaneous immunotherapy (SCIT), we present recommendations that can serve as guiding principles based on a review of the scientific literature. Clinical questions (CQs) concerning SCIT were prepared. Literature searches for publications between January 1990 and February 2011 were performed in PubMed, the Cochrane Library, and Japana Centra Revuo Medicina Web version 4. Qualified studies were analyzed and the results were evaluated, consolidated, and codified. We present answers for 13 CQs on the indications, methods, effectiveness and mechanisms of SCIT, with evidence-based recommendations. The guiding principles are intended to be applied to children (≤15 years old) and adults (≥16 years old) with allergic rhinitis (AR). These principles can be used by otorhinolaryngologists for diagnosis of AR, evaluation of severity and rhinoscopic findings, performance of antigen challenge tests, and management of systemic anaphylactic reactions associated with SCIT. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Craig, Lorraine; Krewski, Dan; Samet, Jonathan; Shortreed, John; van Bree, Leendert; Krupnick, Alan J
2008-01-01
This statement is the result of discussions held at the 2005 NERAM IV Colloquium "International Perspectives on Air Quality: Risk Management Principles for Policy Development" and represents the collective views of 35 delegates, including international air quality policy analysts, academics, nongovernmental organizations, industry representatives, and decision makers from Mexico, Canada, the United States, the United Kingdom, Brazil, Hong Kong, and The Netherlands on principles for global air quality management. The objective of the colloquium was to "establish principles for air quality management based on the identification of international best practice in air quality policy development and implementation." This statement represents the main findings of a breakout group discussion session, presentations of an international panel of speakers from Canada, the United States, Mexico, and Hong Kong and views of the delegates expressed in plenary discussions. NERAM undertook a transparent process to try to ensure that the statement would accurately reflect the conference discussions, including documenting the proceedings and inviting delegates' comments on draft versions of the statement.
Sachs, Benjamin
2010-01-01
Norman Daniels's new book, Just Health, brings together his decades of work on the problem of justice and health. It improves on earlier writings by discussing how we can meet health needs fairly when we cannot meet them all and by attending to the implications of the socioeconomic determinants of health. In this article I return to the core idea around which the entire theory is built: that the principle of equality of opportunity grounds a societal obligation to meet health needs. I point, first, that nowhere does Daniels say just what version of that principle he accepts. I then proceed to construct a principle on his behalf, based on a faithful reading of Just Health. Once we actually nail down the principle, I argue, we will find that there are two problems: it is implausible in itself, and it fails to ground a societal obligation to meet health needs. PMID:20634271
Consolidated principles for screening based on a systematic review and consensus process.
Dobrow, Mark J; Hagens, Victoria; Chafe, Roger; Sullivan, Terrence; Rabeneck, Linda
2018-04-09
In 1968, Wilson and Jungner published 10 principles of screening that often represent the de facto starting point for screening decisions today; 50 years on, are these principles still the right ones? Our objectives were to review published work that presents principles for population-based screening decisions since Wilson and Jungner's seminal publication, and to conduct a Delphi consensus process to assess the review results. We conducted a systematic review and modified Delphi consensus process. We searched multiple databases for articles published in English in 1968 or later that were intended to guide population-based screening decisions, described development and modification of principles, and presented principles as a set or list. Identified sets were compared for basic characteristics (e.g., number, categorization), a citation analysis was conducted, and principles were iteratively synthesized and consolidated into categories to assess evolution. Participants in the consensus process assessed the level of agreement with the importance and interpretability of the consolidated screening principles. We identified 41 sets and 367 unique principles. Each unique principle was coded to 12 consolidated decision principles that were further categorized as disease/condition, test/intervention or program/system principles. Program or system issues were the focus of 3 of Wilson and Jungner's 10 principles, but comprised almost half of all unique principles identified in the review. The 12 consolidated principles were assessed through 2 rounds of the consensus process, leading to specific refinements to improve their relevance and interpretability. No gaps or missing principles were identified. Wilson and Jungner's principles are remarkably enduring, but increasingly reflect a truncated version of contemporary thinking on screening that does not fully capture subsequent focus on program or system principles. Ultimately, this review and consensus process provides a comprehensive and iterative modernization of guidance to inform population-based screening decisions. © 2018 Joule Inc. or its licensors.
Consolidated principles for screening based on a systematic review and consensus process
Hagens, Victoria; Chafe, Roger; Sullivan, Terrence; Rabeneck, Linda
2018-01-01
BACKGROUND: In 1968, Wilson and Jungner published 10 principles of screening that often represent the de facto starting point for screening decisions today; 50 years on, are these principles still the right ones? Our objectives were to review published work that presents principles for population-based screening decisions since Wilson and Jungner’s seminal publication, and to conduct a Delphi consensus process to assess the review results. METHODS: We conducted a systematic review and modified Delphi consensus process. We searched multiple databases for articles published in English in 1968 or later that were intended to guide population-based screening decisions, described development and modification of principles, and presented principles as a set or list. Identified sets were compared for basic characteristics (e.g., number, categorization), a citation analysis was conducted, and principles were iteratively synthesized and consolidated into categories to assess evolution. Participants in the consensus process assessed the level of agreement with the importance and interpretability of the consolidated screening principles. RESULTS: We identified 41 sets and 367 unique principles. Each unique principle was coded to 12 consolidated decision principles that were further categorized as disease/condition, test/intervention or program/system principles. Program or system issues were the focus of 3 of Wilson and Jungner’s 10 principles, but comprised almost half of all unique principles identified in the review. The 12 consolidated principles were assessed through 2 rounds of the consensus process, leading to specific refinements to improve their relevance and interpretability. No gaps or missing principles were identified. INTERPRETATION: Wilson and Jungner’s principles are remarkably enduring, but increasingly reflect a truncated version of contemporary thinking on screening that does not fully capture subsequent focus on program or system principles. Ultimately, this review and consensus process provides a comprehensive and iterative modernization of guidance to inform population-based screening decisions. PMID:29632037
Frontiers in Chemical Sensors: Novel Principles and Techniques
NASA Astrophysics Data System (ADS)
Orellana, Guillermo; Moreno-Bondi, Maria Cruz
This third volume of Springer Series on Chemical Sensors and Biosensors aims to enable the researcher or technologist to become acquainted with the latest principles and techniques that keep on enlarging the applications in this fascinating field. It deals with the novel luminescence lifetime-based techniques for interrogation of sensor arrays in high-throughput screening, cataluminescence, chemical sensing with hollow waveguides, new ways in sensor design and fabrication by means of either combinatorial methods or engineered indicator/support couples.
A new approach to non-invasive oxygenated mixed venous PCO(sub)2
NASA Technical Reports Server (NTRS)
Fisher, Joseph A.; Ansel, Clifford A.
1986-01-01
A clinically practical technique was developed to calculate mixed venous CO2 partial pressure for the calculation of cardiac output by the Fick technique. The Fick principle states that the cardiac output is equal to the CO2 production divided by the arterio-venous CO2 content difference of the pulmonary vessels. A review of the principles involved in the various techniques used to estimate venous CO2 partial pressure is presented.
Energy 101: Wind Turbines - 2014 Update
None
2018-05-11
See how wind turbines generate clean electricity from the power of wind. The video highlights the basic principles at work in wind turbines, and illustrates how the various components work to capture and convert wind energy to electricity. This updated version also includes information on the Energy Department's efforts to advance offshore wind power. Offshore wind energy footage courtesy of Vestas.
Improvements In A Laser-Speckle Surface-Strain Gauge
NASA Technical Reports Server (NTRS)
Lant, Christian T.
1996-01-01
Compact optical subsystem incorporates several improvements over optical subsystems of previous versions of laser-speckle surface-strain gauge: faster acquisition of data, faster response to transients, reduced size and weight, lower cost, and less complexity. Principle of operation described previously in "Laser System Measures Two-Dimensional Strain" (LEW-15046), and "Two-Dimensional Laser-Speckle Surface-Strain Gauge" (LEW-15337).
Developing a Health-Related Quality-of-Life Measure for People with Intellectual Disability
ERIC Educational Resources Information Center
Clark, Lauren; Pett, Marjorie A.; Cardell, Elizabeth M.; Guo, Jia-Wen; Johnson, Erin
2017-01-01
Using principles of community-based participatory research we developed a new theory-based measure of health-related quality of life (HRQOL) for individuals with intellectual disability (ID). We recruited adults with ID (n = 129) to take part in interviews and review successive versions of HRQOL items. Critical input about content and…
ERIC Educational Resources Information Center
Haslam, C.; Wills, A. J.; Haslam, S. A.; Kay, J.; Baron, R.; McNab, F.
2007-01-01
Recent neuropsychological evidence, supporting a strong version of Whorfian principles of linguistic relativity, has reinvigorated debate about the role of language in colour categorisation. This paper questions the methodology used in this research and uses a novel approach to examine the unique contribution of language to categorisation…
Airborne Dial Remote Sensing of the Arctic Ozone Layer
NASA Technical Reports Server (NTRS)
Wirth, Martin; Renger, Wolfgang; Ehret, Gerhard
1992-01-01
A combined ozone and aerosol LIDAR was developed at the Institute of Physics of the Atmosphere at the DLR in Oberpfaffenhofen. It is an airborne version, that, based on the DIAL-principle, permits the recording of two-dimensional ozone profiles. This presentation will focus on the ozone-part; the aerosol subsection will be treated later.
Pavlovian, Skinner, and Other Behaviourists' Contributions to AI. Chapter 9
NASA Technical Reports Server (NTRS)
Kosinski, Withold; Zaczek-Chrzanowska, Dominika
2007-01-01
A version of the definition of intelligent behaviour will be supplied in the context of real and artificial systems. Short presentation of principles of learning, starting with Pavlovian s classical conditioning through reinforced response and operant conditioning of Thorndike and Skinner and finishing with cognitive learning of Tolman and Bandura will be given. The most important figures within behaviourism, especially those with contribution to AI, will be described. Some tools of artificial intelligence that act according to those principles will be presented. An attempt will be made to show when some simple rules for behaviour modifications can lead to a complex intelligent behaviour.
Quantum memories and Landauer's principle
NASA Astrophysics Data System (ADS)
Alicki, Robert
2011-10-01
Two types of arguments concerning (im)possibility of constructing a scalable, exponentially stable quantum memory equipped with Hamiltonian controls are discussed. The first type concerns ergodic properties of open Kitaev models which are considered as promising candidates for such memories. It is shown that, although the 4D Kitaev model provides stable qubit observables, the Hamiltonian control is not possible. The thermodynamical approach leads to the new proposal of the revised version of Landauer's principle and suggests that the existence of quantum memory implies the existence of the perpetuum mobile of the second kind. Finally, a discussion of the stability property of information and its implications is presented.
NASA Astrophysics Data System (ADS)
Zimmerman, Timothy David
2005-11-01
Students and citizens need to apply science to important issues every day. Yet the design of science curricula that foster integration of science and everyday decisions is not well understood. For example, can curricula be designed that help learners apply scientific reasons for choosing only environmentally sustainable seafood for dinner? Learners must develop integrated understandings of scientific principles, prior experiences, and current decisions in order to comprehend how everyday decisions impact environmental resources. In order to investigate how such integrated understandings can be promoted within school science classes, research was conducted with an inquiry-oriented curriculum that utilizes technology and a visit to an informal learning environment (aquarium) to promote the integration of scientific principles (adaptation) with environmental stewardship. This research used a knowledge integration approach to teaching and learning that provided a framework for promoting the application of science to environmental issues. Marine biology, often forsaken in classrooms for terrestrial biology, served as the scientific context for the curriculum. The curriculum design incorporated a three-phase pedagogical strategy and new technology tools to help students integrate knowledge and experiences across the classroom and aquarium learning environments. The research design and assessment protocols included comparisons among and within student populations using two versions of the curriculum: an issue-based version and a principle-based version. These inquiry curricula were tested with sophomore biology students attending a marine-focused academy within a coastal California high school. Pretest-posttest outcomes were compared between and within the curricular treatments. Additionally, comparisons were made between the inquiry groups and seniors in an Advanced Placement biology course who attend the same high school. Results indicate that the inquiry curricula enabled students to integrate and apply knowledge of evolutionary biology to real-world environmental stewardship issues. Over the course of the curriculum, students' ideas became more scientifically normative and tended to focus around concepts of natural selection. Students using the inquiry curricula outperformed the Advanced Placement biology students on several measures, including knowledge of evolutionary biology. These results have implications for designing science curricula that seek to promote the application of science to environmental stewardship and integrate formal and informal learning environments.
ERIC Educational Resources Information Center
Abd-El-Fattah, Sabry M.; AL-Sinani, Yousra; El Shourbagi, Sahar; Fakhroo, Hessa A.
2014-01-01
This study uses the Rasch model technique to examine the dimensionality structure and differential item functioning of the Arabic version of the Perceived Physical Ability Scale for Children (PPASC). A sample of 220 Omani fourth graders (120 males and 100 females) responded to an Arabic translated version of the PPASC. Data on students'…
Pandey, Shilpa; Hakky, Michael; Kwak, Ellie; Jara, Hernan; Geyer, Carl A; Erbay, Sami H
2013-05-01
Neurovascular imaging studies are routinely used for the assessment of headaches and changes in mental status, stroke workup, and evaluation of the arteriovenous structures of the head and neck. These imaging studies are being performed with greater frequency as the aging population continues to increase. Magnetic resonance (MR) angiographic imaging techniques are helpful in this setting. However, mastering these techniques requires an in-depth understanding of the basic principles of physics, complex flow patterns, and the correlation of MR angiographic findings with conventional MR imaging findings. More than one imaging technique may be used to solve difficult cases, with each technique contributing unique information. Unfortunately, incorporating findings obtained with multiple imaging modalities may add to the diagnostic challenge. To ensure diagnostic accuracy, it is essential that the radiologist carefully evaluate the details provided by these modalities in light of basic physics principles, the fundamentals of various imaging techniques, and common neurovascular imaging pitfalls. ©RSNA, 2013.
[MLPA technique--principles and use in practice].
Rusu, Cristina; Sireteanu, Adriana; Puiu, Maria; Skrypnyk, Cristina; Tomescu, E; Csep, Katalin; Creţ, Victoria; Barbarii, Ligia
2007-01-01
MLPA (Multiplex Ligation-dependent Probe Amplification) is a recently introduced method, based on PCR principle, useful for the detection of different genetic abnormalities (aneuploidies, gene deletions/duplications, subtelomeric rearrangements, methylation status etc). The technique is simple, reliable and cheap. We present this method to discuss its importance for a modern genetic service and to underline its multiple advantages.
Equivalence principle for quantum systems: dephasing and phase shift of free-falling particles
NASA Astrophysics Data System (ADS)
Anastopoulos, C.; Hu, B. L.
2018-02-01
We ask the question of how the (weak) equivalence principle established in classical gravitational physics should be reformulated and interpreted for massive quantum objects that may also have internal degrees of freedom (dof). This inquiry is necessary because even elementary concepts like a classical trajectory are not well defined in quantum physics—trajectories originating from quantum histories become viable entities only under stringent decoherence conditions. From this investigation we posit two logically and operationally distinct statements of the equivalence principle for quantum systems. Version A: the probability distribution of position for a free-falling particle is the same as the probability distribution of a free particle, modulo a mass-independent shift of its mean. Version B: any two particles with the same velocity wave-function behave identically in free fall, irrespective of their masses. Both statements apply to all quantum states, including those without a classical correspondence, and also for composite particles with quantum internal dof. We also investigate the consequences of the interaction between internal and external dof induced by free fall. For a class of initial states, we find dephasing occurs for the translational dof, namely, the suppression of the off-diagonal terms of the density matrix, in the position basis. We also find a gravitational phase shift in the reduced density matrix of the internal dof that does not depend on the particle’s mass. For classical states, the phase shift has a natural classical interpretation in terms of gravitational red-shift and special relativistic time-dilation.
Three versions of an ethics of care.
Edwards, Steven D
2009-10-01
The ethics of care still appeals to many in spite of penetrating criticisms of it which have been presented over the past 15 years or so. This paper tries to offer an explanation for this, and then to critically engage with three versions of an ethics of care. The explanation consists firstly in the close affinities between nursing and care. The three versions identified below are by Gilligan (1982), a second by Tronto (1993), and a third by Gastmans (2006), see also Little (1998). Each version is described and then subjected to criticism. It is concluded that where the ethics of care is presented in a distinctive way, it is at its least plausible; where it is stated in more plausible forms, it is not sufficiently distinct from nor superior to at least one other common approach to nursing ethics, namely the much-maligned 'four principles' approach. What is added by this paper to what is already known: as the article tries to explain, in spite of its being subjected to sustained criticism the ethics of care retains its appeal to many scholars. The paper tries to explain why, partly by distinguishing three different versions of an ethics of care. It is also shown that all three versions are beset with problems the least serious of which is distinctiveness from other approaches to moral problems in health care.
Performance study of LMS based adaptive algorithms for unknown system identification
NASA Astrophysics Data System (ADS)
Javed, Shazia; Ahmad, Noor Atinah
2014-07-01
Adaptive filtering techniques have gained much popularity in the modeling of unknown system identification problem. These techniques can be classified as either iterative or direct. Iterative techniques include stochastic descent method and its improved versions in affine space. In this paper we present a comparative study of the least mean square (LMS) algorithm and some improved versions of LMS, more precisely the normalized LMS (NLMS), LMS-Newton, transform domain LMS (TDLMS) and affine projection algorithm (APA). The performance evaluation of these algorithms is carried out using adaptive system identification (ASI) model with random input signals, in which the unknown (measured) signal is assumed to be contaminated by output noise. Simulation results are recorded to compare the performance in terms of convergence speed, robustness, misalignment, and their sensitivity to the spectral properties of input signals. Main objective of this comparative study is to observe the effects of fast convergence rate of improved versions of LMS algorithms on their robustness and misalignment.
Performance study of LMS based adaptive algorithms for unknown system identification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Javed, Shazia; Ahmad, Noor Atinah
Adaptive filtering techniques have gained much popularity in the modeling of unknown system identification problem. These techniques can be classified as either iterative or direct. Iterative techniques include stochastic descent method and its improved versions in affine space. In this paper we present a comparative study of the least mean square (LMS) algorithm and some improved versions of LMS, more precisely the normalized LMS (NLMS), LMS-Newton, transform domain LMS (TDLMS) and affine projection algorithm (APA). The performance evaluation of these algorithms is carried out using adaptive system identification (ASI) model with random input signals, in which the unknown (measured) signalmore » is assumed to be contaminated by output noise. Simulation results are recorded to compare the performance in terms of convergence speed, robustness, misalignment, and their sensitivity to the spectral properties of input signals. Main objective of this comparative study is to observe the effects of fast convergence rate of improved versions of LMS algorithms on their robustness and misalignment.« less
Analgesia/anesthesia for external cephalic version.
Weiniger, Carolyn F
2013-06-01
Professional society guidelines recommend that women with breech presentation be delivered surgically due to a higher incidence of fetal risks compared with vaginal delivery. An alternative is attempted external cephalic version, which if successful, enables attempted vaginal delivery. Attitudes towards external cephalic version (ECV) will be considered in this review, along with pain relief methods and their impact on ECV success rates. Articles suggest that ECV is infrequently offered, due to both physician and patient factors. Success of ECV is higher in multiparous women, complete breech, posterior placenta, or smaller fetus. Preterm ECV performance does not increase vaginal delivery rates. Neuraxial techniques (spinal or epidural) significantly increase ECV success rates, as do moxibustion and hypnosis. Four reviews summarized studies considering ECV and neuraxial techniques. These reviews suggest that neuraxial techniques using high (surgical) doses of local anesthetic are efficacious compared with control groups not using anesthesia, whereas techniques using low-doses are not. Low-dose versus high-dose neuraxial analgesia/anesthesia has not been directly compared in a single study. Based on currently available data, the rate of cephalic presentation is not increased using neuraxial techniques, but vaginal delivery rates are higher. ECV appears to be a low-risk procedure. The logistics of routine ECV and provision of optimal neuraxial techniques for successful ECV require additional research. Safety aspects of neuraxial anesthesia for ECV require further investigation.
Bailey, Stephanie L.; Bono, Rose S.; Nash, Denis; Kimmel, April D.
2018-01-01
Background Spreadsheet software is increasingly used to implement systems science models informing health policy decisions, both in academia and in practice where technical capacity may be limited. However, spreadsheet models are prone to unintentional errors that may not always be identified using standard error-checking techniques. Our objective was to illustrate, through a methodologic case study analysis, the impact of unintentional errors on model projections by implementing parallel model versions. Methods We leveraged a real-world need to revise an existing spreadsheet model designed to inform HIV policy. We developed three parallel versions of a previously validated spreadsheet-based model; versions differed by the spreadsheet cell-referencing approach (named single cells; column/row references; named matrices). For each version, we implemented three model revisions (re-entry into care; guideline-concordant treatment initiation; immediate treatment initiation). After standard error-checking, we identified unintentional errors by comparing model output across the three versions. Concordant model output across all versions was considered error-free. We calculated the impact of unintentional errors as the percentage difference in model projections between model versions with and without unintentional errors, using +/-5% difference to define a material error. Results We identified 58 original and 4,331 propagated unintentional errors across all model versions and revisions. Over 40% (24/58) of original unintentional errors occurred in the column/row reference model version; most (23/24) were due to incorrect cell references. Overall, >20% of model spreadsheet cells had material unintentional errors. When examining error impact along the HIV care continuum, the percentage difference between versions with and without unintentional errors ranged from +3% to +16% (named single cells), +26% to +76% (column/row reference), and 0% (named matrices). Conclusions Standard error-checking techniques may not identify all errors in spreadsheet-based models. Comparing parallel model versions can aid in identifying unintentional errors and promoting reliable model projections, particularly when resources are limited. PMID:29570737
Bailey, Stephanie L; Bono, Rose S; Nash, Denis; Kimmel, April D
2018-01-01
Spreadsheet software is increasingly used to implement systems science models informing health policy decisions, both in academia and in practice where technical capacity may be limited. However, spreadsheet models are prone to unintentional errors that may not always be identified using standard error-checking techniques. Our objective was to illustrate, through a methodologic case study analysis, the impact of unintentional errors on model projections by implementing parallel model versions. We leveraged a real-world need to revise an existing spreadsheet model designed to inform HIV policy. We developed three parallel versions of a previously validated spreadsheet-based model; versions differed by the spreadsheet cell-referencing approach (named single cells; column/row references; named matrices). For each version, we implemented three model revisions (re-entry into care; guideline-concordant treatment initiation; immediate treatment initiation). After standard error-checking, we identified unintentional errors by comparing model output across the three versions. Concordant model output across all versions was considered error-free. We calculated the impact of unintentional errors as the percentage difference in model projections between model versions with and without unintentional errors, using +/-5% difference to define a material error. We identified 58 original and 4,331 propagated unintentional errors across all model versions and revisions. Over 40% (24/58) of original unintentional errors occurred in the column/row reference model version; most (23/24) were due to incorrect cell references. Overall, >20% of model spreadsheet cells had material unintentional errors. When examining error impact along the HIV care continuum, the percentage difference between versions with and without unintentional errors ranged from +3% to +16% (named single cells), +26% to +76% (column/row reference), and 0% (named matrices). Standard error-checking techniques may not identify all errors in spreadsheet-based models. Comparing parallel model versions can aid in identifying unintentional errors and promoting reliable model projections, particularly when resources are limited.
Children's understanding of the addition/subtraction complement principle.
Torbeyns, Joke; Peters, Greet; De Smedt, Bert; Ghesquière, Pol; Verschaffel, Lieven
2016-09-01
In the last decades, children's understanding of mathematical principles has become an important research topic. Different from the commutativity and inversion principles, only few studies have focused on children's understanding of the addition/subtraction complement principle (if a - b = c, then c + b = a), mainly relying on verbal techniques. This contribution aimed at deepening our understanding of children's knowledge of the addition/subtraction complement principle, combining verbal and non-verbal techniques. Participants were 67 third and fourth graders (9- to 10-year-olds). Children solved two tasks in which verbal reports as well as accuracy and speed data were collected. These two tasks differed only in the order of the problems and the instructions. In the looking-back task, children were told that sometimes the preceding problem might help to answer the next problem. In the baseline task, no helpful preceding items were offered. The looking-back task included 10 trigger-target problem pairs on the complement relation. Children verbally reported looking back on about 40% of all target problems in the looking-back task; the target problems were also solved faster and more accurately than in the baseline task. These results suggest that children used their understanding of the complement principle. The verbal and non-verbal data were highly correlated. This study complements previous work on children's understanding of mathematical principles by highlighting interindividual differences in 9- to 10-year-olds' understanding of the complement principle and indicating the potential of combining verbal and non-verbal techniques to investigate (the acquisition of) this understanding. © 2016 The British Psychological Society.
Qualitative biomechanical principles for application in coaching.
Knudson, Duane
2007-01-01
Many aspects of human movements in sport can be readily understood by Newtonian rigid-body mechanics. Many of these laws and biomechanical principles, however, are counterintuitive to a lot of people. There are also several problems in the application of biomechanics to sports, so the application of biomechanics in the qualitative analysis of sport skills by many coaches has been limited. Biomechanics scholars have long been interested in developing principles that facilitate the qualitative application of biomechanics to improve movement performance and reduce the risk of injury. This paper summarizes the major North American efforts to establish a set of general biomechanical principles of movement, and illustrates how principles can be used to improve the application of biomechanics in the qualitative analysis of sport technique. A coach helping a player with a tennis serve is presented as an example. The standardization of terminology for biomechanical principles is proposed as an important first step in improving the application ofbiomechanics in sport. There is also a need for international cooperation and research on the effectiveness of applying biomechanical principles in the coaching of sport techniques.
Multigrid techniques for unstructured meshes
NASA Technical Reports Server (NTRS)
Mavriplis, D. J.
1995-01-01
An overview of current multigrid techniques for unstructured meshes is given. The basic principles of the multigrid approach are first outlined. Application of these principles to unstructured mesh problems is then described, illustrating various different approaches, and giving examples of practical applications. Advanced multigrid topics, such as the use of algebraic multigrid methods, and the combination of multigrid techniques with adaptive meshing strategies are dealt with in subsequent sections. These represent current areas of research, and the unresolved issues are discussed. The presentation is organized in an educational manner, for readers familiar with computational fluid dynamics, wishing to learn more about current unstructured mesh techniques.
Fabric-based active electrode design and fabrication for health monitoring clothing.
Merritt, Carey R; Nagle, H Troy; Grant, Edward
2009-03-01
In this paper, two versions of fabric-based active electrodes are presented to provide a wearable solution for ECG monitoring clothing. The first version of active electrode involved direct attachment of surface-mountable components to a textile screen-printed circuit using polymer thick film techniques. The second version involved attaching a much smaller, thinner, and less obtrusive interposer containing the active electrode circuitry to a simplified textile circuit. These designs explored techniques for electronic textile interconnection, chip attachment to textiles, and packaging of circuits on textiles for durability. The results from ECG tests indicate that the performance of each active electrode is comparable to commercial Ag/AgCl electrodes. The interposer-based active electrodes survived a five-cycle washing test while maintaining good signal integrity.
ERIC Educational Resources Information Center
Hammonds, S. J.
1990-01-01
A technique for the numerical identification of bacteria using normalized likelihoods calculated from a probabilistic database is described, and the principles of the technique are explained. The listing of the computer program is included. Specimen results from the program, and examples of how they should be interpreted, are given. (KR)
NASA Astrophysics Data System (ADS)
Pagliarini, G.; Vocale, P.; Mocerino, A.; Rainieri, S.
2017-01-01
Passive convective heat transfer enhancement techniques are well known and widespread tool for increasing the efficiency of heat transfer equipment. In spite of the ability of the first principle approach to forecast the macroscopic effects of the passive techniques for heat transfer enhancement, namely the increase of both the overall heat exchanged and the head losses, a first principle analysis based on energy, momentum and mass local conservation equations is hardly able to give a comprehensive explanation of how local modifications in the boundary layers contribute to the overall effect. A deeper insight on the heat transfer enhancement mechanisms can be instead obtained within a second principle approach, through the analysis of the local exergy dissipation phenomena which are related to heat transfer and fluid flow. To this aim, the analysis based on the second principle approach implemented through a careful consideration of the local entropy generation rate seems the most suitable, since it allows to identify more precisely the cause of the loss of efficiency in the heat transfer process, thus providing a useful guide in the choice of the most suitable heat transfer enhancement techniques.
Application of Pilates principles increases paraspinal muscle activation.
Andrade, Letícia Souza; Mochizuki, Luís; Pires, Flávio Oliveira; da Silva, Renato André Sousa; Mota, Yomara Lima
2015-01-01
To analyze the effect of Pilates principles on the EMG activity of abdominal and paraspinal muscles on stable and unstable surfaces. Surface EMG data about the rectus abdominis (RA), iliocostalis (IL) and lumbar multifidus (MU) of 19 participants were collected while performing three repetitions of a crunch exercise in the following conditions: 1) with no Pilates technique and stable surface (nP + S); 2) with no Pilates technique and unstable surface (nP + U); 3) with Pilates technique and stable surface (P + S); 4) with Pilates and unstable surface (P + U). The EMG Fanalysis was conducted using a custom-made Matlab(®) 10. There was no condition effect in the RA iEMG with stable and unstable surfaces (F(1,290) = 0 p = 0.98) and with and without principles (F(1,290) = 1.2 p = 0.27). IL iEMG was higher for the stable surface condition (F(1,290) = 32.3 p < 0.001) with Pilates principles (F(1,290) = 21.9 p < 0.001). The MU iEMG was higher for the stable surface condition with and without Pilates principles (F(1,290) = 84.9 p < 0.001). Copyright © 2014 Elsevier Ltd. All rights reserved.
Radio techniques for probing the terrestrial ionosphere.
NASA Astrophysics Data System (ADS)
Hunsucker, R. D.
The subject of the book is a description of the basic principles of operation, plus the capabilities and limitations of all generic radio techniques employed to investigate the terrestrial ionosphere. The purpose of this book is to present to the reader a balanced treatment of each technique so they can understand how to interpret ionospheric data and decide which techniques are most effective for studying specific phenomena. The first two chapters outline the basic theory underlying the techniques, and each following chapter discusses a separate technique. This monograph is entirely devoted to techniques in aeronomy and space physics. The approach is unique in its presentation of the principles, capabilities and limitations of the most important presently used radio techniques. Typical examples of data are shown for the various techniques, and a brief historical account of the technique development is presented. An extended annotated bibliography of the salient papers in the field is included.
Ethical Principles of Psychologists (Amended June 2, 1989).
ERIC Educational Resources Information Center
American Psychologist, 1990
1990-01-01
Reports the amended ethical principles of psychologists (June 2, 1989). The following principles are covered: (1) responsibility; (2) competence; (3) moral and legal standards; (4) public statements; (5) confidentiality; (6) welfare of the consumer; (7) professional relationships; (8) assessment techniques; (9) research with human participants;…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baer, Donald R.
2005-04-22
An updated version of the ASTM guide E1523 to the methods to charge control and charge referencing techniques in x-ray photoelectron spectroscopy has been released by ASTM. The guide is meant to acquaint x-ray photoelectron spectroscopy (XPS) users with the various charge control and charge referencing techniques that are and have been used in the acquisition and interpretation of XPS data from surfaces of insulating specimens. The current guide has been expanded to include new references as well as recommendations for reporting information on charge control and charge referencing. The previous version of the document had been published in 1997.
ERIC Educational Resources Information Center
Hosker, Bill S.
2018-01-01
A highly simplified variation on the do-it-yourself spectrophotometer using a smartphone's light sensor as a detector and an app to calculate and display absorbance values was constructed and tested. This simple version requires no need for electronic components or postmeasurement spectral analysis. Calibration graphs constructed from two…
Chemistry of potentially prebiological natural products
NASA Astrophysics Data System (ADS)
Eschenmoser, Albert
1994-09-01
A relationship between what might be called a kinetic version of Le Chatelier's principle and chemical self-organization is considered. Some aspects of the search for a pre-RNA genetic system are discussed. Results of an experimental investigation on the pairing properties of alternative nucleic acid systems — including those of pyranosyl-RNA (‘p-RNA’), a constitutional isomer of RNA — are summarized.
Ten Precepts about the Circumstance of Rural Education. Occasional Paper No. 11
ERIC Educational Resources Information Center
Howley, Craig, B.
2004-01-01
This paper is a slightly revised version of a formal lecture given on July 29, 2004, to the second cohort of ACCLAIM doctoral students on the final night of a course titled "Rural Education: Historical Perspective." This essay shares the following ten precepts of rural education, which are principles intended as teachings: (1) Rural areas and…
Nebula observations. Catalogues and archive of photoplates
NASA Astrophysics Data System (ADS)
Shlyapnikov, A. A.; Smirnova, M. A.; Elizarova, N. V.
2017-12-01
A process of data systematization based on "Academician G.A. Shajn's Plan" for studying the Galaxy structure related to nebula observations is considered. The creation of digital versions of catalogues of observations and publications is described, as well as their presentation in HTML, VOTable and AJS formats and basic principles of work in the interactive application of International Virtual Observatory the Aladin Sky Atlas.
ERIC Educational Resources Information Center
Horspool, Agi; Lange, Carsten
2012-01-01
This study compares student perceptions, learning behaviours and success in online and face-to-face versions of a Principles of Microeconomics course. It follows a Scholarship of Teaching and Learning (SoTL) approach by using a cycle of empirical analysis, reflection and action to improve the learning experience for students. The online course…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-01
...-based assessment approaches and quality by design principles. These efforts will also be considered part... processes being referenced. DATES: Although you can comment on any guidance at any time (see 21 CFR 10.115(g... on the final version of the guidance and on any other part of the SUPAC guidance series, submit...
A Critique of the New Statement on Labeling
ERIC Educational Resources Information Center
Hitchcock, Leonard A.
2006-01-01
In this paper, the 2005 revision of ALA's position document on labeling and rating systems is closely examined and assessed, not only in comparison with the previous version of the document, but also in terms of its adequacy as a statement of library principles and as a practical guide for library practice. It is found to be ambiguous in meaning,…
Existence of weak solutions to degenerate p-Laplacian equations and integral formulas
NASA Astrophysics Data System (ADS)
Chua, Seng-Kee; Wheeden, Richard L.
2017-12-01
We study the problem of solving some general integral formulas and then apply the conclusions to obtain results about the existence of weak solutions of various degenerate p-Laplacian equations. We adapt Variational Calculus methods and the Mountain Pass Lemma without the Palais-Smale condition, and we use an abstract version of Lions' Concentration Compactness Principle II.
NASA Astrophysics Data System (ADS)
Werner, E.
In 1876, Alexander Graham Bell described his first telephone with a microphone using magnetic induction to convert the voice input into an electric output signal. The basic principle led to a variety of designs optimized for different needs, from hearing impaired users to singers or broadcast announcers. From the various sound pressure versions, only the moving coil design is still in mass production for speech and music application.
The Equivalence Principle Comes to School--Falling Objects and Other Middle School Investigations
ERIC Educational Resources Information Center
Pendrill, Ann-Marie; Ekström, Peter; Hansson, Lena; Mars, Patrik; Ouattara, Lassana; Ryan, Ulrika
2014-01-01
Comparing two objects falling together is a small-scale version of Galileo's classical experiment, demonstrating the equivalence between gravitational and inertial mass. We present here investigations by a group of ten-year-olds, who used iPads to record the drops. The movie recordings were essential in the follow-up discussions, enabling the…
MERCHANT OF VENICE. LITERATURE CURRICULUM III, TEACHER VERSION.
ERIC Educational Resources Information Center
KITZHABER, ALBERT R.
THIS TEACHING GUIDE ON "THE MERCHANT OF VENICE" WAS PREPARED FOR USE IN A NINTH-GRADE LITERATURE CURRICULUM. THE PURPOSE OF THE GUIDE WAS TO ILLUMINATE THE PLAY AS A WHOLE, AND TO SUGGEST TO THE TEACHER SOME USEFUL PRINCIPLES FOR FRAMING QUESTIONS AND GUIDING DISCUSSIONS IN THE CLASSROOM. THE GUIDE WAS NOT TO BE USED, HOWEVER, AS A BASE…
Mechanical-Kinetic Modeling of a Molecular Walker from a Modular Design Principle
NASA Astrophysics Data System (ADS)
Hou, Ruizheng; Loh, Iong Ying; Li, Hongrong; Wang, Zhisong
2017-02-01
Artificial molecular walkers beyond burnt-bridge designs are complex nanomachines that potentially replicate biological walkers in mechanisms and functionalities. Improving the man-made walkers up to performance for widespread applications remains difficult, largely because their biomimetic design principles involve entangled kinetic and mechanical effects to complicate the link between a walker's construction and ultimate performance. Here, a synergic mechanical-kinetic model is developed for a recently reported DNA bipedal walker, which is based on a modular design principle, potentially enabling many directional walkers driven by a length-switching engine. The model reproduces the experimental data of the walker, and identifies its performance-limiting factors. The model also captures features common to the underlying design principle, including counterintuitive performance-construction relations that are explained by detailed balance, entropy production, and bias cancellation. While indicating a low directional fidelity for the present walker, the model suggests the possibility of improving the fidelity above 90% by a more powerful engine, which may be an improved version of the present engine or an entirely new engine motif, thanks to the flexible design principle. The model is readily adaptable to aid these experimental developments towards high-performance molecular walkers.
Leslie, Julian C
2006-01-01
Herbert Spencer's Principles of Psychology (1855, first edition) was regarded by his contemporaries, including William James and John Dewey, as a major contribution to what was then a very new discipline. In this book he first expounded his ideas about both evolution of species and how behavior of the individual organism adapts through interaction with the environment. His formulation of the principle that behavior changes in adaptation to the environment is closely related to the version of the law of effect propounded some years later by Thorndike. He can thus be seen as the first proponent of selectionism, a key tenet of behavior analysis. He also explicitly attacked the then prevailing view of free will as being incompatible with the biologically grounded view of psychological processes that he was advocating, and thus put forward ideas that were precursors of B. F. Skinner's in this important area of debate. PMID:16903496
Leslie, Julian C
2006-07-01
Herbert Spencer's Principles of Psychology (1855, first edition) was regarded by his contemporaries, including William James and John Dewey, as a major contribution to what was then a very new discipline. In this book he first expounded his ideas about both evolution of species and how behavior of the individual organism adapts through interaction with the environment. His formulation of the principle that behavior changes in adaptation to the environment is closely related to the version of the law of effect propounded some years later by Thorndike. He can thus be seen as the first proponent of selectionism, a key tenet of behavior analysis. He also explicitly attacked the then prevailing view of free will as being incompatible with the biologically grounded view of psychological processes that he was advocating, and thus put forward ideas that were precursors of B. F. Skinner's in this important area of debate.
On classical mechanical systems with non-linear constraints
NASA Astrophysics Data System (ADS)
Terra, Gláucio; Kobayashi, Marcelo H.
2004-03-01
In the present work, we analyze classical mechanical systems with non-linear constraints in the velocities. We prove that the d'Alembert-Chetaev trajectories of a constrained mechanical system satisfy both Gauss' principle of least constraint and Hölder's principle. In the case of a free mechanics, they also satisfy Hertz's principle of least curvature if the constraint manifold is a cone. We show that the Gibbs-Maggi-Appell (GMA) vector field (i.e. the second-order vector field which defines the d'Alembert-Chetaev trajectories) conserves energy for any potential energy if, and only if, the constraint is homogeneous (i.e. if the Liouville vector field is tangent to the constraint manifold). We introduce the Jacobi-Carathéodory metric tensor and prove Jacobi-Carathéodory's theorem assuming that the constraint manifold is a cone. Finally, we present a version of Liouville's theorem on the conservation of volume for the flow of the GMA vector field.
A preliminary study of a cryogenic equivalence principle experiment on Shuttle
NASA Technical Reports Server (NTRS)
Everitt, C. W. F.; Worden, P. W., Jr.
1985-01-01
The Weak Equivalence Principle is the hypothesis that all test bodies fall with the same acceleration in the same gravitational field. The current limit on violations of the Weak Equivalence Principle, measured by the ratio of the difference in acceleration of two test masses to their average acceleration, is about 3 parts in one-hundred billion. It is anticipated that this can be improved in a shuttle experiment to a part in one quadrillion. Topics covered include: (1) studies of the shuttle environment, including interference with the experiment, interfacing to the experiment, and possible alternatives; (2) numerical simulations of the proposed experiment, including analytic solutions for special cases of the mass motion and preliminary estimates of sensitivity and time required; (3) error analysis of several noise sources such as thermal distortion, gas and radiation pressure effects, and mechanical distortion; and (4) development and performance tests of a laboratory version of the instrument.
Applying Brain-Based Learning Principles to Athletic Training Education
ERIC Educational Resources Information Center
Craig, Debbie I.
2007-01-01
Objective: To present different concepts and techniques related to the application of brain-based learning principles to Athletic Training clinical education. Background: The body of knowledge concerning how our brains physically learn continues to grow. Brain-based learning principles, developed by numerous authors, offer advice on how to…
A Technique of Teaching the Principle of Equivalence at Ground Level
ERIC Educational Resources Information Center
Lubrica, Joel V.
2016-01-01
This paper presents one way of demonstrating the Principle of Equivalence in the classroom. Teaching the Principle of Equivalence involves someone experiencing acceleration through empty space, juxtaposed with the daily encounter with gravity. This classroom activity is demonstrated with a water-filled bottle containing glass marbles and…
NASA Technical Reports Server (NTRS)
Lohn, Jason; Smith, David; Frank, Jeremy; Globus, Al; Crawford, James
2007-01-01
JavaGenes is a general-purpose, evolutionary software system written in Java. It implements several versions of a genetic algorithm, simulated annealing, stochastic hill climbing, and other search techniques. This software has been used to evolve molecules, atomic force field parameters, digital circuits, Earth Observing Satellite schedules, and antennas. This version differs from version 0.7.28 in that it includes the molecule evolution code and other improvements. Except for the antenna code, JaveGenes is available for NASA Open Source distribution.
A Case Study on Multiple-Choice Testing in Anatomical Sciences
ERIC Educational Resources Information Center
Golda, Stephanie DuPont
2011-01-01
Objective testing techniques, such as multiple-choice examinations, are a widely accepted method of assessment in gross anatomy. In order to deter cheating on these types of examinations, instructors often design several versions of an examination to distribute. These versions usually involve the rearrangement of questions and their corresponding…
A Survey Version of Full-Profile Conjoint Analysis.
ERIC Educational Resources Information Center
Chrzan, Keith
Two studies were conducted to test the viability of a survey version of full-profile conjoint analysis. Conjoint analysis describes a variety of analytic techniques for measuring subjects'"utilities," or preferences for the individual attributes or levels of attributes that constitute objects under study. The first study compared the…
Basic principles of management for cervical spine trauma.
O'Dowd, J K
2010-03-01
This article reviews the basic principles of management of cervical trauma. The technique and critical importance of careful assessment is described. Instability is defined, and the incidence of a second injury is highlighted. The concept of spinal clearance is discussed. Early reduction and stabilisation techniques are described, and the indications, and approach for surgery reviewed. The importance of the role of post-injury rehabilitation is identified.
Case mix measures and diagnosis-related groups: opportunities and threats for inpatient dermatology.
Hensen, P; Fürstenberg, T; Luger, T A; Steinhoff, M; Roeder, N
2005-09-01
The changing healthcare environment world-wide is leading to extensive use of per case payment systems based on diagnosis-related groups (DRG). The aim of this study was to examine the impact of application of different DRG systems used in the German healthcare system. We retrospectively analysed 2334 clinical data sets of inpatients discharged from an academic dermatological inpatient unit in 2003. Data were regarded as providing high coding quality in compliance with the diagnosis and procedure classifications as well as coding standards. The application of the Australian AR-DRG version 4.1, the German G-DRG version 1.0, and the German G-DRG version 2004 was considered in detail. To evaluate more specific aspects, data were broken down into 11 groups based on the principle diagnosis. DRG cost weights and case mix index were used to compare coverage of inpatient dermatological services. Economic impacts were illustrated by case mix volumes and calculation of DRG payments. Case mix index results and the pending prospective revenues vary tremendously from the application of one or another of the DRG systems. The G-DRG version 2004 provides increased levels of case mix index that encourages, in particular, medical dermatology. The AR-DRG version 4.1 and the first German DRG version 1.0 appear to be less suitable to adequately cover inpatient dermatology. The G-DRG version 2004 has been greatly improved, probably due to proceeding calculation standards and DRG adjustments. The future of inpatient dermatology is subject to appropriate depiction of well-established treatment standards.
Del Carratore, Francesco; Jankevics, Andris; Eisinga, Rob; Heskes, Tom; Hong, Fangxin; Breitling, Rainer
2017-09-01
The Rank Product (RP) is a statistical technique widely used to detect differentially expressed features in molecular profiling experiments such as transcriptomics, metabolomics and proteomics studies. An implementation of the RP and the closely related Rank Sum (RS) statistics has been available in the RankProd Bioconductor package for several years. However, several recent advances in the understanding of the statistical foundations of the method have made a complete refactoring of the existing package desirable. We implemented a completely refactored version of the RankProd package, which provides a more principled implementation of the statistics for unpaired datasets. Moreover, the permutation-based P -value estimation methods have been replaced by exact methods, providing faster and more accurate results. RankProd 2.0 is available at Bioconductor ( https://www.bioconductor.org/packages/devel/bioc/html/RankProd.html ) and as part of the mzMatch pipeline ( http://www.mzmatch.sourceforge.net ). rainer.breitling@manchester.ac.uk. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.
To cut or not to cut? Assessing the modular structure of brain networks.
Chang, Yu-Teng; Pantazis, Dimitrios; Leahy, Richard M
2014-05-01
A wealth of methods has been developed to identify natural divisions of brain networks into groups or modules, with one of the most prominent being modularity. Compared with the popularity of methods to detect community structure, only a few methods exist to statistically control for spurious modules, relying almost exclusively on resampling techniques. It is well known that even random networks can exhibit high modularity because of incidental concentration of edges, even though they have no underlying organizational structure. Consequently, interpretation of community structure is confounded by the lack of principled and computationally tractable approaches to statistically control for spurious modules. In this paper we show that the modularity of random networks follows a transformed version of the Tracy-Widom distribution, providing for the first time a link between module detection and random matrix theory. We compute parametric formulas for the distribution of modularity for random networks as a function of network size and edge variance, and show that we can efficiently control for false positives in brain and other real-world networks. Copyright © 2014 Elsevier Inc. All rights reserved.
Ultrasound elastography: principles, techniques, and clinical applications.
Dewall, Ryan J
2013-01-01
Ultrasound elastography is an emerging set of imaging modalities used to image tissue elasticity and are often referred to as virtual palpation. These techniques have proven effective in detecting and assessing many different pathologies, because tissue mechanical changes often correlate with tissue pathological changes. This article reviews the principles of ultrasound elastography, many of the ultrasound-based techniques, and popular clinical applications. Originally, elastography was a technique that imaged tissue strain by comparing pre- and postcompression ultrasound images. However, new techniques have been developed that use different excitation methods such as external vibration or acoustic radiation force. Some techniques track transient phenomena such as shear waves to quantitatively measure tissue elasticity. Clinical use of elastography is increasing, with applications including lesion detection and classification, fibrosis staging, treatment monitoring, vascular imaging, and musculoskeletal applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gonçalves, L.A.; Olavo, L.S.F., E-mail: olavolsf@gmail.com
Dissipation in Quantum Mechanics took some time to become a robust field of investigation after the birth of the field. The main issue hindering developments in the field is that the Quantization process was always tightly connected to the Hamiltonian formulation of Classical Mechanics. In this paper we present a quantization process that does not depend upon the Hamiltonian formulation of Classical Mechanics (although still departs from Classical Mechanics) and thus overcome the problem of finding, from first principles, a completely general Schrödinger equation encompassing dissipation. This generalized process of quantization is shown to be nothing but an extension ofmore » a more restricted version that is shown to produce the Schrödinger equation for Hamiltonian systems from first principles (even for Hamiltonian velocity dependent potential). - Highlights: • A Quantization process independent of the Hamiltonian formulation of quantum Mechanics is proposed. • This quantization method is applied to dissipative or absorptive systems. • A Dissipative Schrödinger equation is derived from first principles.« less
The Stanford equivalence principle program
NASA Technical Reports Server (NTRS)
Worden, Paul W., Jr.; Everitt, C. W. Francis; Bye, M.
1989-01-01
The Stanford Equivalence Principle Program (Worden, Jr. 1983) is intended to test the uniqueness of free fall to the ultimate possible accuracy. The program is being conducted in two phases: first, a ground-based version of the experiment, which should have a sensitivity to differences in rate of fall of one part in 10(exp 12); followed by an orbital experiment with a sensitivity of one part in 10(exp 17) or better. The ground-based experiment, although a sensitive equivalence principle test in its own right, is being used for technology development for the orbital experiment. A secondary goal of the experiment is a search for exotic forces. The instrument is very well suited for this search, which would be conducted mostly with the ground-based apparatus. The short range predicted for these forces means that forces originating in the Earth would not be detectable in orbit. But detection of Yukawa-type exotic forces from a nearby large satellite (such as Space Station) is feasible, and gives a very sensitive and controllable test for little more effort than the orbiting equivalence principle test itself.
Pachankis, John E.
2014-01-01
Gay and bisexual men disproportionately experience depression, anxiety, and related health risks at least partially because of their exposure to sexual minority stress. This paper describes the adaptation of an evidence-based intervention capable of targeting the psychosocial pathways through which minority stress operates. Interviews with key stakeholders, including gay and bisexual men with depression and anxiety and expert providers, suggested intervention principles and techniques for improving minority stress coping. These principles and techniques are consistent with general cognitive behavioral therapy approaches, the empirical tenets of minority stress theory, and professional guidelines for LGB-affirmative mental health practice. If found to be efficacious, the psychosocial intervention described here would be one of the first to improve the mental health of gay and bisexual men by targeting minority stress. PMID:25554721
In Silico PCR Tools for a Fast Primer, Probe, and Advanced Searching.
Kalendar, Ruslan; Muterko, Alexandr; Shamekova, Malika; Zhambakin, Kabyl
2017-01-01
The polymerase chain reaction (PCR) is fundamental to molecular biology and is the most important practical molecular technique for the research laboratory. The principle of this technique has been further used and applied in plenty of other simple or complex nucleic acid amplification technologies (NAAT). In parallel to laboratory "wet bench" experiments for nucleic acid amplification technologies, in silico or virtual (bioinformatics) approaches have been developed, among which in silico PCR analysis. In silico NAAT analysis is a useful and efficient complementary method to ensure the specificity of primers or probes for an extensive range of PCR applications from homology gene discovery, molecular diagnosis, DNA fingerprinting, and repeat searching. Predicting sensitivity and specificity of primers and probes requires a search to determine whether they match a database with an optimal number of mismatches, similarity, and stability. In the development of in silico bioinformatics tools for nucleic acid amplification technologies, the prospects for the development of new NAAT or similar approaches should be taken into account, including forward-looking and comprehensive analysis that is not limited to only one PCR technique variant. The software FastPCR and the online Java web tool are integrated tools for in silico PCR of linear and circular DNA, multiple primer or probe searches in large or small databases and for advanced search. These tools are suitable for processing of batch files that are essential for automation when working with large amounts of data. The FastPCR software is available for download at http://primerdigital.com/fastpcr.html and the online Java version at http://primerdigital.com/tools/pcr.html .
NASA Technical Reports Server (NTRS)
Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.
1991-01-01
The Finite Difference Time Domain Electromagnetic Scattering Code Version A is a three dimensional numerical electromagnetic scattering code based upon the Finite Difference Time Domain Technique (FDTD). This manual provides a description of the code and corresponding results for the default scattering problem. In addition to the description, the operation, resource requirements, version A code capabilities, a description of each subroutine, a brief discussion of the radar cross section computations, and a discussion of the scattering results.
Principles and application of shock-tubes and shock tunnels
NASA Technical Reports Server (NTRS)
Ried, R. C.; Clauss, H. G., Jr.
1963-01-01
The principles, theoretical flow equations, calculation techniques, limitations and practical performance characteristics of basic and high performance shock tubes and shock tunnels are presented. Selected operating curves are included.
NASA Astrophysics Data System (ADS)
Makhov, Dmitry V.; Symonds, Christopher; Fernandez-Alberti, Sebastian; Shalashilin, Dmitrii V.
2017-08-01
The Multiconfigurational Ehrenfest (MCE) method is a quantum dynamics technique which allows treatment of a large number of quantum nuclear degrees of freedom. This paper presents a review of MCE and its recent applications, providing a summary of the formalisms, including its ab initio direct dynamics versions and also giving a summary of recent results. Firstly, we describe the Multiconfigurational Ehrenfest version 2 (MCEv2) method and its applicability to direct dynamics and report new calculations which show that the approach converges to the exact result in model systems with tens of degrees of freedom. Secondly, we review previous ;on the fly; ab initio Multiple Cloning (AIMC-MCE) MCE dynamics results obtained for systems of a similar size, in which the calculations treat every electron and every nucleus of a polyatomic molecule on a fully quantum basis. We also review the Time Dependent Diabatic Basis (TDDB) version of the technique and give an example of its application. We summarise the details of the sampling techniques and interpolations used for calculation of the matrix elements, which make our approach efficient. Future directions of work are outlined.
An objective isobaric/isentropic technique for upper air analysis
NASA Technical Reports Server (NTRS)
Mancuso, R. L.; Endlich, R. M.; Ehernberger, L. J.
1981-01-01
An objective meteorological analysis technique is presented whereby both horizontal and vertical upper air analyses are performed. The process used to interpolate grid-point values from the upper-air station data is the same as for grid points on both an isobaric surface and a vertical cross-sectional plane. The nearby data surrounding each grid point are used in the interpolation by means of an anisotropic weighting scheme, which is described. The interpolation for a grid-point potential temperature is performed isobarically; whereas wind, mixing-ratio, and pressure height values are interpolated from data that lie on the isentropic surface that passes through the grid point. Two versions (A and B) of the technique are evaluated by qualitatively comparing computer analyses with subjective handdrawn analyses. The objective products of version A generally have fair correspondence with the subjective analyses and with the station data, and depicted the structure of the upper fronts, tropopauses, and jet streams fairly well. The version B objective products correspond more closely to the subjective analyses, and show the same strong gradients across the upper front with only minor smoothing.
Adaptive partially hidden Markov models with application to bilevel image coding.
Forchhammer, S; Rasmussen, T S
1999-01-01
Partially hidden Markov models (PHMMs) have previously been introduced. The transition and emission/output probabilities from hidden states, as known from the HMMs, are conditioned on the past. This way, the HMM may be applied to images introducing the dependencies of the second dimension by conditioning. In this paper, the PHMM is extended to multiple sequences with a multiple token version and adaptive versions of PHMM coding are presented. The different versions of the PHMM are applied to lossless bilevel image coding. To reduce and optimize the model cost and size, the contexts are organized in trees and effective quantization of the parameters is introduced. The new coding methods achieve results that are better than the JBIG standard on selected test images, although at the cost of increased complexity. By the minimum description length principle, the methods presented for optimizing the code length may apply as guidance for training (P)HMMs for, e.g., segmentation or recognition purposes. Thereby, the PHMM models provide a new approach to image modeling.
Collister, Barbara; Stein, Glenda; Katz, Deborah; DeBruyn, Joan; Andrusiw, Linda; Cloutier, Sheila
2012-01-01
Increasing costs and budget reductions combined with increasing demand from our growing, aging population support the need to ensure that the scarce resources allocated to home care clients match client needs. This article details how Integrated Home Care for the Calgary Zone of Alberta Health Services considered ethical and economic principles and used data from the Resident Assessment Instrument for Home Care (RAI-HC) and case mix indices from the Resource Utilization Groups Version III for Home Care (RUG-III/HC) to formulate service guidelines. These explicit service guidelines formalize and support individual resource allocation decisions made by case managers and provide a consistent and transparent method of allocating limited resources.
Life on the arc: principle-centered comprehensive care.
Fohey, T; Cassidy, J L
1998-01-01
Today's dental practice is experiencing an evolution in the manner through which new materials and techniques are marketed and introduced. An increasing concern among the patient population regarding aesthetics contributes to the acceptance of a commodity dental philosophy, without questioning the reliability of the technique or new material. A principle-centered practice differentiates the product marketing from the viability of a restorative material in vivo. This article discusses the concept of a principle-centered practice and describes how to place quality products in a balanced system in which harmony exits between all components of the masticatory system: the teeth, the muscles, and the temporomandibular joints.
Classical closure theory and Lam's interpretation of epsilon-RNG
NASA Technical Reports Server (NTRS)
Zhou, YE
1995-01-01
Lam's phenomenological epsilon-renormalization group (RNG) model is quite different from the other members of that group. It does not make use of the correspondence principle and the epsilon-expansion procedure. We demonstrate that Lam's epsilon-RNG model is essentially the physical space version of the classical closure theory in spectral space and consider the corresponding treatment of the eddy viscosity and energy backscatter.
Evaluations of Some Scheduling Algorithms for Hard Real-Time Systems
1990-06-01
construct because the mechanism is a dispatching procedure. Since all nonpreemptive schedules are contained in the set of all preemptive schedules, the...optimal value of Tmax in the preemptive case is at least a lower bound on the optimal Tmax for the nonpreemptive schedules. This principle is the basis...23 b. Nonpreemptable Version .............................................. 24 4. The Minimize Maximum Tardiness with Earliest Start
Optical Fibers Would Sense Local Strains
NASA Technical Reports Server (NTRS)
Egalon, Claudio O.; Rogowski, Robert S.
1994-01-01
Proposed fiber-optic transducers measure local strains. Includes lead-in and lead-out lengths producing no changes in phase shifts, plus short sensing length in which phase shift is sensitive to strain. Phase shifts in single-mode fibers vary with strains. In alternative version, multiple portions of optical fiber sensitive to strains characteristic of specific vibrational mode of object. Same principle also used with two-mode fiber.
Basic principles of management for cervical spine trauma
2009-01-01
This article reviews the basic principles of management of cervical trauma. The technique and critical importance of careful assessment is described. Instability is defined, and the incidence of a second injury is highlighted. The concept of spinal clearance is discussed. Early reduction and stabilisation techniques are described, and the indications, and approach for surgery reviewed. The importance of the role of post-injury rehabilitation is identified. PMID:19701655
Acar, Nihat; Karakasli, Ahmet; Karaarslan, Ahmet; Mas, Nermin Ng; Hapa, Onur
2017-01-01
Volumetric measurements of benign tumors enable surgeons to trace volume changes during follow-up periods. For a volumetric measurement technique to be applicable, it should be easy, rapid, and inexpensive and should carry a high interobserver reliability. We aimed to assess the interobserver reliability of a volumetric measurement technique using the Cavalier's principle of stereological methods. The computerized tomography (CT) of 15 patients with a histopathologically confirmed diagnosis of enchondroma with variant tumor sizes and localizations was retrospectively reviewed for interobserver reliability evaluation of the volumetric stereological measurement with the Cavalier's principle, V = t × [((SU) × d) /SL]2 × Σ P. The volumes of the 15 tumors collected by the observers are demonstrated in Table 1. There was no statistical significance between the first and second observers ( p = 0.000 and intraclass correlation coefficient = 0.970) and between the first and third observers ( p = 0.000 and intraclass correlation coefficient = 0.981). No statistical significance was detected between the second and third observers ( p = 0.000 and intraclass correlation coefficient = 0.976). The Cavalier's principle with the stereological technique using the CT scans is an easy, rapid, and inexpensive technique in volumetric evaluation of enchondromas with a trustable interobserver reliability.
Judo principles and practices: applications to conflict-solving strategies in psychotherapy.
Gleser, J; Brown, P
1988-07-01
Jigoro Kano created judo from ju-jitsu techniques. He realized that the Ju principle of both judo and ju-jitsu as the art of yielding, was that of living and changing. The principle of yielding has been applied in dynamic and directive psychotherapies for many years and was recently linked to the Ju principle in martial arts. After several years of using a modified judo practice as a therapeutic tool, and applying the principle of yielding as a dynamic conflict-solving strategy, the authors discovered judo principles applicable to conflict solving, particularly for regressed and violent psychotic patients.
MATLAB Toolboxes for Reference Electrode Standardization Technique (REST) of Scalp EEG
Dong, Li; Li, Fali; Liu, Qiang; Wen, Xin; Lai, Yongxiu; Xu, Peng; Yao, Dezhong
2017-01-01
Reference electrode standardization technique (REST) has been increasingly acknowledged and applied as a re-reference technique to transform an actual multi-channels recordings to approximately zero reference ones in electroencephalography/event-related potentials (EEG/ERPs) community around the world in recent years. However, a more easy-to-use toolbox for re-referencing scalp EEG data to zero reference is still lacking. Here, we have therefore developed two open-source MATLAB toolboxes for REST of scalp EEG. One version of REST is closely integrated into EEGLAB, which is a popular MATLAB toolbox for processing the EEG data; and another is a batch version to make it more convenient and efficient for experienced users. Both of them are designed to provide an easy-to-use for novice researchers and flexibility for experienced researchers. All versions of the REST toolboxes can be freely downloaded at http://www.neuro.uestc.edu.cn/rest/Down.html, and the detailed information including publications, comments and documents on REST can also be found from this website. An example of usage is given with comparative results of REST and average reference. We hope these user-friendly REST toolboxes could make the relatively novel technique of REST easier to study, especially for applications in various EEG studies. PMID:29163006
MATLAB Toolboxes for Reference Electrode Standardization Technique (REST) of Scalp EEG.
Dong, Li; Li, Fali; Liu, Qiang; Wen, Xin; Lai, Yongxiu; Xu, Peng; Yao, Dezhong
2017-01-01
Reference electrode standardization technique (REST) has been increasingly acknowledged and applied as a re-reference technique to transform an actual multi-channels recordings to approximately zero reference ones in electroencephalography/event-related potentials (EEG/ERPs) community around the world in recent years. However, a more easy-to-use toolbox for re-referencing scalp EEG data to zero reference is still lacking. Here, we have therefore developed two open-source MATLAB toolboxes for REST of scalp EEG. One version of REST is closely integrated into EEGLAB, which is a popular MATLAB toolbox for processing the EEG data; and another is a batch version to make it more convenient and efficient for experienced users. Both of them are designed to provide an easy-to-use for novice researchers and flexibility for experienced researchers. All versions of the REST toolboxes can be freely downloaded at http://www.neuro.uestc.edu.cn/rest/Down.html, and the detailed information including publications, comments and documents on REST can also be found from this website. An example of usage is given with comparative results of REST and average reference. We hope these user-friendly REST toolboxes could make the relatively novel technique of REST easier to study, especially for applications in various EEG studies.
History Matters: Incremental Ontology Reasoning Using Modules
NASA Astrophysics Data System (ADS)
Cuenca Grau, Bernardo; Halaschek-Wiener, Christian; Kazakov, Yevgeny
The development of ontologies involves continuous but relatively small modifications. Existing ontology reasoners, however, do not take advantage of the similarities between different versions of an ontology. In this paper, we propose a technique for incremental reasoning—that is, reasoning that reuses information obtained from previous versions of an ontology—based on the notion of a module. Our technique does not depend on a particular reasoning calculus and thus can be used in combination with any reasoner. We have applied our results to incremental classification of OWL DL ontologies and found significant improvement over regular classification time on a set of real-world ontologies.
Buteyko technique use to control asthma symptoms.
Austin, Gillian
The Buteyko breathing technique is recommended in national guidance for control of asthma symptoms. This article explores the evidence base for the technique, outlines its main principles and includes two cases studies.
Supercritical fluid extraction. Principles and practice
DOE Office of Scientific and Technical Information (OSTI.GOV)
McHugh, M.A.; Krukonis, V.J.
This book is a presentation of the fundamentals and application of super-critical fluid solvents (SCF). The authors cover virtually every facet of SCF technology: the history of SCF extraction, its underlying thermodynamic principles, process principles, industrial applications, and analysis of SCF research and development efforts. The thermodynamic principles governing SCF extraction are covered in depth. The often complex three-dimensional pressure-temperature composition (PTx) phase diagrams for SCF-solute mixtures are constructed in a coherent step-by-step manner using the more familiar two-dimensional Px diagrams. The experimental techniques used to obtain high pressure phase behavior information are described in detail and the advantages andmore » disadvantages of each technique are explained. Finally, the equations used to model SCF-solute mixtures are developed, and modeling results are presented to highlight the correlational strengths of a cubic equation of state.« less
Enabling devices, empowering people: the design and evaluation of Trackball EdgeWrite.
Wobbrock, Jacob O; Myers, Brad A
2008-01-01
To describe the research and development that led to Trackball EdgeWrite, a gestural text entry method that improves desktop input for some people with motor impairments. To compare the character-level version of this technique with a new word-level version. Further, to compare the technique with competitor techniques that use on-screen keyboards. A rapid and iterative design-and-test approach was used to generate working prototypes and elicit quantitative and qualitative feedback from a veteran trackball user. In addition, theoretical modelling based on the Steering law was used to compare competing designs. One result is a refined software artifact, Trackball EdgeWrite, which represents the outcome of this investigation. A theoretical result shows the speed benefit of word-level stroking compared to character-level stroking, which resulted in a 45.0% improvement. Empirical results of a trackball user with a spinal cord injury indicate a peak performance of 8.25 wpm with the character-level version of Trackball EdgeWrite and 12.09 wpm with the word-level version, a 46.5% improvement. Log file analysis of extended real-world text entry shows stroke savings of 43.9% with the word-level version. Both versions of Trackball EdgeWrite were better than on-screen keyboards, particularly regarding user preferences. Follow-up correspondence shows that the veteran trackball user with a spinal cord injury still uses Trackball EdgeWrite on a daily basis 2 years after his initial exposure to the software. Trackball EdgeWrite is a successful new method for desktop text entry and may have further implications for able-bodied users of mobile technologies. Theoretical modelling is useful in combination with empirical testing to explore design alternatives. Single-user lab and field studies can be useful for driving a rapid iterative cycle of innovation and development.
Moxibustion for Cephalic Version of Breech Presentation.
Schlaeger, Judith M; Stoffel, Cynthia L; Bussell, Jeanie L; Cai, Hui Yan; Takayama, Miho; Yajima, Hiroyoshi; Takakura, Nobuari
2018-05-01
Moxibustion, a form of traditional Chinese medicine (TCM), is the burning of the herb moxa (Folium Artemisiae argyi or mugwort) over acupuncture points. It is often used in China to facilitate cephalic version of breech presentation. This article reviews the history, philosophy, therapeutic use, possible mechanisms of action, and literature pertaining to its use for this indication. For moxibustion, moxa can be rolled into stick form, placed directly on the skin, or placed on an acupuncture needle and ignited to warm acupuncture points. Studies have demonstrated that moxibustion may promote cephalic version of breech presentation and may facilitate external cephalic version. However, there is currently a paucity of research on the effects of moxibustion on cephalic version of breech presentation, and thus there is a need for further studies. Areas needing more investigation include efficacy, safety, optimal technique, and best protocol for cephalic version of breech presentation. © 2018 by the American College of Nurse-Midwives.
Stable aesthetic standards delusion: changing 'artistic quality' by elaboration.
Carbon, Claus-Christian; Hesslinger, Vera M
2014-01-01
The present study challenges the notion that judgments of artistic quality are based on stable aesthetic standards. We propose that such standards are a delusion and that judgments of artistic quality are the combined result of exposure, elaboration, and discourse. We ran two experiments using elaboration tasks based on the repeated evaluation technique in which different versions of the Mona Lisa had to be elaborated deeply. During the initial task either the version known from the Louvre or an alternative version owned by the Prado was elaborated; during the second task both versions were elaborated in a comparative fashion. After both tasks multiple blends of the two versions had to be evaluated concerning several aesthetic key variables. Judgments of artistic quality of the blends were significantly different depending on the initially elaborated version of the Mona Lisa, indicating experience-based aesthetic processing, which contradicts the notion of stable aesthetic standards.
Aircraft noise prediction program propeller analysis system IBM-PC version user's manual version 2.0
NASA Technical Reports Server (NTRS)
Nolan, Sandra K.
1988-01-01
The IBM-PC version of the Aircraft Noise Prediction Program (ANOPP) Propeller Analysis System (PAS) is a set of computational programs for predicting the aerodynamics, performance, and noise of propellers. The ANOPP-PAS is a subset of a larger version of ANOPP which can be executed on CDC or VAX computers. This manual provides a description of the IBM-PC version of the ANOPP-PAS and its prediction capabilities, and instructions on how to use the system on an IBM-XT or IBM-AT personal computer. Sections within the manual document installation, system design, ANOPP-PAS usage, data entry preprocessors, and ANOPP-PAS functional modules and procedures. Appendices to the manual include a glossary of ANOPP terms and information on error diagnostics and recovery techniques.
NASA Technical Reports Server (NTRS)
Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.
1992-01-01
The Penn State Finite Difference Time Domain Electromagnetic Code Version B is a three dimensional numerical electromagnetic scattering code based upon the Finite Difference Time Domain Technique (FDTD). The supplied version of the code is one version of our current three dimensional FDTD code set. This manual provides a description of the code and corresponding results for several scattering problems. The manual is organized into 14 sections: introduction, description of the FDTD method, operation, resource requirements, Version B code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include file, a discussion of radar cross section computations, a discussion of some scattering results, a sample problem setup section, a new problem checklist, references and figure titles.
Harrison, Peter M C; Collins, Tom; Müllensiefen, Daniel
2017-06-15
Modern psychometric theory provides many useful tools for ability testing, such as item response theory, computerised adaptive testing, and automatic item generation. However, these techniques have yet to be integrated into mainstream psychological practice. This is unfortunate, because modern psychometric techniques can bring many benefits, including sophisticated reliability measures, improved construct validity, avoidance of exposure effects, and improved efficiency. In the present research we therefore use these techniques to develop a new test of a well-studied psychological capacity: melodic discrimination, the ability to detect differences between melodies. We calibrate and validate this test in a series of studies. Studies 1 and 2 respectively calibrate and validate an initial test version, while Studies 3 and 4 calibrate and validate an updated test version incorporating additional easy items. The results support the new test's viability, with evidence for strong reliability and construct validity. We discuss how these modern psychometric techniques may also be profitably applied to other areas of music psychology and psychological science in general.
NASA Technical Reports Server (NTRS)
Mcmanus, John W.; Goodrich, Kenneth H.
1989-01-01
A research program investigating the use of Artificial Intelligence (AI) programming techniques to aid in the development of a Tactical Decision Generator (TDG) for Within-Visual-Range (WVR) air combat engagements is discussed. The application of AI methods for development and implementation of the TDG is presented. The history of the Adaptive Maneuvering Logic (AML) program is traced and current versions of the (AML) program is traced and current versions of the AML program are compared and contrasted with the TDG system. The Knowledge-Based Systems (KBS) used by the TDG to aid in the decision-making process are outlined and example rules are presented. The results of tests to evaluate the performance of the TDG against a version of AML and against human pilots in the Langley Differential Maneuvering Simulator (DMS) are presented. To date, these results have shown significant performance gains in one-versus-one air combat engagements.
Lucena, E; Lucena, C; Gómez, M; Ortiz, J A; Ruiz, J; Arango, A; Diaz, C; Beuerman, C
1989-02-01
Sperm washing techniques, based on the swim-up principle used before inseminating the human oocyte in in-vitro fertilization and embryo transfer programmes (IVF-ET), usually require prior centrifugation which causes damage to the sperm cell. A technique is described for separating sperm at laboratory temperature based on sperm migration--sedimentation principles, using two concentric tubes and recovering 70-90% forward-moving cells. A group of 17 patients is presented who were managed with this method. The results were 85% fertilization rate, 4% polyspermia and six clinical pregnancies.
A Test Reliability Analysis of an Abbreviated Version of the Pupil Control Ideology Form.
ERIC Educational Resources Information Center
Gaffney, Patrick V.
A reliability analysis was conducted of an abbreviated, 10-item version of the Pupil Control Ideology Form (PCI), using the Cronbach's alpha technique (L. J. Cronbach, 1951) and the computation of the standard error of measurement. The PCI measures a teacher's orientation toward pupil control. Subjects were 168 preservice teachers from one private…
[Alienation and adaptation in English translation of traditional Chinese medicinal literature].
Liang, Jun-xiong; Wang, Guan-jun
2006-10-01
Alienation and adaptation are two of the principles and methods in translation, each possessing their own values. Alienation should be applied in translating linguistic content in order to transfer the imbedded cultural messages trustfully; while the principle of adaptation should be followed in linguistic structure translation due to the different thinking patterns between Chinese and English which results in a great linguistic structure difference. Therefore, the translator must express the original meaning trustfully, on the other hand, to make the Chinese version more smooth, linguistic structure should be transformed to conform to the thinking habit of the readers. In brief,alienation and adaptation should complement each other in translation to make it a bridge connecting the different cultures.
Juridical and ethical peculiarities in doping policy.
McNamee, Mike J; Tarasti, Lauri
2010-03-01
Criticisms of the ethical justification of antidoping legislation are not uncommon in the literatures of medical ethics, sports ethics and sports medicine. Critics of antidoping point to inconsistencies of principle in the application of legislation and the unjustifiability of ethical postures enshrined in the World Anti-Doping Code, a new version of which came into effect in January 2009. This article explores the arguments concerning the apparent legal peculiarities of antidoping legislation and their ethically salient features in terms of: notions of culpability, liability and guilt; aspects of potential duplication of punishments and the limitations of athlete privacy in antidoping practice and policy. It is noted that tensions still exist between legal and ethical principles and norms that require further critical attention.
Variational principle for scattering of light by dielectric particles
NASA Technical Reports Server (NTRS)
Yung, Y. L.
1978-01-01
Consideration is given to the work of Purcell and Pennypacker (1973) where a dielectric particle is taken to be an aggregate of N polarizable elements mounted on a cubic lattice. The simultaneous equations which result from the scattering problem are presented. This theory has been discussed in the case of nonspherical and inhomogeneous objects whose dimensions are smaller than or comparable to the wavelength of incident light. A more precise numerical treatment is derived for further progress. The variational principle is invoked and the practical limit for the current version of the scheme is a dipole array on the order of 10,000 atoms. Limits to the scattering parameter due to the phase difference between neighboring atoms are discussed.
Statistical Evaluation of Time Series Analysis Techniques
NASA Technical Reports Server (NTRS)
Benignus, V. A.
1973-01-01
The performance of a modified version of NASA's multivariate spectrum analysis program is discussed. A multiple regression model was used to make the revisions. Performance improvements were documented and compared to the standard fast Fourier transform by Monte Carlo techniques.
Principles for system level electrochemistry
NASA Technical Reports Server (NTRS)
Thaller, L. H.
1986-01-01
The higher power and higher voltage levels anticipated for future space missions have required a careful review of the techniques currently in use to preclude battery problems that are related to the dispersion characteristics of the individual cells. Not only are the out-of-balance problems accentuated in these larger systems, but the thermal management considerations also require a greater degree of accurate design. Newer concepts which employ active cooling techniques are being developed which permit higher rates of discharge and tighter packing densities for the electrochemical components. This paper will put forward six semi-independent principles relating to battery systems. These principles will progressively address cell, battery and finally system related aspects of large electrochemical storage systems.
NASA Technical Reports Server (NTRS)
Smith, Terence R.; Menon, Sudhakar; Star, Jeffrey L.; Estes, John E.
1987-01-01
This paper provides a brief survey of the history, structure and functions of 'traditional' geographic information systems (GIS), and then suggests a set of requirements that large-scale GIS should satisfy, together with a set of principles for their satisfaction. These principles, which include the systematic application of techniques from several subfields of computer science to the design and implementation of GIS and the integration of techniques from computer vision and image processing into standard GIS technology, are discussed in some detail. In particular, the paper provides a detailed discussion of questions relating to appropriate data models, data structures and computational procedures for the efficient storage, retrieval and analysis of spatially-indexed data.
A Brief History of Limb Lengthening.
Birch, John G
2017-09-01
In the last 35 years, orthopaedic surgeons have witnessed 3 major advances in the technique of limb lengthening: "distraction osteogenesis" facilitated by Gavriil Ilizarov method and infinitely-adaptable circular fixator with fine-wire bone fragment fixation; the introduction of the "6-strut" computer program-assisted circular fixators to effect complex deformity correction simultaneously; and the development of motorized intramedullary lengthening nails. However, the principles and associated complications of these techniques are on the basis of observations by Codivilla, Putti, and Abbott from as much as 110 years ago. This review notes the contribution of these pioneers in limb lengthening, and the contribution of Thor Heyerdahl principles of tolerance and diversity to the dissemination of Ilizarov principles to the Western world.
Theoretical Approaches to Dealing with Somalia
2012-05-17
because of a lack of assistance from the international community. 67 To use Thomas Friedman’s term, in The World Is Flat, Kaplan champions glocalization ...and board games. Wal-Mart makes the global local: glocalization . 69 Russett, Grasping the Democratic Peace : Principles for a Post-Cold War World...30-31. Russett presents, as fact, democracies do not war against each other. 26 Seth Kaplan calls his version of glocalization an enmeshing
ERIC Educational Resources Information Center
VANDERMEER, A. W.; AND OTHERS
THIS RESEARCH INVOLVED THE SELECTION OF TWO EXTANT TEACHING FILMS, DEVELOPMENT OF TESTS ON THEIR CONTENT, REVISION OF THE FILMS, AND A COMPARISON OF THE TWO VERSIONS. THE FILMS SELECTED WERE "WHY FOODS SPOIL" AND "ATOMS AND MOLECULES." THE SUBJECTS CONSISTED OF STUDENTS IN SMALL TOWN SCHOOLS FROM 5TH TO 12TH GRADES.…
Soltani, Esmail; Bahrainian, Seyed Abdolmajid; Masjedi Arani, Abbas; Farhoudian, Ali; Gachkar, Latif
2016-06-01
Social anxiety disorder is often related to specific impairment or distress in different areas of life, including occupational, social and family settings. The purpose of the present study was to examine the psychometric properties of the persian version of the social anxiety-acceptance and action questionnaire (SA-AAQ) in university students. In this descriptive cross-sectional study, 324 students from Shahid Beheshti University of Medical Sciences participated via the cluster sampling method during year 2015. Factor analysis by the principle component analysis method, internal consistency analysis, and convergent and divergent validity were conducted to examine the validity of the SA-AAQ. To calculate the reliability of the SA-AAQ, Cronbach's alpha and test-retest reliability were used. The results from factor analysis by principle component analysis method yielded three factors that were named acceptance, action and non-judging of experience. The three-factor solution explained 51.82% of the variance. Evidence for the internal consistency of SA-AAQ was obtained via calculating correlations between SA-AAQ and its subscales. Support for convergent and discriminant validity of the SA-AAQ via its correlations with the acceptance and action questionnaire - II, social interaction anxiety scale, cognitive fusion questionnaire, believability of anxious feelings and thoughts questionnaire, valued living questionnaire and WHOQOL- BREF was obtained. The reliability of the SA-AAQ via calculating Cronbach's alpha and test-retest coefficients yielded values of 0.84 and 0.84, respectively. The Iranian version of the SA-AAQ has acceptable levels of psychometric properties in university students. The SA-AAQ is a valid and reliable measure to be utilized in research investigations and therapeutic interventions.
Soltani, Esmail; Bahrainian, Seyed Abdolmajid; Masjedi Arani, Abbas; Farhoudian, Ali; Gachkar, Latif
2016-01-01
Background Social anxiety disorder is often related to specific impairment or distress in different areas of life, including occupational, social and family settings. Objective The purpose of the present study was to examine the psychometric properties of the persian version of the social anxiety-acceptance and action questionnaire (SA-AAQ) in university students. Materials and Methods In this descriptive cross-sectional study, 324 students from Shahid Beheshti University of Medical Sciences participated via the cluster sampling method during year 2015. Factor analysis by the principle component analysis method, internal consistency analysis, and convergent and divergent validity were conducted to examine the validity of the SA-AAQ. To calculate the reliability of the SA-AAQ, Cronbach’s alpha and test-retest reliability were used. Results The results from factor analysis by principle component analysis method yielded three factors that were named acceptance, action and non-judging of experience. The three-factor solution explained 51.82% of the variance. Evidence for the internal consistency of SA-AAQ was obtained via calculating correlations between SA-AAQ and its subscales. Support for convergent and discriminant validity of the SA-AAQ via its correlations with the acceptance and action questionnaire - II, social interaction anxiety scale, cognitive fusion questionnaire, believability of anxious feelings and thoughts questionnaire, valued living questionnaire and WHOQOL- BREF was obtained. The reliability of the SA-AAQ via calculating Cronbach’s alpha and test-retest coefficients yielded values of 0.84 and 0.84, respectively. Conclusions The Iranian version of the SA-AAQ has acceptable levels of psychometric properties in university students. The SA-AAQ is a valid and reliable measure to be utilized in research investigations and therapeutic interventions. PMID:27803719
Al-Musawi, Nu'man M
2003-04-01
Using confirmatory factor analytic techniques on data generated from 200 students enrolled at the University of Bahrain, we obtained some construct validity and reliability data for the Arabic Version of the 1961 Group Personality Projective Test by Cassel and Khan. In contrast to the 5-factor model proposed for the Group Personality Projective Test, a 6-factor solution appeared justified for the Arabic Version of this test, suggesting some variance between the cultural groups in the United States and in Bahrain.
Integrating Agronomic Principles with Management Experience in Introductory Agronomy.
ERIC Educational Resources Information Center
Vorst, J. J.
1989-01-01
Explains the use of a cropping systems project to teach agronomic principles and crop management techniques, and to enhance communication skills. Provides a sample progress report instructions sheet which was used for the project. (Author/RT)
Klemm, Matthias; Blum, Johannes; Link, Dietmar; Hammer, Martin; Haueisen, Jens; Schweitzer, Dietrich
2016-09-01
Fluorescence lifetime imaging ophthalmoscopy (FLIO) is a new technique to detect changes in the human retina. The autofluorescence decay over time, generated by endogenous fluorophores, is measured in vivo. The strong autofluorescence of the crystalline lens, however, superimposes the intensity decay of the retina fluorescence, as the confocal principle is not able to suppress it sufficiently. Thus, the crystalline lens autofluorescence causes artifacts in the retinal fluorescence lifetimes determined from the intensity decays. Here, we present a new technique to suppress the autofluorescence of the crystalline lens by introducing an annular stop into the detection light path, which we call Schweitzer's principle. The efficacy of annular stops with an outer diameter of 7 mm and inner diameters of 1 to 5 mm are analyzed in an experimental setup using a model eye based on fluorescent dyes. Compared to the confocal principle, Schweitzer's principle with an inner diameter of 3 mm is able to reduce the simulated crystalline lens fluorescence to 4%, while 42% of the simulated retina fluorescence is preserved. Thus, we recommend the implementation of Schweitzer's principle in scanning laser ophthalmoscopes used for fundus autofluorescence measurements, especially the FLIO device, for improved image quality.
Cloutier, Jacinthe; Lafrance, Josée; Michallet, Bernard; Marcoux, Lyson; Cloutier, France
2015-03-01
The Canadian Interprofessional Health Collaborative recommends that future professionals be prepared for collaborative practice. To do so, it is necessary for them to learn about the principles of interprofessional collaboration. Therefore, to ascertain if students are predisposed, their attitude toward interprofessional learning must be assessed. In the French Canadian context such a measuring tool has not been published yet. The purpose of this study is to translate in French an adapted version of the RIPLS questionnaire and to validate it for use with undergraduate students from seven various health and social care programmes in a Canadian university. According to Vallerand's methodology, a method for translating measuring instruments: (i) the forward-backward translation indicated that six items of the experimental French version of the RIPLS needed to be more specific; (ii) the experimental French version of the RIPLS seemed clear according to the pre-test assessing items clarity; (iii) evaluation of the content validity indicated that the experimental French version of the RIPLS presents good content validity and (iv) a very good internal consistency was obtained (α = 0.90; n = 141). Results indicate that the psychometric properties of the RIPLS in French are comparable to the English version, although a different factorial structure was found. The relevance of three of the 19 items on the RIPLS scale is questionable, resulting in a revised 16-item scale. Future research aimed at validating the translated French version of the RIPLS could also be conducted in another francophone cultural context.
Optimal startup control of a jacketed tubular reactor.
NASA Technical Reports Server (NTRS)
Hahn, D. R.; Fan, L. T.; Hwang, C. L.
1971-01-01
The optimal startup policy of a jacketed tubular reactor, in which a first-order, reversible, exothermic reaction takes place, is presented. A distributed maximum principle is presented for determining weak necessary conditions for optimality of a diffusional distributed parameter system. A numerical technique is developed for practical implementation of the distributed maximum principle. This involves the sequential solution of the state and adjoint equations, in conjunction with a functional gradient technique for iteratively improving the control function.
Quantum Gauss-Jordan Elimination and Simulation of Accounting Principles on Quantum Computers
NASA Astrophysics Data System (ADS)
Diep, Do Ngoc; Giang, Do Hoang; Van Minh, Nguyen
2017-06-01
The paper is devoted to a version of Quantum Gauss-Jordan Elimination and its applications. In the first part, we construct the Quantum Gauss-Jordan Elimination (QGJE) Algorithm and estimate the complexity of computation of Reduced Row Echelon Form (RREF) of N × N matrices. The main result asserts that QGJE has computation time is of order 2 N/2. The second part is devoted to a new idea of simulation of accounting by quantum computing. We first expose the actual accounting principles in a pure mathematics language. Then, we simulate the accounting principles on quantum computers. We show that, all accounting actions are exhousted by the described basic actions. The main problems of accounting are reduced to some system of linear equations in the economic model of Leontief. In this simulation, we use our constructed Quantum Gauss-Jordan Elimination to solve the problems and the complexity of quantum computing is a square root order faster than the complexity in classical computing.
How Do We See Art: An Eye-Tracker Study
Quiroga, Rodrigo Quian; Pedreira, Carlos
2011-01-01
We describe the pattern of fixations of subjects looking at figurative and abstract paintings from different artists (Molina, Mondrian, Rembrandt, della Francesca) and at modified versions in which different aspects of these art pieces were altered with simple digital manipulations. We show that the fixations of the subjects followed some general common principles (e.g., being attracted to saliency regions) but with a large variability for the figurative paintings, according to the subject’s personal appreciation and knowledge. In particular, we found different gazing patterns depending on whether the subject saw the original or the modified version of the painting first. We conclude that the study of gazing patterns obtained by using the eye-tracker technology gives a useful approach to quantify how subjects observe art. PMID:21941476
Boosted Schwarzschild metrics from a Kerr–Schild perspective
NASA Astrophysics Data System (ADS)
Mädler, Thomas; Winicour, Jeffrey
2018-02-01
The Kerr–Schild version of the Schwarzschild metric contains a Minkowski background which provides a definition of a boosted black hole. There are two Kerr–Schild versions corresponding to ingoing or outgoing principle null directions. We show that the two corresponding Minkowski backgrounds and their associated boosts have an unexpected difference. We analyze this difference and discuss the implications in the nonlinear regime for the gravitational memory effect resulting from the ejection of massive particles from an isolated system. We show that the nonlinear effect agrees with the linearized result based upon the retarded Green function only if the velocity of the ejected particle corresponds to a boost symmetry of the ingoing Minkowski background. A boost with respect to the outgoing Minkowski background is inconsistent with the absence of ingoing radiation from past null infinity.
Collaborating with human factors when designing an electronic textbook
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ratner, J.A.; Zadoks, R.I.; Attaway, S.W.
The development of on-line engineering textbooks presents new challenges to authors to effectively integrate text and tools in an electronic environment. By incorporating human factors principles of interface design and cognitive psychology early in the design process, a team at Sandia National Laboratories was able to make the end product more usable and shorten the prototyping and editing phases. A critical issue was simultaneous development of paper and on-line versions of the textbook. In addition, interface consistency presented difficulties with distinct goals and limitations for each media. Many of these problems were resolved swiftly with human factors input using templates,more » style guides and iterative usability testing of both paper and on-line versions. Writing style continuity was also problematic with numerous authors contributing to the text.« less
Nearfield acoustic holography. I - Theory of generalized holography and the development of NAH
NASA Technical Reports Server (NTRS)
Maynard, J. D.; Williams, E. G.; Lee, Y.
1985-01-01
Because its underlying principles are so fundamental, holography has been studied and applied in many areas of science. Recently, a technique has been developed which takes the maximum advantage of the fundamental principles and extracts much more information from a hologram than is customarily associated with such a measurement. In this paper the fundamental principles of holography are reviewed, and a sound radiation measurement system, called nearfield acoustic holography (NAH), which fully exploits the fundamental principles, is described.
Jain, Rahi; Venkatasubramanian, Padma
2014-01-01
Quality Ayurvedic herbal medicines are potential, low-cost solutions for addressing contemporary healthcare needs of both Indian and global community. Correlating Ayurvedic herbal preparations with modern processing principles (MPPs) can help develop new and use appropriate technology for scaling up production of the medicines, which is necessary to meet the growing demand. Understanding the fundamental Ayurvedic principles behind formulation and processing is also important for improving the dosage forms. Even though Ayurvedic industry has adopted technologies from food, chemical and pharmaceutical industries, there is no systematic study to correlate the traditional and modern processing methods. This study is an attempt to provide a possible correlation between the Ayurvedic processing methods and MPPs. A systematic literature review was performed to identify the Ayurvedic processing methods by collecting information from English editions of classical Ayurveda texts on medicine preparation methods. Correlation between traditional and MPPs was done based on the techniques used in Ayurvedic drug processing. It was observed that in Ayurvedic medicine preparations there were two major types of processes, namely extraction, and separation. Extraction uses membrane rupturing and solute diffusion principles, while separation uses volatility, adsorption, and size-exclusion principles. The study provides systematic documentation of methods used in Ayurveda for herbal drug preparation along with its interpretation in terms of MPPs. This is the first step which can enable improving or replacing traditional techniques. New technologies or use of existing technologies can be used to improve the dosage forms and scaling up while maintaining the Ayurvedic principles similar to traditional techniques.
NASA Technical Reports Server (NTRS)
Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.
1991-01-01
The Penn State Finite Difference Time Domain Electromagnetic Scattering Code Versions TEA and TMA are two dimensional numerical electromagnetic scattering codes based upon the Finite Difference Time Domain Technique (FDTD) first proposed by Yee in 1966. The supplied version of the codes are two versions of our current two dimensional FDTD code set. This manual provides a description of the codes and corresponding results for the default scattering problem. The manual is organized into eleven sections: introduction, Version TEA and TMA code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include files (TEACOM.FOR TMACOM.FOR), a section briefly discussing scattering width computations, a section discussing the scattering results, a sample problem set section, a new problem checklist, references and figure titles.
NASA Technical Reports Server (NTRS)
Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.
1991-01-01
The Penn State Finite Difference Time Domain Electromagnetic Scattering Code Versions TEA and TMA are two dimensional electromagnetic scattering codes based on the Finite Difference Time Domain Technique (FDTD) first proposed by Yee in 1966. The supplied version of the codes are two versions of our current FDTD code set. This manual provides a description of the codes and corresponding results for the default scattering problem. The manual is organized into eleven sections: introduction, Version TEA and TMA code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include files (TEACOM.FOR TMACOM.FOR), a section briefly discussing scattering width computations, a section discussing the scattering results, a sample problem setup section, a new problem checklist, references, and figure titles.
NASA Technical Reports Server (NTRS)
Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.
1991-01-01
The Penn State Finite Difference Time Domain Electromagnetic Scattering Code Version C is a three dimensional numerical electromagnetic scattering code based upon the Finite Difference Time Domain Technique (FDTD). The supplied version of the code is one version of our current three dimensional FDTD code set. This manual provides a description of the code and corresponding results for several scattering problems. The manual is organized into fourteen sections: introduction, description of the FDTD method, operation, resource requirements, Version C code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include file (COMMONC.FOR), a section briefly discussing Radar Cross Section (RCS) computations, a section discussing some scattering results, a sample problem setup section, a new problem checklist, references and figure titles.
NASA Technical Reports Server (NTRS)
Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.
1991-01-01
The Penn State Finite Difference Time Domain Electromagnetic Scattering Code Version D is a three dimensional numerical electromagnetic scattering code based upon the Finite Difference Time Domain Technique (FDTD). The supplied version of the code is one version of our current three dimensional FDTD code set. This manual provides a description of the code and corresponding results for several scattering problems. The manual is organized into fourteen sections: introduction, description of the FDTD method, operation, resource requirements, Version D code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include file (COMMOND.FOR), a section briefly discussing Radar Cross Section (RCS) computations, a section discussing some scattering results, a sample problem setup section, a new problem checklist, references and figure titles.
NASA Technical Reports Server (NTRS)
Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.
1992-01-01
The Penn State Finite Difference Time Domain (FDTD) Electromagnetic Scattering Code Version A is a three dimensional numerical electromagnetic scattering code based on the Finite Difference Time Domain technique. The supplied version of the code is one version of our current three dimensional FDTD code set. The manual provides a description of the code and the corresponding results for the default scattering problem. The manual is organized into 14 sections: introduction, description of the FDTD method, operation, resource requirements, Version A code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include file (COMMONA.FOR), a section briefly discussing radar cross section (RCS) computations, a section discussing the scattering results, a sample problem setup section, a new problem checklist, references, and figure titles.
NASA Technical Reports Server (NTRS)
Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.
1992-01-01
The Penn State Finite Difference Time Domain Electromagnetic Scattering Code Version C is a three-dimensional numerical electromagnetic scattering code based on the Finite Difference Time Domain (FDTD) technique. The supplied version of the code is one version of our current three-dimensional FDTD code set. The manual given here provides a description of the code and corresponding results for several scattering problems. The manual is organized into 14 sections: introduction, description of the FDTD method, operation, resource requirements, Version C code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include file (COMMONC.FOR), a section briefly discussing radar cross section computations, a section discussing some scattering results, a new problem checklist, references, and figure titles.
NASA Technical Reports Server (NTRS)
Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.
1991-01-01
The Penn State Finite Difference Time Domain Electromagnetic Scattering Code Version B is a three dimensional numerical electromagnetic scattering code based upon the Finite Difference Time Domain Technique (FDTD). The supplied version of the code is one version of our current three dimensional FDTD code set. This manual provides a description of the code and corresponding results for several scattering problems. The manual is organized into fourteen sections: introduction, description of the FDTD method, operation, resource requirements, Version B code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include file (COMMONB.FOR), a section briefly discussing Radar Cross Section (RCS) computations, a section discussing some scattering results, a sample problem setup section, a new problem checklist, references and figure titles.
Building Structure Housing: Case Study of Community Housing in Kendari City
NASA Astrophysics Data System (ADS)
Umar, M. Z.; Faslih, A.; Arsyad, M.; Sjamsu, A. S.; Kadir, I.
2017-11-01
Housing development has been pioneered through a simple home construction program to reduce the production cost. Simple housing program was developed in Kendari City. The purpose of this study is to show the principles of reducing the cost production for the type 36 homes, in Kendari City. The selected architectural objects are the lower, middle and the upper structure of type 36 house. The data collection was done by observation and in-depth discussion with construction workers. The analysis technique used in this research was a descriptive narrative analysis technique in the form of tabulation data. This study concluded that there are several principles of price reduction in the structure of public housing buildings. Quick principles exist in constructing techniques such as using cigarette packs as a foundation pad, mortar usage for rapid wall standing, and the spacing of mortars could be done manually by using two fingers on a human hand. Economic principles could be used for material matters, such as eliminating the use of gravel for concrete, the use of sand material to contain the soil, the foundation does not use sand and empty stone, and the shape of the ring beam was made using triangle reinforcement.
A Low Power Digital Accumulation Technique for Digital-Domain CMOS TDI Image Sensor.
Yu, Changwei; Nie, Kaiming; Xu, Jiangtao; Gao, Jing
2016-09-23
In this paper, an accumulation technique suitable for digital domain CMOS time delay integration (TDI) image sensors is proposed to reduce power consumption without degrading the rate of imaging. In terms of the slight variations of quantization codes among different pixel exposures towards the same object, the pixel array is divided into two groups: one is for coarse quantization of high bits only, and the other one is for fine quantization of low bits. Then, the complete quantization codes are composed of both results from the coarse-and-fine quantization. The equivalent operation comparably reduces the total required bit numbers of the quantization. In the 0.18 µm CMOS process, two versions of 16-stage digital domain CMOS TDI image sensor chains based on a 10-bit successive approximate register (SAR) analog-to-digital converter (ADC), with and without the proposed technique, are designed. The simulation results show that the average power consumption of slices of the two versions are 6 . 47 × 10 - 8 J/line and 7 . 4 × 10 - 8 J/line, respectively. Meanwhile, the linearity of the two versions are 99.74% and 99.99%, respectively.
The principle of superposition and its application in ground-water hydraulics
Reilly, Thomas E.; Franke, O. Lehn; Bennett, Gordon D.
1987-01-01
The principle of superposition, a powerful mathematical technique for analyzing certain types of complex problems in many areas of science and technology, has important applications in ground-water hydraulics and modeling of ground-water systems. The principle of superposition states that problem solutions can be added together to obtain composite solutions. This principle applies to linear systems governed by linear differential equations. This report introduces the principle of superposition as it applies to ground-water hydrology and provides background information, discussion, illustrative problems with solutions, and problems to be solved by the reader.
The principle of superposition and its application in ground-water hydraulics
Reilly, T.E.; Franke, O.L.; Bennett, G.D.
1984-01-01
The principle of superposition, a powerful methematical technique for analyzing certain types of complex problems in many areas of science and technology, has important application in ground-water hydraulics and modeling of ground-water systems. The principle of superposition states that solutions to individual problems can be added together to obtain solutions to complex problems. This principle applies to linear systems governed by linear differential equations. This report introduces the principle of superposition as it applies to groundwater hydrology and provides background information, discussion, illustrative problems with solutions, and problems to be solved by the reader. (USGS)
Integrated Science Syllabus for Malaysia, Forms 1-111, Revised Version.
ERIC Educational Resources Information Center
Ministry of Education, Kuala Lumpur (Malaysia).
As a revised version of the Scottish Integrated Science, an outline of the Malaysian science course is presented in this volume for use as a guideline for science teaching at the secondary level. A total of 16 sections is included in three forms which are intended to be covered in three years. The topics include: lab techniques, unit systems,…
NASA Astrophysics Data System (ADS)
Panther Bishoff, Jennifer
In recent years, higher education has undergone many changes. The advent of assessment, accountability, and a newfound focus on teaching have required faculty to examine how they are teaching. Administrators and faculty are beginning to recognize that learning is not a "one size fits all" enterprise. To this end, Chickering and Gamson developed an inventory that examined faculty utilization of the Seven Principles of Good Practice in Undergraduate Education. The seven principles included by the authors included faculty-student interaction, cooperative learning, active learning, giving prompt feedback, emphasizing time on task, communicating high expectations, and respecting diverse talents and ways of learning. It was determined by Chickering and Gamson, as well as many other researchers, that these seven principles were hallmarks of successful undergraduate education. Community colleges are important institutions to study, as many students begin their higher education at two-year colleges. Most students are also required to take one or more science classes for their general education requirements; therefore, many students must take at least one general chemistry course. Both community colleges and chemistry are rarely studied in literature, which makes this study important. Community college general chemistry instructors were surveyed using an online version of Chickering and Gamson's Faculty Inventory for the Seven Principles of Good Practice in Undergraduate Education. Responses were analyzed, and it was discovered that not only did instructors utilize the principles to a different extent, but there were also differences between genders as well as between the specific actions related to each principle.
[How to be prudent with synthetic biology. Synthetic Biology and the precautionary principle].
Rodríguez López, Blanca
2014-01-01
Synthetic biology is a new discipline that is twofold: firstly it offers the promise to pay benefits that can alleviate some of the ills that plague mankind; On the other hand, like all technologies, holds risks. Given these, the most critical and concerned about the risks, invoke the application of the precautionary principle, common in cases where an activity or new technology creates risks to the environment and/or human health, but far from universally accepted happens to be currently one of the most controversial principles. In this paper the question of the risks and benefits of synthetic biology and the relevance of applying the precautionary principle are analyzed. To do this we proceed as follows. The first part focuses on synthetic biology. At first, this discipline is characterized, with special attention to what is novel compared to the known as "genetic engineering". In the second stage both the benefits and the risks associated with it are discussed. The first part concludes with a review of the efforts currently being made to control or minimize the risks. The second part aims to analyze the precautionary principle and its possible relevance to the case of Synthetic Biology. At first, the different versions and interpretations of the principle and the various criticisms of which has been the subject are reviewed. Finally, after discarding the Precautionary Principle as an useful tool, it is seen as more appropriate some recent proposals to treat technologies that take into account not only risks but also their benefits.
Separation and sorting of cells in microsystems using physical principles
NASA Astrophysics Data System (ADS)
Lee, Gi-Hun; Kim, Sung-Hwan; Ahn, Kihoon; Lee, Sang-Hoon; Park, Joong Yull
2016-01-01
In the last decade, microfabrication techniques have been combined with microfluidics and applied to cell biology. Utilizing such new techniques, various cell studies have been performed for the research of stem cells, immune cells, cancer, neurons, etc. Among the various biological applications of microtechnology-based platforms, cell separation technology has been highly regarded in biological and clinical fields for sorting different types of cells, finding circulating tumor cells (CTCs), and blood cell separation, amongst other things. Many cell separation methods have been created using various physical principles. Representatively, these include hydrodynamic, acoustic, dielectrophoretic, magnetic, optical, and filtering methods. In this review, each of these methods will be introduced, and their physical principles and sample applications described. Each physical principle has its own advantages and disadvantages. The engineers who design the systems and the biologists who use them should understand the pros and cons of each method or principle, to broaden the use of microsystems for cell separation. Continuous development of microsystems for cell separation will lead to new opportunities for diagnosing CTCs and cancer metastasis, as well as other elements in the bloodstream.
High-Performance Java Codes for Computational Fluid Dynamics
NASA Technical Reports Server (NTRS)
Riley, Christopher; Chatterjee, Siddhartha; Biswas, Rupak; Biegel, Bryan (Technical Monitor)
2001-01-01
The computational science community is reluctant to write large-scale computationally -intensive applications in Java due to concerns over Java's poor performance, despite the claimed software engineering advantages of its object-oriented features. Naive Java implementations of numerical algorithms can perform poorly compared to corresponding Fortran or C implementations. To achieve high performance, Java applications must be designed with good performance as a primary goal. This paper presents the object-oriented design and implementation of two real-world applications from the field of Computational Fluid Dynamics (CFD): a finite-volume fluid flow solver (LAURA, from NASA Langley Research Center), and an unstructured mesh adaptation algorithm (2D_TAG, from NASA Ames Research Center). This work builds on our previous experience with the design of high-performance numerical libraries in Java. We examine the performance of the applications using the currently available Java infrastructure and show that the Java version of the flow solver LAURA performs almost within a factor of 2 of the original procedural version. Our Java version of the mesh adaptation algorithm 2D_TAG performs within a factor of 1.5 of its original procedural version on certain platforms. Our results demonstrate that object-oriented software design principles are not necessarily inimical to high performance.
NASA Technical Reports Server (NTRS)
Bi, Lei; Yang, Ping; Liu, Chao; Yi, Bingqi; Baum, Bryan A.; Van Diedenhoven, Bastiaan; Iwabuchi, Hironobu
2014-01-01
A fundamental problem in remote sensing and radiative transfer simulations involving ice clouds is the ability to compute accurate optical properties for individual ice particles. While relatively simple and intuitively appealing, the conventional geometric-optics method (CGOM) is used frequently for the solution of light scattering by ice crystals. Due to the approximations in the ray-tracing technique, the CGOM accuracy is not well quantified. The result is that the uncertainties are introduced that can impact many applications. Improvements in the Invariant Imbedding T-matrix method (II-TM) and the Improved Geometric-Optics Method (IGOM) provide a mechanism to assess the aforementioned uncertainties. The results computed by the II-TMþIGOM are considered as a benchmark because the IITM solves Maxwell's equations from first principles and is applicable to particle size parameters ranging into the domain at which the IGOM has reasonable accuracy. To assess the uncertainties with the CGOM in remote sensing and radiative transfer simulations, two independent optical property datasets of hexagonal columns are developed for sensitivity studies by using the CGOM and the II-TMþIGOM, respectively. Ice cloud bulk optical properties obtained from the two datasets are compared and subsequently applied to retrieve the optical thickness and effective diameter from Moderate Resolution Imaging Spectroradiometer (MODIS) measurements. Additionally, the bulk optical properties are tested in broadband radiative transfer (RT) simulations using the general circulation model (GCM) version of the Rapid Radiative Transfer Model (RRTMG) that is adopted in the National Center for Atmospheric Research (NCAR) Community Atmosphere Model (CAM, version 5.1). For MODIS retrievals, the mean bias of uncertainties of applying the CGOM in shortwave bands (0.86 and 2.13 micrometers) can be up to 5% in the optical thickness and as high as 20% in the effective diameter, depending on cloud optical thickness and effective diameter. In the MODIS infrared window bands centered at 8.5, 11, and 12 micrometers biases in the optical thickness and effective diameter are up to 12% and 10%, respectively. The CGOM-based simulation errors in ice cloud radiative forcing calculations are on the order of 10Wm(exp 2).
What's so different about Lacan's approach to psychoanalysis?
Fink, Bruce
2011-12-01
Clinical work based on Lacanian principles is rarely compared in the psychoanalytic literature with that based on other principles. The author attempts to highlight a few important theoretical differences regarding language, desire, affect, and time between a Lacanian approach and certain others that lead to differences in focus and technique, related, for example, to interpretation, scansion, and countertransference. Lacanian techniques are illustrated with brief clinical vignettes. In the interest of confidentiality, identifying information and certain circumstances have been changed or omitted in the material presented.
Chemistry of vaporization of refractory materials
NASA Technical Reports Server (NTRS)
Gilles, P. W.
1975-01-01
A discussion is given of the principles of physical chemistry important in vaporization studies, notably the concepts of equilibrium, phase behavior, thermodynamics, solid solution, and kinetics. The important factors influencing equilibrium vaporization phenomena are discussed and illustrated. A proper course of a vaporization study consisting of 9 stages is proposed. The important experimental techniques of Knudsen effusion, Langmuir vaporization and mass spectrometry are discussed. The principles, the factors, the course of a study and the experimental techniques and procedures are illustrated by recent work on the Ti-O system.
Principles of ESCA and application to metal corrosion, coating and lubrication
NASA Technical Reports Server (NTRS)
Wheeler, D. R.
1978-01-01
The principles of ESCA (electron spectroscopy for chemical analysis) were described by comparison with other spectroscopic techniques. The advantages and disadvantages of ESCA as compared to other surface sensitive analytical techniques were evaluated. The use of ESCA was illustrated by actual applications to oxidation of steel and Rene 41, the chemistry of lubricant additives on steel, and the composition of sputter deposited hard coatings. A bibliography of material that was useful for further study of ESCA was presented and commented upon.
Teaching General Principles and Applications of Dendrogeomorphology.
ERIC Educational Resources Information Center
Butler, David R.
1987-01-01
Tree-ring analysis in geomorphology can be incorporated into a number of undergraduate methods in order to reconstruct the history of a variety of geomorphic processes. Discusses dendrochronology, general principles of dendrogeomorphology, field sampling methods, laboratory techniques, and examples of applications. (TW)
Banerjee, Arindam; Ghosh, Joydeep
2004-05-01
Competitive learning mechanisms for clustering, in general, suffer from poor performance for very high-dimensional (>1000) data because of "curse of dimensionality" effects. In applications such as document clustering, it is customary to normalize the high-dimensional input vectors to unit length, and it is sometimes also desirable to obtain balanced clusters, i.e., clusters of comparable sizes. The spherical kmeans (spkmeans) algorithm, which normalizes the cluster centers as well as the inputs, has been successfully used to cluster normalized text documents in 2000+ dimensional space. Unfortunately, like regular kmeans and its soft expectation-maximization-based version, spkmeans tends to generate extremely imbalanced clusters in high-dimensional spaces when the desired number of clusters is large (tens or more). This paper first shows that the spkmeans algorithm can be derived from a certain maximum likelihood formulation using a mixture of von Mises-Fisher distributions as the generative model, and in fact, it can be considered as a batch-mode version of (normalized) competitive learning. The proposed generative model is then adapted in a principled way to yield three frequency-sensitive competitive learning variants that are applicable to static data and produced high-quality and well-balanced clusters for high-dimensional data. Like kmeans, each iteration is linear in the number of data points and in the number of clusters for all the three algorithms. A frequency-sensitive algorithm to cluster streaming data is also proposed. Experimental results on clustering of high-dimensional text data sets are provided to show the effectiveness and applicability of the proposed techniques. Index Terms-Balanced clustering, expectation maximization (EM), frequency-sensitive competitive learning (FSCL), high-dimensional clustering, kmeans, normalized data, scalable clustering, streaming data, text clustering.
Magnetic Levitation Coupled with Portable Imaging and Analysis for Disease Diagnostics.
Knowlton, Stephanie M; Yenilmez, Bekir; Amin, Reza; Tasoglu, Savas
2017-02-19
Currently, many clinical diagnostic procedures are complex, costly, inefficient, and inaccessible to a large population in the world. The requirements for specialized equipment and trained personnel require that many diagnostic tests be performed at remote, centralized clinical laboratories. Magnetic levitation is a simple yet powerful technique and can be applied to levitate cells, which are suspended in a paramagnetic solution and placed in a magnetic field, at a position determined by equilibrium between a magnetic force and a buoyancy force. Here, we present a versatile platform technology designed for point-of-care diagnostics which uses magnetic levitation coupled to microscopic imaging and automated analysis to determine the density distribution of a patient's cells as a useful diagnostic indicator. We present two platforms operating on this principle: (i) a smartphone-compatible version of the technology, where the built-in smartphone camera is used to image cells in the magnetic field and a smartphone application processes the images and to measures the density distribution of the cells and (ii) a self-contained version where a camera board is used to capture images and an embedded processing unit with attached thin-film-transistor (TFT) screen measures and displays the results. Demonstrated applications include: (i) measuring the altered distribution of a cell population with a disease phenotype compared to a healthy phenotype, which is applied to sickle cell disease diagnosis, and (ii) separation of different cell types based on their characteristic densities, which is applied to separate white blood cells from red blood cells for white blood cell cytometry. These applications, as well as future extensions of the essential density-based measurements enabled by this portable, user-friendly platform technology, will significantly enhance disease diagnostic capabilities at the point of care.
Murphy, Devin; Sawczyn, Kelly K; Quinn, Gwendolyn P
2012-04-01
Most pediatric education materials are designed for a parent audience. Social marketing techniques rely on the principles called the "4 P's": product, price, place, and promotion. The objective of this study was to test the design, readability, likelihood to read, and overall opinion of a pediatric fertility preservation brochure with patients, parents, and providers. Qualitative face-to-face interviews. The Children's Cancer Center in Tampa, FL, and All Children's Hospital in St. Petersburg, FL. Male and female cancer patients and survivors aged 12-21 (N = 7), their parents (N = 11), and healthcare providers (N = 6). Patients, survivors, parents, and healthcare providers were given two versions of gender concordant brochures on fertility preservation designed for both pediatric oncology patients and their parents. Design, readability, likelihood to read, and overall opinion from interviews in order to identify facilitators of involving patients in fertility preservation discussions. Parents and teens differed on the design, readability, and likelihood to read, the highest discord being preferences for medical terminology used in the brochures. While parents remarked that much of the language was 'too advanced,' the majority of teens explained that they understood the terminology and preferred it remained on the brochure. Overall feedback from all three groups was utilized to revise the brochures into final versions to increase the likelihood of reading. Information about the development of the 4 P's of social marketing highlights needs from the intended audience. Barriers to patient education in pediatrics can be ameliorated when using the social marketing approach. Copyright © 2012 North American Society for Pediatric and Adolescent Gynecology. Published by Elsevier Inc. All rights reserved.
A technique for fast and accurate measurement of hand volumes using Archimedes' principle.
Hughes, S; Lau, J
2008-03-01
A new technique for measuring hand volumes using Archimedes principle is described. The technique involves the immersion of a hand in a water container placed on an electronic balance. The volume is given by the change in weight divided by the density of water. This technique was compared with the more conventional technique of immersing an object in a container with an overflow spout and collecting and weighing the volume of overflow water. The hand volume of two subjects was measured. Hand volumes were 494 +/- 6 ml and 312 +/- 7 ml for the immersion method and 476 +/- 14 ml and 302 +/- 8 ml for the overflow method for the two subjects respectively. Using plastic test objects, the mean difference between the actual and measured volume was -0.3% and 2.0% for the immersion and overflow techniques respectively. This study shows that hand volumes can be obtained more quickly than the overflow method. The technique could find an application in clinics where frequent hand volumes are required.
High Performance Object-Oriented Scientific Programming in Fortran 90
NASA Technical Reports Server (NTRS)
Norton, Charles D.; Decyk, Viktor K.; Szymanski, Boleslaw K.
1997-01-01
We illustrate how Fortran 90 supports object-oriented concepts by example of plasma particle computations on the IBM SP. Our experience shows that Fortran 90 and object-oriented methodology give high performance while providing a bridge from Fortran 77 legacy codes to modern programming principles. All of our object-oriented Fortran 90 codes execute more quickly thatn the equeivalent C++ versions, yet the abstraction modelling capabilities used for scentific programming are comparably powereful.
The p-version of the finite element method in incremental elasto-plastic analysis
NASA Technical Reports Server (NTRS)
Holzer, Stefan M.; Yosibash, Zohar
1993-01-01
Whereas the higher-order versions of the finite elements method (the p- and hp-version) are fairly well established as highly efficient methods for monitoring and controlling the discretization error in linear problems, little has been done to exploit their benefits in elasto-plastic structural analysis. Aspects of incremental elasto-plastic finite element analysis which are particularly amenable to improvements by the p-version is discussed. These theoretical considerations are supported by several numerical experiments. First, an example for which an analytical solution is available is studied. It is demonstrated that the p-version performs very well even in cycles of elasto-plastic loading and unloading, not only as compared to the traditional h-version but also in respect to the exact solution. Finally, an example of considerable practical importance - the analysis of a cold-worked lug - is presented which demonstrates how the modeling tools offered by higher-order finite element techniques can contribute to an improved approximation of practical problems.
Applying Evolutionary Terminology Auditing to SNOMED CT
Ceusters, Werner
2010-01-01
Evolutionary Terminology Auditing is a technique designed to measure quality improvements of terminologies over successive versions. It uses the most recent version of a terminology as a benchmark and assumes that changes in the underlying ontology correspond to changes in either that part of reality that is covered by the terminology, or the authors’ understanding – if not the ‘state of the art’ in general – thereof. Applied to SNOMED CT over 18 versions, it reveals that at the level of the concepts minimal improvements are obtained and that the second assumption holds for far less changes than one would expect. It is recommended that future versions of SNOMED CT provide more explicit documentation for each introduced change. PMID:21346948
The seasonal-cycle climate model
NASA Technical Reports Server (NTRS)
Marx, L.; Randall, D. A.
1981-01-01
The seasonal cycle run which will become the control run for the comparison with runs utilizing codes and parameterizations developed by outside investigators is discussed. The climate model currently exists in two parallel versions: one running on the Amdahl and the other running on the CYBER 203. These two versions are as nearly identical as machine capability and the requirement for high speed performance will allow. Developmental changes are made on the Amdahl/CMS version for ease of testing and rapidity of turnaround. The changes are subsequently incorporated into the CYBER 203 version using vectorization techniques where speed improvement can be realized. The 400 day seasonal cycle run serves as a control run for both medium and long range climate forecasts alsensitivity studies.
Lakes and reservoirs—Guidelines for study design and sampling
,
2015-09-29
The “National Field Manual for the Collection of Water-Quality Data” (NFM) is an online report with separately published chapters that provides the protocols and guidelines by which U.S. Geological Survey personnel obtain the data used to assess the quality of the Nation’s surface-water and groundwater resources. Chapter A10 reviews limnological principles, describes the characteristics that distinguish lakes from reservoirs, and provides guidance for developing temporal and spatial sampling strategies and data-collection approaches to be used in lake and reservoir environmental investigations.Within this chapter are references to other chapters of the NFM that provide more detailed guidelines related to specific topics and more detailed protocols for the quality assurance and assessment of the lake and reservoir data. Protocols and procedures to address and document the quality of lake and reservoir investigations are adapted from, or referenced to, the protocols and standard operating procedures contained in related chapters of this NFM.Before 2017, the U.S. Geological Survey (USGS) “National Field Manual for the Collection of Water-Quality Data” (NFM) chapters were released in the USGS Techniques of Water-Resources Investigations series. Effective in 2018, new and revised NFM chapters are being released in the USGS Techniques and Methods series; this series change does not affect the content and format of the NFM. More information is in the general introduction to the NFM (USGS Techniques and Methods, book 9, chapter A0, 2018) at https://doi.org/10.3133/tm9A0. The authoritative current versions of NFM chapters are available in the USGS Publications Warehouse at https://pubs.er.usgs.gov. Comments, questions, and suggestions related to the NFM can be addressed to nfm-owq@usgs.gov.
Sadhu, Kalyan K; Mizukami, Shin; Watanabe, Shuji; Kikuchi, Kazuya
2011-05-01
Development of protein labeling techniques with small molecules is enthralling because this method brings promises for triumph over the limitations of fluorescent proteins in live cell imaging. This technology deals with the functionalization of proteins with small molecules and is anticipated to facilitate the expansion of various protein assay methods. A new straightforward aggregation and elimination-based technique for a protein labeling system has been developed with a versatile emissive range of fluorophores. These fluorophores have been applied to show their efficiency for protein labeling by exploiting the same basic principle. A genetically modified version of class A type β-lactamase has been used as the tag protein (BL-tag). The strength of the aggregation interaction between a fluorophore and a quencher plays a governing role in the elimination step of the quencher from the probes, which ultimately controls the swiftness of the protein labeling strategy. Modulation in the elimination process can be accomplished by the variation in the nature of the fluorophore. This diversity facilitates the study of the competitive binding order among the synthesized probes toward the BL-tag labeling method. An aggregation and elimination-based BL-tag technique has been explored to develop an order of color labeling from the equimolar mixture of the labeling probe in solutions. The qualitative and quantitative determination of ordering within the probes toward labeling studies has been executed through SDS-PAGE and time-dependent fluorescence intensity enhancement measurements, respectively. The desirable multiple-wavelength fluorescence labeling probes for the BL-tag technology have been developed and demonstrate broad applicability of this labeling technology to live cell imaging with coumarin and fluorescein derivatives by using confocal microscopy.
Data Citation Concept for CMIP6
NASA Astrophysics Data System (ADS)
Stockhause, M.; Toussaint, F.; Lautenschlager, M.; Lawrence, B.
2015-12-01
There is a broad consensus among data centers and scientific publishers on Force 11's 'Joint Declaration of Data Citation Principles'. To put these principles into operation is not always as straight forward. The focus for CMIP6 data citations lies on the citation of data created by others and used in an analysis underlying the article. And for this source data usually no article of the data creators is available ('stand-alone data publication'). The planned data citation granularities are model data (data collections containing all datasets provided for the project by a single model) and experiment data (data collections containing all datasets for a scientific experiment run by a single model). In case of large international projects or activities like CMIP, the data is commonly stored and disseminated by multiple repositories in a federated data infrastructure such as the Earth System Grid Federation (ESGF). The individual repositories are subject to different institutional and national policies. A Data Management Plan (DMP) will define a certain standard for the repositories including data handling procedures. Another aspect of CMIP data, relevant for data citations, is its dynamic nature. For such large data collections, datasets are added, revised and retracted for years, before the data collection becomes stable for a data citation entity including all model or simulation data. Thus, a critical issue for ESGF is data consistency, requiring thorough dataset versioning to enable the identification of the data collection in the cited version. Currently, the ESGF is designed for accessing the latest dataset versions. Data citation introduces the necessity to support older and retracted dataset versions by storing metadata even beyond data availability (data unpublished in ESGF). Apart from ESGF, other infrastructure components exist for CMIP, which provide information that has to be connected to the CMIP6 data, e.g. ES-DOC providing information on models and simulations and the IPCC Data Distribution Centre (DDC) storing a subset of data together with available metadata (ES-DOC) for the long-term reuse of the interdisciplinary community. Other connections exist to standard project vocabularies, to personal identifiers (e.g. ORCID), or to data products (including provenance information).
NASA Astrophysics Data System (ADS)
Bogusz, Michael
1993-01-01
The need for a systematic methodology for the analysis of aircraft electromagnetic compatibility (EMC) problems is examined. The available computer aids used in aircraft EMC analysis are assessed and a theoretical basis is established for the complex algorithms which identify and quantify electromagnetic interactions. An overview is presented of one particularly well established aircraft antenna to antenna EMC analysis code, the Aircraft Inter-Antenna Propagation with Graphics (AAPG) Version 07 software. The specific new algorithms created to compute cone geodesics and their associated path losses and to graph the physical coupling path are discussed. These algorithms are validated against basic principles. Loss computations apply the uniform geometrical theory of diffraction and are subsequently compared to measurement data. The increased modelling and analysis capabilities of the newly developed AAPG Version 09 are compared to those of Version 07. Several models of real aircraft, namely the Electronic Systems Trainer Challenger, are generated and provided as a basis for this preliminary comparative assessment. Issues such as software reliability, algorithm stability, and quality of hardcopy output are also discussed.
Weston, Michele; Haudek, Kevin C.; Prevost, Luanna; Urban-Lurain, Mark; Merrill, John
2015-01-01
One challenge in science education assessment is that students often focus on surface features of questions rather than the underlying scientific principles. We investigated how student written responses to constructed-response questions about photosynthesis vary based on two surface features of the question: the species of plant and the order of two question prompts. We asked four versions of the question with different combinations of the two plant species and order of prompts in an introductory cell biology course. We found that there was not a significant difference in the content of student responses to versions of the question stem with different species or order of prompts, using both computerized lexical analysis and expert scoring. We conducted 20 face-to-face interviews with students to further probe the effects of question wording on student responses. During the interviews, we found that students thought that the plant species was neither relevant nor confusing when answering the question. Students identified the prompts as both relevant and confusing. However, this confusion was not specific to a single version. PMID:25999312
Enhanced speed in fluorescence imaging using beat frequency multiplexing
NASA Astrophysics Data System (ADS)
Mikami, Hideharu; Kobayashi, Hirofumi; Wang, Yisen; Hamad, Syed; Ozeki, Yasuyuki; Goda, Keisuke
2016-03-01
Fluorescence imaging using radiofrequency-tagged emission (FIRE) is an emerging technique that enables higher imaging speed (namely, temporal resolution) in fluorescence microscopy compared to conventional fluorescence imaging techniques such as confocal microscopy and wide-field microscopy. It works based on the principle that it uses multiple intensity-modulated fields in an interferometric setup as excitation fields and applies frequency-division multiplexing to fluorescence signals. Unfortunately, despite its high potential, FIRE has limited imaging speed due to two practical limitations: signal bandwidth and signal detection efficiency. The signal bandwidth is limited by that of an acousto-optic deflector (AOD) employed in the setup, which is typically 100-200 MHz for the spectral range of fluorescence excitation (400-600 nm). The signal detection efficiency is limited by poor spatial mode-matching between two interfering fields to produce a modulated excitation field. Here we present a method to overcome these limitations and thus to achieve higher imaging speed than the prior version of FIRE. Our method achieves an increase in signal bandwidth by a factor of two and nearly optimal mode matching, which enables the imaging speed limited by the lifetime of the target fluorophore rather than the imaging system itself. The higher bandwidth and better signal detection efficiency work synergistically because higher bandwidth requires higher signal levels to avoid the contribution of shot noise and amplifier noise to the fluorescence signal. Due to its unprecedentedly high-speed performance, our method has a wide variety of applications in cancer detection, drug discovery, and regenerative medicine.
Toroidal Optical Microresonators as Single-Particle Absorption Spectrometers
NASA Astrophysics Data System (ADS)
Heylman, Kevin D.
Single-particle and single-molecule measurements are invaluable tools for characterizing structural and energetic properties of molecules and nanomaterials. Photothermal microscopy in particular is an ultrasensitive technique capable of single-molecule resolution. In this thesis I introduce a new form of photothermal spectroscopy involving toroidal optical microresonators as detectors and a pair of non-interacting lasers as pump and probe for performing single-target absorption spectroscopy. The first three chapters will discuss the motivation, design principles, underlying theory, and fabrication process for the microresonator absorption spectrometer. With an early version of the spectrometer, I demonstrate photothermal mapping and all-optical tuning with toroids of different geometries in Chapter 4. In Chapter 5, I discuss photothermal mapping and measurement of the absolute absorption cross-sections of individual carbon nanotubes. For the next generation of measurements I incorporate all of the advances described in Chapter 2, including a double-modulation technique to improve detection limits and a tunable pump laser for spectral measurements on single gold nanoparticles. In Chapter 6 I observe sharp Fano resonances in the spectra of gold nanoparticles and describe them with a theoretical model. I continued to study this photonic-plasmonic hybrid system in Chapter 7 and explore the thermal tuning of the Fano resonance phase while quantifying the Fisher information. The new method of photothermal single-particle absorption spectroscopy that I will discuss in this thesis has reached record detection limits for microresonator sensing and is within striking distance of becoming the first single-molecule room-temperature absorption spectrometer.
Suggested criteria for evaluating systems engineering methodologies
NASA Technical Reports Server (NTRS)
Gates, Audrey; Paul, Arthur S.; Gill, Tepper L.
1989-01-01
Systems engineering is the application of mathematical and scientific principles to practical ends in the life-cycle of a system. A methodology for systems engineering is a carefully developed, relatively complex procedure or process for applying these mathematical and scientific principles. There are many systems engineering methodologies (or possibly many versions of a few methodologies) currently in use in government and industry. These methodologies are usually designed to meet the needs of a particular organization. It has been observed, however, that many technical and non-technical problems arise when inadequate systems engineering methodologies are applied by organizations to their systems development projects. Various criteria for evaluating systems engineering methodologies are discussed. Such criteria are developed to assist methodology-users in identifying and selecting methodologies that best fit the needs of the organization.
Assessment of Control Techniques for Reducing Emissions from Locomotive Engines
DOT National Transportation Integrated Search
1973-04-01
The primary objective of this study was to determine the most effective method of reducing emissions of oxides of nitrogen from a two-cylinder version of an EMD series 567C locomotive engine. The NOx control techniques selected for use in this study ...
The principles of the Brazilian Unified Health System, studied based on similitude analysis
de Pontes, Ana Paula Munhen; de Oliveira, Denize Cristina; Gomes, Antonio Marcos Tosoli
2014-01-01
Objectives to analyze and compare the incorporation of the ethical-doctrinal and organizational principles into the social representations of the Unified Health System (SUS) among health professionals. Method a study grounded in Social Representations Theory, undertaken with 125 subjects, in eight health institutions in Rio de Janeiro. The free word association technique was applied to the induction term "SUS", the words evoked being analyzed using the techniques of the Vergès matrix and similitude analysis. Results it was identified that the professionals' social representations vary depending on their level of education, and that those with higher education represent a subgroup responsible for the process of representational change identified. This result was confirmed through similitude analysis. Conclusion a process of representational change is ongoing, in which it was ascertained that the professionals incorporated the principles of the SUS into their symbolic constructions. The similitude analysis was shown to be a fruitful technique for research in nursing. PMID:24553704
Development Context Driven Change Awareness and Analysis Framework
NASA Technical Reports Server (NTRS)
Sarma, Anita; Branchaud, Josh; Dwyer, Matthew B.; Person, Suzette; Rungta, Neha
2014-01-01
Recent work on workspace monitoring allows conflict prediction early in the development process, however, these approaches mostly use syntactic differencing techniques to compare different program versions. In contrast, traditional change-impact analysis techniques analyze related versions of the program only after the code has been checked into the master repository. We propose a novel approach, De- CAF (Development Context Analysis Framework), that leverages the development context to scope a change impact analysis technique. The goal is to characterize the impact of each developer on other developers in the team. There are various client applications such as task prioritization, early conflict detection, and providing advice on testing that can benefit from such a characterization. The DeCAF framework leverages information from the development context to bound the iDiSE change impact analysis technique to analyze only the parts of the code base that are of interest. Bounding the analysis can enable DeCAF to efficiently compute the impact of changes using a combination of program dependence and symbolic execution based approaches.
Development Context Driven Change Awareness and Analysis Framework
NASA Technical Reports Server (NTRS)
Sarma, Anita; Branchaud, Josh; Dwyer, Matthew B.; Person, Suzette; Rungta, Neha; Wang, Yurong; Elbaum, Sebastian
2014-01-01
Recent work on workspace monitoring allows conflict prediction early in the development process, however, these approaches mostly use syntactic differencing techniques to compare different program versions. In contrast, traditional change-impact analysis techniques analyze related versions of the program only after the code has been checked into the master repository. We propose a novel approach, DeCAF (Development Context Analysis Framework), that leverages the development context to scope a change impact analysis technique. The goal is to characterize the impact of each developer on other developers in the team. There are various client applications such as task prioritization, early conflict detection, and providing advice on testing that can benefit from such a characterization. The DeCAF framework leverages information from the development context to bound the iDiSE change impact analysis technique to analyze only the parts of the code base that are of interest. Bounding the analysis can enable DeCAF to efficiently compute the impact of changes using a combination of program dependence and symbolic execution based approaches.
NASA Technical Reports Server (NTRS)
da Silva, Arlindo; Redder, Christopher
2010-01-01
MERRA is a NASA reanalysis for the satellite era using a major new version of the Goddard Earth Observing System Data Assimilation System Version 5 (GEOS-5). The project focuses on historical analyses of the hydrological cycle on a broad range of weather and climate time scales and places the NASA EOS suite of observations in a climate context. The characterization of uncertainty in reanalysis fields is a commonly requested feature by users of such data. While intercomparison with reference data sets is common practice for ascertaining the realism of the datasets, such studies typically are restricted to long term climatological statistics and seldom provide state dependent measures of the uncertainties involved. In principle, variational data assimilation algorithms have the ability of producing error estimates for the analysis variables (typically surface pressure, winds, temperature, moisture and ozone) consistent with the assumed background and observation error statistics. However, these "perceived error estimates" are expensive to obtain and are limited by the somewhat simplistic errors assumed in the algorithm. The observation minus forecast residuals (innovations) by-product of any assimilation system constitutes a powerful tool for estimating the systematic and random errors in the analysis fields. Unfortunately, such data is usually not readily available with reanalysis products, often requiring the tedious decoding of large datasets and not so-user friendly file formats. With MERRA we have introduced a gridded version of the observations/innovations used in the assimilation process, using the same grid and data formats as the regular datasets. Such dataset empowers the user with the ability of conveniently performing observing system related analysis and error estimates. The scope of this dataset will be briefly described. We will present a systematic analysis of MERRA innovation time series for the conventional observing system, including maximum-likelihood estimates of background and observation errors, as well as global bias estimates. Starting with the joint PDF of innovations and analysis increments at observation locations we propose a technique for diagnosing bias among the observing systems, and document how these contextual biases have evolved during the satellite era covered by MERRA.
NASA Astrophysics Data System (ADS)
da Silva, A.; Redder, C. R.
2010-12-01
MERRA is a NASA reanalysis for the satellite era using a major new version of the Goddard Earth Observing System Data Assimilation System Version 5 (GEOS-5). The Project focuses on historical analyses of the hydrological cycle on a broad range of weather and climate time scales and places the NASA EOS suite of observations in a climate context. The characterization of uncertainty in reanalysis fields is a commonly requested feature by users of such data. While intercomparison with reference data sets is common practice for ascertaining the realism of the datasets, such studies typically are restricted to long term climatological statistics and seldom provide state dependent measures of the uncertainties involved. In principle, variational data assimilation algorithms have the ability of producing error estimates for the analysis variables (typically surface pressure, winds, temperature, moisture and ozone) consistent with the assumed background and observation error statistics. However, these "perceived error estimates" are expensive to obtain and are limited by the somewhat simplistic errors assumed in the algorithm. The observation minus forecast residuals (innovations) by-product of any assimilation system constitutes a powerful tool for estimating the systematic and random errors in the analysis fields. Unfortunately, such data is usually not readily available with reanalysis products, often requiring the tedious decoding of large datasets and not so-user friendly file formats. With MERRA we have introduced a gridded version of the observations/innovations used in the assimilation process, using the same grid and data formats as the regular datasets. Such dataset empowers the user with the ability of conveniently performing observing system related analysis and error estimates. The scope of this dataset will be briefly described. We will present a systematic analysis of MERRA innovation time series for the conventional observing system, including maximum-likelihood estimates of background and observation errors, as well as global bias estimates. Starting with the joint PDF of innovations and analysis increments at observation locations we propose a technique for diagnosing bias among the observing systems, and document how these contextual biases have evolved during the satellite era covered by MERRA.
Business Management for Independent Schools. Third Edition.
ERIC Educational Resources Information Center
National Association of Independent Schools, Boston, MA.
This business management manual discusses school accounting and reporting principles; in particular, financial management, computerization, and records retention techniques. First is described the basic accounting principles, plant funds, endowment funds, operational funds, chart of accounts, and financial states of the school's annual financial…
Curricular Guidelines for Dental Auxiliary Radiology.
ERIC Educational Resources Information Center
Journal of Dental Education, 1981
1981-01-01
AADS curricular guidelines suggest objectives for these areas of dental auxiliary radiology: physical principles of X-radiation in dentistry, related radiobiological concepts, principles of radiologic health, radiographic technique, x-ray films and intensifying screens, factors contributing to film quality, darkroom, and normal variations in…
Laboratory reptile surgery: principles and techniques.
Alworth, Leanne C; Hernandez, Sonia M; Divers, Stephen J
2011-01-01
Reptiles used for research and instruction may require surgical procedures, including biopsy, coelomic device implantation, ovariectomy, orchidectomy, and esophogostomy tube placement, to accomplish research goals. Providing veterinary care for unanticipated clinical problems may require surgical techniques such as amputation, bone or shell fracture repair, and coeliotomy. Although many principles of surgery are common between mammals and reptiles, important differences in anatomy and physiology exist. Veterinarians who provide care for these species should be aware of these differences. Most reptiles undergoing surgery are small and require specific instrumentation and positioning. In addition, because of the wide variety of unique physiologic and anatomic characteristics among snakes, chelonians, and lizards, different techniques may be necessary for different reptiles. This overview describes many common reptile surgery techniques and their application for research purposes or to provide medical care to research subjects.
NASA Technical Reports Server (NTRS)
Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.
1992-01-01
The Penn State Finite Difference Time Domain Electromagnetic Scattering Code version D is a 3-D numerical electromagnetic scattering code based upon the finite difference time domain technique (FDTD). The manual provides a description of the code and corresponding results for several scattering problems. The manual is organized into 14 sections: introduction; description of the FDTD method; operation; resource requirements; version D code capabilities; a brief description of the default scattering geometry; a brief description of each subroutine; a description of the include file; a section briefly discussing Radar Cross Section computations; a section discussing some scattering results; a sample problem setup section; a new problem checklist; references and figure titles. The FDTD technique models transient electromagnetic scattering and interactions with objects of arbitrary shape and/or material composition. In the FDTD method, Maxwell's curl equations are discretized in time-space and all derivatives (temporal and spatial) are approximated by central differences.
Simple System for Isothermal DNA Amplification Coupled to Lateral Flow Detection
Roskos, Kristina; Hickerson, Anna I.; Lu, Hsiang-Wei; Ferguson, Tanya M.; Shinde, Deepali N.; Klaue, Yvonne; Niemz, Angelika
2013-01-01
Infectious disease diagnosis in point-of-care settings can be greatly improved through integrated, automated nucleic acid testing devices. We have developed an early prototype for a low-cost system which executes isothermal DNA amplification coupled to nucleic acid lateral flow (NALF) detection in a mesofluidic cartridge attached to a portable instrument. Fluid handling inside the cartridge is facilitated through one-way passive valves, flexible pouches, and electrolysis-driven pumps, which promotes a compact and inexpensive instrument design. The closed-system disposable prevents workspace amplicon contamination. The cartridge design is based on standard scalable manufacturing techniques such as injection molding. Nucleic acid amplification occurs in a two-layer pouch that enables efficient heat transfer. We have demonstrated as proof of principle the amplification and detection of Mycobacterium tuberculosis (M.tb) genomic DNA in the cartridge, using either Loop Mediated Amplification (LAMP) or the Exponential Amplification Reaction (EXPAR), both coupled to NALF detection. We envision that a refined version of this cartridge, including upstream sample preparation coupled to amplification and detection, will enable fully-automated sample-in to answer-out infectious disease diagnosis in primary care settings of low-resource countries with high disease burden. PMID:23922706
Investment, regulation, and uncertainty
Smyth, Stuart J; McDonald, Jillian; Falck-Zepeda, Jose
2014-01-01
As with any technological innovation, time refines the technology, improving upon the original version of the innovative product. The initial GM crops had single traits for either herbicide tolerance or insect resistance. Current varieties have both of these traits stacked together and in many cases other abiotic and biotic traits have also been stacked. This innovation requires investment. While this is relatively straight forward, certain conditions need to exist such that investments can be facilitated. The principle requirement for investment is that regulatory frameworks render consistent and timely decisions. If the certainty of regulatory outcomes weakens, the potential for changes in investment patterns increases. This article provides a summary background to the leading plant breeding technologies that are either currently being used to develop new crop varieties or are in the pipeline to be applied to plant breeding within the next few years. Challenges for existing regulatory systems are highlighted. Utilizing an option value approach from investment literature, an assessment of uncertainty regarding the regulatory approval for these varying techniques is undertaken. This research highlights which technology development options have the greatest degree of uncertainty and hence, which ones might be expected to see an investment decline. PMID:24499745
Automatic structural matching of 3D image data
NASA Astrophysics Data System (ADS)
Ponomarev, Svjatoslav; Lutsiv, Vadim; Malyshev, Igor
2015-10-01
A new image matching technique is described. It is implemented as an object-independent hierarchical structural juxtaposition algorithm based on an alphabet of simple object-independent contour structural elements. The structural matching applied implements an optimized method of walking through a truncated tree of all possible juxtapositions of two sets of structural elements. The algorithm was initially developed for dealing with 2D images such as the aerospace photographs, and it turned out to be sufficiently robust and reliable for matching successfully the pictures of natural landscapes taken in differing seasons from differing aspect angles by differing sensors (the visible optical, IR, and SAR pictures, as well as the depth maps and geographical vector-type maps). At present (in the reported version), the algorithm is enhanced based on additional use of information on third spatial coordinates of observed points of object surfaces. Thus, it is now capable of matching the images of 3D scenes in the tasks of automatic navigation of extremely low flying unmanned vehicles or autonomous terrestrial robots. The basic principles of 3D structural description and matching of images are described, and the examples of image matching are presented.
Schueller, Stephen M; Riley, William T; Brown, C Hendricks; Cuijpers, Pim; Duan, Naihua; Kwasny, Mary J; Stiles-Shields, Colleen; Cheung, Ken
2015-01-01
In recent years, there has been increasing discussion of the limitations of traditional randomized controlled trial (RCT) methodologies for the evaluation of eHealth and mHealth interventions, and in particular, the requirement that these interventions be locked down during evaluation. Locking down these interventions locks in defects and eliminates the opportunities for quality improvement and adaptation to the changing technological environment, often leading to validation of tools that are outdated by the time that trial results are published. Furthermore, because behavioral intervention technologies change frequently during real-world deployment, even if a tested intervention were deployed in the real world, its shelf life would be limited. We argue that RCTs will have greater scientific and public health value if they focus on the evaluation of intervention principles (rather than a specific locked-down version of the intervention), allowing for ongoing quality improvement modifications to the behavioral intervention technology based on the core intervention principles, while continuously improving the functionality and maintaining technological currency. This paper is an initial proposal of a framework and methodology for the conduct of trials of intervention principles (TIPs) aimed at minimizing the risks of in-trial changes to intervention technologies and maximizing the potential for knowledge acquisition. The focus on evaluation of intervention principles using clinical and usage outcomes has the potential to provide more generalizable and durable information than trials focused on a single intervention technology. PMID:26155878
Mohr, David C; Schueller, Stephen M; Riley, William T; Brown, C Hendricks; Cuijpers, Pim; Duan, Naihua; Kwasny, Mary J; Stiles-Shields, Colleen; Cheung, Ken
2015-07-08
In recent years, there has been increasing discussion of the limitations of traditional randomized controlled trial (RCT) methodologies for the evaluation of eHealth and mHealth interventions, and in particular, the requirement that these interventions be locked down during evaluation. Locking down these interventions locks in defects and eliminates the opportunities for quality improvement and adaptation to the changing technological environment, often leading to validation of tools that are outdated by the time that trial results are published. Furthermore, because behavioral intervention technologies change frequently during real-world deployment, even if a tested intervention were deployed in the real world, its shelf life would be limited. We argue that RCTs will have greater scientific and public health value if they focus on the evaluation of intervention principles (rather than a specific locked-down version of the intervention), allowing for ongoing quality improvement modifications to the behavioral intervention technology based on the core intervention principles, while continuously improving the functionality and maintaining technological currency. This paper is an initial proposal of a framework and methodology for the conduct of trials of intervention principles (TIPs) aimed at minimizing the risks of in-trial changes to intervention technologies and maximizing the potential for knowledge acquisition. The focus on evaluation of intervention principles using clinical and usage outcomes has the potential to provide more generalizable and durable information than trials focused on a single intervention technology.
Klemm, Matthias; Blum, Johannes; Link, Dietmar; Hammer, Martin; Haueisen, Jens; Schweitzer, Dietrich
2016-01-01
Fluorescence lifetime imaging ophthalmoscopy (FLIO) is a new technique to detect changes in the human retina. The autofluorescence decay over time, generated by endogenous fluorophores, is measured in vivo. The strong autofluorescence of the crystalline lens, however, superimposes the intensity decay of the retina fluorescence, as the confocal principle is not able to suppress it sufficiently. Thus, the crystalline lens autofluorescence causes artifacts in the retinal fluorescence lifetimes determined from the intensity decays. Here, we present a new technique to suppress the autofluorescence of the crystalline lens by introducing an annular stop into the detection light path, which we call Schweitzer’s principle. The efficacy of annular stops with an outer diameter of 7 mm and inner diameters of 1 to 5 mm are analyzed in an experimental setup using a model eye based on fluorescent dyes. Compared to the confocal principle, Schweitzer’s principle with an inner diameter of 3 mm is able to reduce the simulated crystalline lens fluorescence to 4%, while 42% of the simulated retina fluorescence is preserved. Thus, we recommend the implementation of Schweitzer’s principle in scanning laser ophthalmoscopes used for fundus autofluorescence measurements, especially the FLIO device, for improved image quality. PMID:27699092
Human error identification for laparoscopic surgery: Development of a motion economy perspective.
Al-Hakim, Latif; Sevdalis, Nick; Maiping, Tanaphon; Watanachote, Damrongpan; Sengupta, Shomik; Dissaranan, Charuspong
2015-09-01
This study postulates that traditional human error identification techniques fail to consider motion economy principles and, accordingly, their applicability in operating theatres may be limited. This study addresses this gap in the literature with a dual aim. First, it identifies the principles of motion economy that suit the operative environment and second, it develops a new error mode taxonomy for human error identification techniques which recognises motion economy deficiencies affecting the performance of surgeons and predisposing them to errors. A total of 30 principles of motion economy were developed and categorised into five areas. A hierarchical task analysis was used to break down main tasks of a urological laparoscopic surgery (hand-assisted laparoscopic nephrectomy) to their elements and the new taxonomy was used to identify errors and their root causes resulting from violation of motion economy principles. The approach was prospectively tested in 12 observed laparoscopic surgeries performed by 5 experienced surgeons. A total of 86 errors were identified and linked to the motion economy deficiencies. Results indicate the developed methodology is promising. Our methodology allows error prevention in surgery and the developed set of motion economy principles could be useful for training surgeons on motion economy principles. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.
The physics of bat echolocation: Signal processing techniques
NASA Astrophysics Data System (ADS)
Denny, Mark
2004-12-01
The physical principles and signal processing techniques underlying bat echolocation are investigated. It is shown, by calculation and simulation, how the measured echolocation performance of bats can be achieved.
A scale space feature based registration technique for fusion of satellite imagery
NASA Technical Reports Server (NTRS)
Raghavan, Srini; Cromp, Robert F.; Campbell, William C.
1997-01-01
Feature based registration is one of the most reliable methods to register multi-sensor images (both active and passive imagery) since features are often more reliable than intensity or radiometric values. The only situation where a feature based approach will fail is when the scene is completely homogenous or densely textural in which case a combination of feature and intensity based methods may yield better results. In this paper, we present some preliminary results of testing our scale space feature based registration technique, a modified version of feature based method developed earlier for classification of multi-sensor imagery. The proposed approach removes the sensitivity in parameter selection experienced in the earlier version as explained later.
Total Physical Response: A Technique for Teaching All Skills in Spanish.
ERIC Educational Resources Information Center
Glisan, Eileen W.
1986-01-01
Presents a strategy for using an expanded version of Total Physical Response (TPR) as one tool for teaching listening, speaking, reading, and writing in Spanish. Variations of TPR are suggested for the purpose of implementing the technique within the foreign language curriculum. (Author/CB)
DOT National Transportation Integrated Search
1985-01-01
Using a modified version of the Delphi technique, a panel of transportation safety experts developed the following list of legislative priorities for submission to the Department of Motor Vehicles (DMV) Legislative Package for the 1986 session of the...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-23
... principles of best practice. (2) Enhance Intrinsic Motivation--Research strongly suggests that ``motivational interviewing'' techniques, rather than persuasion tactics, effectively enhance motivation for initiating and.... c. Responsivity Principle--Be responsive to temperament, learning style, motivation, gender, and...
21 CFR 809.10 - Labeling for in vitro diagnostic products.
Code of Federal Regulations, 2010 CFR
2010-04-01
... principles of the procedure. Explain concisely, with chemical reactions and techniques involved, if...) Instruments: (i) Use or function. (ii) Installation procedures and special requirements. (iii) Principles of... product testing prior to full commercial marketing (for example, for use on specimens derived from humans...
A Theoretical Model for the Practice of Residential Treatment.
ERIC Educational Resources Information Center
Miskimins, R. W.
1990-01-01
Presents theoretical model describing practice of psychiatric residential treatment for children and adolescents. Emphasis is on 40 practice principles, guiding concepts which dictate specific treatment techniques and administrative procedures for Southern Oregon Adolescent Study and Treatment Center. Groups principles into six clusters: program…
Self-Managed Studying in College Courses.
ERIC Educational Resources Information Center
Edwards, K. Anthony
In an introductory psychology course, students were taught some principles of "adjustment" using self-management techniques and were required to conduct a self-management project. The four student projects reported herein were specifically designed to improve study skills through use of Premack's principle and stimulus control. Course…
Typewriting Methodology 1977: Eight Basic Principles for Good Results
ERIC Educational Resources Information Center
Winger, Fred E.
1977-01-01
The eight basic principles of teaching methodology discussed are as follows: Stress position and technique, stress skill building, stress the pretest/practice/posttest method, stress action research, stress true production skills, stress good proofreading skills, stress performance goals, and stress individualized instruction. (TA)
Nakayama, Masataka; Saito, Satoru
2015-08-01
The present study investigated principles of phonological planning, a common serial ordering mechanism for speech production and phonological short-term memory. Nakayama and Saito (2014) have investigated the principles by using a speech-error induction technique, in which participants were exposed to an auditory distracIor word immediately before an utterance of a target word. They demonstrated within-word adjacent mora exchanges and serial position effects on error rates. These findings support, respectively, the temporal distance and the edge principles at a within-word level. As this previous study induced errors using word distractors created by exchanging adjacent morae in the target words, it is possible that the speech errors are expressions of lexical intrusions reflecting interactive activation of phonological and lexical/semantic representations. To eliminate this possibility, the present study used nonword distractors that had no lexical or semantic representations. This approach successfully replicated the error patterns identified in the abovementioned study, further confirming that the temporal distance and edge principles are organizing precepts in phonological planning.
NASA Technical Reports Server (NTRS)
Steinthorsson, E.; Shih, T. I-P.; Roelke, R. J.
1991-01-01
In order to generate good quality systems for complicated three-dimensional spatial domains, the grid-generation method used must be able to exert rather precise controls over grid-point distributions. Several techniques are presented that enhance control of grid-point distribution for a class of algebraic grid-generation methods known as the two-, four-, and six-boundary methods. These techniques include variable stretching functions from bilinear interpolation, interpolating functions based on tension splines, and normalized K-factors. The techniques developed in this study were incorporated into a new version of GRID3D called GRID3D-v2. The usefulness of GRID3D-v2 was demonstrated by using it to generate a three-dimensional grid system in the coolent passage of a radial turbine blade with serpentine channels and pin fins.
ERIC Educational Resources Information Center
Stolurow, Lawrence M.; And Others
Coding systems need to be developed to account for computer decisions on every frame of a self-instructional program. In flow charts of the UICSM high school math programed series, each frame or page is represented by a diagramatic convention: diamond if a mainline frame, a rectangle if a quiz frame, a bottom-heavy trapezoid if a review or…
Privacy Protection through pseudonymisation in eHealth.
De Meyer, F; De Moor, G; Reed-Fourquet, L
2008-01-01
The ISO TC215 WG4 pseudonymisation task group has produced in 2008 a first version of a technical specification for the application of pseudonymisation in Healthcare Informatics 0. This paper investigates the principles set out in the technical specification as well as its implications in eHealth. The technical specification starts out with a conceptual model and evolves from a theoretical model to a real life model by adding assumptions on the observability of personal data.
A Summary of the Naval Postgraduate School Research Program and Recent Publications
1990-09-01
principles to divide the spectrum of MATLAB computer program on a 386-type a wide-band spread-spectrum signal into sub- computer. Because of the high rf...original in time and a large data sample was required. An signal. Effects due the fiber optic pickup array extended version of MATLAB that allows and...application, such as orbital mechanics and weather prediction. Professor Gragg has also developed numerous MATLAB programs for linear programming problems
Information Visualization: The State of the Art for Maritime Domain Awareness
2006-08-01
les auteurs et aux évaluations de la qualité de certains documents, mots-clés et liens. La version sur papier de cette base de données se trouve à...interface design principles, psychological studies, and perception research • include a review of visualization theory including current visualization...builds on that a theory of how maps are understood (knowledge schemata and cognitive representations), and then analyses the use of symbols and
NASA Astrophysics Data System (ADS)
Sørensen, H.; Nordskov, A.; Sass, B.; Visler, T.
1987-12-01
A simplified version of a deuterium pellet gun based on the pipe gun principle is described. The pipe gun is made from a continuous tube of stainless steel and gas is fed in from the muzzle end only. It is indicated that the pellet length is determined by the temperature gradient along the barrel right outside the freezing cell. Velocities of around 1000 m/s with a scatter of ±2% are obtained with a propellant gas pressure of 40 bar.
ERIC Educational Resources Information Center
New York State Education Dept., Albany. Office of Elementary, Middle and Secondary Education.
This Spanish translation of the abridged version of "A New Compact for Learning" acknowledges the current U.S. education system's inadequacy to educate U.S. citizenry and introduces New York State's New Compact for Learning, intended as a plan to reorganize New York's own system. The compact's fundamental principles are: (1) recognizing…
User guide for MODPATH Version 7—A particle-tracking model for MODFLOW
Pollock, David W.
2016-09-26
MODPATH is a particle-tracking post-processing program designed to work with MODFLOW, the U.S. Geological Survey (USGS) finite-difference groundwater flow model. MODPATH version 7 is the fourth major release since its original publication. Previous versions were documented in USGS Open-File Reports 89–381 and 94–464 and in USGS Techniques and Methods 6–A41.MODPATH version 7 works with MODFLOW-2005 and MODFLOW–USG. Support for unstructured grids in MODFLOW–USG is limited to smoothed, rectangular-based quadtree and quadpatch grids.A software distribution package containing the computer program and supporting documentation, such as input instructions, output file descriptions, and example problems, is available from the USGS over the Internet (http://water.usgs.gov/ogw/modpath/).
NASA Technical Reports Server (NTRS)
Warren, Wayne H., Jr.
1989-01-01
The machine-readable version of the catalog, as it is currently being distributed from the Astronomical Data Center, is described. The catalog is a compilation of measurements of binary- and multiple-star systems obtained by speckle interferometric techniques; this version supersedes a previous edition of the catalog published in 1985. Stars that have been examined for multiplicity with negative results are included, in which case upper limits for the separation are given. The second version is expanded from the first in that a file of newly resolved systems and six cross-index files of alternate designations are included. The data file contains alternate identifications for the observed systems, epochs of observation, reported errors in position angles and separation, and bibliographical references.
Martínez-Pernía, David; González-Castán, Óscar; Huepe, David
2017-02-01
The development of rehabilitation has traditionally focused on measurements of motor disorders and measurements of the improvements produced during the therapeutic process; however, physical rehabilitation sciences have not focused on understanding the philosophical and scientific principles in clinical intervention and how they are interrelated. The main aim of this paper is to explain the foundation stones of the disciplines of physical therapy, occupational therapy, and speech/language therapy in recovery from motor disorder. To reach our goals, the mechanistic view and how it is integrated into physical rehabilitation will first be explained. Next, a classification into mechanistic therapy based on an old version (automaton model) and a technological version (cyborg model) will be shown. Then, it will be shown how physical rehabilitation sciences found a new perspective in motor recovery, which is based on functionalism, during the cognitive revolution in the 1960s. Through this cognitive theory, physical rehabilitation incorporated into motor recovery of those therapeutic strategies that solicit the activation of the brain and/or symbolic processing; aspects that were not taken into account in mechanistic therapy. In addition, a classification into functionalist rehabilitation based on a computational therapy and a brain therapy will be shown. At the end of the article, the methodological principles in physical rehabilitation sciences will be explained. It will allow us to go deeper into the differences and similarities between therapeutic mechanism and therapeutic functionalism.
[Confidentiality in medical oaths: (When the white crow becomes gray...)].
Gelpi, R J; Pérez, M L; Rancich, A M; Mainetti, J A
2000-01-01
Confidentiality, together with the ethical principles of beneficence and non-maleficence, is the most important rule in Medical Oaths at the present time. However, the scientific-technical advances in medicine have made this rule one of the most controversial ones because of its exceptions. In consequence, the aim of the present paper is to comparatively analyze the rule of confidentiality in Medical Oaths of different places, times, origins and in different versions of the Hippocratic Oath in order to determine what should be kept a secret and with what degree of commitment (absolute or "prima facie"). Of the thirty six analyzed Oaths, twenty-seven manifest this rule and nine do not. No relation was found between the manifestation of this rule and the place, time, origin and different versions of the Hippocratic Oath. Most pledges suggest not to reveal what has been seen or heard during the medical act, the same as in the Hippocratic Oath. Seven texts point out that confidentiality should be absolute and four give exceptions in connection with beneficence and justice principles and the moral duty of causing no damage to third parties. Two pledges specify protection of privacy. In conclusion, today confidentiality is considered to be a moral duty for the benefit of the patient and out of consideration for his autonomy; however, at the present time in medicine the duty of keeping absolute secrecy is being reconsidered.
High Performance Liquid Chromatography Experiments to Undergraduate Laboratories
ERIC Educational Resources Information Center
Kissinger, Peter T.; And Others
1977-01-01
Reviews the principles of liquid chromatography with electrochemical detection (LCEC), an analytical technique that incorporates the advantages of both liquids chromatography and electrochemistry. Also suggests laboratory experiments using this technique. (MLH)
NASA Technical Reports Server (NTRS)
Wells, Jeffrey M.; Jones, Thomas W.; Danehy, Paul M.
2005-01-01
Techniques for enhancing photogrammetric measurement of reflective surfaces by reducing noise were developed utilizing principles of light polarization. Signal selectivity with polarized light was also compared to signal selectivity using chromatic filters. Combining principles of linear cross polarization and color selectivity enhanced signal-to-noise ratios by as much as 800 fold. More typical improvements with combining polarization and color selectivity were about 100 fold. We review polarization-based techniques and present experimental results comparing the performance of traditional retroreflective targeting materials, cornercube targets returning depolarized light, and color selectivity.
NASA Technical Reports Server (NTRS)
Wheeler, D. R.
1978-01-01
The principles of ESCA (electron spectroscopy for chemical analysis) are described by comparison with other spectroscopic techniques. The advantages and disadvantages of ESCA as compared to other surface sensitive analytical techniques are evaluated. The use of ESCA is illustrated by actual applications to oxidation of steel and Rene 41, the chemistry of lubricant additives on steel, and the composition of sputter deposited hard coatings. Finally, a bibliography of material that is useful for further study of ESCA is presented and commented upon.
Part-Task Training Strategies in Simulated Carrier Landing Final Approach Training
1983-11-01
received a large amount of attention in the recent past. However, the notion that the value of flight simulation may b• enhanced when principles of...as training devices through the application of principles of learning. The research proposed here s based on this point of view. THIS EXPERIMENT The...tracking. Following Goldstein’s suggestion, one should look for training techniques suggested by learnina principles developed from research on
Accelerated Simulation of Kinetic Transport Using Variational Principles and Sparsity
DOE Office of Scientific and Technical Information (OSTI.GOV)
Caflisch, Russel
This project is centered on the development and application of techniques of sparsity and compressed sensing for variational principles, PDEs and physics problems, in particular for kinetic transport. This included derivation of sparse modes for elliptic and parabolic problems coming from variational principles. The research results of this project are on methods for sparsity in differential equations and their applications and on application of sparsity ideas to kinetic transport of plasmas.
On hydrostatic flows in isentropic coordinates
NASA Astrophysics Data System (ADS)
Bokhove, Onno
2000-01-01
The hydrostatic primitive equations of motion which have been used in large-scale weather prediction and climate modelling over the last few decades are analysed with variational methods in an isentropic Eulerian framework. The use of material isentropic coordinates for the Eulerian hydrostatic equations is known to have distinct conceptual advantages since fluid motion is, under inviscid and statically stable circumstances, confined to take place on quasi-horizontal isentropic surfaces. First, an Eulerian isentropic Hamilton's principle, expressed in terms of fluid parcel variables, is therefore derived by transformation of a Lagrangian Hamilton's principle to an Eulerian one. This Eulerian principle explicitly describes the boundary dynamics of the time-dependent domain in terms of advection of boundary isentropes sB; these are the values the isentropes have at their intersection with the (lower) boundary. A partial Legendre transform for only the interior variables yields an Eulerian ‘action’ principle. Secondly, Noether's theorem is used to derive energy and potential vorticity conservation from the Eulerian Hamilton's principle. Thirdly, these conservation laws are used to derive a wave-activity invariant which is second-order in terms of small-amplitude disturbances relative to a resting or moving basic state. Linear stability criteria are derived but only for resting basic states. In mid-latitudes a time- scale separation between gravity and vortical modes occurs. Finally, this time-scale separation suggests that conservative geostrophic and ageostrophic approximations can be made to the Eulerian action principle for hydrostatic flows. Approximations to Eulerian variational principles may be more advantageous than approximations to Lagrangian ones because non-dimensionalization and scaling tend to be based on Eulerian estimates of the characteristic scales involved. These approximations to the stratified hydrostatic formulation extend previous approximations to the shallow- water equations. An explicit variational derivation is given of an isentropic version of Hoskins & Bretherton's model for atmospheric fronts.
Dr. Goddard and a 1918 version of 'Bazooka'
NASA Technical Reports Server (NTRS)
2004-01-01
Dr. Robert H. Goddard loading a 1918 version of the Bazooka of World War II. From 1930 to 1941, Dr. Goddard made substantial progress in the development of progressively larger rockets, which attained altitudes of 2400 meters, and refined his equipment for guidance and control, his techniques of welding, and his insulation, pumps, and other associated equipment. In many respects, Dr. Goddard laid the essential foundations of practical rocket technology
Operational Cyber Testing Recommendations- Version 1
2014-05-02
be used to verify 5 representative pieces of the environment via sample test runs. Also, ideally, an early version of each test case can also be stood...extraneous effort). Comparing the sample results collected from the scripts with expected results can reveal deficiencies in the data collection techniques...the reporting mechanisms, and the system components themselves. The sample results can also be used for confirming that data is collected with high
Chalifoux, Laurie A; Bauchat, Jeanette R; Higgins, Nicole; Toledo, Paloma; Peralta, Feyce M; Farrer, Jason; Gerber, Susan E; McCarthy, Robert J; Sullivan, John T
2017-10-01
Breech presentation is a leading cause of cesarean delivery. The use of neuraxial anesthesia increases the success rate of external cephalic version procedures for breech presentation and reduces cesarean delivery rates for fetal malpresentation. Meta-analysis suggests that higher-dose neuraxial techniques increase external cephalic version success to a greater extent than lower-dose techniques, but no randomized study has evaluated the dose-response effect. We hypothesized that increasing the intrathecal bupivacaine dose would be associated with increased external cephalic version success. We conducted a randomized, double-blind trial to assess the effect of four intrathecal bupivacaine doses (2.5, 5.0, 7.5, 10.0 mg) combined with fentanyl 15 μg on the success rate of external cephalic version for breech presentation. Secondary outcomes included mode of delivery, indication for cesarean delivery, and length of stay. A total of 240 subjects were enrolled, and 239 received the intervention. External cephalic version was successful in 123 (51.5%) of 239 patients. Compared with bupivacaine 2.5 mg, the odds (99% CI) for a successful version were 1.0 (0.4 to 2.6), 1.0 (0.4 to 2.7), and 0.9 (0.4 to 2.4) for bupivacaine 5.0, 7.5, and 10.0 mg, respectively (P = 0.99). There were no differences in the cesarean delivery rate (P = 0.76) or indication for cesarean delivery (P = 0.82). Time to discharge was increased 60 min (16 to 116 min) with bupivacaine 7.5 mg or higher as compared with 2.5 mg (P = 0.004). A dose of intrathecal bupivacaine greater than 2.5 mg does not lead to an additional increase in external cephalic procedural success or a reduction in cesarean delivery.
The Application of Gestalt Principles in Classroom Teaching
ERIC Educational Resources Information Center
Phillips, Mark
1976-01-01
Discusses the application of principles and techniques derived from Gestalt therapy to education. Initial investigations of the results of these applications have noted significant benefits to both teachers and students, including personal control, self-knowledge and self-esteem. For journal availability, see SO 504 730. (Author/DB)
Retail Training and Education.
ERIC Educational Resources Information Center
Bloodworth, Margaret
The book provides insight into training aims, principles, techniques, aids , and courses designed to meet the training needs of specific groups of staff, such as management staff, training staff, new staff, and established staff. It also covers the training and educational elements of the Certificate of Retail Management Principles. Training…
Desert Operations Tactics, Techniques, and Procedures. Southwest Asia Focus
1990-11-01
6-5 CONTAM IN ATION TRAN SFER ...................................................................... 6-7 PRINCIPLES OF CONTAMINATION AVOIDANCE...6-14 Decontam ination Principles ....................................................................... 6-14 Aircraft Spraydown...moisture or food, lice, mites, and flies can be extremely unpleasant and carry diseases such as scrub typhus and dysentery. The stings of many
Kropacheva, Marya; Melgunov, Mikhail; Makarova, Irina
2017-02-01
The study of migration pathways of artificial isotopes in the flood-plain biogeocoenoses, impacted by the nuclear fuel cycle plants, requires determination of isotope speciations in the biomass of higher terrestrial plants. The optimal method for their determination is the sequential elution technique (SET). The technique was originally developed to study atmospheric pollution by metals and has been applied to lichens, terrestrial and aquatic bryophytes. Due to morphological and physiological differences, it was necessary to adapt SET for new objects: coastal macrophytes growing on the banks of the Yenisei flood-plain islands in the near impact zone of Krasnoyarsk Mining and Chemical Combine (KMCC). In the first version of SET, 20 mM Na 2 EDTA was used as a reagent at the first stage; in the second version of SET, it was 1 M CH 3 COONH 4 . Four fractions were extracted. Fraction I included elements from the intercellular space and those connected with the outer side of the cell wall. Fraction II contained intracellular elements; fraction III contained elements firmly bound in the cell wall and associated structures; fraction IV contained insoluble residue. Adaptation of SET has shown that the first stage should be performed immediately after sampling. Separation of fractions III and IV can be neglected, since the output of isotopes into the IV fraction is at the level of error detection. The most adequate version of SET for terrestrial vascular plants is the version using 20 mM Na 2 EDTA at the first stage. Isotope 90 Sr is most sensitive to the technique changes. Its distribution depends strongly on both the extractant used at stage 1 and duration of the first stage. Distribution of artificial radionuclides in the biomass of terrestrial vascular plants can vary from year to year and depends significantly on the age of the plant. Copyright © 2016 Elsevier Ltd. All rights reserved.
Laboratory Reptile Surgery: Principles and Techniques
Alworth, Leanne C; Hernandez, Sonia M; Divers, Stephen J
2011-01-01
Reptiles used for research and instruction may require surgical procedures, including biopsy, coelomic device implantation, ovariectomy, orchidectomy, and esophogostomy tube placement, to accomplish research goals. Providing veterinary care for unanticipated clinical problems may require surgical techniques such as amputation, bone or shell fracture repair, and coeliotomy. Although many principles of surgery are common between mammals and reptiles, important differences in anatomy and physiology exist. Veterinarians who provide care for these species should be aware of these differences. Most reptiles undergoing surgery are small and require specific instrumentation and positioning. In addition, because of the wide variety of unique physiologic and anatomic characteristics among snakes, chelonians, and lizards, different techniques may be necessary for different reptiles. This overview describes many common reptile surgery techniques and their application for research purposes or to provide medical care to research subjects. PMID:21333158
Neuronavigation. Principles. Surgical technique.
Ivanov, Marcel; Vlad Ciurea, Alexandru
2009-01-01
Neuronavigation and stereotaxy are techniques designed to help neurosurgeons precisely localize different intracerebral pathological processes by using a set of preoperative images (CT, MRI, fMRI, PET, SPECT etc.). The development of computer assisted surgery was possible only after a significant technological progress, especially in the area of informatics and imagistics. The main indications of neuronavigation are represented by the targeting of small and deep intracerebral lesions and choosing the best way to treat them, in order to preserve the neurological function. Stereotaxis also allows lesioning or stimulation of basal ganglia for the treatment of movement disorders. These techniques can bring an important amount of confort both to the patient and to the neurosurgeon. Neuronavigation was introduced in Romania around 2003, in four neurosurgical centers. We present our five-years experience in neuronavigation and describe the main principles and surgical techniques. PMID:20108488
Zachariah, Marianne; Seidling, Hanna M; Neri, Pamela M; Cresswell, Kathrin M; Duke, Jon; Bloomrosen, Meryl; Volk, Lynn A; Bates, David W
2011-01-01
Background Medication-related decision support can reduce the frequency of preventable adverse drug events. However, the design of current medication alerts often results in alert fatigue and high over-ride rates, thus reducing any potential benefits. Methods The authors previously reviewed human-factors principles for relevance to medication-related decision support alerts. In this study, instrument items were developed for assessing the appropriate implementation of these human-factors principles in drug–drug interaction (DDI) alerts. User feedback regarding nine electronic medical records was considered during the development process. Content validity, construct validity through correlation analysis, and inter-rater reliability were assessed. Results The final version of the instrument included 26 items associated with nine human-factors principles. Content validation on three systems resulted in the addition of one principle (Corrective Actions) to the instrument and the elimination of eight items. Additionally, the wording of eight items was altered. Correlation analysis suggests a direct relationship between system age and performance of DDI alerts (p=0.0016). Inter-rater reliability indicated substantial agreement between raters (κ=0.764). Conclusion The authors developed and gathered preliminary evidence for the validity of an instrument that measures the appropriate use of human-factors principles in the design and display of DDI alerts. Designers of DDI alerts may use the instrument to improve usability and increase user acceptance of medication alerts, and organizations selecting an electronic medical record may find the instrument helpful in meeting their clinicians' usability needs. PMID:21946241
Konradi, Rupert; Textor, Marcus; Reimhult, Erik
2012-01-01
The great wealth of different surface sensitive techniques used in biosensing, most of which claim to measure adsorbed mass, can at first glance look unnecessary. However, with each technique relying on a different transducer principle there is something to be gained from a comparison. In this tutorial review, different optical and acoustic evanescent techniques are used to illustrate how an understanding of the transducer principle of each technique can be exploited for further interpretation of hydrated and extended polymer and biological films. Some of the most commonly used surface sensitive biosensor techniques (quartz crystal microbalance, optical waveguide spectroscopy and surface plasmon resonance) are briefly described and five case studies are presented to illustrate how different biosensing techniques can and often should be combined. The case studies deal with representative examples of adsorption of protein films, polymer brushes and lipid membranes, and describe e.g., how to deal with strongly vs. weakly hydrated films, large conformational changes and ordered layers of biomolecules. The presented systems and methods are compared to other representative examples from the increasing literature on the subject. PMID:25586027
Bochove, Erik J; Rao Gudimetla, V S
2017-01-01
We propose a self-consistency condition based on the extended Huygens-Fresnel principle, which we apply to the propagation kernel of the mutual coherence function of a partially coherent laser beam propagating through a turbulent atmosphere. The assumption of statistical independence of turbulence in neighboring propagation segments leads to an integral equation in the propagation kernel. This integral equation is satisfied by a Gaussian function, with dependence on the transverse coordinates that is identical to the previous Gaussian formulation by Yura [Appl. Opt.11, 1399 (1972)APOPAI0003-693510.1364/AO.11.001399], but differs in the transverse coherence length's dependence on propagation distance, so that this established version violates our self-consistency principle. Our formulation has one free parameter, which in the context of Kolmogorov's theory is independent of turbulence strength and propagation distance. We determined its value by numerical fitting to the rigorous beam propagation theory of Yura and Hanson [J. Opt. Soc. Am. A6, 564 (1989)JOAOD60740-323210.1364/JOSAA.6.000564], demonstrating in addition a significant improvement over other Gaussian models.
Variational Principles, Occam Razor and Simplicity Paradox
NASA Astrophysics Data System (ADS)
Berezin, Alexander A.
2004-05-01
Variational minimum principles (VMP) refer to energy (statics, Thomson and Earnshaw theorems in electrostatics), action (Maupertuis, Euler, Lagrange, Hamilton), light (Fermat), quantum paths (Feynman), etc. Historically, VMP appeal to some economy in nature, similarly to Occam Razor Parsimony (ORP) principle. Version of ORP are "best world" (Leibniz), Panglossianism (Voltaire), and "most interesting world" (Dyson). Conceptually, VMP exemplify curious fact that infinite set is often simpler than its subsets (e.g., set of all integers is simpler than set of primes). Algorithmically very simple number 0.1234567... (Champernowne constant) contains Library of Babel of "all books" (Borges) and codes (infinitely many times) everything countably possible. Likewise, full Megaverse (Everett, Deutsch, Guth, Linde) is simpler than our specific ("Big Bang") universe. Dynamically, VMP imply memory effects akin to hysteresis. Similar ideas are "water memory" (Benveniste, Josephson) and isotopic biology (Berezin). Paradoxically, while ORP calls for economy (simplicity), unfolding of ORP in VMP seemingly works in the opposite direction allowing for complexity emergence (e.g., symmetry breaking in Jahn-Teller effect). Metaphysical extrapolation of this complimentarity may lead to "it-from-bit" (Wheeler) reflection of why there is something rather than nothing.
Can there be science-based precaution?
NASA Astrophysics Data System (ADS)
Weiss, Charles
2006-10-01
'Science-based precaution' is possible in logic if not in politics, and should be a normal part of risk management. It should balance the risks and benefits of innovation, or equivalently, specify the price one is willing to pay to avoid risk. The Precaution Principle states that the absence of scientific proof does not preclude precautionary action—or, in its stronger version, that it requires such action. This principle is a useful counterweight to the insistence on rigorous scientific proof, but focuses on costs and risks to the exclusion of benefits. It expresses 'look before you leap', but not 'nothing ventured, nothing gained'. To facilitate adaptive management, we propose a complementary principle: 'precautionary action should not unreasonably interfere with innovation that promises major benefits, until its dangers and benefits are well understood'. In international trade law, we propose that scientific evidence presented in support of discriminatory measures that would otherwise violate the world trade regime—such as the de facto European Union moratorium on importing genetically modified crops—be required to suffice to support a 'reasonable belief' of danger to human health or the environment.
Teaching Economic Principles Interactively: A Cannibal's Dinner Party
ERIC Educational Resources Information Center
Bergstrom, Theodore C.
2009-01-01
The author describes techniques that he uses to interactively teach economics principles. He describes an experiment on market entry and gives examples of applications of classroom clickers. Clicker applications include (a) collecting data about student preferences that can be used to construct demand curves and supply curves, (b) checking…
Thomas Gordon's Communicative Pedagogy in Modern Educational Realities
ERIC Educational Resources Information Center
Leshchenko, Maria; Isaieva, Svitlana
2014-01-01
In the article the principles, strategies, methods, techniques of communicative pedagogy of American scientist Thomas Gordon and system components of effective communication training for parents, teachers and administrators are enlightened. It has been determined that the main principle of Thomas Gordon's pedagogy is an interactive way of knowing…
Principles of Teaching. Module.
ERIC Educational Resources Information Center
Rhoades, Joseph W.
This module on principles of teaching is 1 in a series of 10 modules written for vocational education teacher education programs. It is designed to enable the teacher to do the following: (1) identify subject matter and integrate that subject matter with thought-provoking questions; (2) organize and demonstrate good questioning techniques; and (3)…
ERIC Educational Resources Information Center
Verschaffel, Lieven; Van Dooren, W.; Star, J.
2017-01-01
This special issue comprises contributions that address the breadth of current lines of recent research from cognitive psychology that appear promising for positively impacting students' learning of mathematics. More specifically, we included contributions (a) that refer to cognitive psychology based principles and techniques, such as explanatory…
Nursing Principles & Skills. Teacher Edition.
ERIC Educational Resources Information Center
Oklahoma State Dept. of Vocational and Technical Education, Stillwater. Curriculum and Instructional Materials Center.
This curriculum guide contains 14 units for a course on nursing principles and skills needed by practical nurses. The 14 units of instruction cover the following: (1) using medical terminology; (2) practicing safety procedures; (3) using the nursing process for care planning; (4) using infection control techniques; (5) preparing a patient…
Behavior Modification: Basic Principles. Third Edition
ERIC Educational Resources Information Center
Lee, David L.; Axelrod, Saul
2005-01-01
This classic book presents the basic principles of behavior emphasizing the use of preventive techniques as well as consequences naturally available in the home, business, or school environment to change important behaviors. This book, and its companion piece, "Measurement of Behavior," represents more than 30 years of research and strategies in…
UAV Digital Tracking Array Design, Development and Testing
2009-12-01
and Reports, 1215 Jefferson Davis Highway, Suite 1204, Arlington, VA 22202-4302, and to the Office of Management and Budget, Paperwork Reduction...27 1. Tracking Principles ........................................................................... 27... principles and different modes for a tracking antenna. Different tracking techniques such as sequential lobing, conical scan and monopulse tracking are also
The Art of Evaluation: A Handbook for Educators and Trainers.
ERIC Educational Resources Information Center
Fenwick, Tara J.; Parsons, Jim
This book introduces adult educators and trainers to the principles and techniques of learner evaluation in the various contexts of adult education. The following are among the topics discussed: (1) the purposes of evaluation (the importance of authentic evaluation; principles of evaluation; traps in evaluation); (2) evaluating one's philosophy…
Sign Language and the Brain: A Review
ERIC Educational Resources Information Center
Campbell, Ruth; MacSweeney, Mairead; Waters, Dafydd
2008-01-01
How are signed languages processed by the brain? This review briefly outlines some basic principles of brain structure and function and the methodological principles and techniques that have been used to investigate this question. We then summarize a number of different studies exploring brain activity associated with sign language processing…
NASA Technical Reports Server (NTRS)
Clarke, R.; Lintereur, L.; Bahm, C.
2016-01-01
A desire for more complete documentation of the National Aeronautics and Space Administration (NASA) Armstrong Flight Research Center (AFRC), Edwards, California legacy code used in the core simulation has led to this e ort to fully document the oblate Earth six-degree-of-freedom equations of motion and integration algorithm. The authors of this report have taken much of the earlier work of the simulation engineering group and used it as a jumping-o point for this report. The largest addition this report makes is that each element of the equations of motion is traced back to first principles and at no point is the reader forced to take an equation on faith alone. There are no discoveries of previously unknown principles contained in this report; this report is a collection and presentation of textbook principles. The value of this report is that those textbook principles are herein documented in standard nomenclature that matches the form of the computer code DERIVC. Previous handwritten notes are much of the backbone of this work, however, in almost every area, derivations are explicitly shown to assure the reader that the equations which make up the oblate Earth version of the computer routine, DERIVC, are correct.
The explanatory structure of unexplainable events: Causal constraints on magical reasoning.
Shtulman, Andrew; Morgan, Caitlin
2017-10-01
A common intuition, often captured in fiction, is that some impossible events (e.g., levitating a stone) are "more impossible" than others (e.g., levitating a feather). We investigated the source of this intuition, hypothesizing that graded notions of impossibility arise from explanatory considerations logically precluded by the violation at hand but still taken into account. Studies 1-4 involved college undergraduates (n = 357), and Study 5 involved preschool-aged children (n = 32). In Studies 1 and 2, participants saw pairs of magical spells that violated one of 18 causal principles-six physical, six biological, and six psychological-and were asked to indicate which spell would be more difficult to learn. Both spells violated the same causal principle but differed in their relation to a subsidiary principle. Participants' judgments of spell difficulty honored the subsidiary principle, even when participants were given the option of judging the two spells equally difficult. Study 3 replicated those effects with Likert-type ratings; Study 4 replicated them in an open-ended version of the task in which participants generated their own causal violations; and Study 5 replicated them with children. Taken together, these findings suggest that events that defy causal explanation are interpreted in terms of explanatory considerations that hold in the absence of such violations.
NASA Astrophysics Data System (ADS)
Pakuliak, L. K.; Andruk, V. M.; Golovnia, V. V.; Shatokhina, S. V.; Yizhakevych, O. M.; Ivanov, G. A.; Yatsenko, A. I.; Sergeeva, T. P.
Almost 40-year history of FON project ended with the creation of the whole northern sky catalog of objects down to B ≤ 16.5m. The idea of 4-fold overlapping of the northern sky with 6 wide-field astrographs has not been realized in full. For historical reasons it has been transformed into the 2-fold overlapping observational program of MAO NAS of Ukraine, resulted in three versions of the multimillion catalog of positions, proper motions, and B-magnitudes of stars. The first version of 1.2 million stars had been finished before the 2000s and is based on the AC object list. The measurements of plates were made by automatic measuring complex PARSEC, specially developed for massive photographic reviews. As the input list was limited by AC objects, the most part of stars on the FON plates remained unmeasured. Principles of workflow organization of such works formed the basis for the further development of the project using the latest IT-technologies. For the creation of the second and the third versions of the catalog, the list of objects was obtained as a result of total digitizing of plates and their image processing. The final third version contains 19.5 million stars and galaxies with the maximum possible for the photographic astrometry accuracy. The collection of plates, obtained in other observatories - participants of the project, are partially safe and can be used for the same astrometric tasks.
Bschir, Karim
2017-04-01
Environmental risk assessment is often affected by severe uncertainty. The frequently invoked precautionary principle helps to guide risk assessment and decision-making in the face of scientific uncertainty. In many contexts, however, uncertainties play a role not only in the application of scientific models but also in their development. Building on recent literature in the philosophy of science, this paper argues that precaution should be exercised at the stage when tools for risk assessment are developed as well as when they are used to inform decision-making. The relevance and consequences of this claim are discussed in the context of the threshold of the toxicological concern approach in food toxicology. I conclude that the approach does not meet the standards of an epistemic version of the precautionary principle.
What Are We Doing When We Translate from Quantitative Models?
Critchfield, Thomas S; Reed, Derek D
2009-01-01
Although quantitative analysis (in which behavior principles are defined in terms of equations) has become common in basic behavior analysis, translational efforts often examine everyday events through the lens of narrative versions of laboratory-derived principles. This approach to translation, although useful, is incomplete because equations may convey concepts that are difficult to capture in words. To support this point, we provide a nontechnical introduction to selected aspects of quantitative analysis; consider some issues that translational investigators (and, potentially, practitioners) confront when attempting to translate from quantitative models; and discuss examples of relevant translational studies. We conclude that, where behavior-science translation is concerned, the quantitative features of quantitative models cannot be ignored without sacrificing conceptual precision, scientific and practical insights, and the capacity of the basic and applied wings of behavior analysis to communicate effectively. PMID:22478533
Climate change and Norman Daniels' theory of just health: an essay on basic needs.
Lacey, Joseph
2012-02-01
Norman Daniels, in applying Rawls' theory of justice to the issue of human health, ideally presupposes that society exists in a state of moderate scarcity. However, faced with problems like climate change, many societies find that their state of moderate scarcity is increasingly under threat. The first part of this essay aims to determine the consequences for Daniels' theory of just health when we incorporate into Rawls' understanding of justice the idea that the condition of moderate scarcity can fail. Most significantly, I argue for a generation-neutral principle of basic needs that is lexically prior to Rawls' familiar principles of justice. The second part of this paper aims to demonstrate how my reformulated version of Daniels' conception of just health can help to justify action on climate change and guide climate policy within liberal-egalitarian societies.
The role of biomechanics in maximising distance and accuracy of golf shots.
Hume, Patria A; Keogh, Justin; Reid, Duncan
2005-01-01
Golf biomechanics applies the principles and technique of mechanics to the structure and function of the golfer in an effort to improve golf technique and performance. A common recommendation for technical correction is maintaining a single fixed centre hub of rotation with a two-lever one-hinge moment arm to impart force on the ball. The primary and secondary spinal angles are important for conservation of angular momentum using the kinetic link principle to generate high club-head velocity. When the golfer wants to maximise the distance of their drives, relatively large ground reaction forces (GRF) need to be produced. However, during the backswing, a greater proportion of the GRF will be observed on the back foot, with transfer of the GRF on to the front foot during the downswing/acceleration phase. Rapidly stretching hip, trunk and upper limb muscles during the backswing, maximising the X-factor early in the downswing, and uncocking the wrists when the lead arm is about 30 degrees below the horizontal will take advantage of the summation of force principle. This will help generate large angular velocity of the club head, and ultimately ball displacement. Physical conditioning will help to recruit the muscles in the correct sequence and to optimum effect. To maximise the accuracy of chipping and putting shots, the golfer should produce a lower grip on the club and a slower/shorter backswing. Consistent patterns of shoulder and wrist movements and temporal patterning result in successful chip shots. Qualitative and quantitative methods are used to biomechanically assess golf techniques. Two- and three-dimensional videography, force plate analysis and electromyography techniques have been employed. The common golf biomechanics principles necessary to understand golf technique are stability, Newton's laws of motion (inertia, acceleration, action reaction), lever arms, conservation of angular momentum, projectiles, the kinetic link principle and the stretch-shorten cycle. Biomechanics has a role in maximising the distance and accuracy of all golf shots (swing and putting) by providing both qualitative and quantitative evidence of body angles, joint forces and muscle activity patterns. The quantitative biomechanical data needs to be interpreted by the biomechanist and translated into coaching points for golf professionals and coaches. An understanding of correct technique will help the sports medicine practitioner provide sound technical advice and should help reduce the risk of golfing injury.
Information Requirements Specification II: Brainstorming Collective Decision-Making Technique.
ERIC Educational Resources Information Center
Telem, Moshe
1988-01-01
Information requirements specification (IRS) constitutes an Achilles heel in the system life cycle of management information systems. This article establishes a systematic overall IRS technique applicable to organizations of all types and sizes. The technique's integration of brainstorming and theory Z principles creates an effective, stimulating,…
USDA-ARS?s Scientific Manuscript database
Most studies assessing chlorophyll fluorescence (ChlF) have examined leaf responses to environmental stress conditions using active techniques. Alternatively, passive techniques are able to measure ChlF at both leaf and canopy scales. However, although the measurement principles of both techniques a...
ERIC Educational Resources Information Center
Jaffe, C. Carl
1982-01-01
Describes principle imaging techniques, their applications, and their limitations in terms of diagnostic capability and possible adverse biological effects. Techniques include film radiography, computed tomography, nuclear medicine, positron emission tomography (PET), ultrasonography, nuclear magnetic resonance, and digital radiography. PET has…
NASA Technical Reports Server (NTRS)
Wolf, S. F.; Lipschutz, M. E.
1993-01-01
Multivariate statistical analysis techniques (linear discriminant analysis and logistic regression) can provide powerful discrimination tools which are generally unfamiliar to the planetary science community. Fall parameters were used to identify a group of 17 H chondrites (Cluster 1) that were part of a coorbital stream which intersected Earth's orbit in May, from 1855 - 1895, and can be distinguished from all other H chondrite falls. Using multivariate statistical techniques, it was demonstrated that a totally different criterion, labile trace element contents - hence thermal histories - or 13 Cluster 1 meteorites are distinguishable from those of 45 non-Cluster 1 H chondrites. Here, we focus upon the principles of multivariate statistical techniques and illustrate their application using non-meteoritic and meteoritic examples.
Kaltenbacher, Barbara; Kaltenbacher, Manfred; Sim, Imbo
2013-01-01
We consider the second order wave equation in an unbounded domain and propose an advanced perfectly matched layer (PML) technique for its efficient and reliable simulation. In doing so, we concentrate on the time domain case and use the finite-element (FE) method for the space discretization. Our un-split-PML formulation requires four auxiliary variables within the PML region in three space dimensions. For a reduced version (rPML), we present a long time stability proof based on an energy analysis. The numerical case studies and an application example demonstrate the good performance and long time stability of our formulation for treating open domain problems. PMID:23888085
Computational Control of Flexible Aerospace Systems
NASA Technical Reports Server (NTRS)
Sharpe, Lonnie, Jr.; Shen, Ji Yao
1994-01-01
The main objective of this project is to establish a distributed parameter modeling technique for structural analysis, parameter estimation, vibration suppression and control synthesis of large flexible aerospace structures. This report concentrates on the research outputs produced in the last two years of the project. The main accomplishments can be summarized as follows. A new version of the PDEMOD Code had been completed. A theoretical investigation of the NASA MSFC two-dimensional ground-based manipulator facility by using distributed parameter modelling technique has been conducted. A new mathematical treatment for dynamic analysis and control of large flexible manipulator systems has been conceived, which may provide a embryonic form of a more sophisticated mathematical model for future modified versions of the PDEMOD Codes.
A Wave Diagnostics in Geophysics: Algorithmic Extraction of Atmosphere Disturbance Modes
NASA Astrophysics Data System (ADS)
Leble, S.; Vereshchagin, S.
2018-04-01
The problem of diagnostics in geophysics is discussed and a proposal based on dynamic projecting operators technique is formulated. The general exposition is demonstrated by an example of symbolic algorithm for the wave and entropy modes in the exponentially stratified atmosphere. The novel technique is developed as a discrete version for the evolution operator and the corresponding projectors via discrete Fourier transformation. Its explicit realization for directed modes in exponential one-dimensional atmosphere is presented via the correspondent projection operators in its discrete version in terms of matrices with a prescribed action on arrays formed from observation tables. A simulation based on opposite directed (upward and downward) wave train solution is performed and the modes' extraction from a mixture is illustrated.
Soft magnetic tweezers: a proof of principle.
Mosconi, Francesco; Allemand, Jean François; Croquette, Vincent
2011-03-01
We present here the principle of soft magnetic tweezers which improve the traditional magnetic tweezers allowing the simultaneous application and measurement of an arbitrary torque to a deoxyribonucleic acid (DNA) molecule. They take advantage of a nonlinear coupling regime that appears when a fast rotating magnetic field is applied to a superparamagnetic bead immersed in a viscous fluid. In this work, we present the development of the technique and we compare it with other techniques capable of measuring the torque applied to the DNA molecule. In this proof of principle, we use standard electromagnets to achieve our experiments. Despite technical difficulties related to the present implementation of these electromagnets, the agreement of measurements with previous experiments is remarkable. Finally, we propose a simple way to modify the experimental design of electromagnets that should bring the performances of the device to a competitive level.
Complete denture impression techniques: evidence-based or philosophical.
Singla, Shefali
2007-01-01
Code of practice is dangerous and ever-changing in today's world. Relating this to complete denture impression technique, we have been provided with a set of philosophies--"no pressure, minimal pressure, definite pressure and selective pressure". The objectives and principles of impression-making have been clearly defined. Do you think any philosophy can satisfy any operator to work on these principles and achieve these objectives? These philosophies take into consideration only the tissue part and not the complete basal seat, which comprises the periphery, the tissues and the bone structure. Under such circumstances, should we consider a code of practice dangerous or should we develop an evidence-based approach having a scientific background following certain principles, providing the flexibility to adapt to clinical procedures and to normal biological variations in patients rather than the rigidity imposed by strict laws?
Design and Development of Basic Physical Layer WiMAX Network Simulation Models
2009-01-01
Wide Web . The third software version was developed during the period of 22 August to 4 November, 2008. The software version developed during the...researched on the Web . The mathematics of some fundamental concepts such as Fourier transforms, convolutional coding techniques were also reviewed...Mathworks Matlab users’ website. A simulation model was found, entitled Estudio y Simulacion de la capa Jisica de la norma 802.16 ( Sistema WiMAX) developed
NASA Astrophysics Data System (ADS)
Tawfik, Sherif A.; El-Sheikh, S. M.; Salem, N. M.
2016-09-01
Recently we have become aware that the description of the quantum wave functions in Sec. 2.1 is incorrect. In the published version of the paper, we have stated that the states are expanded in terms of plane waves. However, the correct description of the quantum states in the context of the real space implementation (using the Octopus code) is that states are represented by discrete points in a real space grid.
Modulational stability of periodic solutions of the Kuramoto-Sivaskinsky equation
NASA Technical Reports Server (NTRS)
Papageorgiou, Demetrios T.; Papanicolaou, George C.; Smyrlis, Yiorgos S.
1993-01-01
We study the long-wave, modulational, stability of steady periodic solutions of the Kuramoto-Sivashinsky equation. The analysis is fully nonlinear at first, and can in principle be carried out to all orders in the small parameter, which is the ratio of the spatial period to a characteristic length of the envelope perturbations. In the linearized regime, we recover a high-order version of the results of Frisch, She, and Thual, which shows that the periodic waves are much more stable than previously expected.
2015-09-21
and metal organic chemical vapor deposition (MOCVD) [18]. In the former case, carbon can contaminate the material during air exposure in standard... gallium . In addition, carbon can be found as a contaminant in the source gases or it can be etched off the susceptor that transfers heat to the substrate...split interstitial Figure 1: Split interstitials of carbon (yellow) and nitrogen (blue) surrounded by four gallium atoms (red). energy differences of
2013-06-01
2014%20Oct%2009%20Rel1.pdf Mankiw , N. G. (2009). Principles of economics (5th ed.) [PDF version]. Mason, OH: South-Western Cengage Learning...putting in the time and resources to make this research possible. I really enjoyed your class, MN4970 Energy Economics , because it taught me to think...contributor to our economic prosperity. If we do not develop the policies that encourage the private sector to seize the opportunity, the United States
Virtue ethics – an old answer to a new dilemma? Part 2. The case for inclusive virtue ethics
2015-01-01
Summary While Principlism is a widely accepted consensus statement for ethics, the moral theory that underpins it faces serious challenges. This two-part paper proposes a version of virtue theory as a more grounded system of moral analysis. Part 2 examines the role of basic moral theory as the foundation to ethics and suggests how virtue theory can be used as a central framework for ethics while being inclusive of insights from deontology and consequentialism. PMID:25792615
A legal version of the nanoworld
NASA Astrophysics Data System (ADS)
Lacour, Stéphanie
2011-09-01
Nanosciences and nanotechnologies come into a pre-existing legal system. Their arrival, and how they are received are worthy of analysis. Such an effort shall at first include simply lexical considerations, in order to penetrate, via their origins, the traces of these specific objects into the territory of law. The goal of this article is to explore the effects of "nanos" in various legal fields, including their relevance to the principle of precaution, patent law, and the applicable laws for chemical substances.
1993-06-09
within the framework of an update for the computer database "DiaNIK" which has been developed at the Vernadsky Institute of Geochemistry and Analytical...chemical thermodynamic data for minerals and mineral-forming substances. The structure of thermodynamic database "DiaNIK" is based on the principles...in the database . A substantial portion of the thermodynamic values recommended by "DiaNIK" experts for the substances in User Version 3.1 resulted from
Analysis of Learning Curve Fitting Techniques.
1987-09-01
1986. 15. Neter, John and others. Applied Linear Regression Models. Homewood IL: Irwin, 19-33. 16. SAS User’s Guide: Basics, Version 5 Edition. SAS... Linear Regression Techniques (15:23-52). Random errors are assumed to be normally distributed when using -# ordinary least-squares, according to Johnston...lot estimated by the improvement curve formula. For a more detailed explanation of the ordinary least-squares technique, see Neter, et. al., Applied
Procedure for Adapting Direct Simulation Monte Carlo Meshes
NASA Technical Reports Server (NTRS)
Woronowicz, Michael S.; Wilmoth, Richard G.; Carlson, Ann B.; Rault, Didier F. G.
1992-01-01
A technique is presented for adapting computational meshes used in the G2 version of the direct simulation Monte Carlo method. The physical ideas underlying the technique are discussed, and adaptation formulas are developed for use on solutions generated from an initial mesh. The effect of statistical scatter on adaptation is addressed, and results demonstrate the ability of this technique to achieve more accurate results without increasing necessary computational resources.
High-informative version of nonlinear transformation of Langmuir waves to electromagnetic waves
NASA Astrophysics Data System (ADS)
Erofeev, Vasily I.; Erofeev
2014-04-01
The concept of informativeness of nonlinear plasma physical scenario is discussed. Basic principles for heightening the informativeness of plasma kinetic models are explained. Former high-informative correlation analysis of plasma kinetics (Erofeev, V. 2011 High-Informative Plasma Theory, Saarbrücken: LAP) is generalized for studies of weakly turbulent plasmas that contain fields of solenoidal plasma waves apart from former potential ones. Respective machinery of plasma kinetic modeling is applied to an analysis of fusion of Langmuir waves with transformation to electromagnetic waves. It is shown that the customary version of this phenomenon (Terashima, Y. and Yajima, N. 1963 Prog. Theor. Phys. 30, 443; Akhiezer, I. A., Danelia, I. A. and Tsintsadze, N. L. 1964 Sov. Phys. JETP 19, 208; Al'tshul', L. M. and Karpman, V. I. 1965 Sov. Phys. JETP 20, 1043) substantially distorts the picture of merging of Langmuir waves with long wavelengths (λ >~ c/ωpe ).
Distributed and parallel Ada and the Ada 9X recommendations
NASA Technical Reports Server (NTRS)
Volz, Richard A.; Goldsack, Stephen J.; Theriault, R.; Waldrop, Raymond S.; Holzbacher-Valero, A. A.
1992-01-01
Recently, the DoD has sponsored work towards a new version of Ada, intended to support the construction of distributed systems. The revised version, often called Ada 9X, will become the new standard sometimes in the 1990s. It is intended that Ada 9X should provide language features giving limited support for distributed system construction. The requirements for such features are given. Many of the most advanced computer applications involve embedded systems that are comprised of parallel processors or networks of distributed computers. If Ada is to become the widely adopted language envisioned by many, it is essential that suitable compilers and tools be available to facilitate the creation of distributed and parallel Ada programs for these applications. The major languages issues impacting distributed and parallel programming are reviewed, and some principles upon which distributed/parallel language systems should be built are suggested. Based upon these, alternative language concepts for distributed/parallel programming are analyzed.
Gascón-Cánovas, Juan J; Russo de Leon, Jessica Roxanna; Cózar Fernandez, Antonio; Heredia Calzado, Jose M
2017-07-01
School bullying is a growing problem. The current study is aimed at culturally adapting and assessing the psychometric properties of a brief scale to measure bullying. A cross-cultural adaptation of the brief scale -Adolescent Peer Relations Instrument-Bullying (APRI)- was performed using the translation and back-translation technique. The Spanish version of APRI questionnaire was administered to a sample of 1,428 schoolchildren aged 12-14years in the region of Mar Menor in Murcia (Spain). Exploratory factor analysis, with oblique rotation, was used to assess the validity of the internal structure, the Cronbach's alpha to analyse their consistency, and the Kruskal-Wallis test to check their ability to discriminate between subjects with varying degrees of bullying according to Kidscreen-52 scale of social acceptability RESULTS: Two factors were identified in the adapted version of APRI (physical victimisation and verbal/social victimisation), similar to those in the original scale. The questionnaire has high internal consistency (Cronbach's alpha=0.94) and discrimination capacity (P<01), with significant effect sizes between degrees of bullying. The internal structure of the APRI Spanish version is similar to the original, and its scores confirm high reliability and construct validity. Further studies need to be performed with broader age ranges and confirmatory analysis techniques, to ratify the equivalence of the adapted version with the original version. Copyright © 2015 Asociación Española de Pediatría. Publicado por Elsevier España, S.L.U. All rights reserved.
ERIC Educational Resources Information Center
Otani, Akira
1989-01-01
Examines several basic hypnotherapeutic techniques (rapport building, problem assessment, resistance management, and behavior change) based on Milton H. Erickson's hypnotherapeutic principles that can be translated into the general framework of counseling. (Author/CM)
Ensuring safety of implanted devices under MRI using reversed RF polarization.
Overall, William R; Pauly, John M; Stang, Pascal P; Scott, Greig C
2010-09-01
Patients with long-wire medical implants are currently prevented from undergoing magnetic resonance imaging (MRI) scans due to the risk of radio frequency (RF) heating. We have developed a simple technique for determining the heating potential for these implants using reversed radio frequency (RF) polarization. This technique could be used on a patient-to-patient basis as a part of the standard prescan procedure to ensure that the subject's device does not pose a heating risk. By using reversed quadrature polarization, the MR scan can be sensitized exclusively to the potentially dangerous currents in the device. Here, we derive the physical principles governing the technique and explore the primary sources of inaccuracy. These principles are verified through finite-difference simulations and through phantom scans of implant leads. These studies demonstrate the potential of the technique for sensitively detecting potentially dangerous coupling conditions before they can do any harm. 2010 Wiley-Liss, Inc.
The Golosiiv on-line plate archive database, management and maintenance
NASA Astrophysics Data System (ADS)
Pakuliak, L.; Sergeeva, T.
2007-08-01
We intend to create online version of the database of the MAO NASU plate archive as VO-compatible structures in accordance with principles, developed by the International Virtual Observatory Alliance in order to make them available for world astronomical community. The online version of the log-book database is constructed by means of MySQL+PHP. Data management system provides a user with user interface, gives a capability of detailed traditional form-filling radial search of plates, obtaining some auxiliary sampling, the listing of each collection and permits to browse the detail descriptions of collections. The administrative tool allows database administrator the data correction, enhancement with new data sets and control of the integrity and consistence of the database as a whole. The VO-compatible database is currently constructing under the demands and in the accordance with principles of international data archives and has to be strongly generalized in order to provide a possibility of data mining by means of standard interfaces and to be the best fitted to the demands of WFPDB Group for databases of the plate catalogues. On-going enhancements of database toward the WFPDB bring the problem of the verification of data to the forefront, as it demands the high degree of data reliability. The process of data verification is practically endless and inseparable from data management owing to a diversity of data errors nature, that means to a variety of ploys of their identification and fixing. The current status of MAO NASU glass archive forces the activity in both directions simultaneously: the enhancement of log-book database with new sets of observational data as well as generalized database creation and the cross-identification between them. The VO-compatible version of the database is supplying with digitized data of plates obtained with MicroTek ScanMaker 9800 XL TMA. The scanning procedure is not total but is conducted selectively in the frames of special projects.
Estimating skin sensitization potency from a single dose LLNA.
Roberts, David W
2015-04-01
Skin sensitization is an important aspect of safety assessment. The mouse local lymph node assay (LLNA) developed in the 1990 s is an in vivo test used for skin sensitization hazard identification and characterization. More recently a reduced version of the LLNA (rLLNA) has been developed as a means of identifying, but not quantifying, sensitization hazard. The work presented here is aimed at enabling rLLNA data to be used to give quantitative potency information that can be used, inter alia, in modeling and read-across approaches to non-animal based potency estimation. A probit function has been derived enabling estimation of EC3 from a single dose. This has led to development of a modified version of the rLLNA, whereby as a general principle the SI value at 10%, or at a lower concentration if 10% is not testable, is used to calculate the EC3. This version of the rLLNA has been evaluated against a selection of chemicals for which full LLNA data are available, and has been shown to give EC3 values in good agreement with those derived from the full LLNA. Copyright © 2015 Elsevier Inc. All rights reserved.
Advances in Liposuction: Five Key Principles with Emphasis on Patient Safety and Outcomes
Tabbal, Geo N.; Ahmad, Jamil; Lista, Frank
2013-01-01
Summary: Since Illouz’s presentation of a technique for lipoplasty at the 1982 Annual Meeting of the American Society of Plastic and Reconstructive Surgeons, liposuction has become one of the most commonly performed aesthetic surgery procedures. The evolution of liposuction has seen refinements in technique and improvement of patient safety-related standards of care. Based on long-term experience with body contouring surgery, 5 principles of advanced liposuction are presented: preoperative evaluation and planning, intraoperative monitoring—safety measures, the role of wetting solutions and fluid resuscitation, circumferential contouring and complication prevention, and outcomes measurement. PMID:25289270
Teaching ethical analysis in occupational therapy.
Haddad, A M
1988-05-01
Ethical decision making is a cognitive skill requiring education in ethical principles and an understanding of specific ethical issues. It is also a psychodynamic process involving personalities, values, opinions, and perceptions. This article proposes the use of case studies and role-playing techniques in teaching ethics in occupational therapy to supplement conventional methods of presenting ethical theories and principles. These two approaches invite students to discuss and analyze crucial issues in occupational therapy from a variety of viewpoints. Methodology of developing case studies and role-playing exercises are discussed. The techniques are evaluated and their application to the teaching of ethics is examined.
The quantum limit for gravitational-wave detectors and methods of circumventing it
NASA Technical Reports Server (NTRS)
Thorne, K. S.; Caves, C. M.; Sandberg, V. D.; Zimmermann, M.; Drever, R. W. P.
1979-01-01
The Heisenberg uncertainty principle prevents the monitoring of the complex amplitude of a mechanical oscillator more accurately than a certain limit value. This 'quantum limit' is a serious obstacle to the achievement of a 10 to the -21st gravitational-wave detection sensitivity. This paper examines the principles of the back-action evasion technique and finds that this technique may be able to overcome the problem of the quantum limit. Back-action evasion does not solve, however, other problems of detection, such as weak coupling, large amplifier noise, and large Nyquist noise.
Basic principles of Hasse diagram technique in chemistry.
Brüggemann, Rainer; Voigt, Kristina
2008-11-01
Principles of partial order applied to ranking are explained. The Hasse diagram technique (HDT) is the application of partial order theory based on a data matrix. In this paper, HDT is introduced in a stepwise procedure, and some elementary theorems are exemplified. The focus is to show how the multivariate character of a data matrix is realized by HDT and in which cases one should apply other mathematical or statistical methods. Many simple examples illustrate the basic theoretical ideas. Finally, it is shown that HDT is a useful alternative for the evaluation of antifouling agents, which was originally performed by amoeba diagrams.
Surgical treatment of osteoporotic fractures: An update on the principles of management.
Yaacobi, Eyal; Sanchez, Daniela; Maniar, Hemil; Horwitz, Daniel S
2017-12-01
The treatment of osteoporotic fractures continues to challenge orthopedic surgeon. The fragility of the underlying bone in conjunction with the need for specific implants led to the development of explicit surgical techniques in order to minimize implant failure related complications, morbidity and mortality. From the patient's perspective, the existence of frailty, dementia and other medical related co-morbidities induce a complex situation necessitating high vigilance during the perioperative and post-operative period. This update reviews current principles and techniques essential to successful surgical treatment of these injuries. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Sheffner, E. J.; Hlavka, C. A.; Bauer, E. M.
1984-01-01
Two techniques have been developed for the mapping and area estimation of small grains in California from Landsat digital data. The two techniques are Band Ratio Thresholding, a semi-automated version of a manual procedure, and LCLS, a layered classification technique which can be fully automated and is based on established clustering and classification technology. Preliminary evaluation results indicate that the two techniques have potential for providing map products which can be incorporated into existing inventory procedures and automated alternatives to traditional inventory techniques and those which currently employ Landsat imagery.
Special feature on imaging systems and techniques
NASA Astrophysics Data System (ADS)
Yang, Wuqiang; Giakos, George
2013-07-01
The IEEE International Conference on Imaging Systems and Techniques (IST'2012) was held in Manchester, UK, on 16-17 July 2012. The participants came from 26 countries or regions: Austria, Brazil, Canada, China, Denmark, France, Germany, Greece, India, Iran, Iraq, Italy, Japan, Korea, Latvia, Malaysia, Norway, Poland, Portugal, Sweden, Switzerland, Taiwan, Tunisia, UAE, UK and USA. The technical program of the conference consisted of a series of scientific and technical sessions, exploring physical principles, engineering and applications of new imaging systems and techniques, as reflected by the diversity of the submitted papers. Following a rigorous review process, a total of 123 papers were accepted, and they were organized into 30 oral presentation sessions and a poster session. In addition, six invited keynotes were arranged. The conference not only provided the participants with a unique opportunity to exchange ideas and disseminate research outcomes but also paved a way to establish global collaboration. Following the IST'2012, a total of 55 papers, which were technically extended substantially from their versions in the conference proceeding, were submitted as regular papers to this special feature of Measurement Science and Technology . Following a rigorous reviewing process, 25 papers have been finally accepted for publication in this special feature and they are organized into three categories: (1) industrial tomography, (2) imaging systems and techniques and (3) image processing. These papers not only present the latest developments in the field of imaging systems and techniques but also offer potential solutions to existing problems. We hope that this special feature provides a good reference for researchers who are active in the field and will serve as a catalyst to trigger further research. It has been our great pleasure to be the guest editors of this special feature. We would like to thank the authors for their contributions, without which it would not be possible to have this special feature published. We are grateful to all reviewers, who devoted their time and effort, on a voluntary basis, to ensure that all submissions were reviewed rigorously and fairly. The publishing staff of Measurement Science and Technology are particularly acknowledged for giving us timely advice on guest-editing this special feature.
Launch vehicle systems design analysis
NASA Technical Reports Server (NTRS)
Ryan, Robert; Verderaime, V.
1993-01-01
Current launch vehicle design emphasis is on low life-cycle cost. This paper applies total quality management (TQM) principles to a conventional systems design analysis process to provide low-cost, high-reliability designs. Suggested TQM techniques include Steward's systems information flow matrix method, quality leverage principle, quality through robustness and function deployment, Pareto's principle, Pugh's selection and enhancement criteria, and other design process procedures. TQM quality performance at least-cost can be realized through competent concurrent engineering teams and brilliance of their technical leadership.
Finite-temperature Gutzwiller approximation from the time-dependent variational principle
NASA Astrophysics Data System (ADS)
Lanatà, Nicola; Deng, Xiaoyu; Kotliar, Gabriel
2015-08-01
We develop an extension of the Gutzwiller approximation to finite temperatures based on the Dirac-Frenkel variational principle. Our method does not rely on any entropy inequality, and is substantially more accurate than the approaches proposed in previous works. We apply our theory to the single-band Hubbard model at different fillings, and show that our results compare quantitatively well with dynamical mean field theory in the metallic phase. We discuss potential applications of our technique within the framework of first-principle calculations.
The Value of the Operational Principle in Instructional Design
ERIC Educational Resources Information Center
Gibbons, Andrew S.
2009-01-01
Formal design studies are increasing our insight into design processes, including those of instructional design. Lessons are being learned from other design fields, and new techniques and concepts can be imported as they are demonstrated effective. The purpose of this article is to introduce a design concept--the "operational principle"--for…
Modeling Success: Using Preenrollment Data to Identify Academically At-Risk Students
ERIC Educational Resources Information Center
Gansemer-Topf, Ann M.; Compton, Jonathan; Wohlgemuth, Darin; Forbes, Greg; Ralston, Ekaterina
2015-01-01
Improving student success and degree completion is one of the core principles of strategic enrollment management. To address this principle, institutional data were used to develop a statistical model to identify academically at-risk students. The model employs multiple linear regression techniques to predict students at risk of earning below a…
Personality Plus. (Your Guide to Better Health and Personal Development).
ERIC Educational Resources Information Center
de la Concha, Hector
This manual is designed to acquaint individuals with basic principles of personal health, grooming, and personality development. Addressed in the individual chapters of the guide are the following topics: self-evaluation and completion of a personal inventory; proper diet and principles of maintaining a sound body; relaxation techniques; key words…
The Relative Performance of Female and Male Students in Accounting Principles Classes.
ERIC Educational Resources Information Center
Bouillon, Marvin L.; Doran, B. Michael
1992-01-01
The performance of female and male students in Accounting Principles (AP) I and II was compared by using multiple regression techniques to assess the incremental explanatory effects of gender. Males significantly outperformed females in AP I, contradicting earlier studies. Similar gender of instructor and student was insignificant. (JOW)
2012-03-01
by Toyota in the manufacturing world, such as just-in-time, kaizen , one-piece flow, jidoka, and heijunka. These techniques helped spawn the “lean...relentless reflection (hansei) and continuous improvement ( kaizen ) Similar to Principle 2, this principle shares similarity with space program
Computational techniques in tribology and material science at the atomic level
NASA Technical Reports Server (NTRS)
Ferrante, J.; Bozzolo, G. H.
1992-01-01
Computations in tribology and material science at the atomic level present considerable difficulties. Computational techniques ranging from first-principles to semi-empirical and their limitations are discussed. Example calculations of metallic surface energies using semi-empirical techniques are presented. Finally, application of the methods to calculation of adhesion and friction are presented.
Data gap filling techniques are commonly used to predict hazard in the absence of empirical data. The most established techniques are read-across, trend analysis and quantitative structure-activity relationships (QSARs). Toxic equivalency factors (TEFs) are less frequently used d...
A new electron microscope technique for the study of living materials.
Kálmán, E
1979-07-01
In order to gain informations on the real structure of biological specimens the "wet technique" for electron microscopy has been developed. The construction and the working principle of a special microchamber are described. Applications of this technique for the investigation of blood cells, gametes and various bacteries are demonstrated by micrographs.
Fantini, Sergio; Sassaroli, Angelo; Tgavalekos, Kristen T.; Kornbluth, Joshua
2016-01-01
Abstract. Cerebral blood flow (CBF) and cerebral autoregulation (CA) are critically important to maintain proper brain perfusion and supply the brain with the necessary oxygen and energy substrates. Adequate brain perfusion is required to support normal brain function, to achieve successful aging, and to navigate acute and chronic medical conditions. We review the general principles of CBF measurements and the current techniques to measure CBF based on direct intravascular measurements, nuclear medicine, X-ray imaging, magnetic resonance imaging, ultrasound techniques, thermal diffusion, and optical methods. We also review techniques for arterial blood pressure measurements as well as theoretical and experimental methods for the assessment of CA, including recent approaches based on optical techniques. The assessment of cerebral perfusion in the clinical practice is also presented. The comprehensive description of principles, methods, and clinical requirements of CBF and CA measurements highlights the potentially important role that noninvasive optical methods can play in the assessment of neurovascular health. In fact, optical techniques have the ability to provide a noninvasive, quantitative, and continuous monitor of CBF and autoregulation. PMID:27403447
Optimal control of information epidemics modeled as Maki Thompson rumors
NASA Astrophysics Data System (ADS)
Kandhway, Kundan; Kuri, Joy
2014-12-01
We model the spread of information in a homogeneously mixed population using the Maki Thompson rumor model. We formulate an optimal control problem, from the perspective of single campaigner, to maximize the spread of information when the campaign budget is fixed. Control signals, such as advertising in the mass media, attempt to convert ignorants and stiflers into spreaders. We show the existence of a solution to the optimal control problem when the campaigning incurs non-linear costs under the isoperimetric budget constraint. The solution employs Pontryagin's Minimum Principle and a modified version of forward backward sweep technique for numerical computation to accommodate the isoperimetric budget constraint. The techniques developed in this paper are general and can be applied to similar optimal control problems in other areas. We have allowed the spreading rate of the information epidemic to vary over the campaign duration to model practical situations when the interest level of the population in the subject of the campaign changes with time. The shape of the optimal control signal is studied for different model parameters and spreading rate profiles. We have also studied the variation of the optimal campaigning costs with respect to various model parameters. Results indicate that, for some model parameters, significant improvements can be achieved by the optimal strategy compared to the static control strategy. The static strategy respects the same budget constraint as the optimal strategy and has a constant value throughout the campaign horizon. This work finds application in election and social awareness campaigns, product advertising, movie promotion and crowdfunding campaigns.
NASA Technical Reports Server (NTRS)
Burgin, G. H.; Fogel, L. J.; Phelps, J. P.
1975-01-01
A technique for computer simulation of air combat is described. Volume 1 decribes the computer program and its development in general terms. Two versions of the program exist. Both incorporate a logic for selecting and executing air combat maneuvers with performance models of specific fighter aircraft. In the batch processing version the flight paths of two aircraft engaged in interactive aerial combat and controlled by the same logic are computed. The realtime version permits human pilots to fly air-to-air combat against the adaptive maneuvering logic (AML) in Langley Differential Maneuvering Simulator (DMS). Volume 2 consists of a detailed description of the computer programs.
1996-09-01
Generalized Likelihood Ratio (GLR) and voting techniques. The third class consisted of multiple hypothesis filter detectors, specifically the MMAE. The...vector version, versus a tensor if we use the matrix version of the power spectral density estimate. Using this notation, we will derive an...as MATLAB , have an intrinsic sample covariance computation available, which makes this method quite easy to implement. In practice, the mean for the
Multigrid Techniques for Highly Indefinite Equations
NASA Technical Reports Server (NTRS)
Shapira, Yair
1996-01-01
A multigrid method for the solution of finite difference approximations of elliptic PDE's is introduced. A parallelizable version of it, suitable for two and multi level analysis, is also defined, and serves as a theoretical tool for deriving a suitable implementation for the main version. For indefinite Helmholtz equations, this analysis provides a suitable mesh size for the coarsest grid used. Numerical experiments show that the method is applicable to diffusion equations with discontinuous coefficients and highly indefinite Helmholtz equations.
A Course in Heterogeneous Catalysis: Principles, Practice, and Modern Experimental Techniques.
ERIC Educational Resources Information Center
Wolf, Eduardo E.
1981-01-01
Outlines a multidisciplinary course which comprises fundamental, practical, and experimental aspects of heterogeneous catalysis. The course structure is a combination of lectures and demonstrations dealing with the use of spectroscopic techniques for surface analysis. (SK)
Chemical Principles Revisited: Archaeological Dating.
ERIC Educational Resources Information Center
Rowe, M. W.
1986-01-01
Discusses methods used to date archaeological artifacts and other remains. They include: (1) nuclear dating techniques (radiocarbon dating, accelerator radiocarbon dating, thermoluminescence, and others); (2) chemical dating techniques (amino acid racemization, obsidian hydration dating, elemental content changes, and thermal analysis dating); and…
Experimental Study of Residual Stresses in Rail by Moire Interferometry
DOT National Transportation Integrated Search
1993-09-01
The residual stresses in rails produced by rolling cycles are studied experimentally by moire interferometry. The dissection technique is adopted for this investigation. The basic principle of the dissection technique is that the residual stress is r...
Methods for coherent lensless imaging and X-ray wavefront measurements
NASA Astrophysics Data System (ADS)
Guizar Sicairos, Manuel
X-ray diffractive imaging is set apart from other high-resolution imaging techniques (e.g. scanning electron or atomic force microscopy) for its high penetration depth, which enables tomographic 3D imaging of thick samples and buried structures. Furthermore, using short x-ray pulses, it enables the capability to take ultrafast snapshots, giving a unique opportunity to probe nanoscale dynamics at femtosecond time scales. In this thesis we present improvements to phase retrieval algorithms, assess their performance through numerical simulations, and develop new methods for both imaging and wavefront measurement. Building on the original work by Faulkner and Rodenburg, we developed an improved reconstruction algorithm for phase retrieval with transverse translations of the object relative to the illumination beam. Based on gradient-based nonlinear optimization, this algorithm is capable of estimating the object, and at the same time refining the initial knowledge of the incident illumination and the object translations. The advantages of this algorithm over the original iterative transform approach are shown through numerical simulations. Phase retrieval has already shown substantial success in wavefront sensing at optical wavelengths. Although in principle the algorithms can be used at any wavelength, in practice the focus-diversity mechanism that makes optical phase retrieval robust is not practical to implement for x-rays. In this thesis we also describe the novel application of phase retrieval with transverse translations to the problem of x-ray wavefront sensing. This approach allows the characterization of the complex-valued x-ray field in-situ and at-wavelength and has several practical and algorithmic advantages over conventional focused beam measurement techniques. A few of these advantages include improved robustness through diverse measurements, reconstruction from far-field intensity measurements only, and significant relaxation of experimental requirements over other beam characterization approaches. Furthermore, we show that a one-dimensional version of this technique can be used to characterize an x-ray line focus produced by a cylindrical focusing element. We provide experimental demonstrations of the latter at hard x-ray wavelengths, where we have characterized the beams focused by a kinoform lens and an elliptical mirror. In both experiments the reconstructions exhibited good agreement with independent measurements, and in the latter a small mirror misalignment was inferred from the phase retrieval reconstruction. These experiments pave the way for the application of robust phase retrieval algorithms for in-situ alignment and performance characterization of x-ray optics for nanofocusing. We also present a study on how transverse translations help with the well-known uniqueness problem of one-dimensional phase retrieval. We also present a novel method for x-ray holography that is capable of reconstructing an image using an off-axis extended reference in a non-iterative computation, greatly generalizing an earlier approach by Podorov et al. The approach, based on the numerical application of derivatives on the field autocorrelation, was developed from first mathematical principles. We conducted a thorough theoretical study to develop technical and intuitive understanding of this technique and derived sufficient separation conditions required for an artifact-free reconstruction. We studied the effects of missing information in the Fourier domain, and of an imperfect reference, and we provide a signal-to-noise ratio comparison with the more traditional approach of Fourier transform holography. We demonstrated this new holographic approach through proof-of-principle optical experiments and later experimentally at soft x-ray wavelengths, where we compared its performance to Fourier transform holography, iterative phase retrieval and state-of-the-art zone-plate x-ray imaging techniques (scanning and full-field). Finally, we present a demonstration of the technique using a single 20 fs pulse from a high-harmonic table-top source. Holography with an extended reference is shown to provide fast, good quality images that are robust to noise and artifacts that arise from missing information due to a beam stop. (Abstract shortened by UMI.)
Principle of the electrically induced Transient Current Technique
NASA Astrophysics Data System (ADS)
Bronuzzi, J.; Moll, M.; Bouvet, D.; Mapelli, A.; Sallese, J. M.
2018-05-01
In the field of detector development for High Energy Physics, the so-called Transient Current Technique (TCT) is used to characterize the electric field profile and the charge trapping inside silicon radiation detectors where particles or photons create electron-hole pairs in the bulk of a semiconductor device, as PiN diodes. In the standard approach, the TCT signal originates from the free carriers generated close to the surface of a silicon detector, by short pulses of light or by alpha particles. This work proposes a new principle of charge injection by means of lateral PN junctions implemented in one of the detector electrodes, called the electrical TCT (el-TCT). This technique is fully compatible with CMOS technology and therefore opens new perspectives for assessment of radiation detectors performances.
Neuroimaging Techniques: a Conceptual Overview of Physical Principles, Contribution and History
NASA Astrophysics Data System (ADS)
Minati, Ludovico
2006-06-01
This paper is meant to provide a brief overview of the techniques currently used to image the brain and to study non-invasively its anatomy and function. After a historical summary in the first section, general aspects are outlined in the second section. The subsequent six sections survey, in order, computed tomography (CT), morphological magnetic resonance imaging (MRI), functional magnetic resonance imaging (fMRI), diffusion-tensor magnetic resonance imaging (DWI/DTI), positron emission tomography (PET), and electro- and magneto-encephalography (EEG/MEG) based imaging. Underlying physical principles, modelling and data processing approaches, as well as clinical and research relevance are briefly outlined for each technique. Given the breadth of the scope, there has been no attempt to be comprehensive. The ninth and final section outlines some aspects of active research in neuroimaging.
Optimization of radio telemetry receiving systems: Chapter 5.2
Evans, Scott D.; Stevenson, John R.; Adams, Noah S.; Beeman, John W.; Eiler, John H.
2012-01-01
Telemetry provides a powerful and flexible tool for studying fish and other aquatic animals, and its use has become increasingly commonplace. However, telemetry is gear intensive and typically requires more specialized knowledge and training than many other field techniques. As with other scientific methods, collecting good data is dependent on an understanding of the underlying principles behind the approach, knowing how to use the equipment and techniques properly, and recognizing what to do with the data collected. This book provides a road map for using telemetry to study aquatic animals, and provides the basic information needed to plan, implement, and conduct a telemetry study under field conditions. Topics include acoustic or radio telemetry study design, tag implantation techniques, radio and acoustic telemetry principles and case studies, and data management and analysis.
A history of telemetry in fishery research: Chapter 2
Hockersmith, Eric; Beeman, John W.; Adams, Noah S.; Beeman, John W.; Eiler, John H.
2012-01-01
Telemetry provides a powerful and flexible tool for studying fish and other aquatic animals, and its use has become increasingly commonplace. However, telemetry is gear intensive and typically requires more specialized knowledge and training than many other field techniques. As with other scientific methods, collecting good data is dependent on an understanding of the underlying principles behind the approach, knowing how to use the equipment and techniques properly, and recognizing what to do with the data collected. This book provides a road map for using telemetry to study aquatic animals, and provides the basic information needed to plan, implement, and conduct a telemetry study under field conditions. Topics include acoustic or radio telemetry study design, tag implantation techniques, radio and acoustic telemetry principles and case studies, and data management and analysis.
Measuring liquid density using Archimedes' principle
NASA Astrophysics Data System (ADS)
Hughes, Stephen W.
2006-09-01
A simple technique is described for measuring absolute and relative liquid density based on Archimedes' principle. The technique involves placing a container of the liquid under test on an electronic balance and suspending a probe (e.g. a glass marble) attached to a length of line beneath the surface of the liquid. If the volume of the probe is known, the density of liquid is given by the difference between the balance reading before and after immersion of the probe divided by the volume of the probe. A test showed that the density of water at room temperature could be measured to an accuracy and precision of 0.01 ± 0.1%. The probe technique was also used to measure the relative density of milk, Coca-Cola, fruit juice, olive oil and vinegar.
Poulakis, V; Witzsch, U; Schultheiss, D; Rathert, P; Becht, E
2004-12-01
The first reconstructive procedure for ureteropelvic junction (UPJ) obstruction was performed by Trendelenburg in 1886. The important milestones in the reconstruction of UPJ are discussed and all available historical papers and reports since 1886 are reviewed. Kuster published the first successful dismembered pyeloplasty 5 years later, but his technique was prone to strictures. In 1892, the application of the Heineke-Mickulicz principle by Fenger resulted in bulking and kinking with obstruction. Plication of the renal pelvis, first introduced by Israel in 1896, was modified by Kelly in 1906. After the principle of the Finney pyloroplasty, von Lichtenberg designed his pyeloplasty in 1921, best suited to cases of high implantation of the ureter. Foley modified flap techniques, first introduced by Schwyzer in 1923 after the application of the Durante pyloroplasty principle, successfully to Y-V pyeloplasty in 1937. Culp and de-Weerd introduced the spiral flap in 1951. Scardino and Prince reported about the vertical flap in 1953. Patel published the extra-long spiral flap technique in 1982. In order to decrease the likelihood of stricture, Nesbit, in 1949, modified Kuster's procedure by utilizing an elliptic anastomosis. In the same year, Anderson and Hynes, published their technique. With the advent of endourology, several minimally invasive procedures were applied: antegrade or retrograde endopyelotomy, balloon dilation, and laparoscopic pyeloplasty. The concept of full-thickness incision of the narrow segment followed by prolonged stenting was first described in 1903 by Albarran and was popularized by Davis in 1943. Several basic principles must be applied in order to ensure successful repair: the resultant anastomosis should be widely patent, performed in a watertight fashion without tension. Endopyelotomy represents an alternative to open surgery.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baer, D.R.
2005-05-01
An updated version of the American Society for Testing and Materials (ASTM) guide E 1523 to the methods to charge control and charge referencing techniques in x-ray photoelectron spectroscopy has been released by ASTM [Annual Book of ASTM Standards Surface Analysis (American Society for Testing and Materials, West Conshohocken, PA, 2004), Vol. 03.06]. The guide is meant to acquaint x-ray photoelectron spectroscopy (XPS) users with the various charge control and charge referencing techniques that are and have been used in the acquisition and interpretation of XPS data from surfaces of insulating specimens. The current guide has been expanded to includemore » new references as well as recommendations for reporting information on charge control and charge referencing. The previous version of the document had been published in 1997 [D. R. Baer and K. D. Bomben, J. Vac. Sci. Technol. A 16, 754 (1998)].« less
An Overview Of Wideband Signal Analysis Techniques
NASA Astrophysics Data System (ADS)
Speiser, Jeffrey M.; Whitehouse, Harper J.
1989-11-01
This paper provides a unifying perspective for several narowband and wideband signal processing techniques. It considers narrowband ambiguity functions and Wigner-Ville distibutions, together with the wideband ambiguity function and several proposed approaches to a wideband version of the Wigner-Ville distribution (WVD). A unifying perspective is provided by the methodology of unitary representations and ray representations of transformation groups.
Information System Design Methodology Based on PERT/CPM Networking and Optimization Techniques.
ERIC Educational Resources Information Center
Bose, Anindya
The dissertation attempts to demonstrate that the program evaluation and review technique (PERT)/Critical Path Method (CPM) or some modified version thereof can be developed into an information system design methodology. The methodology utilizes PERT/CPM which isolates the basic functional units of a system and sets them in a dynamic time/cost…
CMMI(Registered) for Services, Version 1.3
2010-11-01
ISO 2008b] ISO /IEC 27001 :2005 Information technology – Security techniques – Information Security Management Systems – Requirements [ ISO /IEC 2005...Commission. ISO /IEC 27001 Information Technology – Security Techniques – Information Security Management Systems – Requirements, 2005. http...CMM or International Organization for Standardization ( ISO ) 9001, you will immediately recognize many similarities in their structure and content
CMMI(Registered) for Acquisition, Version 1.3. CMMI-ACQ, V1.3
2010-11-01
and Software Engineering – System Life Cycle Processes [ ISO 2008b] ISO /IEC 27001 :2005 Information technology – Security techniques – Information...International Organization for Standardization and International Electrotechnical Commission. ISO /IEC 27001 Information Technology – Security Techniques...International Organization for Standardization/International Electrotechnical Commission ( ISO /IEC) body of standards. CMMs focus on improving processes
ERIC Educational Resources Information Center
Crino, Michael D.; And Others
1985-01-01
The random response technique was compared to a direct questionnaire, administered to college students, to investigate whether or not the responses predicted the social desirability of the item. Results suggest support for the hypothesis. A 33-item version of the Marlowe-Crowne Social Desirability Scale which was used is included. (GDC)
NASA Astrophysics Data System (ADS)
Ward, A. J.; Pendry, J. B.
2000-06-01
In this paper we present an updated version of our ONYX program for calculating photonic band structures using a non-orthogonal finite difference time domain method. This new version employs the same transparent formalism as the first version with the same capabilities for calculating photonic band structures or causal Green's functions but also includes extra subroutines for the calculation of transmission and reflection coefficients. Both the electric and magnetic fields are placed onto a discrete lattice by approximating the spacial and temporal derivatives with finite differences. This results in discrete versions of Maxwell's equations which can be used to integrate the fields forwards in time. The time required for a calculation using this method scales linearly with the number of real space points used in the discretization so the technique is ideally suited to handling systems with large and complicated unit cells.
[Fetal version as ambulatory intervention].
Nohe, G; Hartmann, W; Klapproth, C E
1996-06-01
The external cephalic version (ECV) of the fetus at term reduces the maternal and fetal risks of intrapartum breech presentation and Caesarean delivery. Since 1986 over 800 external cephalic versions were performed in the outpatient Department of Obstetrics and Gynaecology of the Städtische Frauenklinik Stuttgart. 60.5% were successful. NO severe complications occurred. Sufficient amniotic fluid as well as the mobility of the fetal breech is a major criterion for the success of the ECV. Management requires a safe technique for mother and fetus. This includes ultrasonography, elektronic fetal monitoring and the ability to perform immediate caesarean delivery as well as the performance of ECV without analgesicas and sedatives. More than 70% of the ECV were successful without tocolysis. In unsuccessful cases the additional use of tocolysis improves the success rate only slightly. Therefore routine use of tocolysis does not appear necessary. External cephalic version can be recommended as an outpatient treatment without tocolysis.
NASA Astrophysics Data System (ADS)
Murray, Natalie; Bourne, Neil; Field, John
1997-07-01
Brar and Bless pioneeered the use of plate impact upon bars as a technique for investigating the 1D stress loading of glass. We wish to extend this technique by applying VISAR and embedded stress gauge measurements to a symmetrical version of the test. In this configuration two rods impact one upon the other in a symmetrical version of the Taylor test geometry in which the impact is perfectly rigid in the centre of mass frame. Previous work in the laboratory has characterised the three glass types (float, borosilicate and a high density lead glass). These experiments will identify the 1D stress failure mechanisms from high-speed photography and the stress and particle velocity histories will be interpreted in the light of these results. The differences in response of the three glasses will be highlighted.
Remote sensing at the University of Kansas in radar systems
NASA Technical Reports Server (NTRS)
Moore, R. K.
1970-01-01
Demonstration that a spectral response across an octave bandwidth in the microwave region is as variable as the comparable response in the visible and infrared region is a major mile-stone and indicates the potential of polypanchromatic radar systems is analogous with that of color photography. Averaging of the returns from a target element appears necessary to obtain a grey scale adequate for many earth-science applications of radar systems. This result can be obtained either by azimuth averaging or by the use of panchromatic techniques (range averaging). Improvement with panchromatic techniques has been demonstrated both with a landbased electromagnetic system and with an ultrasonic simulator. The advantage of the averaging achieved in azimuth with the real-aperture version of the DPD-2 when compared with the synthetic aperture version confirms the concept.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tucker, T.C.
1980-06-01
The implementation of a version of the Rutherford Laboratory's magnetostatic computer code GFUN3D on the CDC 7600 at the National Magnetic Fusion Energy Computer Center is reported. A new iteration technique that greatly increases the probability of convergence and reduces computation time by about 30% for calculations with nonlinear, ferromagnetic materials is included. The use of GFUN3D on the NMFE network is discussed, and suggestions for future work are presented. Appendix A consists of revisions to the GFUN3D User Guide (published by Rutherford Laboratory( that are necessary to use this version. Appendix B contains input and output for some samplemore » calculations. Appendix C is a detailed discussion of the old and new iteration techniques.« less
Feedback power control strategies in wireless sensor networks with joint channel decoding.
Abrardo, Andrea; Ferrari, Gianluigi; Martalò, Marco; Perna, Fabio
2009-01-01
In this paper, we derive feedback power control strategies for block-faded multiple access schemes with correlated sources and joint channel decoding (JCD). In particular, upon the derivation of the feasible signal-to-noise ratio (SNR) region for the considered multiple access schemes, i.e., the multidimensional SNR region where error-free communications are, in principle, possible, two feedback power control strategies are proposed: (i) a classical feedback power control strategy, which aims at equalizing all link SNRs at the access point (AP), and (ii) an innovative optimized feedback power control strategy, which tries to make the network operational point fall in the feasible SNR region at the lowest overall transmit energy consumption. These strategies will be referred to as "balanced SNR" and "unbalanced SNR," respectively. While they require, in principle, an unlimited power control range at the sources, we also propose practical versions with a limited power control range. We preliminary consider a scenario with orthogonal links and ideal feedback. Then, we analyze the robustness of the proposed power control strategies to possible non-idealities, in terms of residual multiple access interference and noisy feedback channels. Finally, we successfully apply the proposed feedback power control strategies to a limiting case of the class of considered multiple access schemes, namely a central estimating officer (CEO) scenario, where the sensors observe noisy versions of a common binary information sequence and the AP's goal is to estimate this sequence by properly fusing the soft-output information output by the JCD algorithm.
Hommerson, Paul; Khan, Amjad M; de Jong, Gerhardus J; Somsen, Govert W
2011-01-01
A major step forward in the development and application of capillary electrophoresis (CE) was its coupling to ESI-MS, first reported in 1987. More than two decades later, ESI has remained the principal ionization technique in CE-MS, but a number of other ionization techniques have also been implemented. In this review the state-of-the-art in the employment of soft ionization techniques for CE-MS is presented. First the fundamentals and general challenges of hyphenating conventional CE and microchip electrophoresis with MS are outlined. After elaborating on the characteristics and role of ESI, emphasis is put on alternative ionization techniques including sonic spray ionization (SSI), thermospray ionization (TSI), atmospheric pressure chemical ionization (APCI), atmospheric pressure photoionization (APPI), matrix-assisted laser desorption ionization (MALDI) and continuous-flow fast atom bombardment (CF-FAB). The principle of each ionization technique is outlined and the experimental set-ups of the CE-MS couplings are described. The strengths and limitations of each ionization technique with respect to CE-MS are discussed and the applicability of the various systems is illustrated by a number of typical examples. Copyright © 2011 Wiley Periodicals, Inc.
Application of majority voting and consensus voting algorithms in N-version software
NASA Astrophysics Data System (ADS)
Tsarev, R. Yu; Durmuş, M. S.; Üstoglu, I.; Morozov, V. A.
2018-05-01
N-version programming is one of the most common techniques which is used to improve the reliability of software by building in fault tolerance, redundancy and decreasing common cause failures. N different equivalent software versions are developed by N different and isolated workgroups by considering the same software specifications. The versions solve the same task and return results that have to be compared to determine the correct result. Decisions of N different versions are evaluated by a voting algorithm or the so-called voter. In this paper, two of the most commonly used software voting algorithms such as the majority voting algorithm and the consensus voting algorithm are studied. The distinctive features of Nversion programming with majority voting and N-version programming with consensus voting are described. These two algorithms make a decision about the correct result on the base of the agreement matrix. However, if the equivalence relation on the agreement matrix is not satisfied it is impossible to make a decision. It is shown that the agreement matrix can be transformed into an appropriate form by using the Boolean compositions when the equivalence relation is satisfied.
Protein Signaling Networks from Single Cell Fluctuations and Information Theory Profiling
Shin, Young Shik; Remacle, F.; Fan, Rong; Hwang, Kiwook; Wei, Wei; Ahmad, Habib; Levine, R.D.; Heath, James R.
2011-01-01
Protein signaling networks among cells play critical roles in a host of pathophysiological processes, from inflammation to tumorigenesis. We report on an approach that integrates microfluidic cell handling, in situ protein secretion profiling, and information theory to determine an extracellular protein-signaling network and the role of perturbations. We assayed 12 proteins secreted from human macrophages that were subjected to lipopolysaccharide challenge, which emulates the macrophage-based innate immune responses against Gram-negative bacteria. We characterize the fluctuations in protein secretion of single cells, and of small cell colonies (n = 2, 3,···), as a function of colony size. Measuring the fluctuations permits a validation of the conditions required for the application of a quantitative version of the Le Chatelier's principle, as derived using information theory. This principle provides a quantitative prediction of the role of perturbations and allows a characterization of a protein-protein interaction network. PMID:21575571
Investigation of a low-cost magneto-inductive magnetometer for space science applications
NASA Astrophysics Data System (ADS)
Regoli, Leonardo H.; Moldwin, Mark B.; Pellioni, Matthew; Bronner, Bret; Hite, Kelsey; Sheinker, Arie; Ponder, Brandon M.
2018-03-01
A new sensor for measuring low-amplitude magnetic fields that is ideal for small spacecraft is presented. The novel measurement principle enables the fabrication of a low-cost sensor with low power consumption and with measuring capabilities that are comparable to recent developments for CubeSat applications. The current magnetometer, a software-modified version of a commercial sensor, is capable of detecting fields with amplitudes as low as 8.7 nT at 40 Hz and 2.7 nT at 1 Hz, with a noise floor of 4 pT/
NASA Astrophysics Data System (ADS)
YagnaSri, P.; Siddiqui, Maimuna; Vijaya Nirmala, M.
2018-03-01
The objective of the work is to develop the higher order theory for piezoelectric composite laminated plates with zigzag function and to determine the thermal characteristics of piezoelectric laminated plate with zig zag function for different aspect ratios (a/h), thickness ratios (z/h) and voltage and also to evaluate electric potential function by solving second order differential equation satisfying electric boundary conditions along the thickness direction of piezoelectric layer. The related functions and derivations for equation of motion are obtained using the dynamic version of the principle of virtual work or Hamilton’s principle. The solutions are obtained by using Navier’s stokes method for anti-symmetric angle-ply with specific type of simply supported boundary conditions. Computer programs have been developed for realistic prediction of stresses and deflections for various sides to thickness ratios (a/h) and voltages.
Nomura, Yasunori; Salzetta, Nico
2016-08-04
The firewall paradox for black holes is often viewed as indicating a conflict between unitarity and the equivalence principle. We elucidate how the paradox manifests as a limitation of semiclassical theory, rather than presents a conflict between fundamental principles. Two principal features of the fundamental and semiclassical theories address two versions of the paradox: the entanglement and typicality arguments. First, the physical Hilbert space describing excitations on a fixed black hole background in the semiclassical theory is exponentially smaller than the number of physical states in the fundamental theory of quantum gravity. Second, in addition to the Hilbert space formore » physical excitations, the semiclassical theory possesses an unphysically large Fock space built by creation and annihilation operators on the fixed black hole background. Understanding these features not only eliminates the necessity of firewalls but also leads to a new picture of Hawking emission contrasting pair creation at the horizon.« less
Permaculture: Dreamworld or Breakthrough?
ERIC Educational Resources Information Center
Johns, Maria
1992-01-01
Compares present agriculture practices to permaculture farming techniques, presents a historical perspective of permaculture and where these techniques are being successfully practiced around the world. Inserts (vignettes) enumerate the principles of permaculture and the background of Bill Mollison who conceptualized this farming practice. (MCO)
Dudzińska, Marta; Tarach, Jerzy S; Burroughs, Thomas E; Zwolak, Agnieszka; Matuszek, Beata; Smoleń, Agata; Nowakowski, Andrzej
2014-10-27
The aim of the study was to develop a Polish version of the Diabetes Quality of Life Brief Clinical Inventory (DQL-BCI) and to perform validating evaluation of selected psychometric aspects. The translation process was performed in accordance with generally accepted international principles of translation and cultural adaptation of measurement tools. Two hundred and seventy-four subjects with type 2 diabetes completed the Polish version of DQL-BCI, the generic EQ-5D questionnaire and the diabetes-specific DSC-R. The examination provides information about the reliability (internal consistency, test-retest) and the construct validity of the studied tool (the relationship between the DQL-BCI score and EQ-5D and DSC-R scales, as well as selected clinical patient characteristics). Cronbach's α (internal consistency) for the translated version of DQL-BCI was 0.76. Test-retest Pearson correlation coefficient was 0.96. Spearman's coefficient correlation between DQL-BCI score and EQ-5D index and EQ-VAS were 0.6 (p = 0.0000001) and 0.61 (p = 0.0000001) respectively. The correlation between scores of the examined tool and DSC-R total score was -0.6 (p = 0.0000001). Quality of life was lower among patients with microvascular as well as macrovascular complications and with occurring hypoglycemic episodes. The result of this study is the Polish scale used to test the quality of life of patients with diabetes, which includes the range of problems faced by patients while maintaining a patient-friendly form. High reliability of the scale and good construct validity qualify the Polish version of DQL-BCI as a reliable tool in both research and individual diagnostics.
Tarach, Jerzy S.; Burroughs, Thomas E.; Zwolak, Agnieszka; Matuszek, Beata; Smoleń, Agata; Nowakowski, Andrzej
2014-01-01
Introduction The aim of the study was to develop a Polish version of the Diabetes Quality of Life Brief Clinical Inventory (DQL-BCI) and to perform validating evaluation of selected psychometric aspects. Material and methods The translation process was performed in accordance with generally accepted international principles of translation and cultural adaptation of measurement tools. Two hundred and seventy-four subjects with type 2 diabetes completed the Polish version of DQL-BCI, the generic EQ-5D questionnaire and the diabetes-specific DSC-R. The examination provides information about the reliability (internal consistency, test-retest) and the construct validity of the studied tool (the relationship between the DQL-BCI score and EQ-5D and DSC-R scales, as well as selected clinical patient characteristics). Results Cronbach's α (internal consistency) for the translated version of DQL-BCI was 0.76. Test-retest Pearson correlation coefficient was 0.96. Spearman's coefficient correlation between DQL-BCI score and EQ-5D index and EQ-VAS were 0.6 (p = 0.0000001) and 0.61 (p = 0.0000001) respectively. The correlation between scores of the examined tool and DSC-R total score was –0.6 (p = 0.0000001). Quality of life was lower among patients with microvascular as well as macrovascular complications and with occurring hypoglycemic episodes. Conclusions The result of this study is the Polish scale used to test the quality of life of patients with diabetes, which includes the range of problems faced by patients while maintaining a patient-friendly form. High reliability of the scale and good construct validity qualify the Polish version of DQL-BCI as a reliable tool in both research and individual diagnostics. PMID:25395940
Using the Movies to Illustrate the Principles of Experimental Design
ERIC Educational Resources Information Center
Strelan, Peter
2018-01-01
This article presents an innovative technique for teaching the principles of experimental design in a way that is entertaining and engaging for students. Following a lecture on experimental design, students participate in an experiment in which the teacher uses a funny segment from a movie to test the influence of implicit social norms. Randomly…
Cooperative Learning Instructional Methods for CS1: Design, Implementation, and Evaluation
ERIC Educational Resources Information Center
Beck, Leland; Chizhik, Alexander
2013-01-01
Cooperative learning is a well-known instructional technique that has been applied with a wide variety of subject matter and a broad spectrum of populations. This article briefly reviews the principles of cooperative learning, and describes how these principles were incorporated into a comprehensive set of cooperative learning activities for a CS1…
To Design and Evaluate a 12th Grade Course in the Principles of Economics; Final Report.
ERIC Educational Resources Information Center
Wiggins, Suzanne E.; Sperling, John G.
Reported is the design, development, and evaluation of a one-semester course on the principles of economics for twelfth grade students. The course is intended to develop students' capacity for economic reasoning through economic theory and empirical research. To do this, teaching materials and innovative techniques for teacher training were…
ERIC Educational Resources Information Center
Kies, Cosette N.
A brief overview of the functions of public relations in libraries introduces this manual, which provides an explanation of the public relations (PR) process, including fact-finding, planning, communicating, evaluating, and marketing; some PR principles; a 10-step program that could serve as a model for planning a PR program; a discussion of PR…
SPIRAL-SPRITE: a rapid single point MRI technique for application to porous media.
Szomolanyi, P; Goodyear, D; Balcom, B; Matheson, D
2001-01-01
This study presents the application of a new, rapid, single point MRI technique which samples k space with spiral trajectories. The general principles of the technique are outlined along with application to porous concrete samples, solid pharmaceutical tablets and gas phase imaging. Each sample was chosen to highlight specific features of the method.
Cardiovascular magnetic resonance physics for clinicians: part II
2012-01-01
This is the second of two reviews that is intended to cover the essential aspects of cardiovascular magnetic resonance (CMR) physics in a way that is understandable and relevant to clinicians using CMR in their daily practice. Starting with the basic pulse sequences and contrast mechanisms described in part I, it briefly discusses further approaches to accelerate image acquisition. It then continues by showing in detail how the contrast behaviour of black blood fast spin echo and bright blood cine gradient echo techniques can be modified by adding rf preparation pulses to derive a number of more specialised pulse sequences. The simplest examples described include T2-weighted oedema imaging, fat suppression and myocardial tagging cine pulse sequences. Two further important derivatives of the gradient echo pulse sequence, obtained by adding preparation pulses, are used in combination with the administration of a gadolinium-based contrast agent for myocardial perfusion imaging and the assessment of myocardial tissue viability using a late gadolinium enhancement (LGE) technique. These two imaging techniques are discussed in more detail, outlining the basic principles of each pulse sequence, the practical steps required to achieve the best results in a clinical setting and, in the case of perfusion, explaining some of the factors that influence current approaches to perfusion image analysis. The key principles of contrast-enhanced magnetic resonance angiography (CE-MRA) are also explained in detail, especially focusing on timing of the acquisition following contrast agent bolus administration, and current approaches to achieving time resolved MRA. Alternative MRA techniques that do not require the use of an endogenous contrast agent are summarised, and the specialised pulse sequence used to image the coronary arteries, using respiratory navigator gating, is described in detail. The article concludes by explaining the principle behind phase contrast imaging techniques which create images that represent the phase of the MR signal rather than the magnitude. It is shown how this principle can be used to generate velocity maps by designing gradient waveforms that give rise to a relative phase change that is proportional to velocity. Choice of velocity encoding range and key pitfalls in the use of this technique are discussed. PMID:22995744
Mahmud, Wan Mohd Rushidi Wan; Awang, Amir; Mohamed, Mahmood Nazar
2004-01-01
The Malay version of the Medical Outcome Study (MOS) Social Support Survey was validated among a sample of postpartum Malay women attending selected health centers in Kedah, North West of Peninsular Malaysia. 215 women between 4 to 12 weeks postpartum were recruited for the validation study. They were given questionnaires on socio-demography, the Malay-versions of the MOS Social Support Survey, Edinburgh Postnatal Depression Scale (EPDS) and the 21-items Beck Depression Inventory-II (BDI-II). 30 of the women, who were bilingual, were also given the original English version of the instrument. A week later, these women were again given the Malay version of the MOS Social Support Survey. The scale displayed good internal consistency (Cronbach’s alpha = 0.93), parallel form reliability (0.98) and test-retest reliability (0.97) (Spearman’s rho; p<0.01). The negative correlations of the overall support index (total social support measure) with the Malay versions of EPDS and BDI-II confirmed its validity. Extraction method of the 19 items (item 2 to item 20) from the MOS Social Support Survey using principle axis factoring with direct oblimin rotation converged into 3 dimensions of functional social support (informational, affectionate / positive social interaction and instrumental support) with reliability coefficients of 0.91, 0.83 and 0.75 respectively. The overall support index also displayed low but significant correlations with item 1 which represents a single measure of structural social support in the instrument (p <0.01). The Malay version of the MOS Social Support Survey demonstrated good psychometric properties in measuring social support among a sample of Malay postpartum Malay women attending selected health centers in Kedah, North West of Peninsular Malaysia and it could be used as a simple instrument in primary care settings. PMID:22973124
NASA Astrophysics Data System (ADS)
Riza, Nabeel A.; Khan, Sajjad A.
2006-03-01
These errata are related to minor typographic errors that were present in the print of Ref. [1]. In these errata, we present the correct versions of Figs. 9 and 12 of Ref. [1]. Specifically, equations in block number 5 and 8 in Fig. 9 and block number 6 in Fig. 12 were incorrectly printed in Ref. [1]. The correct versions of Figs. 9 and 12 are shown next (see Figs. 1 and 2).
Cristiano, Bárbara F G; Delgado, José Ubiratan; da Silva, José Wanderley S; de Barros, Pedro D; de Araújo, Radier M S; Dias, Fábio C; Lopes, Ricardo T
2012-09-01
The potentiometric titration method was used for characterization of uranium compounds to be applied in intercomparison programs. The method is applied with traceability assured using a potassium dichromate primary standard. A semi-automatic version was developed to reduce the analysis time and the operator variation. The standard uncertainty in determining the total concentration of uranium was around 0.01%, which is suitable for uranium characterization and compatible with those obtained by manual techniques. Copyright © 2012 Elsevier Ltd. All rights reserved.
Refinements to the Graves and Pitarka (2010) Broadband Ground-Motion Simulation Method
Graves, Robert; Pitarka, Arben
2014-12-17
This brief article describes refinements to the Graves and Pitarka (2010) broadband ground-motion simulation methodology (GP2010 hereafter) that have been implemented in version 14.3 of the Southern California Earthquake Center (SCEC) Broadband Platform (BBP). The updated version of our method on the current SCEC BBP is referred to as GP14.3. Here, our simulation technique is a hybrid approach that combines low- and high-frequency motions computed with different methods into a single broadband response.
GrammarViz 3.0: Interactive Discovery of Variable-Length Time Series Patterns
Senin, Pavel; Lin, Jessica; Wang, Xing; ...
2018-02-23
The problems of recurrent and anomalous pattern discovery in time series, e.g., motifs and discords, respectively, have received a lot of attention from researchers in the past decade. However, since the pattern search space is usually intractable, most existing detection algorithms require that the patterns have discriminative characteristics and have its length known in advance and provided as input, which is an unreasonable requirement for many real-world problems. In addition, patterns of similar structure, but of different lengths may co-exist in a time series. In order to address these issues, we have developed algorithms for variable-length time series pattern discoverymore » that are based on symbolic discretization and grammar inference—two techniques whose combination enables the structured reduction of the search space and discovery of the candidate patterns in linear time. In this work, we present GrammarViz 3.0—a software package that provides implementations of proposed algorithms and graphical user interface for interactive variable-length time series pattern discovery. The current version of the software provides an alternative grammar inference algorithm that improves the time series motif discovery workflow, and introduces an experimental procedure for automated discretization parameter selection that builds upon the minimum cardinality maximum cover principle and aids the time series recurrent and anomalous pattern discovery.« less
2017-01-01
Recent advances in understanding protein folding have benefitted from coarse-grained representations of protein structures. Empirical energy functions derived from these techniques occasionally succeed in distinguishing native structures from their corresponding ensembles of nonnative folds or decoys which display varying degrees of structural dissimilarity to the native proteins. Here we utilized atomic coordinates of single protein chains, comprising a large diverse training set, to develop and evaluate twelve all-atom four-body statistical potentials obtained by exploring alternative values for a pair of inherent parameters. Delaunay tessellation was performed on the atomic coordinates of each protein to objectively identify all quadruplets of interacting atoms, and atomic potentials were generated via statistical analysis of the data and implementation of the inverted Boltzmann principle. Our potentials were evaluated using benchmarking datasets from Decoys-‘R'-Us, and comparisons were made with twelve other physics- and knowledge-based potentials. Ranking 3rd, our best potential tied CHARMM19 and surpassed AMBER force field potentials. We illustrate how a generalized version of our potential can be used to empirically calculate binding energies for target-ligand complexes, using HIV-1 protease-inhibitor complexes for a practical application. The combined results suggest an accurate and efficient atomic four-body statistical potential for protein structure prediction and assessment. PMID:29119109
Statistical inference of protein structural alignments using information and compression.
Collier, James H; Allison, Lloyd; Lesk, Arthur M; Stuckey, Peter J; Garcia de la Banda, Maria; Konagurthu, Arun S
2017-04-01
Structural molecular biology depends crucially on computational techniques that compare protein three-dimensional structures and generate structural alignments (the assignment of one-to-one correspondences between subsets of amino acids based on atomic coordinates). Despite its importance, the structural alignment problem has not been formulated, much less solved, in a consistent and reliable way. To overcome these difficulties, we present here a statistical framework for the precise inference of structural alignments, built on the Bayesian and information-theoretic principle of Minimum Message Length (MML). The quality of any alignment is measured by its explanatory power-the amount of lossless compression achieved to explain the protein coordinates using that alignment. We have implemented this approach in MMLigner , the first program able to infer statistically significant structural alignments. We also demonstrate the reliability of MMLigner 's alignment results when compared with the state of the art. Importantly, MMLigner can also discover different structural alignments of comparable quality, a challenging problem for oligomers and protein complexes. Source code, binaries and an interactive web version are available at http://lcb.infotech.monash.edu.au/mmligner . arun.konagurthu@monash.edu. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
Zaikin, Alexey; Míguez, Joaquín
2017-01-01
We compare three state-of-the-art Bayesian inference methods for the estimation of the unknown parameters in a stochastic model of a genetic network. In particular, we introduce a stochastic version of the paradigmatic synthetic multicellular clock model proposed by Ullner et al., 2007. By introducing dynamical noise in the model and assuming that the partial observations of the system are contaminated by additive noise, we enable a principled mechanism to represent experimental uncertainties in the synthesis of the multicellular system and pave the way for the design of probabilistic methods for the estimation of any unknowns in the model. Within this setup, we tackle the Bayesian estimation of a subset of the model parameters. Specifically, we compare three Monte Carlo based numerical methods for the approximation of the posterior probability density function of the unknown parameters given a set of partial and noisy observations of the system. The schemes we assess are the particle Metropolis-Hastings (PMH) algorithm, the nonlinear population Monte Carlo (NPMC) method and the approximate Bayesian computation sequential Monte Carlo (ABC-SMC) scheme. We present an extensive numerical simulation study, which shows that while the three techniques can effectively solve the problem there are significant differences both in estimation accuracy and computational efficiency. PMID:28797087
GrammarViz 3.0: Interactive Discovery of Variable-Length Time Series Patterns
DOE Office of Scientific and Technical Information (OSTI.GOV)
Senin, Pavel; Lin, Jessica; Wang, Xing
The problems of recurrent and anomalous pattern discovery in time series, e.g., motifs and discords, respectively, have received a lot of attention from researchers in the past decade. However, since the pattern search space is usually intractable, most existing detection algorithms require that the patterns have discriminative characteristics and have its length known in advance and provided as input, which is an unreasonable requirement for many real-world problems. In addition, patterns of similar structure, but of different lengths may co-exist in a time series. In order to address these issues, we have developed algorithms for variable-length time series pattern discoverymore » that are based on symbolic discretization and grammar inference—two techniques whose combination enables the structured reduction of the search space and discovery of the candidate patterns in linear time. In this work, we present GrammarViz 3.0—a software package that provides implementations of proposed algorithms and graphical user interface for interactive variable-length time series pattern discovery. The current version of the software provides an alternative grammar inference algorithm that improves the time series motif discovery workflow, and introduces an experimental procedure for automated discretization parameter selection that builds upon the minimum cardinality maximum cover principle and aids the time series recurrent and anomalous pattern discovery.« less
NASA Technical Reports Server (NTRS)
Key, David L.; Heffley, Robert K.
2002-01-01
The purpose of the study was to develop generic design principles for obtaining attitude command response in moderate to aggressive maneuvers without increasing SCAS series servo authority from the existing +/- 10%. In particular, to develop a scheme that would work on the UH-60 helicopter so that it can be considered for incorporation in future upgrades. The basic math model was a UH-60A version of GENHEL. The simulation facility was the NASA-Ames Vertical Motion Simulator (VMS). Evaluation tasks were Hover, Acceleration-Deceleration, and Sidestep, as defined in ADS-33D-PRF for Degraded Visual Environment (DVE). The DVE was adjusted to provide a Usable Cue Environment (UCE) equal to two. The basic concept investigated was the extent to which the limited attitude command authority achievable by the series servo could be supplemented by a 10%/sec trim servo. The architecture used provided angular rate feedback to only the series servo, shared the attitude feedback between the series and trim servos, and when the series servo approached saturation the attitude feedback was slowly phased out. Results show that modest use of the trim servo does improve pilot ratings, especially in and around hover. This improvement can be achieved with little degradation in response predictability during moderately aggressive maneuvers.
MEMS deformable mirror embedded wavefront sensing and control system
NASA Astrophysics Data System (ADS)
Owens, Donald; Schoen, Michael; Bush, Keith
2006-01-01
Electrostatic Membrane Deformable Mirror (MDM) technology developed using silicon bulk micro-machining techniques offers the potential of providing low-cost, compact wavefront control systems for diverse optical system applications. Electrostatic mirror construction using bulk micro-machining allows for custom designs to satisfy wavefront control requirements for most optical systems. An electrostatic MDM consists of a thin membrane, generally with a thin metal or multi-layer high-reflectivity coating, suspended over an actuator pad array that is connected to a high-voltage driver. Voltages applied to the array elements deflect the membrane to provide an optical surface capable of correcting for measured optical aberrations in a given system. Electrostatic membrane DM designs are derived from well-known principles of membrane mechanics and electrostatics, the desired optical wavefront control requirements, and the current limitations of mirror fabrication and actuator drive electronics. MDM performance is strongly dependent on mirror diameter and air damping in meeting desired spatial and temporal frequency requirements. In this paper, we present wavefront control results from an embedded wavefront control system developed around a commercially available high-speed camera and an AgilOptics Unifi MDM driver using USB 2.0 communications and the Linux development environment. This new product, ClariFast TM, combines our previous Clarifi TM product offering into a faster more streamlined version dedicated strictly to Hartmann Wavefront sensing.
Kashkouli, Mohsen Bahmani; Karimi, Nasser; Aghamirsalim, Mohamadreza; Abtahi, Mohammad Bagher; Nojomi, Marzieh; Shahrad-Bejestani, Hadi; Salehi, Masoud
2017-02-01
To determine the measurement properties of the Persian language version of the Graves orbitopathy quality of life questionnaire (GO-QOL). Following a systematic translation and cultural adaptation process, 141 consecutive unselected thyroid eye disease (TED) patients answered the Persian GO-QOL and underwent complete ophthalmic examination. The questionnaire was again completed by 60 patients on the second visit, 2-4 weeks later. Construct validity (cross-cultural validity, structural validity and hypotheses testing), reliability (internal consistency and test-retest reliability), and floor and ceiling effects of the Persian version of the GO-QOL were evaluated. Furthermore, Rasch analysis was used to assess its psychometric properties. Cross-cultural validity was established by back-translation techniques, committee review and pretesting techniques. Bi-dimensionality of the questionnaire was confirmed by factor analysis. Construct validity was also supported through confirmation of 6 out of 8 predefined hypotheses. Cronbach's α and intraclass correlation coefficient (ICC) were 0.650 and 0.859 for visual functioning and 0.875 and 0.896 for appearance subscale, respectively. Mean quality of life (QOL) scores for visual functioning and appearance were 78.18 (standard deviation, SD, 21.57) and 56.25 (SD 26.87), respectively. Person reliabilities from the Rasch rating scale model for both visual functioning and appearance revealed an acceptable internal consistency for the Persian GO-QOL. The Persian GO-QOL questionnaire is a valid and reliable tool with good psychometric properties in evaluation of Persian-speaking patients with TED. Applying Rasch analysis to future versions of the GO-QOL is recommended in order to perform tests for linearity between the estimated item measures in different versions.
NASA Astrophysics Data System (ADS)
Gaunaa, Mac; Heinz, Joachim; Skrzypiński, Witold
2016-09-01
The crossflow principle is one of the key elements used in engineering models for prediction of the aerodynamic loads on wind turbine blades in standstill or blade installation situations, where the flow direction relative to the wind turbine blade has a component in the direction of the blade span direction. In the present work, the performance of the crossflow principle is assessed on the DTU 10MW reference blade using extensive 3D CFD calculations. Analysis of the computational results shows that there is only a relatively narrow region in which the crossflow principle describes the aerodynamic loading well. In some conditions the deviation of the predicted loadings can be quite significant, having a large influence on for instance the integral aerodynamic moments around the blade centre of mass; which is very important for single blade installation applications. The main features of these deviations, however, have a systematic behaviour on all force components, which in this paper is employed to formulate the first version of an engineering correction method to the crossflow principle applicable for wind turbine blades. The new correction model improves the agreement with CFD results for the key aerodynamic loads in crossflow situations. The general validity of this model for other blade shapes should be investigated in subsequent works.
Singh, Jay P.; Desmarais, Sarah L.; Sellers, Brian G.; Hylton, Tatiana; Tirotti, Melissa; Van Dorn, Richard A.
2013-01-01
Though considerable research has examined the validity of risk assessment tools in predicting adverse outcomes in justice-involved adolescents, the extent to which risk assessments are translated into risk management strategies and, importantly, the association between this link and adverse outcomes has gone largely unexamined. To address these shortcomings, the Risk-Need-Responsivity (RNR) model was used to examine associations between identified strengths and vulnerabilities, interventions, and institutional outcomes for justice-involved youth. Data were collected from risk assessments completed using the Short-Term Assessment of Risk and Treatability: Adolescent Version (START:AV) for 120 adolescent offenders (96 boys and 24 girls). Interventions and outcomes were extracted from institutional records. Mixed evidence of adherence to RNR principles was found. Accordant to the risk principle, adolescent offenders judged to have more strengths had more strength-based interventions in their service plans, though adolescent offenders with more vulnerabilities did not have more interventions targeting their vulnerabilities. With respect to the need and responsivity principles, vulnerabilities and strengths identified as particularly relevant to the individual youth's risk of adverse outcomes were addressed in the service plans about half and a quarter of the time, respectively. Greater adherence to the risk and need principles was found to predict significantly the likelihood of externalizing outcomes. Findings suggest some gaps between risk assessment and risk management and highlight the potential usefulness of strength-based approaches to intervention. PMID:25346561
Duque-Ramos, Astrid; Quesada-Martínez, Manuel; Iniesta-Moreno, Miguela; Fernández-Breis, Jesualdo Tomás; Stevens, Robert
2016-10-17
The biomedical community has now developed a significant number of ontologies. The curation of biomedical ontologies is a complex task and biomedical ontologies evolve rapidly, so new versions are regularly and frequently published in ontology repositories. This has the implication of there being a high number of ontology versions over a short time span. Given this level of activity, ontology designers need to be supported in the effective management of the evolution of biomedical ontologies as the different changes may affect the engineering and quality of the ontology. This is why there is a need for methods that contribute to the analysis of the effects of changes and evolution of ontologies. In this paper we approach this issue from the ontology quality perspective. In previous work we have developed an ontology evaluation framework based on quantitative metrics, called OQuaRE. Here, OQuaRE is used as a core component in a method that enables the analysis of the different versions of biomedical ontologies using the quality dimensions included in OQuaRE. Moreover, we describe and use two scales for evaluating the changes between the versions of a given ontology. The first one is the static scale used in OQuaRE and the second one is a new, dynamic scale, based on the observed values of the quality metrics of a corpus defined by all the versions of a given ontology (life-cycle). In this work we explain how OQuaRE can be adapted for understanding the evolution of ontologies. Its use has been illustrated with the ontology of bioinformatics operations, types of data, formats, and topics (EDAM). The two scales included in OQuaRE provide complementary information about the evolution of the ontologies. The application of the static scale, which is the original OQuaRE scale, to the versions of the EDAM ontology reveals a design based on good ontological engineering principles. The application of the dynamic scale has enabled a more detailed analysis of the evolution of the ontology, measured through differences between versions. The statistics of change based on the OQuaRE quality scores make possible to identify key versions where some changes in the engineering of the ontology triggered a change from the OQuaRE quality perspective. In the case of the EDAM, this study let us to identify that the fifth version of the ontology has the largest impact in the quality metrics of the ontology, when comparative analyses between the pairs of consecutive versions are performed.
1994-08-01
AGARD-AG-300 Vol. 12 04 ADVISORY GROUP FOR AEROSPACE RESEARCH & DEVELOPMENT 7 RUE ANCELLE, 92200 NEUILLY-SUR-SEINE, FRANCE AUG 0195 AGARDograph 300...AGARD Flight Test Techniques Series Volume 12 on The Principles of Flight Test Assessment of Flight-Safety-Critical Systems in Helicopters (Les...and Availability on Back Cover AGARD-AG-300 Vol. 12 ADVISORY GROUP FOR AEROSPACE RESEARCH & DEVELOPMENT 7 RUE ANCELLE, 92200 NEUILLY-SUR-SEINE, FRANCE
Mitigating road impacts on animals through learning principles.
Proppe, D S; McMillan, N; Congdon, J V; Sturdy, C B
2017-01-01
Roads are a nearly ubiquitous feature of the developed world, but their presence does not come without consequences. Many mammals, birds, reptiles, and amphibians suffer high rates of mortality through collision with motor vehicles, while other species treat roads as barriers that reduce gene flow between populations. Road effects extend beyond the pavement, where traffic noise is altering communities of songbirds, insects, and some mammals. Traditional methods of mitigation along roads include the creation of quieter pavement and tires and the construction of physical barriers to reduce sound transmission and movement. While effective, these forms of mitigation are costly and time-consuming. One alternative is the use of learning principles to create or extinguish aversive behaviors in animals living near roads. Classical and operant conditioning are well-documented techniques for altering behavior in response to novel cues and signals. Behavioral ecologists have used conditioning techniques to mitigate human-wildlife conflict challenges, alter predator-prey interactions, and facilitate reintroduction efforts. Yet, these principles have rarely been applied in the context of roads. We suggest that the field of road ecology is ripe with opportunity for experimentation with learning principles. We present tangible ways that learning techniques could be utilized to mitigate negative roadside behaviors, address the importance of evaluating fitness within these contexts, and evaluate the longevity of learned behaviors. This review serves as an invitation for empirical studies that test the effectiveness of learning paradigms as a mitigation tool in the context of roads.
48 CFR 9905.505-50 - Techniques for application.
Code of Federal Regulations, 2010 CFR
2010-10-01
... this cost accounting principle does not require that allocation of unallowable costs to final cost.... 9905.505-50 Section 9905.505-50 Federal Acquisition Regulations System COST ACCOUNTING STANDARDS BOARD... ACCOUNTING STANDARDS COST ACCOUNTING STANDARDS FOR EDUCATIONAL INSTITUTIONS 9905.505-50 Techniques for...
Analysis of magnetic fields using variational principles and CELAS2 elements
NASA Technical Reports Server (NTRS)
Frye, J. W.; Kasper, R. G.
1977-01-01
Prospective techniques for analyzing magnetic fields using NASTRAN are reviewed. A variational principle utilizing a vector potential function is presented which has as its Euler equations, the required field equations and boundary conditions for static magnetic fields including current sources. The need for an addition to this variational principle of a constraint condition is discussed. Some results using the Lagrange multiplier method to apply the constraint and CELAS2 elements to simulate the matrices are given. Practical considerations of using large numbers of CELAS2 elements are discussed.
Zhao, Hongxia; Yang, Yong; Shu, Xin; Wang, Yanwei; Ran, Qianping
2018-04-09
First-principle calculations, especially by the density functional theory (DFT) methods, are becoming a power technique to study molecular structure and properties of organic/inorganic interfaces. This review introduces some recent examples on the study of adsorption models of organic molecules or oligomers on mineral surfaces and interfacial properties obtained from first-principles calculations. The aim of this contribution is to inspire scientists to benefit from first-principle calculations and to apply the similar strategies when studying and tailoring interfacial properties at the atomistic scale, especially for those interested in the design and development of new molecules and new products. Copyright © 2017. Published by Elsevier B.V.
[Conceptual approach to formation of a modern system of medical provision].
Belevitin, A B; Miroshnichenko, Iu V; Bunin, S A; Goriachev, A B; Krasavin, K D
2009-09-01
Within the frame of forming of a new face of medical service of the Armed Forces, were determined the principle approaches to optimization of the process of development of the system of medical supply. It was proposed to use the following principles: principle of hierarchic structuring, principle of purposeful orientation, principle of vertical task sharing, principle of horizontal task sharing, principle of complex simulation, principle of permanent perfection. The main direction of optimization of structure and composition of system of medical supply of the Armed Forces are: forming of modern institutes of medical supply--centers of support by technique and facilities on the base of central, regional storehouses, and attachment of several functions of organs of military government to them; creation of medical supply office on the base military hospitals, being basing treatment-prophylaxis institutes, in adjusted territorial zones of responsibility for the purpose of realization of complex of tasks of supplying the units and institutes, attached to them on medical support, by medical equipment. Building of medical support system is realized on three levels: Center - Military region (NAVY region) - territorial zone of responsibility.
Ross, David A; Rohrbaugh, Robert
2014-04-01
The authors describe the development and implementation of a new adult psychiatry residency didactic curriculum based on adult learning principles and an integrative, patient-centered approach that includes a progressive 4-year neuroscience curriculum. The authors describe the process of conducting a needs assessment, engaging stakeholders and developing guiding principles for the new curriculum. The curriculum was evaluated using qualitative measures, a resident survey, course evaluations, and a pilot version of a specialized assessment tool. Feedback from the resident survey and from course evaluations was positive, and residents indicated interest in receiving additional training in neuroscience. Residents self-reported not incorporating neuroscience into formulation and treatment planning as often as other perspectives. They also reported that neuroscience was reinforced less by clinical faculty than other perspectives. Performance on the curriculum assessment corroborated that clinical application of neuroscience may benefit from additional reinforcement. Residents responded well to the design and content of the new didactic curriculum. The neuroscience component appears to have achieved its primary objective of enhancing attitudes to the field. Continued work including enhancing the culture of neuroscience at the clinical sites may be required to achieve broader behavioral goals.
NASA Technical Reports Server (NTRS)
Muravyov, Alexander A.; Turner, Travis L.; Robinson, Jay H.; Rizzi, Stephen A.
1999-01-01
In this paper, the problem of random vibration of geometrically nonlinear MDOF structures is considered. The solutions obtained by application of two different versions of a stochastic linearization method are compared with exact (F-P-K) solutions. The formulation of a relatively new version of the stochastic linearization method (energy-based version) is generalized to the MDOF system case. Also, a new method for determination of nonlinear sti ness coefficients for MDOF structures is demonstrated. This method in combination with the equivalent linearization technique is implemented in a new computer program. Results in terms of root-mean-square (RMS) displacements obtained by using the new program and an existing in-house code are compared for two examples of beam-like structures.
ERIC Educational Resources Information Center
Commission on Engineering Education, Washington, DC.
THIS STUDENTS' MANUAL FOR THE ENGINEERING CONCEPTS CURRICULUM PROJECT'S (ECCP) HIGH SCHOOL COURSE, "THE MAN MADE WORLD," IS THE THIRD DRAFT OF THE EXPERIMENTAL VERSION. THE MATERIAL WRITTEN BY SCIENTISTS, ENGINEERS, AND EDUCATORS, EMPHASIZES THE THEORIES AND TECHNIQUES WHICH CONTRIBUTE TO OUR TECHNOLOGICAL CIVILIZATION. RESOURCES OF THE MAN-MADE…
ERIC Educational Resources Information Center
Laner, S.; And Others
This report is a critical evaluation based on extended field trials and theoretical analysis of the time-span technique of measuring level of work in organizational hierarchies. It is broadly concluded that the technique does possess many of the desirable features claimed by its originator, but that earlier, less highly structured versions based…
"PowerPoint[R] Engagement" Techniques to Foster Deep Learning
ERIC Educational Resources Information Center
Berk, Ronald A.
2011-01-01
The purpose of this article is to describe a bunch of strategies with which teachers may already be familiar and, perhaps, use regularly, but not always in the context of a formal PowerPoint[R] presentation. Here are the author's top 10 engagement techniques that fit neatly within any version of PowerPoint[R]. Some of these may also be used with…
ERIC Educational Resources Information Center
Martin, Clessen J.
Volume 2, the appendix to the final report of Project FAST, consists of prose selections used to study the effects of text reduction techniques on the comprehension and recall of written materials among visually handicapped and hearing impaired subjects. Each selection is presented in various versions such as 10 percent subjective deleted, 20…
Modern Display Technologies for Airborne Applications.
1983-04-01
the case of LED head-down direct view displays, this requires that special attention be paid to the optical filtering , the electrical drive/address...effectively attenuates the LED specular reflectance component, the colour and neutral density filtering attentuate the diffuse component and the... filter techniques are planned for use with video, multi- colour and advanced versions of numeric, alphanumeric and graphic displays; this technique
ViBe: a universal background subtraction algorithm for video sequences.
Barnich, Olivier; Van Droogenbroeck, Marc
2011-06-01
This paper presents a technique for motion detection that incorporates several innovative mechanisms. For example, our proposed technique stores, for each pixel, a set of values taken in the past at the same location or in the neighborhood. It then compares this set to the current pixel value in order to determine whether that pixel belongs to the background, and adapts the model by choosing randomly which values to substitute from the background model. This approach differs from those based upon the classical belief that the oldest values should be replaced first. Finally, when the pixel is found to be part of the background, its value is propagated into the background model of a neighboring pixel. We describe our method in full details (including pseudo-code and the parameter values used) and compare it to other background subtraction techniques. Efficiency figures show that our method outperforms recent and proven state-of-the-art methods in terms of both computation speed and detection rate. We also analyze the performance of a downscaled version of our algorithm to the absolute minimum of one comparison and one byte of memory per pixel. It appears that even such a simplified version of our algorithm performs better than mainstream techniques.
Creating stimuli for the study of biological-motion perception.
Dekeyser, Mathias; Verfaillie, Karl; Vanrie, Jan
2002-08-01
In the perception of biological motion, the stimulus information is confined to a small number of lights attached to the major joints of a moving person. Despite this drastic degradation of the stimulus information, the human visual apparatus organizes the swarm of moving dots into a vivid percept of a moving biological creature. Several techniques have been proposed to create point-light stimuli: placing dots at strategic locations on photographs or films, video recording a person with markers attached to the body, computer animation based on artificial synthesis, and computer animation based on motion-capture data. A description is given of the technique we are currently using in our laboratory to produce animated point-light figures. The technique is based on a combination of motion capture and three-dimensional animation software (Character Studio, Autodesk, Inc., 1998). Some of the advantages of our approach are that the same actions can be shown from any viewpoint, that point-light versions, as well as versions with a full-fleshed character, can be created of the same actions, and that point lights can indicate the center of a joint (thereby eliminating several disadvantages associated with other techniques).
[Use of indocyanine green angiography in reconstructive surgery: Brief review].
Echalier, C; Pluvy, I; Pauchot, J
2016-12-01
The success of flap surgery is highly dependant of vascularisation, according to the principle of dermal and subdermal perfusion. This principle requires compatible dimensions for the survival of the flap. Indocyanine green angiography (ICG), a technique enabling an assessment of vascularization by fluorescence, has received a considerable impetus during the last two decades. The purpose of this article was to conduct a review on this technique and to evaluate its relevance in flap surgery. We reviewed all articles referenced on PubMed from 1995 till 2015 using a search combining the terms 'indocyanine green', 'flap', 'near-infrared', 'fluorescence', 'imaging' OR 'angiography'. One hundred fifty five articles were found and among those thirty-four were selected. ICG is a reliable technique to locate perforants vessels, to determine the outlines of the flat and evaluate its per- and postoperative viability and to appraise anastomoses. This technique allows a reliable and real-time assessment of potential necrotic areas and an improvement in the detection of complications compared to conventional techniques. Copyright © 2016 Elsevier Masson SAS. All rights reserved.
NASA Astrophysics Data System (ADS)
Mahaboob, B.; Venkateswarlu, B.; Sankar, J. Ravi; Balasiddamuni, P.
2017-11-01
This paper uses matrix calculus techniques to obtain Nonlinear Least Squares Estimator (NLSE), Maximum Likelihood Estimator (MLE) and Linear Pseudo model for nonlinear regression model. David Pollard and Peter Radchenko [1] explained analytic techniques to compute the NLSE. However the present research paper introduces an innovative method to compute the NLSE using principles in multivariate calculus. This study is concerned with very new optimization techniques used to compute MLE and NLSE. Anh [2] derived NLSE and MLE of a heteroscedatistic regression model. Lemcoff [3] discussed a procedure to get linear pseudo model for nonlinear regression model. In this research article a new technique is developed to get the linear pseudo model for nonlinear regression model using multivariate calculus. The linear pseudo model of Edmond Malinvaud [4] has been explained in a very different way in this paper. David Pollard et.al used empirical process techniques to study the asymptotic of the LSE (Least-squares estimation) for the fitting of nonlinear regression function in 2006. In Jae Myung [13] provided a go conceptual for Maximum likelihood estimation in his work “Tutorial on maximum likelihood estimation