Multi-valued logic gates based on ballistic transport in quantum point contacts.
Seo, M; Hong, C; Lee, S-Y; Choi, H K; Kim, N; Chung, Y; Umansky, V; Mahalu, D
2014-01-22
Multi-valued logic gates, which can handle quaternary numbers as inputs, are developed by exploiting the ballistic transport properties of quantum point contacts in series. The principle of a logic gate that finds the minimum of two quaternary number inputs is demonstrated. The device is scalable to allow multiple inputs, which makes it possible to find the minimum of multiple inputs in a single gate operation. Also, the principle of a half-adder for quaternary number inputs is demonstrated. First, an adder that adds up two quaternary numbers and outputs the sum of inputs is demonstrated. Second, a device to express the sum of the adder into two quaternary digits [Carry (first digit) and Sum (second digit)] is demonstrated. All the logic gates presented in this paper can in principle be extended to allow decimal number inputs with high quality QPCs.
Teaching Structure-Property Relationships: Investigating Molecular Structure and Boiling Point
ERIC Educational Resources Information Center
Murphy, Peter M.
2007-01-01
A concise, well-organized table of the boiling points of 392 organic compounds has facilitated inquiry-based instruction in multiple scientific principles. Many individual or group learning activities can be derived from the tabulated data of molecular structure and boiling point based on the instructor's education objectives and the students'…
Multiple-point principle with a scalar singlet extension of the standard model
Haba, Naoyuki; Ishida, Hiroyuki; Okada, Nobuchika; ...
2017-01-21
Here, we suggest a scalar singlet extension of the standard model, in which the multiple-point principle (MPP) condition of a vanishing Higgs potential at the Planck scale is realized. Although there have been lots of attempts to realize the MPP at the Planck scale, the realization with keeping naturalness is quite difficult. This model can easily achieve the MPP at the Planck scale without large Higgs mass corrections. It is worth noting that the electroweak symmetry can be radiatively broken in our model. In the naturalness point of view, the singlet scalar mass should be of O(1 TeV) or less.more » Also, we consider right-handed neutrino extension of the model for neutrino mass generation. The model does not affect the MPP scenario, and might keep the naturalness with the new particle mass scale beyond TeV, thanks to accidental cancellation of Higgs mass corrections.« less
NASA Astrophysics Data System (ADS)
van Veenstra, Anne Fleur; Janssen, Marijn
One of the main challenges for e-government is to create coherent services for citizens and businesses. Realizing Integrated Service Delivery (ISD) requires government agencies to collaborate across their organizational boundaries. The coordination of processes across multiple organizations to realize ISD is called orchestration. One way of achieving orchestration is to formalize processes using architecture. In this chapter we identify architectural principles for orchestration by looking at three case studies of cross-organizational service delivery chain formation in the Netherlands. In total, six generic principles were formulated and subsequently validated in two workshops with experts. These principles are: (i) build an intelligent front office, (ii) give processes a clear starting point and end, (iii) build a central workflow application keeping track of the process, (iv) differentiate between simple and complex processes, (v) ensure that the decision-making responsibility and the overview of the process are not performed by the same process role, and (vi) create a central point where risk profiles are maintained. Further research should focus on how organizations can adapt these principles to their own situation.
A wavefront orientation method for precise numerical determination of tsunami travel time
NASA Astrophysics Data System (ADS)
Fine, I. V.; Thomson, R. E.
2013-04-01
We present a highly accurate and computationally efficient method (herein, the "wavefront orientation method") for determining the travel time of oceanic tsunamis. Based on Huygens principle, the method uses an eight-point grid-point pattern and the most recent information on the orientation of the advancing wave front to determine the time for a tsunami to travel to a specific oceanic location. The method is shown to provide improved accuracy and reduced anisotropy compared with the conventional multiple grid-point method presently in widespread use.
Point specificity in acupuncture
2012-01-01
The existence of point specificity in acupuncture is controversial, because many acupuncture studies using this principle to select control points have found that sham acupoints have similar effects to those of verum acupoints. Furthermore, the results of pain-related studies based on visual analogue scales have not supported the concept of point specificity. In contrast, hemodynamic, functional magnetic resonance imaging and neurophysiological studies evaluating the responses to stimulation of multiple points on the body surface have shown that point-specific actions are present. This review article focuses on clinical and laboratory studies supporting the existence of point specificity in acupuncture and also addresses studies that do not support this concept. Further research is needed to elucidate the point-specific actions of acupuncture. PMID:22373514
Writing Multiple Choice Outcome Questions to Assess Knowledge and Competence.
Brady, Erik D
2015-11-01
Few articles contemplate the need for good guidance in question item-writing in the continuing education (CE) space. Although many of the core principles of sound item design translate to the CE health education team, the need exists for specific examples for nurse educators that clearly describe how to measure changes in competence and knowledge using multiple choice items. In this article, some keys points and specific examples for nursing CE providers are shared. Copyright 2015, SLACK Incorporated.
Dynamic Debates: An Analysis of Group Polarization over Time on Twitter
ERIC Educational Resources Information Center
Yardi, Sarita; Boyd, Danah
2010-01-01
The principle of homophily says that people associate with other groups of people who are mostly like themselves. Many online communities are structured around groups of socially similar individuals. On Twitter, however, people are exposed to multiple, diverse points of view through the public timeline. The authors captured 30,000 tweets about the…
ERIC Educational Resources Information Center
McMullen, Victoria B.
This curriculum provides a sequence of activities designed to help develop cognitive and communication skills in severely and profoundly multi-handicapped individuals who are functioning between 0 and 24 months. Based on the principles that communication begins at birth and that educational programming must begin at the point where the handicapped…
Feedback power control strategies in wireless sensor networks with joint channel decoding.
Abrardo, Andrea; Ferrari, Gianluigi; Martalò, Marco; Perna, Fabio
2009-01-01
In this paper, we derive feedback power control strategies for block-faded multiple access schemes with correlated sources and joint channel decoding (JCD). In particular, upon the derivation of the feasible signal-to-noise ratio (SNR) region for the considered multiple access schemes, i.e., the multidimensional SNR region where error-free communications are, in principle, possible, two feedback power control strategies are proposed: (i) a classical feedback power control strategy, which aims at equalizing all link SNRs at the access point (AP), and (ii) an innovative optimized feedback power control strategy, which tries to make the network operational point fall in the feasible SNR region at the lowest overall transmit energy consumption. These strategies will be referred to as "balanced SNR" and "unbalanced SNR," respectively. While they require, in principle, an unlimited power control range at the sources, we also propose practical versions with a limited power control range. We preliminary consider a scenario with orthogonal links and ideal feedback. Then, we analyze the robustness of the proposed power control strategies to possible non-idealities, in terms of residual multiple access interference and noisy feedback channels. Finally, we successfully apply the proposed feedback power control strategies to a limiting case of the class of considered multiple access schemes, namely a central estimating officer (CEO) scenario, where the sensors observe noisy versions of a common binary information sequence and the AP's goal is to estimate this sequence by properly fusing the soft-output information output by the JCD algorithm.
Equation of state of solid, liquid and gaseous tantalum from first principles
Miljacic, Ljubomir; Demers, Steven; Hong, Qi-Jun; ...
2015-09-18
Here, we present ab initio calculations of the phase diagram and the equation of state of Ta in a wide range of volumes and temperatures, with volumes from 9 to 180 Å 3/atom, temperature as high as 20000 K, and pressure up to 7 Mbars. The calculations are based on first principles, in combination with techniques of molecular dynamics, thermodynamic integration, and statistical modeling. Multiple phases are studied, including the solid, fluid, and gas single phases, as well as two-phase coexistences. We calculate the critical point by direct molecular dynamics sampling, and extend the equation of state to very lowmore » density through virial series fitting. The accuracy of the equation of state is assessed by comparing both the predicted melting curve and the critical point with previous experimental and theoretical investigations.« less
ERIC Educational Resources Information Center
Gayle, Barbara Mae
2004-01-01
Learning to engage in civil discourse requires students to maintain an openness to new points of view and attitude change. In a public speaking course based on principles of civil discourse, classroom procedures were designed to foster subjective reframing by engaging students in the disorienting exercise of supporting multiple perspectives on the…
Brain MRI volumetry in a single patient with mild traumatic brain injury.
Ross, David E; Castelvecchi, Cody; Ochs, Alfred L
2013-01-01
This letter to the editor describes the case of a 42 year old man with mild traumatic brain injury and multiple neuropsychiatric symptoms which persisted for a few years after the injury. Initial CT scans and MRI scans of the brain showed no signs of atrophy. Brain volume was measured using NeuroQuant®, an FDA-approved, commercially available software method. Volumetric cross-sectional (one point in time) analysis also showed no atrophy. However, volumetric longitudinal (two points in time) analysis showed progressive atrophy in several brain regions. This case illustrated in a single patient the principle discovered in multiple previous group studies, namely that the longitudinal design is more powerful than the cross-sectional design for finding atrophy in patients with traumatic brain injury.
Feedback Power Control Strategies in Wireless Sensor Networks with Joint Channel Decoding
Abrardo, Andrea; Ferrari, Gianluigi; Martalò, Marco; Perna, Fabio
2009-01-01
In this paper, we derive feedback power control strategies for block-faded multiple access schemes with correlated sources and joint channel decoding (JCD). In particular, upon the derivation of the feasible signal-to-noise ratio (SNR) region for the considered multiple access schemes, i.e., the multidimensional SNR region where error-free communications are, in principle, possible, two feedback power control strategies are proposed: (i) a classical feedback power control strategy, which aims at equalizing all link SNRs at the access point (AP), and (ii) an innovative optimized feedback power control strategy, which tries to make the network operational point fall in the feasible SNR region at the lowest overall transmit energy consumption. These strategies will be referred to as “balanced SNR” and “unbalanced SNR,” respectively. While they require, in principle, an unlimited power control range at the sources, we also propose practical versions with a limited power control range. We preliminary consider a scenario with orthogonal links and ideal feedback. Then, we analyze the robustness of the proposed power control strategies to possible non-idealities, in terms of residual multiple access interference and noisy feedback channels. Finally, we successfully apply the proposed feedback power control strategies to a limiting case of the class of considered multiple access schemes, namely a central estimating officer (CEO) scenario, where the sensors observe noisy versions of a common binary information sequence and the AP's goal is to estimate this sequence by properly fusing the soft-output information output by the JCD algorithm. PMID:22291536
NASA Technical Reports Server (NTRS)
Banyukevich, A.; Ziolkovski, K.
1975-01-01
A number of hybrid methods for solving Cauchy problems are described on the basis of an evaluation of advantages of single and multiple-point numerical integration methods. The selection criterion is the principle of minimizing computer time. The methods discussed include the Nordsieck method, the Bulirsch-Stoer extrapolation method, and the method of recursive Taylor-Steffensen power series.
Tonomura, W; Moriguchi, H; Jimbo, Y; Konishi, S
2008-01-01
This paper describes an advanced Micro Channel Array (MCA) so as to record neuronal network at multiple points simultaneously. Developed MCA is designed for neuronal network analysis which has been studied by co-authors using MEA (Micro Electrode Arrays) system. The MCA employs the principle of the extracellular recording. Presented MCA has the following advantages. First of all, the electrodes integrated around individual micro channels are electrically isolated for parallel multipoint recording. Sucking and clamping of cells through micro channels is expected to improve the cellular selectivity and S/N ratio. In this study, hippocampal neurons were cultured on the developed MCA. As a result, the spontaneous and evoked spike potential could be recorded by sucking and clamping the cells at multiple points. Herein, we describe the successful experimental results together with the design and fabrication of the advanced MCA toward on-chip analysis of neuronal network.
The dynamics and limits of corporate growth in health care.
Robinson, J C
1996-01-01
This paper analyzes the economic dynamics of five forms of organizational growth in health care: horizontal integration within a single geographic market; horizontal integration across different geographic markets; diversification among multiple products and types of service; diversification among multiple distribution channels; and vertical integration with suppliers. These principles are illustrated through brief case studies of three firms that have grown by way of internal expansion, mergers, acquisitions, and diversification: WellPoint Health Networks, UniHealth America, and Mullikin Medical Enterprises. The paper analyzes the potential limits of organizational growth in health care and explores the implications of integration and diversification for antitrust policy.
41 CFR Appendix A to Subpart D of... - 3-Key Points and Principles
Code of Federal Regulations, 2010 CFR
2010-07-01
... Principles A Appendix A to Subpart D of Part 102 Public Contracts and Property Management Federal Property... Subpart D of Part 102-3—Key Points and Principles This appendix provides additional guidance in the form of answers to frequently asked questions and identifies key points and principles that may be applied...
41 CFR Appendix A to Subpart C of... - 3-Key Points and Principles
Code of Federal Regulations, 2010 CFR
2010-07-01
... Principles A Appendix A to Subpart C of Part 102 Public Contracts and Property Management Federal Property... 102-3—Key Points and Principles This appendix provides additional guidance in the form of answers to frequently asked questions and identifies key points and principles that may be applied to situations not...
Prendergast, Michael L.; Pearson, Frank S.; Podus, Deborah; Hamilton, Zachary K.; Greenwell, Lisa
2013-01-01
Objectives The purpose of the present meta-analysis was to answer the question: Can the Andrews principles of risk, needs, and responsivity, originally developed for programs that treat offenders, be extended to programs that treat drug abusers? Methods Drawing from a dataset that included 243 independent comparisons, we conducted random-effects meta-regression and ANOVA-analog meta-analyses to test the Andrews principles by averaging crime and drug use outcomes over a diverse set of programs for drug abuse problems. Results For crime outcomes, in the meta-regressions the point estimates for each of the principles were substantial, consistent with previous studies of the Andrews principles. There was also a substantial point estimate for programs exhibiting a greater number of the principles. However, almost all of the 95% confidence intervals included the zero point. For drug use outcomes, in the meta-regressions the point estimates for each of the principles was approximately zero; however, the point estimate for programs exhibiting a greater number of the principles was somewhat positive. All of the estimates for the drug use principles had confidence intervals that included the zero point. Conclusions This study supports previous findings from primary research studies targeting the Andrews principles that those principles are effective in reducing crime outcomes, here in meta-analytic research focused on drug treatment programs. By contrast, programs that follow the principles appear to have very little effect on drug use outcomes. Primary research studies that experimentally test the Andrews principles in drug treatment programs are recommended. PMID:24058325
Effect of multiple circular holes Fraunhofer diffraction for the infrared optical imaging
NASA Astrophysics Data System (ADS)
Lu, Chunlian; Lv, He; Cao, Yang; Cai, Zhisong; Tan, Xiaojun
2014-11-01
With the development of infrared optics, infrared optical imaging systems play an increasingly important role in modern optical imaging systems. Infrared optical imaging is used in industry, agriculture, medical, military and transportation. But in terms of infrared optical imaging systems which are exposed for a long time, some contaminations will affect the infrared optical imaging. When the contamination contaminate on the lens surface of the optical system, it would affect diffraction. The lens can be seen as complementary multiple circular holes screen happen Fraunhofer diffraction. According to Babinet principle, you can get the diffraction of the imaging system. Therefore, by studying the multiple circular holes Fraunhofer diffraction, conclusions can be drawn about the effect of infrared imaging. This paper mainly studies the effect of multiple circular holes Fraunhofer diffraction for the optical imaging. Firstly, we introduce the theory of Fraunhofer diffraction and Point Spread Function. Point Spread Function is a basic tool to evaluate the image quality of the optical system. Fraunhofer diffraction will affect Point Spread Function. Then, the results of multiple circular holes Fraunhofer diffraction are given for different hole size and hole spacing. We choose the hole size from 0.1mm to 1mm and hole spacing from 0.3mm to 0.8mm. The infrared wavebands of optical imaging are chosen from 1μm to 5μm. We use the MATLAB to simulate light intensity distribution of multiple circular holes Fraunhofer diffraction. Finally, three-dimensional diffraction maps of light intensity are given to contrast.
Quantized conductance operation near a single-atom point contact in a polymer-based atomic switch
NASA Astrophysics Data System (ADS)
Krishnan, Karthik; Muruganathan, Manoharan; Tsuruoka, Tohru; Mizuta, Hiroshi; Aono, Masakazu
2017-06-01
Highly-controlled conductance quantization is achieved near a single-atom point contact in a redox-based atomic switch device, in which a poly(ethylene oxide) (PEO) film is sandwiched between Ag and Pt electrodes. Current-voltage measurements revealed reproducible quantized conductance of ˜1G 0 for more than 102 continuous voltage sweep cycles under a specific condition, indicating the formation of a well-defined single-atom point contact of Ag in the PEO matrix. The device exhibited a conductance state distribution centered at 1G 0, with distinct half-integer multiples of G 0 and small fractional variations. First-principles density functional theory simulations showed that the experimental observations could be explained by the existence of a tunneling gap and the structural rearrangement of an atomic point contact.
Managing Multiple Health Problems: Living with Multiple Health Problems
... treatments affect people with multiple health problems. Guiding Principles on Caring for Older Adults with Multiple Health ... interactions and other side effects. Each of the principles above is intended to help improve the health ...
Framework for assessing causality in disease management programs: principles.
Wilson, Thomas; MacDowell, Martin
2003-01-01
To credibly state that a disease management (DM) program "caused" a specific outcome it is required that metrics observed in the DM population be compared with metrics that would have been expected in the absence of a DM intervention. That requirement can be very difficult to achieve, and epidemiologists and others have developed guiding principles of causality by which credible estimates of DM impact can be made. This paper introduces those key principles. First, DM program metrics must be compared with metrics from a "reference population." This population should be "equivalent" to the DM intervention population on all factors that could independently impact the outcome. In addition, the metrics used in both groups should use the same defining criteria (ie, they must be "comparable" to each other). The degree to which these populations fulfill the "equivalent" assumption and metrics fulfill the "comparability" assumption should be stated. Second, when "equivalence" or "comparability" is not achieved, the DM managers should acknowledge this fact and, where possible, "control" for those factors that may impact the outcome(s). Finally, it is highly unlikely that one study will provide definitive proof of any specific DM program value for all time; thus, we strongly recommend that studies be ongoing, at multiple points in time, and at multiple sites, and, when observational study designs are employed, that more than one type of study design be utilized. Methodologically sophisticated studies that follow these "principles of causality" will greatly enhance the reputation of the important and growing efforts in DM.
Structured light optical microscopy for three-dimensional reconstruction of technical surfaces
NASA Astrophysics Data System (ADS)
Kettel, Johannes; Reinecke, Holger; Müller, Claas
2016-04-01
In microsystems technology quality control of micro structured surfaces with different surface properties is playing an ever more important role. The process of quality control incorporates three-dimensional (3D) reconstruction of specularand diffusive reflecting technical surfaces. Due to the demand on high measurement accuracy and data acquisition rates, structured light optical microscopy has become a valuable solution to solve this problem providing high vertical and lateral resolution. However, 3D reconstruction of specular reflecting technical surfaces still remains a challenge to optical measurement principles. In this paper we present a measurement principle based on structured light optical microscopy which enables 3D reconstruction of specular- and diffusive reflecting technical surfaces. It is realized using two light paths of a stereo microscope equipped with different magnification levels. The right optical path of the stereo microscope is used to project structured light onto the object surface. The left optical path is used to capture the structured illuminated object surface with a camera. Structured light patterns are generated by a Digital Light Processing (DLP) device in combination with a high power Light Emitting Diode (LED). Structured light patterns are realized as a matrix of discrete light spots to illuminate defined areas on the object surface. The introduced measurement principle is based on multiple and parallel processed point measurements. Analysis of the measured Point Spread Function (PSF) by pattern recognition and model fitting algorithms enables the precise calculation of 3D coordinates. Using exemplary technical surfaces we demonstrate the successful application of our measurement principle.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-24
... 0584-AD65 School Food Safety Program Based on Hazard Analysis and Critical Control Point Principles... Safety Program Based on Hazard Analysis and Critical Control Point Principles (HACCP) was published on... of Management and Budget (OMB) cleared the associated information collection requirements (ICR) on...
Consolidated principles for screening based on a systematic review and consensus process.
Dobrow, Mark J; Hagens, Victoria; Chafe, Roger; Sullivan, Terrence; Rabeneck, Linda
2018-04-09
In 1968, Wilson and Jungner published 10 principles of screening that often represent the de facto starting point for screening decisions today; 50 years on, are these principles still the right ones? Our objectives were to review published work that presents principles for population-based screening decisions since Wilson and Jungner's seminal publication, and to conduct a Delphi consensus process to assess the review results. We conducted a systematic review and modified Delphi consensus process. We searched multiple databases for articles published in English in 1968 or later that were intended to guide population-based screening decisions, described development and modification of principles, and presented principles as a set or list. Identified sets were compared for basic characteristics (e.g., number, categorization), a citation analysis was conducted, and principles were iteratively synthesized and consolidated into categories to assess evolution. Participants in the consensus process assessed the level of agreement with the importance and interpretability of the consolidated screening principles. We identified 41 sets and 367 unique principles. Each unique principle was coded to 12 consolidated decision principles that were further categorized as disease/condition, test/intervention or program/system principles. Program or system issues were the focus of 3 of Wilson and Jungner's 10 principles, but comprised almost half of all unique principles identified in the review. The 12 consolidated principles were assessed through 2 rounds of the consensus process, leading to specific refinements to improve their relevance and interpretability. No gaps or missing principles were identified. Wilson and Jungner's principles are remarkably enduring, but increasingly reflect a truncated version of contemporary thinking on screening that does not fully capture subsequent focus on program or system principles. Ultimately, this review and consensus process provides a comprehensive and iterative modernization of guidance to inform population-based screening decisions. © 2018 Joule Inc. or its licensors.
Consolidated principles for screening based on a systematic review and consensus process
Hagens, Victoria; Chafe, Roger; Sullivan, Terrence; Rabeneck, Linda
2018-01-01
BACKGROUND: In 1968, Wilson and Jungner published 10 principles of screening that often represent the de facto starting point for screening decisions today; 50 years on, are these principles still the right ones? Our objectives were to review published work that presents principles for population-based screening decisions since Wilson and Jungner’s seminal publication, and to conduct a Delphi consensus process to assess the review results. METHODS: We conducted a systematic review and modified Delphi consensus process. We searched multiple databases for articles published in English in 1968 or later that were intended to guide population-based screening decisions, described development and modification of principles, and presented principles as a set or list. Identified sets were compared for basic characteristics (e.g., number, categorization), a citation analysis was conducted, and principles were iteratively synthesized and consolidated into categories to assess evolution. Participants in the consensus process assessed the level of agreement with the importance and interpretability of the consolidated screening principles. RESULTS: We identified 41 sets and 367 unique principles. Each unique principle was coded to 12 consolidated decision principles that were further categorized as disease/condition, test/intervention or program/system principles. Program or system issues were the focus of 3 of Wilson and Jungner’s 10 principles, but comprised almost half of all unique principles identified in the review. The 12 consolidated principles were assessed through 2 rounds of the consensus process, leading to specific refinements to improve their relevance and interpretability. No gaps or missing principles were identified. INTERPRETATION: Wilson and Jungner’s principles are remarkably enduring, but increasingly reflect a truncated version of contemporary thinking on screening that does not fully capture subsequent focus on program or system principles. Ultimately, this review and consensus process provides a comprehensive and iterative modernization of guidance to inform population-based screening decisions. PMID:29632037
Achieving Integration in Mixed Methods Designs—Principles and Practices
Fetters, Michael D; Curry, Leslie A; Creswell, John W
2013-01-01
Mixed methods research offers powerful tools for investigating complex processes and systems in health and health care. This article describes integration principles and practices at three levels in mixed methods research and provides illustrative examples. Integration at the study design level occurs through three basic mixed method designs—exploratory sequential, explanatory sequential, and convergent—and through four advanced frameworks—multistage, intervention, case study, and participatory. Integration at the methods level occurs through four approaches. In connecting, one database links to the other through sampling. With building, one database informs the data collection approach of the other. When merging, the two databases are brought together for analysis. With embedding, data collection and analysis link at multiple points. Integration at the interpretation and reporting level occurs through narrative, data transformation, and joint display. The fit of integration describes the extent the qualitative and quantitative findings cohere. Understanding these principles and practices of integration can help health services researchers leverage the strengths of mixed methods. PMID:24279835
Achieving integration in mixed methods designs-principles and practices.
Fetters, Michael D; Curry, Leslie A; Creswell, John W
2013-12-01
Mixed methods research offers powerful tools for investigating complex processes and systems in health and health care. This article describes integration principles and practices at three levels in mixed methods research and provides illustrative examples. Integration at the study design level occurs through three basic mixed method designs-exploratory sequential, explanatory sequential, and convergent-and through four advanced frameworks-multistage, intervention, case study, and participatory. Integration at the methods level occurs through four approaches. In connecting, one database links to the other through sampling. With building, one database informs the data collection approach of the other. When merging, the two databases are brought together for analysis. With embedding, data collection and analysis link at multiple points. Integration at the interpretation and reporting level occurs through narrative, data transformation, and joint display. The fit of integration describes the extent the qualitative and quantitative findings cohere. Understanding these principles and practices of integration can help health services researchers leverage the strengths of mixed methods. © Health Research and Educational Trust.
Computer-assisted 3D kinematic analysis of all leg joints in walking insects.
Bender, John A; Simpson, Elaine M; Ritzmann, Roy E
2010-10-26
High-speed video can provide fine-scaled analysis of animal behavior. However, extracting behavioral data from video sequences is a time-consuming, tedious, subjective task. These issues are exacerbated where accurate behavioral descriptions require analysis of multiple points in three dimensions. We describe a new computer program written to assist a user in simultaneously extracting three-dimensional kinematics of multiple points on each of an insect's six legs. Digital video of a walking cockroach was collected in grayscale at 500 fps from two synchronized, calibrated cameras. We improved the legs' visibility by painting white dots on the joints, similar to techniques used for digitizing human motion. Compared to manual digitization of 26 points on the legs over a single, 8-second bout of walking (or 106,496 individual 3D points), our software achieved approximately 90% of the accuracy with 10% of the labor. Our experimental design reduced the complexity of the tracking problem by tethering the insect and allowing it to walk in place on a lightly oiled glass surface, but in principle, the algorithms implemented are extensible to free walking. Our software is free and open-source, written in the free language Python and including a graphical user interface for configuration and control. We encourage collaborative enhancements to make this tool both better and widely utilized.
41 CFR Appendix A to Subpart B of... - 3-Key Points and Principles
Code of Federal Regulations, 2010 CFR
2010-07-01
... Principles A Appendix A to Subpart B of Part 102 Public Contracts and Property Management Federal Property.... B, App. A Appendix A to Subpart B of Part 102-3—Key Points and Principles This appendix provides... principles that may be applied to situations not covered elsewhere in this subpart. The guidance follows: Key...
Efficient Mutagenesis Independent of Ligation (EMILI).
Füzik, Tibor; Ulbrich, Pavel; Ruml, Tomáš
2014-11-01
Site-directed mutagenesis is one of the most widely used techniques in life sciences. Here we describe an improved and simplified method for introducing mutations at desired sites. It consists of an inverse PCR using a plasmid template and two partially complementary primers. The synthesis step is followed by annealing of the PCR product's sticky ends, which are generated by exonuclease digestion. This method is fast, extremely efficient and cost-effective. It can be used to introduce large insertions and deletions, but also for multiple point mutations in a single step. To show the principle and to prove the efficiency of the method, we present a series of basic mutations (insertions, deletions, point mutations) on pUC19 plasmid DNA. Copyright © 2014 Elsevier B.V. All rights reserved.
Multiple steady states in atmospheric chemistry
NASA Technical Reports Server (NTRS)
Stewart, Richard W.
1993-01-01
The equations describing the distributions and concentrations of trace species are nonlinear and may thus possess more than one solution. This paper develops methods for searching for multiple physical solutions to chemical continuity equations and applies these to subsets of equations describing tropospheric chemistry. The calculations are carried out with a box model and use two basic strategies. The first strategy is a 'search' method. This involves fixing model parameters at specified values, choosing a wide range of initial guesses at a solution, and using a Newton-Raphson technique to determine if different initial points converge to different solutions. The second strategy involves a set of techniques known as homotopy methods. These do not require an initial guess, are globally convergent, and are guaranteed, in principle, to find all solutions of the continuity equations. The first method is efficient but essentially 'hit or miss' in the sense that it cannot guarantee that all solutions which may exist will be found. The second method is computationally burdensome but can, in principle, determine all the solutions of a photochemical system. Multiple solutions have been found for models that contain a basic complement of photochemical reactions involving O(x), HO(x), NO(x), and CH4. In the present calculations, transitions occur between stable branches of a multiple solution set as a control parameter is varied. These transitions are manifestations of hysteresis phenomena in the photochemical system and may be triggered by increasing the NO flux or decreasing the CH4 flux from current mean tropospheric levels.
Relevance and limits of the principle of "equivalence of care" in prison medicine.
Niveau, Gérard
2007-10-01
The principle of "equivalence of care" in prison medicine is a principle by which prison health services are obliged to provide prisoners with care of a quality equivalent to that provided for the general public in the same country. It is cited in numerous national and international directives and recommendations. The principle of equivalence is extremely relevant from the point of view of normative ethics but requires adaptation from the point of view of applied ethics. From a clinical point of view, the principle of equivalence is often insufficient to take account of the adaptations necessary for the organization of care in a correctional setting. The principle of equivalence is cost-effective in general, but has to be overstepped to ensure the humane management of certain special cases.
NASA Astrophysics Data System (ADS)
Leyva, R.; Artillan, P.; Cabal, C.; Estibals, B.; Alonso, C.
2011-04-01
The article studies the dynamic performance of a family of maximum power point tracking circuits used for photovoltaic generation. It revisits the sinusoidal extremum seeking control (ESC) technique which can be considered as a particular subgroup of the Perturb and Observe algorithms. The sinusoidal ESC technique consists of adding a small sinusoidal disturbance to the input and processing the perturbed output to drive the operating point at its maximum. The output processing involves a synchronous multiplication and a filtering stage. The filter instance determines the dynamic performance of the MPPT based on sinusoidal ESC principle. The approach uses the well-known root-locus method to give insight about damping degree and settlement time of maximum-seeking waveforms. This article shows the transient waveforms in three different filter instances to illustrate the approach. Finally, an experimental prototype corroborates the dynamic analysis.
Relevance and limits of the principle of “equivalence of care” in prison medicine
Niveau, Gérard
2007-01-01
The principle of “equivalence of care” in prison medicine is a principle by which prison health services are obliged to provide prisoners with care of a quality equivalent to that provided for the general public in the same country. It is cited in numerous national and international directives and recommendations. The principle of equivalence is extremely relevant from the point of view of normative ethics but requires adaptation from the point of view of applied ethics. From a clinical point of view, the principle of equivalence is often insufficient to take account of the adaptations necessary for the organization of care in a correctional setting. The principle of equivalence is cost‐effective in general, but has to be overstepped to ensure the humane management of certain special cases. PMID:17906061
ERIC Educational Resources Information Center
Lockwood, Elise; Reed, Zackery; Caughman, John S.
2017-01-01
The multiplication principle serves as a cornerstone in enumerative combinatorics. The principle underpins many basic counting formulas and provides students with a critical element of combinatorial justification. Given its importance, the way in which it is presented in textbooks is surprisingly varied. In this paper, we analyze a number of…
Calculating the n-point correlation function with general and efficient python code
NASA Astrophysics Data System (ADS)
Genier, Fred; Bellis, Matthew
2018-01-01
There are multiple approaches to understanding the evolution of large-scale structure in our universe and with it the role of baryonic matter, dark matter, and dark energy at different points in history. One approach is to calculate the n-point correlation function estimator for galaxy distributions, sometimes choosing a particular type of galaxy, such as luminous red galaxies. The standard way to calculate these estimators is with pair counts (for the 2-point correlation function) and with triplet counts (for the 3-point correlation function). These are O(n2) and O(n3) problems, respectively and with the number of galaxies that will be characterized in future surveys, having efficient and general code will be of increasing importance. Here we show a proof-of-principle approach to the 2-point correlation function that relies on pre-calculating galaxy locations in coarse “voxels”, thereby reducing the total number of necessary calculations. The code is written in python, making it easily accessible and extensible and is open-sourced to the community. Basic results and performance tests using SDSS/BOSS data will be shown and we discuss the application of this approach to the 3-point correlation function.
NASA Astrophysics Data System (ADS)
Wang, Hongling
2011-10-01
This article, placed the comprehensive quality improvement of undergraduates under the background of elite culture and mass culture, analyzed the influences and challenges brought by elite culture and mass culture on the undergraduate education from multiple perspectives of philosophy, ethics, economics, education, sociology and etc. and combing some foreign developed countries' experiences proposed the principles should be insisted by high schools in the context of elite culture and mass culture. With the development of times, undergraduate education should also constantly develop into new historical starting points and thoroughly reform the undergraduate education from content to essence, perception to format with a globalized horizon, so as to be able to reflect the time characteristics and better promote the overall development of undergraduates. Exactly based on such a view, this article, on the premise of full recognition that the flourishing and development of elite culture and mass culture has promoted China into a multicultural situation, proposed the principles for university moral education, such as education should promote the integration of undergraduate multi-values, sticking to the integration of unary guidance with diverse development, insisting on seeking common points while reserving differences and harmony but with differences, and etc.
Engineering topological phases in the Luttinger semimetal α -Sn
NASA Astrophysics Data System (ADS)
Zhang, Dongqin; Wang, Huaiqiang; Ruan, Jiawei; Yao, Ge; Zhang, Haijun
2018-05-01
α -Sn is well known as a typical Luttinger semimetal with a quadratic band touching at the Γ point. Based on the effective k .p analysis as well as first-principles calculations, we demonstrate that multiple topological phases with a rich diagram, including topological insulator, Dirac semimetal, and Weyl semimetal phases, can be induced and engineered in α -Sn by external strains, magnetic fields, and circularly polarized light (CPL). Intriguingly, not only the conventional type-I Weyl nodes but also type-II Weyl nodes and double-Weyl nodes can be generated directly from the quadratic semimetal by applying a magnetic field or CPL. Our results apply equally well to other Luttinger semimetals with similar crystal and electronic structures, and thus open an avenue for realizing and engineering multiple topological phases on a versatile platform.
NASA Astrophysics Data System (ADS)
Leberl, F.; Gruber, M.; Ponticelli, M.; Wiechert, A.
2012-07-01
The UltraCam-project created a novel Large Format Digital Aerial Camera. It was inspired by the ISPRS Congress 2000 in Amsterdam. The search for a promising imaging idea succeeded in May 2001, defining a tiling approach with multiple lenses and multiple area CCD arrays to assemble a seamless and geometrically stable monolithic photogrammetric aerial large format image. First resources were spent on the project in September 2011. The initial UltraCam-D was announced and demonstrated in May 2003. By now the imaging principle has resulted in a 4th generation UltraCam Eagle, increasing the original swath width from 11,500 pixels to beyond 20,000. Inspired by the original imaging principle, alternatives have been investigated, and the UltraCam-G carries the swath width even further, namely to a frame image with nearly 30,000 pixels, however, with a modified tiling concept and optimized for orthophoto production. We explain the advent of digital aerial large format imaging and how it benefits from improvements in computing technology to cope with data flows at a rate of 3 Gigabits per second and a need to deal with Terabytes of imagery within a single aerial sortie. We also address the many benefits of a transition to a fully digital workflow with a paradigm shift away from minimizing a project's number of aerial photographs and towards maximizing the automation of photogrammetric workflows by means of high redundancy imaging strategies. The instant gratification from near-real-time aerial triangulations and dense image matching has led to a reassessment of the value of photogrammetric point clouds to successfully compete with direct point cloud measurements by LiDAR.
A Rejection Principle for Sequential Tests of Multiple Hypotheses Controlling Familywise Error Rates
BARTROFF, JAY; SONG, JINLIN
2015-01-01
We present a unifying approach to multiple testing procedures for sequential (or streaming) data by giving sufficient conditions for a sequential multiple testing procedure to control the familywise error rate (FWER). Together we call these conditions a “rejection principle for sequential tests,” which we then apply to some existing sequential multiple testing procedures to give simplified understanding of their FWER control. Next the principle is applied to derive two new sequential multiple testing procedures with provable FWER control, one for testing hypotheses in order and another for closed testing. Examples of these new procedures are given by applying them to a chromosome aberration data set and to finding the maximum safe dose of a treatment. PMID:26985125
Sultan, Mehwish; Kuluski, Kerry; McIsaac, Warren J; Cafazzo, Joseph A; Seto, Emily
2018-01-01
People with multiple chronic conditions often struggle with managing their health. The purpose of this research was to identify specific challenges of patients with multiple chronic conditions and to use the findings to form design principles for a telemonitoring system tailored for these patients. Semi-structured interviews with 15 patients with multiple chronic conditions and 10 clinicians were conducted to gain an understanding of their needs and preferences for a smartphone-based telemonitoring system. The interviews were analyzed using a conventional content analysis technique, resulting in six themes. Design principles developed from the themes included that the system must be modular to accommodate various combinations of conditions, reinforce a routine, consolidate record keeping, as well as provide actionable feedback to the patients. Designing an application for multiple chronic conditions is complex due to variability in patient conditions, and therefore, design principles developed in this study can help with future innovations aimed to help manage this population.
Tonomura, Wataru; Moriguchi, Hiroyuki; Jimbo, Yasuhiko; Konishi, Satoshi
2010-08-01
This paper describes an advanced Micro Channel Array (MCA) for recording electrophysiological signals of neuronal networks at multiple points simultaneously. The developed MCA is designed for neuronal network analysis which has been studied by the co-authors using the Micro Electrode Arrays (MEA) system, and employs the principles of extracellular recordings. A prerequisite for extracellular recordings with good signal-to-noise ratio is a tight contact between cells and electrodes. The MCA described herein has the following advantages. The electrodes integrated around individual micro channels are electrically isolated to enable parallel multipoint recording. Reliable clamping of a targeted cell through micro channels is expected to improve the cellular selectivity and the attachment between the cell and the electrode toward steady electrophysiological recordings. We cultured hippocampal neurons on the developed MCA. As a result, the spontaneous and evoked spike potentials could be recorded by sucking and clamping the cells at multiple points. In this paper, we describe the design and fabrication of the MCA and the successful electrophysiological recordings leading to the development of an effective cellular network analysis device.
Adjustment of multi-CCD-chip-color-camera heads
NASA Astrophysics Data System (ADS)
Guyenot, Volker; Tittelbach, Guenther; Palme, Martin
1999-09-01
The principle of beam-splitter-multi-chip cameras consists in splitting an image into differential multiple images of different spectral ranges and in distributing these onto separate black and white CCD-sensors. The resulting electrical signals from the chips are recombined to produce a high quality color picture on the monitor. Because this principle guarantees higher resolution and sensitivity in comparison to conventional single-chip camera heads, the greater effort is acceptable. Furthermore, multi-chip cameras obtain the compete spectral information for each individual object point while single-chip system must rely on interpolation. In a joint project, Fraunhofer IOF and STRACON GmbH and in future COBRA electronic GmbH develop methods for designing the optics and dichroitic mirror system of such prism color beam splitter devices. Additionally, techniques and equipment for the alignment and assembly of color beam splitter-multi-CCD-devices on the basis of gluing with UV-curable adhesives have been developed, too.
Linder, Regina
2012-01-01
Health care occupies a distinct niche in an economy struggling to recover from recession. Professions related to the care of patients are thought to be relatively resistant to downturns, and thus become attractive to students typically drawn to more lucrative pursuits. Currently, a higher profile for clinical laboratory technology among college students and those considering career change results in larger and better prepared applicant pools. However, after decades of contraction marked by closing of programs, prospective students encounter an educational system without the capacity or vigor to meet their needs. Here discussed are some principles and proposals to allow universities, partnering with health-care providers, government agencies, and other stake-holders to develop new programs, or reenergize existing ones to serve our students and patients. Principles include academic rigor in biomedical and clinical science, multiple points of entry for students, flexibility in format, cost effectiveness, career ladders and robust partnerships. PMID:23653802
Principles of dynamical modularity in biological regulatory networks
Deritei, Dávid; Aird, William C.; Ercsey-Ravasz, Mária; Regan, Erzsébet Ravasz
2016-01-01
Intractable diseases such as cancer are associated with breakdown in multiple individual functions, which conspire to create unhealthy phenotype-combinations. An important challenge is to decipher how these functions are coordinated in health and disease. We approach this by drawing on dynamical systems theory. We posit that distinct phenotype-combinations are generated by interactions among robust regulatory switches, each in control of a discrete set of phenotypic outcomes. First, we demonstrate the advantage of characterizing multi-switch regulatory systems in terms of their constituent switches by building a multiswitch cell cycle model which points to novel, testable interactions critical for early G2/M commitment to division. Second, we define quantitative measures of dynamical modularity, namely that global cell states are discrete combinations of switch-level phenotypes. Finally, we formulate three general principles that govern the way coupled switches coordinate their function. PMID:26979940
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bannikov, V.V.; Shein, I.R.; Ivanovskii, A.L., E-mail: ivanovskii@ihim.uran.ru
2012-12-15
Employing first-principles band structure calculations, we have examined the electronic, optical properties and the peculiarities of the chemical bonding for six newly synthesized layered quaternary 1111-like chalcogenide fluorides SrAgSF, SrAgSeF, SrAgTeF, BaAgSF, BaAgSeF, and SrCuTeF, which are discussed in comparison with some isostructural 1111-like chalcogenide oxides. We found that all of the studied phases AMChF (A=Sr, Ba; M=Cu, Ag; and Ch=S, Se, Te) are semiconductors for which the fitted 'experimental' gaps lie in the interval from 2.23 eV (for SrAgSeF) to 3.07 eV (for SrCuTeF). The near-Fermi states of AMChF are formed exclusively by the valence orbitals of the atomsmore » from the blocks (MCh); thus, these phases belong to the layered materials with 'natural multiple quantum wells'. The bonding in these new AMChF phases is described as a high-anisotropic mixture of ionic and covalent contributions, where ionic M-Ch bonds together with covalent M-Ch and Ch-Ch bonds take place inside blocks (MCh), while inside blocks (AF) and between the adjacent blocks (MCh)/(AF) mainly ionic bonds emerge. - Graphical Abstract: Isoelectronic surface for SrAgSeF and atomic-resolved densities of states for SrAgTeF, and SrCuTeF. Highlights: Black-Right-Pointing-Pointer Very recently six new layered 1111-like chalcogenide fluorides AMChF were synthesized. Black-Right-Pointing-Pointer Electronic, optical properties for AMChF phases were examined from first principles. Black-Right-Pointing-Pointer All these materials are characterized as non-magnetic semiconductors. Black-Right-Pointing-Pointer Bonding is highly anisotropic and includes ionic and covalent contributions. Black-Right-Pointing-Pointer Introduction of magnetic ions in AMChF is proposed for search of novel magnetic materials.« less
An Overview on Perception and Its Principles from Avicenna's Point of View
ERIC Educational Resources Information Center
Soltani, Ali Reza
2015-01-01
The main purpose this paper attempts to reach is to recognize principles of perception, its dimensions and types from Avicenna's point of view. This study is a qualitative study conducted using descriptive-analytical methods. Resources are first reviewed and principles of perception along with its process are extracted from his perspective.…
Lindenberg, Robert; Zhu, Lin L; Schlaug, Gottfried
2012-06-01
Proof-of-principle studies have demonstrated transient beneficial effects of transcranial direct current stimulation (tDCS) on motor function in stroke patients, mostly after single treatment sessions. To assess the efficacy of multiple treatment sessions on motor outcome. The authors examined the effects of two 5-day intervention periods of bihemispheric tDCS and simultaneous occupational/physical therapy on motor function in a group of 10 chronic stroke patients. The first 5-day period yielded an increase in Upper-Extremity Fugl-Meyer (UE-FM) scores by 5.9 ± 2.4 points (16.6% ± 10.6%). The second 5-day period resulted in further meaningful, although significantly lower, gains with an additional improvement of 2.3 ± 1.4 points in UE-FM compared with the end of the first 5-day period (5.5% ± 4.2%). The overall mean change after the 2 periods was 8.2 ± 2.2 points (22.9% ± 11.4%). The results confirm the efficacy of bihemispheric tDCS in combination with peripheral sensorimotor stimulation. Furthermore, they demonstrate that the effects of multiple treatment sessions in chronic stroke patients may not necessarily lead to a linear response function, which is of relevance for the design of experimental neurorehabilitation trials.
NASA Astrophysics Data System (ADS)
Thomson, C. J.
2004-12-01
Pseudodifferential operators (PSDOs) yield in principle exact one--way seismic wave equations, which are attractive both conceptually and for their promise of computational efficiency. The one--way operators can be extended to include multiple--scattering effects, again in principle exactly. In practice approximations must be made and, as an example, the variable--wavespeed Helmholtz equation for scalar waves in two space dimensions is here factorized to give the one--way wave equation. This simple case permits clear identification of a sequence of physically reasonable approximations to be used when the mathematically exact PSDO one--way equation is implemented on a computer. As intuition suggests, these approximations hinge on the medium gradients in the direction transverse to the main propagation direction. A key point is that narrow--angle approximations are to be avoided in the interests of accuracy. Another key consideration stems from the fact that the so--called ``standard--ordering'' PSDO indicates how lateral interpolation of the velocity structure can significantly reduce computational costs associated with the Fourier or plane--wave synthesis lying at the heart of the calculations. The decision on whether a slow or a fast Fourier transform code should be used rests upon how many lateral model parameters are truly distinct. A third important point is that the PSDO theory shows what approximations are necessary in order to generate an exponential one--way propagator for the laterally varying case, representing the intuitive extension of classical integral--transform solutions for a laterally homogeneous medium. This exponential propagator suggests the use of larger discrete step sizes, and it can also be used to approach phase--screen like approximations (though the latter are not the main interest here). Numerical comparisons with finite--difference solutions will be presented in order to assess the approximations being made and to gain an understanding of computation time differences. The ideas described extend to the three--dimensional, generally anisotropic case and to multiple scattering by invariant embedding.
ERIC Educational Resources Information Center
Petrowsky, Michael C.
This paper analyzes the results of a pilot study at Glendale Community College (Arizona) to assess the effectiveness of a comprehensive multiple choice final exam in the macroeconomic principles course. The "pilot project" involved the administration of a 50-question multiple choice exam to 71 students in three macroeconomics sections.…
Multi-viewpoint clustering analysis
NASA Technical Reports Server (NTRS)
Mehrotra, Mala; Wild, Chris
1993-01-01
In this paper, we address the feasibility of partitioning rule-based systems into a number of meaningful units to enhance the comprehensibility, maintainability and reliability of expert systems software. Preliminary results have shown that no single structuring principle or abstraction hierarchy is sufficient to understand complex knowledge bases. We therefore propose the Multi View Point - Clustering Analysis (MVP-CA) methodology to provide multiple views of the same expert system. We present the results of using this approach to partition a deployed knowledge-based system that navigates the Space Shuttle's entry. We also discuss the impact of this approach on verification and validation of knowledge-based systems.
Multiple hot-carrier collection in photo-excited graphene Moiré superlattices
Wu, Sanfeng; Wang, Lei; Lai, You; Shan, Wen-Yu; Aivazian, Grant; Zhang, Xian; Taniguchi, Takashi; Watanabe, Kenji; Xiao, Di; Dean, Cory; Hone, James; Li, Zhiqiang; Xu, Xiaodong
2016-01-01
In conventional light-harvesting devices, the absorption of a single photon only excites one electron, which sets the standard limit of power-conversion efficiency, such as the Shockley-Queisser limit. In principle, generating and harnessing multiple carriers per absorbed photon can improve efficiency and possibly overcome this limit. We report the observation of multiple hot-carrier collection in graphene/boron-nitride Moiré superlattice structures. A record-high zero-bias photoresponsivity of 0.3 A/W (equivalently, an external quantum efficiency exceeding 50%) is achieved using graphene’s photo-Nernst effect, which demonstrates a collection of at least five carriers per absorbed photon. We reveal that this effect arises from the enhanced Nernst coefficient through Lifshtiz transition at low-energy Van Hove singularities, which is an emergent phenomenon due to the formation of Moiré minibands. Our observation points to a new means for extremely efficient and flexible optoelectronics based on van der Waals heterostructures. PMID:27386538
Small RNA biology is systems biology.
Jost, Daniel; Nowojewski, Andrzej; Levine, Erel
2011-01-01
During the last decade small regulatory RNA (srRNA) emerged as central players in the regulation of gene expression in all kingdoms of life. Multiple pathways for srRNA biogenesis and diverse mechanisms of gene regulation may indicate that srRNA regulation evolved independently multiple times. However, small RNA pathways share numerous properties, including the ability of a single srRNA to regulate multiple targets. Some of the mechanisms of gene regulation by srRNAs have significant effect on the abundance of free srRNAs that are ready to interact with new targets. This results in indirect interactions among seemingly unrelated genes, as well as in a crosstalk between different srRNA pathways. Here we briefly review and compare the major srRNA pathways, and argue that the impact of srRNA is always at the system level. We demonstrate how a simple mathematical model can ease the discussion of governing principles. To demonstrate these points we review a few examples from bacteria and animals.
NASA Technical Reports Server (NTRS)
Chin, S.; Lan, C. Edward
1988-01-01
An inviscid discrete vortex model, with newly derived expressions for the tangential velocity imposed at the separation points, is used to investigate the symmetric and asymmetric vortex separation on cones and tangent ogives. The circumferential locations of separation are taken from experimental data. Based on a slender body theory, the resulting simultaneous nonlinear algebraic equations in a cross-flow plane are solved with Broyden's modified Newton-Raphson method. Total force coefficients are obtained through momentum principle with new expressions for nonconical flow. It is shown through the method of function deflation that multiple solutions exist at large enough angles of attack, even with symmetric separation points. These additional solutions are asymmetric in vortex separation and produce side force coefficients which agree well with data for cones and tangent ogives.
Improving Planck calibration by including frequency-dependent relativistic corrections
NASA Astrophysics Data System (ADS)
Quartin, Miguel; Notari, Alessio
2015-09-01
The Planck satellite detectors are calibrated in the 2015 release using the "orbital dipole", which is the time-dependent dipole generated by the Doppler effect due to the motion of the satellite around the Sun. Such an effect has also relativistic time-dependent corrections of relative magnitude 10-3, due to coupling with the "solar dipole" (the motion of the Sun compared to the CMB rest frame), which are included in the data calibration by the Planck collaboration. We point out that such corrections are subject to a frequency-dependent multiplicative factor. This factor differs from unity especially at the highest frequencies, relevant for the HFI instrument. Since currently Planck calibration errors are dominated by systematics, to the point that polarization data is currently unreliable at large scales, such a correction can in principle be highly relevant for future data releases.
Translating Theory Into Practice: Implementing a Program of Assessment.
Hauer, Karen E; O'Sullivan, Patricia S; Fitzhenry, Kristen; Boscardin, Christy
2018-03-01
A program of assessment addresses challenges in learner assessment using a centrally planned, coordinated approach that emphasizes assessment for learning. This report describes the steps taken to implement a program of assessment framework within a medical school. A literature review on best practices in assessment highlighted six principles that guided implementation of the program of assessment in 2016-2017: (1) a centrally coordinated plan for assessment aligns with and supports a curricular vision; (2) multiple assessment tools used longitudinally generate multiple data points; (3) learners require ready access to information-rich feedback to promote reflection and informed self-assessment; (4) mentoring is essential to facilitate effective data use for reflection and learning planning; (5) the program of assessment fosters self-regulated learning behaviors; and (6) expert groups make summative decisions about grades and readiness for advancement. Implementation incorporated stakeholder engagement, use of multiple assessment tools, design of a coaching program, and creation of a learner performance dashboard. The assessment team monitors adherence to principles defining the program of assessment and gathers and responds to regular feedback from key stakeholders, including faculty, staff, and students. Next steps include systematically collecting evidence for validity of individual assessments and the program overall. Iterative review of student performance data informs curricular improvements. The program of assessment also highlights technology needs that will be addressed with information technology experts. The outcome ultimately will entail showing evidence of validity that the program produces physicians who engage in lifelong learning and provide high-quality patient care.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hourdequin, Marion, E-mail: Marion.Hourdequin@ColoradoCollege.edu; Department of Philosophy, Colorado College, 14 E. Cache La Poudre St., Colorado Springs, CO 80903; Landres, Peter
Traditional mechanisms for public participation in environmental impact assessment under U.S. federal law have been criticized as ineffective and unable to resolve conflict. As these mechanisms are modified and new approaches developed, we argue that participation should be designed and evaluated not only on practical grounds of cost-effectiveness and efficiency, but also on ethical grounds based on democratic ideals. In this paper, we review and synthesize modern democratic theory to develop and justify four ethical principles for public participation: equal opportunity to participate, equal access to information, genuine deliberation, and shared commitment. We then explore several tensions that are inherentmore » in applying these ethical principles to public participation in EIA. We next examine traditional NEPA processes and newer collaborative approaches in light of these principles. Finally, we explore the circumstances that argue for more in-depth participatory processes. While improved EIA participatory processes do not guarantee improved outcomes in environmental management, processes informed by these four ethical principles derived from democratic theory may lead to increased public engagement and satisfaction with government agency decisions. - Highlights: Black-Right-Pointing-Pointer Four ethical principles based on democratic theory for public participation in EIA. Black-Right-Pointing-Pointer NEPA and collaboration offer different strengths in meeting these principles. Black-Right-Pointing-Pointer We explore tensions inherent in applying these principles. Black-Right-Pointing-Pointer Improved participatory processes may improve public acceptance of agency decisions.« less
Benda, Norbert; Brandt, Andreas
2018-01-01
Recently, new draft guidelines on multiplicity issues in clinical trials have been issued by European Medicine Agency (EMA) and Food and Drug Administration (FDA), respectively. Multiplicity is an issue in clinical trials, if the probability of a false-positive decision is increased by insufficiently accounting for testing multiple hypotheses. We outline the regulatory principles related to multiplicity issues in confirmatory clinical trials intended to support a marketing authorization application in the EU, describe the reasons for an increasing complexity regarding multiple hypotheses testing and discuss the specific multiplicity issues emerging within the regulatory context and being relevant for drug approval.
Adjustable long duration high-intensity point light source
NASA Astrophysics Data System (ADS)
Krehl, P.; Hagelweide, J. B.
1981-06-01
A new long duration high-intensity point light source with adjustable light duration and a small light spot locally stable in time has been developed. The principle involved is a stationary high-temperature plasma flow inside a partly constrained capillary of a coaxial spark gap which is viewed end on through a terminating Plexiglas window. The point light spark gap is operated via a resistor by an artificial transmission line. Using two exchangeable inductance sets in the line, two ranges of photoduration 10-130 μs and 100-600 μs can be covered. For a light spot size of 1.5 mm diameter the corresponding peak light output amounts to 5×106 and 1.6×106 candelas, respectively. Within these ranges the duration is controlled by an ignitron crowbar to extinguish the plasma. The adjustable photoduration is very useful for the application of continuous writing rotating mirror cameras, thus preventing multiple exposures. The essentially uniform exposure within the visible spectral range makes the new light source suitable for color cinematography.
Observation of topological nodal fermion semimetal phase in ZrSiS
Neupane, Madhab; Belopolski, Ilya; Hosen, M. Mofazzel; ...
2016-05-11
We present that unveiling new topological phases of matter is one of the current objectives in condensed matter physics. Recent experimental discoveries of Dirac and Weyl semimetals prompt the search for other exotic phases of matter. Here we present a systematic angle-resolved photoemission spectroscopy study of ZrSiS, a prime topological nodal semimetal candidate. Our wider Brillouin zone (BZ) mapping shows multiple Fermi surface pockets such as the diamond-shaped Fermi surface, elliptical-shaped Fermi surface, and a small electron pocket encircling at the zone center (Γ) point, the M point, and the X point of the BZ, respectively. We experimentally establish themore » spinless nodal fermion semimetal phase in ZrSiS, which is supported by our first-principles calculations. Our findings evidence that the ZrSiS-type of material family is a new platform on which to explore exotic states of quantum matter; these materials are expected to provide an avenue for engineering two-dimensional topological insulator systems.« less
NASA Astrophysics Data System (ADS)
Weng, Tongfeng; Zhang, Jie; Small, Michael; Harandizadeh, Bahareh; Hui, Pan
2018-03-01
We propose a unified framework to evaluate and quantify the search time of multiple random searchers traversing independently and concurrently on complex networks. We find that the intriguing behaviors of multiple random searchers are governed by two basic principles—the logarithmic growth pattern and the harmonic law. Specifically, the logarithmic growth pattern characterizes how the search time increases with the number of targets, while the harmonic law explores how the search time of multiple random searchers varies relative to that needed by individual searchers. Numerical and theoretical results demonstrate these two universal principles established across a broad range of random search processes, including generic random walks, maximal entropy random walks, intermittent strategies, and persistent random walks. Our results reveal two fundamental principles governing the search time of multiple random searchers, which are expected to facilitate investigation of diverse dynamical processes like synchronization and spreading.
NASA Astrophysics Data System (ADS)
Miehe, Christian; Mauthe, Steffen; Teichtmeister, Stephan
2015-09-01
This work develops new minimization and saddle point principles for the coupled problem of Darcy-Biot-type fluid transport in porous media at fracture. It shows that the quasi-static problem of elastically deforming, fluid-saturated porous media is related to a minimization principle for the evolution problem. This two-field principle determines the rate of deformation and the fluid mass flux vector. It provides a canonically compact model structure, where the stress equilibrium and the inverse Darcy's law appear as the Euler equations of a variational statement. A Legendre transformation of the dissipation potential relates the minimization principle to a characteristic three field saddle point principle, whose Euler equations determine the evolutions of deformation and fluid content as well as Darcy's law. A further geometric assumption results in modified variational principles for a simplified theory, where the fluid content is linked to the volumetric deformation. The existence of these variational principles underlines inherent symmetries of Darcy-Biot theories of porous media. This can be exploited in the numerical implementation by the construction of time- and space-discrete variational principles, which fully determine the update problems of typical time stepping schemes. Here, the proposed minimization principle for the coupled problem is advantageous with regard to a new unconstrained stable finite element design, while space discretizations of the saddle point principles are constrained by the LBB condition. The variational principles developed provide the most fundamental approach to the discretization of nonlinear fluid-structure interactions, showing symmetric systems in algebraic update procedures. They also provide an excellent starting point for extensions towards more complex problems. This is demonstrated by developing a minimization principle for a phase field description of fracture in fluid-saturated porous media. It is designed for an incorporation of alternative crack driving forces, such as a convenient criterion in terms of the effective stress. The proposed setting provides a modeling framework for the analysis of complex problems such as hydraulic fracture. This is demonstrated by a spectrum of model simulations.
NASA Astrophysics Data System (ADS)
Froggatt*, C. D.
2003-01-01
The quark-lepton mass problem and the ideas of mass protection are reviewed. The hierarchy problem and suggestions for its resolution, including Little Higgs models, are discussed. The Multiple Point Principle (MPP) is introduced and used within the Standard Model (SM) to predict the top quark and Higgs particle masses. Mass matrix ansätze are considered; in particular we discuss the lightest family mass generation model, in which all the quark mixing angles are successfully expressed in terms of simple expressions involving quark mass ratios. It is argued that an underlying chiral flavour symmetry is responsible for the hierarchical texture of the fermion mass matrices. The phenomenology of neutrino mass matrices is briefly discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li Sheng; Piao Yunsong; Liu Yang
2009-12-15
In a given path with multiple branches, in principle, it can be expected that there are some fork points, where one branch is bifurcated into different branches, or various branches converge into one or several branches. In this paper, it is shown that if there is a web formed by such branches in a given field space, in which each branch can be responsible for a period of slow roll inflation, a multiverse separated by a domain wall network will come into being, some of which might correspond to our observable universe. We discuss this scenario and show possible observationsmore » of a given observer at late time.« less
Classification of polytype structures of zinc sulfide
DOE Office of Scientific and Technical Information (OSTI.GOV)
Laptev, V.I.
1994-12-31
It is suggested that the existing classification of polytype structures of zinc sulfide be supplemented with an additional criterion: the characteristic of regular point systems (Wyckoff positions) including their type, number, and multiplicity. The consideration of the Wyckoff positions allowed the establishment of construction principles of known polytype series of different symmetries and the systematization (for the first time) of the polytypes with the same number of differently packed layers. the classification suggested for polytype structures of zinc sulfide is compact and provides a basis for creating search systems. The classification table obtained can also be used for numerous siliconmore » carbide polytypes. 8 refs., 4 tabs.« less
Free-form surface measuring method based on optical theodolite measuring system
NASA Astrophysics Data System (ADS)
Yu, Caili
2012-10-01
The measurement for single-point coordinate, length and large-dimension curved surface in industrial measurement can be achieved through forward intersection measurement by the theodolite measuring system composed of several optical theodolites and one computer. The measuring principle of flexible large-dimension three-coordinate measuring system made up of multiple (above two) optical theodolites and composition and functions of the system have been introduced in this paper. Especially for measurement of curved surface, 3D measured data of spatial free-form surface is acquired through the theodolite measuring system and the CAD model is formed through surface fitting to directly generate CAM processing data.
Phased Array Beamforming and Imaging in Composite Laminates Using Guided Waves
NASA Technical Reports Server (NTRS)
Tian, Zhenhua; Leckey, Cara A. C.; Yu, Lingyu
2016-01-01
This paper presents the phased array beamforming and imaging using guided waves in anisotropic composite laminates. A generic phased array beamforming formula is presented, based on the classic delay-and-sum principle. The generic formula considers direction-dependent guided wave properties induced by the anisotropic material properties of composites. Moreover, the array beamforming and imaging are performed in frequency domain where the guided wave dispersion effect has been considered. The presented phased array method is implemented with a non-contact scanning laser Doppler vibrometer (SLDV) to detect multiple defects at different locations in an anisotropic composite plate. The array is constructed of scan points in a small area rapidly scanned by the SLDV. Using the phased array method, multiple defects at different locations are successfully detected. Our study shows that the guided wave phased array method is a potential effective method for rapid inspection of large composite structures.
Spectral imaging: principles and applications.
Garini, Yuval; Young, Ian T; McNamara, George
2006-08-01
Spectral imaging extends the capabilities of biological and clinical studies to simultaneously study multiple features such as organelles and proteins qualitatively and quantitatively. Spectral imaging combines two well-known scientific methodologies, namely spectroscopy and imaging, to provide a new advantageous tool. The need to measure the spectrum at each point of the image requires combining dispersive optics with the more common imaging equipment, and introduces constrains as well. The principles of spectral imaging and a few representative applications are described. Spectral imaging analysis is necessary because the complex data structure cannot be analyzed visually. A few of the algorithms are discussed with emphasis on the usage for different experimental modes (fluorescence and bright field). Finally, spectral imaging, like any method, should be evaluated in light of its advantages to specific applications, a selection of which is described. Spectral imaging is a relatively new technique and its full potential is yet to be exploited. Nevertheless, several applications have already shown its potential. (c) 2006 International Society for Analytical Cytology.
NASA Astrophysics Data System (ADS)
Roberts, Andrew; Appleby-Thomas, Gareth; Hazell, Paul
2011-06-01
Following multiple loading events the resultant shock state of a material will lie away from the principle Hugoniot. Prediction of such states requires knowledge of a materials equation-of-state. The material-specific variable Grunieisen gamma (Γ) defines the shape of ``off-Hugoniot'' points in energy-volume-pressure space. Experimentally the shock-reverberation technique (based on the principle of impedance-matching) has previously allowed estimation of the first-order Grunieisen gamma term (Γ1) for a silicone elastomer. Here, this approach was employed to calculate Γ1 for two dissimilar materials, Polyether ether ketone (PEEK) and the armour-grade aluminium alloy 5083 (H32); thereby allowing discussion of limitations of this technique in the context of plate-impact experiments employing Manganin stress gauges. Finally, the experimentally determined values for Γ1 were further refined by comparison between experimental records and numerical simulations carried out using the commercial code ANYSYS Autodyn®.
Scalets, wavelets and (complex) turning point quantization
NASA Astrophysics Data System (ADS)
Handy, C. R.; Brooks, H. A.
2001-05-01
Despite the many successes of wavelet analysis in image and signal processing, the incorporation of continuous wavelet transform theory within quantum mechanics has lacked a compelling, first principles, motivating analytical framework, until now. For arbitrary one-dimensional rational fraction Hamiltonians, we develop a simple, unified formalism, which clearly underscores the complementary, and mutually interdependent, role played by moment quantization theory (i.e. via scalets, as defined herein) and wavelets. This analysis involves no approximation of the Hamiltonian within the (equivalent) wavelet space, and emphasizes the importance of (complex) multiple turning point contributions in the quantization process. We apply the method to three illustrative examples. These include the (double-well) quartic anharmonic oscillator potential problem, V(x) = Z2x2 + gx4, the quartic potential, V(x) = x4, and the very interesting and significant non-Hermitian potential V(x) = -(ix)3, recently studied by Bender and Boettcher.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Haba, Naoyuki; Ishida, Hiroyuki; Okada, Nobuchika
Here, we suggest a scalar singlet extension of the standard model, in which the multiple-point principle (MPP) condition of a vanishing Higgs potential at the Planck scale is realized. Although there have been lots of attempts to realize the MPP at the Planck scale, the realization with keeping naturalness is quite difficult. This model can easily achieve the MPP at the Planck scale without large Higgs mass corrections. It is worth noting that the electroweak symmetry can be radiatively broken in our model. In the naturalness point of view, the singlet scalar mass should be of O(1 TeV) or less.more » Also, we consider right-handed neutrino extension of the model for neutrino mass generation. The model does not affect the MPP scenario, and might keep the naturalness with the new particle mass scale beyond TeV, thanks to accidental cancellation of Higgs mass corrections.« less
Assessing and managing stressors in a changing marine environment.
Chapman, Peter M
2017-11-30
We are facing a dynamic future in the face of multiple stressors acting individually and in combination: climate change; habitat change/loss; overfishing; invasive species; harmful algal blooms/eutrophication; and, chemical contaminants. Historic assessment and management approaches will be inadequate for addressing risks from climate change and other stressors. Wicked problems (non-linear, complex, competing risks and benefits, not easily solvable), will become increasingly common. We are facing irreversible changes to our planetary living conditions. Agreed protection goals and considering both the negatives (risks) and the positives (benefits) of all any and all actions are required, as is judicious and appropriate use of the Precautionary Principle. Researchers and managers need to focus on: determining tipping points (alternative stable points); maintaining ecosystem services; and, managing competing ecosystem services. Marine (and other) scientists are urged to focus their research on wicked problems to allow for informed decision-making on a planetary basis. Copyright © 2016 Elsevier Ltd. All rights reserved.
Improving Planck calibration by including frequency-dependent relativistic corrections
DOE Office of Scientific and Technical Information (OSTI.GOV)
Quartin, Miguel; Notari, Alessio, E-mail: mquartin@if.ufrj.br, E-mail: notari@ffn.ub.es
2015-09-01
The Planck satellite detectors are calibrated in the 2015 release using the 'orbital dipole', which is the time-dependent dipole generated by the Doppler effect due to the motion of the satellite around the Sun. Such an effect has also relativistic time-dependent corrections of relative magnitude 10{sup −3}, due to coupling with the 'solar dipole' (the motion of the Sun compared to the CMB rest frame), which are included in the data calibration by the Planck collaboration. We point out that such corrections are subject to a frequency-dependent multiplicative factor. This factor differs from unity especially at the highest frequencies, relevantmore » for the HFI instrument. Since currently Planck calibration errors are dominated by systematics, to the point that polarization data is currently unreliable at large scales, such a correction can in principle be highly relevant for future data releases.« less
Wong, Rebecca; Levi, Angelique W; Harigopal, Malini; Schofield, Kevin; Chhieng, David C
2012-02-01
Our cytology laboratory, like many others, is under pressure to improve quality and provide test results faster while decreasing costs. We sought to address these issues by introducing new technology and lean principles. To determine the combined impact of the FocalPoint Guided Screener (GS) Imaging System (BD Diagnostics-TriPath, Burlington, North Carolina) and lean manufacturing principles on the turnaround time (TAT) and productivity of the gynecologic cytology operation. We established a baseline measure of the TAT for Papanicolaou tests. We then compared that to the performance after implementing the FocalPoint GS Imaging System and lean principles. The latter included value-stream mapping, workflow modification, and a first in-first out policy. The mean (SD) TAT for Papanicolaou tests before and after the implementation of FocalPoint GS Imaging System and lean principles was 4.38 (1.28) days and 3.20 (1.32) days, respectively. This represented a 27% improvement in the average TAT, which was statistically significant (P < .001). In addition, the productivity of staff improved 17%, as evidenced by the increase in slides screened from 8.85/h to 10.38/h. The false-negative fraction decreased from 1.4% to 0.9%, representing a 36% improvement. In our laboratory, the implementation of FocalPoint GS Imaging System in conjunction with lean principles resulted in a significant decrease in the average TAT for Papanicolaou tests and a substantial increase in the productivity of cytotechnologists while maintaining the diagnostic quality of gynecologic cytology.
An Exploratory Review of Design Principles in Constructivist Gaming Learning Environments
ERIC Educational Resources Information Center
Rosario, Roberto A. Munoz; Widmeyer, George R.
2009-01-01
Creating a design theory for Constructivist Gaming Learning Environment necessitates, among other things, the establishment of design principles. These principles have the potential to help designers produce games, where users achieve higher levels of learning. This paper focuses on twelve design principles: Probing, Distributed, Multiple Routes,…
Set Partitions and the Multiplication Principle
ERIC Educational Resources Information Center
Lockwood, Elise; Caughman, John S., IV
2016-01-01
To further understand student thinking in the context of combinatorial enumeration, we examine student work on a problem involving set partitions. In this context, we note some key features of the multiplication principle that were often not attended to by students. We also share a productive way of thinking that emerged for several students who…
Multiple point statistical simulation using uncertain (soft) conditional data
NASA Astrophysics Data System (ADS)
Hansen, Thomas Mejer; Vu, Le Thanh; Mosegaard, Klaus; Cordua, Knud Skou
2018-05-01
Geostatistical simulation methods have been used to quantify spatial variability of reservoir models since the 80s. In the last two decades, state of the art simulation methods have changed from being based on covariance-based 2-point statistics to multiple-point statistics (MPS), that allow simulation of more realistic Earth-structures. In addition, increasing amounts of geo-information (geophysical, geological, etc.) from multiple sources are being collected. This pose the problem of integration of these different sources of information, such that decisions related to reservoir models can be taken on an as informed base as possible. In principle, though difficult in practice, this can be achieved using computationally expensive Monte Carlo methods. Here we investigate the use of sequential simulation based MPS simulation methods conditional to uncertain (soft) data, as a computational efficient alternative. First, it is demonstrated that current implementations of sequential simulation based on MPS (e.g. SNESIM, ENESIM and Direct Sampling) do not account properly for uncertain conditional information, due to a combination of using only co-located information, and a random simulation path. Then, we suggest two approaches that better account for the available uncertain information. The first make use of a preferential simulation path, where more informed model parameters are visited preferentially to less informed ones. The second approach involves using non co-located uncertain information. For different types of available data, these approaches are demonstrated to produce simulation results similar to those obtained by the general Monte Carlo based approach. These methods allow MPS simulation to condition properly to uncertain (soft) data, and hence provides a computationally attractive approach for integration of information about a reservoir model.
Using Adult Learning Principles in Adult Basic and Literacy Education. Practice Application Brief.
ERIC Educational Resources Information Center
Imel, Susan
Adult basic and literacy education (ABLE) is a complex undertaking that serves diverse learners with a variety of needs. Although no definitive list of adult education principles exists in the literature, the following principles have been identified in multiple sources devoted to principles of effective adult education: involve learners in…
ERIC Educational Resources Information Center
Heald, M.; Allen, D.; Villa, D.; Oliver, C.
2013-01-01
This proof of principle study was designed to evaluate whether excessively high rates of social approach behaviors in children with Angelman syndrome (AS) can be modified using a multiple schedule design. Four children with AS were exposed to a multiple schedule arrangement, in which social reinforcement and extinction, cued using a novel…
The probability density function (PDF) of Lagrangian Turbulence
NASA Astrophysics Data System (ADS)
Birnir, B.
2012-12-01
The statistical theory of Lagrangian turbulence is derived from the stochastic Navier-Stokes equation. Assuming that the noise in fully-developed turbulence is a generic noise determined by the general theorems in probability, the central limit theorem and the large deviation principle, we are able to formulate and solve the Kolmogorov-Hopf equation for the invariant measure of the stochastic Navier-Stokes equations. The intermittency corrections to the scaling exponents of the structure functions require a multiplicative (multipling the fluid velocity) noise in the stochastic Navier-Stokes equation. We let this multiplicative noise, in the equation, consists of a simple (Poisson) jump process and then show how the Feynmann-Kac formula produces the log-Poissonian processes, found by She and Leveque, Waymire and Dubrulle. These log-Poissonian processes give the intermittency corrections that agree with modern direct Navier-Stokes simulations (DNS) and experiments. The probability density function (PDF) plays a key role when direct Navier-Stokes simulations or experimental results are compared to theory. The statistical theory of turbulence is determined, including the scaling of the structure functions of turbulence, by the invariant measure of the Navier-Stokes equation and the PDFs for the various statistics (one-point, two-point, N-point) can be obtained by taking the trace of the corresponding invariant measures. Hopf derived in 1952 a functional equation for the characteristic function (Fourier transform) of the invariant measure. In distinction to the nonlinear Navier-Stokes equation, this is a linear functional differential equation. The PDFs obtained from the invariant measures for the velocity differences (two-point statistics) are shown to be the four parameter generalized hyperbolic distributions, found by Barndorff-Nilsen. These PDF have heavy tails and a convex peak at the origin. A suitable projection of the Kolmogorov-Hopf equations is the differential equation determining the generalized hyperbolic distributions. Then we compare these PDFs with DNS results and experimental data.
Toward an Ethical Framework for Climate Services
NASA Astrophysics Data System (ADS)
Wilby, R.; Adams, P.; Eitland, E.; Hewitson, B.; Shumake, J.; Vaughan, C.; Zebiak, S. E.
2015-12-01
Climate services offer information and tools to help stakeholders anticipate and/or manage risks posed by climate change. However, climate services lack a cohesive ethical framework to govern their development and application. This paper describes a prototype, open-ended process to form a set of ethical principles to ensure that climate services are effectively deployed to manage climate risks, realize opportunities, and advance human security.We begin by acknowledging the multiplicity of competing interests and motivations across individuals and institutions. Growing awareness of potential climate impacts has raised interest and investments in climate services and led to the entrance of new providers. User demand for climate services is also rising, as are calls for new types of services. Meanwhile, there is growing pressure from funders to operationalize climate research.Our proposed ethical framework applies reference points founded on diverse experiences in western and developing countries, fundamental and applied climate research, different sectors, gender, and professional practice (academia, private sector, government). We assert that climate service providers should be accountable for both their practices and products by upholding values of integrity, transparency, humility, and collaboration.Principles of practice include: communicating all value judgements; eschewing climate change as a singular threat; engaging in the co-exploration of knowledge; establishing mechanisms for monitoring/evaluating procedures and products; declaring any conflicts of interest. Examples of principles of products include: clear and defensible provenance of information; descriptions of the extent and character of uncertainties using terms that are meaningful to intended users; tools and information that are tailored to the context of the user; and thorough documentation of methods and meta-data.We invite the community to test and refine these points.
Ten Anchor Points for Teaching Principles of Marketing
ERIC Educational Resources Information Center
Tomkovick, Chuck
2004-01-01
Effective marketing instructors commonly share a love for their students, an affinity for the subject matter, and a devotion to continuous quality improvement. The purpose of this article is to highlight 10 anchor points for teaching Principles of Marketing, which are designed to better engage students in the learning process. These anchor…
41 CFR Appendix A to Subpart A of... - 3-Key Points and Principles
Code of Federal Regulations, 2010 CFR
2010-07-01
... 41 Public Contracts and Property Management 3 2010-07-01 2010-07-01 false 3-Key Points and Principles A Appendix A to Subpart A of Part 102 Public Contracts and Property Management Federal Property Management Regulations System (Continued) FEDERAL MANAGEMENT REGULATION GENERAL 3-FEDERAL ADVISORY COMMITTEE...
Fiber sensor network with multipoint sensing using double-pass hybrid LPFG-FBG sensor configuration
NASA Astrophysics Data System (ADS)
Yong, Yun-Thung; Lee, Sheng-Chyan; Rahman, Faidz Abd
2017-03-01
This is a study on double-pass intensity-based hybrid Long Period Fiber Grating (LPFG)and Fiber Bragg Grating (FBG) sensor configuration where a fiber sensor network was constructed with multiple sensing capability. The sensing principle is based on interrogation of intensity changes of the reflected signal from an FBG caused by the LPFG spectral response to the surrounding perturbations. The sensor network developed was tested in monitoring diesel adulteration of up to a distance of 8 km. Kerosene concentration from 0% to 50% was added as adulterant into diesel. The sensitivity of the double-pass hybrid LPFG-FBG sensor over multiple points was>0.21 dB/% (for adulteration range of 0-30%) and >0.45 dB/% from 30% to 50% adulteration. It is found that the sensitivity can drop up to 35% when the fiber length increased from 0 km to 8 km (for the case of adulteration of 0-30%). With the multiple sensing capabilities, normalized FBG's reflected power can be demodulated at the same time for comparison of sensitivity performance across various fiber sensors.
Use of General Principles in Teaching Biochemistry.
ERIC Educational Resources Information Center
Fernandez, Rolando Hernandez; Tomey, Agustin Vicedo
1991-01-01
Presents Principles of Biochemistry for use as main focus of a biochemistry course. The nine guiding ideas are the principles of continual turnover, macromolecular organization, molecular recognition, multiplicity of utilization, maximum efficiency, gradual change, interrelationship, transformational reciprocity, and information transfer. In use…
2017-05-22
angular velocity values Figure 33: Feasibility test Figure 34: Bellman’s Principle Figure 35: Bellman’s Principle validation Minimum Figure 36...Distribution of at test point for simulated ISR traffic Figure 48: PDFs of observed and ISR traffic Table 2: Adversary security states at test point #10...Figure 49: Hypothesis testing at test point #10 Figure 50: Distribution of for observed traffic Figure 51: Distribution of for ISR traffic Table 3
Pump-Probe Spectroscopy Using the Hadamard Transform.
Beddard, Godfrey S; Yorke, Briony A
2016-08-01
A new method of performing pump-probe experiments is proposed and experimentally demonstrated by a proof of concept on the millisecond scale. The idea behind this method is to measure the total probe intensity arising from several time points as a group, instead of measuring each time separately. These measurements are multiplexes that are then transformed into the true signal via multiplication with a binary Hadamard S matrix. Each group of probe pulses is determined by using the pattern of a row of the Hadamard S matrix and the experiment is completed by rotating this pattern by one step for each sample excitation until the original pattern is again produced. Thus to measure n time points, n excitation events are needed and n probe patterns each taken from the n × n S matrix. The time resolution is determined by the shortest time between the probe pulses. In principle, this method could be used over all timescales, instead of the conventional pump-probe method which uses delay lines for picosecond and faster time resolution, or fast detectors and oscilloscopes on longer timescales. This new method is particularly suitable for situations where the probe intensity is weak and/or the detector is noisy. When the detector is noisy, there is in principle a signal to noise advantage over conventional pump-probe methods. © The Author(s) 2016.
Cognition of an expert tackling an unfamiliar conceptual physics problem
NASA Astrophysics Data System (ADS)
Schuster, David; Undreiu, Adriana
2009-11-01
We have investigated and analyzed the cognition of an expert tackling a qualitative conceptual physics problem of an unfamiliar type. Our goal was to elucidate the detailed cognitive processes and knowledge elements involved, irrespective of final solution form, and consider implications for instruction. The basic but non-trivial problem was to find qualitatively the direction of acceleration of a pendulum bob at various stages of its motion, a problem originally studied by Reif and Allen. Methodology included interviews, introspection, retrospection and self-reported metacognition. Multiple facets of cognition were revealed, with different reasoning strategies used at different stages and for different points on the path. An account is given of the zigzag thinking paths and interplay of reasoning modes and schema elements involved. We interpret the cognitive processes in terms of theoretical concepts that emerged, namely: case-based, principle-based, experiential-intuitive and practical-heuristic reasoning; knowledge elements and schemata; activation; metacognition and epistemic framing. The complexity of cognition revealed in this case study contrasts with the tidy principle-based solutions we present to students. The pervasive role of schemata, case-based reasoning, practical heuristic strategies, and their interplay with physics principles is noteworthy, since these aspects of cognition are generally neither recognized nor taught. The schema/reasoning-mode perspective has direct application in science teaching, learning and problem-solving.
Fashion, Paper Dolls and Multiplicatives
ERIC Educational Resources Information Center
Ura, Suzana Kaori; Stein-Barana, Alzira C. M.; Munhoz, Deisy P.
2011-01-01
The multiplicative principle is the tool allowing the counting of groups that can be described by a sequence of events. An event is a subset of sample space, i.e. a collection of possible outcomes, which may be equal to or smaller than the sample space as a whole. It is important that students understand this basic principle early on and know how…
Spaceborne receivers: Basic principles
NASA Technical Reports Server (NTRS)
Stacey, J. M.
1984-01-01
The underlying principles of operation of microwave receivers for space observations of planetary surfaces were examined. The design philosophy of the receiver as it is applied to operate functionally as an efficient receiving system, the principle of operation of the key components of the receiver, and the important differences among receiver types are explained. The operating performance and the sensitivity expectations for both the modulated and total power receiver configurations are outlined. The expressions are derived from first principles and are developed through the important intermediate stages to form practicle and easily applied equations. The transfer of thermodynamic energy from point to point within the receiver is illustrated. The language of microwave receivers is applied statistics.
Kamstrupp's wow-effect: re-examined and expanded
NASA Astrophysics Data System (ADS)
King, Elizabeth M.; Dickmann, Ellyn M.; Johnson, Barbara Z.
2016-12-01
This review examines Anne Katrine Kamstrupp's article "The wow-effect in science teacher education; technology; sociomateriality." In the discussion below we explore three key areas of her ethnographic research. First, we reconsider Kamstrupp's article through the lens of technology as a pedagogical choice and philosophy. This is followed by our discussion of aspects of her study within the context of a basic understanding that entry-level pre-service teachers need to fully understand both the process of learning and scientific principles as these are important foundational factors in determining whether or not the wow-effect will occur as expected. Finally, our review team presents multiple areas in Kamstrupp's article as potential points for further elaboration.
Liquid crystal nanoparticles for delivery of photosensitizers for photodynamic therapy
NASA Astrophysics Data System (ADS)
Nag, Okhil K.; Naciri, Jawad; Delehanty, James B.
2018-02-01
The main principle of photodynamic therapy (PDT) is to kill malignant cells by generation of reactive oxygen species (ROS). PDT appeared highly effective when ROS can be produced in subcellular location such as plasma membrane. The plasma membrane maintains the structural integrity of the cell and regulates multiple important cellular processes, such as endocytosis, trafficking, and apoptotic pathways, could be one of the best points to kill the cancer cells. Previously, we have developed a plasma membrane-targeted liquid crystal nanoparticle (LCNP) formulation that can be loaded with dyes or drugs. Here we highlight the utility of this LCNP for membrane targeted delivery and imaging for a photosensitizer (PS) for PDT applications.
Reconfiguration of a smart surface using heteroclinic connections
McInnes, Colin R.; Xu, Ming
2017-01-01
A reconfigurable smart surface with multiple equilibria is presented, modelled using discrete point masses and linear springs with geometric nonlinearity. An energy-efficient reconfiguration scheme is then investigated to connect equal-energy unstable (but actively controlled) equilibria. In principle, zero net energy input is required to transition the surface between these unstable states, compared to transitions between stable equilibria across a potential barrier. These transitions between equal-energy unstable states, therefore, form heteroclinic connections in the phase space of the problem. Moreover, the smart surface model developed can be considered as a unit module for a range of applications, including modules which can aggregate together to form larger distributed smart surface systems. PMID:28265191
Quadratic grating apodized photon sieves for simultaneous multiplane microscopy
NASA Astrophysics Data System (ADS)
Cheng, Yiguang; Zhu, Jiangping; He, Yu; Tang, Yan; Hu, Song; Zhao, Lixin
2017-10-01
We present a new type of imaging device, named quadratic grating apodized photon sieve (QGPS), used as the objective for simultaneous multiplane imaging in X-rays. The proposed QGPS is structured based on the combination of two concepts: photon sieves and quadratic gratings. Its design principles are also expounded in detail. Analysis of imaging properties of QGPS in terms of point-spread function shows that QGPS can image multiple layers within an object field onto a single image plane. Simulated and experimental results in visible light both demonstrate the feasibility of QGPS for simultaneous multiplane imaging, which is extremely promising to detect dynamic specimens by X-ray microscopy in the physical and life sciences.
Maternity care and Human Rights: what do women think?
Solnes Miltenburg, Andrea; Lambermon, Fleur; Hamelink, Cees; Meguid, Tarek
2016-07-02
A human rights approach to maternal health is considered as a useful framework in international efforts to reduce maternal mortality. Although fundamental human rights principles are incorporated into legal and medical frameworks, human rights have to be translated into measurable actions and outcomes. So far, their substantive applications remain unclear. The aim of this study is to explore women's perspectives and experiences of maternal health services through a human rights perspective in Magu District, Tanzania. This study is a qualitative exploration of perspectives and experiences of women regarding maternity services in government health facilities. The point of departure is a Human Rights perspective. A total of 36 semi-structured interviews were held with 17 women, between the age of 31 and 63, supplemented with one focus group discussion of a selection of the interviewed women, in three rural villages and the town centre in Magu District. Data analysis was performed using a coding scheme based on four human rights principles: dignity, autonomy, equality and safety. Women's experiences of maternal health services reflect several sub-standard care factors relating to violations of multiple human rights principles. Women were aware that substandard care was present and described a range of ways how the services could be delivered that would venerate human rights principles. Prominent themes included: 'being treated well and equal', 'being respected' and 'being given the appropriate information and medical treatment'. Women in this rural Tanzanian setting are aware that their experiences of maternity care reflect violations of their basic rights and are able to voice what basic human rights principles mean to them as well as their desired applications in maternal health service provision.
The Jet Principle: Technologies Provide Border Conditions for Global Learning
ERIC Educational Resources Information Center
Ahamer, Gilbert
2012-01-01
Purpose: The purpose of this paper is to first define the "jet principle" of (e-)learning as providing dynamically suitable framework conditions for enhanced learning procedures that combine views from multiple cultures of science. Second it applies this principle to the case of the "Global Studies" curriculum, a unique…
Developing an Asteroid Rotational Theory
NASA Astrophysics Data System (ADS)
Geis, Gena; Williams, Miguel; Linder, Tyler; Pakey, Donald
2018-01-01
The goal of this project is to develop a theoretical asteroid rotational theory from first principles. Starting at first principles provides a firm foundation for computer simulations which can be used to analyze multiple variables at once such as size, rotation period, tensile strength, and density. The initial theory will be presented along with early models of applying the theory to the asteroid population. Early results confirm previous work by Pravec et al. (2002) that show the majority of the asteroids larger than 200m have negligible tensile strength and have spin rates close to their critical breakup point. Additionally, results show that an object with zero tensile strength has a maximum rotational rate determined by the object’s density, not size. Therefore, an iron asteroid with a density of 8000 kg/m^3 would have a minimum spin period of 1.16h if the only forces were gravitational and centrifugal. The short-term goal is to include material forces in the simulations to determine what tensile strength will allow the high spin rates of asteroids smaller than 150m.
Risk-Sensitive Control of Pure Jump Process on Countable Space with Near Monotone Cost
DOE Office of Scientific and Technical Information (OSTI.GOV)
Suresh Kumar, K., E-mail: suresh@math.iitb.ac.in; Pal, Chandan, E-mail: cpal@math.iitb.ac.in
2013-12-15
In this article, we study risk-sensitive control problem with controlled continuous time pure jump process on a countable space as state dynamics. We prove multiplicative dynamic programming principle, elliptic and parabolic Harnack’s inequalities. Using the multiplicative dynamic programing principle and the Harnack’s inequalities, we prove the existence and a characterization of optimal risk-sensitive control under the near monotone condition.
Thermodynamic resource theories, non-commutativity and maximum entropy principles
NASA Astrophysics Data System (ADS)
Lostaglio, Matteo; Jennings, David; Rudolph, Terry
2017-04-01
We discuss some features of thermodynamics in the presence of multiple conserved quantities. We prove a generalisation of Landauer principle illustrating tradeoffs between the erasure costs paid in different ‘currencies’. We then show how the maximum entropy and complete passivity approaches give different answers in the presence of multiple observables. We discuss how this seems to prevent current resource theories from fully capturing thermodynamic aspects of non-commutativity.
a Point-Like Picture of the Hydrogen Atom
NASA Astrophysics Data System (ADS)
Faghihi, F.; Jangjoo, A.; Khani, M.
A point-like picture of the Schrödinger solution for hydrogen atom is worked to emphasize that "point-like particles" may describe as "probability wave function". In each case, the three-dimensional shape of the |Ψnlm(rn, cosθ)|2 is plotted and the paths of the point-like electron (it is better to say reduced mass of the pair particles) are described in each closed shell. Finally, the orbital shape of the molecules are given according to the present simple model. In our opinion, "interpretations of the Correspondence Principle", which is a basic principle in all elementary quantum text, seems to be reviewed again!
The network and transmission of based on the principle of laser multipoint communication
NASA Astrophysics Data System (ADS)
Fu, Qiang; Liu, Xianzhu; Jiang, Huilin; Hu, Yuan; Jiang, Lun
2014-11-01
Space laser communication is the perfectly choose to the earth integrated information backbone network in the future. This paper introduces the structure of the earth integrated information network that is a large capacity integrated high-speed broadband information network, a variety of communications platforms were densely interconnected together, such as the land, sea, air and deep air users or aircraft, the technologies of the intelligent high-speed processing, switching and routing were adopt. According to the principle of maximum effective comprehensive utilization of information resources, get accurately information, fast processing and efficient transmission through inter-satellite, satellite earth, sky and ground station and other links. Namely it will be a space-based, air-based and ground-based integrated information network. It will be started from the trends of laser communication. The current situation of laser multi-point communications were expounded, the transmission scheme of the dynamic multi-point between wireless laser communication n network has been carefully studied, a variety of laser communication network transmission schemes the corresponding characteristics and scope described in detail , described the optical multiplexer machine that based on the multiport form of communication is applied to relay backbone link; the optical multiplexer-based on the form of the segmentation receiver field of view is applied to small angle link, the optical multiplexer-based form of three concentric spheres structure is applied to short distances, motorized occasions, and the multi-point stitching structure based on the rotation paraboloid is applied to inter-satellite communications in detail. The multi-point laser communication terminal apparatus consist of the transmitting and receiving antenna, a relay optical system, the spectroscopic system, communication system and communication receiver transmitter system. The communication forms of optical multiplexer more than four goals or more, the ratio of received power and volume weight will be Obvious advantages, and can track multiple moving targets in flexible.It would to provide reference for the construction of earth integrated information networks.
THE DEVELOPMENT OF AN INSTRUMENT FOR MEASURING THE UNDERSTANDING OF PROFIT-MAXIMIZING PRINCIPLES.
ERIC Educational Resources Information Center
MCCORMICK, FLOYD G.
THE PURPOSE OF THE STUDY WAS TO DEVELOP AN INSTRUMENT FOR MEASURING PROFIT-MAXIMIZING PRINCIPLES IN FARM MANAGEMENT WITH IMPLICATIONS FOR VOCATIONAL AGRICULTURE. PRINCIPLES WERE IDENTIFIED FROM LITERATURE SELECTED BY AGRICULTURAL ECONOMISTS. FORTY-FIVE MULTIPLE-CHOICE QUESTIONS WERE REFINED ON THE BASIS OF RESULTS OF THREE PRETESTS AND…
The Aristotelian Principle and Education.
ERIC Educational Resources Information Center
Pekarsky, Daniel
1980-01-01
For the Aristotelian principle to become operative, certain principles must be met. Individuals should be free from anxiety about satisfying basic needs. They cannot enjoy complexity in their lives beyond a certain point. To have satisfying lives, individuals should engage in both passive and active enjoyments, leading from a lower to a higher…
On the correspondence between quantum and classical variational principles
Ruiz, D. E.; Dodin, I. Y.
2015-06-10
Here, classical variational principles can be deduced from quantum variational principles via formal reparameterization of the latter. It is shown that such reparameterization is possible without invoking any assumptions other than classicality and without appealing to dynamical equations. As examples, first principle variational formulations of classical point-particle and cold-fluid motion are derived from their quantum counterparts for Schrodinger, Pauli, and Klein-Gordon particles.
Components for the Global Digital Object Cloud
NASA Astrophysics Data System (ADS)
Glaves, Helen; Hanahoe, Hilary; Weigel, Tobias; Lannom, Larry; Wittenburg, Peter; Koureas, Dimitris; Almas, Bridget
2017-04-01
We are at a tipping point in the development of a common conceptual framework and set of tools and components which will revolutionize the management of scientific data. It is widely acknowledged that the current volumes and complexity of data now being collected, and the inevitable and enormous increase in that volume and complexity, have reached the point where action is required. Around 80% of the data generated is being lost after short time periods and a corresponding amount of time is being wasted by reseachers on routine data management tasks. At the same time, and largely in response to this perceived crisis, a number of principles (G8, RDA DFT, FAIR) for the management of scientific data have arisen and been widely endorsed. The danger now is that agreement will stop at the level of principles and that multiple non-interoperable domain and technology specific silos will continue to arise, all based on the abstract principles. If this happens, we will lose the opportunity to create a common set of low-level tools and components based on an agreed conceptual approach. The Research Data Alliance (RDA) is now combining recommendations from its individual working and interest groups, such as suggestions for proper citation of dynamic data or how to assess the quality of repositories, to design configurations of core components (as specified by RDA and other initiatives such as W3C) and stimulate their implementation. Together with a few global communities such as climate modeling, biodiversity and material science, experts involved in RDA are developing a concept called Global Digital Object Cloud (GDOC) which has the potential to overcome the huge fragmentation which hampers efficient data management and re-use. It is compliant with the FAIR principles in so far as a) it puts Digital Objects (DOs) in its center, b) has all DOs assigned PIDs which are resolvable to useful state information, c) has all DOs associated with metadata, and d) has all DO bit sequences stored in trustworthy repositories. The presentation will give an overview of the types of components involved, the corresponding specifications of RDA, and the concept of the GDOC.
Popova, A Yu; Trukhina, G M; Mikailova, O M
In the article there is considered the quality control and safety system implemented in the one of the largest flight catering food production plant for airline passengers and flying squad. The system for the control was based on the Hazard Analysis And Critical Control Points (HACCP) principles and developed hygienic and antiepidemic measures. There is considered the identification of hazard factors at stages of the technical process. There are presented results of the analysis data of monitoring for 6 critical control points over the five-year period. The quality control and safety system permit to decline food contamination risk during acceptance, preparation and supplying of in-flight meal. There was proved the efficiency of the implemented system. There are determined further ways of harmonization and implementation for HACCP principles in the plant.
Integral imaging with multiple image planes using a uniaxial crystal plate.
Park, Jae-Hyeung; Jung, Sungyong; Choi, Heejin; Lee, Byoungho
2003-08-11
Integral imaging has been attracting much attention recently for its several advantages such as full parallax, continuous view-points, and real-time full-color operation. However, the thickness of the displayed three-dimensional image is limited to relatively small value due to the degradation of the image resolution. In this paper, we propose a method to provide observers with enhanced perception of the depth without severe resolution degradation by the use of the birefringence of a uniaxial crystal plate. The proposed integral imaging system can display images integrated around three central depth planes by dynamically altering the polarization and controlling both elemental images and dynamic slit array mask accordingly. We explain the principle of the proposed method and verify it experimentally.
NASA Astrophysics Data System (ADS)
dall'Acqua, Luisa
2011-08-01
The teleology of our research is to propose a solution to the request of "innovative, creative teaching", proposing a methodology to educate creative Students in a society characterized by multiple reference points and hyper dynamic knowledge, continuously subject to reviews and discussions. We apply a multi-prospective Instructional Design Model (PENTHA ID Model), defined and developed by our research group, which adopts a hybrid pedagogical approach, consisting of elements of didactical connectivism intertwined with aspects of social constructivism and enactivism. The contribution proposes an e-course structure and approach, applying the theoretical design principles of the above mentioned ID Model, describing methods, techniques, technologies and assessment criteria for the definition of lesson modes in an e-course.
Chou, Ting-Chao
2011-01-01
The mass-action law based system analysis via mathematical induction and deduction lead to the generalized theory and algorithm that allows computerized simulation of dose-effect dynamics with small size experiments using a small number of data points in vitro, in animals, and in humans. The median-effect equation of the mass-action law deduced from over 300 mechanism specific-equations has been shown to be the unified theory that serves as the common-link for complicated biomedical systems. After using the median-effect principle as the common denominator, its applications are mechanism-independent, drug unit-independent, and dynamic order-independent; and can be used generally for single drug analysis or for multiple drug combinations in constant-ratio or non-constant ratios. Since the "median" is the common link and universal reference point in biological systems, these general enabling lead to computerized quantitative bio-informatics for econo-green bio-research in broad disciplines. Specific applications of the theory, especially relevant to drug discovery, drug combination, and clinical trials, have been cited or illustrated in terms of algorithms, experimental design and computerized simulation for data analysis. Lessons learned from cancer research during the past fifty years provide a valuable opportunity to reflect, and to improve the conventional divergent approach and to introduce a new convergent avenue, based on the mass-action law principle, for the efficient cancer drug discovery and the low-cost drug development.
Chou, Ting-Chao
2011-01-01
The mass-action law based system analysis via mathematical induction and deduction lead to the generalized theory and algorithm that allows computerized simulation of dose-effect dynamics with small size experiments using a small number of data points in vitro, in animals, and in humans. The median-effect equation of the mass-action law deduced from over 300 mechanism specific-equations has been shown to be the unified theory that serves as the common-link for complicated biomedical systems. After using the median-effect principle as the common denominator, its applications are mechanism-independent, drug unit-independent, and dynamic order-independent; and can be used generally for single drug analysis or for multiple drug combinations in constant-ratio or non-constant ratios. Since the “median” is the common link and universal reference point in biological systems, these general enabling lead to computerized quantitative bio-informatics for econo-green bio-research in broad disciplines. Specific applications of the theory, especially relevant to drug discovery, drug combination, and clinical trials, have been cited or illustrated in terms of algorithms, experimental design and computerized simulation for data analysis. Lessons learned from cancer research during the past fifty years provide a valuable opportunity to reflect, and to improve the conventional divergent approach and to introduce a new convergent avenue, based on the mass-action law principle, for the efficient cancer drug discovery and the low-cost drug development. PMID:22016837
Reflector automatic acquisition and pointing based on auto-collimation theodolite.
Luo, Jun; Wang, Zhiqian; Wen, Zhuoman; Li, Mingzhu; Liu, Shaojin; Shen, Chengwu
2018-01-01
An auto-collimation theodolite (ACT) for reflector automatic acquisition and pointing is designed based on the principle of autocollimators and theodolites. First, the principle of auto-collimation and theodolites is reviewed, and then the coaxial ACT structure is developed. Subsequently, the acquisition and pointing strategies for reflector measurements are presented, which first quickly acquires the target over a wide range and then points the laser spot to the charge coupled device zero position. Finally, experiments are conducted to verify the acquisition and pointing performance, including the calibration of the ACT, the comparison of the acquisition mode and pointing mode, and the accuracy measurement in horizontal and vertical directions. In both directions, a measurement accuracy of ±3″ is achieved. The presented ACT is suitable for automatic pointing and monitoring the reflector over a small scanning area and can be used in a wide range of applications such as bridge structure monitoring and cooperative target aiming.
Reflector automatic acquisition and pointing based on auto-collimation theodolite
NASA Astrophysics Data System (ADS)
Luo, Jun; Wang, Zhiqian; Wen, Zhuoman; Li, Mingzhu; Liu, Shaojin; Shen, Chengwu
2018-01-01
An auto-collimation theodolite (ACT) for reflector automatic acquisition and pointing is designed based on the principle of autocollimators and theodolites. First, the principle of auto-collimation and theodolites is reviewed, and then the coaxial ACT structure is developed. Subsequently, the acquisition and pointing strategies for reflector measurements are presented, which first quickly acquires the target over a wide range and then points the laser spot to the charge coupled device zero position. Finally, experiments are conducted to verify the acquisition and pointing performance, including the calibration of the ACT, the comparison of the acquisition mode and pointing mode, and the accuracy measurement in horizontal and vertical directions. In both directions, a measurement accuracy of ±3″ is achieved. The presented ACT is suitable for automatic pointing and monitoring the reflector over a small scanning area and can be used in a wide range of applications such as bridge structure monitoring and cooperative target aiming.
Turning patient-centeredness from ideal to real: lessons from 2 success stories.
Millenson, Michael L; DiGioia, Anthony M; Greenhouse, Pamela K; Swieskowski, David
2013-01-01
The Institute of Medicine's 2001 Crossing the Quality Chasm report established patient-centeredness as 1 of 6 core principles for health system redesign. Yet, turning aspiration into accomplishment has proven arduous. Patient-centered care has components that challenge established professional norms, and the term itself has not always been clearly defined. However, these barriers can be overcome using Rogers' principles of diffusion of innovation, as is shown by 2 case histories. One involves care at an urban academic medical center, the other outpatient care at multiple physician sites located in urban, suburban, and rural locations. At the University of Pittsburgh Medical Center, the Patient- and Family-Centered Care Methodology and Practice has become the new "operating system" in 60 clinical areas, using a 6-step approach to engage patients and families as codesigners of ideal care. Meanwhile, the Health Coach Program at Mercy Clinics, Inc, Des Moines, Iowa, has used a "high-tech/high-touch" combined approach to change the organizational culture through patient-centered initiatives. By doing so, it has put the organization in a position to accept risk for populations of patients. Importantly, both programs have been financially and clinically successful, are accepted by frontline physicians and senior management, and are nationally recognized. Common principles include physician leadership, comfort with uncertainty during innovation, organizational structures that send a consistent message about expectations, and quality improvement as a constant cycle with no end point.
Robinson, Maren N; Tansil, Kristin A; Elder, Randy W; Soler, Robin E; Labre, Magdala P; Mercer, Shawna L; Eroglu, Dogan; Baur, Cynthia; Lyon-Daniel, Katherine; Fridinger, Fred; Sokler, Lynn A; Green, Lawrence W; Miller, Therese; Dearing, James W; Evans, William D; Snyder, Leslie B; Kasisomayajula Viswanath, K; Beistle, Diane M; Chervin, Doryn D; Bernhardt, Jay M; Rimer, Barbara K
2014-09-01
Health communication campaigns including mass media and health-related product distribution have been used to reduce mortality and morbidity through behavior change. The intervention is defined as having two core components reflecting two social marketing principles: (1) promoting behavior change through multiple communication channels, one being mass media, and (2) distributing a free or reduced-price product that facilitates adoption and maintenance of healthy behavior change, sustains cessation of harmful behaviors, or protects against behavior-related disease or injury. Using methods previously developed for the Community Guide, a systematic review (search period, January 1980-December 2009) was conducted to evaluate the effectiveness of health communication campaigns that use multiple channels, including mass media, and distribute health-related products. The primary outcome of interest was use of distributed health-related products. Twenty-two studies that met Community Guide quality criteria were analyzed in 2010. Most studies showed favorable behavior change effects on health-related product use (a median increase of 8.4 percentage points). By product category, median increases in desired behaviors ranged from 4.0 percentage points for condom promotion and distribution campaigns to 10.0 percentage points for smoking-cessation campaigns. Health communication campaigns that combine mass media and other communication channels with distribution of free or reduced-price health-related products are effective in improving healthy behaviors. This intervention is expected to be applicable across U.S. demographic groups, with appropriate population targeting. The ability to draw more specific conclusions about other important social marketing practices is constrained by limited reporting of intervention components and characteristics. Published by Elsevier Inc.
Critical and maximally informative encoding between neural populations in the retina
Kastner, David B.; Baccus, Stephen A.; Sharpee, Tatyana O.
2015-01-01
Computation in the brain involves multiple types of neurons, yet the organizing principles for how these neurons work together remain unclear. Information theory has offered explanations for how different types of neurons can maximize the transmitted information by encoding different stimulus features. However, recent experiments indicate that separate neuronal types exist that encode the same filtered version of the stimulus, but then the different cell types signal the presence of that stimulus feature with different thresholds. Here we show that the emergence of these neuronal types can be quantitatively described by the theory of transitions between different phases of matter. The two key parameters that control the separation of neurons into subclasses are the mean and standard deviation (SD) of noise affecting neural responses. The average noise across the neural population plays the role of temperature in the classic theory of phase transitions, whereas the SD is equivalent to pressure or magnetic field, in the case of liquid–gas and magnetic transitions, respectively. Our results account for properties of two recently discovered types of salamander Off retinal ganglion cells, as well as the absence of multiple types of On cells. We further show that, across visual stimulus contrasts, retinal circuits continued to operate near the critical point whose quantitative characteristics matched those expected near a liquid–gas critical point and described by the nearest-neighbor Ising model in three dimensions. By operating near a critical point, neural circuits can maximize information transmission in a given environment while retaining the ability to quickly adapt to a new environment. PMID:25675497
The holographic principle, the equipartition of energy and Newton’s gravity
NASA Astrophysics Data System (ADS)
Sadiq, M.
2017-12-01
Assuming the equipartition of energy to hold on a holographic sphere, Erik Verlinde demonstrated that Newton’s gravity follows as an entropic force. Some comments are in place about Verlinde’s assumptions in his derivation. It is pointed out that the holographic principle allows for freedom up to a free scale factor in the choice of Planck scale area while leading to classical gravity. Similarity of this free parameter with the Immirzi parameter of loop quantum gravity is discussed. We point out that the equipartition of energy is inbuilt into the holographic principle and, therefore, need not be assumed from the outset.
First-principles multiple-barrier diffusion theory. The case study of interstitial diffusion in CdTe
Yang, Ji -Hui; Park, Ji -Sang; Kang, Joongoo; ...
2015-02-17
The diffusion of particles in solid-state materials generally involves several sequential thermal-activation processes. However, presently, diffusion coefficient theory only deals with a single barrier, i.e., it lacks an accurate description to deal with multiple-barrier diffusion. Here, we develop a general diffusion coefficient theory for multiple-barrier diffusion. Using our diffusion theory and first-principles calculated hopping rates for each barrier, we calculate the diffusion coefficients of Cd, Cu, Te, and Cl interstitials in CdTe for their full multiple-barrier diffusion pathways. As a result, we found that the calculated diffusivity agrees well with the experimental measurement, thus justifying our theory, which is generalmore » for many other systems.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-15
... Critical Control Point (HACCP) principles to reduce the risk of foodborne illness in the operation of... Control Number 0910-0578)--Extension HACCP principles are designed to reduce the occurrence of foodborne... manuals that interpret and promote the application of HACCP principles to reduce the risk of foodborne...
A restricted proof that the weak equivalence principle implies the Einstein equivalence principle
NASA Technical Reports Server (NTRS)
Lightman, A. P.; Lee, D. L.
1973-01-01
Schiff has conjectured that the weak equivalence principle (WEP) implies the Einstein equivalence principle (EEP). A proof is presented of Schiff's conjecture, restricted to: (1) test bodies made of electromagnetically interacting point particles, that fall from rest in a static, spherically symmetric gravitational field; (2) theories of gravity within a certain broad class - a class that includes almost all complete relativistic theories that have been found in the literature, but with each theory truncated to contain only point particles plus electromagnetic and gravitational fields. The proof shows that every nonmentric theory in the class (every theory that violates EEP) must violate WEP. A formula is derived for the magnitude of the violation. It is shown that WEP is a powerful theoretical and experimental tool for constraining the manner in which gravity couples to electromagnetism in gravitation theories.
Dew point measurement technique utilizing fiber cut reflection
NASA Astrophysics Data System (ADS)
Kostritskii, S. M.; Dikevich, A. A.; Korkishko, Yu. N.; Fedorov, V. A.
2009-05-01
The fiber optical dew point hygrometer based on change of reflection coefficient for fiber cut has been developed and examined. We proposed and verified the model of condensation detector functioning principle. Experimental frost point measurements on air with different frost points have been performed.
A three dimensional point cloud registration method based on rotation matrix eigenvalue
NASA Astrophysics Data System (ADS)
Wang, Chao; Zhou, Xiang; Fei, Zixuan; Gao, Xiaofei; Jin, Rui
2017-09-01
We usually need to measure an object at multiple angles in the traditional optical three-dimensional measurement method, due to the reasons for the block, and then use point cloud registration methods to obtain a complete threedimensional shape of the object. The point cloud registration based on a turntable is essential to calculate the coordinate transformation matrix between the camera coordinate system and the turntable coordinate system. We usually calculate the transformation matrix by fitting the rotation center and the rotation axis normal of the turntable in the traditional method, which is limited by measuring the field of view. The range of exact feature points used for fitting the rotation center and the rotation axis normal is approximately distributed within an arc less than 120 degrees, resulting in a low fit accuracy. In this paper, we proposes a better method, based on the invariant eigenvalue principle of rotation matrix in the turntable coordinate system and the coordinate transformation matrix of the corresponding coordinate points. First of all, we control the rotation angle of the calibration plate with the turntable to calibrate the coordinate transformation matrix of the corresponding coordinate points by using the least squares method. And then we use the feature decomposition to calculate the coordinate transformation matrix of the camera coordinate system and the turntable coordinate system. Compared with the traditional previous method, it has a higher accuracy, better robustness and it is not affected by the camera field of view. In this method, the coincidence error of the corresponding points on the calibration plate after registration is less than 0.1mm.
Multiple testing and power calculations in genetic association studies.
So, Hon-Cheong; Sham, Pak C
2011-01-01
Modern genetic association studies typically involve multiple single-nucleotide polymorphisms (SNPs) and/or multiple genes. With the development of high-throughput genotyping technologies and the reduction in genotyping cost, investigators can now assay up to a million SNPs for direct or indirect association with disease phenotypes. In addition, some studies involve multiple disease or related phenotypes and use multiple methods of statistical analysis. The combination of multiple genetic loci, multiple phenotypes, and multiple methods of evaluating associations between genotype and phenotype means that modern genetic studies often involve the testing of an enormous number of hypotheses. When multiple hypothesis tests are performed in a study, there is a risk of inflation of the type I error rate (i.e., the chance of falsely claiming an association when there is none). Several methods for multiple-testing correction are in popular use, and they all have strengths and weaknesses. Because no single method is universally adopted or always appropriate, it is important to understand the principles, strengths, and weaknesses of the methods so that they can be applied appropriately in practice. In this article, we review the three principle methods for multiple-testing correction and provide guidance for calculating statistical power.
Principles and Heuristics for Designing Minimalist Instruction.
ERIC Educational Resources Information Center
van der Meij, Hans; Carroll, John M.
1995-01-01
Presents an overview of principles and heuristics for designing minimalist instruction, with examples and theoretical or empirical arguments. Provides a starting point from which to create minimalist instruction to suit a variety of uses. (SR)
Music-evoked emotions: principles, brain correlates, and implications for therapy.
Koelsch, Stefan
2015-03-01
This paper describes principles underlying the evocation of emotion with music: evaluation, resonance, memory, expectancy/tension, imagination, understanding, and social functions. Each of these principles includes several subprinciples, and the framework on music-evoked emotions emerging from these principles and subprinciples is supposed to provide a starting point for a systematic, coherent, and comprehensive theory on music-evoked emotions that considers both reception and production of music, as well as the relevance of emotion-evoking principles for music therapy. © 2015 New York Academy of Sciences.
Multi Dimensional Honey Bee Foraging Algorithm Based on Optimal Energy Consumption
NASA Astrophysics Data System (ADS)
Saritha, R.; Vinod Chandra, S. S.
2017-10-01
In this paper a new nature inspired algorithm is proposed based on natural foraging behavior of multi-dimensional honey bee colonies. This method handles issues that arise when food is shared from multiple sources by multiple swarms at multiple destinations. The self organizing nature of natural honey bee swarms in multiple colonies is based on the principle of energy consumption. Swarms of multiple colonies select a food source to optimally fulfill the requirements of its colonies. This is based on the energy requirement for transporting food between a source and destination. Minimum use of energy leads to maximizing profit in each colony. The mathematical model proposed here is based on this principle. This has been successfully evaluated by applying it on multi-objective transportation problem for optimizing cost and time. The algorithm optimizes the needs at each destination in linear time.
Multi- and unisensory visual flash illusions.
Courtney, Jon R; Motes, Michael A; Hubbard, Timothy L
2007-01-01
The role of stimulus structure in multisensory and unisensory interactions was examined. When a flash (17 ms) was accompanied by multiple tones (each 7 ms, SOA < or =100 ms) multiple flashes were reported, and this effect has been suggested to reflect the role of stimulus continuity in multisensory interactions. In experiments 1 and 2 we examined if stimulus continuity would affect concurrently presented stimuli. When a relatively longer flash (317 ms) was accompanied by multiple tones (each 7 ms), observers reported perceiving multiple flashes. In experiment 3 we tested whether a flash presented near fixation would induce an illusory flash further in the periphery. One flash (17 ms) presented 5 degrees below fixation was reported as multiple flashes if presented with two flashes (each 17 ms, SOA =100 ms) 2 degrees above fixation. The extent to which these data support a phenomenological continuity principle and whether this principle applies to unisensory perception is discussed.
Principles of estimation of Radiative danger
NASA Astrophysics Data System (ADS)
Korogodin, V. I.
1990-08-01
The main principles of the estimation of Radiative danger has been discussed. Two main particularities of the danger were pointed out: negatve consequencies of small doses, which does not lead to radiation sickness, but lead to disfunctions of sanguine organs and thin intestines; absolute estimation of biological anomalies, which was forwarded by A.D. Sakharov (1921-1989). The ethic aspects of the use of Nuclear weapons on the fate of Human civilization were pointed out by A.D. Sakharov (1921-1990).
Vargas-Leguás, H; Rodríguez Garrido, V; Lorite Cuenca, R; Pérez-Portabella, C; Redecillas Ferreiro, S; Campins Martí, M
2009-06-01
This guide for the preparation of powdered infant formulae in hospital environments is a collaborative work between several hospital services and is based on national and European regulations, international experts meetings and the recommendations of scientific societies. This guide also uses the Hazard Analysis and Critical Control Point principles proposed by Codex Alimentarius and emphasises effective verifying measures, microbiological controls of the process and the corrective actions when monitoring indicates that a critical control point is not under control. It is a dynamic guide and specifies the evaluation procedures that allow it to be constantly adapted.
Aids to Computer-Based Multimedia Learning.
ERIC Educational Resources Information Center
Mayer, Richard E.; Moreno, Roxana
2002-01-01
Presents a cognitive theory of multimedia learning that draws on dual coding theory, cognitive load theory, and constructivist learning theory and derives some principles of instructional design for fostering multimedia learning. These include principles of multiple representation, contiguity, coherence, modality, and redundancy. (SLD)
Acceleration of neutrons in a scheme of a tautochronous mathematical pendulum (physical principles)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rivlin, Lev A
We consider the physical principles of neutron acceleration through a multiple synchronous interaction with a gradient rf magnetic field in a scheme of a tautochronous mathematical pendulum. (laser applications and other aspects of quantum electronics)
A new adaptive light beam focusing principle for scanning light stimulation systems.
Bitzer, L A; Meseth, M; Benson, N; Schmechel, R
2013-02-01
In this article a novel principle to achieve optimal focusing conditions or rather the smallest possible beam diameter for scanning light stimulation systems is presented. It is based on the following methodology: First, a reference point on a camera sensor is introduced where optimal focusing conditions are adjusted and the distance between the light focusing optic and the reference point is determined using a laser displacement sensor. In a second step, this displacement sensor is used to map the topography of the sample under investigation. Finally, the actual measurement is conducted, using optimal focusing conditions in each measurement point at the sample surface, that are determined by the height difference between camera sensor and the sample topography. This principle is independent of the measurement values, the optical or electrical properties of the sample, the used light source, or the selected wavelength. Furthermore, the samples can be tilted, rough, bent, or of different surface materials. In the following the principle is implemented using an optical beam induced current system, but basically it can be applied to any other scanning light stimulation system. Measurements to demonstrate its operation are shown, using a polycrystalline silicon solar cell.
Superadiabatic driving of a three-level quantum system
NASA Astrophysics Data System (ADS)
Theisen, M.; Petiziol, F.; Carretta, S.; Santini, P.; Wimberger, S.
2017-07-01
We study superadiabatic quantum control of a three-level quantum system whose energy spectrum exhibits multiple avoided crossings. In particular, we investigate the possibility of treating the full control task in terms of independent two-level Landau-Zener problems. We first show that the time profiles of the elements of the full control Hamiltonian are characterized by peaks centered around the crossing times. These peaks decay algebraically for large times. In principle, such a power-law scaling invalidates the hypothesis of perfect separability. Nonetheless, we address the problem from a pragmatic point of view by studying the fidelity obtained through separate control as a function of the intercrossing separation. This procedure may be a good approach to achieve approximate adiabatic driving of a specific instantaneous eigenstate in realistic implementations.
Methane source identification in Boston, Massachusetts using isotopic and ethane measurements
NASA Astrophysics Data System (ADS)
Down, A.; Jackson, R. B.; Plata, D.; McKain, K.; Wofsy, S. C.; Rella, C.; Crosson, E.; Phillips, N. G.
2012-12-01
Methane has substantial greenhouse warming potential and is the principle component of natural gas. Fugitive natural gas emissions could be a significant source of methane to the atmosphere. However, the cumulative magnitude of natural gas leaks is not yet well constrained. We used a combination of point source measurements and ambient monitoring to characterize the methane sources in the Boston urban area. We developed distinct fingerprints for natural gas and multiple biogenic methane sources based on hydrocarbon concentration and isotopic composition. We combine these data with periodic measurements of atmospheric methane and ethane concentration to estimate the fractional contribution of natural gas and biogenic methane sources to the cumulative urban methane flux in Boston. These results are used to inform an inverse model of urban methane concentration and emissions.
A slippery molecular assembly allows water as a self-erasable security marker
Thirumalai, Rajasekaran; Mukhopadhyay, Rahul Dev; Praveen, Vakayil K.; Ajayaghosh, Ayyappanpillai
2015-01-01
Protection of currency and valuable documents from counterfeit continues to be a challenge. While there are many embedded security features available for document safety, they are not immune to forgery. Fluorescence is a sensitive property, which responds to external stimuli such as solvent polarity, temperature or mechanical stress, however practical use in security applications is hampered due to several reasons. Therefore, a simple and specific stimuli responsive security feature that is difficult to duplicate is of great demand. Herein we report the design of a fluorescent molecular assembly on which water behaves as a self-erasable security marker for checking the authenticity of documents at point of care. The underlying principle involves the disciplined self-assembly of a tailor-made fluorescent molecule, which initially form a weak blue fluorescence (λem = 425 nm, Φf = 0.13) and changes to cyan emission (λem = 488 nm,Φf = 0.18) in contact with water due to a reversible molecular slipping motion. This simple chemical tool, based on the principles of molecular self-assembly and fluorescence modulation, allows creation of security labels and optically masked barcodes for multiple documents authentication. PMID:25940779
A slippery molecular assembly allows water as a self-erasable security marker.
Thirumalai, Rajasekaran; Mukhopadhyay, Rahul Dev; Praveen, Vakayil K; Ajayaghosh, Ayyappanpillai
2015-05-05
Protection of currency and valuable documents from counterfeit continues to be a challenge. While there are many embedded security features available for document safety, they are not immune to forgery. Fluorescence is a sensitive property, which responds to external stimuli such as solvent polarity, temperature or mechanical stress, however practical use in security applications is hampered due to several reasons. Therefore, a simple and specific stimuli responsive security feature that is difficult to duplicate is of great demand. Herein we report the design of a fluorescent molecular assembly on which water behaves as a self-erasable security marker for checking the authenticity of documents at point of care. The underlying principle involves the disciplined self-assembly of a tailor-made fluorescent molecule, which initially form a weak blue fluorescence (λem = 425 nm, Φf = 0.13) and changes to cyan emission (λem = 488 nm,Φf = 0.18) in contact with water due to a reversible molecular slipping motion. This simple chemical tool, based on the principles of molecular self-assembly and fluorescence modulation, allows creation of security labels and optically masked barcodes for multiple documents authentication.
Ten guiding principles for youth mental health services.
Hughes, Frank; Hebel, Lisa; Badcock, Paul; Parker, Alexandra G
2018-06-01
Guiding principles are arguably central to the development of any health service. The aim of this article is to report on the outcomes of a youth mental health (YMH) community of practice (CoP), which identified a range of guiding principles that provide a clear point of comparison for the only other set of principles for YMH service delivery proposed to date. A YMH CoP was established in 2010 as part of the Victorian State Government approach to improving YMH care. An initial literature search was undertaken to locate articles on YMH service delivery. A number of common themes were identified, which the YMH community of practice (YMHCoP) members then elaborated upon by drawing from their collective experience of the YMH sector. The resultant themes were then refined through subsequent group discussions to derive a definitive set of guiding principles. These principles were then augmented by a second literature search conducted in July 2015. Fifteen key themes were derived from the initial literature search and YMH CoP discussions. These were refined by the YMH CoP to produce 10 guiding principles for YMH service development. These are discussed through reference to the relevant literature, using the only other article on principles of YMH service delivery as a notable point of comparison. The 10 principles identified may be useful for quality improvement and are likely to have international relevance. We suggest the timely pursuit of an international consensus on guiding principles for service delivery under the auspices of a peak body for YMH. © 2017 John Wiley & Sons Australia, Ltd.
Innovations and Improvements in Pharmacokinetic Models Based on Physiology.
Abbiati, Roberto Andrea; Manca, Davide
2017-01-01
Accompanied by significant improvements of modeling techniques and computational methods in medical sciences, the last thirty years saw the flourishing of pharmacokinetic models for applications in the pharmacometric field. In particular, physiologically based pharmacokinetic (PBPK) models, grounded on a mechanistic foundation, have been applied to explore a multiplicity of aspects with possible applications in patient care and new drugs development, as in the case of siRNA therapies. This article summarizes the features we recently introduced in PBPK modeling within a threeyear research project funded by Italian Research Ministry. Four major points are detailed: (i) the mathematical formulation of the model, which allows modulating its complexity as a function of the administration route and active principle; (ii) a dedicated parameter of the PBPK model quantifies the drugprotein binding, which affects the active principle distribution; (iii) the gall bladder compartment and the bile enterohepatic circulation process; (iv) the coupling of the pharmacokinetic and pharmacodynamic models to produce an overall understanding of the drug effects on mammalian body. The proposed model is applied to two separate endovenous (remifentanil) and oral (sorafenib) drug administrations. The resulting PBPK simulations are consistent with the literature experimental data. Blood concentration predictability is confirmed in multiple reference subjects. Furthermore, in case of sorafenib administration in mice, it is possible to evaluate the drug concentration in the liver and reproduce the effects of the enterohepatic circulation. Finally, a preliminary application of the coupling of the pharmacokinetic/pharmacodynamic models is presented and discussed. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
Study on the Spatial Resolution of Single and Multiple Coincidences Compton Camera
NASA Astrophysics Data System (ADS)
Andreyev, Andriy; Sitek, Arkadiusz; Celler, Anna
2012-10-01
In this paper we study the image resolution that can be obtained from the Multiple Coincidences Compton Camera (MCCC). The principle of MCCC is based on a simultaneous acquisition of several gamma-rays emitted in cascade from a single nucleus. Contrary to a standard Compton camera, MCCC can theoretically provide the exact location of a radioactive source (based only on the identification of the intersection point of three cones created by a single decay), without complicated tomographic reconstruction. However, practical implementation of the MCCC approach encounters several problems, such as low detection sensitivities result in very low probability of coincident triple gamma-ray detection, which is necessary for the source localization. It is also important to evaluate how the detection uncertainties (finite energy and spatial resolution) influence identification of the intersection of three cones, thus the resulting image quality. In this study we investigate how the spatial resolution of the reconstructed images using the triple-cone reconstruction (TCR) approach compares to images reconstructed from the same data using standard iterative method based on single-cone. Results show, that FWHM for the point source reconstructed with TCR was 20-30% higher than the one obtained from the standard iterative reconstruction based on expectation maximization (EM) algorithm and conventional single-cone Compton imaging. Finite energy and spatial resolutions of the MCCC detectors lead to errors in conical surfaces definitions (“thick” conical surfaces) which only amplify in image reconstruction when intersection of three cones is being sought. Our investigations show that, in spite of being conceptually appealing, the identification of triple cone intersection constitutes yet another restriction of the multiple coincidence approach which limits the image resolution that can be obtained with MCCC and TCR algorithm.
Homeostasis of exercise hyperpnea and optimal sensorimotor integration: the internal model paradigm.
Poon, Chi-Sang; Tin, Chung; Yu, Yunguo
2007-10-15
Homeostasis is a basic tenet of biomedicine and an open problem for many physiological control systems. Among them, none has been more extensively studied and intensely debated than the dilemma of exercise hyperpnea - a paradoxical homeostatic increase of respiratory ventilation that is geared to metabolic demands instead of the normal chemoreflex mechanism. Classical control theory has led to a plethora of "feedback/feedforward control" or "set point" hypotheses for homeostatic regulation, yet so far none of them has proved satisfactory in explaining exercise hyperpnea and its interactions with other respiratory inputs. Instead, the available evidence points to a far more sophisticated respiratory controller capable of integrating multiple afferent and efferent signals in adapting the ventilatory pattern toward optimality relative to conflicting homeostatic, energetic and other objectives. This optimality principle parsimoniously mimics exercise hyperpnea, chemoreflex and a host of characteristic respiratory responses to abnormal gas exchange or mechanical loading/unloading in health and in cardiopulmonary diseases - all without resorting to a feedforward "exercise stimulus". Rather, an emergent controller signal encoding the projected metabolic level is predicted by the principle as an exercise-induced 'mental percept' or 'internal model', presumably engendered by associative learning (operant conditioning or classical conditioning) which achieves optimality through continuous identification of, and adaptation to, the causal relationship between respiratory motor output and resultant chemical-mechanical afferent feedbacks. This internal model self-tuning adaptive control paradigm opens a new challenge and exciting opportunity for experimental and theoretical elucidations of the mechanisms of respiratory control - and of homeostatic regulation and sensorimotor integration in general.
Elementary Concepts and Fundamental Laws of the Theory of Heat
NASA Astrophysics Data System (ADS)
de Oliveira, Mário J.
2018-06-01
The elementary concepts and fundamental laws concerning the science of heat are examined from the point of view of its development with special attention to its theoretical structure. The development is divided into four periods, each one characterized by the concept that was attributed to heat. The transition from one to the next period was marked by the emergence of new concepts and new laws, and by singular events. We point out that thermodynamics, as it emerged, is founded on the elementary concepts of temperature and adiabatic wall, and on the fundamental laws: Mayer-Joule principle, or law of conservation of energy; Carnot principle, which leads to the definition of entropy; and the Clausius principle, or law of increase in entropy.
Elementary Concepts and Fundamental Laws of the Theory of Heat
NASA Astrophysics Data System (ADS)
de Oliveira, Mário J.
2018-03-01
The elementary concepts and fundamental laws concerning the science of heat are examined from the point of view of its development with special attention to its theoretical structure. The development is divided into four periods, each one characterized by the concept that was attributed to heat. The transition from one to the next period was marked by the emergence of new concepts and new laws, and by singular events. We point out that thermodynamics, as it emerged, is founded on the elementary concepts of temperature and adiabatic wall, and on the fundamental laws: Mayer-Joule principle, or law of conservation of energy; Carnot principle, which leads to the definition of entropy; and the Clausius principle, or law of increase in entropy.
Slowness based CCP stacking technique in suppressing crustal multiples
NASA Astrophysics Data System (ADS)
Guan, Z.; Niu, F.
2016-12-01
Common-conversion-point (CCP) stacking of receiver function is a widely used technique to image velocity discontinuities in the mantle, such as the lithosphere-asthenosphere boundary (LAB) in the upper mantle, the 410-km and the 660-km discontinuities in the mantle transition zone. In a layered medium, a teleseismic record can be considered as the summation of the direct arrival and a series of conversions and reflections at boundaries below the station. Receiver functions are an attempt to approximate a Green's function associated with structure beneath the receiver by deconvolving one component of a teleseismic signal from another to remove source signals from seismograms. The CCP technique assumes that receiver functions composed solely of P to S conversions at velocity boundaries, whose depths can be mapped out through their arrival times. The multiple reflections at shallow boundaries with large velocity contrasts, such as the base of unconsolidated sediments and the Moho, can pose significant challenges to the accuracy of CCP imaging. In principle, the P to S conversions and multiples originated from deep and shallow boundaries arrive at a seismic station with incident angles that are, respectively, smaller and larger than that of the direct P wave. Therefore the corresponding slowness can be used to isolate the conversions from multiples, allowing for minimizing multiple-induced artifacts. We developed a refined CCP stacking method that uses relative slowness as a weighting factor to suppress the multiples. We performed extensive numerical tests with synthetic data to seek the best weighting scheme and to verify the robustness of the images. We applied the refined technique to the NECESSArray data, and found that the complicated low velocity structures in the depth range of 200-400 km shown in the CCP images of previous studies are mostly artifacts resulted from crustal multiples.
Improving the discoverability, accessibility, and citability of omics datasets: a case report.
Darlington, Yolanda F; Naumov, Alexey; McOwiti, Apollo; Kankanamge, Wasula H; Becnel, Lauren B; McKenna, Neil J
2017-03-01
Although omics datasets represent valuable assets for hypothesis generation, model testing, and data validation, the infrastructure supporting their reuse lacks organization and consistency. Using nuclear receptor signaling transcriptomic datasets as proof of principle, we developed a model to improve the discoverability, accessibility, and citability of published omics datasets. Primary datasets were retrieved from archives, processed to extract data points, then subjected to metadata enrichment and gap filling. The resulting secondary datasets were exposed on responsive web pages to support mining of gene lists, discovery of related datasets, and single-click citation integration with popular reference managers. Automated processes were established to embed digital object identifier-driven links to the secondary datasets in associated journal articles, small molecule and gene-centric databases, and a dataset search engine. Our model creates multiple points of access to reprocessed and reannotated derivative datasets across the digital biomedical research ecosystem, promoting their visibility and usability across disparate research communities. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Pauli Principle and Pion Scattering
DOE R&D Accomplishments Database
Bethe, H. A.
1972-10-01
It is pointed out that if the Pauli principle is taken into account in the discussion of pion scattering by complex nuclei (as it ought, of course, to be) some rather implausible consequences of some earlier treatments of this problem can be avoided. (auth)
A Note on the Conservation of Mechanical Energy and the Galilean Principle of Relativity
ERIC Educational Resources Information Center
Santos, F. C.; Soares, V.; Tort, A. C.
2010-01-01
A reexamination of simple examples that we usually teach to our students in introductory courses is the starting point for a discussion about the principle of conservation of energy and Galilean invariance. (Contains 5 figures.)
Le Chatelier's principle with multiple relaxation channels
NASA Astrophysics Data System (ADS)
Gilmore, R.; Levine, R. D.
1986-05-01
Le Chatelier's principle is discussed within the constrained variational approach to thermodynamics. The formulation is general enough to encompass systems not in thermal (or chemical) equilibrium. Particular attention is given to systems with multiple constraints which can be relaxed. The moderation of the initial perturbation increases as additional constraints are removed. This result is studied in particular when the (coupled) relaxation channels have widely different time scales. A series of inequalities is derived which describes the successive moderation as each successive relaxation channel opens up. These inequalities are interpreted within the metric-geometry representation of thermodynamics.
Partial Verbal Redundancy in Multimedia Presentations for Writing Strategy Instruction
ERIC Educational Resources Information Center
Roscoe, Rod D.; Jacovina, Matthew E.; Harry, Danielle; Russell, Devin G.; McNamara, Danielle S.
2015-01-01
Multimedia instructional materials require learners to select, organize, and integrate information across multiple modalities. To facilitate these comprehension processes, a variety of multimedia design principles have been proposed. This study further explores the redundancy principle by manipulating the degree of partial redundancy between…
"Essential Principles of Economics:" A Hypermedia Textbook.
ERIC Educational Resources Information Center
McCain, Roger A.
2000-01-01
Discusses an electronic textbook called "Essential Principles of Economics." Explains that economic concepts are found by following links from the table of contents, while each chapter includes both expository information and interactive material including online multiple-choice drill questions. States that the textbook is a "work…
Lee, Peter; Bollensdorff, Christian; Quinn, T. Alexander; Wuskell, Joseph P.; Loew, Leslie M.; Kohl, Peter
2011-01-01
Background Simultaneous optical mapping of multiple electrophysiologically relevant parameters in living myocardium is desirable for integrative exploration of mechanisms underlying heart rhythm generation under normal and pathophysiologic conditions. Current multiparametric methods are technically challenging, usually involving multiple sensors and moving parts, which contributes to high logistic and economic thresholds that prevent easy application of the technique. Objective The purpose of this study was to develop a simple, affordable, and effective method for spatially resolved, continuous, simultaneous, and multiparametric optical mapping of the heart, using a single camera. Methods We present a new method to simultaneously monitor multiple parameters using inexpensive off-the-shelf electronic components and no moving parts. The system comprises a single camera, commercially available optical filters, and light-emitting diodes (LEDs), integrated via microcontroller-based electronics for frame-accurate illumination of the tissue. For proof of principle, we illustrate measurement of four parameters, suitable for ratiometric mapping of membrane potential (di-4-ANBDQPQ) and intracellular free calcium (fura-2), in an isolated Langendorff-perfused rat heart during sinus rhythm and ectopy, induced by local electrical or mechanical stimulation. Results The pilot application demonstrates suitability of this imaging approach for heart rhythm research in the isolated heart. In addition, locally induced excitation, whether stimulated electrically or mechanically, gives rise to similar ventricular propagation patterns. Conclusion Combining an affordable camera with suitable optical filters and microprocessor-controlled LEDs, single-sensor multiparametric optical mapping can be practically implemented in a simple yet powerful configuration and applied to heart rhythm research. The moderate system complexity and component cost is destined to lower the threshold to broader application of functional imaging and to ease implementation of more complex optical mapping approaches, such as multiparametric panoramic imaging. A proof-of-principle application confirmed that although electrically and mechanically induced excitation occur by different mechanisms, their electrophysiologic consequences downstream from the point of activation are not dissimilar. PMID:21459161
Modeling the lake eutrophication stochastic ecosystem and the research of its stability.
Wang, Bo; Qi, Qianqian
2018-06-01
In the reality, the lake system will be disturbed by stochastic factors including the external and internal factors. By adding the additive noise and the multiplicative noise to the right-hand sides of the model equation, the additive stochastic model and the multiplicative stochastic model are established respectively in order to reduce model errors induced by the absence of some physical processes. For both the two kinds of stochastic ecosystems, the authors studied the bifurcation characteristics with the FPK equation and the Lyapunov exponent method based on the Stratonovich-Khasminiskii stochastic average principle. Results show that, for the additive stochastic model, when control parameter (i.e., nutrient loading rate) falls into the interval [0.388644, 0.66003825], there exists bistability for the ecosystem and the additive noise intensities cannot make the bifurcation point drift. In the region of the bistability, the external stochastic disturbance which is one of the main triggers causing the lake eutrophication, may make the ecosystem unstable and induce a transition. When control parameter (nutrient loading rate) falls into the interval (0, 0.388644) and (0.66003825, 1.0), there only exists a stable equilibrium state and the additive noise intensity could not change it. For the multiplicative stochastic model, there exists more complex bifurcation performance and the multiplicative ecosystem will be broken by the multiplicative noise. Also, the multiplicative noise could reduce the extent of the bistable region, ultimately, the bistable region vanishes for sufficiently large noise. What's more, both the nutrient loading rate and the multiplicative noise will make the ecosystem have a regime shift. On the other hand, for the two kinds of stochastic ecosystems, the authors also discussed the evolution of the ecological variable in detail by using the Four-stage Runge-Kutta method of strong order γ=1.5. The numerical method was found to be capable of effectively explaining the regime shift theory and agreed with the realistic analyze. These conclusions also confirms the two paths for the system to move from one stable state to another proposed by Beisner et al. [3], which may help understand the occurrence mechanism related to the lake eutrophication from the view point of the stochastic model and mathematical analysis. Copyright © 2018 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Muralidhar, K Raja; Komanduri, K
2014-06-01
Purpose: The objective of this work is to present a mechanism for calculating inflection points on profiles at various depths and field sizes and also a significant study on the percentage of doses at the inflection points for various field sizes and depths for 6XFFF and 10XFFF energy profiles. Methods: Graphical representation was done on Percentage of dose versus Inflection points. Also using the polynomial function, the authors formulated equations for calculating spot-on inflection point on the profiles for 6X FFF and 10X FFF energies for all field sizes and at various depths. Results: In a flattening filter free radiationmore » beam which is not like in Flattened beams, the dose at inflection point of the profile decreases as field size increases for 10XFFF. Whereas in 6XFFF, the dose at the inflection point initially increases up to 10x10cm2 and then decreases. The polynomial function was fitted for both FFF beams for all field sizes and depths. For small fields less than 5x5 cm2 the inflection point and FWHM are almost same and hence analysis can be done just like in FF beams. A change in 10% of dose can change the field width by 1mm. Conclusion: The present study, Derivative of equations based on the polynomial equation to define inflection point concept is precise and accurate way to derive the inflection point dose on any FFF beam profile at any depth with less than 1% accuracy. Corrections can be done in future studies based on the multiple number of machine data. Also a brief study was done to evaluate the inflection point positions with respect to dose in FFF energies for various field sizes and depths for 6XFFF and 10XFFF energy profiles.« less
Total Quality Management in Higher Education: Applying Deming's Fourteen Points.
ERIC Educational Resources Information Center
Masters, Robert J.; Leiker, Linda
1992-01-01
This article presents guidelines to aid administrators of institutions of higher education in applying the 14 principles of Total Quality Management. The principles stress understanding process improvements, handling variation, fostering prediction, and using psychology to capitalize on human resources. (DB)
[The ethics of principles and ethics of responsibility].
Cembrani, Fabio
2016-01-01
In his brief comment, the author speculates if ethics in health-care relationship it still has a practical sense.The essay points out the difference between principles ethics and ethics of responsibility, supporting the latter and try to highlight its constitutive dimensions.
A Point System Approach to Secondary Classroom Management
ERIC Educational Resources Information Center
Xenos, Anthony J.
2012-01-01
This article presents guiding principles governing the design, implementation, and management of a point system to promote discipline and academic rigor in a secondary classroom. Four considerations are discussed: (1) assigning appropriate point values to integral classroom behaviors and tasks; (2) determining the relationship among consequences,…
Fermion systems in discrete space-time
NASA Astrophysics Data System (ADS)
Finster, Felix
2007-05-01
Fermion systems in discrete space-time are introduced as a model for physics on the Planck scale. We set up a variational principle which describes a non-local interaction of all fermions. This variational principle is symmetric under permutations of the discrete space-time points. We explain how for minimizers of the variational principle, the fermions spontaneously break this permutation symmetry and induce on space-time a discrete causal structure.
Principles Underlying the Use of Multiple Informants’ Reports
De Los Reyes, Andres; Thomas, Sarah A.; Goodman, Kimberly L.; Kundey, Shannon M.A.
2014-01-01
Researchers use multiple informants’ reports to assess and examine behavior. However, informants’ reports commonly disagree. Informants’ reports often disagree in their perceived levels of a behavior (“low” vs. “elevated” mood), and examining multiple reports in a single study often results in inconsistent findings. Although researchers often espouse taking a multi-informant assessment approach, they frequently address informant discrepancies using techniques that treat discrepancies as measurement error. Yet, recent work indicates that researchers in a variety of fields often may be unable to justify treating informant discrepancies as measurement error. In this paper, the authors advance a framework (Operations Triad Model) outlining general principles for using and interpreting informants’ reports. Using the framework, researchers can test whether or not they can extract meaningful information about behavior from discrepancies among multiple informants’ reports. The authors provide supportive evidence for this framework and discuss its implications for hypothesis testing, study design, and quantitative review. PMID:23140332
Huygens-Feynman-Fresnel principle as the basis of applied optics.
Gitin, Andrey V
2013-11-01
The main relationships of wave optics are derived from a combination of the Huygens-Fresnel principle and the Feynman integral over all paths. The stationary-phase approximation of the wave relations gives the correspondent relations from the point of view of geometrical optics.
A judging principle of crucial vibrational transmission paths in plates
NASA Astrophysics Data System (ADS)
Wang, Bin; Li, Dong-Xu; Jiang, Jian-Ping; Liao, Yi-Huan
2016-10-01
This paper developed a judging principle of crucial vibrational transmission path (VTP) in plates. Novel generalized definitions of VTPs are given referred to the meaning of streamlines. And by comparing governing equations, the similarity between energy flow and fluid motion is firstly found so that an analytic method of VTPs in plates is proposed by analogy with fluid motion. Hereafter, the crucial VTP is defined for energy flows at objective points and relative judging criteria is given. Finally, based on two numerical experiments of passive control, the judging principle is indirectly verified by comparing the reduction effects of energy flows at focused points and relative judgment results of crucial VTPs. This paper is meaningful for analyzing and applying the VTPs in plates to guide the control design in future.
Novel Principles and Techniques to Create a Natural Design in Female Hairline Correction Surgery.
Park, Jae Hyun
2015-12-01
Female hairline correction surgery is becoming increasingly popular. However, no guidelines or methods of female hairline design have been introduced to date. The purpose of this study was to create an initial framework based on the novel principles of female hairline design and then use artistic ability and experience to fine tune this framework. An understanding of the concept of 5 areas (frontal area, frontotemporal recess area, temporal peak, infratemple area, and sideburns) and 5 points (C, A, B, T, and S) is required for female hairline correction surgery (the 5A5P principle). The general concepts of female hairline correction surgery and natural design methods are, herein, explained with a focus on the correlations between these 5 areas and 5 points. A natural and aesthetic female hairline can be created with application of the above-mentioned concepts. The 5A5P principle of forming the female hairline is very useful in female hairline correction surgery.
Novel Principles and Techniques to Create a Natural Design in Female Hairline Correction Surgery
2015-01-01
Abstract Background: Female hairline correction surgery is becoming increasingly popular. However, no guidelines or methods of female hairline design have been introduced to date. Methods: The purpose of this study was to create an initial framework based on the novel principles of female hairline design and then use artistic ability and experience to fine tune this framework. An understanding of the concept of 5 areas (frontal area, frontotemporal recess area, temporal peak, infratemple area, and sideburns) and 5 points (C, A, B, T, and S) is required for female hairline correction surgery (the 5A5P principle). The general concepts of female hairline correction surgery and natural design methods are, herein, explained with a focus on the correlations between these 5 areas and 5 points. Results: A natural and aesthetic female hairline can be created with application of the above-mentioned concepts. Conclusion: The 5A5P principle of forming the female hairline is very useful in female hairline correction surgery. PMID:26894014
Learning from Comparing Multiple Examples: On the Dilemma of "Similar" or "Different"
ERIC Educational Resources Information Center
Guo, Jian-Peng; Pang, Ming Fai; Yang, Ling-Yan; Ding, Yi
2012-01-01
Although researchers have demonstrated that studying multiple examples is more effective than studying one example to facilitate learning, the principles found in the literature for designing multiple examples remain ambiguous. This paper reviews variation theory research on example design which sheds light on unclear issues regarding the effects…
Small Area Indices of Multiple Deprivation in South Africa
ERIC Educational Resources Information Center
Noble, Michael; Barnes, Helen; Wright, Gemma; Roberts, Benjamin
2010-01-01
This paper presents the Provincial Indices of Multiple Deprivation that were constructed by the authors at ward level using 2001 Census data for each of South Africa's nine provinces. The principles adopted in conceptualising the indices are described and multiple deprivation is defined as a weighted combination of discrete dimensions of…
Counseling Families Using Principles of Re-EDucation
ERIC Educational Resources Information Center
Shepard, Lisa
2011-01-01
When Nicholas Hobbs created the Re-EDucation model, he envisioned that this philosophy would inform multiple disciplines. Today, Re-ED is widely applied to work with troubled children in day treatment, school-based services, residential settings, and therapeutic wilderness programs. Hobbs outlined a dozen Principles of Re-EDucation that are…
Cunningham, Thomas V
2016-01-01
Three common ethical principles for establishing the limits of parental authority in pediatric treatment decision-making are the harm principle, the principle of best interest, and the threshold view. This paper considers how these principles apply to a case of a premature neonate with multiple significant co-morbidities whose mother wanted all possible treatments, and whose health care providers wondered whether it would be ethically permissible to allow him to die comfortably despite her wishes. Whether and how these principles help in understanding what was morally right for the child is questioned. The paper concludes that the principles were of some value in understanding the moral geography of the case; however, this case reveals that common bioethical principles for medical decision-making are problematically value-laden because they are inconsistent with the widespread moral value of medical vitalism.
NASA Astrophysics Data System (ADS)
Zurek, Wojciech Hubert
2007-11-01
Measurements transfer information about a system to the apparatus and then, further on, to observers and (often inadvertently) to the environment. I show that even imperfect copying essential in such situations restricts possible unperturbed outcomes to an orthogonal subset of all possible states of the system, thus breaking the unitary symmetry of its Hilbert space implied by the quantum superposition principle. Preferred outcome states emerge as a result. They provide a framework for “wave-packet collapse,” designating terminal points of quantum jumps and defining the measured observable by specifying its eigenstates. In quantum Darwinism, they are the progenitors of multiple copies spread throughout the environment—the fittest quantum states that not only survive decoherence, but subvert the environment into carrying information about them—into becoming a witness.
Vortex line topology during vortex tube reconnection
NASA Astrophysics Data System (ADS)
McGavin, P.; Pontin, D. I.
2018-05-01
This paper addresses reconnection of vortex tubes, with particular focus on the topology of the vortex lines (field lines of the vorticity). This analysis of vortex line topology reveals key features of the reconnection process, such as the generation of many small flux rings, formed when reconnection occurs in multiple locations in the vortex sheet between the tubes. Consideration of three-dimensional reconnection principles leads to a robust measurement of the reconnection rate, even once instabilities break the symmetry. It also allows us to identify internal reconnection of vortex lines within the individual vortex tubes. Finally, the introduction of a third vortex tube is shown to render the vortex reconnection process fully three-dimensional, leading to a fundamental change in the topological structure of the process. An additional interesting feature is the generation of vorticity null points.
Dielectric Spectroscopy in Biomaterials: Agrophysics
El Khaled, Dalia; Castellano, Nuria N.; Gázquez, Jose A.; Perea-Moreno, Alberto-Jesus; Manzano-Agugliaro, Francisco
2016-01-01
Being dependent on temperature and frequency, dielectric properties are related to various types of food. Predicting multiple physical characteristics of agri-food products has been the main objective of non-destructive assessment possibilities executed in many studies on horticultural products and food materials. This review manipulates the basic fundamentals of dielectric properties with their concepts and principles. The different factors affecting the behavior of dielectric properties have been dissected, and applications executed on different products seeking the characterization of a diversity of chemical and physical properties are all pointed out and referenced with their conclusions. Throughout the review, a detailed description of the various adopted measurement techniques and the mostly popular equipment are presented. This compiled review serves in coming out with an updated reference for the dielectric properties of spectroscopy that are applied in the agrophysics field. PMID:28773438
Cloud Effects in Hyperspectral Imagery from First-Principles Scene Simulations
2009-01-01
SPIE. One print or electronic copy may be made for personal use only. Systematic or multiple reproduction, or distribution to multiple locations...scattering and absorption, scattering events, surface scattering with material-dependent bidirectional reflectances, multiple surface adjacency...aerosols or clouds, they may be absorbed, or they may reflect off the ground or an object. A given photon may undergo multiple scattering events
Floyd A. Johnson
1961-01-01
This report assumes a knowledge of the principles of point sampling as described by Grosenbaugh, Bell and Alexander, and others. Whenever trees are counted at every point in a sample of points (large sample) and measured for volume at a portion (small sample) of these points, the sampling design could be called ratio double sampling. If the large...
Multiple-Point Temperature Gradient Algorithm for Ring Laser Gyroscope Bias Compensation
Li, Geng; Zhang, Pengfei; Wei, Guo; Xie, Yuanping; Yu, Xudong; Long, Xingwu
2015-01-01
To further improve ring laser gyroscope (RLG) bias stability, a multiple-point temperature gradient algorithm is proposed for RLG bias compensation in this paper. Based on the multiple-point temperature measurement system, a complete thermo-image of the RLG block is developed. Combined with the multiple-point temperature gradients between different points of the RLG block, the particle swarm optimization algorithm is used to tune the support vector machine (SVM) parameters, and an optimized design for selecting the thermometer locations is also discussed. The experimental results validate the superiority of the introduced method and enhance the precision and generalizability in the RLG bias compensation model. PMID:26633401
Sucila, Antanas
2002-01-01
The aim of study is to recall surgeons deontological principles and errors. The article demonstrates some specific deontological errors, performed by surgeon on patients and his colleagues; points out painful sequela of these errors as well. CONCLUSION. The surgeon should take in account deontological principles rigorously in routine daily practice.
Hospital employs TQM principles to rework its evaluation system.
Burda, D
1992-02-24
One Kansas hospital has taken the traditional employee evaluation process--with all its performance criteria, point systems and rankings--and turned it on its head. The new system employs total quality management principles and promotes personal development, education and teamwork. And everyone gets the same raise.
The Human Activity of Evaluation Theorizing.
ERIC Educational Resources Information Center
Alkin, Marvin C.; Ellett, Frederick, Jr.
Theorizing about evaluation should be conceptualized as a human activity governed by certain strategies and principles. The theories advanced by various evaluators have changed over the years, thus illustrating ten principles of evaluation. The starting point for theory development or modification is self-reflection and review of one's own…
A propulsion-mass tensor coupling in relativistic rocket motion
NASA Astrophysics Data System (ADS)
Brito, Hector Hugo
1998-01-01
Following earlier speculations about antigravity machines and works on the relativistic dynamics of constant and variable rest mass point particles, a mass tensor is found in connection with the closed system consisting of the rocket driven spaceship and its propellant mass, provided a ``solidification'' point other than the system center of mass is considered. Therefore, the mass tensor form depends on whether the system is open or closed, and upon where the ``solidification'' point is located. An alternative propulsion principle is subsequently derived from the tensor mass approach. The new principle, the covariant equivalent of Newton's Third Law for the physical interpretation of the relativistic rocket motion, reads: A spaceship undergoes a propulsion effect when the whole system mass 4-ellipsoid warps.
A point of view: why point-of-care places are not free marketplaces.
Rambur, B; Mooney, M M
1998-01-01
Current wisdom holds that health care is a business and "as such must abide by market principles." Most nurses are not well enough versed in economic theories to credibly critique health care delivery decisions based on economic theories. The relationship of market principles to health care realities is described in basic terms to encourage nurses to "optimize patient care and influence health care policy." Physicians, who control all access points to the health care system, have enjoyed a 40-year market dominance that is "rapidly being replaced by insurance companies and for-profit investors." Providers' decisions to treat or not to treat are strongly influenced by whether the patient is in a fee-for-service or capitated payment environment.
Part-Task Training Strategies in Simulated Carrier Landing Final Approach Training
1983-11-01
received a large amount of attention in the recent past. However, the notion that the value of flight simulation may b• enhanced when principles of...as training devices through the application of principles of learning. The research proposed here s based on this point of view. THIS EXPERIMENT The...tracking. Following Goldstein’s suggestion, one should look for training techniques suggested by learnina principles developed from research on
Common pitfalls in statistical analysis: The perils of multiple testing
Ranganathan, Priya; Pramesh, C. S.; Buyse, Marc
2016-01-01
Multiple testing refers to situations where a dataset is subjected to statistical testing multiple times - either at multiple time-points or through multiple subgroups or for multiple end-points. This amplifies the probability of a false-positive finding. In this article, we look at the consequences of multiple testing and explore various methods to deal with this issue. PMID:27141478
RandomSpot: A web-based tool for systematic random sampling of virtual slides.
Wright, Alexander I; Grabsch, Heike I; Treanor, Darren E
2015-01-01
This paper describes work presented at the Nordic Symposium on Digital Pathology 2014, Linköping, Sweden. Systematic random sampling (SRS) is a stereological tool, which provides a framework to quickly build an accurate estimation of the distribution of objects or classes within an image, whilst minimizing the number of observations required. RandomSpot is a web-based tool for SRS in stereology, which systematically places equidistant points within a given region of interest on a virtual slide. Each point can then be visually inspected by a pathologist in order to generate an unbiased sample of the distribution of classes within the tissue. Further measurements can then be derived from the distribution, such as the ratio of tumor to stroma. RandomSpot replicates the fundamental principle of traditional light microscope grid-shaped graticules, with the added benefits associated with virtual slides, such as facilitated collaboration and automated navigation between points. Once the sample points have been added to the region(s) of interest, users can download the annotations and view them locally using their virtual slide viewing software. Since its introduction, RandomSpot has been used extensively for international collaborative projects, clinical trials and independent research projects. So far, the system has been used to generate over 21,000 sample sets, and has been used to generate data for use in multiple publications, identifying significant new prognostic markers in colorectal, upper gastro-intestinal and breast cancer. Data generated using RandomSpot also has significant value for training image analysis algorithms using sample point coordinates and pathologist classifications.
21 CFR 320.27 - Guidelines on the design of a multiple-dose in vivo bioavailability study.
Code of Federal Regulations, 2011 CFR
2011-04-01
... vivo bioavailability study. 320.27 Section 320.27 Food and Drugs FOOD AND DRUG ADMINISTRATION... Guidelines on the design of a multiple-dose in vivo bioavailability study. (a) Basic principles. (1) In... labeling of the test product. (3) A multiple-dose study may be required to determine the bioavailability of...
Enhancing the Therapy Experience Using Principles of Video Game Design.
Folkins, John Wm; Brackenbury, Tim; Krause, Miriam; Haviland, Allison
2016-02-01
This article considers the potential benefits that applying design principles from contemporary video games may have on enhancing therapy experiences. Six principles of video game design are presented, and their relevance for enriching clinical experiences is discussed. The motivational and learning benefits of each design principle have been discussed in the education literature as having positive impacts on student motivation and learning and are related here to aspects of clinical practice. The essential experience principle suggests connecting all aspects of the experience around a central emotion or cognitive connection. The discovery principle promotes indirect learning in focused environments. The risk-taking principle addresses the uncertainties clients face when attempting newly learned skills in novel situations. The generalization principle encourages multiple opportunities for skill transfer. The reward system principle directly relates to the scaffolding of frequent and varied feedback in treatment. Last, the identity principle can assist clients in using their newly learned communication skills to redefine self-perceptions. These principles highlight areas for research and interventions that may be used to reinforce or advance current practice.
WATER QUALITY IN SOURCE WATER, TREATMENT, AND DISTRIBUTION SYSTEMS
Most drinking water utilities practice the multiple-barrier concept as the guiding principle for providing safe water. This chapter discusses multiple barriers as they relate to the basic criteria for selecting and protecting source waters, including known and potential sources ...
Protein folding and misfolding: mechanism and principles
Englander, S. Walter; Mayne, Leland; Krishna, Mallela M. G.
2012-01-01
Two fundamentally different views of how proteins fold are now being debated. Do proteins fold through multiple unpredictable routes directed only by the energetically downhill nature of the folding landscape or do they fold through specific intermediates in a defined pathway that systematically puts predetermined pieces of the target native protein into place? It has now become possible to determine the structure of protein folding intermediates, evaluate their equilibrium and kinetic parameters, and establish their pathway relationships. Results obtained for many proteins have serendipitously revealed a new dimension of protein structure. Cooperative structural units of the native protein, called foldons, unfold and refold repeatedly even under native conditions. Much evidence obtained by hydrogen exchange and other methods now indicates that cooperative foldon units and not individual amino acids account for the unit steps in protein folding pathways. The formation of foldons and their ordered pathway assembly systematically puts native-like foldon building blocks into place, guided by a sequential stabilization mechanism in which prior native-like structure templates the formation of incoming foldons with complementary structure. Thus the same propensities and interactions that specify the final native state, encoded in the amino-acid sequence of every protein, determine the pathway for getting there. Experimental observations that have been interpreted differently, in terms of multiple independent pathways, appear to be due to chance misfolding errors that cause different population fractions to block at different pathway points, populate different pathway intermediates, and fold at different rates. This paper summarizes the experimental basis for these three determining principles and their consequences. Cooperative native-like foldon units and the sequential stabilization process together generate predetermined stepwise pathways. Optional misfolding errors are responsible for 3-state and heterogeneous kinetic folding. PMID:18405419
NASA Astrophysics Data System (ADS)
Brown, Nancy Melamed
This qualitative investigation extends the study of teacher learning within a reform-based community of practice model of professional development. This long-term, multiple case study examined three experienced teachers' transformations in thinking about science instruction. Data were collected during the three years of the Guided Inquiry supporting Multiple Literacies research project, designed to develop instructional practices informed by a socio-cultural, inquiry-based orientation. Data sources included: transcripts of semi-structured interviews collected at strategic points, the teacher's journals, initial application information, and teachers' written case studies. Using an interpretive case study approach, tenets of the teachers' orientations were identified through a recursive process. Results are organized to reflect two principles that were integral to the design of the professional development community. The first principle describes changes in teachers' orientations about the goals and characteristics of science instruction in the elementary grades. The second describes changes about teachers' knowledge about themselves as learners and the influence of this knowledge on their thinking about science instruction and student learning. Illustrative findings indicate that: (a) it is possible for teachers' language regarding conceptions of their practice to change with only superficial change in their orientations, (b) teachers can hold dualistic ways of thinking about their practice, (c) in some cases, teachers use a significant amount of autobiography about their own learning to explain their practice; over time, this was replaced with warrants using the language that developed within the professional development community, and (d) long-term case studies revealed differences in orientations that emerged and were refined over time. These findings provide strong support for communities of practice as a model of professional development and hold implications for advancing teacher learning.
Notes on TQM (Total Quality Management) and Education
ERIC Educational Resources Information Center
Daniel, Carter A.
2005-01-01
Application of Deming's TQM principles to education is long overdue. Principles that have proven their worth in businesses for decades could revolutionize our thinking about education. But they require a total commitment, from the highest to the lowest level. Deming's 14 points, and Gray Rinehart's suggestions, are presented, discussed, and…
Making Marketing Principles Tangible: Online Auctions as Living Case Studies
ERIC Educational Resources Information Center
Wood, Charles M.; Suter, Tracy A.
2004-01-01
This article presents an effective course supplement for Principles of Marketing classes. An experiential project involving online auctions is offered to instructors seeking to create a more participatory student environment and an interactive teaching style. A number of learning points are illustrated that allow instructors to use an auction…
Study on the intrinsic defects in tin oxide with first-principles method
NASA Astrophysics Data System (ADS)
Sun, Yu; Liu, Tingyu; Chang, Qiuxiang; Ma, Changmin
2018-04-01
First-principles and thermodynamic methods are used to study the contribution of vibrational entropy to defect formation energy and the stability of the intrinsic point defects in SnO2 crystal. According to thermodynamic calculation results, the contribution of vibrational entropy to defect formation energy is significant and should not be neglected, especially at high temperatures. The calculated results indicate that the oxygen vacancy is the major point defect in undoped SnO2 crystal, which has a higher concentration than that of the other point defect. The property of negative-U is put forward in SnO2 crystal. In order to determine the most stable defects much clearer under different conditions, the most stable intrinsic defect as a function of Fermi level, oxygen partial pressure and temperature are described in the three-dimensional defect formation enthalpy diagrams. The diagram visually provides the most stable point defects under different conditions.
Insight into point defects and impurities in titanium from first principles
NASA Astrophysics Data System (ADS)
Nayak, Sanjeev K.; Hung, Cain J.; Sharma, Vinit; Alpay, S. Pamir; Dongare, Avinash M.; Brindley, William J.; Hebert, Rainer J.
2018-03-01
Titanium alloys find extensive use in the aerospace and biomedical industries due to a unique combination of strength, density, and corrosion resistance. Decades of mostly experimental research has led to a large body of knowledge of the processing-microstructure-properties linkages. But much of the existing understanding of point defects that play a significant role in the mechanical properties of titanium is based on semi-empirical rules. In this work, we present the results of a detailed self-consistent first-principles study that was developed to determine formation energies of intrinsic point defects including vacancies, self-interstitials, and extrinsic point defects, such as, interstitial and substitutional impurities/dopants. We find that most elements, regardless of size, prefer substitutional positions, but highly electronegative elements, such as C, N, O, F, S, and Cl, some of which are common impurities in Ti, occupy interstitial positions.
First-principles investigation of point defect and atomic diffusion in Al2Ca
NASA Astrophysics Data System (ADS)
Tian, Xiao; Wang, Jia-Ning; Wang, Ya-Ping; Shi, Xue-Feng; Tang, Bi-Yu
2017-04-01
Point defects and atomic diffusion in Al2Ca have been studied from first-principles calculations within density functional framework. After formation energy and relative stability of point defects are investigated, several predominant diffusion processes in Al2Ca are studied, including sublattice one-step mechanism, 3-jump vacancy cycles and antistructure sublattice mechanism. The associated energy profiles are calculated with climbing image nudged elastic band (CI-NEB) method, then the saddle points and activation barriers during atomic diffusion are further determined. The resulted activation barriers show that both Al and Ca can diffuse mainly mediated by neighbor vacancy on their own sublattice. 3-jump cycle mechanism mediated by VCa may make some contribution to the overall Al diffusion. And antistructure (AS) sublattice mechanism can also play an important role in Ca atomic diffusion owing to the moderate activation barrier.
Applying Universal Design for Learning in Online Courses: Pedagogical and Practical Considerations
ERIC Educational Resources Information Center
Dell, Cindy Ann; Dell, Thomas F.; Blackwell, Terry L.
2015-01-01
Inclusion of the universal design for learning (UDL) model as a guiding set of principles for online curriculum development in higher education is discussed. Fundamentally, UDL provides the student with multiple means of accessing the course based on three overarching principles: presentation; action and expression; and engagement and interaction.…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-01
... principles similar to the principles applied in determining whether employees of two or more trades or... marketing or sale of a MEWA. Section 6605 enables the Secretary to issue administrative cease and desist... but not limited to marketing, soliciting, providing, or offering to provide benefits consisting of...
Designing future landscapes from principles of form and function
Larry D. Harris; Patrick Kangas
1979-01-01
Future landscapes will consist of a gradient of types ranging from wilderness areas to totally "humanized" environments. The man-dominated landscapes will be required to fulfill multiple functions only one of which is aesthetic enjoyment. It is suggested that basic principles of form and function may contribute to design criteria. Applications to the...
Leave islands as refugia for low-mobility species in managed forest mosaics
Stephanie J. Wessell-Kelly; Deanna H. Olson
2013-01-01
In recent years, forest management in the Pacifi c Northwest has shifted from one based largely on resource extraction to one based on ecosystem management principles. Forest management based on these principles involves simultaneously balancing and sustaining multiple forest resource values, including silvicultural, social, economic, and ecological objectives. Leave...
Modeling Success: Using Preenrollment Data to Identify Academically At-Risk Students
ERIC Educational Resources Information Center
Gansemer-Topf, Ann M.; Compton, Jonathan; Wohlgemuth, Darin; Forbes, Greg; Ralston, Ekaterina
2015-01-01
Improving student success and degree completion is one of the core principles of strategic enrollment management. To address this principle, institutional data were used to develop a statistical model to identify academically at-risk students. The model employs multiple linear regression techniques to predict students at risk of earning below a…
The Relative Performance of Female and Male Students in Accounting Principles Classes.
ERIC Educational Resources Information Center
Bouillon, Marvin L.; Doran, B. Michael
1992-01-01
The performance of female and male students in Accounting Principles (AP) I and II was compared by using multiple regression techniques to assess the incremental explanatory effects of gender. Males significantly outperformed females in AP I, contradicting earlier studies. Similar gender of instructor and student was insignificant. (JOW)
ERIC Educational Resources Information Center
Levy, Sharona T.; Peleg, Ran; Ofeck, Eyal; Tabor, Naamit; Dubovi, Ilana; Bluestein, Shiri; Ben-Zur, Hadar
2018-01-01
We propose and evaluate a framework supporting collaborative discovery learning of complex systems. The framework blends five design principles: (1) individual action: amidst (2) social interactions; challenged with (3) multiple tasks; set in (4) a constrained interactive learning environment that draws attention to (5) highlighted target…
Alexander, Nick; Rowe, Sylvia; Brackett, Robert E; Burton-Freeman, Britt; Hentges, Eric J; Kretser, Alison; Klurfeld, David M; Meyers, Linda D; Mukherjea, Ratna; Ohlhorst, Sarah
2015-06-01
Officers and other representatives of more than a dozen food-, nutrition-, and health-related scientific societies and organizations, food industry scientists, and staff of the USDA, the CDC, the Food and Drug Administration, and the NIH convened on 8 December 2014 in Washington, DC, to reach a consensus among individuals participating on guiding principles for the development of research-oriented, food- and nutrition-related public-private partnerships. During the daylong working meeting, participants discussed and revised 12 previously published guidelines to ensure integrity in the conduct of food and nutrition research collaborations among public, nonprofit, and private sectors. They agreed to reconvene periodically to reassess the public-private partnership principles. This article presents the guiding principles and potential benefits, outlines key discussion points, and articulates points of agreement and reservation. © 2015 American Society for Nutrition.
NASA Astrophysics Data System (ADS)
Müller, Benjamin; Bernhardt, Matthias; Jackisch, Conrad; Schulz, Karsten
2016-09-01
For understanding water and solute transport processes, knowledge about the respective hydraulic properties is necessary. Commonly, hydraulic parameters are estimated via pedo-transfer functions using soil texture data to avoid cost-intensive measurements of hydraulic parameters in the laboratory. Therefore, current soil texture information is only available at a coarse spatial resolution of 250 to 1000 m. Here, a method is presented to derive high-resolution (15 m) spatial topsoil texture patterns for the meso-scale Attert catchment (Luxembourg, 288 km2) from 28 images of ASTER (advanced spaceborne thermal emission and reflection radiometer) thermal remote sensing. A principle component analysis of the images reveals the most dominant thermal patterns (principle components, PCs) that are related to 212 fractional soil texture samples. Within a multiple linear regression framework, distributed soil texture information is estimated and related uncertainties are assessed. An overall root mean squared error (RMSE) of 12.7 percentage points (pp) lies well within and even below the range of recent studies on soil texture estimation, while requiring sparser sample setups and a less diverse set of basic spatial input. This approach will improve the generation of spatially distributed topsoil maps, particularly for hydrologic modeling purposes, and will expand the usage of thermal remote sensing products.
Analyzing the dynamics of cell cycle processes from fixed samples through ergodic principles
Wheeler, Richard John
2015-01-01
Tools to analyze cyclical cellular processes, particularly the cell cycle, are of broad value for cell biology. Cell cycle synchronization and live-cell time-lapse observation are widely used to analyze these processes but are not available for many systems. Simple mathematical methods built on the ergodic principle are a well-established, widely applicable, and powerful alternative analysis approach, although they are less widely used. These methods extract data about the dynamics of a cyclical process from a single time-point “snapshot” of a population of cells progressing through the cycle asynchronously. Here, I demonstrate application of these simple mathematical methods to analysis of basic cyclical processes—cycles including a division event, cell populations undergoing unicellular aging, and cell cycles with multiple fission (schizogony)—as well as recent advances that allow detailed mapping of the cell cycle from continuously changing properties of the cell such as size and DNA content. This includes examples using existing data from mammalian, yeast, and unicellular eukaryotic parasite cell biology. Through the ongoing advances in high-throughput cell analysis by light microscopy, electron microscopy, and flow cytometry, these mathematical methods are becoming ever more important and are a powerful complementary method to traditional synchronization and time-lapse cell cycle analysis methods. PMID:26543196
Svyatsky, Daniil; Lipnikov, Konstantin
2017-03-18
Richards’s equation describes steady-state or transient flow in a variably saturated medium. For a medium having multiple layers of soils that are not aligned with coordinate axes, a mesh fitted to these layers is no longer orthogonal and the classical two-point flux approximation finite volume scheme is no longer accurate. Here, we propose new second-order accurate nonlinear finite volume (NFV) schemes for the head and pressure formulations of Richards’ equation. We prove that the discrete maximum principles hold for both formulations at steady-state which mimics similar properties of the continuum solution. The second-order accuracy is achieved using high-order upwind algorithmsmore » for the relative permeability. Numerical simulations of water infiltration into a dry soil show significant advantage of the second-order NFV schemes over the first-order NFV schemes even on coarse meshes. Since explicit calculation of the Jacobian matrix becomes prohibitively expensive for high-order schemes due to build-in reconstruction and slope limiting algorithms, we study numerically the preconditioning strategy introduced recently in Lipnikov et al. (2016) that uses a stable approximation of the continuum Jacobian. Lastly, numerical simulations show that the new preconditioner reduces computational cost up to 2–3 times in comparison with the conventional preconditioners.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Svyatsky, Daniil; Lipnikov, Konstantin
Richards’s equation describes steady-state or transient flow in a variably saturated medium. For a medium having multiple layers of soils that are not aligned with coordinate axes, a mesh fitted to these layers is no longer orthogonal and the classical two-point flux approximation finite volume scheme is no longer accurate. Here, we propose new second-order accurate nonlinear finite volume (NFV) schemes for the head and pressure formulations of Richards’ equation. We prove that the discrete maximum principles hold for both formulations at steady-state which mimics similar properties of the continuum solution. The second-order accuracy is achieved using high-order upwind algorithmsmore » for the relative permeability. Numerical simulations of water infiltration into a dry soil show significant advantage of the second-order NFV schemes over the first-order NFV schemes even on coarse meshes. Since explicit calculation of the Jacobian matrix becomes prohibitively expensive for high-order schemes due to build-in reconstruction and slope limiting algorithms, we study numerically the preconditioning strategy introduced recently in Lipnikov et al. (2016) that uses a stable approximation of the continuum Jacobian. Lastly, numerical simulations show that the new preconditioner reduces computational cost up to 2–3 times in comparison with the conventional preconditioners.« less
NASA Astrophysics Data System (ADS)
Eisenbach, Markus; Larkin, Jeff; Lutjens, Justin; Rennich, Steven; Rogers, James H.
2017-02-01
The Locally Self-consistent Multiple Scattering (LSMS) code solves the first principles Density Functional theory Kohn-Sham equation for a wide range of materials with a special focus on metals, alloys and metallic nano-structures. It has traditionally exhibited near perfect scalability on massively parallel high performance computer architectures. We present our efforts to exploit GPUs to accelerate the LSMS code to enable first principles calculations of O(100,000) atoms and statistical physics sampling of finite temperature properties. We reimplement the scattering matrix calculation for GPUs with a block matrix inversion algorithm that only uses accelerator memory. Using the Cray XK7 system Titan at the Oak Ridge Leadership Computing Facility we achieve a sustained performance of 14.5PFlop/s and a speedup of 8.6 compared to the CPU only code.
Eisenbach, Markus; Larkin, Jeff; Lutjens, Justin; ...
2016-07-12
The Locally Self-consistent Multiple Scattering (LSMS) code solves the first principles Density Functional theory Kohn–Sham equation for a wide range of materials with a special focus on metals, alloys and metallic nano-structures. It has traditionally exhibited near perfect scalability on massively parallel high performance computer architectures. In this paper, we present our efforts to exploit GPUs to accelerate the LSMS code to enable first principles calculations of O(100,000) atoms and statistical physics sampling of finite temperature properties. We reimplement the scattering matrix calculation for GPUs with a block matrix inversion algorithm that only uses accelerator memory. Finally, using the Craymore » XK7 system Titan at the Oak Ridge Leadership Computing Facility we achieve a sustained performance of 14.5PFlop/s and a speedup of 8.6 compared to the CPU only code.« less
An Exemplar-Based Multi-View Domain Generalization Framework for Visual Recognition.
Niu, Li; Li, Wen; Xu, Dong; Cai, Jianfei
2018-02-01
In this paper, we propose a new exemplar-based multi-view domain generalization (EMVDG) framework for visual recognition by learning robust classifier that are able to generalize well to arbitrary target domain based on the training samples with multiple types of features (i.e., multi-view features). In this framework, we aim to address two issues simultaneously. First, the distribution of training samples (i.e., the source domain) is often considerably different from that of testing samples (i.e., the target domain), so the performance of the classifiers learnt on the source domain may drop significantly on the target domain. Moreover, the testing data are often unseen during the training procedure. Second, when the training data are associated with multi-view features, the recognition performance can be further improved by exploiting the relation among multiple types of features. To address the first issue, considering that it has been shown that fusing multiple SVM classifiers can enhance the domain generalization ability, we build our EMVDG framework upon exemplar SVMs (ESVMs), in which a set of ESVM classifiers are learnt with each one trained based on one positive training sample and all the negative training samples. When the source domain contains multiple latent domains, the learnt ESVM classifiers are expected to be grouped into multiple clusters. To address the second issue, we propose two approaches under the EMVDG framework based on the consensus principle and the complementary principle, respectively. Specifically, we propose an EMVDG_CO method by adding a co-regularizer to enforce the cluster structures of ESVM classifiers on different views to be consistent based on the consensus principle. Inspired by multiple kernel learning, we also propose another EMVDG_MK method by fusing the ESVM classifiers from different views based on the complementary principle. In addition, we further extend our EMVDG framework to exemplar-based multi-view domain adaptation (EMVDA) framework when the unlabeled target domain data are available during the training procedure. The effectiveness of our EMVDG and EMVDA frameworks for visual recognition is clearly demonstrated by comprehensive experiments on three benchmark data sets.
[Comment on the ruling about the appeal against the Directive on biotechnological inventions].
Fernando Magarzo, M R
2001-01-01
The author examines the content of the European Court of Justice ruling which dismisses the appeal lodged by the Netherlands against Directive 98/44 concerning the legal protection of biotechnological inventions. The main grounds for the appeal were as follows: inappropriate choice of points of law; breach of the principle of subsidiarity; violation of the principle of legal certainty; breach of International Law obligations; undermining of human dignity; breach of the principle of collegiality.
Classification and data acquisition with incomplete data
NASA Astrophysics Data System (ADS)
Williams, David P.
In remote-sensing applications, incomplete data can result when only a subset of sensors (e.g., radar, infrared, acoustic) are deployed at certain regions. The limitations of single sensor systems have spurred interest in employing multiple sensor modalities simultaneously. For example, in land mine detection tasks, different sensor modalities are better-suited to capture different aspects of the underlying physics of the mines. Synthetic aperture radar sensors may be better at detecting surface mines, while infrared sensors may be better at detecting buried mines. By employing multiple sensor modalities to address the detection task, the strengths of the disparate sensors can be exploited in a synergistic manner to improve performance beyond that which would be achievable with either single sensor alone. When multi-sensor approaches are employed, however, incomplete data can be manifested. If each sensor is located on a separate platform ( e.g., aircraft), each sensor may interrogate---and hence collect data over---only partially overlapping areas of land. As a result, some data points may be characterized by data (i.e., features) from only a subset of the possible sensors employed in the task. Equivalently, this scenario implies that some data points will be missing features. Increasing focus in the future on using---and fusing data from---multiple sensors will make such incomplete-data problems commonplace. In many applications involving incomplete data, it is possible to acquire the missing data at a cost. In multi-sensor remote-sensing applications, data is acquired by deploying sensors to data points. Acquiring data is usually an expensive, time-consuming task, a fact that necessitates an intelligent data acquisition process. Incomplete data is not limited to remote-sensing applications, but rather, can arise in virtually any data set. In this dissertation, we address the general problem of classification when faced with incomplete data. We also address the closely related problem of active data acquisition, which develops a strategy to acquire missing features and labels that will most benefit the classification task. We first address the general problem of classification with incomplete data, maintaining the view that all data (i.e., information) is valuable. We employ a logistic regression framework within which we formulate a supervised classification algorithm for incomplete data. This principled, yet flexible, framework permits several interesting extensions that allow all available data to be utilized. One extension incorporates labeling error, which permits the usage of potentially imperfectly labeled data in learning a classifier. A second major extension converts the proposed algorithm to a semi-supervised approach by utilizing unlabeled data via graph-based regularization. Finally, the classification algorithm is extended to the case in which (image) data---from which features are extracted---are available from multiple resolutions. Taken together, this family of incomplete-data classification algorithms exploits all available data in a principled manner by avoiding explicit imputation. Instead, missing data is integrated out analytically with the aid of an estimated conditional density function (conditioned on the observed features). This feat is accomplished by invoking only mild assumptions. We also address the problem of active data acquisition by determining which missing data should be acquired to most improve performance. Specifically, we examine this data acquisition task when the data to be acquired can be either labels or features. The proposed approach is based on a criterion that accounts for the expected benefit of the acquisition. This approach, which is applicable for any general missing data problem, exploits the incomplete-data classification framework introduced in the first part of this dissertation. This data acquisition approach allows for the acquisition of both labels and features. Moreover, several types of feature acquisition are permitted, including the acquisition of individual or multiple features for individual or multiple data points, which may be either labeled or unlabeled. Furthermore, if different types of data acquisition are feasible for a given application, the algorithm will automatically determine the most beneficial type of data to acquire. Experimental results on both benchmark machine learning data sets and real (i.e., measured) remote-sensing data demonstrate the advantages of the proposed incomplete-data classification and active data acquisition algorithms.
Connecting and Using Multiple Representations
ERIC Educational Resources Information Center
Nielsen, Maria E.; Bostic, Jonathan D.
2018-01-01
"Principles to Actions: Ensuring Mathematical Success for All" (NCTM 2014) emphasizes eight teaching practices for effective mathematics teaching, one of which is to "use and connect multiple representations" (NCTM 2014, p. 24). An action that describes how teachers might promote this practice is to "allocate substantial…
Pointright: a system to redirect mouse and keyboard control among multiple machines
Johanson, Bradley E [Palo Alto, CA; Winograd, Terry A [Stanford, CA; Hutchins, Gregory M [Mountain View, CA
2008-09-30
The present invention provides a software system, PointRight, that allows for smooth and effortless control of pointing and input devices among multiple displays. With PointRight, a single free-floating mouse and keyboard can be used to control multiple screens. When the cursor reaches the edge of a screen it seamlessly moves to the adjacent screen and keyboard control is simultaneously redirected to the appropriate machine. Laptops may also redirect their keyboard and pointing device, and multiple pointers are supported simultaneously. The system automatically reconfigures itself as displays go on, go off, or change the machine they display.
Ethical principles in federal regulations: the case of children and research risks.
Williams, P C
1996-04-01
Ethical principles play an important part not only in the promulgation of regulations but also in their application, i.e., enforcement and adjudication. while traditional ethical principles--promotion of welfare, freedom, and fairness--play an important role in both elements of regulation, some other kinds of ethical principles are significant as well. Principles governing the structure of decision processes should shape the structure and actions of agencies; principles of wise application should govern the work of those whose responsibility it is to apply regulatory language to particular situations. These points are demonstrated by investigating a case study: federal regulations designed to protect children involved in scientific research applied to a placebo study of the effects of recombinant human growth hormone on children of extremely short stature.
Estimating vehicle height using homographic projections
Cunningham, Mark F; Fabris, Lorenzo; Gee, Timothy F; Ghebretati, Jr., Frezghi H; Goddard, James S; Karnowski, Thomas P; Ziock, Klaus-peter
2013-07-16
Multiple homography transformations corresponding to different heights are generated in the field of view. A group of salient points within a common estimated height range is identified in a time series of video images of a moving object. Inter-salient point distances are measured for the group of salient points under the multiple homography transformations corresponding to the different heights. Variations in the inter-salient point distances under the multiple homography transformations are compared. The height of the group of salient points is estimated to be the height corresponding to the homography transformation that minimizes the variations.
Many-Worlds Interpretation of Quantum Theory and Mesoscopic Anthropic Principle
NASA Astrophysics Data System (ADS)
Kamenshchik, A. Yu.; Teryaev, O. V.
2008-10-01
We suggest to combine the Anthropic Principle with Many-Worlds Interpretation of Quantum Theory. Realizing the multiplicity of worlds it provides an opportunity of explanation of some important events which are assumed to be extremely improbable. The Mesoscopic Anthropic Principle suggested here is aimed to explain appearance of such events which are necessary for emergence of Life and Mind. It is complementary to Cosmological Anthropic Principle explaining the fine tuning of fundamental constants. We briefly discuss various possible applications of Mesoscopic Anthropic Principle including the Solar Eclipses and assembling of complex molecules. Besides, we address the problem of Time's Arrow in the framework of Many-World Interpretation. We suggest the recipe for disentangling of quantities defined by fundamental physical laws and by an anthropic selection.
NASA Technical Reports Server (NTRS)
Dekorvin, Andre
1989-01-01
The main purpose is to develop a theory for multiple knowledge systems. A knowledge system could be a sensor or an expert system, but it must specialize in one feature. The problem is that we have an exhaustive list of possible answers to some query (such as what object is it). By collecting different feature values, in principle, it should be possible to give an answer to the query, or at least narrow down the list. Since a sensor, or for that matter an expert system, does not in most cases yield a precise value for the feature, uncertainty must be built into the model. Also, researchers must have a formal mechanism to be able to put the information together. Researchers chose to use the Dempster-Shafer approach to handle the problems mentioned above. Researchers introduce the concept of a state of recognition and point out that there is a relation between receiving updates and defining a set valued Markov Chain. Also, deciding what the value of the next set valued variable is can be phrased in terms of classical decision making theory such as minimizing the maximum regret. Other related problems are examined.
The aetiopathogenesis of fatigue: unpredictable, complex and persistent
Clark, James E.; Fai Ng, W.; Watson, Stuart; Newton, Julia L.
2016-01-01
Background Chronic fatigue syndrome is a common condition characterized by severe fatigue with post-exertional malaise, impaired cognitive ability, poor sleep quality, muscle pain, multi-joint pain, tender lymph nodes, sore throat or headache. Its defining symptom, fatigue is common to several diseases. Areas of agreement Research has established a broad picture of impairment across autonomic, endocrine and inflammatory systems though progress seems to have reached an impasse. Areas of controversy The absence of a clear consensus view of the pathophysiology of fatigue suggests the need to switch from a focus on abnormalities in one system to an experimental and clinical approach which integrates findings across multiple systems and their constituent parts and to consider multiple environmental factors. Growing points We discuss this with reference to three key factors, non-determinism, non-reductionism and self-organization and suggest that an approach based on these principles may afford a coherent explanatory framework for much of the observed phenomena in fatigue and offers promising avenues for future research. Areas timely for developing research By adopting this approach, the field can examine issues regarding aetiopathogenesis and treatment, with relevance for future research and clinical practice. PMID:26872857
Development of Advanced Methods of Structural and Trajectory Analysis for Transport Aircraft
NASA Technical Reports Server (NTRS)
Ardema, Mark D.; Windhorst, Robert; Phillips, James
1998-01-01
This paper develops a near-optimal guidance law for generating minimum fuel, time, or cost fixed-range trajectories for supersonic transport aircraft. The approach uses a choice of new state variables along with singular perturbation techniques to time-scale decouple the dynamic equations into multiple equations of single order (second order for the fast dynamics). Application of the maximum principle to each of the decoupled equations, as opposed to application to the original coupled equations, avoids the two point boundary value problem and transforms the problem from one of a functional optimization to one of multiple function optimizations. It is shown that such an approach produces well known aircraft performance results such as minimizing the Brequet factor for minimum fuel consumption and the energy climb path. Furthermore, the new state variables produce a consistent calculation of flight path angle along the trajectory, eliminating one of the deficiencies in the traditional energy state approximation. In addition, jumps in the energy climb path are smoothed out by integration of the original dynamic equations at constant load factor. Numerical results performed for a supersonic transport design show that a pushover dive followed by a pullout at nominal load factors are sufficient maneuvers to smooth the jump.
Optimization of Supersonic Transport Trajectories
NASA Technical Reports Server (NTRS)
Ardema, Mark D.; Windhorst, Robert; Phillips, James
1998-01-01
This paper develops a near-optimal guidance law for generating minimum fuel, time, or cost fixed-range trajectories for supersonic transport aircraft. The approach uses a choice of new state variables along with singular perturbation techniques to time-scale decouple the dynamic equations into multiple equations of single order (second order for the fast dynamics). Application of the maximum principle to each of the decoupled equations, as opposed to application to the original coupled equations, avoids the two point boundary value problem and transforms the problem from one of a functional optimization to one of multiple function optimizations. It is shown that such an approach produces well known aircraft performance results such as minimizing the Brequet factor for minimum fuel consumption and the energy climb path. Furthermore, the new state variables produce a consistent calculation of flight path angle along the trajectory, eliminating one of the deficiencies in the traditional energy state approximation. In addition, jumps in the energy climb path are smoothed out by integration of the original dynamic equations at constant load factor. Numerical results performed for a supersonic transport design show that a pushover dive followed by a pullout at nominal load factors are sufficient maneuvers to smooth the jump.
Olfactory Deficits in MCI as Predictor of Improved Cognition on Donepezil
2014-04-01
rivastigmine , and are being followed at the scheduled time-points as per the intent-to-treat principle employed in this study. The data obtained will be...the intent-to-treat principle, and in secondary analyses we will combine the data in patients who receive donepezil or galantamine or rivastigmine
Teaching Keynes's Principle of Effective Demand Using the Aggregate Labor Market Diagram.
ERIC Educational Resources Information Center
Dalziel, Paul; Lavoie, Marc
2003-01-01
Suggests a method to teach John Keynes's principle of effective demand using a standard aggregate labor market diagram familiar to students taking advanced undergraduate macroeconomics courses. States the analysis incorporates Michal Kalecki's version to show Keynesian unemployment as a point on the aggregate labor demand curve inside the…
A Response from Japan to TLRP's Ten Principles for Effective Pedagogy
ERIC Educational Resources Information Center
Abiko, Tadahiko
2011-01-01
This article comments upon James and Pollard's contribution in comparison with perspectives on pedagogy in Japan, where the concept has tended to be discredited by academics. TLRP's clusters of 10 principles are reviewed and found to be persuasive and meaningful, especially in relation to the following points: the emphasis on recognising…
Using Physics Principles in the Teaching of Chemistry.
ERIC Educational Resources Information Center
Gulden, Warren
1996-01-01
Presents three examples that show how students can use traditional physics principles or laws for the purpose of understanding chemistry better. Examples include Coulomb's Law and melting points, the Faraday Constant, and the Rydberg Constant. Presents a list of some other traditional topics in a chemistry course that could be enhanced by the…
Meta-Analyses of Seven of NIDA’s Principles of Drug Addiction Treatment
Pearson, Frank S.; Prendergast, Michael L.; Podus, Deborah; Vazan, Peter; Greenwell, Lisa; Hamilton, Zachary
2011-01-01
Seven of the 13 Principles of Drug Addiction Treatment disseminated by the National Institute on Drug Abuse (NIDA) were meta-analyzed as part of the Evidence-based Principles of Treatment (EPT) project. By averaging outcomes over the diverse programs included in EPT, we found that five of the NIDA principles examined are supported: matching treatment to the client’s needs; attending to the multiple needs of clients; behavioral counseling interventions; treatment plan reassessment; and counseling to reduce risk of HIV. Two of the NIDA principles are not supported: remaining in treatment for an adequate period of time and frequency of testing for drug use. These weak effects could be the result of the principles being stated too generally to apply to the diverse interventions and programs that exist or of unmeasured moderator variables being confounded with the moderators that measured the principles. Meta-analysis should be a standard tool for developing principles of effective treatment for substance use disorders. PMID:22119178
Pearson, Frank S; Prendergast, Michael L; Podus, Deborah; Vazan, Peter; Greenwell, Lisa; Hamilton, Zachary
2012-07-01
Of the 13 principles of drug addiction treatment disseminated by the National Institute on Drug Abuse (NIDA), 7 were meta-analyzed as part of the Evidence-based Principles of Treatment (EPT) project. By averaging outcomes over the diverse programs included in the EPT, we found that 5 of the NIDA principles examined are supported: matching treatment to the client's needs, attending to the multiple needs of clients, behavioral counseling interventions, treatment plan reassessment, and counseling to reduce risk of HIV. Two of the NIDA principles are not supported: remaining in treatment for an adequate period and frequency of testing for drug use. These weak effects could be the result of the principles being stated too generally to apply to the diverse interventions and programs that exist or unmeasured moderator variables being confounded with the moderators that measured the principles. Meta-analysis should be a standard tool for developing principles of effective treatment for substance use disorders. Copyright © 2012 Elsevier Inc. All rights reserved.
Values and principles evident in current health promotion practice.
Gregg, Jane; O'Hara, Lily
2007-04-01
Modern health promotion practice needs to respond to complex health issues that have multiple interrelated determinants. This requires an understanding of the values and principles of health promotion. A literature review was undertaken to explore the values and principles evident in current health promotion theory and practice. A broad range of values and principles are espoused as being integral to modern health promotion theory and practice. Although there are some commonalities across these lists, there is no recognised, authoritative set of values and principles accepted as fundamental and applicable to modern health promotion. There is a continuum of values and principles evident in health promotion practice from those associated with holistic, ecological, salutogenic health promotion to those more in keeping with conventional health promotion. There is a need for a system of values and principles consistent with modern health promotion that enables practitioners to purposefully integrate these values and principles into their understanding of health, as well as their needs assessment, planning, implementation and evaluation practice.
Langeland, Eva; Riise, Trond; Hanestad, Berit R; Nortvedt, Monica W; Kristoffersen, Kjell; Wahl, Astrid K
2006-08-01
Although the theory of salutogenesis provides generic understanding of how coping may be created, this theoretical perspective has not been explored sufficiently within research among people suffering from mental health problems. The aim of this study is to investigate the effect of talk-therapy groups based on salutogenic treatment principles on coping with mental health problems. In an experimental design, the participants (residents in the community) were randomly allocated to a coping-enhancing experimental group (n=59) and a control group (n=47) receiving standard care. Coping was measured using the sense of coherence (SOC) questionnaire. Coping improved significantly in the experiment group (+6 points) compared with the control group (-2 points). The manageability component contributed most to this improvement. Talk-therapy groups based on salutogenic treatment principles improve coping among people with mental health problems. Talk-therapy groups based on salutogenic treatment principles may be helpful in increasing coping in the recovery process among people with mental health problems and seem to be applicable to people with various mental health problems.
[MLPA technique--principles and use in practice].
Rusu, Cristina; Sireteanu, Adriana; Puiu, Maria; Skrypnyk, Cristina; Tomescu, E; Csep, Katalin; Creţ, Victoria; Barbarii, Ligia
2007-01-01
MLPA (Multiplex Ligation-dependent Probe Amplification) is a recently introduced method, based on PCR principle, useful for the detection of different genetic abnormalities (aneuploidies, gene deletions/duplications, subtelomeric rearrangements, methylation status etc). The technique is simple, reliable and cheap. We present this method to discuss its importance for a modern genetic service and to underline its multiple advantages.
NASA Astrophysics Data System (ADS)
Utochnikova, V. V.; Grishko, A. Yu.; Koshelev, D. S.; Averin, A. A.; Lepnev, L. S.; Kuzmina, N. P.
2017-12-01
The principles of the "multiphotonic emission", i.e. multiple emission from one lanthanide ion, in heterometallic lanthanide terephthalates were determined. Thanks to it, another system with the same effect, namely EuxY1-x(dbm)3(Phen) (Hdbm - dibenzoylmethanate, Phen - o-phenanthroline (mistape)) was found. The criteria for concentration quenching appearance were formulated and demonstrated.
ERIC Educational Resources Information Center
Capraro, Robert M.; Capraro, Mary Margaret; Barroso, Luciana R.; Morgan, James R.
2016-01-01
In this article the principle investigators of the various projects that comprise Aggie STEM at Texas A&M University discuss the impact and cross pollination of having graduate students from Turkey working and conducting their research as part of the multi-college Aggie STEM project. Turkish students have been engaged in instrumental roles…
Principled Approaches to Missing Data in Epidemiologic Studies
Perkins, Neil J; Cole, Stephen R; Harel, Ofer; Tchetgen Tchetgen, Eric J; Sun, BaoLuo; Mitchell, Emily M; Schisterman, Enrique F
2018-01-01
Abstract Principled methods with which to appropriately analyze missing data have long existed; however, broad implementation of these methods remains challenging. In this and 2 companion papers (Am J Epidemiol. 2018;187(3):576–584 and Am J Epidemiol. 2018;187(3):585–591), we discuss issues pertaining to missing data in the epidemiologic literature. We provide details regarding missing-data mechanisms and nomenclature and encourage the conduct of principled analyses through a detailed comparison of multiple imputation and inverse probability weighting. Data from the Collaborative Perinatal Project, a multisite US study conducted from 1959 to 1974, are used to create a masked data-analytical challenge with missing data induced by known mechanisms. We illustrate the deleterious effects of missing data with naive methods and show how principled methods can sometimes mitigate such effects. For example, when data were missing at random, naive methods showed a spurious protective effect of smoking on the risk of spontaneous abortion (odds ratio (OR) = 0.43, 95% confidence interval (CI): 0.19, 0.93), while implementation of principled methods multiple imputation (OR = 1.30, 95% CI: 0.95, 1.77) or augmented inverse probability weighting (OR = 1.40, 95% CI: 1.00, 1.97) provided estimates closer to the “true” full-data effect (OR = 1.31, 95% CI: 1.05, 1.64). We call for greater acknowledgement of and attention to missing data and for the broad use of principled missing-data methods in epidemiologic research. PMID:29165572
Principled Approaches to Missing Data in Epidemiologic Studies.
Perkins, Neil J; Cole, Stephen R; Harel, Ofer; Tchetgen Tchetgen, Eric J; Sun, BaoLuo; Mitchell, Emily M; Schisterman, Enrique F
2018-03-01
Principled methods with which to appropriately analyze missing data have long existed; however, broad implementation of these methods remains challenging. In this and 2 companion papers (Am J Epidemiol. 2018;187(3):576-584 and Am J Epidemiol. 2018;187(3):585-591), we discuss issues pertaining to missing data in the epidemiologic literature. We provide details regarding missing-data mechanisms and nomenclature and encourage the conduct of principled analyses through a detailed comparison of multiple imputation and inverse probability weighting. Data from the Collaborative Perinatal Project, a multisite US study conducted from 1959 to 1974, are used to create a masked data-analytical challenge with missing data induced by known mechanisms. We illustrate the deleterious effects of missing data with naive methods and show how principled methods can sometimes mitigate such effects. For example, when data were missing at random, naive methods showed a spurious protective effect of smoking on the risk of spontaneous abortion (odds ratio (OR) = 0.43, 95% confidence interval (CI): 0.19, 0.93), while implementation of principled methods multiple imputation (OR = 1.30, 95% CI: 0.95, 1.77) or augmented inverse probability weighting (OR = 1.40, 95% CI: 1.00, 1.97) provided estimates closer to the "true" full-data effect (OR = 1.31, 95% CI: 1.05, 1.64). We call for greater acknowledgement of and attention to missing data and for the broad use of principled missing-data methods in epidemiologic research.
Detection and quantification of MS lesions using fuzzy topological principles
NASA Astrophysics Data System (ADS)
Udupa, Jayaram K.; Wei, Luogang; Samarasekera, Supun; Miki, Yukio; van Buchem, M. A.; Grossman, Robert I.
1996-04-01
Quantification of the severity of the multiple sclerosis (MS) disease through estimation of lesion volume via MR imaging is vital for understanding and monitoring the disease and its treatment. This paper presents a novel methodology and a system that can be routinely used for segmenting and estimating the volume of MS lesions via dual-echo spin-echo MR imagery. An operator indicates a few points in the images by pointing to the white matter, the gray matter, and the CSF. Each of these objects is then detected as a fuzzy connected set. The holes in the union of these objects correspond to potential lesion sites which are utilized to detect each potential lesion as a fuzzy connected object. These 3D objects are presented to the operator who indicates acceptance/rejection through the click of a mouse button. The volume of accepted lesions is then computed and output. Based on several evaluation studies and over 300 3D data sets that were processed, we conclude that the methodology is highly reliable and consistent, with a coefficient of variation (due to subjective operator actions) of less than 1.0% for volume.
A double-correlation tremor-location method
NASA Astrophysics Data System (ADS)
Li, Ka Lok; Sgattoni, Giulia; Sadeghisorkhani, Hamzeh; Roberts, Roland; Gudmundsson, Olafur
2017-02-01
A double-correlation method is introduced to locate tremor sources based on stacks of complex, doubly-correlated tremor records of multiple triplets of seismographs back projected to hypothetical source locations in a geographic grid. Peaks in the resulting stack of moduli are inferred source locations. The stack of the moduli is a robust measure of energy radiated from a point source or point sources even when the velocity information is imprecise. Application to real data shows how double correlation focuses the source mapping compared to the common single correlation approach. Synthetic tests demonstrate the robustness of the method and its resolution limitations which are controlled by the station geometry, the finite frequency of the signal, the quality of the used velocity information and noise level. Both random noise and signal or noise correlated at time shifts that are inconsistent with the assumed velocity structure can be effectively suppressed. Assuming a surface wave velocity, we can constrain the source location even if the surface wave component does not dominate. The method can also in principle be used with body waves in 3-D, although this requires more data and seismographs placed near the source for depth resolution.
Application of the principle of corresponding states to two phase choked flow
NASA Technical Reports Server (NTRS)
Hendricks, R. C.; Simoneau, R. J.
1973-01-01
It is pointed out that several fluids including methane, oxygen, and nitrogen appear to form an average parametric plot which indicates that the isenthalpic Joule-Thomson coefficient must nearly obey the principle of corresponding states. With this as a basis, it was assumed that there could be several thermodynamic flow processes which nearly obey the principle. An examination was made to determine whether two-phase choked flow could be one of them. The analysis is described and the results are given.
2007-06-01
information than another party, such as in the market for used cars. Mankiw , in his book, “The Principles of Economics ,” defines adverse selection as...Options,” Power Point presentation e-mailed to the author, 18 May 2007, 1-9. 37 Ibid., 2. 38 Gregory N. Mankiw , Principles of Economics , Third...the Internet.” The American Economic Review. December 1999. 89 No. 5. 1066- 1078. Mankiw , Gregory N. Principles of Economics , Third Edition (Mason
Breaking the acoustic diffraction barrier with localization optoacoustic tomography
NASA Astrophysics Data System (ADS)
Deán-Ben, X. Luís.; Razansky, Daniel
2018-02-01
Diffraction causes blurring of high-resolution features in images and has been traditionally associated to the resolution limit in light microscopy and other imaging modalities. The resolution of an imaging system can be generally assessed via its point spread function, corresponding to the image acquired from a point source. However, the precision in determining the position of an isolated source can greatly exceed the diffraction limit. By combining the estimated positions of multiple sources, localization-based imaging has resulted in groundbreaking methods such as super-resolution fluorescence optical microscopy and has also enabled ultrasound imaging of microvascular structures with unprecedented spatial resolution in deep tissues. Herein, we introduce localization optoacoustic tomography (LOT) and discuss on the prospects of using localization imaging principles in optoacoustic imaging. LOT was experimentally implemented by real-time imaging of flowing particles in 3D with a recently-developed volumetric optoacoustic tomography system. Provided the particles were separated by a distance larger than the diffraction-limited resolution, their individual locations could be accurately determined in each frame of the acquired image sequence and the localization image was formed by superimposing a set of points corresponding to the localized positions of the absorbers. The presented results demonstrate that LOT can significantly enhance the well-established advantages of optoacoustic imaging by breaking the acoustic diffraction barrier in deep tissues and mitigating artifacts due to limited-view tomographic acquisitions.
Formation Energies of Native Point Defects in Strained-Layer Superlattices (Postprint)
2017-06-05
AFRL-RX-WP-JA-2017-0217 FORMATION ENERGIES OF NATIVE POINT DEFECTS IN STRAINED-LAYER SUPERLATTICES (POSTPRINT) Zhi-Gang Yu...2016 Interim 11 September 2013 – 5 November 2016 4. TITLE AND SUBTITLE FORMATION ENERGIES OF NATIVE POINT DEFECTS IN STRAINED-LAYER SUPERLATTICES...native point defect (NPD) formation energies and absence of mid-gap levels. In this Letter we use first-principles calculations to study the formation
Multiple comparisons in drug efficacy studies: scientific or marketing principles?
Leo, Jonathan
2004-01-01
When researchers design an experiment to compare a given medication to another medication, a behavioral therapy, or a placebo, the experiment often involves numerous comparisons. For instance, there may be several different evaluation methods, raters, and time points. Although scientifically justified, such comparisons can be abused in the interests of drug marketing. This article provides two recent examples of such questionable practices. The first involves the case of the arthritis drug celecoxib (Celebrex), where the study lasted 12 months but the authors only presented 6 months of data. The second case involves the NIMH Multimodal Treatment Study (MTA) study evaluating the efficacy of stimulant medication for attention-deficit hyperactivity disorder where ratings made by several groups are reported in contradictory fashion. The MTA authors have not clarified the confusion, at least in print, suggesting that the actual findings of the study may have played little role in the authors' reported conclusions.
An iterative method for obtaining the optimum lightning location on a spherical surface
NASA Technical Reports Server (NTRS)
Chao, Gao; Qiming, MA
1991-01-01
A brief introduction to the basic principles of an eigen method used to obtain the optimum source location of lightning is presented. The location of the optimum source is obtained by using multiple direction finders (DF's) on a spherical surface. An improvement of this method, which takes the distance of source-DF's as a constant, is presented. It is pointed out that using a weight factor of signal strength is not the most ideal method because of the inexact inverse signal strength-distance relation and the inaccurate signal amplitude. An iterative calculation method is presented using the distance from the source to the DF as a weight factor. This improved method has higher accuracy and needs only a little more calculation time. Some computer simulations for a 4DF system are presented to show the improvement of location through use of the iterative method.
Non-Equlibrium Driven Dynamics of Continuous Attractors in Place Cell Networks
NASA Astrophysics Data System (ADS)
Zhong, Weishun; Kim, Hyun Jin; Schwab, David; Murugan, Arvind
Attractors have found much use in neuroscience as a means of information processing and decision making. Examples include associative memory with point and continuous attractors, spatial navigation and planning using place cell networks, dynamic pattern recognition among others. The functional use of such attractors requires the action of spatially and temporally varying external driving signals and yet, most theoretical work on attractors has been in the limit of small or no drive. We take steps towards understanding the non-equilibrium driven dynamics of continuous attractors in place cell networks. We establish an `equivalence principle' that relates fluctuations under a time-dependent external force to equilibrium fluctuations in a `co-moving' frame with only static forces, much like in Newtonian physics. Consequently, we analytically derive a network's capacity to encode multiple attractors as a function of the driving signal size and rate of change.
Entangled trajectories Hamiltonian dynamics for treating quantum nuclear effects
NASA Astrophysics Data System (ADS)
Smith, Brendan; Akimov, Alexey V.
2018-04-01
A simple and robust methodology, dubbed Entangled Trajectories Hamiltonian Dynamics (ETHD), is developed to capture quantum nuclear effects such as tunneling and zero-point energy through the coupling of multiple classical trajectories. The approach reformulates the classically mapped second-order Quantized Hamiltonian Dynamics (QHD-2) in terms of coupled classical trajectories. The method partially enforces the uncertainty principle and facilitates tunneling. The applicability of the method is demonstrated by studying the dynamics in symmetric double well and cubic metastable state potentials. The methodology is validated using exact quantum simulations and is compared to QHD-2. We illustrate its relationship to the rigorous Bohmian quantum potential approach, from which ETHD can be derived. Our simulations show a remarkable agreement of the ETHD calculation with the quantum results, suggesting that ETHD may be a simple and inexpensive way of including quantum nuclear effects in molecular dynamics simulations.
Fuel-optimal trajectories of aeroassisted orbital transfer with plane change
NASA Technical Reports Server (NTRS)
Naidu, Desineni Subbaramaiah; Hibey, Joseph L.
1989-01-01
The problem of minimization of fuel consumption during the atmospheric portion of an aeroassisted, orbital transfer with plane change is addressed. The complete mission has required three characteristic velocities, a deorbit impulse at high earth orbit (HEO), a boost impulse at the atmospheric exit, and a reorbit impulse at low earth orbit (LEO). A performance index has been formulated as the sum of these three impulses. Application of optimal control principles has led to a nonlinear, two-point, boundary value problem which was solved by using a multiple shooting algorithm. The strategy for the atmospheric portion of the minimum-fuel transfer is to start initially with the maximum positive lift in order to recover from the downward plunge, and then to fly with a gradually decreasing lift such that the vehicle skips out of the atmosphere with a flight path angle near zero degrees.
National neonatal data to support specialist care and improve infant outcomes.
Spencer, Andrew; Modi, Neena
2013-03-01
'Liberating the NHS' and the new Outcomes Framework make information central to the management of the UK National Health Service (NHS). The principles of patient choice and government policy on the transparency of outcomes for public services are key drivers for improving the performance. Specialist neonatal care is able to respond positively to these challenges owing to the development of a well-defined dataset and comprehensive national data collection. When combined with analysis, audit and feedback at the national level, this is proving to be an effective means to harness the potential of clinical data. Other key characteristics have been an integrated approach to ensure that data are captured once and serve multiple needs, collaboration between professional organisations, parents, academic institutions, the commercial sector and NHS managers, and responsiveness to changing requirements. The authors discuss these aspects of national neonatal specialist data and point to future developments.
Phonon-defect scattering and thermal transport in semiconductors: developing guiding principles
NASA Astrophysics Data System (ADS)
Polanco, Carlos; Lindsay, Lucas
First principles calculations of thermal conductivity have shown remarkable agreement with measurements for high-quality crystals. Nevertheless, most materials contain defects that provide significant extrinsic resistance and lower the conductivity from that of a perfect sample. This effect is usually accounted for with simplified analytical models that neglect the atomistic details of the defect and the exact dynamical properties of the system, which limits prediction capabilities. Recently, a method based on Greens functions was developed to calculate the phonon-defect scattering rates from first principles. This method has shown the important role of point defects in determining thermal transport in diamond and boron arsenide, two competitors for the highest bulk thermal conductivity. Here, we study the role of point defects on other relatively high thermal conductivity semiconductors, e.g., BN, BeSe, SiC, GaN and Si. We compare their first principles defect-phonon scattering rates and effects on transport properties with those from simplified models and explore common principles that determine these. Efforts will focus on basic vibrational properties that vary from system to system, such as density of states, interatomic force constants and defect deformation. Research supported by the U.S. Department of Energy, Basic Energy Sciences, Materials Sciences and Engineering Division.
A Proof of the Occupancy Principle and the Mean-Transit-Time Theorem for Compartmental Models
RAMAKRISHNAN, RAJASEKHAR; LEONARD, EDWARD F.; DELL, RALPH B.
2012-01-01
The occupancy principle and the mean-transit-time theorem are derived for the passage of a tracer through a system that can be described by a general pool model. It is proved, using matrix theory, that if (and only if) tracer entering the system labels equally all tracee fluxes into the system, then the integral of the tracer concentration is the same in all the pools. It is also proved that if, in addition, all flow out of the system is through the observation point, the first moment of the tracer concentration at the observation point can be used to calculate the total amount of trace in the system. The necessity of this condition is analyzed. Examples are given of models in which the occupancy principle and the mean-transit-time theorem hold or do not hold. PMID:22328793
Saeedi, Ehsan; Kong, Yinan
2017-01-01
In this paper, we propose a novel parallel architecture for fast hardware implementation of elliptic curve point multiplication (ECPM), which is the key operation of an elliptic curve cryptography processor. The point multiplication over binary fields is synthesized on both FPGA and ASIC technology by designing fast elliptic curve group operations in Jacobian projective coordinates. A novel combined point doubling and point addition (PDPA) architecture is proposed for group operations to achieve high speed and low hardware requirements for ECPM. It has been implemented over the binary field which is recommended by the National Institute of Standards and Technology (NIST). The proposed ECPM supports two Koblitz and random curves for the key sizes 233 and 163 bits. For group operations, a finite-field arithmetic operation, e.g. multiplication, is designed on a polynomial basis. The delay of a 233-bit point multiplication is only 3.05 and 3.56 μs, in a Xilinx Virtex-7 FPGA, for Koblitz and random curves, respectively, and 0.81 μs in an ASIC 65-nm technology, which are the fastest hardware implementation results reported in the literature to date. In addition, a 163-bit point multiplication is also implemented in FPGA and ASIC for fair comparison which takes around 0.33 and 0.46 μs, respectively. The area-time product of the proposed point multiplication is very low compared to similar designs. The performance (1Area×Time=1AT) and Area × Time × Energy (ATE) product of the proposed design are far better than the most significant studies found in the literature. PMID:28459831
Hossain, Md Selim; Saeedi, Ehsan; Kong, Yinan
2017-01-01
In this paper, we propose a novel parallel architecture for fast hardware implementation of elliptic curve point multiplication (ECPM), which is the key operation of an elliptic curve cryptography processor. The point multiplication over binary fields is synthesized on both FPGA and ASIC technology by designing fast elliptic curve group operations in Jacobian projective coordinates. A novel combined point doubling and point addition (PDPA) architecture is proposed for group operations to achieve high speed and low hardware requirements for ECPM. It has been implemented over the binary field which is recommended by the National Institute of Standards and Technology (NIST). The proposed ECPM supports two Koblitz and random curves for the key sizes 233 and 163 bits. For group operations, a finite-field arithmetic operation, e.g. multiplication, is designed on a polynomial basis. The delay of a 233-bit point multiplication is only 3.05 and 3.56 μs, in a Xilinx Virtex-7 FPGA, for Koblitz and random curves, respectively, and 0.81 μs in an ASIC 65-nm technology, which are the fastest hardware implementation results reported in the literature to date. In addition, a 163-bit point multiplication is also implemented in FPGA and ASIC for fair comparison which takes around 0.33 and 0.46 μs, respectively. The area-time product of the proposed point multiplication is very low compared to similar designs. The performance ([Formula: see text]) and Area × Time × Energy (ATE) product of the proposed design are far better than the most significant studies found in the literature.
[Causation in the court: the complex case of malignant mesothelioma].
Lageard, Giovanni
2011-01-01
The aim of this paper is to carry out an analysis of the legal evolution in Italy of the assessment of causation i.e. cause and effect, in oncological diseases, a question taken into consideration by the High Court almost exclusively with reference to pleural mesothelioma. The most debated question when defining the causal association between asbestos exposure and mesothelioma is the possible role that any multiple potentially causative exposures could assume in the induction and development of the disease, and in particular the role of any asbestos exposure over the successive employment periods. Indeed, this is a subject on which, to date, no agreement has yet been reached in scientific doctrine: these divergences bear important practical significance from a legal point of view, since sustaining one thesis or another may constitute determining factors when ascertaining responsibility for individuals who, in the past, had decisional statuses in the workplace. Jurisprudence in the High Court took on an oscillating position on this question as from the early 2000s, which was divided into those who sustained the thesis of the relevance of any asbestos exposure over the successive employment periods and those who were of a different opinion, i.e. only the first exposure period has relevant causative effect. The point under discussion concerns, in particular, the adequacy of a probabilistic law only governing such a question. An important turning point was made in the year 2010 when two sentences were announced in the High Court, reiterating, in strict compliance with the principles affirmed by the United Sections in 2002, that a judge cannot, and must not, be satisfied with a general causation, but must rather reach a judgment on the basis of an individual causation. In particular, not only did the second of these two sentences recognise the multifactorial nature of mesothelioma, something which had almost always been denied in jurisprudence in the past, but it also established some very clear legal principles of law. Essentially, when ascertaining the causation, a judge should verify whether or not there is a sufficiently well established scientific law covering the question and whether such a law is universal or probabilistic. Should the latter be the case, then it is necessary to establish if the accelerating effect has been determined in the case in question, on the basis of the factual acquisitions. We must now wait for the concrete application of these principles by juridical bodies.
PubMed on Tap: discovering design principles for online information delivery to handheld computers.
Hauser, Susan E; Demner-Fushman, Dina; Ford, Glenn; Thoma, George R
2004-01-01
Online access to biomedical information from handheld computers will be a valuable adjunct to other popular medical applications if information delivery systems are designed with handheld computers in mind. The goal of this project is to discover design principles to facilitate practitioners' access to online medical information at the point-of-care. A prototype system was developed to serve as a testbed for this research. Using the testbed, an initial evaluation has yielded several user interface design principles. Continued research is expected to discover additional user interface design principles as well as guidelines for results organization and system performance
ERIC Educational Resources Information Center
McNamee, Paul; Madden, Dave; McNamee, Frank; Wall, John; Hurst, Alan; Vrasidas, Charalambos; Chanquoy, Lucile; Baccino, Thierry; Acar, Emrah; Onwy-Yazici, Ela; Jordan, Ann
2009-01-01
This paper describes an ongoing EU project concerned with developing an instructional design framework for virtual classes (VC) that is based on the theory of Multiple Intelligences (MI) (1983). The psychological theory of Multiple Intelligences (Gardner 1983) has received much credence within instructional design since its inception and has been…
1991-11-01
Tilted Rough Disc," Donald J. Schertler and Nicholas George "Image Deblurring for Multiple-Point Impulse Responses," Bryan J. Stossel and Nicholas George...Rough Disc Donald J. Schertler Nicholas George Image Deblurring for Multiple-Point Impulse Bryan J. Stossel Responses Nicholas George z 0 zw V) w LU 0...number of impulses present in the degradation. IMAGE DEBLURRING FOR MULTIPLE-POINT IMPULSE RESPONSESt Bryan J. Stossel Nicholas George Institute of Optics
ERIC Educational Resources Information Center
Loehr, Peter
This paper presents W. Edwards Deming's 14 management points, 7 deadly diseases, and 4 obstacles that thwart productivity, and discusses how these principles relate to teaching and learning. Application of these principles is expected to increase the quality of learning in classrooms from kindergarten through graduate level. Examples of the…
41 CFR Appendix A to Subpart E of... - 3-Key Points and Principles
Code of Federal Regulations, 2012 CFR
2012-01-01
... Academy of Sciences or the National Academy of Public Administration? Pt. 102-3, Subpt. E, App. A Appendix... principles Section(s) Question(s) Guidance I. Section 15 of the Act allows the National Academy of Sciences... these circumstances, neither the existence of the funding agreement nor the fact that it contemplates...
41 CFR Appendix A to Subpart E of... - 3-Key Points and Principles
Code of Federal Regulations, 2011 CFR
2011-01-01
... Academy of Sciences or the National Academy of Public Administration? Pt. 102-3, Subpt. E, App. A Appendix... principles Section(s) Question(s) Guidance I. Section 15 of the Act allows the National Academy of Sciences... these circumstances, neither the existence of the funding agreement nor the fact that it contemplates...
41 CFR Appendix A to Subpart E of... - 3-Key Points and Principles
Code of Federal Regulations, 2010 CFR
2010-07-01
... Academy of Sciences or the National Academy of Public Administration? Pt. 102-3, Subpt. E, App. A Appendix... principles Section(s) Question(s) Guidance I. Section 15 of the Act allows the National Academy of Sciences... these circumstances, neither the existence of the funding agreement nor the fact that it contemplates...
41 CFR Appendix A to Subpart E of... - 3-Key Points and Principles
Code of Federal Regulations, 2014 CFR
2014-01-01
... Academy of Sciences or the National Academy of Public Administration? Pt. 102-3, Subpt. E, App. A Appendix... principles Section(s) Question(s) Guidance I. Section 15 of the Act allows the National Academy of Sciences... these circumstances, neither the existence of the funding agreement nor the fact that it contemplates...
41 CFR Appendix A to Subpart E of... - 3-Key Points and Principles
Code of Federal Regulations, 2013 CFR
2013-07-01
... Academy of Sciences or the National Academy of Public Administration? Pt. 102-3, Subpt. E, App. A Appendix... principles Section(s) Question(s) Guidance I. Section 15 of the Act allows the National Academy of Sciences... these circumstances, neither the existence of the funding agreement nor the fact that it contemplates...
Faculty Opinions Regarding the Philosophical Principles of Total Quality Management (TQM).
ERIC Educational Resources Information Center
Aliff, John Vincent
The 14 points of the Total Quality Management (TQM) model can be distilled into the following 5 main guiding principles: establish a moral purpose for the institution, use cooperative efforts instead of individual efforts, stop the use of inspection (testing) to improve students and teachers, continuously improve the system and its products, and…
Three Principles to Improve Outcomes for Children and Families. Science to Policy and Practice
ERIC Educational Resources Information Center
Cohen, Steven D.
2017-01-01
The science of child development and the core capabilities of adults point to a set of "design principles" that policymakers and practitioners in many different sectors can use to improve outcomes for children and families. That is, to be maximally effective, policies and services should: (1) support responsive relationships for children…
Reflections Catalyzed by an Assault on a Favorite Principle
ERIC Educational Resources Information Center
Schultz, Emeric
2010-01-01
This commentary disagrees with a recent submission (Cheung, D. "J. Chem. Educ. 2009, 86," 514-518) questioning the value of the Le Chtelier principle (LCP). Cheung points out that the LCP fails to predict the proper change in a small set of chemical equilibria. This commentary argues that the LCP has great qualitative utility in correctly…
Henry, C Jeya K; Xin, Janice Lim Wen
2014-06-01
The local manufacture of ready-to-use therapeutic foods (RUTFs) is increasing, and there is a need to develop methods to ensure their safe production. We propose the application of Hazard Analysis Critical Control Point (HACCP) principles to achieve this goal. The basic principles of HACCP in the production of RUTFs are outlined. It is concluded that the implementation of an HACCP system in the manufacture of RUTFs is not only feasible but also attainable. The introduction of good manufacturing practices, coupled with an effective HACCP system, will ensure that RUTFs are produced in a cost-effective, safe, and hygienic manner.
Wiesmeth, Hans; Häckl, Dennis
2011-09-01
This paper investigates the concept of extended producer responsibility (EPR) from an economic point of view. Particular importance will be placed on the concept of 'economic feasibility' of an EPR policy, which should guide decision-making in this context. Moreover, the importance of the core EPR principle of 'integrating signals throughout the product chain' into the incentive structure will be demonstrated with experiences from Germany. These examples refer to sales packaging consumption, refillable drinks packages and waste electrical and electronic equipment collection. As a general conclusion, the interaction between economic principles and technological development needs to be observed carefully when designing incentive-compatible EPR policies.
Principles for a Successful Computerized Physician Order Entry Implementation
Ash, Joan S.; Fournier, Lara; Stavri, P. Zoë; Dykstra, Richard
2003-01-01
To identify success factors for implementing computerized physician order entry (CPOE), our research team took both a top-down and bottom-up approach and reconciled the results to develop twelve overarching principles to guide implementation. A consensus panel of experts produced ten Considerations with nearly 150 sub-considerations, and a three year project using qualitative methods at multiple successful sites for a grounded theory approach yielded ten general themes with 24 sub-themes. After reconciliation using a meta-matrix approach, twelve Principles, which cluster into groups forming the mnemonic CPOE emerged. Computer technology principles include: temporal concerns; technology and meeting information needs; multidimensional integration; and costs. Personal principles are: value to users and tradeoffs; essential people; and training and support. Organizational principles include: foundational underpinnings; collaborative project management; terms, concepts and connotations; and improvement through evaluation and learning. Finally, Environmental issues include the motivation and context for implementing such systems. PMID:14728129
Basing Science Ethics on Respect for Human Dignity.
Aközer, Mehmet; Aközer, Emel
2016-12-01
A "no ethics" principle has long been prevalent in science and has demotivated deliberation on scientific ethics. This paper argues the following: (1) An understanding of a scientific "ethos" based on actual "value preferences" and "value repugnances" prevalent in the scientific community permits and demands critical accounts of the "no ethics" principle in science. (2) The roots of this principle may be traced to a repugnance of human dignity, which was instilled at a historical breaking point in the interrelation between science and ethics. This breaking point involved granting science the exclusive mandate to pass judgment on the life worth living. (3) By contrast, respect for human dignity, in its Kantian definition as "the absolute inner worth of being human," should be adopted as the basis to ground science ethics. (4) The pathway from this foundation to the articulation of an ethical duty specific to scientific practice, i.e., respect for objective truth, is charted by Karl Popper's discussion of the ethical principles that form the basis of science. This also permits an integrated account of the "external" and "internal" ethical problems in science. (5) Principles of the respect for human dignity and the respect for objective truth are also safeguards of epistemic integrity. Plain defiance of human dignity by genetic determinism has compromised integrity of claims to knowledge in behavioral genetics and other behavioral sciences. Disregard of the ethical principles that form the basis of science threatens epistemic integrity.
Omery, A
1991-09-01
The purposes of this article were to provide insight into the process of ethics and ethical inquiry and to explore the ethical issues of culpability and pain management/control. Critical care nurses who currently care for vascular patients identified these issues as occurring frequently in their practice. Authors in critical care nursing generally have limited the process of ethical inquiry to a theoretical framework built around an ethic of principles. The message many critical care nurses heard was that this one type of theoretical ethical framework was the totality of ethics. The application of these principles was ethical inquiry. For some nurses, the ethic of principles is sufficient. For others, an ethic of principles is either incomplete or foreign. This second group of nurses may believe that they have no moral voice if the language of ethics is only the language of principles. The language of principles, however, is not the only theoretical framework available. There is also the ethic of care, and ethical inquiry can include the application of that framework. Indeed, the language of the ethic of care may give a voice to nurses who previously felt morally mute. In fact, these two theoretical frameworks are not the only frameworks available to nurses. There is also virtue ethics, a framework not discussed in this article. A multiplicity of ethical frameworks is available for nurses to use in analyzing their professional and personal dilemmas. Recognizing that multiplicity, nurses can analyze their ethical dilemmas more comprehensively and effectively. Applying differing ethical frameworks can result in the same conclusions. This was the case for the issue of culpability.(ABSTRACT TRUNCATED AT 250 WORDS)
ERIC Educational Resources Information Center
Park, Sanghoon
2015-01-01
Animated pedagogical agents have become popular in multimedia learning with combined delivery of verbal and non-verbal forms of information. In order to reduce unnecessary cognitive load caused by such multiple forms of information and also to foster generative cognitive processing, multimedia design principles with social cues are suggested…
ERIC Educational Resources Information Center
Sahney, Sangeeta
2016-01-01
Purpose: Educational institutes must embrace the principles of total quality management (TQM) if they seek to remain competitive, and survive and succeed in the long run. An educational institution must embrace the principles of quality management and incorporate them into all of their activities. Starting with a theoretical background, the paper…
NASA Astrophysics Data System (ADS)
Saha, Ashim Kumar; Yoshiya, Masato
2018-03-01
Stability of native point defect species and optical properties are quantitatively examined through first principles calculations in order to identify possible native point defect species in MoS2 and its influences on electronic structures and resultant optical properties. Possible native point defect species are identified as functions of thermodynamic environment and location of Fermi-level in MoS2. It is found that sulphur vacancies can be introduced more easily than other point defect species which will create impurity levels both in bandgap and in valence band. Additionally, antisite Mo and/or Mo vacancies can be created depending on chemical potential of sulphur, both of which will create impurity levels in bandgap and in valence band. Those impurity levels result in pronounced photon absorption in visible light region, though each of these point defects alone has limited impact on the optical properties unless their concentration remained low. Thus, attention must be paid when intentional impurity doping is made to MoS2 to avoid unwanted modification of optical properties of MoS2. Those impurity may enable further exploitation of photovoltaic energy conversion at longer wavelength.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blengini, Gian Andrea, E-mail: blengini@polito.it; CNR-IGAG, Institute of Environmental Geology and Geo-Engineering, Corso Duca degli Abruzzi 24, 10129 Turin; Busto, Mirko, E-mail: mirko.busto@polito.it
Highlights: Black-Right-Pointing-Pointer A new eco-efficient recycling route for post-consumer waste glass was implemented. Black-Right-Pointing-Pointer Integrated waste management and industrial production are crucial to green products. Black-Right-Pointing-Pointer Most of the waste glass rejects are sent back to the glass industry. Black-Right-Pointing-Pointer Recovered co-products give more environmental gains than does avoided landfill. Black-Right-Pointing-Pointer Energy intensive recycling must be limited to waste that cannot be closed-loop recycled. - Abstract: As part of the EU Life + NOVEDI project, a new eco-efficient recycling route has been implemented to maximise resources and energy recovery from post-consumer waste glass, through integrated waste management and industrial production.more » Life cycle assessment (LCA) has been used to identify engineering solutions to sustainability during the development of green building products. The new process and the related LCA are framed within a meaningful case of industrial symbiosis, where multiple waste streams are utilised in a multi-output industrial process. The input is a mix of rejected waste glass from conventional container glass recycling and waste special glass such as monitor glass, bulbs and glass fibres. The green building product is a recycled foam glass (RFG) to be used in high efficiency thermally insulating and lightweight concrete. The environmental gains have been contrasted against induced impacts and improvements have been proposed. Recovered co-products, such as glass fragments/powders, plastics and metals, correspond to environmental gains that are higher than those related to landfill avoidance, whereas the latter is cancelled due to increased transportation distances. In accordance to an eco-efficiency principle, it has been highlighted that recourse to highly energy intensive recycling should be limited to waste that cannot be closed-loop recycled.« less
Learning with Multiple Representations: Extending Multimedia Learning beyond the Lab
ERIC Educational Resources Information Center
Eilam, Billie; Poyas, Yael
2008-01-01
The present study extended multimedia learning principles beyond the lab to an ecologically valid setting (homework). Eighteen information cards were used to perform three homework tasks. The control group students learned from single representation (SR) cards that presented all information as printed text. The multiple representation (MR) group…
Integrating Systems-Based Practice, Community Psychiatry, and Recovery into Residency Training
ERIC Educational Resources Information Center
LeMelle, Stephanie; Arbuckle, Melissa R.; Ranz, Jules M.
2013-01-01
Background: Behavioral health services involving multiple systems of care are increasingly being provided in community as well as hospital settings. Residents therefore should be familiar with multiple systems and the role of the psychiatrist in these systems. The authors describe a curriculum incorporating principles of systems-based practice…
Multiple Family Group Counseling
ERIC Educational Resources Information Center
Sauber, S. Richard
1971-01-01
This article describes the innovative, short term approach of multiple family group counseling in which the counseling applies the principles and dynamics found in family and group counseling to the treatment of the student and his family. Several family units met together to discuss the problems that adversely affect the adolescent and result in…
Surrogate end points in women's health research: science, protoscience, and pseudoscience.
Grimes, David A; Schulz, Kenneth F; Raymond, Elizabeth G
2010-04-01
A surrogate end point (e.g., a laboratory test or image) serves as a proxy for a clinical end point of importance (e.g., fracture, thrombosis, or death). Adoption and use of surrogate end points lacking validation, especially in cardiovascular medicine, have caused thousands of patients' deaths, a serious violation of the ethical principle of beneficence. Copyright 2010 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.
Experiments To Demonstrate Chemical Process Safety Principles.
ERIC Educational Resources Information Center
Dorathy, Brian D.; Mooers, Jamisue A.; Warren, Matthew M.; Mich, Jennifer L.; Murhammer, David W.
2001-01-01
Points out the need to educate undergraduate chemical engineering students on chemical process safety and introduces the content of a chemical process safety course offered at the University of Iowa. Presents laboratory experiments demonstrating flammability limits, flash points, electrostatic, runaway reactions, explosions, and relief design.…
Multiple descriptions based on multirate coding for JPEG 2000 and H.264/AVC.
Tillo, Tammam; Baccaglini, Enrico; Olmo, Gabriella
2010-07-01
Multiple description coding (MDC) makes use of redundant representations of multimedia data to achieve resiliency. Descriptions should be generated so that the quality obtained when decoding a subset of them only depends on their number and not on the particular received subset. In this paper, we propose a method based on the principle of encoding the source at several rates, and properly blending the data encoded at different rates to generate the descriptions. The aim is to achieve efficient redundancy exploitation, and easy adaptation to different network scenarios by means of fine tuning of the encoder parameters. We apply this principle to both JPEG 2000 images and H.264/AVC video data. We consider as the reference scenario the distribution of contents on application-layer overlays with multiple-tree topology. The experimental results reveal that our method favorably compares with state-of-art MDC techniques.
NASA Astrophysics Data System (ADS)
Werner, C. L.; Wegmuller, U.; Strozzi, T.; Wiesmann, A.
2006-12-01
Principle contributors to the noise in differential SAR interferograms are temporal phase stability of the surface, geometry relating to baseline and surface slope, and propagation path delay variations due to tropospheric water vapor and the ionosphere. Time series analysis of multiple interferograms generated from a stack of SAR SLC images seeks to determine the deformation history of the surface while reducing errors. Only those scatterers within a resolution element that are stable and coherent for each interferometric pair contribute to the desired deformation signal. Interferograms with baselines exceeding 1/3 the critical baseline have substantial geometrical decorrelation for distributed targets. Short baseline pairs with multiple reference scenes can be combined using least-squares estimation to obtain a global deformation solution. Alternately point-like persistent scatterers can be identified in scenes that do not exhibit geometrical decorrelation associated with large baselines. In this approach interferograms are formed from a stack of SAR complex images using a single reference scene. Stable distributed scatter pixels are excluded however due to the presence of large baselines. We apply both point- based and short-baseline methodologies and compare results for a stack of fine-beam Radarsat data acquired in 2002-2004 over a rapidly subsiding oil field near Lost Hills, CA. We also investigate the density of point-like scatters with respect to image resolution. The primary difficulty encountered when applying time series methods is phase unwrapping errors due to spatial and temporal gaps. Phase unwrapping requires sufficient spatial and temporal sampling. Increasing the SAR range bandwidth increases the range resolution as well as increasing the critical interferometric baseline that defines the required satellite orbital tube diameter. Sufficient spatial sampling also permits unwrapping because of the reduced phase/pixel gradient. Short time intervals further reduce the differential phase due to deformation when the deformation is continuous. Lower frequency systems (L- vs. C-Band) substantially improve the ability to unwrap the phase correctly by directly reducing both interferometric phase amplitude and temporal decorrelation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
MacGregor, B.R.; McCoy, A.E.; Wickramasekara, S., E-mail: wickrama@grinnell.edu
2012-09-15
We present a formalism of Galilean quantum mechanics in non-inertial reference frames and discuss its implications for the equivalence principle. This extension of quantum mechanics rests on the Galilean line group, the semidirect product of the real line and the group of analytic functions from the real line to the Euclidean group in three dimensions. This group provides transformations between all inertial and non-inertial reference frames and contains the Galilei group as a subgroup. We construct a certain class of unitary representations of the Galilean line group and show that these representations determine the structure of quantum mechanics in non-inertialmore » reference frames. Our representations of the Galilean line group contain the usual unitary projective representations of the Galilei group, but have a more intricate cocycle structure. The transformation formula for the Hamiltonian under the Galilean line group shows that in a non-inertial reference frame it acquires a fictitious potential energy term that is proportional to the inertial mass, suggesting the equivalence of inertial mass and gravitational mass in quantum mechanics. - Highlights: Black-Right-Pointing-Pointer A formulation of Galilean quantum mechanics in non-inertial reference frames is given. Black-Right-Pointing-Pointer The key concept is the Galilean line group, an infinite dimensional group. Black-Right-Pointing-Pointer Unitary, cocycle representations of the Galilean line group are constructed. Black-Right-Pointing-Pointer A non-central extension of the group underlies these representations. Black-Right-Pointing-Pointer Quantum equivalence principle and gravity emerge from these representations.« less
Goldhaber-Fiebert, Sara N; Macrae, Carl
2018-03-01
How can teams manage critical events more effectively? There are commonly gaps in performance during perioperative crises, and emergency manuals are recently available tools that can improve team performance under stress, via multiple mechanisms. This article examines how the principles of implementation science and quality improvement were applied by multiple teams in the development, testing, and systematic implementations of emergency manuals in perioperative care. The core principles of implementation have relevance for future patient safety innovations perioperatively and beyond, and the concepts of emergency manuals and interprofessional teamwork are applicable for diverse fields throughout health care. Copyright © 2017 Sara N. Goldhaber-Fiebert, Carl Macrae. Published by Elsevier Inc. All rights reserved.
The principle of proportionality revisited: interpretations and applications.
Hermerén, Göran
2012-11-01
The principle of proportionality is used in many different contexts. Some of these uses and contexts are first briefly indicated. This paper focusses on the use of this principle as a moral principle. I argue that under certain conditions the principle of proportionality is helpful as a guide in decision-making. But it needs to be clarified and to be used with some flexibility as a context-dependent principle. Several interpretations of the principle are distinguished, using three conditions as a starting point: importance of objective, relevance of means, and most favourable option. The principle is then tested against an example, which suggests that a fourth condition, focusing on non-excessiveness, needs to be added. I will distinguish between three main interpretations of the principle, some primarily with uses in research ethics, others with uses in other areas of bioethics, for instance in comparisons of therapeutic means and ends. The relations between the principle of proportionality and the precautionary principle are explored in the following section. It is concluded that the principles are different and may even clash. In the next section the principle of proportionality is applied to some medical examples drawn from research ethics and bioethics. In concluding, the status of the principle of proportionality as a moral principle is discussed. What has been achieved so far and what remains to be done is finally summarized.
A multiple pointing-mount control strategy for space platforms
NASA Technical Reports Server (NTRS)
Johnson, C. D.
1992-01-01
A new disturbance-adaptive control strategy for multiple pointing-mount space platforms is proposed and illustrated by consideration of a simplified 3-link dynamic model of a multiple pointing-mount space platform. Simulation results demonstrate the effectiveness of the new platform control strategy. The simulation results also reveal a system 'destabilization phenomena' that can occur if the set of individual platform-mounted experiment controllers are 'too responsive.'
Vana, Kimberly D; Silva, Graciela E; Muzyka, Diann; Hirani, Lorraine M
2011-06-01
It has been proposed that students' use of an audience response system, commonly called clickers, may promote comprehension and retention of didactic material. Whether this method actually improves students' grades, however, is still not determined. The purpose of this study was to evaluate whether a lecture format utilizing multiple-choice PowerPoint slides and an audience response system was more effective than a lecture format using only multiple-choice PowerPoint slides in the comprehension and retention of pharmacological knowledge in baccalaureate nursing students. The study also assessed whether the additional use of clickers positively affected students' satisfaction with their learning. Results from 78 students who attended lecture classes with multiple-choice PowerPoint slides plus clickers were compared with those of 55 students who utilized multiple-choice PowerPoint slides only. Test scores between these two groups were not significantly different. A satisfaction questionnaire showed that 72.2% of the control students did not desire the opportunity to use clickers. Of the group utilizing the clickers, 92.3% recommend the use of this system in future courses. The use of multiple-choice PowerPoint slides and an audience response system did not seem to improve the students' comprehension or retention of pharmacological knowledge as compared with those who used solely multiple-choice PowerPoint slides.
Second thoughts about privacy, safety and deception
NASA Astrophysics Data System (ADS)
Sorell, Tom; Draper, Heather
2017-07-01
In this paper, we point out some difficulties with interpreting three of five principles formulated at a retreat on robot ethics sponsored by the Arts and Humanities Council and the Engineering and Physical Sciences Research Council. We also attempt to iron out some conflicts between the principles. Some of the difficulties arise from the way that the autonomy of robot users - their capacity to live by their own choices - can be a goal in the design of care robots. We discuss (a) problems for Principle 2 that arise from competing legal and philosophical understandings of privacy; (b) a tension between privacy and safety (Principles 2 and 3) and (c) some scepticism about the application of Principle 4, which addresses robot design that might result in the deception of vulnerable users.
The "fashion-form" of modern society and its relationship to psychology.
Fuentes, Juan Bautista; Quiroga, Ernesto
2009-05-01
In this work, we present a new way of understanding psychology, which emerges as a result of relating it to the three principles of the theory of fashion of Gilles Lipovetsky: "the principle of the ephemeral," "the principle of the marginal differentiation of individuals," and "the principle of seduction." We relate the first principle to the plurality of the diverse and changing "schools and systems" that have existed throughout the history of psychology. We apply the second to the figure of the psychologist, considered individually, revealing his or her leading role in the generation of the changing plurality of the systems. By means of the third principle, we point up that the diverse psychologies are forms of seduction. We conclude by stating that psychology has the form of fashion and we analyze how this form can help us to better understand it.
A recovery principle provides insight into auxin pattern control in the Arabidopsis root
Moore, Simon; Liu, Junli; Zhang, Xiaoxian; Lindsey, Keith
2017-01-01
Regulated auxin patterning provides a key mechanism for controlling root growth and development. We have developed a data-driven mechanistic model using realistic root geometry and formulated a principle to theoretically investigate quantitative auxin pattern recovery following auxin transport perturbation. This principle reveals that auxin patterning is potentially controlled by multiple combinations of interlinked levels and localisation of influx and efflux carriers. We demonstrate that (1) when efflux carriers maintain polarity but change levels, maintaining the same auxin pattern requires non-uniform and polar distribution of influx carriers; (2) the emergence of the same auxin pattern, from different levels of influx carriers with the same nonpolar localisation, requires simultaneous modulation of efflux carrier level and polarity; and (3) multiple patterns of influx and efflux carriers for maintaining an auxin pattern do not have spatially proportional correlation. This reveals that auxin pattern formation requires coordination between influx and efflux carriers. We further show that the model makes various predictions that can be experimentally validated. PMID:28220889
ERIC Educational Resources Information Center
Shih, Ching-Hsiang; Cheng, Hsiao-Fen; Li, Chia-Chun; Shih, Ching-Tien; Chiang, Ming-Shan
2010-01-01
This study evaluated whether four persons (two groups) with developmental disabilities would be able to improve their collaborative pointing performance through a Multiple Cursor Automatic Pointing Assistive Program (MCAPAP) with a newly developed mouse driver (i.e., a new mouse driver replaces standard mouse driver, and is able to…
[Ethical considerations about research with women in situations of violence].
Rafael, Ricardo de Mattos Russo; Soares de Moura, Anna Tereza Miranda
2013-01-01
This essay aims at reflecting on the ethical and methodological principles involved in research with women in situation of violence. The text raises the discussion of the application of the principles of beneficence and non-maleficence during researches involving this issue, pointing to recommendations towards privacy, autonomy and immediate contributions for volunteers. Then, taking as theoretical reference the principles of justice and equity, the authors propose a debate on methodological aspects involved in protection of respondents, with a view at improving the quality of the data obtained and possible social contributions.
Wang, Lu; Qu, Haibin
2016-03-01
A method combining solid phase extraction, high performance liquid chromatography, and ultraviolet/evaporative light scattering detection (SPE-HPLC-UV/ELSD) was developed according to Quality by Design (QbD) principles and used to assay nine bioactive compounds within a botanical drug, Shenqi Fuzheng Injection. Risk assessment and a Plackett-Burman design were utilized to evaluate the impact of 11 factors on the resolutions and signal-to-noise of chromatographic peaks. Multiple regression and Pareto ranking analysis indicated that the sorbent mass, sample volume, flow rate, column temperature, evaporator temperature, and gas flow rate were statistically significant (p < 0.05) in this procedure. Furthermore, a Box-Behnken design combined with response surface analysis was employed to study the relationships between the quality of SPE-HPLC-UV/ELSD analysis and four significant factors, i.e., flow rate, column temperature, evaporator temperature, and gas flow rate. An analytical design space of SPE-HPLC-UV/ELSD was then constructed by calculated Monte Carlo probability. In the presented approach, the operating parameters of sample preparation, chromatographic separation, and compound detection were investigated simultaneously. Eight terms of method validation, i.e., system-suitability tests, method robustness/ruggedness, sensitivity, precision, repeatability, linearity, accuracy, and stability, were accomplished at a selected working point. These results revealed that the QbD principles were suitable in the development of analytical procedures for samples in complex matrices. Meanwhile, the analytical quality and method robustness were validated by the analytical design space. The presented strategy provides a tutorial on the development of a robust QbD-compliant quantitative method for samples in complex matrices.
ERIC Educational Resources Information Center
Shih, Ching-Hsiang; Shih, Ching-Tien; Wu, Hsiao-Ling
2010-01-01
The latest research adopted software technology to redesign the mouse driver, and turned a mouse into a useful pointing assistive device for people with multiple disabilities who cannot easily or possibly use a standard mouse, to improve their pointing performance through a new operation method, Extended Dynamic Pointing Assistive Program (EDPAP),…
ERIC Educational Resources Information Center
Stinson, Wendy Bounds; Carr, Deborah; Nettles, Mary Frances; Johnson, James T.
2011-01-01
Purpose/Objectives: The objectives of this study were to assess the extent to which school nutrition (SN) programs have implemented food safety programs based on Hazard Analysis and Critical Control Point (HACCP) principles, as well as factors, barriers, and practices related to implementation of these programs. Methods: An online survey was…
ERIC Educational Resources Information Center
Office of Civil Defense (DOD), Washington, DC.
This handbook contains suggestions for teaching the facts, principles, and behaviors relevant to civil defense in social studies classes, grades 1-12. These classes were chosen as the entry point for civil defense education because the core of the civil defense concept is government in action with other community agencies to save lives and…
Morality as the Substructure of Social Justice: Religion in Education as a Case in Point
ERIC Educational Resources Information Center
Potgieter, Ferdinand J.
2011-01-01
Moral issues and principles do not only emerge in cases of conflict among, for instance, religious communities or political parties; indeed they form the moral substructure of notions of social justice. During periods of conflict each opponent claims justice for his/her side and bases the claim on certain principles. In this article, reference is…
ERIC Educational Resources Information Center
Nistor, Nicolae; Dehne, Anina; Drews, Frank Thomas
2010-01-01
In search of methods that improve the efficiency of teaching and training in organizations, several authors point out that mass customization (MC) is a principle that covers individual needs of knowledge and skills and, at the same time limits the development costs of customized training to those of mass training. MC is proven and established in…
Principle Study of Head Meridian Acupoint Massage to Stress Release via Grey Data Model Analysis.
Lee, Ya-Ting
2016-01-01
This paper presents the scientific study of the effectiveness and action principle of head meridian acupoint massage by applying the grey data model analysis approach. First, the head massage procedure for massaging the important head meridian acupuncture points including Taiyang, Fengfu, Tianzhu, Fengqi, and Jianjing is formulated in a standard manner. Second, the status of the autonomic nervous system of each subject is evaluated by using the heart rate variability analyzer before and after the head massage following four weeks. Afterward, the physiological factors of autonomic nerves are quantitatively analyzed by using the grey data modeling theory. The grey data analysis can point out that the status of autonomic nervous system is greatly improved after the massage. The order change of the grey relationship weighting of physiological factors shows the action principle of the sympathetic and parasympathetic nerves when performing head massage. In other words, the grey data model is able to distinguish the detailed interaction of the autonomic nervous system and the head meridian acupoint massage. Thus, the stress relaxing effect of massaging head meridian acupoints is proved, which is lacked in literature. The results can be a reference principle for massage health care in practice.
Origin of Transitions between Metallic and Insulating States in Simple Metals
Naumov, Ivan I.; Hemley, Russell J.
2015-04-17
Unifying principles that underlie recently discovered transitions between metallic and insulating states in elemental solids under pressure are developed. Using group theory arguments and first principles calculations, we show that the electronic properties of the phases involved in these transitions are controlled by symmetry principles not previously recognized. The valence bands in these systems are described by simple and composite band representations constructed from localized Wannier functions centered on points unoccupied by atoms, and which are not necessarily all symmetrical. The character of the Wannier functions is closely related to the degree of s-p(-d) hybridization and reflects multi-center chemical bondingmore » in these insulating states. The conditions under which an insulating state is allowed for structures having an integer number of atoms per primitive unit cell as well as re-entrant (i.e., metal-insulator-metal) transition sequences are detailed, resulting in predictions of novel behavior such as phases having three-dimensional Dirac-like points. The general principles developed are tested and applied to the alkali and alkaline earth metals, including elements where high-pressure insulating phases have been identified or reported (e.g., Li, Na, and Ca).« less
Principle Study of Head Meridian Acupoint Massage to Stress Release via Grey Data Model Analysis
Lee, Ya-Ting
2016-01-01
This paper presents the scientific study of the effectiveness and action principle of head meridian acupoint massage by applying the grey data model analysis approach. First, the head massage procedure for massaging the important head meridian acupuncture points including Taiyang, Fengfu, Tianzhu, Fengqi, and Jianjing is formulated in a standard manner. Second, the status of the autonomic nervous system of each subject is evaluated by using the heart rate variability analyzer before and after the head massage following four weeks. Afterward, the physiological factors of autonomic nerves are quantitatively analyzed by using the grey data modeling theory. The grey data analysis can point out that the status of autonomic nervous system is greatly improved after the massage. The order change of the grey relationship weighting of physiological factors shows the action principle of the sympathetic and parasympathetic nerves when performing head massage. In other words, the grey data model is able to distinguish the detailed interaction of the autonomic nervous system and the head meridian acupoint massage. Thus, the stress relaxing effect of massaging head meridian acupoints is proved, which is lacked in literature. The results can be a reference principle for massage health care in practice. PMID:26904144
Dyke, Elizabeth; Edwards, Nancy; McDowell, Ian; Muga, Richard; Brown, Stephen
2014-10-08
Addressing inequities is a key role for international non-governmental organizations (INGOs) working in health and development. Yet, putting equity principles into practice can prove challenging. In-depth empirical research examining what influences INGOs' implementation of equity principles is limited. This study examined the influences on one INGO's implementation of equity principles in its HIV/AIDS programs. This research employed a case study with nested components (an INGO operating in Kenya, with offices in North America). We used multiple data collection methods, including document reviews, interviews (with staff, partners and clients of the INGO in Kenya), and participant observation (with Kenyan INGO staff). Participant observation was conducted with 10 people over three months. Forty-one interviews were completed, and 127 documents analyzed. Data analysis followed Auerbach and Silverstein's analytic process (2003), with qualitative coding conducted in multiple stages, using descriptive matrices, visual displays and networks (Miles and Huberman, 1994). There was a gap between the INGO's intent to implement equity principles and actual practice due to multiple influences from various players, including donors and country governments. The INGO was reliant on donor funding and needed permission from the Kenyan government to work in-country. Major influences included donor agendas and funding, donor country policies, and Southern country government priorities and legislation. The INGO privileged particular vulnerable populations (based on its reputation, its history, and the priorities of the Kenyan government and the donors). To balance its equity commitment with the influences from other players, the INGO aligned with the system as well as pushed back incrementally on the donors and the Kenyan government to influence these organizations' equity agendas. By moving its equity agenda forward incrementally and using its reputational advantage, the INGO avoided potential negative repercussions that might result from pushing too fast or working outside the system. The INGO aligned the implementation of equity principles in its HIV/AIDS initiatives by working within a system characterized by asymmetrical interdependence. Influences from the donors and Kenyan government contributed to an implementation gap between what the INGO intended to accomplish in implementing equity principles in HIV/AIDS work and actual practice.
Protein Multifunctionality: Principles and Mechanisms
Zaretsky, Joseph Z.; Wreschner, Daniel H.
2008-01-01
In the review, the nature of protein multifunctionality is analyzed. In the first part of the review the principles of structural/functional organization of protein are discussed. In the second part, the main mechanisms involved in development of multiple functions on a single gene product(s) are analyzed. The last part represents a number of examples showing that multifunctionality is a basic feature of biologically active proteins. PMID:21566747
Applying Lean principles and Kaizen rapid improvement events in public health practice.
Smith, Gene; Poteat-Godwin, Annah; Harrison, Lisa Macon; Randolph, Greg D
2012-01-01
This case study describes a local home health and hospice agency's effort to implement Lean principles and Kaizen methodology as a rapid improvement approach to quality improvement. The agency created a cross-functional team, followed Lean Kaizen methodology, and made significant improvements in scheduling time for home health nurses that resulted in reduced operational costs, improved working conditions, and multiple organizational efficiencies.
Nie, Xiaobing; Zheng, Wei Xing; Cao, Jinde
2016-12-01
In this paper, the coexistence and dynamical behaviors of multiple equilibrium points are discussed for a class of memristive neural networks (MNNs) with unbounded time-varying delays and nonmonotonic piecewise linear activation functions. By means of the fixed point theorem, nonsmooth analysis theory and rigorous mathematical analysis, it is proven that under some conditions, such n-neuron MNNs can have 5 n equilibrium points located in ℜ n , and 3 n of them are locally μ-stable. As a direct application, some criteria are also obtained on the multiple exponential stability, multiple power stability, multiple log-stability and multiple log-log-stability. All these results reveal that the addressed neural networks with activation functions introduced in this paper can generate greater storage capacity than the ones with Mexican-hat-type activation function. Numerical simulations are presented to substantiate the theoretical results. Copyright © 2016 Elsevier Ltd. All rights reserved.
New principle for measuring arterial blood oxygenation, enabling motion-robust remote monitoring.
van Gastel, Mark; Stuijk, Sander; de Haan, Gerard
2016-12-07
Finger-oximeters are ubiquitously used for patient monitoring in hospitals worldwide. Recently, remote measurement of arterial blood oxygenation (SpO 2 ) with a camera has been demonstrated. Both contact and remote measurements, however, require the subject to remain static for accurate SpO 2 values. This is due to the use of the common ratio-of-ratios measurement principle that measures the relative pulsatility at different wavelengths. Since the amplitudes are small, they are easily corrupted by motion-induced variations. We introduce a new principle that allows accurate remote measurements even during significant subject motion. We demonstrate the main advantage of the principle, i.e. that the optimal signature remains the same even when the SNR of the PPG signal drops significantly due to motion or limited measurement area. The evaluation uses recordings with breath-holding events, which induce hypoxemia in healthy moving subjects. The events lead to clinically relevant SpO 2 levels in the range 80-100%. The new principle is shown to greatly outperform current remote ratio-of-ratios based methods. The mean-absolute SpO 2 -error (MAE) is about 2 percentage-points during head movements, where the benchmark method shows a MAE of 24 percentage-points. Consequently, we claim ours to be the first method to reliably measure SpO 2 remotely during significant subject motion.
USDA-ARS?s Scientific Manuscript database
A novel technique named multiple-particle tracking (MPT) was used to investigate the micro-structural heterogeneities of Z-trim, a zero calorie cellulosic fiber biopolymer produced from corn hulls. The principle of MPT technique is to monitor the thermally driven motion of inert micro-spheres, which...
ERIC Educational Resources Information Center
Kezar, Adrianna
The purpose of this paper is to explore avenues for achieving pluralistic leadership cultures and present three principles: (1) awareness of identity, positionality, and power conditions; (2) acknowledgment of multiple descriptions of campus leadership and personal philosophies of leadership; and (3) negotiation among multiple descriptions of…
Where Adults Go: A Multiple Case Study of Adult Serving Undergraduate Colleges and Universities
ERIC Educational Resources Information Center
Dixon-Williams, Shelley B.
2010-01-01
This research is an exploratory multiple case study of adult serving undergraduate colleges and universities. Using the Council of Adult and Experiential Learning (CAEL) Principles of Effective Practice for Serving Adult Learners, this study examines the differences of adult serving undergraduate colleges across the three sectors of higher…
Myth 6: Cosmetic Use of Multiple Selection Criteria
ERIC Educational Resources Information Center
Friedman-Nimz, Reva
2009-01-01
Twenty-five years ago, armed with the courage of her convictions and a respectable collection of empirical evidence, the author articulated what she considered to be a compelling argument against the cosmetic use of multiple selection criteria as a guiding principle for identifying children and youth with high potential. To assess the current…
Exploring undergraduates' understanding of photosynthesis using diagnostic question clusters.
Parker, Joyce M; Anderson, Charles W; Heidemann, Merle; Merrill, John; Merritt, Brett; Richmond, Gail; Urban-Lurain, Mark
2012-01-01
We present a diagnostic question cluster (DQC) that assesses undergraduates' thinking about photosynthesis. This assessment tool is not designed to identify individual misconceptions. Rather, it is focused on students' abilities to apply basic concepts about photosynthesis by reasoning with a coordinated set of practices based on a few scientific principles: conservation of matter, conservation of energy, and the hierarchical nature of biological systems. Data on students' responses to the cluster items and uses of some of the questions in multiple-choice, multiple-true/false, and essay formats are compared. A cross-over study indicates that the multiple-true/false format shows promise as a machine-gradable format that identifies students who have a mixture of accurate and inaccurate ideas. In addition, interviews with students about their choices on three multiple-choice questions reveal the fragility of students' understanding. Collectively, the data show that many undergraduates lack both a basic understanding of the role of photosynthesis in plant metabolism and the ability to reason with scientific principles when learning new content. Implications for instruction are discussed.
Exploring Undergraduates' Understanding of Photosynthesis Using Diagnostic Question Clusters
Parker, Joyce M.; Anderson, Charles W.; Heidemann, Merle; Merrill, John; Merritt, Brett; Richmond, Gail; Urban-Lurain, Mark
2012-01-01
We present a diagnostic question cluster (DQC) that assesses undergraduates' thinking about photosynthesis. This assessment tool is not designed to identify individual misconceptions. Rather, it is focused on students' abilities to apply basic concepts about photosynthesis by reasoning with a coordinated set of practices based on a few scientific principles: conservation of matter, conservation of energy, and the hierarchical nature of biological systems. Data on students' responses to the cluster items and uses of some of the questions in multiple-choice, multiple-true/false, and essay formats are compared. A cross-over study indicates that the multiple-true/false format shows promise as a machine-gradable format that identifies students who have a mixture of accurate and inaccurate ideas. In addition, interviews with students about their choices on three multiple-choice questions reveal the fragility of students' understanding. Collectively, the data show that many undergraduates lack both a basic understanding of the role of photosynthesis in plant metabolism and the ability to reason with scientific principles when learning new content. Implications for instruction are discussed. PMID:22383617
41 CFR Appendix A to Subpart C of... - 3-Key Points and Principles
Code of Federal Regulations, 2011 CFR
2011-01-01
... Management Regulations System (Continued) FEDERAL MANAGEMENT REGULATION GENERAL 3-FEDERAL ADVISORY COMMITTEE... or agency management directives; (iv) The applicability of conflict of interest statutes and other... 41 Public Contracts and Property Management 3 2011-01-01 2011-01-01 false 3-Key Points and...
The Neuroscience of PowerPoint[TM
ERIC Educational Resources Information Center
Horvath, Jared Cooney
2014-01-01
Many concepts have been published relevant to improving the design of PowerPoint[TM] (PP) presentations for didactic purposes, including the redundancy, modality, and signaling principles of multimedia learning. In this article, we review the recent neuroimaging findings that have emerged elucidating the neural structures involved in many of these…
Enhancing the Impact of Quality Points in Interteaching
ERIC Educational Resources Information Center
Rosales, Rocío; Soldner, James L.; Crimando, William
2014-01-01
Interteaching is a classroom instruction approach based on behavioral principles that offers increased flexibility to instructors. There are several components of interteaching that may contribute to its demonstrated efficacy. In a prior analysis of one of these components, the quality points contingency, no significant difference was reported in…
Peng, Nie; Bang-Fa, Ni; Wei-Zhi, Tian
2013-02-01
Application of effective interaction depth (EID) principle for parametric normalization of full energy peak efficiencies at different counting positions, originally for quasi-point sources, has been extended to bulky sources (within ∅30 mm×40 mm) with arbitrary matrices. It is also proved that the EID function for quasi-point source can be directly used for cylindrical bulky sources (within ∅30 mm×40 mm) with the geometric center as effective point source for low atomic number (Z) and low density (D) media and high energy γ-rays. It is also found that in general EID for bulky sources is dependent upon Z and D of the medium and the energy of the γ-rays in question. In addition, the EID principle was theoretically verified by MCNP calculations. Copyright © 2012 Elsevier Ltd. All rights reserved.
Role of zero-point effects in stabilizing the ground state structure of bulk Fe2P
NASA Astrophysics Data System (ADS)
Bhat, Soumya S.; Gupta, Kapil; Bhattacharjee, Satadeep; Lee, Seung-Cheol
2018-05-01
Structural stability of Fe2P is investigated in detail using first-principles calculations based on density functional theory. While the orthorhombic C23 phase is found to be energetically more stable, the experiments suggest it to be hexagonal C22 phase. In the present study, we show that in order to obtain the correct ground state structure of Fe2P from the first-principles based methods it is utmost necessary to consider the zero-point effects such as zero-point vibrations and spin fluctuations. This study demonstrates an exceptional case where a bulk material is stabilized by quantum effects, which are usually important in low-dimensional materials. Our results also indicate the possibility of magnetic field induced structural quantum phase transition in Fe2P, which should form the basis for further theoretical and experimental efforts.
No scanning depth imaging system based on TOF
NASA Astrophysics Data System (ADS)
Sun, Rongchun; Piao, Yan; Wang, Yu; Liu, Shuo
2016-03-01
To quickly obtain a 3D model of real world objects, multi-point ranging is very important. However, the traditional measuring method usually adopts the principle of point by point or line by line measurement, which is too slow and of poor efficiency. In the paper, a no scanning depth imaging system based on TOF (time of flight) was proposed. The system is composed of light source circuit, special infrared image sensor module, processor and controller of image data, data cache circuit, communication circuit, and so on. According to the working principle of the TOF measurement, image sequence was collected by the high-speed CMOS sensor, and the distance information was obtained by identifying phase difference, and the amplitude image was also calculated. Experiments were conducted and the experimental results show that the depth imaging system can achieve no scanning depth imaging function with good performance.
Optically inactive defects in monolayer and bilayer phosphorene: A first-principles study
NASA Astrophysics Data System (ADS)
Huang, Ling-yi; Zhang, Xu; Zhang, Mingliang; Lu, Gang
2018-05-01
Many-body excitonic effect is crucial in two-dimensional (2D) materials and can significantly impact their optoelectronic properties. Because defects are inevitable in 2D materials, understanding how they influence the optical and excitonic properties of the 2D materials is of significant scientific and technological importance. Here we focus on intrinsic point defects in monolayer and bilayer phosphorene and examine whether and how their optoelectronic properties may be modified by the defects. Based on large-scale first-principles calculations, we have systematically explored the optical and excitonic properties of phosphorene in the presence and absence of the point defects. We find that the optical properties of bilayer phosphorene depend on the stacking order of the layers. More importantly, we reveal that the dominant point defects in few-layer phosphorene are optically inactive, which renders phosphorene particularly attractive in optoelectronic applications.
Principles of protein targeting to the nucleolus.
Martin, Robert M; Ter-Avetisyan, Gohar; Herce, Henry D; Ludwig, Anne K; Lättig-Tünnemann, Gisela; Cardoso, M Cristina
2015-01-01
The nucleolus is the hallmark of nuclear compartmentalization and has been shown to exert multiple roles in cellular metabolism besides its main function as the place of rRNA synthesis and assembly of ribosomes. Nucleolar proteins dynamically localize and accumulate in this nuclear compartment relative to the surrounding nucleoplasm. In this study, we have assessed the molecular requirements that are necessary and sufficient for the localization and accumulation of peptides and proteins inside the nucleoli of living cells. The data showed that positively charged peptide entities composed of arginines alone and with an isoelectric point at and above 12.6 are necessary and sufficient for mediating significant nucleolar accumulation. A threshold of 6 arginines is necessary for peptides to accumulate in nucleoli, but already 4 arginines are sufficient when fused within 15 amino acid residues of a nuclear localization signal of a protein. Using a pH sensitive dye, we found that the nucleolar compartment is particularly acidic when compared to the surrounding nucleoplasm and, hence, provides the ideal electrochemical environment to bind poly-arginine containing proteins. In fact, we found that oligo-arginine peptides and GFP fusions bind RNA in vitro. Consistent with RNA being the main binding partner for arginines in the nucleolus, we found that the same principles apply to cells from insects to man, indicating that this mechanism is highly conserved throughout evolution.
First-Principles Prediction of Densities of Amorphous Materials: The Case of Amorphous Silicon
NASA Astrophysics Data System (ADS)
Furukawa, Yoritaka; Matsushita, Yu-ichiro
2018-02-01
A novel approach to predict the atomic densities of amorphous materials is explored on the basis of Car-Parrinello molecular dynamics (CPMD) in density functional theory. Despite the determination of the atomic density of matter being crucial in understanding its physical properties, no first-principles method has ever been proposed for amorphous materials until now. We have extended the conventional method for crystalline materials in a natural manner and pointed out the importance of the canonical ensemble of the total energy in the determination of the atomic densities of amorphous materials. To take into account the canonical distribution of the total energy, we generate multiple amorphous structures with several different volumes by CPMD simulations and average the total energies at each volume. The density is then determined as the one that minimizes the averaged total energy. In this study, this approach is implemented for amorphous silicon (a-Si) to demonstrate its validity, and we have determined the density of a-Si to be 4.1% lower and its bulk modulus to be 28 GPa smaller than those of the crystal, which are in good agreement with experiments. We have also confirmed that generating samples through classical molecular dynamics simulations produces a comparable result. The findings suggest that the presented method is applicable to other amorphous systems, including those for which experimental knowledge is lacking.
Microfluidic Surface Plasmon Resonance Sensors: From Principles to Point-of-Care Applications
Wang, Da-Shin; Fan, Shih-Kang
2016-01-01
Surface plasmon resonance (SPR) is a label-free, highly-sensitive, and real-time sensing technique. Conventional SPR sensors, which involve a planar thin gold film, have been widely exploited in biosensing; various miniaturized formats have been devised for portability purposes. Another type of SPR sensor which utilizes localized SPR (LSPR), is based on metal nanostructures with surface plasmon modes at the structural interface. The resonance condition is sensitive to the refractive index change of the local medium. The principles of these two types of SPR sensors are reviewed and their integration with microfluidic platforms is described. Further applications of microfluidic SPR sensors to point-of-care (POC) diagnostics are discussed. PMID:27472340
Topological photonic crystal with ideal Weyl points
NASA Astrophysics Data System (ADS)
Wang, Luyang; Jian, Shao-Kai; Yao, Hong
Weyl points in three-dimensional photonic crystals behave as monopoles of Berry flux in momentum space. Here, based on symmetry analysis, we show that a minimal number of symmetry-related Weyl points can be realized in time-reversal invariant photonic crystals. We propose to realize these ``ideal'' Weyl points in modified double-gyroid photonic crystals, which is confirmed by our first-principle photonic band-structure calculations. Photonic crystals with ideal Weyl points are qualitatively advantageous in applications such as angular and frequency selectivity, broadband invisibility cloaking, and broadband 3D-imaging.
Nagmoti, Jyoti Mahantesh
2017-01-01
PowerPoint (PPT™) presentation has become an integral part of day-to-day teaching in medicine. Most often, PPT™ is used in its default mode which in fact, is known to cause boredom and ineffective learning. Research has shown improved short-term memory by applying multimedia principles for designing and delivering lectures. However, such evidence in medical education is scarce. Therefore, we attempted to evaluate the effect of multimedia principles on enhanced learning of parasitology. Second-year medical students received a series of lectures, half of the lectures used traditionally designed PPT™ and the rest used slides designed by Mayer's multimedia principles. Students answered pre and post-tests at the end of each lecture (test-I) and an essay test after six months (test-II) which assessed their short and long term knowledge retention respectively. Students' feedback on quality and content of lectures were collected. Statistically significant difference was found between post test scores of traditional and modified lectures (P = 0.019) indicating, improved short-term memory after modified lectures. Similarly, students scored better in test II on the contents learnt through modified lectures indicating, enhanced comprehension and improved long-term memory (P < 0.001). Many students appreciated learning through multimedia designed PPT™ and suggested for their continued use. It is time to depart from default PPT™ and adopt multimedia principles to enhance comprehension and improve short and long term knowledge retention. Further, medical educators may be trained and encouraged to apply multimedia principles for designing and delivering effective lectures.
Motor Synergies and the Equilibrium-Point Hypothesis
Latash, Mark L.
2010-01-01
The article offers a way to unite three recent developments in the field of motor control and coordination: (1) The notion of synergies is introduced based on the principle of motor abundance; (2) The uncontrolled manifold hypothesis is described as offering a computational framework to identify and quantify synergies; and (3) The equilibrium-point hypothesis is described for a single muscle, single joint, and multi-joint systems. Merging these concepts into a single coherent scheme requires focusing on control variables rather than performance variables. The principle of minimal final action is formulated as the guiding principle within the referent configuration hypothesis. Motor actions are associated with setting two types of variables by a controller, those that ultimately define average performance patterns and those that define associated synergies. Predictions of the suggested scheme are reviewed, such as the phenomenon of anticipatory synergy adjustments, quick actions without changes in synergies, atypical synergies, and changes in synergies with practice. A few models are briefly reviewed. PMID:20702893
Motor synergies and the equilibrium-point hypothesis.
Latash, Mark L
2010-07-01
The article offers a way to unite three recent developments in the field of motor control and coordination: (1) The notion of synergies is introduced based on the principle of motor abundance; (2) The uncontrolled manifold hypothesis is described as offering a computational framework to identify and quantify synergies; and (3) The equilibrium-point hypothesis is described for a single muscle, single joint, and multijoint systems. Merging these concepts into a single coherent scheme requires focusing on control variables rather than performance variables. The principle of minimal final action is formulated as the guiding principle within the referent configuration hypothesis. Motor actions are associated with setting two types of variables by a controller, those that ultimately define average performance patterns and those that define associated synergies. Predictions of the suggested scheme are reviewed, such as the phenomenon of anticipatory synergy adjustments, quick actions without changes in synergies, atypical synergies, and changes in synergies with practice. A few models are briefly reviewed.
Effective intermolecular potential and critical point for C60 molecule
NASA Astrophysics Data System (ADS)
Ramos, J. Eloy
2017-07-01
The approximate nonconformal (ANC) theory is applied to the C60 molecule. A new binary potential function is developed for C60, which has three parameters only and is obtained by averaging the site-site carbon interactions on the surface of two C60 molecules. It is shown that the C60 molecule follows, to a good approximation, the corresponding states principle with n-C8H18, n-C4F10 and n-C5F12. The critical point of C60 is estimated in two ways: first by applying the corresponding states principle under the framework of the ANC theory, and then by using previous computer simulations. The critical parameters obtained by applying the corresponding states principle, although very different from those reported in the literature, are consistent with the previous results of the ANC theory. It is shown that the Girifalco potential does not correspond to an average of the site-site carbon-carbon interaction.
ERIC Educational Resources Information Center
Wida, Sam
1992-01-01
Uses extremely strong neodymium magnets to demonstrate several principles of physics including electromagnetic induction, Lenz's Law, domain theory, demagnetization, the Curie point, and magnetic flux lines. (MDH)
Roots of polynomials by ratio of successive derivatives
NASA Technical Reports Server (NTRS)
Crouse, J. E.; Putt, C. W.
1972-01-01
An order of magnitude study of the ratios of successive polynomial derivatives yields information about the number of roots at an approached root point and the approximate location of a root point from a nearby point. The location approximation improves as a root is approached, so a powerful convergence procedure becomes available. These principles are developed into a computer program which finds the roots of polynomials with real number coefficients.
1977-05-15
February through 15 May 1977 PUBLISHED REPORTS Journal Articles JA No. 4621 Minority Carriers in Graphite and the H- Point Magnetoreflec- tion... point , the light at the output face must emerge from the coupled guide. In principle, both switch states can be achieved us- ing the A/3...Fermi level moves downward with increasing proton dose until it becomes pinned at a position designated as the high-dose Fermi level. At this point
Liquid-vapor equilibrium and interfacial properties of square wells in two dimensions
NASA Astrophysics Data System (ADS)
Armas-Pérez, Julio C.; Quintana-H, Jacqueline; Chapela, Gustavo A.
2013-01-01
Liquid-vapor coexistence and interfacial properties of square wells in two dimensions are calculated. Orthobaric densities, vapor pressures, surface tensions, and interfacial thicknesses are reported. Results are presented for a series of potential widths λ* = 1.4, 1.5, 2, 2.5, 3, 3.5, 4, 4.5, and 5, where λ* is given in units of the hard core diameter σ. Critical and triple points are explored. No critical point was found for λ* < 1.4. Corresponding states principle analysis is performed for the whole series. For λ* = 1.4 and 1.5 evidence is presented that at an intermediate temperature between the critical and the triple point temperatures the liquid branch becomes an amorphous solid. This point is recognized in Armas-Pérez et al. [unpublished] as a hexatic phase transition. It is located at reduced temperatures T* = 0.47 and 0.35 for λ* = 1.4 and 1.5, respectively. Properties such as the surface tension, vapor pressure, and interfacial thickness do not present any discontinuity at these points. This amorphous solid branch does not follow the corresponding state principle, which is only applied to liquids and gases.
The melting point of lithium: an orbital-free first-principles molecular dynamics study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Mohan; Hung, Linda; Huang, Chen
2013-08-25
The melting point of liquid lithium near zero pressure is studied with large-scale orbital-free first-principles molecular dynamics (OF-FPMD) in the isobaric-isothermal ensemble. Here, we adopt the Wang-Govind-Carter (WGC) functional as our kinetic energy density functional (KEDF) and construct a bulk-derived local pseudopotential (BLPS) for Li. Our simulations employ both the ‘heat-until-melts’ method and the coexistence method. We predict 465 K as an upper bound of the melting point of Li from the ‘heat-until-melts’ method, while we predict 434 K as the melting point of Li from the coexistence method. These values compare well with an experimental melting point of 453more » K at zero pressure. Furthermore, we calculate a few important properties of liquid Li including the diffusion coefficients, pair distribution functions, static structure factors, and compressibilities of Li at 470 K and 725 K in the canonical ensemble. This theoretically-obtained results show good agreement with known experimental results, suggesting that OF-FPMD using a non-local KEDF and a BLPS is capable of accurately describing liquid metals.« less
Lean Manufacturing Principles Improving the Targeting Process
2012-06-08
author has familiarity with Lean manufacturing principles. Third, Lean methods have been used in different industries and have proven adaptable to the...92 The case study also demonstrates the multi organizational application of VSM, JIT and the 5S method ...new members not knowing the process, this will serve as a start point for the developing of understanding. Within the Food industry we observed “the
Variational principle for the Navier-Stokes equations.
Kerswell, R R
1999-05-01
A variational principle is presented for the Navier-Stokes equations in the case of a contained boundary-driven, homogeneous, incompressible, viscous fluid. Based upon making the fluid's total viscous dissipation over a given time interval stationary subject to the constraint of the Navier-Stokes equations, the variational problem looks overconstrained and intractable. However, introducing a nonunique velocity decomposition, u(x,t)=phi(x,t) + nu(x,t), "opens up" the variational problem so that what is presumed a single allowable point over the velocity domain u corresponding to the unique solution of the Navier-Stokes equations becomes a surface with a saddle point over the extended domain (phi,nu). Complementary or dual variational problems can then be constructed to estimate this saddle point value strictly from above as part of a minimization process or below via a maximization procedure. One of these reduced variational principles is the natural and ultimate generalization of the upper bounding problem developed by Doering and Constantin. The other corresponds to the ultimate Busse problem which now acts to lower bound the true dissipation. Crucially, these reduced variational problems require only the solution of a series of linear problems to produce bounds even though their unique intersection is conjectured to correspond to a solution of the nonlinear Navier-Stokes equations.
NASA Astrophysics Data System (ADS)
Hanyu, Ryosuke; Tsuji, Toshiaki
This paper proposes a whole-body haptic sensing system that has multiple supporting points between the body frame and the end-effector. The system consists of an end-effector and multiple force sensors. Using this mechanism, the position of a contact force on the surface can be calculated without any sensor array. A haptic sensing system with a single supporting point structure has previously been developed by the present authors. However, the system has drawbacks such as low stiffness and low strength. Therefore, in this study, a mechanism with multiple supporting points was proposed and its performance was verified. In this paper, the basic concept of the mechanism is first introduced. Next, an evaluation of the proposed method, performed by conducting some experiments, is presented.
Comparison of two stand-alone CADe systems at multiple operating points
NASA Astrophysics Data System (ADS)
Sahiner, Berkman; Chen, Weijie; Pezeshk, Aria; Petrick, Nicholas
2015-03-01
Computer-aided detection (CADe) systems are typically designed to work at a given operating point: The device displays a mark if and only if the level of suspiciousness of a region of interest is above a fixed threshold. To compare the standalone performances of two systems, one approach is to select the parameters of the systems to yield a target false-positive rate that defines the operating point, and to compare the sensitivities at that operating point. Increasingly, CADe developers offer multiple operating points, which necessitates the comparison of two CADe systems involving multiple comparisons. To control the Type I error, multiple-comparison correction is needed for keeping the family-wise error rate (FWER) less than a given alpha-level. The sensitivities of a single modality at different operating points are correlated. In addition, the sensitivities of the two modalities at the same or different operating points are also likely to be correlated. It has been shown in the literature that when test statistics are correlated, well-known methods for controlling the FWER are conservative. In this study, we compared the FWER and power of three methods, namely the Bonferroni, step-up, and adjusted step-up methods in comparing the sensitivities of two CADe systems at multiple operating points, where the adjusted step-up method uses the estimated correlations. Our results indicate that the adjusted step-up method has a substantial advantage over other the two methods both in terms of the FWER and power.
Secure Multiparty Quantum Computation for Summation and Multiplication.
Shi, Run-hua; Mu, Yi; Zhong, Hong; Cui, Jie; Zhang, Shun
2016-01-21
As a fundamental primitive, Secure Multiparty Summation and Multiplication can be used to build complex secure protocols for other multiparty computations, specially, numerical computations. However, there is still lack of systematical and efficient quantum methods to compute Secure Multiparty Summation and Multiplication. In this paper, we present a novel and efficient quantum approach to securely compute the summation and multiplication of multiparty private inputs, respectively. Compared to classical solutions, our proposed approach can ensure the unconditional security and the perfect privacy protection based on the physical principle of quantum mechanics.
Secure Multiparty Quantum Computation for Summation and Multiplication
Shi, Run-hua; Mu, Yi; Zhong, Hong; Cui, Jie; Zhang, Shun
2016-01-01
As a fundamental primitive, Secure Multiparty Summation and Multiplication can be used to build complex secure protocols for other multiparty computations, specially, numerical computations. However, there is still lack of systematical and efficient quantum methods to compute Secure Multiparty Summation and Multiplication. In this paper, we present a novel and efficient quantum approach to securely compute the summation and multiplication of multiparty private inputs, respectively. Compared to classical solutions, our proposed approach can ensure the unconditional security and the perfect privacy protection based on the physical principle of quantum mechanics. PMID:26792197
Point defect induced segregation of alloying solutes in α-Fe
NASA Astrophysics Data System (ADS)
You, Yu-Wei; Zhang, Yange; Li, Xiangyan; Xu, Yichun; Liu, C. S.; Chen, J. L.; Luo, G.-N.
2016-10-01
Segregation of alloying solute toward clusters and precipitates can result in hardening and embrittlement of ferritic and ferritic/martensitic steels in aging nuclear power plants. Thus, it is essential to study the segregation of solute in α-Fe. In this study, the segregation of eight kinds of alloying solutes (Al, Si, P, S, Ga, Ge, As, Se) in defect-free system and at vacancy, divacancy, and self-interstitial atom in α-Fe has been systematically studied by first-principles calculations. We find that it is energetically favorable for multiple solute S or Se atoms to segregate in defect-free system to form solute clusters, whereas it is very difficult for the other solute atoms to form the similar clusters. With the presence of vacancy and divacancy, the segregation of all the solutes are significantly promoted to form vacancy-solute and divacancy-solute clusters. The divacancy-solute cluster is more stable than the vacancy-solute cluster. The most-stable self-interstitial atom 〈110〉 dumbbell is also found to tightly bind with multiple solute atoms. The 〈110〉-S is even more stable than divacancy-S cluster. Meanwhile, the law of mass action is employed to predict the concentration evolution of vacancy-Si, vacancy-P, and vacancy-S clusters versus temperature and vacancy concentration.
Light and Shadows of the Korean Healthcare System
2012-01-01
This article reviewed achievements and challenges of the National Health Insurance of the Republic of Korea and shared thoughts on its future directions. Starting with large workplaces of 500 or more employees in 1977, Korea's National Health Insurance successfully achieved universal coverage within just 12 yr in 1989. This amazing pace of growth was possible due to a positive combination of strong political will and rapid economic growth. Key features of Korea's experience in achieving universal coverage include 1) gradual expansion of coverage, 2) careful consideration to maintain sound insurance finances, and 3) introducing multiple health insurance societies (multiple payer system) at the initial stage. Introduction of the health insurance has dramatically improved Korea's health indicators and has fueled the rapid growth of basic medical infrastructure including medical institutions and professionals. On the other hand, the successful expansion was not free from side-effects. Although coverage has gradually expanded, benefits are still relatively low. The current situation warrants concern because coverage expansion is driven by welfare populism asserted by irresponsible political slogans and lacks a social consensus on basic principles and philosophy regarding the expansion. Concentration of patients to a few large prestigious hospitals as well as the inefficiencies resulting from a colossal single-payer system should also be pointed out. PMID:22661868
Light and shadows of the Korean healthcare system.
Moon, Tai Joon
2012-05-01
This article reviewed achievements and challenges of the National Health Insurance of the Republic of Korea and shared thoughts on its future directions. Starting with large workplaces of 500 or more employees in 1977, Korea's National Health Insurance successfully achieved universal coverage within just 12 yr in 1989. This amazing pace of growth was possible due to a positive combination of strong political will and rapid economic growth. Key features of Korea's experience in achieving universal coverage include 1) gradual expansion of coverage, 2) careful consideration to maintain sound insurance finances, and 3) introducing multiple health insurance societies (multiple payer system) at the initial stage. Introduction of the health insurance has dramatically improved Korea's health indicators and has fueled the rapid growth of basic medical infrastructure including medical institutions and professionals. On the other hand, the successful expansion was not free from side-effects. Although coverage has gradually expanded, benefits are still relatively low. The current situation warrants concern because coverage expansion is driven by welfare populism asserted by irresponsible political slogans and lacks a social consensus on basic principles and philosophy regarding the expansion. Concentration of patients to a few large prestigious hospitals as well as the inefficiencies resulting from a colossal single-payer system should also be pointed out.
Composite analysis for Escherichia coli at coastal beaches
Bertke, E.E.
2007-01-01
At some coastal beaches, concentrations of fecal-indicator bacteria can differ substantially between multiple points at the same beach at the same time. Because of this spatial variability, the recreational water quality at beaches is sometimes determined by stratifying a beach into several areas and collecting a sample from each area to analyze for the concentration of fecal-indicator bacteria. The average concentration of bacteria from those points is often used to compare to the recreational standard for advisory postings. Alternatively, if funds are limited, a single sample is collected to represent the beach. Compositing the samples collected from each section of the beach may yield equally accurate data as averaging concentrations from multiple points, at a reduced cost. In the study described herein, water samples were collected at multiple points from three Lake Erie beaches and analyzed for Escherichia coli on modified mTEC agar (EPA Method 1603). From the multiple-point samples, a composite sample (n = 116) was formed at each beach by combining equal aliquots of well-mixed water from each point. Results from this study indicate that E. coli concentrations from the arithmetic average of multiple-point samples and from composited samples are not significantly different (t = 1.59, p = 0.1139) and yield similar measures of recreational water quality; additionally, composite samples could result in a significant cost savings.
Connection-based and object-based grouping in multiple-object tracking: A developmental study.
Van der Hallen, Ruth; Reusens, Julie; Evers, Kris; de-Wit, Lee; Wagemans, Johan
2018-03-30
Developmental research on Gestalt laws has previously revealed that, even as young as infancy, we are bound to group visual elements into unitary structures in accordance with a variety of organizational principles. Here, we focus on the developmental trajectory of both connection-based and object-based grouping, and investigate their impact on object formation in participants, aged 9-21 years old (N = 113), using a multiple-object tracking paradigm. Results reveal a main effect of both age and grouping type, indicating that 9- to 21-year-olds are sensitive to both connection-based and object-based grouping interference, and tracking ability increases with age. In addition to its importance for typical development, these results provide an informative baseline to understand clinical aberrations in this regard. Statement of contribution What is already known on this subject? The origin of the Gestalt principles is still an ongoing debate: Are they innate, learned over time, or both? Developmental research has revealed how each Gestalt principle has its own trajectory and unique relationship to visual experience. Both connectedness and object-based grouping play an important role in object formation during childhood. What does this study add? The study identifies how sensitivity to connectedness and object-based grouping evolves in individuals, aged 9-21 years old. Using multiple-object tracking, results reveal that the ability to track multiple objects increases with age. These results provide an informative baseline to understand clinical aberrations in different types of grouping. © 2018 The Authors. British Journal of Developmental Psychology published by John Wiley & Sons Ltd on behalf of British Psychological Society.
Multiple-Choice Question Tests: A Convenient, Flexible and Effective Learning Tool? A Case Study
ERIC Educational Resources Information Center
Douglas, Mercedes; Wilson, Juliette; Ennis, Sean
2012-01-01
The research presented in this paper is part of a project investigating assessment practices, funded by the Scottish Funding Council. Using established principles of good assessment and feedback, the use of online formative and summative multiple choice tests (MCT's) was piloted to support independent and self-directed learning and improve…
ERIC Educational Resources Information Center
Chen, Qi; Mirman, Daniel
2012-01-01
One of the core principles of how the mind works is the graded, parallel activation of multiple related or similar representations. Parallel activation of multiple representations has been particularly important in the development of theories and models of language processing, where coactivated representations ("neighbors") have been shown to…
Note: Toward multiple addressable optical trapping
Faustov, Alexei R.; Webb, Michael R.; Walt, David R.
2010-01-01
We describe a setup for addressable optical trapping in which a laser source is focused on a digital micromirror device and generates an optical trap in a microfluidic cell. In this paper, we report a proof-of-principle single beam∕single micromirror∕single three-dimensional trap arrangement that should serve as the basis for a multiple-trap instrument. PMID:20192526
Babinet's principle in double-refraction systems
NASA Astrophysics Data System (ADS)
Ropars, Guy; Le Floch, Albert
2014-06-01
Babinet's principle applied to systems with double refraction is shown to involve spatial interchanges between the ordinary and extraordinary patterns observed through two complementary screens. As in the case of metamaterials, the extraordinary beam does not follow the Snell-Descartes refraction law, the superposition principle has to be applied simultaneously at two points. Surprisingly, by contrast to the intuitive impression, in the presence of the screen with an opaque region, we observe that the emerging extraordinary photon pattern, which however has undergone a deviation, remains fixed when a natural birefringent crystal is rotated while the ordinary one rotates with the crystal. The twofold application of Babinet's principle implies intensity and polarization interchanges but also spatial and dynamic interchanges which should occur in birefringent metamaterials.
Implications of the Fourteen Points of Total Quality Management (TQM) for Science Education.
ERIC Educational Resources Information Center
Aliff, John Vincent
The management theories of W. Edwards Deming are known as Total Quality Management (TQM) and advocate building quality into organizational processes rather than analyzing outcomes. Although TQM was originally developed for the workplace, educational reformers have been applying its principles to higher education. The original 14 points of Deming's…
NASA Technical Reports Server (NTRS)
Rimskiy-Korsakov, A. V.; Belousov, Y. I.
1973-01-01
A program was compiled for calculating acoustical pressure levels, which might be created by vibrations of complex structures (an assembly of shells and rods), under the influence of a given force, for cases when these fields cannot be measured directly. The acoustical field is determined according to transition frequency and pulse characteristics of the structure in the projection mode. Projection characteristics are equal to the reception characteristics, for vibrating systems in which the reciprocity principle holds true. Characteristics in the receiving mode are calculated on the basis of experimental data on a point pulse space velocity source (input signal) and vibration response of the structure (output signal). The space velocity of a pulse source, set at a point in space r, where it is necessary to calculate the sound field of the structure p(r,t), is determined by measurements of acoustic pressure, created by a point source at a distance R. The vibration response is measured at the point where the forces F and f exciting the system should act.
Using concept mapping principles in PowerPoint.
Kinchin, I M; Cabot, L B
2007-11-01
The use of linear PowerPoint templates to support lectures may inadvertently encourage dental students to adopt a passive approach to learning and a narrow appreciation of the field of study. Such presentations may support short-term learning gains and validate assessment regimes that promote surface learning approaches at the expense of developing a wider appreciation of the field that is necessary for development of clinical expertise. Exploitation of concept mapping principles can provide a balance for the negative learning behaviour that is promoted by the unreflective use of PowerPoint. This increases the opportunities for students to access holistic knowledge structures that are indicators of expertise. We illustrate this using the example of partial denture design and show that undergraduates' grasp of learning and teaching issues is sufficiently sophisticated for them to appreciate the implications of varying the mode of presentation. Our findings indicate that students understand the strategic value of bullet-pointed presentations for short-term assessment goals and the benefits of deep learning mediated by concept mapping that may support longer term professional development. Students are aware of the tension between these competing agendas.
NASA Astrophysics Data System (ADS)
Krynkin, A.; Dolcetti, G.; Hunting, S.
2017-02-01
Accurate reconstruction of the surface roughness is of high importance to various areas of science and engineering. One important application of this technology is for remote monitoring of open channel flows through observing its dynamic surface roughness. In this paper a novel airborne acoustic method of roughness reconstruction is proposed and tested with a static rigid rough surface. This method is based on the acoustic holography principle and Kirchhoff approximation which make use of acoustic pressure data collected at multiple receiver points spread along an arch. The Tikhonov regularisation and generalised cross validation technique are used to solve the underdetermined system of equations for the acoustic pressures. The experimental data are collected above a roughness created with a 3D printer. For the given surface, it is shown that the proposed method works well with the various number of receiver positions. In this paper, the tested ratios between the number of surface points at which the surface elevation can be reconstructed and number of receiver positions are 2.5, 5, and 7.5. It is shown that, in a region comparable with the projected size of the main directivity lobe, the method is able to reconstruct the spatial spectrum density of the actual surface elevation with the accuracy of 20%.
Krynkin, A; Dolcetti, G; Hunting, S
2017-02-01
Accurate reconstruction of the surface roughness is of high importance to various areas of science and engineering. One important application of this technology is for remote monitoring of open channel flows through observing its dynamic surface roughness. In this paper a novel airborne acoustic method of roughness reconstruction is proposed and tested with a static rigid rough surface. This method is based on the acoustic holography principle and Kirchhoff approximation which make use of acoustic pressure data collected at multiple receiver points spread along an arch. The Tikhonov regularisation and generalised cross validation technique are used to solve the underdetermined system of equations for the acoustic pressures. The experimental data are collected above a roughness created with a 3D printer. For the given surface, it is shown that the proposed method works well with the various number of receiver positions. In this paper, the tested ratios between the number of surface points at which the surface elevation can be reconstructed and number of receiver positions are 2.5, 5, and 7.5. It is shown that, in a region comparable with the projected size of the main directivity lobe, the method is able to reconstruct the spatial spectrum density of the actual surface elevation with the accuracy of 20%.
An evolutionary reduction principle for mutation rates at multiple Loci.
Altenberg, Lee
2011-06-01
A model of mutation rate evolution for multiple loci under arbitrary selection is analyzed. Results are obtained using techniques from Karlin (Evolutionary Biology, vol. 14, pp. 61-204, 1982) that overcome the weak selection constraints needed for tractability in prior studies of multilocus event models.A multivariate form of the reduction principle is found: reduction results at individual loci combine topologically to produce a surface of mutation rate alterations that are neutral for a new modifier allele. New mutation rates survive if and only if they fall below this surface-a generalization of the hyperplane found by Zhivotovsky et al. (Proc. Natl. Acad. Sci. USA 91, 1079-1083, 1994) for a multilocus recombination modifier. Increases in mutation rates at some loci may evolve if compensated for by decreases at other loci. The strength of selection on the modifier scales in proportion to the number of germline cell divisions, and increases with the number of loci affected. Loci that do not make a difference to marginal fitnesses at equilibrium are not subject to the reduction principle, and under fine tuning of mutation rates would be expected to have higher mutation rates than loci in mutation-selection balance.Other results include the nonexistence of 'viability analogous, Hardy-Weinberg' modifier polymorphisms under multiplicative mutation, and the sufficiency of average transmission rates to encapsulate the effect of modifier polymorphisms on the transmission of loci under selection. A conjecture is offered regarding situations, like recombination in the presence of mutation, that exhibit departures from the reduction principle. Constraints for tractability are: tight linkage of all loci, initial fixation at the modifier locus, and mutation distributions comprising transition probabilities of reversible Markov chains.
Pointing with Power or Creating with Chalk
ERIC Educational Resources Information Center
Rudow, Sasha R.; Finck, Joseph E.
2015-01-01
This study examines the attitudes of students on the use of PowerPoint and chalk/white boards in college science lecture classes. Students were asked to complete a survey regarding their experiences with PowerPoint and chalk/white boards in their science classes. Both multiple-choice and short answer questions were used. The multiple-choice…
Casper, Andrew; Liu, Dalong; Ebbini, Emad S
2012-01-01
A system for the realtime generation and control of multiple-focus ultrasound phased-array heating patterns is presented. The system employs a 1-MHz, 64-element array and driving electronics capable of fine spatial and temporal control of the heating pattern. The driver is integrated with a realtime 2-D temperature imaging system implemented on a commercial scanner. The coordinates of the temperature control points are defined on B-mode guidance images from the scanner, together with the temperature set points and controller parameters. The temperature at each point is controlled by an independent proportional, integral, and derivative controller that determines the focal intensity at that point. Optimal multiple-focus synthesis is applied to generate the desired heating pattern at the control points. The controller dynamically reallocates the power available among the foci from the shared power supply upon reaching the desired temperature at each control point. Furthermore, anti-windup compensation is implemented at each control point to improve the system dynamics. In vitro experiments in tissue-mimicking phantom demonstrate the robustness of the controllers for short (2-5 s) and longer multiple-focus high-intensity focused ultrasound exposures. Thermocouple measurements in the vicinity of the control points confirm the dynamics of the temperature variations obtained through noninvasive feedback. © 2011 IEEE
Rogers, Geoffrey
2018-06-01
The Yule-Nielsen effect is an influence on halftone color caused by the diffusion of light within the paper upon which the halftone ink is printed. The diffusion can be characterized by a point spread function. In this paper, a point spread function for paper is derived using the multiple-path model of reflection. This model treats the interaction of light with turbid media as a random walk. Using the multiple-path point spread function, a general expression is derived for the average reflectance of light from a frequency-modulated halftone, in which dot size is constant and the number of dots is varied, with the arrangement of dots random. It is also shown that the line spread function derived from the multiple-path model has the form of a Lorentzian function.
Hanel, Rudolf; Thurner, Stefan; Gell-Mann, Murray
2014-05-13
The maximum entropy principle (MEP) is a method for obtaining the most likely distribution functions of observables from statistical systems by maximizing entropy under constraints. The MEP has found hundreds of applications in ergodic and Markovian systems in statistical mechanics, information theory, and statistics. For several decades there has been an ongoing controversy over whether the notion of the maximum entropy principle can be extended in a meaningful way to nonextensive, nonergodic, and complex statistical systems and processes. In this paper we start by reviewing how Boltzmann-Gibbs-Shannon entropy is related to multiplicities of independent random processes. We then show how the relaxation of independence naturally leads to the most general entropies that are compatible with the first three Shannon-Khinchin axioms, the (c,d)-entropies. We demonstrate that the MEP is a perfectly consistent concept for nonergodic and complex statistical systems if their relative entropy can be factored into a generalized multiplicity and a constraint term. The problem of finding such a factorization reduces to finding an appropriate representation of relative entropy in a linear basis. In a particular example we show that path-dependent random processes with memory naturally require specific generalized entropies. The example is to our knowledge the first exact derivation of a generalized entropy from the microscopic properties of a path-dependent random process.
What really separates casuistry from principlism in biomedical ethics.
Cudney, Paul
2014-06-01
Since the publication of the first edition of Tom Beauchamp and James Childress's Principles of Biomedical Ethics there has been much debate about what a proper method in medical ethics should look like. The main rival for Beauchamp and Childress's account, principlism, has consistently been casuistry, an account that recommends argument by analogy from paradigm cases. Admirably, Beauchamp and Childress have modified their own view in successive editions of Principles of Biomedical Ethics in order to address the concerns proponents of casuistry and others have had about principlism. Given these adjustments to their view, some have claimed that principlism and casuistry no longer count as distinct methods. Even so, many still consider these two conceptions of bioethical methodologies as rivals. Both accounts of the relationship between casuistry and principlism are wrong. These two conceptions of methodology in biomedical ethics are significantly different, but the differences are not the ones pointed out by those who still claim that they are distinct positions. In this article, I explain where the real similarities and differences lie between these two views.
Motion control of a gantry crane with a container
NASA Astrophysics Data System (ADS)
Shugailo, T. S.; Yushkov, M. P.
2018-05-01
The transportation of a container by a gantry crane in a given time from one point of space to another is considered. The system is at rest at the end of the motion. A maximum admissible speed is taken into account. The control force is found using either the Pontryagin maximum principle or the generalized Gauss principle. The advantages of the second method over the first one is demonstrated.
Detailed gravity anomalies from GEOS-3 satellite altimetry data
NASA Technical Reports Server (NTRS)
Gopalapillai, G. S.; Mourad, A. G.
1978-01-01
A technique for deriving mean gravity anomalies from dense altimetry data was developed. A combination of both deterministic and statistical techniques was used. The basic mathematical model was based on the Stokes' equation which describes the analytical relationship between mean gravity anomalies and geoid undulations at a point; this undulation is a linear function of the altimetry data at that point. The overdetermined problem resulting from the excessive altimetry data available was solved using Least-Squares principles. These principles enable the simultaneous estimation of the associated standard deviations reflecting the internal consistency based on the accuracy estimates provided for the altimetry data as well as for the terrestrial anomaly data. Several test computations were made of the anomalies and their accuracy estimates using GOES-3 data.
Point defects in thorium nitride: A first-principles study
NASA Astrophysics Data System (ADS)
Pérez Daroca, D.; Llois, A. M.; Mosca, H. O.
2016-11-01
Thorium and its compounds (carbides and nitrides) are being investigated as possible materials to be used as nuclear fuels for Generation-IV reactors. As a first step in the research of these materials under irradiation, we study the formation energies and stability of point defects in thorium nitride by means of first-principles calculations within the framework of density functional theory. We focus on vacancies, interstitials, Frenkel pairs and Schottky defects. We found that N and Th vacancies have almost the same formation energy and that the most energetically favorable defects of all studied in this work are N interstitials. These kind of results for ThN, to the best authors' knowledge, have not been obtained previously, neither experimentally, nor theoretically.
NASA Technical Reports Server (NTRS)
Clarke, R.; Lintereur, L.; Bahm, C.
2016-01-01
A desire for more complete documentation of the National Aeronautics and Space Administration (NASA) Armstrong Flight Research Center (AFRC), Edwards, California legacy code used in the core simulation has led to this e ort to fully document the oblate Earth six-degree-of-freedom equations of motion and integration algorithm. The authors of this report have taken much of the earlier work of the simulation engineering group and used it as a jumping-o point for this report. The largest addition this report makes is that each element of the equations of motion is traced back to first principles and at no point is the reader forced to take an equation on faith alone. There are no discoveries of previously unknown principles contained in this report; this report is a collection and presentation of textbook principles. The value of this report is that those textbook principles are herein documented in standard nomenclature that matches the form of the computer code DERIVC. Previous handwritten notes are much of the backbone of this work, however, in almost every area, derivations are explicitly shown to assure the reader that the equations which make up the oblate Earth version of the computer routine, DERIVC, are correct.
Learning viewpoint invariant object representations using a temporal coherence principle.
Einhäuser, Wolfgang; Hipp, Jörg; Eggert, Julian; Körner, Edgar; König, Peter
2005-07-01
Invariant object recognition is arguably one of the major challenges for contemporary machine vision systems. In contrast, the mammalian visual system performs this task virtually effortlessly. How can we exploit our knowledge on the biological system to improve artificial systems? Our understanding of the mammalian early visual system has been augmented by the discovery that general coding principles could explain many aspects of neuronal response properties. How can such schemes be transferred to system level performance? In the present study we train cells on a particular variant of the general principle of temporal coherence, the "stability" objective. These cells are trained on unlabeled real-world images without a teaching signal. We show that after training, the cells form a representation that is largely independent of the viewpoint from which the stimulus is looked at. This finding includes generalization to previously unseen viewpoints. The achieved representation is better suited for view-point invariant object classification than the cells' input patterns. This property to facilitate view-point invariant classification is maintained even if training and classification take place in the presence of an--also unlabeled--distractor object. In summary, here we show that unsupervised learning using a general coding principle facilitates the classification of real-world objects, that are not segmented from the background and undergo complex, non-isomorphic, transformations.
Schmidt, Eric; Ros, Maxime; Moyse, Emmanuel; Lorthois, Sylvie; Swider, Pascal
2016-01-01
In line with the first law of thermodynamics, Bernoulli's principle states that the total energy in a fluid is the same at all points. We applied Bernoulli's principle to understand the relationship between intracranial pressure (ICP) and intracranial fluids. We analyzed simple fluid physics along a tube to describe the interplay between pressure and velocity. Bernoulli's equation demonstrates that a fluid does not flow along a gradient of pressure or velocity; a fluid flows along a gradient of energy from a high-energy region to a low-energy region. A fluid can even flow against a pressure gradient or a velocity gradient. Pressure and velocity represent part of the total energy. Cerebral blood perfusion is not driven by pressure but by energy: the blood flows from high-energy to lower-energy regions. Hydrocephalus is related to increased cerebrospinal fluid (CSF) resistance (i.e., energy transfer) at various points. Identification of the energy transfer within the CSF circuit is important in understanding and treating CSF-related disorders. Bernoulli's principle is not an abstract concept far from clinical practice. We should be aware that pressure is easy to measure, but it does not induce resumption of fluid flow. Even at the bedside, energy is the key to understanding ICP and fluid dynamics.
Quantum Mechanics predicts evolutionary biology.
Torday, J S
2018-07-01
Nowhere are the shortcomings of conventional descriptive biology more evident than in the literature on Quantum Biology. In the on-going effort to apply Quantum Mechanics to evolutionary biology, merging Quantum Mechanics with the fundamentals of evolution as the First Principles of Physiology-namely negentropy, chemiosmosis and homeostasis-offers an authentic opportunity to understand how and why physics constitutes the basic principles of biology. Negentropy and chemiosmosis confer determinism on the unicell, whereas homeostasis constitutes Free Will because it offers a probabilistic range of physiologic set points. Similarly, on this basis several principles of Quantum Mechanics also apply directly to biology. The Pauli Exclusion Principle is both deterministic and probabilistic, whereas non-localization and the Heisenberg Uncertainty Principle are both probabilistic, providing the long-sought after ontologic and causal continuum from physics to biology and evolution as the holistic integration recognized as consciousness for the first time. Copyright © 2018 Elsevier Ltd. All rights reserved.
36 CFR 200.1 - Central organization.
Code of Federal Regulations, 2010 CFR
2010-07-01
... National Forest System lands and management of natural resources within the principle of multiple use and... engineering, forest economics and marketing, watersheds, wildlife and fish habitat, range, recreation and...
NASA Astrophysics Data System (ADS)
Gedik, Z.; Çakmak, B.
2013-04-01
Special relativity forbids superluminal influences. Using only the no-signaling principle and an assumption about the form of the Schmidt decomposition, we show that for any allowed fidelity there is a unique approximate qubit cloner which can be written explicitly. We introduce the prime cloners whose fidelities have multiplicative property and show that the fidelity of the prime cloners for the infinite copy limit is 1/2.
Demonstration of Human-Autonomy Teaming Principles
NASA Technical Reports Server (NTRS)
Shively, Robert Jay
2016-01-01
Known problems with automation include lack of mode awareness, automation brittleness, and risk of miscalibrated trust. Human-Autonomy Teaming (HAT) is essential for improving these problems. We have identified some critical components of HAT and ran a part-task study to introduce these components to a ground station that supports flight following of multiple aircraft. Our goal was to demonstrate, evaluate, and refine HAT principles. This presentation provides a brief summary of the study and initial findings.
Correlation complementarity yields bell monogamy relations.
Kurzyński, P; Paterek, T; Ramanathan, R; Laskowski, W; Kaszlikowski, D
2011-05-06
We present a method to derive Bell monogamy relations by connecting the complementarity principle with quantum nonlocality. The resulting monogamy relations are stronger than those obtained from the no-signaling principle alone. In many cases, they yield tight quantum bounds on the amount of violation of single and multiple qubit correlation Bell inequalities. In contrast with the two-qubit case, a rich structure of possible violation patterns is shown to exist in the multipartite scenario.
A first principles calculation and statistical mechanics modeling of defects in Al-H system
NASA Astrophysics Data System (ADS)
Ji, Min; Wang, Cai-Zhuang; Ho, Kai-Ming
2007-03-01
The behavior of defects and hydrogen in Al was investigated by first principles calculations and statistical mechanics modeling. The formation energy of different defects in Al+H system such as Al vacancy, H in institution and multiple H in Al vacancy were calculated by first principles method. Defect concentration in thermodynamical equilibrium was studied by total free energy calculation including configuration entropy and defect-defect interaction from low concentration limit to hydride limit. In our grand canonical ensemble model, hydrogen chemical potential under different environment plays an important role in determing the defect concentration and properties in Al-H system.
NASA Astrophysics Data System (ADS)
Boerner, S.; Funke, H. H.-W.; Hendrick, P.; Recker, E.; Elsing, R.
2013-03-01
The usage of alternative fuels in aircraft industry plays an important role of current aero engine research and development processes. The micromix burning principle allows a secure and low NOx combustion of gaseous hydrogen. The combustion principle is based on the fluid phenomenon of jet in cross flow and achieves a significant lowering in NOx formation by using multiple miniaturized flames. The paper highlights the development and the integration of a combustion chamber, based on the micromix combustion principle, into an Auxiliary Power Unit (APU) GTCP 36-300 with regard to the necessary modifications on the gas turbine and on the engine controller.
Basic principles of fracture treatment in children.
Ömeroğlu, Hakan
2018-04-01
This review aims to summarize the basic treatment principles of fractures according to their types and general management principles of special conditions including physeal fractures, multiple fractures, open fractures, and pathologic fractures in children. Definition of the fracture is needed for better understanding the injury mechanism, planning a proper treatment strategy, and estimating the prognosis. As the healing process is less complicated, remodeling capacity is higher and non-union is rare, the fractures in children are commonly treated by non-surgical methods. Surgical treatment is preferred in children with multiple injuries, in open fractures, in some pathologic fractures, in fractures with coexisting vascular injuries, in fractures which have a history of failed initial conservative treatment and in fractures in which the conservative treatment has no/little value such as femur neck fractures, some physeal fractures, displaced extension and flexion type humerus supracondylar fractures, displaced humerus lateral condyle fractures, femur, tibia and forearm shaft fractures in older children and adolescents and unstable pelvis and acetabulum fractures. Most of the fractures in children can successfully be treated by non-surgical methods.
Topological photonic crystal with equifrequency Weyl points
NASA Astrophysics Data System (ADS)
Wang, Luyang; Jian, Shao-Kai; Yao, Hong
2016-06-01
Weyl points in three-dimensional photonic crystals behave as monopoles of Berry flux in momentum space. Here, based on general symmetry analysis, we show that a minimal number of four symmetry-related (consequently equifrequency) Weyl points can be realized in time-reversal invariant photonic crystals. We further propose an experimentally feasible way to modify double-gyroid photonic crystals to realize four equifrequency Weyl points, which is explicitly confirmed by our first-principle photonic band-structure calculations. Remarkably, photonic crystals with equifrequency Weyl points are qualitatively advantageous in applications including angular selectivity, frequency selectivity, invisibility cloaking, and three-dimensional imaging.
The Multiple Control of Verbal Behavior
Michael, Jack; Palmer, David C; Sundberg, Mark L
2011-01-01
Amid the novel terms and original analyses in Skinner's Verbal Behavior, the importance of his discussion of multiple control is easily missed, but multiple control of verbal responses is the rule rather than the exception. In this paper we summarize and illustrate Skinner's analysis of multiple control and introduce the terms convergent multiple control and divergent multiple control. We point out some implications for applied work and discuss examples of the role of multiple control in humor, poetry, problem solving, and recall. Joint control and conditional discrimination are discussed as special cases of multiple control. We suggest that multiple control is a useful analytic tool for interpreting virtually all complex behavior, and we consider the concepts of derived relations and naming as cases in point. PMID:22532752
38 CFR 3.303 - Principles relating to service connection.
Code of Federal Regulations, 2010 CFR
2010-07-01
... permit service connection of arthritis, disease of the heart, nephritis, or pulmonary disease, first..., tuberculosis, multiple sclerosis, etc.), there is no requirement of evidentiary showing of continuity...
Photographs and Committees: Activities That Help Students Discover Permutations and Combinations.
ERIC Educational Resources Information Center
Szydlik, Jennifer Earles
2000-01-01
Presents problem situations that support students when discovering the multiplication principle, permutations, combinations, Pascal's triangle, and relationships among those objects in a concrete context. (ASK)
Havener, Robin W; Liang, Yufeng; Brown, Lola; Yang, Li; Park, Jiwoong
2014-06-11
We report a systematic study of the optical conductivity of twisted bilayer graphene (tBLG) across a large energy range (1.2-5.6 eV) for various twist angles, combined with first-principles calculations. At previously unexplored high energies, our data show signatures of multiple van Hove singularities (vHSs) in the tBLG bands as well as the nonlinearity of the single layer graphene bands and their electron-hole asymmetry. Our data also suggest that excitonic effects play a vital role in the optical spectra of tBLG. Including electron-hole interactions in first-principles calculations is essential to reproduce the shape of the conductivity spectra, and we find evidence of coherent interactions between the states associated with the multiple vHSs in tBLG.
NASA Astrophysics Data System (ADS)
Hess, Holger; Albrecht, Martin; Grothof, Markus; Hussmann, Stephan; Schwarte, Rudolf
2004-01-01
Working on optical distance measurement a new optical correlator was developed at the Institute for Data Processing of the University of Siegen in the last years. The so called Photonic Mixer Device (PMD), to be meant originally for laser ranging systems, offers a lot of advantages for wireless optical data communication like high speed spatial light demodulation up to the GHz range and inherent backlight suppression. This contribution describes the application of such PMDs in a free space interconnect based on the principle of Multi Dimensional Multiple Access (MDMA) and the advantages of this new approach, starting from the MDMA principle and followed by the fundamental functionality of PMDs. After that an Optical MDMA (O-MDMA) demonstrator and first measurement results will be presented.
Monitoring system of multiple fire fighting based on computer vision
NASA Astrophysics Data System (ADS)
Li, Jinlong; Wang, Li; Gao, Xiaorong; Wang, Zeyong; Zhao, Quanke
2010-10-01
With the high demand of fire control in spacious buildings, computer vision is playing a more and more important role. This paper presents a new monitoring system of multiple fire fighting based on computer vision and color detection. This system can adjust to the fire position and then extinguish the fire by itself. In this paper, the system structure, working principle, fire orientation, hydrant's angle adjusting and system calibration are described in detail; also the design of relevant hardware and software is introduced. At the same time, the principle and process of color detection and image processing are given as well. The system runs well in the test, and it has high reliability, low cost, and easy nodeexpanding, which has a bright prospect of application and popularization.
Group-multicast capable optical virtual private ring with contention avoidance
NASA Astrophysics Data System (ADS)
Peng, Yunfeng; Du, Shu; Long, Keping
2008-11-01
A ring based optical virtual private network (OVPN) employing contention sensing and avoidance is proposed to deliver multiple-to-multiple group-multicast traffic. The network architecture is presented and its operation principles as well as performance are investigated. The main contribution of this article is the presentation of an innovative group-multicast capable OVPN architecture with technologies available today.
An analysis of the multiple model adaptive control algorithm. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Greene, C. S.
1978-01-01
Qualitative and quantitative aspects of the multiple model adaptive control method are detailed. The method represents a cascade of something which resembles a maximum a posteriori probability identifier (basically a bank of Kalman filters) and a bank of linear quadratic regulators. Major qualitative properties of the MMAC method are examined and principle reasons for unacceptable behavior are explored.
ERIC Educational Resources Information Center
Hodkowski, Nicola M.; Gardner, Amber; Jorgensen, Cody; Hornbein, Peter; Johnson, Heather L.; Tzur, Ron
2016-01-01
In this paper we examine the application of Tzur's (2007) fine-grained assessment to the design of an assessment measure of a particular multiplicative scheme so that non-interview, good enough data can be obtained (on a large scale) to infer into elementary students' reasoning. We outline three design principles that surfaced through our recent…
The Value of Friction, Tension, and Disparity in Global Collaboration (Invited)
NASA Astrophysics Data System (ADS)
Parsons, M. A.
2013-12-01
Misunderstandings; conflicting goals; competition for limited funds; differing worldviews, agendas, ideals... These types of 'friction' are inevitable in national and global collaboration. And while friction can create tension and conflict, it is not inherently bad. It is at these points of interaction and tension where we can sometimes gain the most insight. Common understanding comes not only through agreed universal principles but also through multiple lines of evidence that wind through disparate views and describe a greater story. Collaboration is not straightforward in an environment of friction, tension, and disparity. Collaborators do not necessarily have common goals. Dynamic, coalition-style politics emerge. How can we align these disparities to achieve standards and common knowledge while still valuing and understanding differing perspectives? Achieving the understanding that comes through both unity and disparity is a central goal of the Research Data Alliance. RDA is emerging as a "neutral place" or "social gateway" where frictions can be identified, addressed, and understood but not necessarily removed.
Keith, M M; Cann, B; Brophy, J T; Hellyer, D; Day, M; Egan, S; Mayville, K; Watterson, A
2001-01-01
This research was prompted by the clinical presentation of workers from a variety of gaming occupations with injuries and illnesses and multiple health and safety concerns. Using participatory action research principles, 51 gaming workers in Ontario and 20 gaming workers in Manitoba were consulted during a series of focus group sessions. Mapping exercises were used to survey the participants about their health concerns, perceived occupational hazards and the impact of working conditions on their personal lives. Participants were then asked to prioritize their concerns and make recommendations for improvements. Gaming workers from both provinces identified similar health, hazard and psycho-social concerns. They prioritized the issues of stress, ergonomics, indoor air quality (including second-hand smoke and temperature), biological hazards, physical hazards and noise. This study points to a need to more fully investigate and address health and safety issues in the gaming industry. It also demonstrates the effectiveness of a worker-driven, participatory consultation. Copyright 2001 Wiley-Liss, Inc.
Electronic defects in the halide antiperovskite semiconductor Hg3Se2I2
NASA Astrophysics Data System (ADS)
Kim, Joon-Il; Peters, John A.; He, Yihui; Liu, Zhifu; Das, Sanjib; Kontsevoi, Oleg Y.; Kanatzidis, Mercouri G.; Wessels, Bruce W.
2017-10-01
Halide perovskites have emerged as a potential photoconducting material for photovoltaics and hard radiation detection. We investigate the nature of charge transport in the semi-insulating chalcohalide Hg3Se2I2 compound using the temperature dependence of dark current, thermally stimulated current (TSC) spectroscopy, and photoconductivity measurements as well as first-principles density functional theory (DFT) calculations. Dark conductivity measurements and TSC spectroscopy indicate the presence of multiple shallow and deep level traps that have relatively low concentrations of the order of 1013-1015c m-3 and capture cross sections of ˜10-16c m2 . A distinct persistent photoconductivity is observed at both low temperatures (<170 K ) and high temperatures (>230 K), with major implications for room-temperature compound semiconductor radiation detection. From preliminary DFT calculations, the origin of the traps is attributed to intrinsic vacancy defects (VHg, VSe, and VI) and interstitials (Seint) or other extrinsic impurities. The results point the way for future improvements in crystal quality and detector performance.
From scientific literacy to sustainability literacy: An ecological framework for education
NASA Astrophysics Data System (ADS)
Colucci-Gray, Laura; Camino, Elena; Barbiero, Giuseppe; Gray, Donald
2006-03-01
In this paper, we report some reflections on science and education, in relation to teaching and research in the field of complex and controversial socio-environmental issues. Starting from an examination of the literature on the epistemological aspects of the science of controversial issues, and introducing the perspective of complexity, the article argues for a complexity of content, context, and method in understanding current problems. Focusing on a model of learning which includes dialogical and reflective approaches, the final part of the article reports on aspect of the authors' experimental practice with role-play for dealing with complex issues. The review of the literature and our experience of action-research introduce a view of education which promotes young people's awareness of multiple points of view, an ability to establish relationships between processes, scales, and contexts which may be nonlinearly related, and practice with creative and nonviolent forms of interrelations with others. Such an approach in science education is coherent with a scenario of planet sustainability based on ecological webs and equity principles.
Stepping Stones to Research: Providing Pipelines from Middle School through PhD
NASA Astrophysics Data System (ADS)
Noel-Storr, Jacob; Baum, S. A.; RIT Insight Lab SSR Team; Carlson CenterImaging Science Faculty, Chester F.
2014-01-01
We present a decade's worth of strategies designed to promote and provide "Stepping Stones to Research" to provide a realistic pipeline of educational opportunities, with multiple gateways and exit points, for students moving towards STEM careers along the "STEM pipeline". We also illustrate how the Stepping Stones are designed to incidentally co-inside with related external opportunities through which we can also guide and support our mentees on their paths. We present programs such as middle school family science programs, high school research opportunities, high school internships, undergraduate research pathways, research experiences for undergraduates, and other opportunities. We will highlight the presentations being made at this very meeting -- from the first presentation of a high school student, to a dissertation presentation of a PhD graduate -- that have benefited from this stepping stone principle. We also reflect on the essential nature of building a "researcher-trust", even as a young student, of advocates and mentors who can support the continuation of a scientific career.
The Electrophysiological MEMS Device with Micro Channel Array for Cellular Network Analysis
NASA Astrophysics Data System (ADS)
Tonomura, Wataru; Kurashima, Toshiaki; Takayama, Yuzo; Moriguchi, Hiroyuki; Jimbo, Yasuhiko; Konishi, Satoshi
This paper describes a new type of MCA (Micro Channel Array) for simultaneous multipoint measurement of cellular network. Presented MCA employing the measurement principles of the patch-clamp technique is designed for advanced neural network analysis which has been studied by co-authors using 64ch MEA (Micro Electrode Arrays) system. First of all, sucking and clamping of cells through channels of developed MCA is expected to improve electrophysiological signal detections. Electrophysiological sensing electrodes integrated around individual channels of MCA by using MEMS (Micro Electro Mechanical System) technologies are electrically isolated for simultaneous multipoint measurement. In this study, we tested the developed MCA using the non-cultured rat's cerebral cortical slice and the hippocampal neurons. We could measure the spontaneous action potential of the slice simultaneously at multiple points and culture the neurons on developed MCA. Herein, we describe the experimental results together with the design and fabrication of the electrophysiological MEMS device with MCA for cellular network analysis.
Yuan, Tiezhu; Wang, Hongqiang; Cheng, Yongqiang; Qin, Yuliang
2017-01-01
Radar imaging based on electromagnetic vortex can achieve azimuth resolution without relative motion. The present paper investigates this imaging technique with the use of a single receiving antenna through theoretical analysis and experimental results. Compared with the use of multiple receiving antennas, the echoes from a single receiver cannot be used directly for image reconstruction using Fourier method. The reason is revealed by using the point spread function. An additional phase is compensated for each mode before imaging process based on the array parameters and the elevation of the targets. A proof-of-concept imaging system based on a circular phased array is created, and imaging experiments of corner-reflector targets are performed in an anechoic chamber. The azimuthal image is reconstructed by the use of Fourier transform and spectral estimation methods. The azimuth resolution of the two methods is analyzed and compared through experimental data. The experimental results verify the principle of azimuth resolution and the proposed phase compensation method. PMID:28335487
The Advanced Gamma-ray Imaging System (AGIS): Real Time Stereoscopic Array Trigger
NASA Astrophysics Data System (ADS)
Byrum, K.; Anderson, J.; Buckley, J.; Cundiff, T.; Dawson, J.; Drake, G.; Duke, C.; Haberichter, B.; Krawzcynski, H.; Krennrich, F.; Madhavan, A.; Schroedter, M.; Smith, A.
2009-05-01
Future large arrays of Imaging Atmospheric Cherenkov telescopes (IACTs) such as AGIS and CTA are conceived to comprise of 50 - 100 individual telescopes each having a camera with 10**3 to 10**4 pixels. To maximize the capabilities of such IACT arrays with a low energy threshold, a wide field of view and a low background rate, a sophisticated array trigger is required. We describe the design of a stereoscopic array trigger that calculates image parameters and then correlates them across a subset of telescopes. Fast Field Programmable Gate Array technology allows to use lookup tables at the array trigger level to form a real-time pattern recognition trigger tht capitalizes on the multiple view points of the shower at different shower core distances. A proof of principle system is currently under construction. It is based on 400 MHz FPGAs and the goal is for camera trigger rates of up to 10 MHz and a tunable cosmic-ray background suppression at the array level.
What Comes Beyond the Standard Models, Proceedings to the 9th Workshop held in Bled, Slovenia.
NASA Astrophysics Data System (ADS)
Mankoc Borstnik, Norma; Nielsen, Holger Bech; Froggatt, Colin D.; Lukman, Dragan
2006-12-01
Contents: 1. Child Universes in the Laboratory (S. Ansoldi and E.I. Guendelman) 2. Relation between Finestructure Constants at the Planck Scale from Multiple Point Principle (D.L. Bennett, L.V. Laperashvili and H.B. Nielsen) 3. On the Origin of Families of Fermions and Their Mass Matrices -- Approximate Analyses of Properties of Four Families Within Approach Unifying Spins and Charges (M. Breskvar, D. Lukman and N.S. Mankoc Borstnik) 4. Cosmoparticle Physics: Cross-disciplinary Study of Physics Beyond the Standard Model (M.Yu. Khlopov) 5. Discussion Section on 4th Generation (M.Yu. Khlopov) 6. Involution Requirement on a Boundary Makes Massless Fermions Compactified on a Finite Flat Disk Mass Protected (N.S. Mankoc Borstnik and H.B. Nielsen) 7. How Can Group Theory be Generalized so Perhaps Providing Further Information About Our Universe? (R. Mirman) 8. Future Dependent Initial Conditions from Imaginary Part in Lagrangian (H.B. Nielsen and M. Ninomiya) 9. Coupling Self-tuning to Critical Lines From Highly Compact Extra Dimensions (K. Petrov)
Ethical considerations in industry-sponsored multiregional clinical trials.
Ibia, Ekopimo; Binkowitz, Bruce; Saillot, Jean-Louis; Talerico, Steven; Koerner, Chin; Ferreira, Irene; Agarwal, Anupam; Metz, Craig; Maman, Marianne
2010-01-01
During the last several decades, the scientific and ethics communities have addressed important ethical issues in medical research, resulting in the elaboration and adoption of concepts, guidelines, and codes. Ethical issues in the conduct of Multiregional Clinical Trials have attracted significant attention mainly in the last two decades. With the globalization of clinical research and the rapid expansion to countries with a limited tradition of biomedical research, sponsors must proactively address local ethical issues, the adequacy of oversight as well as the applicability and validity of data, and scientific conclusions drawn from diverse patient populations. This paper highlights some core ethical principles and milestones in medical research, and, from an industry perspective, it discusses ethical issues that the clinical trial team may face when conducting Multiregional Clinical Trials (MRCT, clinical trials conducted at sites located across multiple geographic regions of the world). This paper further highlights the areas of consensus and controversies and proposes points to consider. Copyright © 2010 John Wiley & Sons, Ltd.
New fast DCT algorithms based on Loeffler's factorization
NASA Astrophysics Data System (ADS)
Hong, Yoon Mi; Kim, Il-Koo; Lee, Tammy; Cheon, Min-Su; Alshina, Elena; Han, Woo-Jin; Park, Jeong-Hoon
2012-10-01
This paper proposes a new 32-point fast discrete cosine transform (DCT) algorithm based on the Loeffler's 16-point transform. Fast integer realizations of 16-point and 32-point transforms are also provided based on the proposed transform. For the recent development of High Efficiency Video Coding (HEVC), simplified quanti-zation and de-quantization process are proposed. Three different forms of implementation with the essentially same performance, namely matrix multiplication, partial butterfly, and full factorization can be chosen accord-ing to the given platform. In terms of the number of multiplications required for the realization, our proposed full-factorization is 3~4 times faster than a partial butterfly, and about 10 times faster than direct matrix multiplication.
SPIRAL-SPRITE: a rapid single point MRI technique for application to porous media.
Szomolanyi, P; Goodyear, D; Balcom, B; Matheson, D
2001-01-01
This study presents the application of a new, rapid, single point MRI technique which samples k space with spiral trajectories. The general principles of the technique are outlined along with application to porous concrete samples, solid pharmaceutical tablets and gas phase imaging. Each sample was chosen to highlight specific features of the method.
The Ranking of Higher Education Institutions in Russia: Some Methodological Problems.
ERIC Educational Resources Information Center
Filinov, Nikolay B.; Ruchkina, Svetlana
2002-01-01
The ranking of higher education institutions in Russia is examined from two points of view: as a social phenomenon and as a multi-criteria decision-making problem. The first point of view introduces the idea of interested and involved parties; the second introduces certain principles on which a rational ranking methodology should be based.…
ERIC Educational Resources Information Center
Rowland, D. R.
2007-01-01
The physical analysis of a uniformly accelerating point charge provides a rich problem to explore in advanced courses in electrodynamics and relativity since it brings together fundamental concepts in relation to electromagnetic radiation, Einstein's equivalence principle and the inertial mass of field energy in ways that reveal subtleties in each…
The problem of 'thick in status, thin in content' in Beauchamp and Childress' principlism.
Lee, Marvin J H
2010-09-01
For many, Thomas Beauchamp and James Childress have elaborated moral reasoning by using the four principles whereby all substantive problems of medical ethics (and of ethics more generally) can be properly analysed and cogent philosophical solutions for the problems can be found. It seems that their 'principlism' gets updated, with better features being added during the course of the six editions of Principles of Biomedical Ethics. Nonetheless, Beauchamp and Childress seem to have been losing their way when it comes to the common-morality justification, which is the epistemological (and perhaps metaphysical) backbone of their method, and this is shown more vividly in their most recent (2009) edition of Principles of Biomedical Ethics. The author points out what he calls the problem of 'thick in status, thin in content' in principlism. The problem exists because principlism cannot adequately explain how the prescriptive sense of common morality it supports is consistent with the existence of what Beauchamp and Childress call the 'legitimate moral diversity in the world'. Because of this problem, first, the practical end that principlism allegedly accomplishes (ie, providing practical moral guidelines in a relatively 'thick' content, based on common morality) is frustrated, and, second, principlism makes itself the method of common morality de jure and of moral pluralism de facto.
Project Management Using Modern Guidance, Navigation and Control Theory
NASA Technical Reports Server (NTRS)
Hill, Terry R.
2011-01-01
Implementing guidance, navigation, and control (GN&C) theory principles and applying them to the human element of project management and control is not a new concept. As both the literature on the subject and the real-world applications are neither readily available nor comprehensive with regard to how such principles might be applied, this paper has been written to educate the project manager on the "laws of physics" of his or her project (not to teach a GN&C engineer how to become a project manager) and to provide an intuitive, mathematical explanation as to the control and behavior of projects. This paper will also address how the fundamental principles of modern GN&C were applied to the National Aeronautics and Space Administration's (NASA) Constellation Program (CxP) space suit project, ensuring the project was managed within cost, schedule, and budget. A project that is akin to a physical system can be modeled and managed using the same over arching principles of GN&C that would be used if that project were a complex vehicle, a complex system(s), or complex software with time-varying processes (at times nonlinear) containing multiple data inputs of varying accuracy and a range of operating points. The classic GN&C theory approach could thus be applied to small, well-defined projects; yet when working with larger, multiyear projects necessitating multiple organizational structures, numerous external influences, and a multitude of diverse resources, modern GN&C principles are required to model and manage the project. The fundamental principles of a GN&C system incorporate these basic concepts: State, Behavior, Feedback Control, Navigation, Guidance and Planning Logic systems. The State of a system defines the aspects of the system that can change over time; e.g., position, velocity, acceleration, coordinate-based attitude, and temperature, etc. The Behavior of the system focuses more on what changes are possible within the system; this is denoted in the state of the system. The behavior of a system, as captured in the system modeling, when properly done will aid in accurately predicting future system performance. The Feedback Control system understands the state and behavior of the system and uses feedback to adjust control inputs into the system. The feedback, which is the right arm of the Control system, allows change to be affected in the overall system; it therefore is important to not only correctly identify the system feedback inputs, but also the system response to the feedback inputs. The Navigation system takes multiple data inputs and based on a priori knowledge of the inputs, develops a statistically based weighting of the inputs and measurements to determine the system's state. Guidance and Planning Logic of the system, complete with an understanding of where the system is (provided by the Navigation system), will in turn determine where the system needs to be and how to get it there. With any system/project, it is critical that the objective of the system/project be clearly defined -- not only to plan but to measure performance and to aid in guiding the system or the project. The system principles discussed above, which can be and have been applied to the current CxP space suit development project, can also be mapped to real-world constituents, thus allowing project managers to apply systems theories that are well defined in engineering and mathematics to a discipline (i.e., Project Management) that historically has been based in personal experience and intuition. This mapping of GN&C theory to Project Management will, in turn, permit a direct, methodical approach to Project Management, planning and control providing a tool to help predict (and guide) performance and an understanding of the project constraints, how the project can be controlled, and the impacts to external influences and inputs. This approach, to a project manager, flows down to the three bottom-line variables of cost, schedule, and scope ando the needed control of these three variables to successfully perform and complete a project.
NASA Technical Reports Server (NTRS)
Deepak, A.; Fluellen, A.
1978-01-01
An efficient numerical method of multiple quadratures, the Conroy method, is applied to the problem of computing multiple scattering contributions in the radiative transfer through realistic planetary atmospheres. A brief error analysis of the method is given and comparisons are drawn with the more familiar Monte Carlo method. Both methods are stochastic problem-solving models of a physical or mathematical process and utilize the sampling scheme for points distributed over a definite region. In the Monte Carlo scheme the sample points are distributed randomly over the integration region. In the Conroy method, the sample points are distributed systematically, such that the point distribution forms a unique, closed, symmetrical pattern which effectively fills the region of the multidimensional integration. The methods are illustrated by two simple examples: one, of multidimensional integration involving two independent variables, and the other, of computing the second order scattering contribution to the sky radiance.
Design principles for radiation-resistant solid solutions
NASA Astrophysics Data System (ADS)
Schuler, Thomas; Trinkle, Dallas R.; Bellon, Pascal; Averback, Robert
2017-05-01
We develop a multiscale approach to quantify the increase in the recombined fraction of point defects under irradiation resulting from dilute solute additions to a solid solution. This methodology provides design principles for radiation-resistant materials. Using an existing database of solute diffusivities, we identify Sb as one of the most efficient solutes for this purpose in a Cu matrix. We perform density-functional-theory calculations to obtain binding and migration energies of Sb atoms, vacancies, and self-interstitial atoms in various configurations. The computed data informs the self-consistent mean-field formalism to calculate transport coefficients, allowing us to make quantitative predictions of the recombined fraction of point defects as a function of temperature and irradiation rate using homogeneous rate equations. We identify two different mechanisms according to which solutes lead to an increase in the recombined fraction of point defects; at low temperature, solutes slow down vacancies (kinetic effect), while at high temperature, solutes stabilize vacancies in the solid solution (thermodynamic effect). Extension to other metallic matrices and solutes are discussed.
NASA Astrophysics Data System (ADS)
Reimberg, Paulo; Bernardeau, Francis
2018-01-01
We present a formalism based on the large deviation principle (LDP) applied to cosmological density fields, and more specifically to the arbitrary functional of density profiles, and we apply it to the derivation of the cumulant generating function and one-point probability distribution function (PDF) of the aperture mass (Map ), a common observable for cosmic shear observations. We show that the LDP can indeed be used in practice for a much larger family of observables than previously envisioned, such as those built from continuous and nonlinear functionals of density profiles. Taking advantage of this formalism, we can extend previous results, which were based on crude definitions of the aperture mass, with top-hat windows and the use of the reduced shear approximation (replacing the reduced shear with the shear itself). We were precisely able to quantify how this latter approximation affects the Map statistical properties. In particular, we derive the corrective term for the skewness of the Map and reconstruct its one-point PDF.
Abgrall, N.; Arnquist, I. J.; Avignone, F. T.; ...
2016-11-11
Here, a search for Pauli-exclusion-principle-violating K α electron transitions was performed using 89.5 kg-d of data collected with a p-type point contact high-purity germanium detector operated at the Kimballton Underground Research Facility. A lower limit on the transition lifetime of 5.8 × 10 30 s at 90% C.L. was set by looking for a peak at 10.6 keV resulting from the X-ray and Auger electrons present following the transition. A similar analysis was done to look for the decay of atomic K-shell electrons into neutrinos, resulting in a lower limit of 6.8 × 10 30 s at 90% C.L. Itmore » is estimated that the Majorana Demonstrator, a 44 kg array of p-type point contact detectors that will search for the neutrinoless double-beta decay of 76Ge, could improve upon these exclusion limits by an order of magnitude after three years of operation.« less
[The anthropic principle in biology and radiobiology].
Akif'ev, A P; Degtiarev, S V
1999-01-01
In accordance with the anthropic principle of the Universe the physical constants of fundamental particles of matter and the laws of their counteraction are those that an appearance of man and mind becomes possible and necessary. It is suggested to add some biological constants to the set of fundamental constants. With reparation of DNA as an example it was shown how a cell ran some parameters of Watson-Crick double helix. It was pointed that the concept of the anthropic principle of the Universe in its full body including biological constants is a key to developing of a unified theory of evolution of the Universe within the limits of scientific creationism.
Brst-Bfv Quantization and the Schwinger Action Principle
NASA Astrophysics Data System (ADS)
Garcia, J. Antonio; Vergara, J. David; Urrutia, Luis F.
We introduce an operator version of the BRST-BFV effective action for arbitrary systems with first class constraints. Using the Schwinger action principle we calculate the propagators corresponding to: (i) the parametrized nonrelativistic free particle, (ii) the relativistic free particle and (iii) the spinning relativistic free particle. Our calculation correctly imposes the BRST invariance at the end points. The precise use of the additional boundary terms required in the description of fermionic variables is incorporated.
Operational Symbols: Can a Picture Be Worth a Thousand Words?
1991-04-01
internal visualization, because forms are to visual communication what words are to verbal communication. From a psychological point of view, the process... Visual Communication . Washington, DC: National Education Association, 1960. Bohannan, Anthony G. "C31 In Support of the Land Commander," in Principles...captions guide what is learned from a picture or graphic. 40. John C. Ball and Francis C. Byrnes, ed., Research, Principles, and Practices in Visual
A game plan: Gamification design principles in mHealth applications for chronic disease management.
Miller, Aaron S; Cafazzo, Joseph A; Seto, Emily
2016-06-01
Effective chronic disease management is essential to improve positive health outcomes, and incentive strategies are useful in promoting self-care with longevity. Gamification, applied with mHealth (mobile health) applications, has the potential to better facilitate patient self-management. This review article addresses a knowledge gap around the effective use of gamification design principles, or mechanics, in developing mHealth applications. Badges, leaderboards, points and levels, challenges and quests, social engagement loops, and onboarding are mechanics that comprise gamification. These mechanics are defined and explained from a design and development perspective. Health and fitness applications with gamification mechanics include: bant which uses points, levels, and social engagement, mySugr which uses challenges and quests, RunKeeper which uses leaderboards as well as social engagement loops and onboarding, Fitocracy which uses badges, and Mango Health, which uses points and levels. Specific design considerations are explored, an example of the efficacy of a gamified mHealth implementation in facilitating improved self-management is provided, limitations to this work are discussed, a link between the principles of gaming and gamification in health and wellness technologies is provided, and suggestions for future work are made. We conclude that gamification could be leveraged in developing applications with the potential to better facilitate self-management in persons with chronic conditions. © The Author(s) 2014.
Huang, Rongyong; Zheng, Shunyi; Hu, Kun
2018-06-01
Registration of large-scale optical images with airborne LiDAR data is the basis of the integration of photogrammetry and LiDAR. However, geometric misalignments still exist between some aerial optical images and airborne LiDAR point clouds. To eliminate such misalignments, we extended a method for registering close-range optical images with terrestrial LiDAR data to a variety of large-scale aerial optical images and airborne LiDAR data. The fundamental principle is to minimize the distances from the photogrammetric matching points to the terrestrial LiDAR data surface. Except for the satisfactory efficiency of about 79 s per 6732 × 8984 image, the experimental results also show that the unit weighted root mean square (RMS) of the image points is able to reach a sub-pixel level (0.45 to 0.62 pixel), and the actual horizontal and vertical accuracy can be greatly improved to a high level of 1/4⁻1/2 (0.17⁻0.27 m) and 1/8⁻1/4 (0.10⁻0.15 m) of the average LiDAR point distance respectively. Finally, the method is proved to be more accurate, feasible, efficient, and practical in variety of large-scale aerial optical image and LiDAR data.
Protection Relaying Scheme Based on Fault Reactance Operation Type
NASA Astrophysics Data System (ADS)
Tsuji, Kouichi
The theories of operation of existing relays are roughly divided into two types: one is the current differential types based on Kirchhoff's first law and the other is impedance types based on second law. We can apply the Kirchhoff's laws to strictly formulate fault phenomena, so the circuit equations are represented non linear simultaneous equations with variables fault point k and fault resistance Rf. This method has next two defect. 1) heavy computational burden for the iterative calculation on N-R method, 2) relay operator can not easily understand principle of numerical matrix operation. The new protection relay principles we proposed this paper focuses on the fact that the reactance component on fault point is almost zero. Two reactance Xf(S), Xf(R) on branch both ends are calculated by operation of solving linear equations. If signs of Xf(S) and Xf(R) are not same, it can be judged that the fault point exist in the branch. This reactance Xf corresponds to difference of branch reactance between actual fault point and imaginaly fault point. And so relay engineer can to understand fault location by concept of “distance". The simulation results using this new method indicates the highly precise estimation of fault locations compared with the inspected fault locations on operating transmission lines.
Winstanley, A.; Sperotto, R.G.; Putnick, D.L.; Cherian, S.; Bornstein, M.H.; Gattis, M.
2014-01-01
The aims of this study were to examine and compare the development of parenting cognitions and principles in mothers following preterm and term deliveries. Parenting cognitions about child development, including thinking that is restricted to single causes and single outcomes (categorical thinking) and thinking that takes into account multiple perspectives (perspectivist thinking), have been shown to relate to child outcomes. Parenting principles about using routines (structure) or infant cues (attunement) to guide daily caregiving have been shown to relate to caregiving practices. We investigated the continuity and stability of parenting cognitions and principles in the days following birth to 5 months postpartum for mothers of infants born term and preterm. All parenting cognitions were stable across time. Categorical thinking increased at a group level across time in mothers of preterm, but not term, infants. Perspectivist thinking increased at a group level for first-time mothers (regardless of birth status) and tended to be lower in mothers of preterm infants. Structure at birth did not predict later structure (and so was unstable) in mothers of preterm, but not term, infants and neither group changed in mean level across time. Attunement was consistent across time in both groups of mothers. These results indicate that prematurity has multiple, diverse effects on parenting beliefs, which may in turn influence maternal behavior and child outcomes. PMID:25459794
NASA Astrophysics Data System (ADS)
Chen, Xi; Zhong, Jiaqi; Song, Hongwei; Zhu, Lei; Wang, Jin; Zhan, Mingsheng
2014-08-01
Vibrational noise is one of the most important noises that limits the performance of the nonisotopes atom-interferometers (AIs) -based weak-equivalence-principle (WEP) -test experiment. By analyzing the vibration-induced phases, we find that, although the induced phases are not completely common, their ratio is always a constant at every experimental data point, which is not fully utilized in the traditional elliptic curve-fitting method. From this point, we propose a strategy that can greatly suppress the vibration-induced phase noise by stabilizing the Raman laser frequencies at high precision and controlling the scanning-phase ratio. The noise rejection ratio can be as high as 1015 with arbitrary dual-species AIs. Our method provides a Lissajous curve, and the shape of the curve indicates the breakdown of the weak-equivalence-principle signal. Then we manage to derive an estimator for the differential phase of the Lissajous curve. This strategy could be helpful in extending the candidates of atomic species for high-precision AIs-based WEP-test experiments.
Radical constructivism: Between realism and solipsism
NASA Astrophysics Data System (ADS)
Martínez-Delgado, Alberto
2002-11-01
This paper criticizes radical constructivism of the Glasersfeld type, pointing out some contradictions between the declared radical principles and their theoretical and practical development. These contradictions manifest themselves in a frequent oscillation between solipsism and realism, despite constructivist claims to be an anti-realist theory. The paper also points out the contradiction between the relativism of the radical constructivist principles and the constructivist exclusion of other epistemological or educational paradigms. It also disputes the originality and importance of the radical constructivist paradigm, suggesting the idea of an isomorphism between radical constructivist theory and contemplative realism. In addition, some pedagogical and scientific methodological aspects of the radical constructivist model are examined. Although radical constructivism claims to be a rational theory and advocates deductive thinking, it is argued that there is no logical deductive connection between the radical principles of constructivism and the radical constructivist ideas about scientific research and learning. The paper suggests the possibility of an ideological substratum in the construction and hegemonic success of subjective constructivism and, finally, briefly advances an alternative realist model to epistemological and educational radical constructivism.
, so that future decisions about the management of terrestrial environments will be consistent with sustainable development principles. landsat image The World Summit on Sustainable Development pointed to the
Moeller, Mary Pat; Carr, Gwen; Seaver, Leeanne; Stredler-Brown, Arlene; Holzinger, Daniel
2013-10-01
A diverse panel of experts convened in Bad Ischl, Austria, in June of 2012 for the purpose of coming to consensus on essential principles that guide family-centered early intervention with children who are deaf or hard of hearing (D/HH). The consensus panel included parents, deaf professionals, early intervention program leaders, early intervention specialists, and researchers from 10 nations. All participants had expertise in working with families of children who are D/HH, and focus was placed on identifying family-centered practice principles that are specific to partnering with these families. Panel members reported that the implementation of family-centered principles was uneven or inconsistent in their respective nations. During the consensus meeting, they identified 10 agreed-upon foundational principles. Following the conference, they worked to refine the principles and to develop a document that described the principles themselves, related program and provider behaviors, and evidence supporting their use (drawing upon studies from multiple disciplines and nations). The goal of this effort was to promote widespread implementation of validated, evidence-based principles for family-centered early intervention with children who are deaf and hard of hearing and their families.
Effects of comprehensive educational reforms on academic success in a diverse student body.
Lieberman, Steven A; Ainsworth, Michael A; Asimakis, Gregory K; Thomas, Lauree; Cain, Lisa D; Mancuso, Melodee G; Rabek, Jeffrey P; Zhang, Ni; Frye, Ann W
2010-12-01
Calls for medical curriculum reform and increased student diversity in the USA have seen mixed success: performance outcomes following curriculum revisions have been inconsistent and national matriculation of under-represented minority (URM) students has not met aspirations. Published innovations in curricula, academic support and pipeline programmes usually describe isolated interventions that fail to affect curriculum-level outcomes. United States Medical Licensing Examination (USMLE) Step 1 performance and graduation rates were analysed for three classes of medical students before (matriculated 1995-1997, n=517) and after (matriculated 2003-2005, n=597) implementing broad-based reforms in our education system. The changes in pipeline recruitment and preparation programmes, instructional methods, assessment systems, academic support and board preparation were based on sound educational principles and best practices. Post-reform classes were diverse with respect to ethnicity (25.8% URM students), gender (51.8% female), and Medical College Admissions Test (MCAT) score (range 20-40; 24.1% scored ≤ 25). Mean±standard deviation MCAT scores were minimally changed (from 27.2±4.7 to 27.8±3.6). The Step 1 failure rate decreased by 69.3% and mean score increased by 14.0 points (effect size: d=0.67) overall. Improvements were greater among women (failure rate decreased by 78.9%, mean score increased by 15.6 points; d=0.76) and URM students (failure rate decreased by 76.5%, mean score increased by 14.6 points; d=0.74), especially African-American students (failure rate decreased by 93.6%, mean score increased by 20.8 points; d=1.12). Step 1 scores increased across the entire MCAT range. Four- and 5-year graduation rates increased by 7.1% and 5.8%, respectively. The effect sizes in these performance improvements surpassed those previously reported for isolated interventions in curriculum and student support. This success is likely to have resulted from the broad-based, mutually reinforcing nature of reforms in multiple components of the education system. The results suggest that a narrow reductionist view of educational programme reform is less likely to result in improved educational outcomes than a system perspective that addresses the coordinated functioning of multiple aspects of the academic enterprise. © Blackwell Publishing Ltd 2010.
NASA Astrophysics Data System (ADS)
Zhu, Tao; Ren, Ji-Rong; Mo, Shu-Fan
2009-12-01
In this paper, by making use of Duan's topological current theory, the evolution of the vortex filaments in excitable media is discussed in detail. The vortex filaments are found generating or annihilating at the limit points and encountering, splitting, or merging at the bifurcation points of a complex function Z(vec x, t). It is also shown that the Hopf invariant of knotted scroll wave filaments is preserved in the branch processes (splitting, merging, or encountering) during the evolution of these knotted scroll wave filaments. Furthermore, it also revealed that the “exclusion principle" in some chemical media is just the special case of the Hopf invariant constraint, and during the branch processes the “exclusion principle" is also protected by topology.
Multiple excitation nano-spot generation and confocal detection for far-field microscopy.
Mondal, Partha Pratim
2010-03-01
An imaging technique is developed for the controlled generation of multiple excitation nano-spots for far-field microscopy. The system point spread function (PSF) is obtained by interfering two counter-propagating extended depth-of-focus PSF (DoF-PSF), resulting in highly localized multiple excitation spots along the optical axis. The technique permits (1) simultaneous excitation of multiple planes in the specimen; (2) control of the number of spots by confocal detection; and (3) overcoming the point-by-point based excitation. Fluorescence detection from the excitation spots can be efficiently achieved by Z-scanning the detector/pinhole assembly. The technique complements most of the bioimaging techniques and may find potential application in high resolution fluorescence microscopy and nanoscale imaging.
Multiple excitation nano-spot generation and confocal detection for far-field microscopy
NASA Astrophysics Data System (ADS)
Mondal, Partha Pratim
2010-03-01
An imaging technique is developed for the controlled generation of multiple excitation nano-spots for far-field microscopy. The system point spread function (PSF) is obtained by interfering two counter-propagating extended depth-of-focus PSF (DoF-PSF), resulting in highly localized multiple excitation spots along the optical axis. The technique permits (1) simultaneous excitation of multiple planes in the specimen; (2) control of the number of spots by confocal detection; and (3) overcoming the point-by-point based excitation. Fluorescence detection from the excitation spots can be efficiently achieved by Z-scanning the detector/pinhole assembly. The technique complements most of the bioimaging techniques and may find potential application in high resolution fluorescence microscopy and nanoscale imaging.
NASA Astrophysics Data System (ADS)
Wu, Qiang; Zhao, Dekang; Wang, Yang; Shen, Jianjun; Mu, Wenping; Liu, Honglei
2017-11-01
Water inrush from coal-seam floors greatly threatens mining safety in North China and is a complex process controlled by multiple factors. This study presents a mathematical assessment system for coal-floor water-inrush risk based on the variable-weight model (VWM) and unascertained measure theory (UMT). In contrast to the traditional constant-weight model (CWM), which assigns a fixed weight to each factor, the VWM varies with the factor-state value. The UMT employs the confidence principle, which is more effective in ordered partition problems than the maximum membership principle adopted in the former mathematical theory. The method is applied to the Datang Tashan Coal Mine in North China. First, eight main controlling factors are selected to construct the comprehensive evaluation index system. Subsequently, an incentive-penalty variable-weight model is built to calculate the variable weights of each factor. Then, the VWM-UMT model is established using the quantitative risk-grade divide of each factor according to the UMT. On this basis, the risk of coal-floor water inrush in Tashan Mine No. 8 is divided into five grades. For comparison, the CWM is also adopted for the risk assessment, and a differences distribution map is obtained between the two methods. Finally, the verification of water-inrush points indicates that the VWM-UMT model is powerful and more feasible and reasonable. The model has great potential and practical significance in future engineering applications.
Toward the full and proper implementation of Jordan's Principle: An elusive goal to date.
Blackstock, Cindy
2016-01-01
First Nations children experience service delays, disruptions and denials due to jurisdictional payment disputes within and between federal and provincial/territorial governments. The House of Commons sought to ensure First Nations children could access government services on the same terms as other children when it unanimously passed a private members motion in support of Jordan's Principle in 2007. Jordan's Principle states that when a jurisdictional dispute arises regarding public services for a First Nations child that are otherwise available to other children, the government of first contact pays for the service and addresses payment disputes later. Unfortunately, the federal government adopted a definition of Jordan's Principle that was so narrow (complex medical needs with multiple service providers) that no child ever qualified. This narrow definition has been found to be unlawful by the Federal Court of Canada and the Canadian Human Rights Tribunal. The present commentary describes Jordan's Principle, the legal cases that have considered it and the implications of those decisions for health care providers.
Time Series Model Identification by Estimating Information.
1982-11-01
principle, Applications of Statistics, P. R. Krishnaiah , ed., North-Holland: Amsterdam, 27-41. Anderson, T. W. (1971). The Statistical Analysis of Time Series...E. (1969). Multiple Time Series Modeling, Multivariate Analysis II, edited by P. Krishnaiah , Academic Press: New York, 389-409. Parzen, E. (1981...Newton, H. J. (1980). Multiple Time Series Modeling, II Multivariate Analysis - V, edited by P. Krishnaiah , North Holland: Amsterdam, 181-197. Shibata, R
ERIC Educational Resources Information Center
Chin, Doris B.; Chi, Min; Schwartz, Daniel L.
2016-01-01
A common approach for introducing students to a new science concept is to present them with multiple cases of the phenomenon and ask them to explore. The expectation is that students will naturally take advantage of the multiple cases to support their learning and seek an underlying principle for the phenomenon. However, the success of such tasks…
Microsurgical principles and postoperative adhesions: lessons from the past.
Gomel, Victor; Koninckx, Philippe R
2016-10-01
"Microsurgery" is a set of principles developed to improve fertility surgery outcomes. These principles were developed progressively based on common sense and available evidence, under control of clinical feedback obtained with the use of second-look laparoscopy. Fertility outcome was the end point; significant improvement in fertility rates validated the concept clinically. Postoperative adhesion formation being a major cause of failure in fertility surgery, the concept of microsurgery predominantly addresses prevention of postoperative adhesions. In this concept, magnification with a microscope or laparoscope plays a minor role as technical facilitator. Not surprisingly, the principles to prevent adhesion formation are strikingly similar to our actual understanding: gentle tissue handling, avoiding desiccation, irrigation at room temperature, shielding abdominal contents from ambient air, meticulous hemostasis and lavage, avoiding foreign body contamination and infection, administration of dexamethasone postoperatively, and even the concept of keeping denuded areas separated by temporary adnexal or ovarian suspension. The actual concepts of peritoneal conditioning during surgery and use of dexamethasone and a barrier at the end of surgery thus confirm without exception the tenets of microsurgery. Although recent research helped to clarify the pathophysiology of adhesion formation, refined its prevention and the relative importance of each factor, the clinical end point of improvement of fertility rates remains demonstrated for only the microsurgical tenets as a whole. In conclusion, the principles of microsurgery remain fully valid as the cornerstones of reproductive microsurgery, whether performed by means of open access or laparoscopy. Copyright © 2016 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.
1991-11-01
Nicholas George "Image Deblurring for Multiple-Point Impulse Responses," Bryan J. Stossel and Nicholas George 14. SUBJECT TERMS 15. NUMBER OF PAGES...Keith B. Farr Nicholas George Backscatter from a Tilted Rough Disc Donald J. Schertler Nicholas George Image Deblurring for Multiple-Point Impulse ...correlation components. Uf) c)z 0 CL C/) Ix I- z 0 0 LL C,z -J a 0l IMAGE DEBLURRING FOR MULTIPLE-POINT IMPULSE RESPONSES Bryan J. Stossel and Nicholas George
Control of Finite-State, Finite Memory Stochastic Systems
NASA Technical Reports Server (NTRS)
Sandell, Nils R.
1974-01-01
A generalized problem of stochastic control is discussed in which multiple controllers with different data bases are present. The vehicle for the investigation is the finite state, finite memory (FSFM) stochastic control problem. Optimality conditions are obtained by deriving an equivalent deterministic optimal control problem. A FSFM minimum principle is obtained via the equivalent deterministic problem. The minimum principle suggests the development of a numerical optimization algorithm, the min-H algorithm. The relationship between the sufficiency of the minimum principle and the informational properties of the problem are investigated. A problem of hypothesis testing with 1-bit memory is investigated to illustrate the application of control theoretic techniques to information processing problems.
State transfer in highly connected networks and a quantum Babinet principle
NASA Astrophysics Data System (ADS)
Tsomokos, D. I.; Plenio, M. B.; de Vega, I.; Huelga, S. F.
2008-12-01
The transfer of a quantum state between distant nodes in two-dimensional networks is considered. The fidelity of state transfer is calculated as a function of the number of interactions in networks that are described by regular graphs. It is shown that perfect state transfer is achieved in a network of size N , whose structure is that of an (N/2) -cross polytope graph, if N is a multiple of 4 . The result is reminiscent of the Babinet principle of classical optics. A quantum Babinet principle is derived, which allows for the identification of complementary graphs leading to the same fidelity of state transfer, in analogy with complementary screens providing identical diffraction patterns.
Rediscovering the Kernels of Truth in the Urban Legends of the Freshman Composition Classroom
ERIC Educational Resources Information Center
Lovoy, Thomas
2004-01-01
English teachers, as well as teachers within other disciplines, often boil down abstract principles to easily explainable bullet points. Students often pick up and retain these points but fail to grasp the broader contexts that make them relevant. It is therefore sometimes helpful to revisit some of the more common of these "rules of thumb" to…
About the International System of Units (SI) Part II. Organization and General Principles
ERIC Educational Resources Information Center
Aubrecht, Gordon J., II; French, Anthony P.; Iona, Mario
2011-01-01
As all physicists know, all units are arbitrary. The numbering system is anthropocentric; for example, the Celsius scale of temperature has 100 degrees between the boiling point of water at STP and the freezing point of water. The number 100 is chosen because human beings have 10 fingers. The best units might be based on physical constants, for…
ERIC Educational Resources Information Center
Shih, Ching-Hsiang; Chang, Man-Ling; Shih, Ching-Tien
2009-01-01
This study evaluated whether two people with multiple disabilities and minimal motor behavior would be able to improve their pointing performance using finger poke ability with a mouse wheel through a Dynamic Pointing Assistive Program (DPAP) and a newly developed mouse driver (i.e., a new mouse driver replaces standard mouse driver, changes a…
ERIC Educational Resources Information Center
Shih, Ching-Hsiang; Chiu, Sheng-Kai; Chu, Chiung-Ling; Shih, Ching-Tien; Liao, Yung-Kun; Lin, Chia-Chen
2010-01-01
This study evaluated whether two people with multiple disabilities would be able to improve their pointing performance using hand swing with a standard mouse through an Extended Dynamic Pointing Assistive Program (EDPAP) and a newly developed mouse driver (i.e., a new mouse driver replaces standard mouse driver, and changes a mouse into a precise…
Hatton, Anna L; Dixon, John; Rome, Keith; Brauer, Sandra G; Williams, Katrina; Kerr, Graham
2016-04-21
Many people with multiple sclerosis experience problems with walking, which can make daily activities difficult and often leads to falls. Foot sensation plays an important role in keeping the body balanced whilst walking; however, people with multiple sclerosis often have poor sensation on the soles of their feet. Wearing a specially designed shoe insole, which enhances plantar sensory information, could help people with multiple sclerosis to walk better. This study will explore whether long-term wear of a textured insole can improve walking in people with multiple sclerosis. A prospective randomised controlled trial with two parallel groups will be conducted aiming to recruit 176 people with multiple sclerosis living in the community (Brisbane, Australia). Adults with a clinical diagnosis of multiple sclerosis, Disease Steps score 1-4, who are ambulant over 100 m and who meet specific inclusion criteria will be recruited. Participants will be randomised to a smooth control insole (n = 88) or textured insole (n = 88) group. The allocated insole will be worn for 12-weeks within participants' own footwear, with self-report wear diaries and falls calendars being completed over this period. Blinded assessors will conduct two baseline assessments and one post-intervention assessment. Gait tasks will be completed barefoot, wearing standardised footwear only, and wearing standardised footwear with smooth and textured insoles. The primary outcome measure will be mediolateral base of support when walking over even and uneven surfaces. Secondary measures include spatiotemporal gait parameters (stride length, stride time variability, double-limb support time, velocity), gait kinematics (hip, knee, and ankle joint angles, toe clearance, trunk inclination, arm swing, mediolateral pelvis/head displacement), foot sensation (light touch-pressure, vibration, two-point discrimination) and proprioception (ankle joint position sense). Group allocation will be concealed and all analyses will be based on an intention-to-treat principle. This study will explore the effects of wearing textured insoles over 12-weeks on gait, foot sensation and proprioception in people with multiple sclerosis. The study has the potential to identify a new, evidence-based footwear intervention which has the capacity to enhance mobility and independent living in people with multiple sclerosis. Australian New Zealand Clinical Trials Registry ACTRN12615000421538 . Registered 4 May 2015.
Exploring point-cloud features from partial body views for gender classification
NASA Astrophysics Data System (ADS)
Fouts, Aaron; McCoppin, Ryan; Rizki, Mateen; Tamburino, Louis; Mendoza-Schrock, Olga
2012-06-01
In this paper we extend a previous exploration of histogram features extracted from 3D point cloud images of human subjects for gender discrimination. Feature extraction used a collection of concentric cylinders to define volumes for counting 3D points. The histogram features are characterized by a rotational axis and a selected set of volumes derived from the concentric cylinders. The point cloud images are drawn from the CAESAR anthropometric database provided by the Air Force Research Laboratory (AFRL) Human Effectiveness Directorate and SAE International. This database contains approximately 4400 high resolution LIDAR whole body scans of carefully posed human subjects. Success from our previous investigation was based on extracting features from full body coverage which required integration of multiple camera images. With the full body coverage, the central vertical body axis and orientation are readily obtainable; however, this is not the case with a one camera view providing less than one half body coverage. Assuming that the subjects are upright, we need to determine or estimate the position of the vertical axis and the orientation of the body about this axis relative to the camera. In past experiments the vertical axis was located through the center of mass of torso points projected on the ground plane and the body orientation derived using principle component analysis. In a natural extension of our previous work to partial body views, the absence of rotational invariance about the cylindrical axis greatly increases the difficulty for gender classification. Even the problem of estimating the axis is no longer simple. We describe some simple feasibility experiments that use partial image histograms. Here, the cylindrical axis is assumed to be known. We also discuss experiments with full body images that explore the sensitivity of classification accuracy relative to displacements of the cylindrical axis. Our initial results provide the basis for further investigation of more complex partial body viewing problems and new methods for estimating the two position coordinates for the axis location and the unknown body orientation angle.
43 CFR 419.3 - What general principles govern implementation of the TROA?
Code of Federal Regulations, 2011 CFR
2011-10-01
... possible, multiple beneficial purposes, including municipal and industrial, irrigation, fish, wildlife... the extent that water is lawfully available. This includes, but is not limited to, the exercise of...
How Many Principles for Public Health Ethics?
Coughlin, Steven S.
2009-01-01
General moral (ethical) principles play a prominent role in certain methods of moral reasoning and ethical decision-making in bioethics and public health. Examples include the principles of respect for autonomy, beneficence, nonmaleficence, and justice. Some accounts of ethics in public health have pointed to additional principles related to social and environmental concerns, such as the precautionary principle and principles of solidarity or social cohesion. This article provides an overview of principle-based methods of moral reasoning as they apply to public health ethics including a summary of advantages and disadvantages of methods of moral reasoning that rely upon general principles of moral reasoning. Drawing upon the literature on public health ethics, examples are provided of additional principles, obligations, and rules that may be useful for analyzing complex ethical issues in public health. A framework is outlined that takes into consideration the interplay of ethical principles and rules at individual, community, national, and global levels. Concepts such as the precautionary principle and solidarity are shown to be useful to public health ethics to the extent that they can be shown to provide worthwhile guidance and information above and beyond principles of beneficence, nonmaleficence, and justice, and the clusters of rules and maxims that are linked to these moral principles. Future directions likely to be productive include further work on areas of public health ethics such as public trust, community empowerment, the rights of individuals who are targeted (or not targeted) by public health interventions, individual and community resilience and wellbeing, and further clarification of principles, obligations, and rules in public health disciplines such as environmental science, prevention and control of chronic and infectious diseases, genomics, and global health. PMID:20072707
Observation of non-Hermitian degeneracies in a chaotic exciton-polariton billiard.
Gao, T; Estrecho, E; Bliokh, K Y; Liew, T C H; Fraser, M D; Brodbeck, S; Kamp, M; Schneider, C; Höfling, S; Yamamoto, Y; Nori, F; Kivshar, Y S; Truscott, A G; Dall, R G; Ostrovskaya, E A
2015-10-22
Exciton-polaritons are hybrid light-matter quasiparticles formed by strongly interacting photons and excitons (electron-hole pairs) in semiconductor microcavities. They have emerged as a robust solid-state platform for next-generation optoelectronic applications as well as for fundamental studies of quantum many-body physics. Importantly, exciton-polaritons are a profoundly open (that is, non-Hermitian) quantum system, which requires constant pumping of energy and continuously decays, releasing coherent radiation. Thus, the exciton-polaritons always exist in a balanced potential landscape of gain and loss. However, the inherent non-Hermitian nature of this potential has so far been largely ignored in exciton-polariton physics. Here we demonstrate that non-Hermiticity dramatically modifies the structure of modes and spectral degeneracies in exciton-polariton systems, and, therefore, will affect their quantum transport, localization and dynamical properties. Using a spatially structured optical pump, we create a chaotic exciton-polariton billiard--a two-dimensional area enclosed by a curved potential barrier. Eigenmodes of this billiard exhibit multiple non-Hermitian spectral degeneracies, known as exceptional points. Such points can cause remarkable wave phenomena, such as unidirectional transport, anomalous lasing/absorption and chiral modes. By varying parameters of the billiard, we observe crossing and anti-crossing of energy levels and reveal the non-trivial topological modal structure exclusive to non-Hermitian systems. We also observe mode switching and a topological Berry phase for a parameter loop encircling the exceptional point. Our findings pave the way to studies of non-Hermitian quantum dynamics of exciton-polaritons, which may uncover novel operating principles for polariton-based devices.
Miller, Benjamin F; Seals, Douglas R; Hamilton, Karyn L
2017-09-01
Adaptation to stress is identified as one of the seven pillars of aging research. Our viewpoint discusses the importance of the distinction between stress resistance and resilience, highlights how integration of physiological principles is critical for further understanding in vivo stress resistance and resilience, and advocates for the use of early warning signs to prevent a tipping point in stress resistance and resilience. Copyright © 2017 Elsevier B.V. All rights reserved.
Research status of wave energy conversion (WEC) device of raft structure
NASA Astrophysics Data System (ADS)
Dong, Jianguo; Gao, Jingwei; Tao, Liang; Zheng, Peng
2017-10-01
This paper has briefly described the concept of wave energy generation and six typical conversion devices. As for raft structure, detailed analysis is provided from its development process to typical devices. Taking the design process and working principle of Plamis as an example, the general principle of raft structure is briefly described. After that, a variety of raft structure models are introduced. Finally, the advantages and disadvantages, and development trend of raft structure are pointed out.
Effect of Automatic Processing on Specification of Problem Solutions for Computer Programs.
1981-03-01
Number 7 ± 2" item limitaion on human short-term memory capability (Miller, 1956) should be a guiding principle in program design. Yourdon and...input either a single example solution or multiple exam’- le solutions in sequence. If a participant’s P1 has a low value - near 0 - it may be concluded... Principles in Experimental Design, Winer ,1971). 55 Table 12 ANOVA Resultt, For Performance Measure 2 Sb DF MS F Source of Variation Between Subjects
The missing link in Aboriginal care: resource accounting.
Ashton, C W; Duffie-Ashton, Denise
2008-01-01
Resource accounting principles provide more effective planning for Aboriginal healthcare delivery through driving best management practices, efficacious techniques for long-term resource allocation, transparency of information and performance measurement. Major improvements to Aboriginal health in New Zealand and Australia were facilitated in the context of this public finance paradigm, rather than cash accounting systems that remain the current method for public departments in Canada. Multiple funding sources and fragmented delivery of Aboriginal healthcare can be remedied through similar adoption of such principles.
THE SCREENING AND RANKING ALGORITHM FOR CHANGE-POINTS DETECTION IN MULTIPLE SAMPLES
Song, Chi; Min, Xiaoyi; Zhang, Heping
2016-01-01
The chromosome copy number variation (CNV) is the deviation of genomic regions from their normal copy number states, which may associate with many human diseases. Current genetic studies usually collect hundreds to thousands of samples to study the association between CNV and diseases. CNVs can be called by detecting the change-points in mean for sequences of array-based intensity measurements. Although multiple samples are of interest, the majority of the available CNV calling methods are single sample based. Only a few multiple sample methods have been proposed using scan statistics that are computationally intensive and designed toward either common or rare change-points detection. In this paper, we propose a novel multiple sample method by adaptively combining the scan statistic of the screening and ranking algorithm (SaRa), which is computationally efficient and is able to detect both common and rare change-points. We prove that asymptotically this method can find the true change-points with almost certainty and show in theory that multiple sample methods are superior to single sample methods when shared change-points are of interest. Additionally, we report extensive simulation studies to examine the performance of our proposed method. Finally, using our proposed method as well as two competing approaches, we attempt to detect CNVs in the data from the Primary Open-Angle Glaucoma Genes and Environment study, and conclude that our method is faster and requires less information while our ability to detect the CNVs is comparable or better. PMID:28090239
The Mark, the Thing, and the Object: On What Commands Repetition in Freud and Lacan.
Van de Vijver, Gertrudis; Bazan, Ariane; Detandt, Sandrine
2017-01-01
In Logique du Fantasme , Lacan argues that the compulsion to repeat does not obey the same discharge logic as homeostatic processes. Repetition installs a realm that is categorically different from the one related to homeostatic pleasure seeking, a properly subjective one, one in which the mark "stands for," "takes the place of," what we have ventured to call "an event," and what only in the movement of return, in what Lacan calls a "thinking of repetition," confirms and ever reconfirms this point of no return, which is also a qualitative cut and a structural loss. The kind of "standing for" Lacan intends here with the concept of repetition is certainly not something like an image or a faithful description. No, what Lacan wishes to stress is that this mark is situated at another level, at another place, it is " entstellt ," and as such, it is punctually impinging upon the bodily dynamics without rendering the event, without having an external meta-point of view, but cutting across registers according to a logics that is not the homeostatic memory logics. This paper elaborates on this distinction on the basis of a confrontation with what Freud says about the pleasure principle and its beyond in Beyond the Pleasure Principle , and also takes inspiration from Freud's Project for a Scientific Psychology. We argue that Lacan's theory of enjoyment takes up and generalizes what Freud was after in Beyond the Pleasure Principle with the Wiederholungszwang , and pushes Freud's thoughts to a more articulated point: to the point where a subject is considered to speak only when it has allowed the other, through discourse, to have impacted and cut into his bodily pleasure dynamics.
Mapping of bird distributions from point count surveys
Sauer, J.R.; Pendleton, G.W.; Orsillo, Sandra; Ralph, C.J.; Sauer, J.R.; Droege, S.
1995-01-01
Maps generated from bird survey data are used for a variety of scientific purposes, but little is known about their bias and precision. We review methods for preparing maps from point count data and appropriate sampling methods for maps based on point counts. Maps based on point counts can be affected by bias associated with incomplete counts, primarily due to changes in proportion counted as a function of observer or habitat differences. Large-scale surveys also generally suffer from regional and temporal variation in sampling intensity. A simulated surface is used to demonstrate sampling principles for maps.
A Consumer Protection Model for Regulating Lawyers.
ERIC Educational Resources Information Center
Chalfie, Deborah M.
1992-01-01
Describes and critiques the "discipline model" of lawyer regulation from a consumer point of view and outlines an alternative model for regulating lawyers that is grounded in consumer protection principles. (JOW)
Core principles of assessment in competency-based medical education.
Lockyer, Jocelyn; Carraccio, Carol; Chan, Ming-Ka; Hart, Danielle; Smee, Sydney; Touchie, Claire; Holmboe, Eric S; Frank, Jason R
2017-06-01
The meaningful assessment of competence is critical for the implementation of effective competency-based medical education (CBME). Timely ongoing assessments are needed along with comprehensive periodic reviews to ensure that trainees continue to progress. New approaches are needed to optimize the use of multiple assessors and assessments; to synthesize the data collected from multiple assessors and multiple types of assessments; to develop faculty competence in assessment; and to ensure that relationships between the givers and receivers of feedback are appropriate. This paper describes the core principles of assessment for learning and assessment of learning. It addresses several ways to ensure the effectiveness of assessment programs, including using the right combination of assessment methods and conducting careful assessor selection and training. It provides a reconceptualization of the role of psychometrics and articulates the importance of a group process in determining trainees' progress. In addition, it notes that, to reach its potential as a driver in trainee development, quality care, and patient safety, CBME requires effective information management and documentation as well as ongoing consideration of ways to improve the assessment system.
First-Principles Prediction of Spin-Polarized Multiple Dirac Rings in Manganese Fluoride
NASA Astrophysics Data System (ADS)
Jiao, Yalong; Ma, Fengxian; Zhang, Chunmei; Bell, John; Sanvito, Stefano; Du, Aijun
2017-07-01
Spin-polarized materials with Dirac features have sparked great scientific interest due to their potential applications in spintronics. But such a type of structure is very rare and none has been fabricated. Here, we investigate the already experimentally synthesized manganese fluoride (MnF3 ) as a novel spin-polarized Dirac material by using first-principles calculations. MnF3 exhibits multiple Dirac cones in one spin orientation, while it behaves like a large gap semiconductor in the other spin channel. The estimated Fermi velocity for each cone is of the same order of magnitude as that in graphene. The 3D band structure further reveals that MnF3 possesses rings of Dirac nodes in the Brillouin zone. Such a spin-polarized multiple Dirac ring feature is reported for the first time in an experimentally realized material. Moreover, similar band dispersions can be also found in other transition metal fluorides (e.g., CoF3 , CrF3 , and FeF3 ). Our results highlight a new interesting single-spin Dirac material with promising applications in spintronics and information technologies.
First-Principles Prediction of Spin-Polarized Multiple Dirac Rings in Manganese Fluoride.
Jiao, Yalong; Ma, Fengxian; Zhang, Chunmei; Bell, John; Sanvito, Stefano; Du, Aijun
2017-07-07
Spin-polarized materials with Dirac features have sparked great scientific interest due to their potential applications in spintronics. But such a type of structure is very rare and none has been fabricated. Here, we investigate the already experimentally synthesized manganese fluoride (MnF_{3}) as a novel spin-polarized Dirac material by using first-principles calculations. MnF_{3} exhibits multiple Dirac cones in one spin orientation, while it behaves like a large gap semiconductor in the other spin channel. The estimated Fermi velocity for each cone is of the same order of magnitude as that in graphene. The 3D band structure further reveals that MnF_{3} possesses rings of Dirac nodes in the Brillouin zone. Such a spin-polarized multiple Dirac ring feature is reported for the first time in an experimentally realized material. Moreover, similar band dispersions can be also found in other transition metal fluorides (e.g., CoF_{3}, CrF_{3}, and FeF_{3}). Our results highlight a new interesting single-spin Dirac material with promising applications in spintronics and information technologies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hirt, P.W.
1991-01-01
This study focuses on two core national forest management policies; sustained yield and multiple use. Public and elected officials attempt to apply principles of sustainable development to publicly-owned forest lands to ensure that a wide variety of both market and nonmarket forest values are preserved for the benefit of present and future generations. Interest groups, the Forest Service, and policy makers have conceived of sustained yield and multiple use in different and evolving ways over the years. This study explores how these principles have been variously defined and either implemented or thwarted. After World War Two, with escalating demands onmore » national forest resources, the US Forest Service turned to intensive management as a technological method of enhancing natural forest productivity and mitigating the environmental effects of increased use. But the agency's optimistic vision of efficient, sustained production of forest commodities through technical mastery over nature has met overwhelming fiscal, environmental, technical, and political obstacles. Changing public values since the 1960s and popularization of ecology have initiated a growing skepticism toward the premises of intensive management.« less
Guiding principles of safety as a basis for developing a pharmaceutical safety culture.
Edwards, Brian; Olsen, Axel K; Whalen, Matthew D; Gold, Marla J
2007-05-01
Despite the best efforts of industry and regulatory authorities, the trust of society in the process of medicine development and communication of pharmaceutical risk has ebbed away. In response the US government has called for a culture of compliance while the EU regulators talk of a 'culture of scientific excellence'. However, one of the fundamental problems hindering progress to rebuilding trust based on a pharmaceutical safety culture is the lack of agreement and transparency between all stakeholders as to what is meant by a 'Safety of Medicines'. For that reason, we propose 'Guiding Principles of Safety for Pharmaceuticals' are developed analogous to the way that Chemical Safety has been tackled. A logical starting point would be to examine the Principles outlined by the US Institute of Medicine although we acknowledge that these Principles require further extensive debate and definition. Nevertheless, the Principles should take centre stage in the reform of pharmaceutical development required to restore society's trust.
Churkin, Alexander; Barash, Danny
2008-01-01
Background RNAmute is an interactive Java application which, given an RNA sequence, calculates the secondary structure of all single point mutations and organizes them into categories according to their similarity to the predicted structure of the wild type. The secondary structure predictions are performed using the Vienna RNA package. A more efficient implementation of RNAmute is needed, however, to extend from the case of single point mutations to the general case of multiple point mutations, which may often be desired for computational predictions alongside mutagenesis experiments. But analyzing multiple point mutations, a process that requires traversing all possible mutations, becomes highly expensive since the running time is O(nm) for a sequence of length n with m-point mutations. Using Vienna's RNAsubopt, we present a method that selects only those mutations, based on stability considerations, which are likely to be conformational rearranging. The approach is best examined using the dot plot representation for RNA secondary structure. Results Using RNAsubopt, the suboptimal solutions for a given wild-type sequence are calculated once. Then, specific mutations are selected that are most likely to cause a conformational rearrangement. For an RNA sequence of about 100 nts and 3-point mutations (n = 100, m = 3), for example, the proposed method reduces the running time from several hours or even days to several minutes, thus enabling the practical application of RNAmute to the analysis of multiple-point mutations. Conclusion A highly efficient addition to RNAmute that is as user friendly as the original application but that facilitates the practical analysis of multiple-point mutations is presented. Such an extension can now be exploited prior to site-directed mutagenesis experiments by virologists, for example, who investigate the change of function in an RNA virus via mutations that disrupt important motifs in its secondary structure. A complete explanation of the application, called MultiRNAmute, is available at [1]. PMID:18445289
NASA Astrophysics Data System (ADS)
Eisenbach, Markus
The Locally Self-consistent Multiple Scattering (LSMS) code solves the first principles Density Functional theory Kohn-Sham equation for a wide range of materials with a special focus on metals, alloys and metallic nano-structures. It has traditionally exhibited near perfect scalability on massively parallel high performance computer architectures. We present our efforts to exploit GPUs to accelerate the LSMS code to enable first principles calculations of O(100,000) atoms and statistical physics sampling of finite temperature properties. Using the Cray XK7 system Titan at the Oak Ridge Leadership Computing Facility we achieve a sustained performance of 14.5PFlop/s and a speedup of 8.6 compared to the CPU only code. This work has been sponsored by the U.S. Department of Energy, Office of Science, Basic Energy Sciences, Material Sciences and Engineering Division and by the Office of Advanced Scientific Computing. This work used resources of the Oak Ridge Leadership Computing Facility, which is supported by the Office of Science of the U.S. Department of Energy under Contract No. DE-AC05-00OR22725.
Psychiatric rehabilitation education for physicians.
Rudnick, Abraham; Eastwood, Diane
2013-06-01
As part of a rapidly spreading reform toward recovery-oriented services, mental health care systems are adopting Psychiatric/Psychosocial Rehabilitation (PSR). Accordingly, PSR education and training programs are now available and accessible. Although psychiatrists and sometimes other physicians (such as family physicians) provide important services to people with serious mental illnesses and may, therefore, need knowledge and skill in PSR, it seems that the medical profession has been slow to participate in PSR education. Based on our experience working in Canada as academic psychiatrists who are also Certified Psychiatric Rehabilitation Practitioners (CPRPs), we offer descriptions of several Canadian initiatives that involve physicians in PSR education. Multiple frameworks guide PSR education for physicians. First, guidance is provided by published PSR principles, such as the importance of self-determination (www.psrrpscanada.ca). Second, guidance is provided by adult education (andragogy) principles, emphasizing the importance of addressing attitudes in addition to knowledge and skills (Knowles, Holton, & Swanson, 2011). Third, guidance in Canada is provided by Canadian Medical Education Directives for Specialists (CanMEDS) principles, which delineate the multiple roles of physicians beyond that of medical expert (Frank, 2005) and have recently been adopted in Australia (Boyce, Spratt, Davies, & McEvoy, 2011). (PsycINFO Database Record (c) 2013 APA, all rights reserved).
NASA Astrophysics Data System (ADS)
Andries, Jesse
2010-11-01
The frequencies of the normal modes of oscillation of linear magnetohydrodynamic perturbations of a stationary equilibrium are related to the stationary points of a quadratic functional over the Hilbert space of Lagrangian displacement vectors, which is subject to a constraint. In the absence of a background flow (or of a uniform flow), the relation reduces to the well-known Rayleigh-Ritz variational principle. In contrast to the existing variational principles for perturbations of stationary equilibria, the present treatment does neither impose additional symmetry restrictions on the equilibrium, nor does it involve the generalization to bilinear functionals instead of quadratic forms. This allows a more natural interpretation of the quadratic forms as energy functionals.
NASA Astrophysics Data System (ADS)
Yang, Bisheng; Dong, Zhen; Liu, Yuan; Liang, Fuxun; Wang, Yongjun
2017-04-01
In recent years, updating the inventory of road infrastructures based on field work is labor intensive, time consuming, and costly. Fortunately, vehicle-based mobile laser scanning (MLS) systems provide an efficient solution to rapidly capture three-dimensional (3D) point clouds of road environments with high flexibility and precision. However, robust recognition of road facilities from huge volumes of 3D point clouds is still a challenging issue because of complicated and incomplete structures, occlusions and varied point densities. Most existing methods utilize point or object based features to recognize object candidates, and can only extract limited types of objects with a relatively low recognition rate, especially for incomplete and small objects. To overcome these drawbacks, this paper proposes a semantic labeling framework by combing multiple aggregation levels (point-segment-object) of features and contextual features to recognize road facilities, such as road surfaces, road boundaries, buildings, guardrails, street lamps, traffic signs, roadside-trees, power lines, and cars, for highway infrastructure inventory. The proposed method first identifies ground and non-ground points, and extracts road surfaces facilities from ground points. Non-ground points are segmented into individual candidate objects based on the proposed multi-rule region growing method. Then, the multiple aggregation levels of features and the contextual features (relative positions, relative directions, and spatial patterns) associated with each candidate object are calculated and fed into a SVM classifier to label the corresponding candidate object. The recognition performance of combining multiple aggregation levels and contextual features was compared with single level (point, segment, or object) based features using large-scale highway scene point clouds. Comparative studies demonstrated that the proposed semantic labeling framework significantly improves road facilities recognition precision (90.6%) and recall (91.2%), particularly for incomplete and small objects.
NASA Astrophysics Data System (ADS)
Sebesta, Mikael; Egelberg, Peter J.; Langberg, Anders; Lindskov, Jens-Henrik; Alm, Kersti; Janicke, Birgit
2016-03-01
Live-cell imaging enables studying dynamic cellular processes that cannot be visualized in fixed-cell assays. An increasing number of scientists in academia and the pharmaceutical industry are choosing live-cell analysis over or in addition to traditional fixed-cell assays. We have developed a time-lapse label-free imaging cytometer HoloMonitorM4. HoloMonitor M4 assists researchers to overcome inherent disadvantages of fluorescent analysis, specifically effects of chemical labels or genetic modifications which can alter cellular behavior. Additionally, label-free analysis is simple and eliminates the costs associated with staining procedures. The underlying technology principle is based on digital off-axis holography. While multiple alternatives exist for this type of analysis, we prioritized our developments to achieve the following: a) All-inclusive system - hardware and sophisticated cytometric analysis software; b) Ease of use enabling utilization of instrumentation by expert- and entrylevel researchers alike; c) Validated quantitative assay end-points tracked over time such as optical path length shift, optical volume and multiple derived imaging parameters; d) Reliable digital autofocus; e) Robust long-term operation in the incubator environment; f) High throughput and walk-away capability; and finally g) Data management suitable for single- and multi-user networks. We provide examples of HoloMonitor applications of label-free cell viability measurements and monitoring of cell cycle phase distribution.
Micro/Nano-scale Strain Distribution Measurement from Sampling Moiré Fringes.
Wang, Qinghua; Ri, Shien; Tsuda, Hiroshi
2017-05-23
This work describes the measurement procedure and principles of a sampling moiré technique for full-field micro/nano-scale deformation measurements. The developed technique can be performed in two ways: using the reconstructed multiplication moiré method or the spatial phase-shifting sampling moiré method. When the specimen grid pitch is around 2 pixels, 2-pixel sampling moiré fringes are generated to reconstruct a multiplication moiré pattern for a deformation measurement. Both the displacement and strain sensitivities are twice as high as in the traditional scanning moiré method in the same wide field of view. When the specimen grid pitch is around or greater than 3 pixels, multi-pixel sampling moiré fringes are generated, and a spatial phase-shifting technique is combined for a full-field deformation measurement. The strain measurement accuracy is significantly improved, and automatic batch measurement is easily achievable. Both methods can measure the two-dimensional (2D) strain distributions from a single-shot grid image without rotating the specimen or scanning lines, as in traditional moiré techniques. As examples, the 2D displacement and strain distributions, including the shear strains of two carbon fiber-reinforced plastic specimens, were measured in three-point bending tests. The proposed technique is expected to play an important role in the non-destructive quantitative evaluations of mechanical properties, crack occurrences, and residual stresses of a variety of materials.
The Evolution of TMD Diagnosis
Ohrbach, R.; Dworkin, S.F.
2016-01-01
This review explores the principles and process associated with the diagnosis of temporomandibular disorders (TMDs). TMD diagnosis has evolved substantially over the past 25 y. Previously, diagnosis focused solely on aberrations in oral structures, largely without empirical evidence. The Research Diagnostic Criteria for TMD (RDC/TMD) were developed on core principles of 1) a dual-axis system reflecting the biopsychosocial model, 2) a clear operationalization for reliability, and 3) the allowance of multiple diagnoses. These principles were retained in the subsequent validation research of the RDC/TMD, and the current diagnostic system—the Diagnostic Criteria for TMD (DC/TMD)—has improved on those principles as well as on diagnostic validity and protocols for assessing the psychosocial domain. Further investigations into etiology and its potential contribution to taxonomy revision are described, particularly within the context of complex disease. The review concludes with an outline of major research areas already underway that will support future revisions of the DC/TMD. PMID:27313164
Winkler, Mirko S; Utzinger, Jürg
2014-07-01
Health Impact Assessment (HIA) is a relatively young field of endeavour, and hence, future progress will depend on the planning, implementation and rigorous evaluation of additional HIAs of projects, programmes and policies the world over. In the June 2014 issue of the International Journal of Health Policy and Management, Fakhri and colleagues investigated underlying principles of HIA through a comprehensive review of the literature and expert consultation. With an emphasis on the Islamic Republic of Iran, the authors identified multiple issues that are relevant for guiding HIA practice. At the same time, the study unravelled current shortcomings in the understanding and definition of HIA principles and best practice at national, regional, and global levels. In this commentary we scrutinise the research presented, highlight strengths and limitations, and discuss the findings in the context of other recent attempts to guide HIA.
Integrality: life principle and right to health.
Viegas, Selma Maria Fonseca; Penna, Cláudia Maria de Mattos
2015-01-01
To understand the health integrality in the daily work of Family Health Strategy (FHS) and its concept according to the managers in Jequitinhonha Valley, Minas Gerais, Brazil. This is a multiple case study of holistic and qualitative approach based on the Quotidian Comprehensive Sociology. The subjects were workers of the Family Health Strategy teams, the support team and managers in a total of 48. The results show the integrality as a principle of life and right to health and to contemplate it in the quotidian of doings in health, others principles of the Unified Health System may be addressed consecutively. The universal right to health care needs is declared in contemplation of integrity of being, the idealization of a subject-centered care, one that is our aim in health care, which signals a step towards a change of attitude in seeking comprehensive care. It is considered that the principle of integrality is a difficult accomplishment in its dimensions.
Analytical Dynamics and Nonrigid Spacecraft Simulation
NASA Technical Reports Server (NTRS)
Likins, P. W.
1974-01-01
Application to the simulation of idealized spacecraft are considered both for multiple-rigid-body models and for models consisting of combination of rigid bodies and elastic bodies, with the elastic bodies being defined either as continua, as finite-element systems, or as a collection of given modal data. Several specific examples are developed in detail by alternative methods of analytical mechanics, and results are compared to a Newton-Euler formulation. The following methods are developed from d'Alembert's principle in vector form: (1) Lagrange's form of d'Alembert's principle for independent generalized coordinates; (2) Lagrange's form of d'Alembert's principle for simply constrained systems; (3) Kane's quasi-coordinate formulation of D'Alembert's principle; (4) Lagrange's equations for independent generalized coordinates; (5) Lagrange's equations for simply constrained systems; (6) Lagrangian quasi-coordinate equations (or the Boltzmann-Hamel equations); (7) Hamilton's equations for simply constrained systems; and (8) Hamilton's equations for independent generalized coordinates.
AEROSAT Access Control Summary
DOT National Transportation Integrated Search
1976-10-01
The report consists of three basic sections. Section 2 is a discussion of the communications concepts germane to AEROSAT access control. It defines and reviews the principles of multiplexing, multiple access, demand access, and access control and rel...
Aerosat Access Control Summary
DOT National Transportation Integrated Search
1976-10-01
The report consists of three basic sections. Section 2 is a discussion of the communications concepts germane to AEROSAT access control. It defines and reviews the principles of multiplexing, multiple access, demand access, and access control and rel...
An Institutional Perspective on Accountable Care Organizations.
Goodrick, Elizabeth; Reay, Trish
2016-12-01
We employ aspects of institutional theory to explore how Accountable Care Organizations (ACOs) can effectively manage the multiplicity of ideas and pressures within which they are embedded and consequently better serve patients and their communities. More specifically, we draw on the concept of institutional logics to highlight the importance of understanding the conflicting principles upon which ACOs were founded. Based on previous research conducted both inside and outside health care settings, we argue that ACOs can combine attention to these principles (or institutional logics) in different ways; the options fall on a continuum from (a) segregating the effects of multiple logics from each other by compartmentalizing responses to multiple logics to (b) fully hybridizing the different logics. We suggest that the most productive path for ACOs is to situate their approach between the two extremes of "segregating" and "fully hybridizing." This strategic approach allows ACOs to develop effective responses that combine logics without fully integrating them. We identify three ways that ACOs can embrace institutional complexity short of fully hybridizing disparate logics: (1) reinterpreting practices to make them compatible with other logics; (2) engaging in strategies that take advantage of existing synergy between conflicting logics; (3) creating opportunities for people at frontline to develop innovative ways of working that combine multiple logics. © The Author(s) 2016.
NASA Astrophysics Data System (ADS)
Hudoklin, Domen; Drnovšek, Janko
2008-10-01
In the field of hygrometry, a primary dew-point standard can be realized according to several proven principles, such as single-pressure (1-P), two-pressure (2-P), or divided flow. Different realizations have been introduced by various national laboratories, each resulting in a stand-alone complex generation system. Recent trends in generator design favor the single-pressure principle without recirculation because it promises theoretically lower uncertainty and because it avoids problems regarding the leak tightness of the recirculation. Instead of recirculation, the efficiency of saturation, the key factor, is increased by preconditioning the inlet gas entering the saturator. For preconditioning, a presaturator or purifier is used to bring the dew point of the inlet stream close to the saturator temperature. The purpose of the paper is to identify the minimum requirements for the preconditioning system and the main saturator to assure efficient saturation for the LMK generator. Moreover, the aim is also to find out if the preconditioning system can be avoided despite the rather simple construction of the main saturator. If this proves to be the case, the generator design can be simplified while maintaining an accurate value of the generated dew point. Experiments were carried out within the scope of improving our existing primary generator in the above-ambient dew-point range up to +70°C. These results show the generated dew point is within the measurement uncertainty for any dew-point value of the inlet gas. Thus, the preconditioning subsystem can be avoided, which leads to a simplified generator design.
Development of a High-Average-Power Compton Gamma Source for Lepton Colliders
NASA Astrophysics Data System (ADS)
Pogorelsky, Igor; Polyanskiy, Mikhail N.; Yakimenko, Vitaliy; Platonenko, Viktor T.
2009-01-01
Gamma- (γ-) ray beams of high average power and peak brightness are of demand for a number of applications in high-energy physics, material processing, medicine, etc. One of such examples is gamma conversion into polarized positrons and muons that is under consideration for projected lepton colliders. A γ-source based on the Compton backscattering from the relativistic electron beam is a promising candidate for this application. Our approach to the high-repetition γ-source assumes placing the Compton interaction point inside a CO2 laser cavity. A laser pulse interacts with periodical electron bunches on each round-trip inside the laser cavity producing the corresponding train of γ-pulses. The round-trip optical losses can be compensated by amplification in the active laser medium. The major challenge for this approach is in maintaining stable amplification rate for a picosecond CO2-laser pulse during multiple resonator round-trips without significant deterioration of its temporal and transverse profiles. Addressing this task, we elaborated on a computer code that allows identifying the directions and priorities in the development of such a multi-pass picosecond CO2 laser. Proof-of-principle experiments help to verify the model and show the viability of the concept. In these tests we demonstrated extended trains of picosecond CO2 laser pulses circulating inside the cavity that incorporates the Compton interaction point.
NASA Astrophysics Data System (ADS)
Laloy, Eric; Hérault, Romain; Lee, John; Jacques, Diederik; Linde, Niklas
2017-12-01
Efficient and high-fidelity prior sampling and inversion for complex geological media is still a largely unsolved challenge. Here, we use a deep neural network of the variational autoencoder type to construct a parametric low-dimensional base model parameterization of complex binary geological media. For inversion purposes, it has the attractive feature that random draws from an uncorrelated standard normal distribution yield model realizations with spatial characteristics that are in agreement with the training set. In comparison with the most commonly used parametric representations in probabilistic inversion, we find that our dimensionality reduction (DR) approach outperforms principle component analysis (PCA), optimization-PCA (OPCA) and discrete cosine transform (DCT) DR techniques for unconditional geostatistical simulation of a channelized prior model. For the considered examples, important compression ratios (200-500) are achieved. Given that the construction of our parameterization requires a training set of several tens of thousands of prior model realizations, our DR approach is more suited for probabilistic (or deterministic) inversion than for unconditional (or point-conditioned) geostatistical simulation. Probabilistic inversions of 2D steady-state and 3D transient hydraulic tomography data are used to demonstrate the DR-based inversion. For the 2D case study, the performance is superior compared to current state-of-the-art multiple-point statistics inversion by sequential geostatistical resampling (SGR). Inversion results for the 3D application are also encouraging.
Relativistic Transverse Gravitational Redshift
NASA Astrophysics Data System (ADS)
Mayer, A. F.
2012-12-01
The parametrized post-Newtonian (PPN) formalism is a tool for quantitative analysis of the weak gravitational field based on the field equations of general relativity. This formalism and its ten parameters provide the practical theoretical foundation for the evaluation of empirical data produced by space-based missions designed to map and better understand the gravitational field (e.g., GRAIL, GRACE, GOCE). Accordingly, mission data is interpreted in the context of the canonical PPN formalism; unexpected, anomalous data are explained as similarly unexpected but apparently real physical phenomena, which may be characterized as ``gravitational anomalies," or by various sources contributing to the total error budget. Another possibility, which is typically not considered, is a small modeling error in canonical general relativity. The concept of the idealized point-mass spherical equipotential surface, which originates with Newton's law of gravity, is preserved in Einstein's synthesis of special relativity with accelerated reference frames in the form of the field equations. It was not previously realized that the fundamental principles of relativity invalidate this concept and with it the idea that the gravitational field is conservative (i.e., zero net work is done on any closed path). The ideal radial free fall of a material body from arbitrarily-large range to a point on such an equipotential surface (S) determines a unique escape-velocity vector of magnitude v collinear to the acceleration vector of magnitude g at this point. For two such points on S separated by angle dφ , the Equivalence Principle implies distinct reference frames experiencing inertial acceleration of identical magnitude g in different directions in space. The complete equivalence of these inertially-accelerated frames to their analogous frames at rest on S requires evaluation at instantaneous velocity v relative to a local inertial observer. Because these velocity vectors are not parallel, a symmetric energy potential exists between the frames that is quantified by the instantaneous Δ {v} = v\\cdot{d}φ between them; in order for either frame to become indistinguishable from the other, such that their respective velocity and acceleration vectors are parallel, a change in velocity is required. While the qualitative features of general relativity imply this phenomenon (i.e., a symmetric potential difference between two points on a Newtonian `equipotential surface' that is similar to a friction effect), it is not predicted by the field equations due to a modeling error concerning time. This is an error of omission; time has fundamental geometric properties implied by the principles of relativity that are not reflected in the field equations. Where b is the radius and g is the gravitational acceleration characterizing a spherical geoid S of an ideal point-source gravitational field, an elegant derivation that rests on first principles shows that for two points at rest on S separated by a distance d << b, a symmetric relativistic redshift exists between these points of magnitude z = gd2/bc^2, which over 1 km at Earth sea level yields z ˜{10-17}. It can be tested with a variety of methods, in particular laser interferometry. A more sophisticated derivation yields a considerably more complex predictive formula for any two points in a gravitational field.
NASA Astrophysics Data System (ADS)
Hamburg, S.
2016-12-01
Environmental Defense Fund (EDF) launched a series of 16 research studies in 2012 to quantify methane emissions from the U.S. oil and gas (O&G) supply chain. In addition to EDF's funding from philanthropic individuals and foundations and in-kind contributions from universities, over forty O&G companies contributed money to the studies. For a subset of studies that required partner companies to provide site access to measure their equipment, five common principles were followed to assure that research was objective and scientifically rigorous. First, academic scientists were selected as principal investigators (PIs) to lead the studies. In line with EDF's policy of not accepting money from corporate partners, O&G companies provided funding directly to academic PIs. Technical work groups and steering committees consisting of EDF and O&G partner staff advised the PIs in the planning and implementation of research, but PIs had the final authority in scientific decisions including publication content. Second, scientific advisory panels of independent experts advised the PIs in the study design, data analysis, and interpretation. Third, studies employed multiple methodologies when possible, including top-down and bottom-up measurements. This helped overcome the limitations of individual approaches to decrease the uncertainty of emission estimates and minimize concerns with data being "cherry-picked". Fourth, studies were published in peer-reviewed journals to undergo an additional round of independent review. Fifth, transparency of data was paramount. Study data were released after publication, although operator and site names of individual data points were anonymized to ensure transparency and allow independent analysis. Following these principles allowed an environmental organization, O&G companies, and academic scientists to collaborate in scientific research while minimizing conflicts of interest. This approach can serve as a model for a scientifically rigorous process minimally influenced by study partners.
Fleet management performance monitoring.
DOT National Transportation Integrated Search
2013-05-01
The principle goal of this project was to enhance and expand the analytical modeling methodology previously developed as part of the Fleet Management Criteria: Disposal Points and Utilization Rates project completed in 2010. The enhanced and ex...
Non-linear optical probing of strain-enabled ferroelectricity in CaTiO3 thin films
NASA Astrophysics Data System (ADS)
Vlahos, Eftihia; Brooks, Charles; Ecklund, Carl Johan; Biegalski, Mike; Rabe, Karin; Schlom, Darrell; Gopalan, Venkatraman
2010-03-01
First principles calculations predict CaTiO3, under tensile strain, to become ferroelectric with a spontaneous polarization of up to 0.5 C/m^2. Comparative second harmonic generation (SHG) studies of a series of strained CaTiO3 thin films were undertaken in order to determine their transition temperature and point group symmetry. The epitaxial strain ranged from -1.7% to 3.3%. Symmetry analysis of the SHG polar plots confirms that for the samples under tensile strain, the polarization is along the <110>p directions and the point group of the ferroelectric phase is mm2. SHG ``hysteresis'' loops were also obtained; these show clear switching. The experimental results are in excellent agreement with the first principles calculations predictions, and low temperature dielectric measurements that were performed on the same samples.
First-principles study of point defects in thorium carbide
NASA Astrophysics Data System (ADS)
Pérez Daroca, D.; Jaroszewicz, S.; Llois, A. M.; Mosca, H. O.
2014-11-01
Thorium-based materials are currently being investigated in relation with their potential utilization in Generation-IV reactors as nuclear fuels. One of the most important issues to be studied is their behavior under irradiation. A first approach to this goal is the study of point defects. By means of first-principles calculations within the framework of density functional theory, we study the stability and formation energies of vacancies, interstitials and Frenkel pairs in thorium carbide. We find that C isolated vacancies are the most likely defects, while C interstitials are energetically favored as compared to Th ones. These kind of results for ThC, to the best authors' knowledge, have not been obtained previously, neither experimentally, nor theoretically. For this reason, we compare with results on other compounds with the same NaCl-type structure.
NASA Astrophysics Data System (ADS)
Chou, Jyh-Pin; Bodrog, Zoltán; Gali, Adam
2018-03-01
Solid-state qubits from paramagnetic point defects in solids are promising platforms to realize quantum networks and novel nanoscale sensors. Recent advances in materials engineering make it possible to create proximate qubits in solids that might interact with each other, leading to electron spin or charge fluctuation. Here we develop a method to calculate the tunneling-mediated charge diffusion between point defects from first principles and apply it to nitrogen-vacancy (NV) qubits in diamond. The calculated tunneling rates are in quantitative agreement with previous experimental data. Our results suggest that proximate neutral and negatively charged NV defect pairs can form a NV-NV molecule. A tunneling-mediated model for the source of decoherence of the near-surface NV qubits is developed based on our findings on the interacting qubits in diamond.
A systems approach to theoretical fluid mechanics: Fundamentals
NASA Technical Reports Server (NTRS)
Anyiwo, J. C.
1978-01-01
A preliminary application of the underlying principles of the investigator's general system theory to the description and analyses of the fluid flow system is presented. An attempt is made to establish practical models, or elements of the general fluid flow system from the point of view of the general system theory fundamental principles. Results obtained are applied to a simple experimental fluid flow system, as test case, with particular emphasis on the understanding of fluid flow instability, transition and turbulence.
Silicon micromachined vibrating gyroscopes
NASA Astrophysics Data System (ADS)
Voss, Ralf
1997-09-01
This work gives an overview of silicon micromachined vibrating gyroscopes. Market perspectives and fields of application are pointed out. The advantage of using silicon micromachining is discussed and estimations of the desired performance, especially for automobiles are given. The general principle of vibrating gyroscopes is explained. Vibrating silicon gyroscopes can be divided into seven classes. for each class the characteristic principle is presented and examples are given. Finally a specific sensor, based on a tuning fork for automotive applications with a sensitivity of 250(mu) V/degrees is described in detail.
Single service point: it's all in the design.
Bradigan, Pamela S; Rodman, Ruey L
2008-01-01
"Design thinking" principles from a leading design firm, IDEO, were key elements in the planning process for a one-desk service model, the ASK Desk, at the John A. Prior Health Sciences Library. The library administration and staff employed the methodology to enhance customer experiences, meet technology challenges, and compete in a changing education environment. The most recent renovations demonstrate how the principles were applied. The concept of "continuous design thinking" is important in the library's daily operations to serve customers most effectively.
Triple point temperature of neon isotopes: Dependence on nitrogen impurity and sealed-cell model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pavese, F.; Steur, P. P. M.; Giraudi, D.
2013-09-11
This paper illustrates a study conducted at INRIM, to further check how some quantities influence the value of the triple point temperature of the neon high-purity isotopes {sup 20}Ne and {sup 22}Ne. The influence of nitrogen as a chemical impurity in neon is critical with regard to the present best total uncertainty achieved in the measurement of these triple points, but only one determination is available in the literature. Checks are reported, performed on two different samples of {sup 22}Ne known to contain a N{sub 2} amount of 157⋅10{sup −6}, using two different models of sealed cells. The model ofmore » the cell can, in principle, have some effects on the shape of the melting plateau or on the triple point temperature observed for the sample sealed in it. This can be due to cell thermal parameters, or because the INRIM cell element mod. c contains many copper wires closely packed, which can, in principle, constrain the interface and induce a premelting-like effect. The reported results on a cell mod. Bter show no evident effect from the cell model and provide a value for the effect of N{sub 2} in Ne liquidus point of 8.6(1.9) μK ppm N{sub 2}{sup −1}, only slightly different from the literature datum.« less
Adaptive force produced by stress-induced regulation of random variation intensity.
Shimansky, Yury P
2010-08-01
The Darwinian theory of life evolution is capable of explaining the majority of related phenomena. At the same time, the mechanisms of optimizing traits beneficial to a population as a whole but not directly to an individual remain largely unclear. There are also significant problems with explaining the phenomenon of punctuated equilibrium. From another perspective, multiple mechanisms for the regulation of the rate of genetic mutations according to the environmental stress have been discovered, but their precise functional role is not well understood yet. Here a novel mathematical paradigm called a Kinetic-Force Principle (KFP), which can serve as a general basis for biologically plausible optimization methods, is introduced and its rigorous derivation is provided. Based on this principle, it is shown that, if the rate of random changes in a biological system is proportional, even only roughly, to the amount of environmental stress, a virtual force is created, acting in the direction of stress relief. It is demonstrated that KFP can provide important insights into solving the above problems. Evidence is presented in support of a hypothesis that the nature employs KFP for accelerating adaptation in biological systems. A detailed comparison between KFP and the principle of variation and natural selection is presented and their complementarity is revealed. It is concluded that KFP is not a competing alternative, but a powerful addition to the principle of variation and natural selection. It is also shown KFP can be used in multiple ways for adaptation of individual biological organisms.
FireProt: Energy- and Evolution-Based Computational Design of Thermostable Multiple-Point Mutants.
Bednar, David; Beerens, Koen; Sebestova, Eva; Bendl, Jaroslav; Khare, Sagar; Chaloupkova, Radka; Prokop, Zbynek; Brezovsky, Jan; Baker, David; Damborsky, Jiri
2015-11-01
There is great interest in increasing proteins' stability to enhance their utility as biocatalysts, therapeutics, diagnostics and nanomaterials. Directed evolution is a powerful, but experimentally strenuous approach. Computational methods offer attractive alternatives. However, due to the limited reliability of predictions and potentially antagonistic effects of substitutions, only single-point mutations are usually predicted in silico, experimentally verified and then recombined in multiple-point mutants. Thus, substantial screening is still required. Here we present FireProt, a robust computational strategy for predicting highly stable multiple-point mutants that combines energy- and evolution-based approaches with smart filtering to identify additive stabilizing mutations. FireProt's reliability and applicability was demonstrated by validating its predictions against 656 mutations from the ProTherm database. We demonstrate that thermostability of the model enzymes haloalkane dehalogenase DhaA and γ-hexachlorocyclohexane dehydrochlorinase LinA can be substantially increased (ΔTm = 24°C and 21°C) by constructing and characterizing only a handful of multiple-point mutants. FireProt can be applied to any protein for which a tertiary structure and homologous sequences are available, and will facilitate the rapid development of robust proteins for biomedical and biotechnological applications.
Gschwind, Michael K [Chappaqua, NY
2011-03-01
Mechanisms for implementing a floating point only single instruction multiple data instruction set architecture are provided. A processor is provided that comprises an issue unit, an execution unit coupled to the issue unit, and a vector register file coupled to the execution unit. The execution unit has logic that implements a floating point (FP) only single instruction multiple data (SIMD) instruction set architecture (ISA). The floating point vector registers of the vector register file store both scalar and floating point values as vectors having a plurality of vector elements. The processor may be part of a data processing system.
NASA Astrophysics Data System (ADS)
Black, Randy; Bai, Haowei; Michalicek, Andrew; Shelton, Blaine; Villela, Mark
2008-01-01
Currently, autonomy in space applications is limited by a variety of technology gaps. Innovative application of wireless technology and avionics architectural principles drawn from the Orion crew exploration vehicle provide solutions for several of these gaps. The Vision for Space Exploration envisions extensive use of autonomous systems. Economic realities preclude continuing the level of operator support currently required of autonomous systems in space. In order to decrease the number of operators, more autonomy must be afforded to automated systems. However, certification authorities have been notoriously reluctant to certify autonomous software in the presence of humans or when costly missions may be jeopardized. The Orion avionics architecture, drawn from advanced commercial aircraft avionics, is based upon several architectural principles including partitioning in software. Robust software partitioning provides "brick wall" separation between software applications executing on a single processor, along with controlled data movement between applications. Taking advantage of these attributes, non-deterministic applications can be placed in one partition and a "Safety" application created in a separate partition. This "Safety" partition can track the position of astronauts or critical equipment and prevent any unsafe command from executing. Only the Safety partition need be certified to a human rated level. As a proof-of-concept demonstration, Honeywell has teamed with the Ultra WideBand (UWB) Working Group at NASA Johnson Space Center to provide tracking of humans, autonomous systems, and critical equipment. Using UWB the NASA team can determine positioning to within less than one inch resolution, allowing a Safety partition to halt operation of autonomous systems in the event that an unplanned collision is imminent. Another challenge facing autonomous systems is the coordination of multiple autonomous agents. Current approaches address the issue as one of networking and coordination of multiple independent units, each with its own mission. As a proof-of-concept Honeywell is developing and testing various algorithms that lead to a deterministic, fault tolerant, reliable wireless backplane. Just as advanced avionics systems control several subsystems, actuators, sensors, displays, etc.; a single "master" autonomous agent (or base station computer) could control multiple autonomous systems. The problem is simplified to controlling a flexible body consisting of several sensors and actuators, rather than one of coordinating multiple independent units. By filling technology gaps associated with space based autonomous system, wireless technology and Orion architectural principles provide the means for decreasing operational costs and simplifying problems associated with collaboration of multiple autonomous systems.
NASA Astrophysics Data System (ADS)
Du, Zhaohui; Chen, Xuefeng; Zhang, Han; Zi, Yanyang; Yan, Ruqiang
2017-09-01
The gearbox of a wind turbine (WT) has dominant failure rates and highest downtime loss among all WT subsystems. Thus, gearbox health assessment for maintenance cost reduction is of paramount importance. The concurrence of multiple faults in gearbox components is a common phenomenon due to fault induction mechanism. This problem should be considered before planning to replace the components of the WT gearbox. Therefore, the key fault patterns should be reliably identified from noisy observation data for the development of an effective maintenance strategy. However, most of the existing studies focusing on multiple fault diagnosis always suffer from inappropriate division of fault information in order to satisfy various rigorous decomposition principles or statistical assumptions, such as the smooth envelope principle of ensemble empirical mode decomposition and the mutual independence assumption of independent component analysis. Thus, this paper presents a joint subspace learning-based multiple fault detection (JSL-MFD) technique to construct different subspaces adaptively for different fault patterns. Its main advantage is its capability to learn multiple fault subspaces directly from the observation signal itself. It can also sparsely concentrate the feature information into a few dominant subspace coefficients. Furthermore, it can eliminate noise by simply performing coefficient shrinkage operations. Consequently, multiple fault patterns are reliably identified by utilizing the maximum fault information criterion. The superiority of JSL-MFD in multiple fault separation and detection is comprehensively investigated and verified by the analysis of a data set of a 750 kW WT gearbox. Results show that JSL-MFD is superior to a state-of-the-art technique in detecting hidden fault patterns and enhancing detection accuracy.
First principles design of a core bioenergetic transmembrane electron-transfer protein
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goparaju, Geetha; Fry, Bryan A.; Chobot, Sarah E.
Here we describe the design, Escherichia coli expression and characterization of a simplified, adaptable and functionally transparent single chain 4-α-helix transmembrane protein frame that binds multiple heme and light activatable porphyrins. Such man-made cofactor-binding oxidoreductases, designed from first principles with minimal reference to natural protein sequences, are known as maquettes. This design is an adaptable frame aiming to uncover core engineering principles governing bioenergetic transmembrane electron-transfer function and recapitulate protein archetypes proposed to represent the origins of photosynthesis. This article is part of a Special Issue entitled Biodesign for Bioenergetics — the design and engineering of electronic transfer cofactors, proteinsmore » and protein networks, edited by Ronald L. Koder and J.L. Ross Anderson.« less
[Research on fast classification based on LIBS technology and principle component analyses].
Yu, Qi; Ma, Xiao-Hong; Wang, Rui; Zhao, Hua-Feng
2014-11-01
Laser-induced breakdown spectroscopy (LIBS) and the principle component analysis (PCA) were combined to study aluminum alloy classification in the present article. Classification experiments were done on thirteen different kinds of standard samples of aluminum alloy which belong to 4 different types, and the results suggested that the LIBS-PCA method can be used to aluminum alloy fast classification. PCA was used to analyze the spectrum data from LIBS experiments, three principle components were figured out that contribute the most, the principle component scores of the spectrums were calculated, and the scores of the spectrums data in three-dimensional coordinates were plotted. It was found that the spectrum sample points show clear convergence phenomenon according to the type of aluminum alloy they belong to. This result ensured the three principle components and the preliminary aluminum alloy type zoning. In order to verify its accuracy, 20 different aluminum alloy samples were used to do the same experiments to verify the aluminum alloy type zoning. The experimental result showed that the spectrum sample points all located in their corresponding area of the aluminum alloy type, and this proved the correctness of the earlier aluminum alloy standard sample type zoning method. Based on this, the identification of unknown type of aluminum alloy can be done. All the experimental results showed that the accuracy of principle component analyses method based on laser-induced breakdown spectroscopy is more than 97.14%, and it can classify the different type effectively. Compared to commonly used chemical methods, laser-induced breakdown spectroscopy can do the detection of the sample in situ and fast with little sample preparation, therefore, using the method of the combination of LIBS and PCA in the areas such as quality testing and on-line industrial controlling can save a lot of time and cost, and improve the efficiency of detection greatly.
Labarta, T
2007-01-01
Operational radiation protection of workers during the dismantling of nuclear facilities is based on the same radiation protection principles as that applied in its exploitation period with the objective of ensuring proper implementation of the as-low-as-reasonably-achievable (ALARA) principle. These principles are: prior determination of the nature and magnitude of radiological risk; classification of workplaces and workers depending on the risks; implementation of control measures; monitoring of zones and working conditions, including, if necessary, individual monitoring. From the experiences and the lessons learned during the dismantling processes carried out in Spain, several important aspects in the practical implementation of these principles that directly influence and ensure an adequate prevention of exposures and the estimation of internal doses are pointed out, with special emphasis on the estimation of internal doses due to transuranic intakes.
Fermat's principle of least time predicts refraction of ant trails at substrate borders.
Oettler, Jan; Schmid, Volker S; Zankl, Niko; Rey, Olivier; Dress, Andreas; Heinze, Jürgen
2013-01-01
Fermat's principle of least time states that light rays passing through different media follow the fastest (and not the most direct) path between two points, leading to refraction at medium borders. Humans intuitively employ this rule, e.g., when a lifeguard has to infer the fastest way to traverse both beach and water to reach a swimmer in need. Here, we tested whether foraging ants also follow Fermat's principle when forced to travel on two surfaces that differentially affected the ants' walking speed. Workers of the little fire ant, Wasmannia auropunctata, established "refracted" pheromone trails to a food source. These trails deviated from the most direct path, but were not different to paths predicted by Fermat's principle. Our results demonstrate a new aspect of decentralized optimization and underline the versatility of the simple yet robust rules governing the self-organization of group-living animals.
Definitely maybe: can unconscious processes perform the same functions as conscious processes?
Hesselmann, Guido; Moors, Pieter
2015-01-01
Hassin recently proposed the “Yes It Can” (YIC) principle to describe the division of labor between conscious and unconscious processes in human cognition. According to this principle, unconscious processes can carry out every fundamental high-level cognitive function that conscious processes can perform. In our commentary, we argue that the author presents an overly idealized review of the literature in support of the YIC principle. Furthermore, we point out that the dissimilar trends observed in social and cognitive psychology, with respect to published evidence of strong unconscious effects, can better be explained by the way how awareness is defined and measured in both research fields. Finally, we show that the experimental paradigm chosen by Hassin to rule out remaining objections against the YIC principle is unsuited to verify the new default notion that all high-level cognitive functions can unfold unconsciously. PMID:25999896
Fermat’s Principle of Least Time Predicts Refraction of Ant Trails at Substrate Borders
Zankl, Niko; Rey, Olivier; Dress, Andreas; Heinze, Jürgen
2013-01-01
Fermat’s principle of least time states that light rays passing through different media follow the fastest (and not the most direct) path between two points, leading to refraction at medium borders. Humans intuitively employ this rule, e.g., when a lifeguard has to infer the fastest way to traverse both beach and water to reach a swimmer in need. Here, we tested whether foraging ants also follow Fermat’s principle when forced to travel on two surfaces that differentially affected the ants’ walking speed. Workers of the little fire ant, Wasmannia auropunctata, established “refracted” pheromone trails to a food source. These trails deviated from the most direct path, but were not different to paths predicted by Fermat’s principle. Our results demonstrate a new aspect of decentralized optimization and underline the versatility of the simple yet robust rules governing the self-organization of group-living animals. PMID:23527263
NASA Astrophysics Data System (ADS)
Bjorklund, E.
1994-12-01
In the 1970s, when computers were memory limited, operating system designers created the concept of "virtual memory", which gave users the ability to address more memory than physically existed. In the 1990s, many large control systems have the potential of becoming data limited. We propose that many of the principles behind virtual memory systems (working sets, locality, caching and clustering) can also be applied to data-limited systems, creating, in effect, "virtual data systems". At the Los Alamos National Laboratory's Clinton P. Anderson Meson Physics Facility (LAMPF), we have applied these principles to a moderately sized (10 000 data points) data acquisition and control system. To test the principles, we measured the system's performance during tune-up, production, and maintenance periods. In this paper, we present a general discussion of the principles of a virtual data system along with some discussion of our own implementation and the results of our performance measurements.
Bryophyllum pinnatum: A Great Teaching Aid.
ERIC Educational Resources Information Center
Martin, Francis L.
1983-01-01
Suggests using Bryophyllum pinnatum to illustrate botanical principles. Includes tips for keeping and maintaining the plant in the classroom and suggests several student activities, including observing root/shoot growth, investigating apical dominance, exploring multiple leaf development, and others. (JN)
Potential scenarios of concern for high speed rail operations
DOT National Transportation Integrated Search
2011-03-16
Currently, multiple operating authorities are proposing the : introduction of high-speed rail service in the United States. : While high-speed rail service shares a number of basic : principles with conventional-speed rail service, the operational : ...
NASA Astrophysics Data System (ADS)
Pyne, Moinak
This thesis aspires to model and control, the flow of power in a DC microgrid. Specifically, the energy sources are a photovoltaic system and the utility grid, a lead acid battery based energy storage system and twenty PEV charging stations as the loads. Theoretical principles of large scale state space modeling are applied to model the considerable number of power electronic converters needed for controlling voltage and current thresholds. The energy storage system is developed using principles of neural networks to facilitate a stable and uncomplicated model of the lead acid battery. Power flow control is structured as a hierarchical problem with multiple interactions between individual components of the microgrid. The implementation is done using fuzzy logic with scheduling the maximum use of available solar energy and compensating demand or excess power with the energy storage system, and minimizing utility grid use, while providing multiple speeds of charging the PEVs.
Analysis of Classes of Singular Steady State Reaction Diffusion Equations
NASA Astrophysics Data System (ADS)
Son, Byungjae
We study positive radial solutions to classes of steady state reaction diffusion problems on the exterior of a ball with both Dirichlet and nonlinear boundary conditions. We study both Laplacian as well as p-Laplacian problems with reaction terms that are p-sublinear at infinity. We consider both positone and semipositone reaction terms and establish existence, multiplicity and uniqueness results. Our existence and multiplicity results are achieved by a method of sub-supersolutions and uniqueness results via a combination of maximum principles, comparison principles, energy arguments and a-priori estimates. Our results significantly enhance the literature on p-sublinear positone and semipositone problems. Finally, we provide exact bifurcation curves for several one-dimensional problems. In the autonomous case, we extend and analyze a quadrature method, and in the nonautonomous case, we employ shooting methods. We use numerical solvers in Mathematica to generate the bifurcation curves.
Robust and efficient overset grid assembly for partitioned unstructured meshes
NASA Astrophysics Data System (ADS)
Roget, Beatrice; Sitaraman, Jayanarayanan
2014-03-01
This paper presents a method to perform efficient and automated Overset Grid Assembly (OGA) on a system of overlapping unstructured meshes in a parallel computing environment where all meshes are partitioned into multiple mesh-blocks and processed on multiple cores. The main task of the overset grid assembler is to identify, in parallel, among all points in the overlapping mesh system, at which points the flow solution should be computed (field points), interpolated (receptor points), or ignored (hole points). Point containment search or donor search, an algorithm to efficiently determine the cell that contains a given point, is the core procedure necessary for accomplishing this task. Donor search is particularly challenging for partitioned unstructured meshes because of the complex irregular boundaries that are often created during partitioning.
Fessenden, S W; Hackmann, T J; Ross, D A; Foskolos, A; Van Amburgh, M E
2017-09-01
Microbial samples from 4 independent experiments in lactating dairy cattle were obtained and analyzed for nutrient composition, AA digestibility, and AA profile after multiple hydrolysis times ranging from 2 to 168 h. Similar bacterial and protozoal isolation techniques were used for all isolations. Omasal bacteria and protozoa samples were analyzed for AA digestibility using a new in vitro technique. Multiple time point hydrolysis and least squares nonlinear regression were used to determine the AA content of omasal bacteria and protozoa, and equivalency comparisons were made against single time point hydrolysis. Formalin was used in 1 experiment, which negatively affected AA digestibility and likely limited the complete release of AA during acid hydrolysis. The mean AA digestibility was 87.8 and 81.6% for non-formalin-treated bacteria and protozoa, respectively. Preservation of microbe samples in formalin likely decreased recovery of several individual AA. Results from the multiple time point hydrolysis indicated that Ile, Val, and Met hydrolyzed at a slower rate compared with other essential AA. Singe time point hydrolysis was found to be nonequivalent to multiple time point hydrolysis when considering biologically important changes in estimated microbial AA profiles. Several AA, including Met, Ile, and Val, were underpredicted using AA determination after a single 24-h hydrolysis. Models for predicting postruminal supply of AA might need to consider potential bias present in postruminal AA flow literature when AA determinations are performed after single time point hydrolysis and when using formalin as a preservative for microbial samples. Copyright © 2017 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
Yu, Zhizhou; Chen, Jian; Zhang, Lei; Wang, Jian
2013-12-11
We report an investigation of Coulomb blockade transport through an endohedral N@C60 weakly coupled with aluminum leads, employing the first-principles method combined with the Keldysh non-equilibrium Green's function derived from the equation of motion beyond the Hartree-Fock approximation. The differential conductance characteristics of the molecular device are calculated within the Coulomb blockade regime, which shows the Coulomb diamond as observed experimentally. When the gate voltage is less than that of the degeneracy point, there are two peaks in the differential conductance with an excited state induced by the change of the exchange interaction between the spin of C60 and the encapsulated nitrogen atom due to the transition from N@C(1-)(60) to N@C(2-)(60), while for a gate voltage larger than that of the degeneracy point, no excited state is available due to the quenching of exchange energy. As a result, there is only one Coulomb blockade peak in the differential conductance from the electron tunneling through the highest energy level below the Fermi level. Our first-principles results are in good agreement with experimental data obtained by an endohedral N@C60 molecular device.
Thermodynamic framework to assess low abundance DNA mutation detection by hybridization.
Willems, Hanny; Jacobs, An; Hadiwikarta, Wahyu Wijaya; Venken, Tom; Valkenborg, Dirk; Van Roy, Nadine; Vandesompele, Jo; Hooyberghs, Jef
2017-01-01
The knowledge of genomic DNA variations in patient samples has a high and increasing value for human diagnostics in its broadest sense. Although many methods and sensors to detect or quantify these variations are available or under development, the number of underlying physico-chemical detection principles is limited. One of these principles is the hybridization of sample target DNA versus nucleic acid probes. We introduce a novel thermodynamics approach and develop a framework to exploit the specific detection capabilities of nucleic acid hybridization, using generic principles applicable to any platform. As a case study, we detect point mutations in the KRAS oncogene on a microarray platform. For the given platform and hybridization conditions, we demonstrate the multiplex detection capability of hybridization and assess the detection limit using thermodynamic considerations; DNA containing point mutations in a background of wild type sequences can be identified down to at least 1% relative concentration. In order to show the clinical relevance, the detection capabilities are confirmed on challenging formalin-fixed paraffin-embedded clinical tumor samples. This enzyme-free detection framework contains the accuracy and efficiency to screen for hundreds of mutations in a single run with many potential applications in molecular diagnostics and the field of personalised medicine.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tweardy, Matthew C.; McConchie, Seth; Hayward, Jason P.
An extension of the point kinetics model is developed in this paper to describe the neutron multiplicity response of a bare uranium object under interrogation by an associated particle imaging deuterium-tritium (D-T) measurement system. This extended model is used to estimate the total neutron multiplication of the uranium. Both MCNPX-PoliMi simulations and data from active interrogation measurements of highly enriched and depleted uranium geometries are used to evaluate the potential of this method and to identify the sources of systematic error. The detection efficiency correction for measured coincidence response is identified as a large source of systematic error. If themore » detection process is not considered, results suggest that the method can estimate total multiplication to within 13% of the simulated value. Values for multiplicity constants in the point kinetics equations are sensitive to enrichment due to (n, xn) interactions by D-T neutrons and can introduce another significant source of systematic bias. This can theoretically be corrected if isotopic composition is known a priori. Finally, the spatial dependence of multiplication is also suspected of introducing further systematic bias for high multiplication uranium objects.« less
Tweardy, Matthew C.; McConchie, Seth; Hayward, Jason P.
2017-06-13
An extension of the point kinetics model is developed in this paper to describe the neutron multiplicity response of a bare uranium object under interrogation by an associated particle imaging deuterium-tritium (D-T) measurement system. This extended model is used to estimate the total neutron multiplication of the uranium. Both MCNPX-PoliMi simulations and data from active interrogation measurements of highly enriched and depleted uranium geometries are used to evaluate the potential of this method and to identify the sources of systematic error. The detection efficiency correction for measured coincidence response is identified as a large source of systematic error. If themore » detection process is not considered, results suggest that the method can estimate total multiplication to within 13% of the simulated value. Values for multiplicity constants in the point kinetics equations are sensitive to enrichment due to (n, xn) interactions by D-T neutrons and can introduce another significant source of systematic bias. This can theoretically be corrected if isotopic composition is known a priori. Finally, the spatial dependence of multiplication is also suspected of introducing further systematic bias for high multiplication uranium objects.« less
ERIC Educational Resources Information Center
Porter, Kristin E.
2018-01-01
Researchers are often interested in testing the effectiveness of an intervention on multiple outcomes, for multiple subgroups, at multiple points in time, or across multiple treatment groups. The resulting multiplicity of statistical hypothesis tests can lead to spurious findings of effects. Multiple testing procedures (MTPs) are statistical…
Gschwind, Michael K
2013-04-16
Mechanisms for generating and executing programs for a floating point (FP) only single instruction multiple data (SIMD) instruction set architecture (ISA) are provided. A computer program product comprising a computer recordable medium having a computer readable program recorded thereon is provided. The computer readable program, when executed on a computing device, causes the computing device to receive one or more instructions and execute the one or more instructions using logic in an execution unit of the computing device. The logic implements a floating point (FP) only single instruction multiple data (SIMD) instruction set architecture (ISA), based on data stored in a vector register file of the computing device. The vector register file is configured to store both scalar and floating point values as vectors having a plurality of vector elements.
Yi, Faliu; Lee, Jieun; Moon, Inkyu
2014-05-01
The reconstruction of multiple depth images with a ray back-propagation algorithm in three-dimensional (3D) computational integral imaging is computationally burdensome. Further, a reconstructed depth image consists of a focus and an off-focus area. Focus areas are 3D points on the surface of an object that are located at the reconstructed depth, while off-focus areas include 3D points in free-space that do not belong to any object surface in 3D space. Generally, without being removed, the presence of an off-focus area would adversely affect the high-level analysis of a 3D object, including its classification, recognition, and tracking. Here, we use a graphics processing unit (GPU) that supports parallel processing with multiple processors to simultaneously reconstruct multiple depth images using a lookup table containing the shifted values along the x and y directions for each elemental image in a given depth range. Moreover, each 3D point on a depth image can be measured by analyzing its statistical variance with its corresponding samples, which are captured by the two-dimensional (2D) elemental images. These statistical variances can be used to classify depth image pixels as either focus or off-focus points. At this stage, the measurement of focus and off-focus points in multiple depth images is also implemented in parallel on a GPU. Our proposed method is conducted based on the assumption that there is no occlusion of the 3D object during the capture stage of the integral imaging process. Experimental results have demonstrated that this method is capable of removing off-focus points in the reconstructed depth image. The results also showed that using a GPU to remove the off-focus points could greatly improve the overall computational speed compared with using a CPU.
The boiling point of stratospheric aerosols.
NASA Technical Reports Server (NTRS)
Rosen, J. M.
1971-01-01
A photoelectric particle counter was used for the measurement of aerosol boiling points. The operational principle involves raising the temperature of the aerosol by vigorously heating a portion of the intake tube. At or above the boiling point, the particles disintegrate rather quickly, and a noticeable effect on the size distribution and concentration is observed. Stratospheric aerosols appear to have the same volatility as a solution of 75% sulfuric acid. Chemical analysis of the aerosols indicates that there are other substances present, but that the sulfate radical is apparently the major constituent.
1983-12-01
national gateway closest to an MCI interconnection point would be chosen.) Another significant principle is that mobile users area to be addresses the...duplication with E.16n. It was agreed that, from an addressing viewpoint, mobile subscribers are .. like fixed subscribers; i.e., mobile subscribers have TEs...reference points S and T, NTl, and may have NT2. Therefore, an ISDN number has the same ability to unambiguously identify points in mobile subscriber
Estimating Statistical Power When Making Adjustments for Multiple Tests
ERIC Educational Resources Information Center
Porter, Kristin E.
2016-01-01
In recent years, there has been increasing focus on the issue of multiple hypotheses testing in education evaluation studies. In these studies, researchers are typically interested in testing the effectiveness of an intervention on multiple outcomes, for multiple subgroups, at multiple points in time or across multiple treatment groups. When…
Multi-point laser coherent detection system and its application on vibration measurement
NASA Astrophysics Data System (ADS)
Fu, Y.; Yang, C.; Xu, Y. J.; Liu, H.; Yan, K.; Guo, M.
2015-05-01
Laser Doppler vibrometry (LDV) is a well-known interferometric technique to measure the motions, vibrations and mode shapes of machine components and structures. The drawback of commercial LDV is that it can only offer a pointwise measurement. In order to build up a vibrometric image, a scanning device is normally adopted to scan the laser point in two spatial axes. These scanning laser Doppler vibrometers (SLDV) assume that the measurement conditions remain invariant while multiple and identical, sequential measurements are performed. This assumption makes SLDVs impractical to do measurement on transient events. In this paper, we introduce a new multiple-point laser coherent detection system based on spatial-encoding technology and fiber configuration. A simultaneous vibration measurement on multiple points is realized using a single photodetector. A prototype16-point laser coherent detection system is built and it is applied to measure the vibration of various objects, such as body of a car or a motorcycle when engine is on and under shock tests. The results show the prospect of multi-point laser coherent detection system in the area of nondestructive test and precise dynamic measurement.
Nonemergency PCI at hospitals with or without on-site cardiac surgery.
Jacobs, Alice K; Normand, Sharon-Lise T; Massaro, Joseph M; Cutlip, Donald E; Carrozza, Joseph P; Marks, Anthony D; Murphy, Nancy; Romm, Iyah K; Biondolillo, Madeleine; Mauri, Laura
2013-04-18
Emergency surgery has become a rare event after percutaneous coronary intervention (PCI). Whether having cardiac-surgery services available on-site is essential for ensuring the best possible outcomes during and after PCI remains uncertain. We enrolled patients with indications for nonemergency PCI who presented at hospitals in Massachusetts without on-site cardiac surgery and randomly assigned these patients, in a 3:1 ratio, to undergo PCI at that hospital or at a partner hospital that had cardiac surgery services available. A total of 10 hospitals without on-site cardiac surgery and 7 with on-site cardiac surgery participated. The coprimary end points were the rates of major adverse cardiac events--a composite of death, myocardial infarction, repeat revascularization, or stroke--at 30 days (safety end point) and at 12 months (effectiveness end point). The primary end points were analyzed according to the intention-to-treat principle and were tested with the use of multiplicative noninferiority margins of 1.5 (for safety) and 1.3 (for effectiveness). A total of 3691 patients were randomly assigned to undergo PCI at a hospital without on-site cardiac surgery (2774 patients) or at a hospital with on-site cardiac surgery (917 patients). The rates of major adverse cardiac events were 9.5% in hospitals without on-site cardiac surgery and 9.4% in hospitals with on-site cardiac surgery at 30 days (relative risk, 1.00; 95% one-sided upper confidence limit, 1.22; P<0.001 for noninferiority) and 17.3% and 17.8%, respectively, at 12 months (relative risk, 0.98; 95% one-sided upper confidence limit, 1.13; P<0.001 for noninferiority). The rates of death, myocardial infarction, repeat revascularization, and stroke (the components of the primary end point) did not differ significantly between the groups at either time point. Nonemergency PCI procedures performed at hospitals in Massachusetts without on-site surgical services were noninferior to procedures performed at hospitals with on-site surgical services with respect to the 30-day and 1-year rates of clinical events. (Funded by the participating hospitals without on-site cardiac surgery; MASS COM ClinicalTrials.gov number, NCT01116882.).
78 FR 33327 - Announcement of Grant and Loan Application Deadlines and Funding Levels
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-04
... principles and general administrative requirements for grants pertaining to their organizational type in... years--30 points (2) Extent to which the work plan demonstrates a well thought out, comprehensive...
ERIC Educational Resources Information Center
Matthews, E. G.
1975-01-01
Describes an approach to the study of entomology directed at people with no special knowledge of insects. The aim of this approach is to reveal some biological principles by studying insects from an ecological point of view. (GS)
[Four seasons acupuncture and Five Shu Points].
Zhao, Jing-sheng; Shi, Xin-de
2009-10-01
The method of four seasons acupuncture was initially recorded in the Internal Classic. It has the important meaning for understanding the contents of acupoints, especially the Five Shu Points. Its original meaning is that the place of acupuncture is difference on each season in order to cooperate with the both qi from the depth of acupuncture and four seasons. Through the interpretation from different aspects, the depth of qi is mainly considered as layer location and further considered as acupoints, and then gradually formed the principle of using the Five Shu Points. Following this way, we could correctly understand the contents of the Five Shu Points.