Interface-Resolving Simulation of Collision Efficiency of Cloud Droplets
NASA Astrophysics Data System (ADS)
Wang, Lian-Ping; Peng, Cheng; Rosa, Bodgan; Onishi, Ryo
2017-11-01
Small-scale air turbulence could enhance the geometric collision rate of cloud droplets while large-scale air turbulence could augment the diffusional growth of cloud droplets. Air turbulence could also enhance the collision efficiency of cloud droplets. Accurate simulation of collision efficiency, however, requires capture of the multi-scale droplet-turbulence and droplet-droplet interactions, which has only been partially achieved in the recent past using the hybrid direct numerical simulation (HDNS) approach. % where Stokes disturbance flow is assumed. The HDNS approach has two major drawbacks: (1) the short-range droplet-droplet interaction is not treated rigorously; (2) the finite-Reynolds number correction to the collision efficiency is not included. In this talk, using two independent numerical methods, we will develop an interface-resolved simulation approach in which the disturbance flows are directly resolved numerically, combined with a rigorous lubrication correction model for near-field droplet-droplet interaction. This multi-scale approach is first used to study the effect of finite flow Reynolds numbers on the droplet collision efficiency in still air. Our simulation results show a significant finite-Re effect on collision efficiency when the droplets are of similar sizes. Preliminary results on integrating this approach in a turbulent flow laden with droplets will also be presented. This work is partially supported by the National Science Foundation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Havu, V.; Fritz Haber Institute of the Max Planck Society, Berlin; Blum, V.
2009-12-01
We consider the problem of developing O(N) scaling grid-based operations needed in many central operations when performing electronic structure calculations with numeric atom-centered orbitals as basis functions. We outline the overall formulation of localized algorithms, and specifically the creation of localized grid batches. The choice of the grid partitioning scheme plays an important role in the performance and memory consumption of the grid-based operations. Three different top-down partitioning methods are investigated, and compared with formally more rigorous yet much more expensive bottom-up algorithms. We show that a conceptually simple top-down grid partitioning scheme achieves essentially the same efficiency as themore » more rigorous bottom-up approaches.« less
Symmetry Properties of Potentiometric Titration Curves.
ERIC Educational Resources Information Center
Macca, Carlo; Bombi, G. Giorgio
1983-01-01
Demonstrates how the symmetry properties of titration curves can be efficiently and rigorously treated by means of a simple method, assisted by the use of logarithmic diagrams. Discusses the symmetry properties of several typical titration curves, comparing the graphical approach and an explicit mathematical treatment. (Author/JM)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fox, Zachary; Neuert, Gregor; Department of Pharmacology, School of Medicine, Vanderbilt University, Nashville, Tennessee 37232
2016-08-21
Emerging techniques now allow for precise quantification of distributions of biological molecules in single cells. These rapidly advancing experimental methods have created a need for more rigorous and efficient modeling tools. Here, we derive new bounds on the likelihood that observations of single-cell, single-molecule responses come from a discrete stochastic model, posed in the form of the chemical master equation. These strict upper and lower bounds are based on a finite state projection approach, and they converge monotonically to the exact likelihood value. These bounds allow one to discriminate rigorously between models and with a minimum level of computational effort.more » In practice, these bounds can be incorporated into stochastic model identification and parameter inference routines, which improve the accuracy and efficiency of endeavors to analyze and predict single-cell behavior. We demonstrate the applicability of our approach using simulated data for three example models as well as for experimental measurements of a time-varying stochastic transcriptional response in yeast.« less
Efficient Integrative Multi-SNP Association Analysis via Deterministic Approximation of Posteriors.
Wen, Xiaoquan; Lee, Yeji; Luca, Francesca; Pique-Regi, Roger
2016-06-02
With the increasing availability of functional genomic data, incorporating genomic annotations into genetic association analysis has become a standard procedure. However, the existing methods often lack rigor and/or computational efficiency and consequently do not maximize the utility of functional annotations. In this paper, we propose a rigorous inference procedure to perform integrative association analysis incorporating genomic annotations for both traditional GWASs and emerging molecular QTL mapping studies. In particular, we propose an algorithm, named deterministic approximation of posteriors (DAP), which enables highly efficient and accurate joint enrichment analysis and identification of multiple causal variants. We use a series of simulation studies to highlight the power and computational efficiency of our proposed approach and further demonstrate it by analyzing the cross-population eQTL data from the GEUVADIS project and the multi-tissue eQTL data from the GTEx project. In particular, we find that genetic variants predicted to disrupt transcription factor binding sites are enriched in cis-eQTLs across all tissues. Moreover, the enrichment estimates obtained across the tissues are correlated with the cell types for which the annotations are derived. Copyright © 2016 American Society of Human Genetics. Published by Elsevier Inc. All rights reserved.
Phillips, Christine B; Dwan, Kathryn; Hepworth, Julie; Pearce, Christopher; Hall, Sally
2014-11-19
The primary health care sector delivers the majority of health care in western countries through small, community-based organizations. However, research into these healthcare organizations is limited by the time constraints and pressure facing them, and the concern by staff that research is peripheral to their work. We developed Q-RARA-Qualitative Rapid Appraisal, Rigorous Analysis-to study small, primary health care organizations in a way that is efficient, acceptable to participants and methodologically rigorous. Q-RARA comprises a site visit, semi-structured interviews, structured and unstructured observations, photographs, floor plans, and social scanning data. Data were collected over the course of one day per site and the qualitative analysis was integrated and iterative. We found Q-RARA to be acceptable to participants and effective in collecting data on organizational function in multiple sites without disrupting the practice, while maintaining a balance between speed and trustworthiness. The Q-RARA approach is capable of providing a richly textured, rigorous understanding of the processes of the primary care practice while also allowing researchers to develop an organizational perspective. For these reasons the approach is recommended for use in small-scale organizations both within and outside the primary health care sector.
Efficient anharmonic vibrational spectroscopy for large molecules using local-mode coordinates
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cheng, Xiaolu; Steele, Ryan P., E-mail: ryan.steele@utah.edu
This article presents a general computational approach for efficient simulations of anharmonic vibrational spectra in chemical systems. An automated local-mode vibrational approach is presented, which borrows techniques from localized molecular orbitals in electronic structure theory. This approach generates spatially localized vibrational modes, in contrast to the delocalization exhibited by canonical normal modes. The method is rigorously tested across a series of chemical systems, ranging from small molecules to large water clusters and a protonated dipeptide. It is interfaced with exact, grid-based approaches, as well as vibrational self-consistent field methods. Most significantly, this new set of reference coordinates exhibits a well-behavedmore » spatial decay of mode couplings, which allows for a systematic, a priori truncation of mode couplings and increased computational efficiency. Convergence can typically be reached by including modes within only about 4 Å. The local nature of this truncation suggests particular promise for the ab initio simulation of anharmonic vibrational motion in large systems, where connection to experimental spectra is currently most challenging.« less
Spectrum splitting using multi-layer dielectric meta-surfaces for efficient solar energy harvesting
NASA Astrophysics Data System (ADS)
Yao, Yuhan; Liu, He; Wu, Wei
2014-06-01
We designed a high-efficiency dispersive mirror based on multi-layer dielectric meta-surfaces. By replacing the secondary mirror of a dome solar concentrator with this dispersive mirror, the solar concentrator can be converted into a spectrum-splitting photovoltaic system with higher energy harvesting efficiency and potentially lower cost. The meta-surfaces are consisted of high-index contrast gratings (HCG). The structures and parameters of the dispersive mirror (i.e. stacked HCG) are optimized based on finite-difference time-domain and rigorous coupled-wave analysis method. Our numerical study shows that the dispersive mirror can direct light with different wavelengths into different angles in the entire solar spectrum, maintaining very low energy loss. Our approach will not only improve the energy harvesting efficiency, but also lower the cost by using single junction cells instead of multi-layer tandem solar cells. Moreover, this approach has the minimal disruption to the existing solar concentrator infrastructures.
Advanced EUV mask and imaging modeling
NASA Astrophysics Data System (ADS)
Evanschitzky, Peter; Erdmann, Andreas
2017-10-01
The exploration and optimization of image formation in partially coherent EUV projection systems with complex source shapes requires flexible, accurate, and efficient simulation models. This paper reviews advanced mask diffraction and imaging models for the highly accurate and fast simulation of EUV lithography systems, addressing important aspects of the current technical developments. The simulation of light diffraction from the mask employs an extended rigorous coupled wave analysis (RCWA) approach, which is optimized for EUV applications. In order to be able to deal with current EUV simulation requirements, several additional models are included in the extended RCWA approach: a field decomposition and a field stitching technique enable the simulation of larger complex structured mask areas. An EUV multilayer defect model including a database approach makes the fast and fully rigorous defect simulation and defect repair simulation possible. A hybrid mask simulation approach combining real and ideal mask parts allows the detailed investigation of the origin of different mask 3-D effects. The image computation is done with a fully vectorial Abbe-based approach. Arbitrary illumination and polarization schemes and adapted rigorous mask simulations guarantee a high accuracy. A fully vectorial sampling-free description of the pupil with Zernikes and Jones pupils and an optimized representation of the diffraction spectrum enable the computation of high-resolution images with high accuracy and short simulation times. A new pellicle model supports the simulation of arbitrary membrane stacks, pellicle distortions, and particles/defects on top of the pellicle. Finally, an extension for highly accurate anamorphic imaging simulations is included. The application of the models is demonstrated by typical use cases.
Using GIS to generate spatially balanced random survey designs for natural resource applications.
Theobald, David M; Stevens, Don L; White, Denis; Urquhart, N Scott; Olsen, Anthony R; Norman, John B
2007-07-01
Sampling of a population is frequently required to understand trends and patterns in natural resource management because financial and time constraints preclude a complete census. A rigorous probability-based survey design specifies where to sample so that inferences from the sample apply to the entire population. Probability survey designs should be used in natural resource and environmental management situations because they provide the mathematical foundation for statistical inference. Development of long-term monitoring designs demand survey designs that achieve statistical rigor and are efficient but remain flexible to inevitable logistical or practical constraints during field data collection. Here we describe an approach to probability-based survey design, called the Reversed Randomized Quadrant-Recursive Raster, based on the concept of spatially balanced sampling and implemented in a geographic information system. This provides environmental managers a practical tool to generate flexible and efficient survey designs for natural resource applications. Factors commonly used to modify sampling intensity, such as categories, gradients, or accessibility, can be readily incorporated into the spatially balanced sample design.
Accuracy and performance of 3D mask models in optical projection lithography
NASA Astrophysics Data System (ADS)
Agudelo, Viviana; Evanschitzky, Peter; Erdmann, Andreas; Fühner, Tim; Shao, Feng; Limmer, Steffen; Fey, Dietmar
2011-04-01
Different mask models have been compared: rigorous electromagnetic field (EMF) modeling, rigorous EMF modeling with decomposition techniques and the thin mask approach (Kirchhoff approach) to simulate optical diffraction from different mask patterns in projection systems for lithography. In addition, each rigorous model was tested for two different formulations for partially coherent imaging: The Hopkins assumption and rigorous simulation of mask diffraction orders for multiple illumination angles. The aim of this work is to closely approximate results of the rigorous EMF method by the thin mask model enhanced with pupil filtering techniques. The validity of this approach for different feature sizes, shapes and illumination conditions is investigated.
NASA Astrophysics Data System (ADS)
Bahl, Mayank; Zhou, Gui-Rong; Heller, Evan; Cassarly, William; Jiang, Mingming; Scarmozzino, Rob; Gregory, G. Groot
2014-09-01
Over the last two decades there has been extensive research done to improve the design of Organic Light Emitting Diodes (OLEDs) so as to enhance light extraction efficiency, improve beam shaping, and allow color tuning through techniques such as the use of patterned substrates, photonic crystal (PCs) gratings, back reflectors, surface texture, and phosphor down-conversion. Computational simulation has been an important tool for examining these increasingly complex designs. It has provided insights for improving OLED performance as a result of its ability to explore limitations, predict solutions, and demonstrate theoretical results. Depending upon the focus of the design and scale of the problem, simulations are carried out using rigorous electromagnetic (EM) wave optics based techniques, such as finite-difference time-domain (FDTD) and rigorous coupled wave analysis (RCWA), or through ray optics based technique such as Monte Carlo ray-tracing. The former are typically used for modeling nanostructures on the OLED die, and the latter for modeling encapsulating structures, die placement, back-reflection, and phosphor down-conversion. This paper presents the use of a mixed-level simulation approach which unifies the use of EM wave-level and ray-level tools. This approach uses rigorous EM wave based tools to characterize the nanostructured die and generate both a Bidirectional Scattering Distribution function (BSDF) and a far-field angular intensity distribution. These characteristics are then incorporated into the ray-tracing simulator to obtain the overall performance. Such mixed-level approach allows for comprehensive modeling of the optical characteristic of OLEDs and can potentially lead to more accurate performance than that from individual modeling tools alone.
On the Modeling of Shells in Multibody Dynamics
NASA Technical Reports Server (NTRS)
Bauchau, Olivier A.; Choi, Jou-Young; Bottasso, Carlo L.
2000-01-01
Energy preserving/decaying schemes are presented for the simulation of the nonlinear multibody systems involving shell components. The proposed schemes are designed to meet four specific requirements: unconditional nonlinear stability of the scheme, a rigorous treatment of both geometric and material nonlinearities, exact satisfaction of the constraints, and the presence of high frequency numerical dissipation. The kinematic nonlinearities associated with arbitrarily large displacements and rotations of shells are treated in a rigorous manner, and the material nonlinearities can be handled when the, constitutive laws stem from the existence of a strain energy density function. The efficiency and robustness of the proposed approach is illustrated with specific numerical examples that also demonstrate the need for integration schemes possessing high frequency numerical dissipation.
Scaling up to address data science challenges
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wendelberger, Joanne R.
Statistics and Data Science provide a variety of perspectives and technical approaches for exploring and understanding Big Data. Partnerships between scientists from different fields such as statistics, machine learning, computer science, and applied mathematics can lead to innovative approaches for addressing problems involving increasingly large amounts of data in a rigorous and effective manner that takes advantage of advances in computing. Here, this article will explore various challenges in Data Science and will highlight statistical approaches that can facilitate analysis of large-scale data including sampling and data reduction methods, techniques for effective analysis and visualization of large-scale simulations, and algorithmsmore » and procedures for efficient processing.« less
Scaling up to address data science challenges
Wendelberger, Joanne R.
2017-04-27
Statistics and Data Science provide a variety of perspectives and technical approaches for exploring and understanding Big Data. Partnerships between scientists from different fields such as statistics, machine learning, computer science, and applied mathematics can lead to innovative approaches for addressing problems involving increasingly large amounts of data in a rigorous and effective manner that takes advantage of advances in computing. Here, this article will explore various challenges in Data Science and will highlight statistical approaches that can facilitate analysis of large-scale data including sampling and data reduction methods, techniques for effective analysis and visualization of large-scale simulations, and algorithmsmore » and procedures for efficient processing.« less
Efficiency versus speed in quantum heat engines: Rigorous constraint from Lieb-Robinson bound
NASA Astrophysics Data System (ADS)
Shiraishi, Naoto; Tajima, Hiroyasu
2017-08-01
A long-standing open problem whether a heat engine with finite power achieves the Carnot efficiency is investgated. We rigorously prove a general trade-off inequality on thermodynamic efficiency and time interval of a cyclic process with quantum heat engines. In a first step, employing the Lieb-Robinson bound we establish an inequality on the change in a local observable caused by an operation far from support of the local observable. This inequality provides a rigorous characterization of the following intuitive picture that most of the energy emitted from the engine to the cold bath remains near the engine when the cyclic process is finished. Using this description, we prove an upper bound on efficiency with the aid of quantum information geometry. Our result generally excludes the possibility of a process with finite speed at the Carnot efficiency in quantum heat engines. In particular, the obtained constraint covers engines evolving with non-Markovian dynamics, which almost all previous studies on this topic fail to address.
Efficiency versus speed in quantum heat engines: Rigorous constraint from Lieb-Robinson bound.
Shiraishi, Naoto; Tajima, Hiroyasu
2017-08-01
A long-standing open problem whether a heat engine with finite power achieves the Carnot efficiency is investgated. We rigorously prove a general trade-off inequality on thermodynamic efficiency and time interval of a cyclic process with quantum heat engines. In a first step, employing the Lieb-Robinson bound we establish an inequality on the change in a local observable caused by an operation far from support of the local observable. This inequality provides a rigorous characterization of the following intuitive picture that most of the energy emitted from the engine to the cold bath remains near the engine when the cyclic process is finished. Using this description, we prove an upper bound on efficiency with the aid of quantum information geometry. Our result generally excludes the possibility of a process with finite speed at the Carnot efficiency in quantum heat engines. In particular, the obtained constraint covers engines evolving with non-Markovian dynamics, which almost all previous studies on this topic fail to address.
NASA Astrophysics Data System (ADS)
Dimitrakopoulos, Panagiotis
2018-03-01
The calculation of polytropic efficiencies is a very important task, especially during the development of new compression units, like compressor impellers, stages and stage groups. Such calculations are also crucial for the determination of the performance of a whole compressor. As processors and computational capacities have substantially been improved in the last years, the need for a new, rigorous, robust, accurate and at the same time standardized method merged, regarding the computation of the polytropic efficiencies, especially based on thermodynamics of real gases. The proposed method is based on the rigorous definition of the polytropic efficiency. The input consists of pressure and temperature values at the end points of the compression path (suction and discharge), for a given working fluid. The average relative error for the studied cases was 0.536 %. Thus, this high-accuracy method is proposed for efficiency calculations related with turbocompressors and their compression units, especially when they are operating at high power levels, for example in jet engines and high-power plants.
Lee, Kyu-Tae; Jang, Ji-Yun; Park, Sang Jin; Ok, Song Ah; Park, Hui Joon
2017-09-28
See-through perovskite solar cells with high efficiency and iridescent colors are demonstrated by employing a multilayer dielectric mirror. A certain amount of visible light is used for wide color gamut semitransparent color generation, which can be easily tuned by changing an angle of incidence, and a wide range of visible light is efficiently reflected back toward a photoactive layer of the perovskite solar cells by the dielectric mirror for highly efficient light-harvesting performance, thus achieving 10.12% power conversion efficiency. We also rigorously examine how the number of pairs in the multilayer dielectric mirror affects optical properties of the colored semitransparent perovskite solar cells. The described approach can open the door to a large number of applications such as building-integrated photovoltaics, self-powered wearable electronics and power-generating color filters for energy-efficient display systems.
Ebalunode, Jerry O; Zheng, Weifan; Tropsha, Alexander
2011-01-01
Optimization of chemical library composition affords more efficient identification of hits from biological screening experiments. The optimization could be achieved through rational selection of reagents used in combinatorial library synthesis. However, with a rapid advent of parallel synthesis methods and availability of millions of compounds synthesized by many vendors, it may be more efficient to design targeted libraries by means of virtual screening of commercial compound collections. This chapter reviews the application of advanced cheminformatics approaches such as quantitative structure-activity relationships (QSAR) and pharmacophore modeling (both ligand and structure based) for virtual screening. Both approaches rely on empirical SAR data to build models; thus, the emphasis is placed on achieving models of the highest rigor and external predictive power. We present several examples of successful applications of both approaches for virtual screening to illustrate their utility. We suggest that the expert use of both QSAR and pharmacophore models, either independently or in combination, enables users to achieve targeted libraries enriched with experimentally confirmed hit compounds.
A generative, probabilistic model of local protein structure.
Boomsma, Wouter; Mardia, Kanti V; Taylor, Charles C; Ferkinghoff-Borg, Jesper; Krogh, Anders; Hamelryck, Thomas
2008-07-01
Despite significant progress in recent years, protein structure prediction maintains its status as one of the prime unsolved problems in computational biology. One of the key remaining challenges is an efficient probabilistic exploration of the structural space that correctly reflects the relative conformational stabilities. Here, we present a fully probabilistic, continuous model of local protein structure in atomic detail. The generative model makes efficient conformational sampling possible and provides a framework for the rigorous analysis of local sequence-structure correlations in the native state. Our method represents a significant theoretical and practical improvement over the widely used fragment assembly technique by avoiding the drawbacks associated with a discrete and nonprobabilistic approach.
Adjoint-Based Algorithms for Adaptation and Design Optimizations on Unstructured Grids
NASA Technical Reports Server (NTRS)
Nielsen, Eric J.
2006-01-01
Schemes based on discrete adjoint algorithms present several exciting opportunities for significantly advancing the current state of the art in computational fluid dynamics. Such methods provide an extremely efficient means for obtaining discretely consistent sensitivity information for hundreds of design variables, opening the door to rigorous, automated design optimization of complex aerospace configuration using the Navier-Stokes equation. Moreover, the discrete adjoint formulation provides a mathematically rigorous foundation for mesh adaptation and systematic reduction of spatial discretization error. Error estimates are also an inherent by-product of an adjoint-based approach, valuable information that is virtually non-existent in today's large-scale CFD simulations. An overview of the adjoint-based algorithm work at NASA Langley Research Center is presented, with examples demonstrating the potential impact on complex computational problems related to design optimization as well as mesh adaptation.
When is good, good enough? Methodological pragmatism for sustainable guideline development.
Browman, George P; Somerfield, Mark R; Lyman, Gary H; Brouwers, Melissa C
2015-03-06
Continuous escalation in methodological and procedural rigor for evidence-based processes in guideline development is associated with increasing costs and production delays that threaten sustainability. While health research methodologists are appropriately responsible for promoting increasing rigor in guideline development, guideline sponsors are responsible for funding such processes. This paper acknowledges that other stakeholders in addition to methodologists should be more involved in negotiating trade-offs between methodological procedures and efficiency in guideline production to produce guidelines that are 'good enough' to be trustworthy and affordable under specific circumstances. The argument for reasonable methodological compromise to meet practical circumstances is consistent with current implicit methodological practice. This paper proposes a conceptual tool as a framework to be used by different stakeholders in negotiating, and explicitly reporting, reasonable compromises for trustworthy as well as cost-worthy guidelines. The framework helps fill a transparency gap in how methodological choices in guideline development are made. The principle, 'when good is good enough' can serve as a basis for this approach. The conceptual tool 'Efficiency-Validity Methodological Continuum' acknowledges trade-offs between validity and efficiency in evidence-based guideline development and allows for negotiation, guided by methodologists, of reasonable methodological compromises among stakeholders. Collaboration among guideline stakeholders in the development process is necessary if evidence-based guideline development is to be sustainable.
Meeting the information system demands of the future through outsourcing.
Goldman, S J
1994-05-01
As managed care organizations work to meet the rigorous data and information requirements of a rapidly evolving health care system, many are recognizing the need to out-source their computer operations. Developing a cost-effective, efficient approach to outsourcing is a challenge to many organizations. This article offers an in-depth view of outsourcing as it relates to the managed health care industry as well as criteria for selecting an outsourcing consultant or vendor.
Molecular approaches to third generation photovoltaics: photochemical up-conversion
NASA Astrophysics Data System (ADS)
Cheng, Yuen Yap; Fückel, Burkhard; Roberts, Derrick A.; Khoury, Tony; Clady, Rapha"l. G. C. R.; Tayebjee, Murad J. Y.; Piper, Roland; Ekins-Daukes, N. J.; Crossley, Maxwell J.; Schmidt, Timothy W.
2010-08-01
We have investigated a photochemical up-conversion system comprising a molecular mixture of a palladium porphyrin to harvest light, and a polycyclic aromatic hydrocarbon to emit light. The energy of harvested photons is stored as molecular triplet states which then annihilate to bring about up-converted fluorescence. The limiting efficiency of such triplet-triplet annihilation up-conversion has been believed to be 11% for some time. However, by rigorously investigating the kinetics of delayed fluorescence following pulsed excitation, we demonstrate instantaneous annihilation efficiencies exceeding 40%, and limiting efficiencies for the current system of ~60%. We attribute the high efficiencies obtained to the electronic structure of the emitting molecule, which exhibits an exceptionally high T2 molecular state. We utilize the kinetic data obtained to model an up-converting layer irradiated with broadband sunlight, finding that ~3% efficiencies can be obtained with the current system, with this improving dramatically upon optimization of various parameters.
Second-harmonic generation from a positive-negative index material heterostructure.
Mattiucci, Nadia; D'Aguanno, Giuseppe; Bloemer, Mark J; Scalora, Michael
2005-12-01
Resonant cavities have been widely used in the past to enhance material, nonlinear response. Traditional mirrors include metallic films and distributed Bragg reflectors. In this paper we propose negative index material mirrors as a third alternative. With the help of a rigorous Green function approach, we investigate second harmonic generation from single and coupled cavities, and theoretically prove that negative index material mirrors can raise the nonlinear conversion efficiency of a bulk material by at least four orders of magnitude compared to a bulk medium.
Efficient shortcut techniques in evanescently coupled waveguides
NASA Astrophysics Data System (ADS)
Paul, Koushik; Sarma, Amarendra K.
2016-10-01
Shortcut to Adiabatic Passage (SHAPE) technique, in the context of coherent control of atomic systems has gained considerable attention in last few years. It is primarily because of its ability to manipulate population among the quantum states infinitely fast compared to the adiabatic processes. Two methods in this regard have been explored rigorously, namely the transitionless quantum driving and the Lewis-Riesenfeld invariant approach. We have applied these two methods to realize SHAPE in adiabatic waveguide coupler. Waveguide couplers are integral components of photonic circuits, primarily used as switching devices. Our study shows that with appropriate engineering of the coupling coefficient and propagation constants of the coupler it is possible to achieve efficient and complete power switching. We also observed that the coupler length could be reduced significantly without affecting the coupling efficiency of the system.
Rigorous force field optimization principles based on statistical distance minimization
Vlcek, Lukas; Chialvo, Ariel A.
2015-10-12
We use the concept of statistical distance to define a measure of distinguishability between a pair of statistical mechanical systems, i.e., a model and its target, and show that its minimization leads to general convergence of the model’s static measurable properties to those of the target. Here we exploit this feature to define a rigorous basis for the development of accurate and robust effective molecular force fields that are inherently compatible with coarse-grained experimental data. The new model optimization principles and their efficient implementation are illustrated through selected examples, whose outcome demonstrates the higher robustness and predictive accuracy of themore » approach compared to other currently used methods, such as force matching and relative entropy minimization. We also discuss relations between the newly developed principles and established thermodynamic concepts, which include the Gibbs-Bogoliubov inequality and the thermodynamic length.« less
Systems approach used in the Gas Centrifuge Enrichment Plant
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rooks, W.A. Jr.
A requirement exists for effective and efficient transfer of technical knowledge from the design engineering team to the production work force. Performance-Based Training (PBT) is a systematic approach to the design, development, and implementation of technical training. This approach has been successfully used by the US Armed Forces, industry, and other organizations. The advantages of the PBT approach are: cost-effectiveness (lowest life-cycle training cost), learning effectiveness, reduced implementation time, and ease of administration. The PBT process comprises five distinctive and rigorous phases: Analysis of Job Performance, Design of Instructional Strategy, Development of Training Materials and Instructional Media, Validation of Materialsmore » and Media, and Implementation of the Instructional Program. Examples from the Gas Centrifuge Enrichment Plant (GCEP) are used to illustrate the application of PBT.« less
Solar energy enhancement using down-converting particles: A rigorous approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abrams, Ze’ev R.; Niv, Avi; Zhang, Xiang
2011-06-01
The efficiency of a single band-gap solar cell is specified by the Shockley-Queisser limit, which defines the maximal output power as a function of the solar cell’s band-gap. One way to overcome this limit is by using a down-conversion process whereupon a high energy photon is split into two lower energy photons, thereby increasing the current of the cell. Here, we provide a full analysis of the possible efficiency increase when placing a down-converting material on top of a pre-existing solar cell. We show that a total 7% efficiency improvement is possible for a perfectly efficient down-converting material. Our analysismore » covers both lossless and lossy theoretical limits, as well as a thermodynamic evaluation. Finally, we describe the advantages of nanoparticles as a possible choice for a down-converting material.« less
Benassi, Enrico
2017-01-15
A number of programs and tools that simulate 1 H and 13 C nuclear magnetic resonance (NMR) chemical shifts using empirical approaches are available. These tools are user-friendly, but they provide a very rough (and sometimes misleading) estimation of the NMR properties, especially for complex systems. Rigorous and reliable ways to predict and interpret NMR properties of simple and complex systems are available in many popular computational program packages. Nevertheless, experimentalists keep relying on these "unreliable" tools in their daily work because, to have a sufficiently high accuracy, these rigorous quantum mechanical methods need high levels of theory. An alternative, efficient, semi-empirical approach has been proposed by Bally, Rablen, Tantillo, and coworkers. This idea consists of creating linear calibrations models, on the basis of the application of different combinations of functionals and basis sets. Following this approach, the predictive capability of a wider range of popular functionals was systematically investigated and tested. The NMR chemical shifts were computed in solvated phase at density functional theory level, using 30 different functionals coupled with three different triple-ζ basis sets. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Thiel, Kati; Mulaku, Edita; Dandapani, Hariharan; Nagy, Csaba; Aro, Eva-Mari; Kallio, Pauli
2018-03-02
Photosynthetic cyanobacteria have been studied as potential host organisms for direct solar-driven production of different carbon-based chemicals from CO 2 and water, as part of the development of sustainable future biotechnological applications. The engineering approaches, however, are still limited by the lack of comprehensive information on most optimal expression strategies and validated species-specific genetic elements which are essential for increasing the intricacy, predictability and efficiency of the systems. This study focused on the systematic evaluation of the key translational control elements, ribosome binding sites (RBS), in the cyanobacterial host Synechocystis sp. PCC 6803, with the objective of expanding the palette of tools for more rigorous engineering approaches. An expression system was established for the comparison of 13 selected RBS sequences in Synechocystis, using several alternative reporter proteins (sYFP2, codon-optimized GFPmut3 and ethylene forming enzyme) as quantitative indicators of the relative translation efficiencies. The set-up was shown to yield highly reproducible expression patterns in independent analytical series with low variation between biological replicates, thus allowing statistical comparison of the activities of the different RBSs in vivo. While the RBSs covered a relatively broad overall expression level range, the downstream gene sequence was demonstrated in a rigorous manner to have a clear impact on the resulting translational profiles. This was expected to reflect interfering sequence-specific mRNA-level interaction between the RBS and the coding region, yet correlation between potential secondary structure formation and observed translation levels could not be resolved with existing in silico prediction tools. The study expands our current understanding on the potential and limitations associated with the regulation of protein expression at translational level in engineered cyanobacteria. The acquired information can be used for selecting appropriate RBSs for optimizing over-expression constructs or multicistronic pathways in Synechocystis, while underlining the complications in predicting the activity due to gene-specific interactions which may reduce the translational efficiency for a given RBS-gene combination. Ultimately, the findings emphasize the need for additional characterized insulator sequence elements to decouple the interaction between the RBS and the coding region for future engineering approaches.
Jonas, Wayne B; Crawford, Cindy; Hilton, Lara; Elfenbaum, Pamela
2017-01-01
Answering the question of "what works" in healthcare can be complex and requires the careful design and sequential application of systematic methodologies. Over the last decade, the Samueli Institute has, along with multiple partners, developed a streamlined, systematic, phased approach to this process called the Scientific Evaluation and Review of Claims in Health Care (SEaRCH™). The SEaRCH process provides an approach for rigorously, efficiently, and transparently making evidence-based decisions about healthcare claims in research and practice with minimal bias. SEaRCH uses three methods combined in a coordinated fashion to help determine what works in healthcare. The first, the Claims Assessment Profile (CAP), seeks to clarify the healthcare claim and question, and its ability to be evaluated in the context of its delivery. The second method, the Rapid Evidence Assessment of the Literature (REAL © ), is a streamlined, systematic review process conducted to determine the quantity, quality, and strength of evidence and risk/benefit for the treatment. The third method involves the structured use of expert panels (EPs). There are several types of EPs, depending on the purpose and need. Together, these three methods-CAP, REAL, and EP-can be integrated into a strategic approach to help answer the question "what works in healthcare?" and what it means in a comprehensive way. SEaRCH is a systematic, rigorous approach for evaluating healthcare claims of therapies, practices, programs, or products in an efficient and stepwise fashion. It provides an iterative, protocol-driven process that is customized to the intervention, consumer, and context. Multiple communities, including those involved in health service and policy, can benefit from this organized framework, assuring that evidence-based principles determine which healthcare practices with the greatest promise are used for improving the public's health and wellness.
Evaluating Rigor in Qualitative Methodology and Research Dissemination
ERIC Educational Resources Information Center
Trainor, Audrey A.; Graue, Elizabeth
2014-01-01
Despite previous and successful attempts to outline general criteria for rigor, researchers in special education have debated the application of rigor criteria, the significance or importance of small n research, the purpose of interpretivist approaches, and the generalizability of qualitative empirical results. Adding to these complications, the…
Zhang, Yiming; Jin, Quan; Wang, Shuting; Ren, Ren
2011-05-01
The mobile behavior of 1481 peptides in ion mobility spectrometry (IMS), which are generated by protease digestion of the Drosophila melanogaster proteome, is modeled and predicted based on two different types of characterization methods, i.e. sequence-based approach and structure-based approach. In this procedure, the sequence-based approach considers both the amino acid composition of a peptide and the local environment profile of each amino acid in the peptide; the structure-based approach is performed with the CODESSA protocol, which regards a peptide as a common organic compound and generates more than 200 statistically significant variables to characterize the whole structure profile of a peptide molecule. Subsequently, the nonlinear support vector machine (SVM) and Gaussian process (GP) as well as linear partial least squares (PLS) regression is employed to correlate the structural parameters of the characterizations with the IMS drift times of these peptides. The obtained quantitative structure-spectrum relationship (QSSR) models are evaluated rigorously and investigated systematically via both one-deep and two-deep cross-validations as well as the rigorous Monte Carlo cross-validation (MCCV). We also give a comprehensive comparison on the resulting statistics arising from the different combinations of variable types with modeling methods and find that the sequence-based approach can give the QSSR models with better fitting ability and predictive power but worse interpretability than the structure-based approach. In addition, though the QSSR modeling using sequence-based approach is not needed for the preparation of the minimization structures of peptides before the modeling, it would be considerably efficient as compared to that using structure-based approach. Copyright © 2011 Elsevier Ltd. All rights reserved.
Liu, Jianfeng; Laird, Carl Damon
2017-09-22
Optimal design of a gas detection systems is challenging because of the numerous sources of uncertainty, including weather and environmental conditions, leak location and characteristics, and process conditions. Rigorous CFD simulations of dispersion scenarios combined with stochastic programming techniques have been successfully applied to the problem of optimal gas detector placement; however, rigorous treatment of sensor failure and nonuniform unavailability has received less attention. To improve reliability of the design, this paper proposes a problem formulation that explicitly considers nonuniform unavailabilities and all backup detection levels. The resulting sensor placement problem is a large-scale mixed-integer nonlinear programming (MINLP) problem thatmore » requires a tailored solution approach for efficient solution. We have developed a multitree method which depends on iteratively solving a sequence of upper-bounding master problems and lower-bounding subproblems. The tailored global solution strategy is tested on a real data problem and the encouraging numerical results indicate that our solution framework is promising in solving sensor placement problems. This study was selected for the special issue in JLPPI from the 2016 International Symposium of the MKO Process Safety Center.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Jianfeng; Laird, Carl Damon
Optimal design of a gas detection systems is challenging because of the numerous sources of uncertainty, including weather and environmental conditions, leak location and characteristics, and process conditions. Rigorous CFD simulations of dispersion scenarios combined with stochastic programming techniques have been successfully applied to the problem of optimal gas detector placement; however, rigorous treatment of sensor failure and nonuniform unavailability has received less attention. To improve reliability of the design, this paper proposes a problem formulation that explicitly considers nonuniform unavailabilities and all backup detection levels. The resulting sensor placement problem is a large-scale mixed-integer nonlinear programming (MINLP) problem thatmore » requires a tailored solution approach for efficient solution. We have developed a multitree method which depends on iteratively solving a sequence of upper-bounding master problems and lower-bounding subproblems. The tailored global solution strategy is tested on a real data problem and the encouraging numerical results indicate that our solution framework is promising in solving sensor placement problems. This study was selected for the special issue in JLPPI from the 2016 International Symposium of the MKO Process Safety Center.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-02
... two phases: (1) Development and (2) research on effectiveness. Abstracts of projects funded under... approaches. Phase 2 projects must subject technology-based approaches to rigorous field-based research to... disabilities; (2) Present a justification, based on scientifically rigorous research or theory, that supports...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-07
... two phases: (1) Development and (2) research on effectiveness. Abstracts of projects funded under... approaches. Phase 2 projects must subject technology-based approaches to rigorous field-based research to... scientifically rigorous research or theory, that demonstrates the potential effectiveness of the technology-based...
Multi-template polymerase chain reaction.
Kalle, Elena; Kubista, Mikael; Rensing, Christopher
2014-12-01
PCR is a formidable and potent technology that serves as an indispensable tool in a wide range of biological disciplines. However, due to the ease of use and often lack of rigorous standards many PCR applications can lead to highly variable, inaccurate, and ultimately meaningless results. Thus, rigorous method validation must precede its broad adoption to any new application. Multi-template samples possess particular features, which make their PCR analysis prone to artifacts and biases: multiple homologous templates present in copy numbers that vary within several orders of magnitude. Such conditions are a breeding ground for chimeras and heteroduplexes. Differences in template amplification efficiencies and template competition for reaction compounds undermine correct preservation of the original template ratio. In addition, the presence of inhibitors aggravates all of the above-mentioned problems. Inhibitors might also have ambivalent effects on the different templates within the same sample. Yet, no standard approaches exist for monitoring inhibitory effects in multitemplate PCR, which is crucial for establishing compatibility between samples.
SCOTCH: Secure Counting Of encrypTed genomiC data using a Hybrid approach.
Chenghong, Wang; Jiang, Yichen; Mohammed, Noman; Chen, Feng; Jiang, Xiaoqian; Al Aziz, Md Momin; Sadat, Md Nazmus; Wang, Shuang
2017-01-01
As genomic data are usually at large scale and highly sensitive, it is essential to enable both efficient and secure analysis, by which the data owner can securely delegate both computation and storage on untrusted public cloud. Counting query of genotypes is a basic function for many downstream applications in biomedical research (e.g., computing allele frequency, calculating chi-squared statistics, etc.). Previous solutions show promise on secure counting of outsourced data but the efficiency is still a big limitation for real world applications. In this paper, we propose a novel hybrid solution to combine a rigorous theoretical model (homomorphic encryption) and the latest hardware-based infrastructure (i.e., Software Guard Extensions) to speed up the computation while preserving the privacy of both data owners and data users. Our results demonstrated efficiency by using the real data from the personal genome project.
SCOTCH: Secure Counting Of encrypTed genomiC data using a Hybrid approach
Chenghong, Wang; Jiang, Yichen; Mohammed, Noman; Chen, Feng; Jiang, Xiaoqian; Al Aziz, Md Momin; Sadat, Md Nazmus; Wang, Shuang
2017-01-01
As genomic data are usually at large scale and highly sensitive, it is essential to enable both efficient and secure analysis, by which the data owner can securely delegate both computation and storage on untrusted public cloud. Counting query of genotypes is a basic function for many downstream applications in biomedical research (e.g., computing allele frequency, calculating chi-squared statistics, etc.). Previous solutions show promise on secure counting of outsourced data but the efficiency is still a big limitation for real world applications. In this paper, we propose a novel hybrid solution to combine a rigorous theoretical model (homomorphic encryption) and the latest hardware-based infrastructure (i.e., Software Guard Extensions) to speed up the computation while preserving the privacy of both data owners and data users. Our results demonstrated efficiency by using the real data from the personal genome project. PMID:29854245
Limit analysis of hollow spheres or spheroids with Hill orthotropic matrix
NASA Astrophysics Data System (ADS)
Pastor, Franck; Pastor, Joseph; Kondo, Djimedo
2012-03-01
Recent theoretical studies of the literature are concerned by the hollow sphere or spheroid (confocal) problems with orthotropic Hill type matrix. They have been developed in the framework of the limit analysis kinematical approach by using very simple trial velocity fields. The present Note provides, through numerical upper and lower bounds, a rigorous assessment of the approximate criteria derived in these theoretical works. To this end, existing static 3D codes for a von Mises matrix have been easily extended to the orthotropic case. Conversely, instead of the non-obvious extension of the existing kinematic codes, a new original mixed approach has been elaborated on the basis of the plane strain structure formulation earlier developed by F. Pastor (2007). Indeed, such a formulation does not need the expressions of the unit dissipated powers. Interestingly, it delivers a numerical code better conditioned and notably more rapid than the previous one, while preserving the rigorous upper bound character of the corresponding numerical results. The efficiency of the whole approach is first demonstrated through comparisons of the results to the analytical upper bounds of Benzerga and Besson (2001) or Monchiet et al. (2008) in the case of spherical voids in the Hill matrix. Moreover, we provide upper and lower bounds results for the hollow spheroid with the Hill matrix which are compared to those of Monchiet et al. (2008).
The Relationship between Project-Based Learning and Rigor in STEM-Focused High Schools
ERIC Educational Resources Information Center
Edmunds, Julie; Arshavsky, Nina; Glennie, Elizabeth; Charles, Karen; Rice, Olivia
2016-01-01
Project-based learning (PjBL) is an approach often favored in STEM classrooms, yet some studies have shown that teachers struggle to implement it with academic rigor. This paper explores the relationship between PjBL and rigor in the classrooms of ten STEM-oriented high schools. Utilizing three different data sources reflecting three different…
Mašín, Ivan
2016-01-01
One of important sources of biomass-based fuel is Jatropha curcas L. Great attention is paid to the biofuel produced from the oil extracted from the Jatropha curcas L. seeds. A mechanised extraction is the most efficient and feasible method for oil extraction for small-scale farmers but there is a need to extract oil in more efficient manner which would increase the labour productivity, decrease production costs, and increase benefits of small-scale farmers. On the other hand innovators should be aware that further machines development is possible only when applying the systematic approach and design methodology in all stages of engineering design. Systematic approach in this case means that designers and development engineers rigorously apply scientific knowledge, integrate different constraints and user priorities, carefully plan product and activities, and systematically solve technical problems. This paper therefore deals with the complex approach to design specification determining that can bring new innovative concepts to design of mechanical machines for oil extraction. The presented case study as the main part of the paper is focused on new concept of screw of machine mechanically extracting oil from Jatropha curcas L. seeds. PMID:27668259
Bruno, Oscar P.; Turc, Catalin; Venakides, Stephanos
2016-01-01
This work, part I in a two-part series, presents: (i) a simple and highly efficient algorithm for evaluation of quasi-periodic Green functions, as well as (ii) an associated boundary-integral equation method for the numerical solution of problems of scattering of waves by doubly periodic arrays of scatterers in three-dimensional space. Except for certain ‘Wood frequencies’ at which the quasi-periodic Green function ceases to exist, the proposed approach, which is based on smooth windowing functions, gives rise to tapered lattice sums which converge superalgebraically fast to the Green function—that is, faster than any power of the number of terms used. This is in sharp contrast to the extremely slow convergence exhibited by the lattice sums in the absence of smooth windowing. (The Wood-frequency problem is treated in part II.) This paper establishes rigorously the superalgebraic convergence of the windowed lattice sums. A variety of numerical results demonstrate the practical efficiency of the proposed approach. PMID:27493573
Time Series Expression Analyses Using RNA-seq: A Statistical Approach
Oh, Sunghee; Song, Seongho; Grabowski, Gregory; Zhao, Hongyu; Noonan, James P.
2013-01-01
RNA-seq is becoming the de facto standard approach for transcriptome analysis with ever-reducing cost. It has considerable advantages over conventional technologies (microarrays) because it allows for direct identification and quantification of transcripts. Many time series RNA-seq datasets have been collected to study the dynamic regulations of transcripts. However, statistically rigorous and computationally efficient methods are needed to explore the time-dependent changes of gene expression in biological systems. These methods should explicitly account for the dependencies of expression patterns across time points. Here, we discuss several methods that can be applied to model timecourse RNA-seq data, including statistical evolutionary trajectory index (SETI), autoregressive time-lagged regression (AR(1)), and hidden Markov model (HMM) approaches. We use three real datasets and simulation studies to demonstrate the utility of these dynamic methods in temporal analysis. PMID:23586021
Time series expression analyses using RNA-seq: a statistical approach.
Oh, Sunghee; Song, Seongho; Grabowski, Gregory; Zhao, Hongyu; Noonan, James P
2013-01-01
RNA-seq is becoming the de facto standard approach for transcriptome analysis with ever-reducing cost. It has considerable advantages over conventional technologies (microarrays) because it allows for direct identification and quantification of transcripts. Many time series RNA-seq datasets have been collected to study the dynamic regulations of transcripts. However, statistically rigorous and computationally efficient methods are needed to explore the time-dependent changes of gene expression in biological systems. These methods should explicitly account for the dependencies of expression patterns across time points. Here, we discuss several methods that can be applied to model timecourse RNA-seq data, including statistical evolutionary trajectory index (SETI), autoregressive time-lagged regression (AR(1)), and hidden Markov model (HMM) approaches. We use three real datasets and simulation studies to demonstrate the utility of these dynamic methods in temporal analysis.
2014-01-01
Background Recent innovations in sequencing technologies have provided researchers with the ability to rapidly characterize the microbial content of an environmental or clinical sample with unprecedented resolution. These approaches are producing a wealth of information that is providing novel insights into the microbial ecology of the environment and human health. However, these sequencing-based approaches produce large and complex datasets that require efficient and sensitive computational analysis workflows. Many recent tools for analyzing metagenomic-sequencing data have emerged, however, these approaches often suffer from issues of specificity, efficiency, and typically do not include a complete metagenomic analysis framework. Results We present PathoScope 2.0, a complete bioinformatics framework for rapidly and accurately quantifying the proportions of reads from individual microbial strains present in metagenomic sequencing data from environmental or clinical samples. The pipeline performs all necessary computational analysis steps; including reference genome library extraction and indexing, read quality control and alignment, strain identification, and summarization and annotation of results. We rigorously evaluated PathoScope 2.0 using simulated data and data from the 2011 outbreak of Shiga-toxigenic Escherichia coli O104:H4. Conclusions The results show that PathoScope 2.0 is a complete, highly sensitive, and efficient approach for metagenomic analysis that outperforms alternative approaches in scope, speed, and accuracy. The PathoScope 2.0 pipeline software is freely available for download at: http://sourceforge.net/projects/pathoscope/. PMID:25225611
Comparison of holographic lens and filter systems for lateral spectrum splitting
NASA Astrophysics Data System (ADS)
Vorndran, Shelby; Chrysler, Benjamin; Kostuk, Raymond K.
2016-09-01
Spectrum splitting is an approach to increasing the conversion efficiency of a photovoltaic (PV) system. Several methods can be used to perform this function which requires efficient spatial separation of different spectral bands of the incident solar radiation. In this paper several of holographic methods for implementing spectrum splitting are reviewed along with the benefits and disadvantages associated with each approach. The review indicates that a volume holographic lens has many advantages for spectrum splitting in terms of both power conversion efficiency and energy yield. A specific design for a volume holographic spectrum splitting lens is discussed for use with high bandgap InGaP and low bandgap silicon PV cells. The holographic lenses are modeled using rigorous coupled wave analysis, and the optical efficiency is evaluated using non-sequential raytracing. A proof-of-concept off-axis holographic lens is also recorded in dichromated gelatin film and the spectral diffraction efficiency of the hologram is measured with multiple laser sources across the diffracted spectral band. The experimental volume holographic lens (VHL) characteristics are compared to an ideal spectrum splitting filter in terms of power conversion efficiency and energy yield in environments with high direct normal incidence (DNI) illumination and high levels of diffuse illumination. The results show that the experimental VHL can achieve 62.5% of the ideal filter power conversion efficiency, 64.8% of the ideal filter DNI environment energy yield, and 57.7% of the ideal diffuse environment energy yield performance.
Rigorous Numerical Study of Low-Period Windows for the Quadratic Map
NASA Astrophysics Data System (ADS)
Galias, Zbigniew
An efficient method to find all low-period windows for the quadratic map is proposed. The method is used to obtain very accurate rigorous bounds of positions of all periodic windows with periods p ≤ 32. The contribution of period-doubling windows on the total width of periodic windows is discussed. Properties of periodic windows are studied numerically.
NASA Astrophysics Data System (ADS)
Mustac, M.; Kim, S.; Tkalcic, H.; Rhie, J.; Chen, Y.; Ford, S. R.; Sebastian, N.
2015-12-01
Conventional approaches to inverse problems suffer from non-linearity and non-uniqueness in estimations of seismic structures and source properties. Estimated results and associated uncertainties are often biased by applied regularizations and additional constraints, which are commonly introduced to solve such problems. Bayesian methods, however, provide statistically meaningful estimations of models and their uncertainties constrained by data information. In addition, hierarchical and trans-dimensional (trans-D) techniques are inherently implemented in the Bayesian framework to account for involved error statistics and model parameterizations, and, in turn, allow more rigorous estimations of the same. Here, we apply Bayesian methods throughout the entire inference process to estimate seismic structures and source properties in Northeast Asia including east China, the Korean peninsula, and the Japanese islands. Ambient noise analysis is first performed to obtain a base three-dimensional (3-D) heterogeneity model using continuous broadband waveforms from more than 300 stations. As for the tomography of surface wave group and phase velocities in the 5-70 s band, we adopt a hierarchical and trans-D Bayesian inversion method using Voronoi partition. The 3-D heterogeneity model is further improved by joint inversions of teleseismic receiver functions and dispersion data using a newly developed high-efficiency Bayesian technique. The obtained model is subsequently used to prepare 3-D structural Green's functions for the source characterization. A hierarchical Bayesian method for point source inversion using regional complete waveform data is applied to selected events from the region. The seismic structure and source characteristics with rigorously estimated uncertainties from the novel Bayesian methods provide enhanced monitoring and discrimination of seismic events in northeast Asia.
Vallecillo-Viejo, Isabel C; Liscovitch-Brauer, Noa; Montiel-Gonzalez, Maria Fernanda; Eisenberg, Eli; Rosenthal, Joshua J C
2018-01-02
Site-directed RNA editing (SDRE) is a general strategy for making targeted base changes in RNA molecules. Although the approach is relatively new, several groups, including our own, have been working on its development. The basic strategy has been to couple the catalytic domain of an adenosine (A) to inosine (I) RNA editing enzyme to a guide RNA that is used for targeting. Although highly efficient on-target editing has been reported, off-target events have not been rigorously quantified. In this report we target premature termination codons (PTCs) in messages encoding both a fluorescent reporter protein and the Cystic Fibrosis Transmembrane Conductance Regulator (CFTR) protein transiently transfected into human epithelial cells. We demonstrate that while on-target editing is efficient, off-target editing is extensive, both within the targeted message and across the entire transcriptome of the transfected cells. By redirecting the editing enzymes from the cytoplasm to the nucleus, off-target editing is reduced without compromising the on-target editing efficiency. The addition of the E488Q mutation to the editing enzymes, a common strategy for increasing on-target editing efficiency, causes a tremendous increase in off-target editing. These results underscore the need to reduce promiscuity in current approaches to SDRE.
Mathematical Rigor vs. Conceptual Change: Some Early Results
NASA Astrophysics Data System (ADS)
Alexander, W. R.
2003-05-01
Results from two different pedagogical approaches to teaching introductory astronomy at the college level will be presented. The first of these approaches is a descriptive, conceptually based approach that emphasizes conceptual change. This descriptive class is typically an elective for non-science majors. The other approach is a mathematically rigorous treatment that emphasizes problem solving and is designed to prepare students for further study in astronomy. The mathematically rigorous class is typically taken by science majors. It also fulfills an elective science requirement for these science majors. The Astronomy Diagnostic Test version 2 (ADT 2.0) was used as an assessment instrument since the validity and reliability have been investigated by previous researchers. The ADT 2.0 was administered as both a pre-test and post-test to both groups. Initial results show no significant difference between the two groups in the post-test. However, there is a slightly greater improvement for the descriptive class between the pre and post testing compared to the mathematically rigorous course. There was great care to account for variables. These variables included: selection of text, class format as well as instructor differences. Results indicate that the mathematically rigorous model, doesn't improve conceptual understanding any better than the conceptual change model. Additional results indicate that there is a similar gender bias in favor of males that has been measured by previous investigators. This research has been funded by the College of Science and Mathematics at James Madison University.
Nano-photonic light trapping near the Lambertian limit in organic solar cell architectures.
Biswas, Rana; Timmons, Erik
2013-09-09
A critical step to achieving higher efficiency solar cells is the broad band harvesting of solar photons. Although considerable progress has recently been achieved in improving the power conversion efficiency of organic solar cells, these cells still do not absorb upto ~50% of the solar spectrum. We have designed and developed an organic solar cell architecture that can boost the absorption of photons by 40% and the photo-current by 50% for organic P3HT-PCBM absorber layers of typical device thicknesses. Our solar cell architecture is based on all layers of the solar cell being patterned in a conformal two-dimensionally periodic photonic crystal architecture. This results in very strong diffraction of photons- that increases the photon path length in the absorber layer, and plasmonic light concentration near the patterned organic-metal cathode interface. The absorption approaches the Lambertian limit. The simulations utilize a rigorous scattering matrix approach and provide bounds of the fundamental limits of nano-photonic light absorption in periodically textured organic solar cells. This solar cell architecture has the potential to increase the power conversion efficiency to 10% for single band gap organic solar cells utilizing long-wavelength absorbers.
Numerical Approach to Spatial Deterministic-Stochastic Models Arising in Cell Biology.
Schaff, James C; Gao, Fei; Li, Ye; Novak, Igor L; Slepchenko, Boris M
2016-12-01
Hybrid deterministic-stochastic methods provide an efficient alternative to a fully stochastic treatment of models which include components with disparate levels of stochasticity. However, general-purpose hybrid solvers for spatially resolved simulations of reaction-diffusion systems are not widely available. Here we describe fundamentals of a general-purpose spatial hybrid method. The method generates realizations of a spatially inhomogeneous hybrid system by appropriately integrating capabilities of a deterministic partial differential equation solver with a popular particle-based stochastic simulator, Smoldyn. Rigorous validation of the algorithm is detailed, using a simple model of calcium 'sparks' as a testbed. The solver is then applied to a deterministic-stochastic model of spontaneous emergence of cell polarity. The approach is general enough to be implemented within biologist-friendly software frameworks such as Virtual Cell.
Developing a space network interface simulator: The NTS approach
NASA Technical Reports Server (NTRS)
Hendrzak, Gary E.
1993-01-01
This paper describes the approach used to redevelop the Network Control Center (NCC) Test System (NTS), a hardware and software facility designed to make testing of the NCC Data System (NCCDS) software efficient, effective, and as rigorous as possible prior to operational use. The NTS transmits and receives network message traffic in real-time. Data transfer rates and message content are strictly controlled and are identical to that of the operational systems. NTS minimizes the need for costly and time-consuming testing with the actual external entities (e.g., the Hubble Space Telescope (HST) Payload Operations Control Center (POCC) and the White Sands Ground Terminal). Discussed are activities associated with the development of the NTS, lessons learned throughout the project's lifecycle, and resulting productivity and quality increases.
Nanophotonic light-trapping theory for solar cells
NASA Astrophysics Data System (ADS)
Yu, Zongfu; Raman, Aaswath; Fan, Shanhui
2011-11-01
Conventional light-trapping theory, based on a ray-optics approach, was developed for standard thick photovoltaic cells. The classical theory established an upper limit for possible absorption enhancement in this context and provided a design strategy for reaching this limit. This theory has become the foundation for light management in bulk silicon PV cells, and has had enormous influence on the optical design of solar cells in general. This theory, however, is not applicable in the nanophotonic regime. Here we develop a statistical temporal coupled-mode theory of light trapping based on a rigorous electromagnetic approach. Our theory reveals that the standard limit can be substantially surpassed when optical modes in the active layer are confined to deep-subwavelength scale, opening new avenues for highly efficient next-generation solar cells.
Local electric dipole moments for periodic systems via density functional theory embedding.
Luber, Sandra
2014-12-21
We describe a novel approach for the calculation of local electric dipole moments for periodic systems. Since the position operator is ill-defined in periodic systems, maximally localized Wannier functions based on the Berry-phase approach are usually employed for the evaluation of local contributions to the total electric dipole moment of the system. We propose an alternative approach: within a subsystem-density functional theory based embedding scheme, subset electric dipole moments are derived without any additional localization procedure, both for hybrid and non-hybrid exchange-correlation functionals. This opens the way to a computationally efficient evaluation of local electric dipole moments in (molecular) periodic systems as well as their rigorous splitting into atomic electric dipole moments. As examples, Infrared spectra of liquid ethylene carbonate and dimethyl carbonate are presented, which are commonly employed as solvents in Lithium ion batteries.
Computational Approaches to the Chemical Equilibrium Constant in Protein-ligand Binding.
Montalvo-Acosta, Joel José; Cecchini, Marco
2016-12-01
The physiological role played by protein-ligand recognition has motivated the development of several computational approaches to the ligand binding affinity. Some of them, termed rigorous, have a strong theoretical foundation but involve too much computation to be generally useful. Some others alleviate the computational burden by introducing strong approximations and/or empirical calibrations, which also limit their general use. Most importantly, there is no straightforward correlation between the predictive power and the level of approximation introduced. Here, we present a general framework for the quantitative interpretation of protein-ligand binding based on statistical mechanics. Within this framework, we re-derive self-consistently the fundamental equations of some popular approaches to the binding constant and pinpoint the inherent approximations. Our analysis represents a first step towards the development of variants with optimum accuracy/efficiency ratio for each stage of the drug discovery pipeline. © 2016 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Robust electromagnetic absorption by graphene/polymer heterostructures
NASA Astrophysics Data System (ADS)
Lobet, Michaël; Reckinger, Nicolas; Henrard, Luc; Lambin, Philippe
2015-07-01
Polymer/graphene heterostructures present good shielding efficiency against GHz electromagnetic perturbations. Theory and experiments demonstrate that there is an optimum number of graphene planes, separated by thin polymer spacers, leading to maximum absorption for millimeter waves Batrakov et al (2014 Sci. Rep. 4 7191). Here, electrodynamics of ideal polymer/graphene multilayered material is first approached with a well-adapted continued-fraction formalism. In a second stage, rigorous coupled wave analysis is used to account for the presence of defects in graphene that are typical of samples produced by chemical vapor deposition, namely microscopic holes, microscopic dots (embryos of a second layer) and grain boundaries. It is shown that the optimum absorbance of graphene/polymer multilayers does not weaken to the first order in defect concentration. This finding testifies to the robustness of the shielding efficiency of the proposed absorption device.
Graphical Descriptives: A Way to Improve Data Transparency and Methodological Rigor in Psychology.
Tay, Louis; Parrigon, Scott; Huang, Qiming; LeBreton, James M
2016-09-01
Several calls have recently been issued to the social sciences for enhanced transparency of research processes and enhanced rigor in the methodological treatment of data and data analytics. We propose the use of graphical descriptives (GDs) as one mechanism for responding to both of these calls. GDs provide a way to visually examine data. They serve as quick and efficient tools for checking data distributions, variable relations, and the potential appropriateness of different statistical analyses (e.g., do data meet the minimum assumptions for a particular analytic method). Consequently, we believe that GDs can promote increased transparency in the journal review process, encourage best practices for data analysis, and promote a more inductive approach to understanding psychological data. We illustrate the value of potentially including GDs as a step in the peer-review process and provide a user-friendly online resource (www.graphicaldescriptives.org) for researchers interested in including data visualizations in their research. We conclude with suggestions on how GDs can be expanded and developed to enhance transparency. © The Author(s) 2016.
NASA Technical Reports Server (NTRS)
Manford, J. S.; Bennett, G. R.
1985-01-01
The Space Station Program will incorporate analysis of operations constraints and considerations in the early design phases to avoid the need for later modifications to the Space Station for operations. The application of modern tools and administrative techniques to minimize the cost of performing effective orbital operations planning and design analysis in the preliminary design phase of the Space Station Program is discussed. Tools and techniques discussed include: approach for rigorous analysis of operations functions, use of the resources of a large computer network, and providing for efficient research and access to information.
Risk Management Structured for Today's Environment
NASA Technical Reports Server (NTRS)
Greenfield, Michael A.
1998-01-01
In NPG (NASA Procedures and Guidelines) 7120.5A, we define risk management as "an organized, systematic decision-making process that efficiently identifies, analyzes, plans, tracks, controls, and communicates and documents risk in order to increase the likelihood of achieving program/project goals." Effective risk management depends upon a thorough understanding of the concept of risk, the principles of risk management and the formation of a disciplined risk management process. In human spaceflight programs, NASA has always maintained a rigorous and highly structured risk management effort. When lives are at stake, NASA's missions must be 100% safe; the risk management approach used in human spaceflight has always been comprehensive.
Are computational models of any use to psychiatry?
Huys, Quentin J M; Moutoussis, Michael; Williams, Jonathan
2011-08-01
Mathematically rigorous descriptions of key hypotheses and theories are becoming more common in neuroscience and are beginning to be applied to psychiatry. In this article two fictional characters, Dr. Strong and Mr. Micawber, debate the use of such computational models (CMs) in psychiatry. We present four fundamental challenges to the use of CMs in psychiatry: (a) the applicability of mathematical approaches to core concepts in psychiatry such as subjective experiences, conflict and suffering; (b) whether psychiatry is mature enough to allow informative modelling; (c) whether theoretical techniques are powerful enough to approach psychiatric problems; and (d) the issue of communicating clinical concepts to theoreticians and vice versa. We argue that CMs have yet to influence psychiatric practice, but that they help psychiatric research in two fundamental ways: (a) to build better theories integrating psychiatry with neuroscience; and (b) to enforce explicit, global and efficient testing of hypotheses through more powerful analytical methods. CMs allow the complexity of a hypothesis to be rigorously weighed against the complexity of the data. The paper concludes with a discussion of the path ahead. It points to stumbling blocks, like the poor communication between theoretical and medical communities. But it also identifies areas in which the contributions of CMs will likely be pivotal, like an understanding of social influences in psychiatry, and of the co-morbidity structure of psychiatric diseases. Copyright © 2011 Elsevier Ltd. All rights reserved.
Scientific impact: opportunity and necessity.
Cohen, Marlene Z; Alexander, Gregory L; Wyman, Jean F; Fahrenwald, Nancy L; Porock, Davina; Wurzbach, Mary E; Rawl, Susan M; Conn, Vicki S
2010-08-01
Recent National Institutes of Health changes have focused attention on the potential scientific impact of research projects. Research with the excellent potential to change subsequent science or health care practice may have high scientific impact. Only rigorous studies that address highly significant problems can generate change. Studies with high impact may stimulate new research approaches by changing understanding of a phenomenon, informing theory development, or creating new research methods that allow a field of science to move forward. Research with high impact can transition health care to more effective and efficient approaches. Studies with high impact may propel new policy developments. Research with high scientific impact typically has both immediate and sustained influence on the field of study. The article includes ideas to articulate potential scientific impact in grant applications as well as possible dissemination strategies to enlarge the impact of completed projects.
Numerical Approach to Spatial Deterministic-Stochastic Models Arising in Cell Biology
Gao, Fei; Li, Ye; Novak, Igor L.; Slepchenko, Boris M.
2016-01-01
Hybrid deterministic-stochastic methods provide an efficient alternative to a fully stochastic treatment of models which include components with disparate levels of stochasticity. However, general-purpose hybrid solvers for spatially resolved simulations of reaction-diffusion systems are not widely available. Here we describe fundamentals of a general-purpose spatial hybrid method. The method generates realizations of a spatially inhomogeneous hybrid system by appropriately integrating capabilities of a deterministic partial differential equation solver with a popular particle-based stochastic simulator, Smoldyn. Rigorous validation of the algorithm is detailed, using a simple model of calcium ‘sparks’ as a testbed. The solver is then applied to a deterministic-stochastic model of spontaneous emergence of cell polarity. The approach is general enough to be implemented within biologist-friendly software frameworks such as Virtual Cell. PMID:27959915
A system-approach to the elastohydrodynamic lubrication point-contact problem
NASA Technical Reports Server (NTRS)
Lim, Sang Gyu; Brewe, David E.
1991-01-01
The classical EHL (elastohydrodynamic lubrication) point contact problem is solved using a new system-approach, similar to that introduced by Houpert and Hamrock for the line-contact problem. Introducing a body-fitted coordinate system, the troublesome free-boundary is transformed to a fixed domain. The Newton-Raphson method can then be used to determine the pressure distribution and the cavitation boundary subject to the Reynolds boundary condition. This method provides an efficient and rigorous way of solving the EHL point contact problem with the aid of a supercomputer and a promising method to deal with the transient EHL point contact problem. A typical pressure distribution and film thickness profile are presented and the minimum film thicknesses are compared with the solution of Hamrock and Dowson. The details of the cavitation boundaries for various operating parameters are discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kalaria, P. C., E-mail: parth.kalaria@partner.kit.edu; Avramidis, K. A.; Franck, J.
High frequency (>230 GHz) megawatt-class gyrotrons are planned as RF sources for electron cyclotron resonance heating and current drive in DEMOnstration fusion power plants (DEMOs). In this paper, for the first time, a feasibility study of a 236 GHz DEMO gyrotron is presented by considering all relevant design goals and the possible technical limitations. A mode-selection procedure is proposed in order to satisfy the multi-frequency and frequency-step tunability requirements. An effective systematic design approach for the optimal design of a gradually tapered cavity is presented. The RF-behavior of the proposed cavity is verified rigorously, supporting 920 kW of stable output power withmore » an interaction efficiency of 36% including the considerations of realistic beam parameters.« less
Efficient steady-state solver for hierarchical quantum master equations
NASA Astrophysics Data System (ADS)
Zhang, Hou-Dao; Qiao, Qin; Xu, Rui-Xue; Zheng, Xiao; Yan, YiJing
2017-07-01
Steady states play pivotal roles in many equilibrium and non-equilibrium open system studies. Their accurate evaluations call for exact theories with rigorous treatment of system-bath interactions. Therein, the hierarchical equations-of-motion (HEOM) formalism is a nonperturbative and non-Markovian quantum dissipation theory, which can faithfully describe the dissipative dynamics and nonlinear response of open systems. Nevertheless, solving the steady states of open quantum systems via HEOM is often a challenging task, due to the vast number of dynamical quantities involved. In this work, we propose a self-consistent iteration approach that quickly solves the HEOM steady states. We demonstrate its high efficiency with accurate and fast evaluations of low-temperature thermal equilibrium of a model Fenna-Matthews-Olson pigment-protein complex. Numerically exact evaluation of thermal equilibrium Rényi entropies and stationary emission line shapes is presented with detailed discussion.
An efficient hybrid technique in RCS predictions of complex targets at high frequencies
NASA Astrophysics Data System (ADS)
Algar, María-Jesús; Lozano, Lorena; Moreno, Javier; González, Iván; Cátedra, Felipe
2017-09-01
Most computer codes in Radar Cross Section (RCS) prediction use Physical Optics (PO) and Physical theory of Diffraction (PTD) combined with Geometrical Optics (GO) and Geometrical Theory of Diffraction (GTD). The latter approaches are computationally cheaper and much more accurate for curved surfaces, but not applicable for the computation of the RCS of all surfaces of a complex object due to the presence of caustic problems in the analysis of concave surfaces or flat surfaces in the far field. The main contribution of this paper is the development of a hybrid method based on a new combination of two asymptotic techniques: GTD and PO, considering the advantages and avoiding the disadvantages of each of them. A very efficient and accurate method to analyze the RCS of complex structures at high frequencies is obtained with the new combination. The proposed new method has been validated comparing RCS results obtained for some simple cases using the proposed approach and RCS using the rigorous technique of Method of Moments (MoM). Some complex cases have been examined at high frequencies contrasting the results with PO. This study shows the accuracy and the efficiency of the hybrid method and its suitability for the computation of the RCS at really large and complex targets at high frequencies.
Systemic Planning: An Annotated Bibliography and Literature Guide. Exchange Bibliography No. 91.
ERIC Educational Resources Information Center
Catanese, Anthony James
Systemic planning is an operational approach to using scientific rigor and qualitative judgment in a complementary manner. It integrates rigorous techniques and methods from systems analysis, cybernetics, decision theory, and work programing. The annotated reference sources in this bibliography include those works that have been most influential…
Solar Energy Enhancement Using Down-converting Particles: A Rigorous Approach
2011-06-06
Solar energy enhancement using down-converting particles: A rigorous approach Ze’ev R. Abrams,1,2 Avi Niv ,2 and Xiang Zhang2,3,a) 1Applied Science...System 1. 114905-2 Abrams, Niv , and Zhang J. Appl. Phys. 109, 114905 (2011) [This article is copyrighted as indicated in the article. Reuse of AIP...This increase per band-gap is displayed in 114905-3 Abrams, Niv , and Zhang J. Appl. Phys. 109, 114905 (2011) [This article is copyrighted as indicated
Short-cut Methods versus Rigorous Methods for Performance-evaluation of Distillation Configurations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ramapriya, Gautham Madenoor; Selvarajah, Ajiththaa; Jimenez Cucaita, Luis Eduardo
Here, this study demonstrates the efficacy of a short-cut method such as the Global Minimization Algorithm (GMA), that uses assumptions of ideal mixtures, constant molar overflow (CMO) and pinched columns, in pruning the search-space of distillation column configurations for zeotropic multicomponent separation, to provide a small subset of attractive configurations with low minimum heat duties. The short-cut method, due to its simplifying assumptions, is computationally efficient, yet reliable in identifying the small subset of useful configurations for further detailed process evaluation. This two-tier approach allows expedient search of the configuration space containing hundreds to thousands of candidate configurations for amore » given application.« less
Short-cut Methods versus Rigorous Methods for Performance-evaluation of Distillation Configurations
Ramapriya, Gautham Madenoor; Selvarajah, Ajiththaa; Jimenez Cucaita, Luis Eduardo; ...
2018-05-17
Here, this study demonstrates the efficacy of a short-cut method such as the Global Minimization Algorithm (GMA), that uses assumptions of ideal mixtures, constant molar overflow (CMO) and pinched columns, in pruning the search-space of distillation column configurations for zeotropic multicomponent separation, to provide a small subset of attractive configurations with low minimum heat duties. The short-cut method, due to its simplifying assumptions, is computationally efficient, yet reliable in identifying the small subset of useful configurations for further detailed process evaluation. This two-tier approach allows expedient search of the configuration space containing hundreds to thousands of candidate configurations for amore » given application.« less
Promoting a Culture of Tailoring for Systems Engineering Policy Expectations
NASA Technical Reports Server (NTRS)
Blankenship, Van A.
2016-01-01
NASA's Marshall Space Flight Center (MSFC) has developed an integrated systems engineering approach to promote a culture of tailoring for program and project policy requirements. MSFC's culture encourages and supports tailoring, with an emphasis on risk-based decision making, for enhanced affordability and efficiency. MSFC's policy structure integrates the various Agency requirements into a single, streamlined implementation approach which serves as a "one-stop-shop" for our programs and projects to follow. The engineers gain an enhanced understanding of policy and technical expectations, as well as lesson's learned from MSFC's history of spaceflight and science missions, to enable them to make appropriate, risk-based tailoring recommendations. The tailoring approach utilizes a standard methodology to classify projects into predefined levels using selected mission and programmatic scaling factors related to risk tolerance. Policy requirements are then selectively applied and tailored, with appropriate rationale, and approved by the governing authorities, to support risk-informed decisions to achieve the desired cost and schedule efficiencies. The policy is further augmented by implementation tools and lifecycle planning aids which help promote and support the cultural shift toward more tailoring. The MSFC Customization Tool is an integrated spreadsheet that ties together everything that projects need to understand, navigate, and tailor the policy. It helps them classify their project, understand the intent of the requirements, determine their tailoring approach, and document the necessary governance approvals. It also helps them plan for and conduct technical reviews throughout the lifecycle. Policy tailoring is thus established as a normal part of project execution, with the tools provided to facilitate and enable the tailoring process. MSFC's approach to changing the culture emphasizes risk-based tailoring of policy to achieve increased flexibility, efficiency, and effectiveness in project execution, while maintaining appropriate rigor to ensure mission success.
Dynamic programming algorithms for biological sequence comparison.
Pearson, W R; Miller, W
1992-01-01
Efficient dynamic programming algorithms are available for a broad class of protein and DNA sequence comparison problems. These algorithms require computer time proportional to the product of the lengths of the two sequences being compared [O(N2)] but require memory space proportional only to the sum of these lengths [O(N)]. Although the requirement for O(N2) time limits use of the algorithms to the largest computers when searching protein and DNA sequence databases, many other applications of these algorithms, such as calculation of distances for evolutionary trees and comparison of a new sequence to a library of sequence profiles, are well within the capabilities of desktop computers. In particular, the results of library searches with rapid searching programs, such as FASTA or BLAST, should be confirmed by performing a rigorous optimal alignment. Whereas rapid methods do not overlook significant sequence similarities, FASTA limits the number of gaps that can be inserted into an alignment, so that a rigorous alignment may extend the alignment substantially in some cases. BLAST does not allow gaps in the local regions that it reports; a calculation that allows gaps is very likely to extend the alignment substantially. Although a Monte Carlo evaluation of the statistical significance of a similarity score with a rigorous algorithm is much slower than the heuristic approach used by the RDF2 program, the dynamic programming approach should take less than 1 hr on a 386-based PC or desktop Unix workstation. For descriptive purposes, we have limited our discussion to methods for calculating similarity scores and distances that use gap penalties of the form g = rk. Nevertheless, programs for the more general case (g = q+rk) are readily available. Versions of these programs that run either on Unix workstations, IBM-PC class computers, or the Macintosh can be obtained from either of the authors.
Efficiency and formalism of quantum games
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, C.F.; Johnson, Neil F.
We show that quantum games are more efficient than classical games and provide a saturated upper bound for this efficiency. We also demonstrate that the set of finite classical games is a strict subset of the set of finite quantum games. Our analysis is based on a rigorous formulation of quantum games, from which quantum versions of the minimax theorem and the Nash equilibrium theorem can be deduced.
Double Dutch: A Tool for Designing Combinatorial Libraries of Biological Systems.
Roehner, Nicholas; Young, Eric M; Voigt, Christopher A; Gordon, D Benjamin; Densmore, Douglas
2016-06-17
Recently, semirational approaches that rely on combinatorial assembly of characterized DNA components have been used to engineer biosynthetic pathways. In practice, however, it is not practical to assemble and test millions of pathway variants in order to elucidate how different DNA components affect the behavior of a pathway. To address this challenge, we apply a rigorous mathematical approach known as design of experiments (DOE) that can be used to construct empirical models of system behavior without testing all variants. To support this approach, we have developed a tool named Double Dutch, which uses a formal grammar and heuristic algorithms to automate the process of DOE library design. Compared to designing by hand, Double Dutch enables users to more efficiently and scalably design libraries of pathway variants that can be used in a DOE framework and uniquely provides a means to flexibly balance design considerations of statistical analysis, construction cost, and risk of homologous recombination, thereby demonstrating the utility of automating decision making when faced with complex design trade-offs.
Reconciling the Rigor-Relevance Dilemma in Intellectual Capital Research
ERIC Educational Resources Information Center
Andriessen, Daniel
2004-01-01
This paper raises the issue of research methodology for intellectual capital and other types of management research by focusing on the dilemma of rigour versus relevance. The more traditional explanatory approach to research often leads to rigorous results that are not of much help to solve practical problems. This paper describes an alternative…
Spatial scaling and multi-model inference in landscape genetics: Martes americana in northern Idaho
Tzeidle N. Wasserman; Samuel A. Cushman; Michael K. Schwartz; David O. Wallin
2010-01-01
Individual-based analyses relating landscape structure to genetic distances across complex landscapes enable rigorous evaluation of multiple alternative hypotheses linking landscape structure to gene flow. We utilize two extensions to increase the rigor of the individual-based causal modeling approach to inferring relationships between landscape patterns and gene flow...
Collisional damping rates for plasma waves
NASA Astrophysics Data System (ADS)
Tigik, S. F.; Ziebell, L. F.; Yoon, P. H.
2016-06-01
The distinction between the plasma dynamics dominated by collisional transport versus collective processes has never been rigorously addressed until recently. A recent paper [P. H. Yoon et al., Phys. Rev. E 93, 033203 (2016)] formulates for the first time, a unified kinetic theory in which collective processes and collisional dynamics are systematically incorporated from first principles. One of the outcomes of such a formalism is the rigorous derivation of collisional damping rates for Langmuir and ion-acoustic waves, which can be contrasted to the heuristic customary approach. However, the results are given only in formal mathematical expressions. The present brief communication numerically evaluates the rigorous collisional damping rates by considering the case of plasma particles with Maxwellian velocity distribution function so as to assess the consequence of the rigorous formalism in a quantitative manner. Comparison with the heuristic ("Spitzer") formula shows that the accurate damping rates are much lower in magnitude than the conventional expression, which implies that the traditional approach over-estimates the importance of attenuation of plasma waves by collisional relaxation process. Such a finding may have a wide applicability ranging from laboratory to space and astrophysical plasmas.
Phonon-tunnelling dissipation in mechanical resonators
Cole, Garrett D.; Wilson-Rae, Ignacio; Werbach, Katharina; Vanner, Michael R.; Aspelmeyer, Markus
2011-01-01
Microscale and nanoscale mechanical resonators have recently emerged as ubiquitous devices for use in advanced technological applications, for example, in mobile communications and inertial sensors, and as novel tools for fundamental scientific endeavours. Their performance is in many cases limited by the deleterious effects of mechanical damping. In this study, we report a significant advancement towards understanding and controlling support-induced losses in generic mechanical resonators. We begin by introducing an efficient numerical solver, based on the 'phonon-tunnelling' approach, capable of predicting the design-limited damping of high-quality mechanical resonators. Further, through careful device engineering, we isolate support-induced losses and perform a rigorous experimental test of the strong geometric dependence of this loss mechanism. Our results are in excellent agreement with the theory, demonstrating the predictive power of our approach. In combination with recent progress on complementary dissipation mechanisms, our phonon-tunnelling solver represents a major step towards accurate prediction of the mechanical quality factor. PMID:21407197
NASA Astrophysics Data System (ADS)
Frizyuk, Kristina; Hasan, Mehedi; Krasnok, Alex; Alú, Andrea; Petrov, Mihail
2018-02-01
Resonantly enhanced Raman scattering in dielectric nanostructures has been recently proven to be an efficient tool for nanothermometry and for the experimental determination of their mode composition. In this paper we develop a rigorous analytical theory based on the Green's function approach to calculate the Raman emission from crystalline high-index dielectric nanoparticles. As an example, we consider silicon nanoparticles which have a strong Raman response due to active optical phonon modes. We relate enhancement of Raman signal emission to the Purcell effect due to the excitation of Mie modes inside the nanoparticles. We also employ our numerical approach to calculate inelastic Raman emission in more sophisticated geometries, which do not allow a straightforward analytical form of the Green's function. The Raman response from a silicon nanodisk has been analyzed with the proposed method, and the contribution of various Mie modes has been revealed.
Useful Material Efficiency Green Metrics Problem Set Exercises for Lecture and Laboratory
ERIC Educational Resources Information Center
Andraos, John
2015-01-01
A series of pedagogical problem set exercises are posed that illustrate the principles behind material efficiency green metrics and their application in developing a deeper understanding of reaction and synthesis plan analysis and strategies to optimize them. Rigorous, yet simple, mathematical proofs are given for some of the fundamental concepts,…
Nakayama, Y; Aoki, Y; Niitsu, H; Saigusa, K
2001-04-15
Forensic dentistry plays an essential role in personal identification procedures. An adequate interincisal space of cadavers with rigor mortis is required to obtain detailed dental findings. We have developed intraoral and two directional approaches, for myotomy of the temporal muscles. The intraoral approach, in which the temporalis was dissected with scissors inserted via an intraoral incision, was adopted for elderly cadavers, females and emaciated or exhausted bodies, and had a merit of no incision on the face. The two directional approach, in which myotomy was performed with thread-wire saw from behind and with scissors via the intraoral incision, was designed for male muscular youths. Both approaches were effective to obtain a desired degree of an interincisal opening without facial damage.
NASA Astrophysics Data System (ADS)
Shao, Feng; Evanschitzky, Peter; Fühner, Tim; Erdmann, Andreas
2009-10-01
This paper employs the Waveguide decomposition method as an efficient rigorous electromagnetic field (EMF) solver to investigate three dimensional mask-induced imaging artifacts in EUV lithography. The major mask diffraction induced imaging artifacts are first identified by applying the Zernike analysis of the mask nearfield spectrum of 2D lines/spaces. Three dimensional mask features like 22nm semidense/dense contacts/posts, isolated elbows and line-ends are then investigated in terms of lithographic results. After that, the 3D mask-induced imaging artifacts such as feature orientation dependent best focus shift, process window asymmetries, and other aberration-like phenomena are explored for the studied mask features. The simulation results can help lithographers to understand the reasons of EUV-specific imaging artifacts and to devise illumination and feature dependent strategies for their compensation in the optical proximity correction (OPC) for EUV masks. At last, an efficient approach using the Zernike analysis together with the Waveguide decomposition technique is proposed to characterize the impact of mask properties for the future OPC process.
Ju, Feng; Lee, Hyo Kyung; Yu, Xinhua; Faris, Nicholas R; Rugless, Fedoria; Jiang, Shan; Li, Jingshan; Osarogiagbon, Raymond U
2017-12-01
The process of lung cancer care from initial lesion detection to treatment is complex, involving multiple steps, each introducing the potential for substantial delays. Identifying the steps with the greatest delays enables a focused effort to improve the timeliness of care-delivery, without sacrificing quality. We retrospectively reviewed clinical events from initial detection, through histologic diagnosis, radiologic and invasive staging, and medical clearance, to surgery for all patients who had an attempted resection of a suspected lung cancer in a community healthcare system. We used a computer process modeling approach to evaluate delays in care delivery, in order to identify potential 'bottlenecks' in waiting time, the reduction of which could produce greater care efficiency. We also conducted 'what-if' analyses to predict the relative impact of simulated changes in the care delivery process to determine the most efficient pathways to surgery. The waiting time between radiologic lesion detection and diagnostic biopsy, and the waiting time from radiologic staging to surgery were the two most critical bottlenecks impeding efficient care delivery (more than 3 times larger compared to reducing other waiting times). Additionally, instituting surgical consultation prior to cardiac consultation for medical clearance and decreasing the waiting time between CT scans and diagnostic biopsies, were potentially the most impactful measures to reduce care delays before surgery. Rigorous computer simulation modeling, using clinical data, can provide useful information to identify areas for improving the efficiency of care delivery by process engineering, for patients who receive surgery for lung cancer.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhaoyuan Liu; Kord Smith; Benoit Forget
2016-05-01
A new method for computing homogenized assembly neutron transport cross sections and dif- fusion coefficients that is both rigorous and computationally efficient is proposed in this paper. In the limit of a homogeneous hydrogen slab, the new method is equivalent to the long-used, and only-recently-published CASMO transport method. The rigorous method is used to demonstrate the sources of inaccuracy in the commonly applied “out-scatter” transport correction. It is also demonstrated that the newly developed method is directly applicable to lattice calculations per- formed by Monte Carlo and is capable of computing rigorous homogenized transport cross sections for arbitrarily heterogeneous lattices.more » Comparisons of several common transport cross section ap- proximations are presented for a simple problem of infinite medium hydrogen. The new method has also been applied in computing 2-group diffusion data for an actual PWR lattice from BEAVRS benchmark.« less
Bardhan, Jaydeep P; Knepley, Matthew G; Anitescu, Mihai
2009-03-14
The importance of electrostatic interactions in molecular biology has driven extensive research toward the development of accurate and efficient theoretical and computational models. Linear continuum electrostatic theory has been surprisingly successful, but the computational costs associated with solving the associated partial differential equations (PDEs) preclude the theory's use in most dynamical simulations. Modern generalized-Born models for electrostatics can reproduce PDE-based calculations to within a few percent and are extremely computationally efficient but do not always faithfully reproduce interactions between chemical groups. Recent work has shown that a boundary-integral-equation formulation of the PDE problem leads naturally to a new approach called boundary-integral-based electrostatics estimation (BIBEE) to approximate electrostatic interactions. In the present paper, we prove that the BIBEE method can be used to rigorously bound the actual continuum-theory electrostatic free energy. The bounds are validated using a set of more than 600 proteins. Detailed numerical results are presented for structures of the peptide met-enkephalin taken from a molecular-dynamics simulation. These bounds, in combination with our demonstration that the BIBEE methods accurately reproduce pairwise interactions, suggest a new approach toward building a highly accurate yet computationally tractable electrostatic model.
NASA Astrophysics Data System (ADS)
Bardhan, Jaydeep P.; Knepley, Matthew G.; Anitescu, Mihai
2009-03-01
The importance of electrostatic interactions in molecular biology has driven extensive research toward the development of accurate and efficient theoretical and computational models. Linear continuum electrostatic theory has been surprisingly successful, but the computational costs associated with solving the associated partial differential equations (PDEs) preclude the theory's use in most dynamical simulations. Modern generalized-Born models for electrostatics can reproduce PDE-based calculations to within a few percent and are extremely computationally efficient but do not always faithfully reproduce interactions between chemical groups. Recent work has shown that a boundary-integral-equation formulation of the PDE problem leads naturally to a new approach called boundary-integral-based electrostatics estimation (BIBEE) to approximate electrostatic interactions. In the present paper, we prove that the BIBEE method can be used to rigorously bound the actual continuum-theory electrostatic free energy. The bounds are validated using a set of more than 600 proteins. Detailed numerical results are presented for structures of the peptide met-enkephalin taken from a molecular-dynamics simulation. These bounds, in combination with our demonstration that the BIBEE methods accurately reproduce pairwise interactions, suggest a new approach toward building a highly accurate yet computationally tractable electrostatic model.
All Rigor and No Play Is No Way to Improve Learning
ERIC Educational Resources Information Center
Wohlwend, Karen; Peppler, Kylie
2015-01-01
The authors propose and discuss their Playshop curricular model, which they developed with teachers. Their studies suggest a playful approach supports even more rigor than the Common Core State Standards require for preschool and early grade children. Children keep their attention longer when learning comes in the form of something they can play…
Duan, Lili; Liu, Xiao; Zhang, John Z H
2016-05-04
Efficient and reliable calculation of protein-ligand binding free energy is a grand challenge in computational biology and is of critical importance in drug design and many other molecular recognition problems. The main challenge lies in the calculation of entropic contribution to protein-ligand binding or interaction systems. In this report, we present a new interaction entropy method which is theoretically rigorous, computationally efficient, and numerically reliable for calculating entropic contribution to free energy in protein-ligand binding and other interaction processes. Drastically different from the widely employed but extremely expensive normal mode method for calculating entropy change in protein-ligand binding, the new method calculates the entropic component (interaction entropy or -TΔS) of the binding free energy directly from molecular dynamics simulation without any extra computational cost. Extensive study of over a dozen randomly selected protein-ligand binding systems demonstrated that this interaction entropy method is both computationally efficient and numerically reliable and is vastly superior to the standard normal mode approach. This interaction entropy paradigm introduces a novel and intuitive conceptual understanding of the entropic effect in protein-ligand binding and other general interaction systems as well as a practical method for highly efficient calculation of this effect.
Energy sustainability: consumption, efficiency, and ...
One of the critical challenges in achieving sustainability is finding a way to meet the energy consumption needs of a growing population in the face of increasing economic prosperity and finite resources. According to ecological footprint computations, the global resource consumption began exceeding planetary supply in 1977 and by 2030, global energy demand, population, and gross domestic product are projected to greatly increase over 1977 levels. With the aim of finding sustainable energy solutions, we present a simple yet rigorous procedure for assessing and counterbalancing the relationship between energy demand, environmental impact, population, GDP, and energy efficiency. Our analyses indicated that infeasible increases in energy efficiency (over 100 %) would be required by 2030 to return to 1977 environmental impact levels and annual reductions (2 and 3 %) in energy demand resulted in physical, yet impractical requirements; hence, a combination of policy and technology approaches is needed to tackle this critical challenge. This work emphasizes the difficulty in moving toward energy sustainability and helps to frame possible solutions useful for policy and management. Based on projected energy consumption, environmental impact, human population, gross domestic product (GDP), and energy efficiency, for this study, we explore the increase in energy-use efficiency and the decrease in energy use intensity required to achieve sustainable environmental impact le
Design of telehealth trials--introducing adaptive approaches.
Law, Lisa M; Wason, James M S
2014-12-01
The field of telehealth and telemedicine is expanding as the need to improve efficiency of health care becomes more pressing. The decision to implement a telehealth system is generally an expensive undertaking that impacts a large number of patients and other stakeholders. It is therefore extremely important that the decision is fully supported by accurate evaluation of telehealth interventions. Numerous reviews of telehealth have described the evidence base as inconsistent. In response they call for larger, more rigorously controlled trials, and trials which go beyond evaluation of clinical effectiveness alone. The aim of this paper is to discuss various ways in which evaluation of telehealth could be improved by the use of adaptive trial designs. We discuss various adaptive design options, such as sample size reviews and changing the study hypothesis to address uncertain parameters, group sequential trials and multi-arm multi-stage trials to improve efficiency, and enrichment designs to maximise the chances of obtaining clear evidence about the telehealth intervention. There is potential to address the flaws discussed in the telehealth literature through the adoption of adaptive approaches to trial design. Such designs could lead to improvements in efficiency, allow the evaluation of multiple telehealth interventions in a cost-effective way, or accurately assess a range of endpoints that are important in the overall success of a telehealth programme. Copyright © 2014 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.
NASA Astrophysics Data System (ADS)
Zhang, George Z.; Myers, Kyle J.; Park, Subok
2013-03-01
Digital breast tomosynthesis (DBT) has shown promise for improving the detection of breast cancer, but it has not yet been fully optimized due to a large space of system parameters to explore. A task-based statistical approach1 is a rigorous method for evaluating and optimizing this promising imaging technique with the use of optimal observers such as the Hotelling observer (HO). However, the high data dimensionality found in DBT has been the bottleneck for the use of a task-based approach in DBT evaluation. To reduce data dimensionality while extracting salient information for performing a given task, efficient channels have to be used for the HO. In the past few years, 2D Laguerre-Gauss (LG) channels, which are a complete basis for stationary backgrounds and rotationally symmetric signals, have been utilized for DBT evaluation2, 3 . But since background and signal statistics from DBT data are neither stationary nor rotationally symmetric, LG channels may not be efficient in providing reliable performance trends as a function of system parameters. Recently, partial least squares (PLS) has been shown to generate efficient channels for the Hotelling observer in detection tasks involving random backgrounds and signals.4 In this study, we investigate the use of PLS as a method for extracting salient information from DBT in order to better evaluate such systems.
NASA Astrophysics Data System (ADS)
Gholizadeh, Hamed
Photosynthesis in aquatic and terrestrial ecosystems is the key component of the food chain and the most important driver of the global carbon cycle. Therefore, estimation of photosynthesis at large spatial scales is of great scientific importance and can only practically be achieved by remote sensing data and techniques. In this dissertation, remotely sensed information and techniques, as well as field measurements, are used to improve current approaches of assessing photosynthetic processes. More specifically, three topics are the focus here: (1) investigating the application of spectral vegetation indices as proxies for terrestrial chlorophyll in a mangrove ecosystem, (2) evaluating and improving one of the most common empirical ocean-color algorithms (OC4), and (3) developing an improved approach based on sunlit-to-shaded scaled photochemical reflectance index (sPRI) ratios for detecting drought signals in a deciduous forest at eastern United States. The results indicated that although the green normalized difference vegetation index (GNDVI) is an efficient proxy for terrestrial chlorophyll content, there are opportunities to improve the performance of vegetation indices by optimizing the band weights. In regards to the second topic, we concluded that the parameters of the OC4 algorithm and similar empirical models should be tuned regionally and the addition of sea-surface temperature makes the global ocean-color approaches more valid. Results obtained from the third topic showed that considering shaded and sunlit portions of the canopy (i.e., two-leaf models instead of single big leaf models) and taking into account the divergent stomatal behavior of the species (i.e. isohydric and anisohydric) can improve the capability of sPRI in detecting drought. In addition to investigating the photosynthetic processes, the other common theme of the three research topics is the evaluation of "off- the-shelf" solutions to remote-sensing problems. Although widely used approaches such as normalized difference vegetation index (NDVI) are easy to apply and are often efficient choices in remote sensing applications, the use of these approaches should be justified and their shortcomings need to be considered in the context of the research application. When developing new remote sensing approaches, special attention should be paid to (1) initial data analysis such as statistical data transformations (e.g. Tukey ladder-of-powers transformation) and (2) rigorous validation design by creating separate training and validation data sets preferably using both field measurements and satellite-based data. Developing a sound approach and applying a rigorous validation methodology go hand in hand. In sum, all approaches have advantages and disadvantages or as George Box puts it, "all models are wrong but some are useful".
Identifying Vulnerabilities and Hardening Attack Graphs for Networked Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saha, Sudip; Vullinati, Anil K.; Halappanavar, Mahantesh
We investigate efficient security control methods for protecting against vulnerabilities in networked systems. A large number of interdependent vulnerabilities typically exist in the computing nodes of a cyber-system; as vulnerabilities get exploited, starting from low level ones, they open up the doors to more critical vulnerabilities. These cannot be understood just by a topological analysis of the network, and we use the attack graph abstraction of Dewri et al. to study these problems. In contrast to earlier approaches based on heuristics and evolutionary algorithms, we study rigorous methods for quantifying the inherent vulnerability and hardening cost for the system. Wemore » develop algorithms with provable approximation guarantees, and evaluate them for real and synthetic attack graphs.« less
Consistent Chemical Mechanism from Collaborative Data Processing
Slavinskaya, Nadezda; Starcke, Jan-Hendrik; Abbasi, Mehdi; ...
2016-04-01
Numerical tool of Process Informatics Model (PrIMe) is mathematically rigorous and numerically efficient approach for analysis and optimization of chemical systems. It handles heterogeneous data and is scalable to a large number of parameters. The Boundto-Bound Data Collaboration module of the automated data-centric infrastructure of PrIMe was used for the systematic uncertainty and data consistency analyses of the H 2/CO reaction model (73/17) and 94 experimental targets (ignition delay times). The empirical rule for evaluation of the shock tube experimental data is proposed. The initial results demonstrate clear benefits of the PrIMe methods for an evaluation of the kinetic datamore » quality and data consistency and for developing predictive kinetic models.« less
Public Health Surveillance Systems: Recent Advances in Their Use and Evaluation.
Groseclose, Samuel L; Buckeridge, David L
2017-03-20
Surveillance is critical for improving population health. Public health surveillance systems generate information that drives action, and the data must be of sufficient quality and with a resolution and timeliness that matches objectives. In the context of scientific advances in public health surveillance, changing health care and public health environments, and rapidly evolving technologies, the aim of this article is to review public health surveillance systems. We consider their current use to increase the efficiency and effectiveness of the public health system, the role of system stakeholders, the analysis and interpretation of surveillance data, approaches to system monitoring and evaluation, and opportunities for future advances in terms of increased scientific rigor, outcomes-focused research, and health informatics.
Efficiency bounds for nonequilibrium heat engines
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mehta, Pankaj; Polkovnikov, Anatoli, E-mail: asp@bu.edu
2013-05-15
We analyze the efficiency of thermal engines (either quantum or classical) working with a single heat reservoir like an atmosphere. The engine first gets an energy intake, which can be done in an arbitrary nonequilibrium way e.g. combustion of fuel. Then the engine performs the work and returns to the initial state. We distinguish two general classes of engines where the working body first equilibrates within itself and then performs the work (ergodic engine) or when it performs the work before equilibrating (non-ergodic engine). We show that in both cases the second law of thermodynamics limits their efficiency. For ergodicmore » engines we find a rigorous upper bound for the efficiency, which is strictly smaller than the equivalent Carnot efficiency. I.e. the Carnot efficiency can be never achieved in single reservoir heat engines. For non-ergodic engines the efficiency can be higher and can exceed the equilibrium Carnot bound. By extending the fundamental thermodynamic relation to nonequilibrium processes, we find a rigorous thermodynamic bound for the efficiency of both ergodic and non-ergodic engines and show that it is given by the relative entropy of the nonequilibrium and initial equilibrium distributions. These results suggest a new general strategy for designing more efficient engines. We illustrate our ideas by using simple examples. -- Highlights: ► Derived efficiency bounds for heat engines working with a single reservoir. ► Analyzed both ergodic and non-ergodic engines. ► Showed that non-ergodic engines can be more efficient. ► Extended fundamental thermodynamic relation to arbitrary nonequilibrium processes.« less
Rahbar, Mohammad H.; Wyatt, Gwen; Sikorskii, Alla; Victorson, David; Ardjomand-Hessabi, Manouchehr
2011-01-01
Background Multisite randomized clinical trials allow for increased research collaboration among investigators and expedite data collection efforts. As a result, government funding agencies typically look favorably upon this approach. As the field of complementary and alternative medicine (CAM) continues to evolve, so do increased calls for the use of more rigorous study design and trial methodologies, which can present challenges for investigators. Purpose To describe the processes involved in the coordination and management of a multisite randomized clinical trial of a CAM intervention. Methods Key aspects related to the coordination and management of a multisite CAM randomized clinical trial are presented, including organizational and site selection considerations, recruitment concerns and issues related to data collection and randomization to treatment groups. Management and monitoring of data, as well as quality assurance procedures are described. Finally, a real world perspective is shared from a recently conducted multisite randomized clinical trial of reflexology for women diagnosed with advanced breast cancer. Results The use of multiple sites in the conduct of CAM-based randomized clinical trials can provide an efficient, collaborative and robust approach to study coordination and data collection that maximizes efficiency and ensures the quality of results. Conclusions Multisite randomized clinical trial designs can offer the field of CAM research a more standardized and efficient approach to examine the effectiveness of novel therapies and treatments. Special attention must be given to intervention fidelity, consistent data collection and ensuring data quality. Assessment and reporting of quantitative indicators of data quality should be required. PMID:21664296
NASA Technical Reports Server (NTRS)
Huck, Friedrich O.; Fales, Carl L.
1990-01-01
Researchers are concerned with the end-to-end performance of image gathering, coding, and processing. The applications range from high-resolution television to vision-based robotics, wherever the resolution, efficiency and robustness of visual information acquisition and processing are critical. For the presentation at this workshop, it is convenient to divide research activities into the following two overlapping areas: The first is the development of focal-plane processing techniques and technology to effectively combine image gathering with coding, with an emphasis on low-level vision processing akin to the retinal processing in human vision. The approach includes the familiar Laplacian pyramid, the new intensity-dependent spatial summation, and parallel sensing/processing networks. Three-dimensional image gathering is attained by combining laser ranging with sensor-array imaging. The second is the rigorous extension of information theory and optimal filtering to visual information acquisition and processing. The goal is to provide a comprehensive methodology for quantitatively assessing the end-to-end performance of image gathering, coding, and processing.
Large Deviations for Nonlocal Stochastic Neural Fields
2014-01-01
We study the effect of additive noise on integro-differential neural field equations. In particular, we analyze an Amari-type model driven by a Q-Wiener process, and focus on noise-induced transitions and escape. We argue that proving a sharp Kramers’ law for neural fields poses substantial difficulties, but that one may transfer techniques from stochastic partial differential equations to establish a large deviation principle (LDP). Then we demonstrate that an efficient finite-dimensional approximation of the stochastic neural field equation can be achieved using a Galerkin method and that the resulting finite-dimensional rate function for the LDP can have a multiscale structure in certain cases. These results form the starting point for an efficient practical computation of the LDP. Our approach also provides the technical basis for further rigorous study of noise-induced transitions in neural fields based on Galerkin approximations. Mathematics Subject Classification (2000): 60F10, 60H15, 65M60, 92C20. PMID:24742297
Beam splitting of low-contrast binary gratings under second Bragg angle incidence.
Zheng, Jiangjun; Zhou, Changhe; Wang, Bo; Feng, Jijun
2008-05-01
Beam splitting of low-contrast rectangular gratings under second Bragg angle incidence is studied. The grating period is between lambda and 2lambda. The diffraction behaviors of the three transmitted propagating orders are illustrated by analyzing the first three propagating grating modes. From a simplified modal approach, the design conditions of gratings as a high-efficiency element with most of its energy concentrated in the -2nd transmitted order (~90%) and of gratings as a 1 x 2 beam splitter with a total efficiency over 90% are derived. The grating parameters for achieving exactly the splitting pattern by use of rigorous coupled-wave analysis verified the design method. A 1 x 3 beam splitter is also demonstrated. Moreover, the polarization-dependent diffraction behaviors are investigated, which suggest the possibility of designing polarization-selective elements under such a configuration. The proposed concept of using the second Bragg angle should be helpful for developing new grating-based devices.
OLED emission zone measurement with high accuracy
NASA Astrophysics Data System (ADS)
Danz, N.; MacCiarnain, R.; Michaelis, D.; Wehlus, T.; Rausch, A. F.; Wächter, C. A.; Reusch, T. C. G.
2013-09-01
Highly efficient state of the art organic light-emitting diodes (OLED) comprise thin emitting layers with thicknesses in the order of 10 nm. The spatial distribution of the photon generation rate, i.e. the profile of the emission zone, inside these layers is of interest for both device efficiency analysis and characterization of charge recombination processes. It can be accessed experimentally by reverse simulation of far-field emission pattern measurements. Such a far-field pattern is the sum of individual emission patterns associated with the corresponding positions inside the active layer. Based on rigorous electromagnetic theory the relation between far-field pattern and emission zone is modeled as a linear problem. This enables a mathematical analysis to be applied to the cases of single and double emitting layers in the OLED stack as well as to pattern measurements in air or inside the substrate. From the results, guidelines for optimum emitter - cathode separation and for selecting the best experimental approach are obtained. Limits for the maximum spatial resolution can be derived.
The MR-Base platform supports systematic causal inference across the human phenome
Wade, Kaitlin H; Haberland, Valeriia; Baird, Denis; Laurin, Charles; Burgess, Stephen; Bowden, Jack; Langdon, Ryan; Tan, Vanessa Y; Yarmolinsky, James; Shihab, Hashem A; Timpson, Nicholas J; Evans, David M; Relton, Caroline; Martin, Richard M; Davey Smith, George
2018-01-01
Results from genome-wide association studies (GWAS) can be used to infer causal relationships between phenotypes, using a strategy known as 2-sample Mendelian randomization (2SMR) and bypassing the need for individual-level data. However, 2SMR methods are evolving rapidly and GWAS results are often insufficiently curated, undermining efficient implementation of the approach. We therefore developed MR-Base (http://www.mrbase.org): a platform that integrates a curated database of complete GWAS results (no restrictions according to statistical significance) with an application programming interface, web app and R packages that automate 2SMR. The software includes several sensitivity analyses for assessing the impact of horizontal pleiotropy and other violations of assumptions. The database currently comprises 11 billion single nucleotide polymorphism-trait associations from 1673 GWAS and is updated on a regular basis. Integrating data with software ensures more rigorous application of hypothesis-driven analyses and allows millions of potential causal relationships to be efficiently evaluated in phenome-wide association studies. PMID:29846171
When Assessment Data Are Words: Validity Evidence for Qualitative Educational Assessments.
Cook, David A; Kuper, Ayelet; Hatala, Rose; Ginsburg, Shiphra
2016-10-01
Quantitative scores fail to capture all important features of learner performance. This awareness has led to increased use of qualitative data when assessing health professionals. Yet the use of qualitative assessments is hampered by incomplete understanding of their role in forming judgments, and lack of consensus in how to appraise the rigor of judgments therein derived. The authors articulate the role of qualitative assessment as part of a comprehensive program of assessment, and translate the concept of validity to apply to judgments arising from qualitative assessments. They first identify standards for rigor in qualitative research, and then use two contemporary assessment validity frameworks to reorganize these standards for application to qualitative assessment.Standards for rigor in qualitative research include responsiveness, reflexivity, purposive sampling, thick description, triangulation, transparency, and transferability. These standards can be reframed using Messick's five sources of validity evidence (content, response process, internal structure, relationships with other variables, and consequences) and Kane's four inferences in validation (scoring, generalization, extrapolation, and implications). Evidence can be collected and evaluated for each evidence source or inference. The authors illustrate this approach using published research on learning portfolios.The authors advocate a "methods-neutral" approach to assessment, in which a clearly stated purpose determines the nature of and approach to data collection and analysis. Increased use of qualitative assessments will necessitate more rigorous judgments of the defensibility (validity) of inferences and decisions. Evidence should be strategically sought to inform a coherent validity argument.
Mobile mental health: a challenging research agenda.
Olff, Miranda
2015-01-01
The field of mobile health ("m-Health") is evolving rapidly and there is an explosive growth of psychological tools on the market. Exciting high-tech developments may identify symptoms, help individuals manage their own mental health, encourage help seeking, and provide both preventive and therapeutic interventions. This development has the potential to be an efficient cost-effective approach reducing waiting lists and serving a considerable portion of people globally ("g-Health"). However, few of the mobile applications (apps) have been rigorously evaluated. There is little information on how valid screening and assessment tools are, which of the mobile intervention apps are effective, or how well mobile apps compare to face-to-face treatments. But how feasible is rigorous scientific evaluation with the rising demands from policy makers, business partners, and users for their quick release? In this paper, developments in m-Health tools-targeting screening, assessment, prevention, and treatment-are reviewed with examples from the field of trauma and posttraumatic stress disorder. The academic challenges in developing and evaluating m-Health tools are being addressed. Evidence-based guidance is needed on appropriate research designs that may overcome some of the public and ethical challenges (e.g., equity, availability) and the market-driven wish to have mobile apps in the "App Store" yesterday rather than tomorrow.
Clinical decision making-a functional medicine perspective.
Pizzorno, Joseph E
2012-09-01
As 21st century health care moves from a disease-based approach to a more patient-centric system that can address biochemical individuality to improve health and function, clinical decision making becomes more complex. Accentuating the problem is the lack of a clear standard for this more complex functional medicine approach. While there is relatively broad agreement in Western medicine for what constitutes competent assessment of disease and identification of related treatment approaches, the complex functional medicine model posits multiple and individualized diagnostic and therapeutic approaches, most or many of which have reasonable underlying science and principles, but which have not been rigorously tested in a research or clinical setting. This has led to non-rigorous thinking and sometimes to uncritical acceptance of both poorly documented diagnostic procedures and ineffective therapies, resulting in less than optimal clinical care.
Clinical Decision Making—A Functional Medicine Perspective
2012-01-01
As 21st century health care moves from a disease-based approach to a more patient-centric system that can address biochemical individuality to improve health and function, clinical decision making becomes more complex. Accentuating the problem is the lack of a clear standard for this more complex functional medicine approach. While there is relatively broad agreement in Western medicine for what constitutes competent assessment of disease and identification of related treatment approaches, the complex functional medicine model posits multiple and individualized diagnostic and therapeutic approaches, most or many of which have reasonable underlying science and principles, but which have not been rigorously tested in a research or clinical setting. This has led to non-rigorous thinking and sometimes to uncritical acceptance of both poorly documented diagnostic procedures and ineffective therapies, resulting in less than optimal clinical care. PMID:24278827
Principles to Products: Toward Realizing MOS 2.0
NASA Technical Reports Server (NTRS)
Bindschadler, Duane L.; Delp, Christopher L.
2012-01-01
This is a report on the Operations Revitalization Initiative, part of the ongoing NASA-funded Advanced Multi-Mission Operations Systems (AMMOS) program. We are implementing products that significantly improve efficiency and effectiveness of Mission Operations Systems (MOS) for deep-space missions. We take a multi-mission approach, in keeping with our organization's charter to "provide multi-mission tools and services that enable mission customers to operate at a lower total cost to NASA." Focusing first on architectural fundamentals of the MOS, we review the effort's progress. In particular, we note the use of stakeholder interactions and consideration of past lessons learned to motivate a set of Principles that guide the evolution of the AMMOS. Thus guided, we have created essential patterns and connections (detailed in companion papers) that are explicitly modeled and support elaboration at multiple levels of detail (system, sub-system, element...) throughout a MOS. This architecture is realized in design and implementation products that provide lifecycle support to a Mission at the system and subsystem level. The products include adaptable multi-mission engineering documentation that describes essentials such as operational concepts and scenarios, requirements, interfaces and agreements, information models, and mission operations processes. Because we have adopted a model-based system engineering method, these documents and their contents are meaningfully related to one another and to the system model. This means they are both more rigorous and reusable (from mission to mission) than standard system engineering products. The use of models also enables detailed, early (e.g., formulation phase) insight into the impact of changes (e.g., to interfaces or to software) that is rigorous and complete, allowing better decisions on cost or technical trades. Finally, our work provides clear and rigorous specification of operations needs to software developers, further enabling significant gains in productivity.
An exergy approach to efficiency evaluation of desalination
NASA Astrophysics Data System (ADS)
Ng, Kim Choon; Shahzad, Muhammad Wakil; Son, Hyuk Soo; Hamed, Osman A.
2017-05-01
This paper presents an evaluation process efficiency based on the consumption of primary energy for all types of practical desalination methods available hitherto. The conventional performance ratio has, thus far, been defined with respect to the consumption of derived energy, such as the electricity or steam, which are susceptible to the conversion losses of power plants and boilers that burned the input primary fuels. As derived energies are usually expressed by the units, either kWh or Joules, these units cannot differentiate the grade of energy supplied to the processes accurately. In this paper, the specific energy consumption is revisited for the efficacy of all large-scale desalination plants. In today's combined production of electricity and desalinated water, accomplished with advanced cogeneration concept, the input exergy of fuels is utilized optimally and efficiently in a temperature cascaded manner. By discerning the exergy destruction successively in the turbines and desalination processes, the relative contribution of primary energy to the processes can be accurately apportioned to the input primary energy. Although efficiency is not a law of thermodynamics, however, a common platform for expressing the figures of merit explicit to the efficacy of desalination processes can be developed meaningfully that has the thermodynamic rigor up to the ideal or thermodynamic limit of seawater desalination for all scientists and engineers to aspire to.
Ganger, Michael T; Dietz, Geoffrey D; Ewing, Sarah J
2017-12-01
qPCR has established itself as the technique of choice for the quantification of gene expression. Procedures for conducting qPCR have received significant attention; however, more rigorous approaches to the statistical analysis of qPCR data are needed. Here we develop a mathematical model, termed the Common Base Method, for analysis of qPCR data based on threshold cycle values (C q ) and efficiencies of reactions (E). The Common Base Method keeps all calculations in the logscale as long as possible by working with log 10 (E) ∙ C q , which we call the efficiency-weighted C q value; subsequent statistical analyses are then applied in the logscale. We show how efficiency-weighted C q values may be analyzed using a simple paired or unpaired experimental design and develop blocking methods to help reduce unexplained variation. The Common Base Method has several advantages. It allows for the incorporation of well-specific efficiencies and multiple reference genes. The method does not necessitate the pairing of samples that must be performed using traditional analysis methods in order to calculate relative expression ratios. Our method is also simple enough to be implemented in any spreadsheet or statistical software without additional scripts or proprietary components.
n-D shape/texture optimal synthetic description and modeling by GEOGINE
NASA Astrophysics Data System (ADS)
Fiorini, Rodolfo A.; Dacquino, Gianfranco F.
2004-12-01
GEOGINE(GEOmetrical enGINE), a state-of-the-art OMG (Ontological Model Generator) based on n-D Tensor Invariants for multidimensional shape/texture optimal synthetic description and learning, is presented. Usually elementary geometric shape robust characterization, subjected to geometric transformation, on a rigorous mathematical level is a key problem in many computer applications in different interest areas. The past four decades have seen solutions almost based on the use of n-Dimensional Moment and Fourier descriptor invariants. The present paper introduces a new approach for automatic model generation based on n -Dimensional Tensor Invariants as formal dictionary. An ontological model is the kernel used for specifying ontologies so that how close an ontology can be from the real world depends on the possibilities offered by the ontological model. By this approach even chromatic information content can be easily and reliably decoupled from target geometric information and computed into robus colour shape parameter attributes. Main GEOGINEoperational advantages over previous approaches are: 1) Automated Model Generation, 2) Invariant Minimal Complete Set for computational efficiency, 3) Arbitrary Model Precision for robust object description.
[Sustainable process improvement with application of 'lean philosophy'].
Rouppe van der Voort, Marc B V; van Merode, G G Frits; Veraart, Henricus G N
2013-01-01
Process improvement is increasingly being implemented, particularly with the aid of 'lean philosophy'. This management philosophy aims to improve quality by reducing 'wastage'. Local improvements can produce negative effects elsewhere due to interdependence of processes. An 'integrated system approach' is required to prevent this. Some hospitals claim that this has been successful. Research into process improvement with the application of lean philosophy has reported many positive effects, defined as improved safety, quality and efficiency. Due to methodological shortcomings and lack of rigorous evaluations it is, however, not yet possible to determine the impact of this approach. It is, however, obvious that the investigated applications are fragmentary, with a dominant focus on the instrumental aspect of the philosophy and a lack of integration in a total system, and with insufficient attention to human aspects. Process improvement is required to achieve better and more goal-oriented healthcare. To achieve this, hospitals must develop integrated system approaches that combine methods for process design with continuous improvement of processes and with personnel management. It is crucial that doctors take the initiative to guide and improve processes in an integral manner.
Efficient numerical method for analyzing optical bistability in photonic crystal microcavities.
Yuan, Lijun; Lu, Ya Yan
2013-05-20
Nonlinear optical effects can be enhanced by photonic crystal microcavities and be used to develop practical ultra-compact optical devices with low power requirements. The finite-difference time-domain method is the standard numerical method for simulating nonlinear optical devices, but it has limitations in terms of accuracy and efficiency. In this paper, a rigorous and efficient frequency-domain numerical method is developed for analyzing nonlinear optical devices where the nonlinear effect is concentrated in the microcavities. The method replaces the linear problem outside the microcavities by a rigorous and numerically computed boundary condition, then solves the nonlinear problem iteratively in a small region around the microcavities. Convergence of the iterative method is much easier to achieve since the size of the problem is significantly reduced. The method is presented for a specific two-dimensional photonic crystal waveguide-cavity system with a Kerr nonlinearity, using numerical methods that can take advantage of the geometric features of the structure. The method is able to calculate multiple solutions exhibiting the optical bistability phenomenon in the strongly nonlinear regime.
Rigorous Electromagnetic Analysis of the Focusing Action of Refractive Cylindrical Microlens
NASA Astrophysics Data System (ADS)
Liu, Juan; Gu, Ben-Yuan; Dong, Bi-Zhen; Yang, Guo-Zhen
The focusing action of refractive cylindrical microlens is investigated based on the rigorous electromagnetic theory with the use of the boundary element method. The focusing behaviors of these refractive microlenses with continuous and multilevel surface-envelope are characterized in terms of total electric-field patterns, the electric-field intensity distributions on the focal plane, and their diffractive efficiencies at the focal spots. The obtained results are also compared with the ones obtained by Kirchhoff's scalar diffraction theory. The present numerical and graphical results may provide useful information for the analysis and design of refractive elements in micro-optics.
Non-adiabatic molecular dynamics by accelerated semiclassical Monte Carlo
White, Alexander J.; Gorshkov, Vyacheslav N.; Tretiak, Sergei; ...
2015-07-07
Non-adiabatic dynamics, where systems non-radiatively transition between electronic states, plays a crucial role in many photo-physical processes, such as fluorescence, phosphorescence, and photoisomerization. Methods for the simulation of non-adiabatic dynamics are typically either numerically impractical, highly complex, or based on approximations which can result in failure for even simple systems. Recently, the Semiclassical Monte Carlo (SCMC) approach was developed in an attempt to combine the accuracy of rigorous semiclassical methods with the efficiency and simplicity of widely used surface hopping methods. However, while SCMC was found to be more efficient than other semiclassical methods, it is not yet as efficientmore » as is needed to be used for large molecular systems. Here, we have developed two new methods: the accelerated-SCMC and the accelerated-SCMC with re-Gaussianization, which reduce the cost of the SCMC algorithm up to two orders of magnitude for certain systems. In many cases shown here, the new procedures are nearly as efficient as the commonly used surface hopping schemes, with little to no loss of accuracy. This implies that these modified SCMC algorithms will be of practical numerical solutions for simulating non-adiabatic dynamics in realistic molecular systems.« less
Rigorous diffraction analysis using geometrical theory of diffraction for future mask technology
NASA Astrophysics Data System (ADS)
Chua, Gek S.; Tay, Cho J.; Quan, Chenggen; Lin, Qunying
2004-05-01
Advanced lithographic techniques such as phase shift masks (PSM) and optical proximity correction (OPC) result in a more complex mask design and technology. In contrast to the binary masks, which have only transparent and nontransparent regions, phase shift masks also take into consideration transparent features with a different optical thickness and a modified phase of the transmitted light. PSM are well-known to show prominent diffraction effects, which cannot be described by the assumption of an infinitely thin mask (Kirchhoff approach) that is used in many commercial photolithography simulators. A correct prediction of sidelobe printability, process windows and linearity of OPC masks require the application of rigorous diffraction theory. The problem of aerial image intensity imbalance through focus with alternating Phase Shift Masks (altPSMs) is performed and compared between a time-domain finite-difference (TDFD) algorithm (TEMPEST) and Geometrical theory of diffraction (GTD). Using GTD, with the solution to the canonical problems, we obtained a relationship between the edge on the mask and the disturbance in image space. The main interest is to develop useful formulations that can be readily applied to solve rigorous diffraction for future mask technology. Analysis of rigorous diffraction effects for altPSMs using GTD approach will be discussed.
Community, state, and federal approaches to conventional and cumulative risk assessment (CRA) were described and compared to assess similarities and differences, and develop recommendations for a consistent CRA approach, acceptable across each level as a rigorous scientific metho...
Doshi, Urmi; Hamelberg, Donald
2015-05-01
Accelerated molecular dynamics (aMD) has been proven to be a powerful biasing method for enhanced sampling of biomolecular conformations on general-purpose computational platforms. Biologically important long timescale events that are beyond the reach of standard molecular dynamics can be accessed without losing the detailed atomistic description of the system in aMD. Over other biasing methods, aMD offers the advantages of tuning the level of acceleration to access the desired timescale without any advance knowledge of the reaction coordinate. Recent advances in the implementation of aMD and its applications to small peptides and biological macromolecules are reviewed here along with a brief account of all the aMD variants introduced in the last decade. In comparison to the original implementation of aMD, the recent variant in which all the rotatable dihedral angles are accelerated (RaMD) exhibits faster convergence rates and significant improvement in statistical accuracy of retrieved thermodynamic properties. RaMD in conjunction with accelerating diffusive degrees of freedom, i.e. dual boosting, has been rigorously tested for the most difficult conformational sampling problem, protein folding. It has been shown that RaMD with dual boosting is capable of efficiently sampling multiple folding and unfolding events in small fast folding proteins. RaMD with the dual boost approach opens exciting possibilities for sampling multiple timescales in biomolecules. While equilibrium properties can be recovered satisfactorily from aMD-based methods, directly obtaining dynamics and kinetic rates for larger systems presents a future challenge. This article is part of a Special Issue entitled Recent developments of molecular dynamics. Copyright © 2014 Elsevier B.V. All rights reserved.
Fast Exact Search in Hamming Space With Multi-Index Hashing.
Norouzi, Mohammad; Punjani, Ali; Fleet, David J
2014-06-01
There is growing interest in representing image data and feature descriptors using compact binary codes for fast near neighbor search. Although binary codes are motivated by their use as direct indices (addresses) into a hash table, codes longer than 32 bits are not being used as such, as it was thought to be ineffective. We introduce a rigorous way to build multiple hash tables on binary code substrings that enables exact k-nearest neighbor search in Hamming space. The approach is storage efficient and straight-forward to implement. Theoretical analysis shows that the algorithm exhibits sub-linear run-time behavior for uniformly distributed codes. Empirical results show dramatic speedups over a linear scan baseline for datasets of up to one billion codes of 64, 128, or 256 bits.
NASA Astrophysics Data System (ADS)
Bansal, Dipanshu; Aref, Amjad; Dargush, Gary; Delaire, Olivier
2016-09-01
Based on thermodynamic principles, we derive expressions quantifying the non-harmonic vibrational behavior of materials, which are rigorous yet easily evaluated from experimentally available data for the thermal expansion coefficient and the phonon density of states. These experimentally-derived quantities are valuable to benchmark first-principles theoretical predictions of harmonic and non-harmonic thermal behaviors using perturbation theory, ab initio molecular-dynamics, or Monte-Carlo simulations. We illustrate this analysis by computing the harmonic, dilational, and anharmonic contributions to the entropy, internal energy, and free energy of elemental aluminum and the ordered compound \\text{FeSi} over a wide range of temperature. Results agree well with previous data in the literature and provide an efficient approach to estimate anharmonic effects in materials.
Fundamental limit of nanophotonic light trapping in solar cells.
Yu, Zongfu; Raman, Aaswath; Fan, Shanhui
2010-10-12
Establishing the fundamental limit of nanophotonic light-trapping schemes is of paramount importance and is becoming increasingly urgent for current solar cell research. The standard theory of light trapping demonstrated that absorption enhancement in a medium cannot exceed a factor of 4n(2)/sin(2)θ, where n is the refractive index of the active layer, and θ is the angle of the emission cone in the medium surrounding the cell. This theory, however, is not applicable in the nanophotonic regime. Here we develop a statistical temporal coupled-mode theory of light trapping based on a rigorous electromagnetic approach. Our theory reveals that the conventional limit can be substantially surpassed when optical modes exhibit deep-subwavelength-scale field confinement, opening new avenues for highly efficient next-generation solar cells.
2013-06-01
The American Society of Clinical Oncology's (ASCO's) new conflict of interest policy reflects a commitment to transparency and independence in the development and presentation of scientific and educational content. ASCO supports thorough and accessible disclosure of financial relationships with companies at institutional and individual levels and calls for rigorous evaluation of content in light of the information disclosed. For abstracts and articles presenting original research, ASCO holds first, last, and corresponding authors to a clear standard of independence. In imposing restrictions, the new policy focuses on the role of these authors rather than of the principal investigator(s) as in the previous policy. ASCO remains actively engaged with the broader scientific community in seeking and implementing efficient, effective approaches to conflict of interest management.
Oral feeding readiness assessment in premature infants.
Gennattasio, Annmarie; Perri, Elizabeth A; Baranek, Donna; Rohan, Annie
2015-01-01
Oral feeding readiness is a complex concept. More evidence is needed on how to approach beginning oral feedings in premature hospitalized infants. This article provides a review of literature related to oral feeding readiness in the premature infant and strategies for promoting safe and efficient progression to full oral intake. Oral feeding readiness assessment tools, clinical pathways, and feeding advancement protocols have been developed to assist with oral feeding initiation and progression. Recognition and support of oral feeding readiness may decrease length of hospital stay and have a positive impact on reducing healthcare costs. Supporting effective cue-based oral feeding through use of rigorous assessment or evidence-based care guidelines can also optimize the hospital experience for infants and caregivers, which, in turn, can promote attachment and parent satisfaction.
NASA Astrophysics Data System (ADS)
Tuckerman, Mark
2006-03-01
One of the computational grand challenge problems is to develop methodology capable of sampling conformational equilibria in systems with rough energy landscapes. If met, many important problems, most notably protein folding, could be significantly impacted. In this talk, two new approaches for addressing this problem will be presented. First, it will be shown how molecular dynamics can be combined with a novel variable transformation designed to warp configuration space in such a way that barriers are reduced and attractive basins stretched. This method rigorously preserves equilibrium properties while leading to very large enhancements in sampling efficiency. Extensions of this approach to the calculation/exploration of free energy surfaces will be discussed. Next, a new very large time-step molecular dynamics method will be introduced that overcomes the resonances which plague many molecular dynamics algorithms. The performance of the methods is demonstrated on a variety of systems including liquid water, long polymer chains simple protein models, and oligopeptides.
Phases, phase equilibria, and phase rules in low-dimensional systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Frolov, T., E-mail: timfrol@berkeley.edu; Mishin, Y., E-mail: ymishin@gmu.edu
2015-07-28
We present a unified approach to thermodynamic description of one, two, and three dimensional phases and phase transformations among them. The approach is based on a rigorous definition of a phase applicable to thermodynamic systems of any dimensionality. Within this approach, the same thermodynamic formalism can be applied for the description of phase transformations in bulk systems, interfaces, and line defects separating interface phases. For both lines and interfaces, we rigorously derive an adsorption equation, the phase coexistence equations, and other thermodynamic relations expressed in terms of generalized line and interface excess quantities. As a generalization of the Gibbs phasemore » rule for bulk phases, we derive phase rules for lines and interfaces and predict the maximum number of phases than may coexist in systems of the respective dimensionality.« less
Structural efficiency studies of corrugated compression panels with curved caps and beaded webs
NASA Technical Reports Server (NTRS)
Davis, R. C.; Mills, C. T.; Prabhakaran, R.; Jackson, L. R.
1984-01-01
Curved cross-sectional elements are employed in structural concepts for minimum-mass compression panels. Corrugated panel concepts with curved caps and beaded webs are optimized by using a nonlinear mathematical programming procedure and a rigorous buckling analysis. These panel geometries are shown to have superior structural efficiencies compared with known concepts published in the literature. Fabrication of these efficient corrugation concepts became possible by advances made in the art of superplastically forming of metals. Results of the mass optimization studies of the concepts are presented as structural efficiency charts for axial compression.
Kim, Minseok; Eleftheriades, George V
2016-10-15
We propose a highly efficient (nearly lossless and impedance-matched) all-dielectric optical tensor impedance metasurface that mimics chiral effects at optical wavelengths. By cascading an array of rotated crossed silicon nanoblocks, we realize chiral optical tensor impedance metasurfaces that operate as circular polarization selective surfaces. Their efficiencies are maximized through a nonlinear numerical optimization process in which the tensor impedance metasurfaces are modeled via multi-conductor transmission line theory. From rigorous full-wave simulations that include all material losses, we show field transmission efficiencies of 94% for right- and left-handed circular polarization selective surfaces at 800 nm.
Beyond Composite Scores and Cronbach's Alpha: Advancing Methodological Rigor in Recreation Research
ERIC Educational Resources Information Center
Gagnon, Ryan J.; Stone, Garrett A.; Garst, Barry A.
2017-01-01
Critically examining common statistical approaches and their strengths and weaknesses is an important step in advancing recreation and leisure sciences. To continue this critical examination and to inform methodological decision making, this study compared three approaches to determine how alternative approaches may result in contradictory…
EVALUATION OF THE COLD PIPE PRECHARGER
The article gives results of an evaluation of the performance of the cold pipe precharger, taking a more rigorous approach than had been previously taken. The approach required detailed descriptions of electrical characteristics, electro-hydrodynamics, and charging theory. The co...
NASA Astrophysics Data System (ADS)
Tibuleac, Sorin
In this dissertation, new reflection and transmission filters are developed and characterized in the optical and microwave spectral regions. These guided-mode resonance (GMR) filters are implemented by integrating diffraction gratings into classical thin-film multilayers to produce high efficiency filter response and low sidebands extended over a large spectral range. Diffraction from phase-shifted gratings and gratings with different periods is analyzed using rigorous coupled-wave theory yielding a new approach to filter linewidth broadening, line-shaping, and multi-line filters at normal incidence. New single-grating transmission filters presented have narrow linewidth, high peak transmittance, and low sideband reflectance. A comparison with classical thin-film filters shows that GMR devices require significantly fewer layers to obtain narrow linewidth and high peak response. All-dielectric microwave frequency- selective surfaces operating in reflection or transmission are shown to be realizable with only a few layers using common microwave materials. Single-layer and multilayer waveguide gratings operating as reflection and transmission filters, respectively, were built and tested in the 4-20 GHz frequency range. The presence of GMR notches and peaks is clearly established by the experimental results, and their spectral location and lineshape found to be in excellent agreement with the theoretical predictions. A new computer program using genetic algorithms and rigorous coupled-wave analysis was developed for optimization of multilayer structures containing homogeneous and diffractive layers. This program was utilized to find GMR filters possessing features not previously known. Thus, numerous examples of transmission filters with peaks approaching 100%, narrow linewidths (~0.03%), and low sidebands have been found in structures containing only 1-3 layers. A new type of GMR device integrating a waveguide grating with subwavelength period on the endface of an optical fiber is developed for high-resolution biomedical or chemical sensors and spectral filtering applications. Diffraction gratings with submicron periods exhibiting high efficiencies have been recorded for the first time on coated and uncoated endfaces of single-mode and multimode fibers. Guided-mode resonance transmittance notches of ~18% were experimentally obtained with structures consisting of photoresist gratings on thin films of Si3N4 deposited on optical fiber endfaces.
Rigorous coupled wave analysis of acousto-optics with relativistic considerations.
Xia, Guoqiang; Zheng, Weijian; Lei, Zhenggang; Zhang, Ruolan
2015-09-01
A relativistic analysis of acousto-optics is presented, and a rigorous coupled wave analysis is generalized for the diffraction of the acousto-optical effect. An acoustic wave generates a grating with temporally and spatially modulated permittivity, hindering direct applications of the rigorous coupled wave analysis for the acousto-optical effect. In a reference frame which moves with the acoustic wave, the grating is static, the medium moves, and the coupled wave equations for the static grating may be derived. Floquet's theorem is then applied to cast these equations into an eigenproblem. Using a Lorentz transformation, the electromagnetic fields in the grating region are transformed to the lab frame where the medium is at rest, and relativistic Doppler frequency shifts are introduced into various diffraction orders. In the lab frame, the boundary conditions are considered and the diffraction efficiencies of various orders are determined. This method is rigorous and general, and the plane waves in the resulting expansion satisfy the dispersion relation of the medium and are propagation modes. Properties of various Bragg diffractions are results, rather than preconditions, of this method. Simulations of an acousto-optical tunable filter made by paratellurite, TeO(2), are given as examples.
Accelerating Biomedical Discoveries through Rigor and Transparency.
Hewitt, Judith A; Brown, Liliana L; Murphy, Stephanie J; Grieder, Franziska; Silberberg, Shai D
2017-07-01
Difficulties in reproducing published research findings have garnered a lot of press in recent years. As a funder of biomedical research, the National Institutes of Health (NIH) has taken measures to address underlying causes of low reproducibility. Extensive deliberations resulted in a policy, released in 2015, to enhance reproducibility through rigor and transparency. We briefly explain what led to the policy, describe its elements, provide examples and resources for the biomedical research community, and discuss the potential impact of the policy on translatability with a focus on research using animal models. Importantly, while increased attention to rigor and transparency may lead to an increase in the number of laboratory animals used in the near term, it will lead to more efficient and productive use of such resources in the long run. The translational value of animal studies will be improved through more rigorous assessment of experimental variables and data, leading to better assessments of the translational potential of animal models, for the benefit of the research community and society. Published by Oxford University Press on behalf of the Institute for Laboratory Animal Research 2017. This work is written by (a) US Government employee(s) and is in the public domain in the US.
Tailoring Systems Engineering Projects for Small Satellite Missions
NASA Technical Reports Server (NTRS)
Horan, Stephen; Belvin, Keith
2013-01-01
NASA maintains excellence in its spaceflight systems by utilizing rigorous engineering processes based on over 50 years of experience. The NASA systems engineering process for flight projects described in NPR 7120.5E was initially developed for major flight projects. The design and development of low-cost small satellite systems does not entail the financial and risk consequences traditionally associated with spaceflight projects. Consequently, an approach is offered to tailoring of the processes such that the small satellite missions will benefit from the engineering rigor without overly burdensome overhead. In this paper we will outline the approaches to tailoring the standard processes for these small missions and describe how it will be applied in a proposed small satellite mission.
A Rigorous Framework for Optimization of Expensive Functions by Surrogates
NASA Technical Reports Server (NTRS)
Booker, Andrew J.; Dennis, J. E., Jr.; Frank, Paul D.; Serafini, David B.; Torczon, Virginia; Trosset, Michael W.
1998-01-01
The goal of the research reported here is to develop rigorous optimization algorithms to apply to some engineering design problems for which design application of traditional optimization approaches is not practical. This paper presents and analyzes a framework for generating a sequence of approximations to the objective function and managing the use of these approximations as surrogates for optimization. The result is to obtain convergence to a minimizer of an expensive objective function subject to simple constraints. The approach is widely applicable because it does not require, or even explicitly approximate, derivatives of the objective. Numerical results are presented for a 31-variable helicopter rotor blade design example and for a standard optimization test example.
Herbal medicine development: a plea for a rigorous scientific foundation.
Lietman, Paul S
2012-09-01
Science, including rigorous basic scientific research and rigorous clinical research, must underlie both the development and the clinical use of herbal medicines. Yet almost none of the hundreds or thousands of articles that are published each year on some aspect of herbal medicines, adheres to 3 simple but profound scientific principles must underlie all of herbal drug development or clinical use. Three fundamental principles that should underlie everyone's thinking about the development and/or clinical use of any herbal medicine. (1) There must be standardization and regulation (rigorously enforced) of the product being studied or being used clinically. (2) There must be scientific proof of a beneficial clinical effect for something of value to the patient and established by rigorous clinical research. (3) There must be scientific proof of safety (acceptable toxicity) for the patient and established by rigorous clinical research. These fundamental principles of science have ramifications for both the scientist and the clinician. It is critically important that both the investigator and the prescriber know exactly what is in the studied or recommended product and how effective and toxic it is. We will find new and useful drugs from natural sources. However, we will have to learn how to study herbal medicines rigorously, and we will have to try to convince the believers in herbal medicines of the wisdom and even the necessity of a rigorous scientific approach to herbal medicine development. Both biomedical science and practicing physicians must enthusiastically accept the responsibility for searching for truth in the discovery and development of new herbal medicines, in the truthful teaching about herbal medicines from a scientific perspective, and in the scientifically proven clinical use of herbal medicines.
Mohammad, Nabil; Wang, Peng; Friedman, Daniel J.; ...
2014-09-17
We report the enhancement of photovoltaic output power by separating the incident spectrum into 3 bands, and concentrating these bands onto 3 different photovoltaic cells. The spectrum-splitting and concentration is achieved via a thin, planar micro-optical element that demonstrates high optical efficiency over the entire spectrum of interest. The optic (which we call a polychromat) was designed using a modified version of the direct-binary-search algorithm. The polychromat was fabricated using grayscale lithography. Rigorous optical characterization demonstrates excellent agreement with simulation results. Electrical characterization of the solar cells made from GaInP, GaAs and Si indicate increase in the peak output powermore » density of 43.63%, 30.84% and 30.86%, respectively when compared to normal operation without the polychromat. This represents an overall increase of 35.52% in output power density. As a result, the potential for cost-effective large-area manufacturing and for high system efficiencies makes our approach a strong candidate for low cost solar power.« less
Tidball, Andrew M; Dang, Louis T; Glenn, Trevor W; Kilbane, Emma G; Klarr, Daniel J; Margolis, Joshua L; Uhler, Michael D; Parent, Jack M
2017-09-12
Specifically ablating genes in human induced pluripotent stem cells (iPSCs) allows for studies of gene function as well as disease mechanisms in disorders caused by loss-of-function (LOF) mutations. While techniques exist for engineering such lines, we have developed and rigorously validated a method of simultaneous iPSC reprogramming while generating CRISPR/Cas9-dependent insertions/deletions (indels). This approach allows for the efficient and rapid formation of genetic LOF human disease cell models with isogenic controls. The rate of mutagenized lines was strikingly consistent across experiments targeting four different human epileptic encephalopathy genes and a metabolic enzyme-encoding gene, and was more efficient and consistent than using CRISPR gene editing of established iPSC lines. The ability of our streamlined method to reproducibly generate heterozygous and homozygous LOF iPSC lines with passage-matched isogenic controls in a single step provides for the rapid development of LOF disease models with ideal control lines, even in the absence of patient tissue. Copyright © 2017 The Author(s). Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Xing, F.; Masson, R.; Lopez, S.
2017-09-01
This paper introduces a new discrete fracture model accounting for non-isothermal compositional multiphase Darcy flows and complex networks of fractures with intersecting, immersed and non-immersed fractures. The so called hybrid-dimensional model using a 2D model in the fractures coupled with a 3D model in the matrix is first derived rigorously starting from the equi-dimensional matrix fracture model. Then, it is discretized using a fully implicit time integration combined with the Vertex Approximate Gradient (VAG) finite volume scheme which is adapted to polyhedral meshes and anisotropic heterogeneous media. The fully coupled systems are assembled and solved in parallel using the Single Program Multiple Data (SPMD) paradigm with one layer of ghost cells. This strategy allows for a local assembly of the discrete systems. An efficient preconditioner is implemented to solve the linear systems at each time step and each Newton type iteration of the simulation. The numerical efficiency of our approach is assessed on different meshes, fracture networks, and physical settings in terms of parallel scalability, nonlinear convergence and linear convergence.
Rosenfeld, Richard M; Shiffman, Richard N
2009-06-01
Guidelines translate best evidence into best practice. A well-crafted guideline promotes quality by reducing health-care variations, improving diagnostic accuracy, promoting effective therapy, and discouraging ineffective-or potentially harmful-interventions. Despite a plethora of published guidelines, methodology is often poorly defined and varies greatly within and among organizations. This manual describes the principles and practices used successfully by the American Academy of Otolaryngology-Head and Neck Surgery to produce quality-driven, evidence-based guidelines using efficient and transparent methodology for action-ready recommendations with multidisciplinary applicability. The development process, which allows moving from conception to completion in 12 months, emphasizes a logical sequence of key action statements supported by amplifying text, evidence profiles, and recommendation grades that link action to evidence. As clinical practice guidelines become more prominent as a key metric of quality health care, organizations must develop efficient production strategies that balance rigor and pragmatism. Equally important, clinicians must become savvy in understanding what guidelines are-and are not-and how they are best utilized to improve care. The information in this manual should help clinicians and organizations achieve these goals.
Turner, Tari; Green, Sally; Tovey, David; McDonald, Steve; Soares-Weiser, Karla; Pestridge, Charlotte; Elliott, Julian
2017-08-01
Producing high-quality, relevant systematic reviews and keeping them up to date is challenging. Cochrane is a leading provider of systematic reviews in health. For Cochrane to continue to contribute to improvements in heath, Cochrane Reviews must be rigorous, reliable and up to date. We aimed to explore existing models of Cochrane Review production and emerging opportunities to improve the efficiency and sustainability of these processes. To inform discussions about how to best achieve this, we conducted 26 interviews and an online survey with 106 respondents. Respondents highlighted the importance and challenge of creating reliable, timely systematic reviews. They described the challenges and opportunities presented by current production models, and they shared what they are doing to improve review production. They particularly highlighted significant challenges with increasing complexity of review methods; difficulty keeping authors on board and on track; and the length of time required to complete the process. Strong themes emerged about the roles of authors and Review Groups, the central actors in the review production process. The results suggest that improvements to Cochrane's systematic review production models could come from improving clarity of roles and expectations, ensuring continuity and consistency of input, enabling active management of the review process, centralising some review production steps; breaking reviews into smaller "chunks", and improving approaches to building capacity of and sharing information between authors and Review Groups. Respondents noted the important role new technologies have to play in enabling these improvements. The findings of this study will inform the development of new Cochrane Review production models and may provide valuable data for other systematic review producers as they consider how best to produce rigorous, reliable, up-to-date reviews.
Del Fiol, Guilherme; Michelson, Matthew; Iorio, Alfonso; Cotoi, Chris; Haynes, R Brian
2018-06-25
A major barrier to the practice of evidence-based medicine is efficiently finding scientifically sound studies on a given clinical topic. To investigate a deep learning approach to retrieve scientifically sound treatment studies from the biomedical literature. We trained a Convolutional Neural Network using a noisy dataset of 403,216 PubMed citations with title and abstract as features. The deep learning model was compared with state-of-the-art search filters, such as PubMed's Clinical Query Broad treatment filter, McMaster's textword search strategy (no Medical Subject Heading, MeSH, terms), and Clinical Query Balanced treatment filter. A previously annotated dataset (Clinical Hedges) was used as the gold standard. The deep learning model obtained significantly lower recall than the Clinical Queries Broad treatment filter (96.9% vs 98.4%; P<.001); and equivalent recall to McMaster's textword search (96.9% vs 97.1%; P=.57) and Clinical Queries Balanced filter (96.9% vs 97.0%; P=.63). Deep learning obtained significantly higher precision than the Clinical Queries Broad filter (34.6% vs 22.4%; P<.001) and McMaster's textword search (34.6% vs 11.8%; P<.001), but was significantly lower than the Clinical Queries Balanced filter (34.6% vs 40.9%; P<.001). Deep learning performed well compared to state-of-the-art search filters, especially when citations were not indexed. Unlike previous machine learning approaches, the proposed deep learning model does not require feature engineering, or time-sensitive or proprietary features, such as MeSH terms and bibliometrics. Deep learning is a promising approach to identifying reports of scientifically rigorous clinical research. Further work is needed to optimize the deep learning model and to assess generalizability to other areas, such as diagnosis, etiology, and prognosis. ©Guilherme Del Fiol, Matthew Michelson, Alfonso Iorio, Chris Cotoi, R Brian Haynes. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 25.06.2018.
Fish-Eye Observing with Phased Array Radio Telescopes
NASA Astrophysics Data System (ADS)
Wijnholds, S. J.
The radio astronomical community is currently developing and building several new radio telescopes based on phased array technology. These telescopes provide a large field-of-view, that may in principle span a full hemisphere. This makes calibration and imaging very challenging tasks due to the complex source structures and direction dependent radio wave propagation effects. In this thesis, calibration and imaging methods are developed based on least squares estimation of instrument and source parameters. Monte Carlo simulations and actual observations with several prototype show that this model based approach provides statistically and computationally efficient solutions. The error analysis provides a rigorous mathematical framework to assess the imaging performance of current and future radio telescopes in terms of the effective noise, which is the combined effect of propagated calibration errors, noise in the data and source confusion.
Diffraction efficiency calculations of polarization diffraction gratings with surface relief
NASA Astrophysics Data System (ADS)
Nazarova, D.; Sharlandjiev, P.; Berberova, N.; Blagoeva, B.; Stoykova, E.; Nedelchev, L.
2018-03-01
In this paper, we evaluate the optical response of a stack of two diffraction gratings of equal one-dimensional periodicity. The first one is a surface-relief grating structure; the second, a volume polarization grating. This model is based on our experimental results from polarization holographic recordings in azopolymer films. We used films of commercially available azopolymer (poly[1-[4-(3-carboxy-4-hydroxyphenylazo) benzenesulfonamido]-1,2-ethanediyl, sodium salt]), shortly denoted as PAZO. During the recording process, a polarization grating in the volume of the material and a relief grating on the film surface are formed simultaneously. In order to evaluate numerically the optical response of this “hybrid” diffraction structure, we used the rigorous coupled-wave approach (RCWA). It yields stable numerical solutions of Maxwell’s vector equations using the algebraic eigenvalue method.
The importance of knowledge-based technology.
Cipriano, Pamela F
2012-01-01
Nurse executives are responsible for a workforce that can provide safer and more efficient care in a complex sociotechnical environment. National quality priorities rely on technologies to provide data collection, share information, and leverage analytic capabilities to interpret findings and inform approaches to care that will achieve better outcomes. As a key steward for quality, the nurse executive exercises leadership to provide the infrastructure to build and manage nursing knowledge and instill accountability for following evidence-based practices. These actions contribute to a learning health system where new knowledge is captured as a by-product of care delivery enabled by knowledge-based electronic systems. The learning health system also relies on rigorous scientific evidence embedded into practice at the point of care. The nurse executive optimizes use of knowledge-based technologies, integrated throughout the organization, that have the capacity to help transform health care.
Bayesian operational modal analysis with asynchronous data, part I: Most probable value
NASA Astrophysics Data System (ADS)
Zhu, Yi-Chen; Au, Siu-Kui
2018-01-01
In vibration tests, multiple sensors are used to obtain detailed mode shape information about the tested structure. Time synchronisation among data channels is required in conventional modal identification approaches. Modal identification can be more flexibly conducted if this is not required. Motivated by the potential gain in feasibility and economy, this work proposes a Bayesian frequency domain method for modal identification using asynchronous 'output-only' ambient data, i.e. 'operational modal analysis'. It provides a rigorous means for identifying the global mode shape taking into account the quality of the measured data and their asynchronous nature. This paper (Part I) proposes an efficient algorithm for determining the most probable values of modal properties. The method is validated using synthetic and laboratory data. The companion paper (Part II) investigates identification uncertainty and challenges in applications to field vibration data.
Bansal, Dipanshu; Aref, Amjad; Dargush, Gary; ...
2016-07-20
Based on thermodynamic principles, we derive expressions quantifying the non-harmonic vibrational behavior of materials, which are rigorous yet easily evaluated from experimentally available data for the thermal expansion coefficient and the phonon density of states. These experimentally-derived quantities are valuable to benchmark first-principles theoretical predictions of harmonic and non-harmonic thermal behaviors using perturbation theory, ab initio molecular-dynamics, or Monte-Carlo simulations. In this study, we illustrate this analysis by computing the harmonic, dilational, and anharmonic contributions to the entropy, internal energy, and free energy of elemental aluminum and the ordered compound FeSi over a wide range of temperature. Our results agreemore » well with previous data in the literature and provide an efficient approach to estimate anharmonic effects in materials.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bansal, Dipanshu; Aref, Amjad; Dargush, Gary
Based on thermodynamic principles, we derive expressions quantifying the non-harmonic vibrational behavior of materials, which are rigorous yet easily evaluated from experimentally available data for the thermal expansion coefficient and the phonon density of states. These experimentally-derived quantities are valuable to benchmark first-principles theoretical predictions of harmonic and non-harmonic thermal behaviors using perturbation theory, ab initio molecular-dynamics, or Monte-Carlo simulations. In this study, we illustrate this analysis by computing the harmonic, dilational, and anharmonic contributions to the entropy, internal energy, and free energy of elemental aluminum and the ordered compound FeSi over a wide range of temperature. Our results agreemore » well with previous data in the literature and provide an efficient approach to estimate anharmonic effects in materials.« less
Photonic jet reconstruction for particle refractive index measurement by digital in-line holography.
Sentis, Matthias P L; Onofri, Fabrice R A; Lamadie, Fabrice
2017-01-23
A new and computationally efficient approach is proposed for determining the refractive index of spherical and transparent particles, in addition to their size and 3D position, using digital in-line holography. The method is based on the localization of the maximum intensity position of the photonic jet with respect to the particle center retrieved from the back propagation of recorded holograms. Rigorous electromagnetic calculations and experimental results demonstrate that for liquid-liquid systems and droplets with a radius > 30µm, a refractive index measurement with a resolution inferior to 4 × 10-3 is achievable, revealing a significant potential for the use of this method to investigate multiphase flows. The resolution for solid or liquid particles in gas is expected to be lower but sufficient for the recognition of particle material.
New algorithms for processing time-series big EEG data within mobile health monitoring systems.
Serhani, Mohamed Adel; Menshawy, Mohamed El; Benharref, Abdelghani; Harous, Saad; Navaz, Alramzana Nujum
2017-10-01
Recent advances in miniature biomedical sensors, mobile smartphones, wireless communications, and distributed computing technologies provide promising techniques for developing mobile health systems. Such systems are capable of monitoring epileptic seizures reliably, which are classified as chronic diseases. Three challenging issues raised in this context with regard to the transformation, compression, storage, and visualization of big data, which results from a continuous recording of epileptic seizures using mobile devices. In this paper, we address the above challenges by developing three new algorithms to process and analyze big electroencephalography data in a rigorous and efficient manner. The first algorithm is responsible for transforming the standard European Data Format (EDF) into the standard JavaScript Object Notation (JSON) and compressing the transformed JSON data to decrease the size and time through the transfer process and to increase the network transfer rate. The second algorithm focuses on collecting and storing the compressed files generated by the transformation and compression algorithm. The collection process is performed with respect to the on-the-fly technique after decompressing files. The third algorithm provides relevant real-time interaction with signal data by prospective users. It particularly features the following capabilities: visualization of single or multiple signal channels on a smartphone device and query data segments. We tested and evaluated the effectiveness of our approach through a software architecture model implementing a mobile health system to monitor epileptic seizures. The experimental findings from 45 experiments are promising and efficiently satisfy the approach's objectives in a price of linearity. Moreover, the size of compressed JSON files and transfer times are reduced by 10% and 20%, respectively, while the average total time is remarkably reduced by 67% through all performed experiments. Our approach successfully develops efficient algorithms in terms of processing time, memory usage, and energy consumption while maintaining a high scalability of the proposed solution. Our approach efficiently supports data partitioning and parallelism relying on the MapReduce platform, which can help in monitoring and automatic detection of epileptic seizures. Copyright © 2017 Elsevier B.V. All rights reserved.
Efficient shortest-path-tree computation in network routing based on pulse-coupled neural networks.
Qu, Hong; Yi, Zhang; Yang, Simon X
2013-06-01
Shortest path tree (SPT) computation is a critical issue for routers using link-state routing protocols, such as the most commonly used open shortest path first and intermediate system to intermediate system. Each router needs to recompute a new SPT rooted from itself whenever a change happens in the link state. Most commercial routers do this computation by deleting the current SPT and building a new one using static algorithms such as the Dijkstra algorithm at the beginning. Such recomputation of an entire SPT is inefficient, which may consume a considerable amount of CPU time and result in a time delay in the network. Some dynamic updating methods using the information in the updated SPT have been proposed in recent years. However, there are still many limitations in those dynamic algorithms. In this paper, a new modified model of pulse-coupled neural networks (M-PCNNs) is proposed for the SPT computation. It is rigorously proved that the proposed model is capable of solving some optimization problems, such as the SPT. A static algorithm is proposed based on the M-PCNNs to compute the SPT efficiently for large-scale problems. In addition, a dynamic algorithm that makes use of the structure of the previously computed SPT is proposed, which significantly improves the efficiency of the algorithm. Simulation results demonstrate the effective and efficient performance of the proposed approach.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2014-05-02
All programs with the U.S. Department of Energy's (DOE) Office of Energy Efficiency and Renewable Energy (EERE) are required to undertake rigorous, objective peer review of their funded projects on a yearly basis in order to ensure and enhance the management, relevance, effectiveness, and productivity of those projects.
A New Approach to an Old Order.
ERIC Educational Resources Information Center
Rambhia, Sanjay
2002-01-01
Explains the difficulties middle school students face in algebra regarding the order of operations. Describes a more visual approach to teaching the order of operations so that students can better solve complex problems and be better prepared for the rigors of algebra. (YDS)
Agent-Centric Approach for Cybersecurity Decision-Support with Partial Observability
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tipireddy, Ramakrishna; Chatterjee, Samrat; Paulson, Patrick R.
Generating automated cyber resilience policies for real-world settings is a challenging research problem that must account for uncertainties in system state over time and dynamics between attackers and defenders. In addition to understanding attacker and defender motives and tools, and identifying “relevant” system and attack data, it is also critical to develop rigorous mathematical formulations representing the defender’s decision-support problem under uncertainty. Game-theoretic approaches involving cyber resource allocation optimization with Markov decision processes (MDP) have been previously proposed in the literature. Moreover, advancements in reinforcement learning approaches have motivated the development of partially observable stochastic games (POSGs) in various multi-agentmore » problem domains with partial information. Recent advances in cyber-system state space modeling have also generated interest in potential applicability of POSGs for cybersecurity. However, as is the case in strategic card games such as poker, research challenges using game-theoretic approaches for practical cyber defense applications include: 1) solving for equilibrium and designing efficient algorithms for large-scale, general problems; 2) establishing mathematical guarantees that equilibrium exists; 3) handling possible existence of multiple equilibria; and 4) exploitation of opponent weaknesses. Inspired by advances in solving strategic card games while acknowledging practical challenges associated with the use of game-theoretic approaches in cyber settings, this paper proposes an agent-centric approach for cybersecurity decision-support with partial system state observability.« less
The Tailoring of Traditional Systems Engineering for the Morpheus Project
NASA Technical Reports Server (NTRS)
Devolites, Jennifer L.; Hart, Jeremy J.
2013-01-01
NASA's Morpheus Project has developed and tested a prototype planetary lander capable of vertical takeoff and landing that is designed to serve as a testbed for advanced spacecraft technologies. The lander vehicle, propelled by a LOX/Methane engine and sized to carry a 500kg payload to the lunar surface, provides a platform for bringing technologies from the laboratory into an integrated flight system at relatively low cost. From the beginning, one of goals for the Morpheus Project was to streamline agency processes and practices. The Morpheus project accepted a challenge to tailor the traditional NASA systems engineering approach in a way that would be appropriate for a lower cost, rapid prototype engineering effort, but retain the essence of the guiding principles. The team has produced innovative ways to create an infrastructure and approach that would challenge existing systems engineering processes while still enabling successful implementation of the current Morpheus Project. This paper describes the tailored systems engineering approach for the Morpheus project, including the processes, tools, and amount of rigor employed over the project's multiple lifecycles since the project began in FY11. Lessons learned from these trials have the potential to be scaled up and improve efficiency on a larger projects or programs.
NASA Astrophysics Data System (ADS)
Asplund, Erik; Klüner, Thorsten
2012-03-01
In this paper, control of open quantum systems with emphasis on the control of surface photochemical reactions is presented. A quantum system in a condensed phase undergoes strong dissipative processes. From a theoretical viewpoint, it is important to model such processes in a rigorous way. In this work, the description of open quantum systems is realized within the surrogate Hamiltonian approach [R. Baer and R. Kosloff, J. Chem. Phys. 106, 8862 (1997)], 10.1063/1.473950. An efficient and accurate method to find control fields is optimal control theory (OCT) [W. Zhu, J. Botina, and H. Rabitz, J. Chem. Phys. 108, 1953 (1998), 10.1063/1.475576; Y. Ohtsuki, G. Turinici, and H. Rabitz, J. Chem. Phys. 120, 5509 (2004)], 10.1063/1.1650297. To gain control of open quantum systems, the surrogate Hamiltonian approach and OCT, with time-dependent targets, are combined. Three open quantum systems are investigated by the combined method, a harmonic oscillator immersed in an ohmic bath, CO adsorbed on a platinum surface, and NO adsorbed on a nickel oxide surface. Throughout this paper, atomic units, i.e., ℏ = me = e = a0 = 1, have been used unless otherwise stated.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Guohong; Liu, Yong; Li, Baojun
2015-06-07
We investigate experimentally and theoretically the influence of electron transport layer (ETL) thickness on properties of typical N,N′-diphenyl-N,N′-bis(1-naphthyl)-[1,1′-biphthyl]-4,4′-diamine (NPB)/tris-(8-hydroxyquinoline) aluminum (Alq{sub 3}) heterojunction based organic light-emitting diodes (OLEDs), where the thickness of ETL is varied to adjust the distance between the emitting zone and the metal electrode. The devices showed a maximum current efficiency of 3.8 cd/A when the ETL thickness is around 50 nm corresponding to an emitter-cathode distance of 80 nm, and a second maximum current efficiency of 2.6 cd/A when the ETL thickness is around 210 nm corresponding to an emitter-cathode distance of 240 nm. We adopt a rigorous electromagnetic approach that takesmore » parameters, such as dipole orientation, polarization, light emitting angle, exciton recombination zone, and diffusion length into account to model the optical properties of devices as a function of varying ETL thickness. Our simulation results are accurately consistent with the experimental results with a widely varying thickness of ETL, indicating that the theoretical model may be helpful to design high efficiency OLEDs.« less
76 FR 17654 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-30
... OMB Review; Comment Request Title: Evaluation of Adolescent Pregnancy Prevention Approaches-- First... as part of the Evaluation of Adolescent Pregnancy Prevention Approaches (PPA). PPA is a random assignment evaluation designed to result in rigorous evidence on effective ways to reduce teen pregnancy. The...
Quasi-experimental study designs series-paper 6: risk of bias assessment.
Waddington, Hugh; Aloe, Ariel M; Becker, Betsy Jane; Djimeu, Eric W; Hombrados, Jorge Garcia; Tugwell, Peter; Wells, George; Reeves, Barney
2017-09-01
Rigorous and transparent bias assessment is a core component of high-quality systematic reviews. We assess modifications to existing risk of bias approaches to incorporate rigorous quasi-experimental approaches with selection on unobservables. These are nonrandomized studies using design-based approaches to control for unobservable sources of confounding such as difference studies, instrumental variables, interrupted time series, natural experiments, and regression-discontinuity designs. We review existing risk of bias tools. Drawing on these tools, we present domains of bias and suggest directions for evaluation questions. The review suggests that existing risk of bias tools provide, to different degrees, incomplete transparent criteria to assess the validity of these designs. The paper then presents an approach to evaluating the internal validity of quasi-experiments with selection on unobservables. We conclude that tools for nonrandomized studies of interventions need to be further developed to incorporate evaluation questions for quasi-experiments with selection on unobservables. Copyright © 2017 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Hidayat, D.; Nurlaelah, E.; Dahlan, J. A.
2017-09-01
The ability of mathematical creative and critical thinking are two abilities that need to be developed in the learning of mathematics. Therefore, efforts need to be made in the design of learning that is capable of developing both capabilities. The purpose of this research is to examine the mathematical creative and critical thinking ability of students who get rigorous mathematical thinking (RMT) approach and students who get expository approach. This research was quasi experiment with control group pretest-posttest design. The population were all of students grade 11th in one of the senior high school in Bandung. The result showed that: the achievement of mathematical creative and critical thinking abilities of student who obtain RMT is better than students who obtain expository approach. The use of Psychological tools and mediation with criteria of intentionality, reciprocity, and mediated of meaning on RMT helps students in developing condition in critical and creative processes. This achievement contributes to the development of integrated learning design on students’ critical and creative thinking processes.
NASA Technical Reports Server (NTRS)
Chen, Wei; Tsui, Kwok-Leung; Allen, Janet K.; Mistree, Farrokh
1994-01-01
In this paper we introduce a comprehensive and rigorous robust design procedure to overcome some limitations of the current approaches. A comprehensive approach is general enough to model the two major types of robust design applications, namely, robust design associated with the minimization of the deviation of performance caused by the deviation of noise factors (uncontrollable parameters), and robust design due to the minimization of the deviation of performance caused by the deviation of control factors (design variables). We achieve mathematical rigor by using, as a foundation, principles from the design of experiments and optimization. Specifically, we integrate the Response Surface Method (RSM) with the compromise Decision Support Problem (DSP). Our approach is especially useful for design problems where there are no closed-form solutions and system performance is computationally expensive to evaluate. The design of a solar powered irrigation system is used as an example. Our focus in this paper is on illustrating our approach rather than on the results per se.
Learning Strategy Instruction Innovation Configuration
ERIC Educational Resources Information Center
Schumaker, Jean B.
2011-01-01
One way of helping students with learning disabilities and other struggling students to be independent life-long learners is to teach them how to use learning strategies in efficient ways. Learning strategy instruction can provide students the opportunity to succeed in today's schools and meet rigorous standards, transforming ineffective learners…
Reporting Guidelines: Optimal Use in Preventive Medicine and Public Health
Popham, Karyn; Calo, William A.; Carpentier, Melissa Y.; Chen, Naomi E.; Kamrudin, Samira A.; Le, Yen-Chi L.; Skala, Katherine A.; Thornton, Logan R.; Mullen, Patricia Dolan
2012-01-01
Numerous reporting guidelines are available to help authors write higher quality manuscripts more efficiently. Almost 200 are listed on the EQUATOR (Enhancing the Quality and Transparency of Health Research) Network’s website and they vary in authority, usability, and breadth, making it difficult to decide which one(s) to use. This paper provides consistent information about guidelines for preventive medicine and public health and a framework and sequential approach for selecting them. EQUATOR guidelines were reviewed for relevance to target audiences; selected guidelines were classified as “core” (frequently recommended) or specialized, and the latter were grouped by their focus. Core and specialized guidelines were coded for indicators of authority (simultaneous publication in multiple journals, rationale, scientific background supporting each element, expertise of designers, permanent website/named group), usability (presence of checklists and examples of good reporting), and breadth (manuscript sections covered). Discrepancies were resolved by consensus. Selected guidelines are presented in four tables arranged to facilitate selection: core guidelines, all of which pertain to major research designs; guidelines for additional study designs, topical guidelines, and guidelines for particular manuscript sections. A flow diagram provides an overview. The framework and sequential approach will enable authors as well as editors, peer reviewers, researchers, and systematic reviewers to make optimal use of available guidelines to improve the transparency, clarity, and rigor of manuscripts and research protocols and the efficiency of conducing systematic reviews and meta-analyses. PMID:22992369
Ma, Chao; Ouyang, Jihong; Chen, Hui-Ling; Zhao, Xue-Hua
2014-01-01
A novel hybrid method named SCFW-KELM, which integrates effective subtractive clustering features weighting and a fast classifier kernel-based extreme learning machine (KELM), has been introduced for the diagnosis of PD. In the proposed method, SCFW is used as a data preprocessing tool, which aims at decreasing the variance in features of the PD dataset, in order to further improve the diagnostic accuracy of the KELM classifier. The impact of the type of kernel functions on the performance of KELM has been investigated in detail. The efficiency and effectiveness of the proposed method have been rigorously evaluated against the PD dataset in terms of classification accuracy, sensitivity, specificity, area under the receiver operating characteristic (ROC) curve (AUC), f-measure, and kappa statistics value. Experimental results have demonstrated that the proposed SCFW-KELM significantly outperforms SVM-based, KNN-based, and ELM-based approaches and other methods in the literature and achieved highest classification results reported so far via 10-fold cross validation scheme, with the classification accuracy of 99.49%, the sensitivity of 100%, the specificity of 99.39%, AUC of 99.69%, the f-measure value of 0.9964, and kappa value of 0.9867. Promisingly, the proposed method might serve as a new candidate of powerful methods for the diagnosis of PD with excellent performance.
Ma, Chao; Ouyang, Jihong; Chen, Hui-Ling; Zhao, Xue-Hua
2014-01-01
A novel hybrid method named SCFW-KELM, which integrates effective subtractive clustering features weighting and a fast classifier kernel-based extreme learning machine (KELM), has been introduced for the diagnosis of PD. In the proposed method, SCFW is used as a data preprocessing tool, which aims at decreasing the variance in features of the PD dataset, in order to further improve the diagnostic accuracy of the KELM classifier. The impact of the type of kernel functions on the performance of KELM has been investigated in detail. The efficiency and effectiveness of the proposed method have been rigorously evaluated against the PD dataset in terms of classification accuracy, sensitivity, specificity, area under the receiver operating characteristic (ROC) curve (AUC), f-measure, and kappa statistics value. Experimental results have demonstrated that the proposed SCFW-KELM significantly outperforms SVM-based, KNN-based, and ELM-based approaches and other methods in the literature and achieved highest classification results reported so far via 10-fold cross validation scheme, with the classification accuracy of 99.49%, the sensitivity of 100%, the specificity of 99.39%, AUC of 99.69%, the f-measure value of 0.9964, and kappa value of 0.9867. Promisingly, the proposed method might serve as a new candidate of powerful methods for the diagnosis of PD with excellent performance. PMID:25484912
Fully implicit moving mesh adaptive algorithm
NASA Astrophysics Data System (ADS)
Serazio, C.; Chacon, L.; Lapenta, G.
2006-10-01
In many problems of interest, the numerical modeler is faced with the challenge of dealing with multiple time and length scales. The former is best dealt with with fully implicit methods, which are able to step over fast frequencies to resolve the dynamical time scale of interest. The latter requires grid adaptivity for efficiency. Moving-mesh grid adaptive methods are attractive because they can be designed to minimize the numerical error for a given resolution. However, the required grid governing equations are typically very nonlinear and stiff, and of considerably difficult numerical treatment. Not surprisingly, fully coupled, implicit approaches where the grid and the physics equations are solved simultaneously are rare in the literature, and circumscribed to 1D geometries. In this study, we present a fully implicit algorithm for moving mesh methods that is feasible for multidimensional geometries. Crucial elements are the development of an effective multilevel treatment of the grid equation, and a robust, rigorous error estimator. For the latter, we explore the effectiveness of a coarse grid correction error estimator, which faithfully reproduces spatial truncation errors for conservative equations. We will show that the moving mesh approach is competitive vs. uniform grids both in accuracy (due to adaptivity) and efficiency. Results for a variety of models 1D and 2D geometries will be presented. L. Chac'on, G. Lapenta, J. Comput. Phys., 212 (2), 703 (2006) G. Lapenta, L. Chac'on, J. Comput. Phys., accepted (2006)
Butler Ellis, M Clare; Kennedy, Marc C; Kuster, Christian J; Alanis, Rafael; Tuck, Clive R
2018-05-28
The BREAM (Bystander and Resident Exposure Assessment Model) (Kennedy et al. in BREAM: A probabilistic bystander and resident exposure assessment model of spray drift from an agricultural boom sprayer. Comput Electron Agric 2012;88:63-71) for bystander and resident exposure to spray drift from boom sprayers has recently been incorporated into the European Food Safety Authority (EFSA) guidance for determining non-dietary exposures of humans to plant protection products. The component of BREAM, which relates airborne spray concentrations to bystander and resident dermal exposure, has been reviewed to identify whether it is possible to improve this and its description of variability captured in the model. Two approaches have been explored: a more rigorous statistical analysis of the empirical data and a semi-mechanistic model based on established studies combined with new data obtained in a wind tunnel. A statistical comparison between field data and model outputs was used to determine which approach gave the better prediction of exposures. The semi-mechanistic approach gave the better prediction of experimental data and resulted in a reduction in the proposed regulatory values for the 75th and 95th percentiles of the exposure distribution.
Realizing Scientific Methods for Cyber Security
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carroll, Thomas E.; Manz, David O.; Edgar, Thomas W.
There is little doubt among cyber security researchers about the lack of scientic rigor that underlies much of the liter-ature. The issues are manifold and are well documented. Further complicating the problem is insufficient scientic methods to address these issues. Cyber security melds man and machine: we inherit the challenges of computer science, sociology, psychology, and many other elds and create new ones where these elds interface. In this paper we detail a partial list of challenges imposed by rigorous science and survey how other sciences have tackled them, in the hope of applying a similar approach to cyber securitymore » science. This paper is by no means comprehensive: its purpose is to foster discussion in the community on how we can improve rigor in cyber security science.« less
Recent Developments: PKI Square Dish for the Soleras Project
NASA Technical Reports Server (NTRS)
Rogers, W. E.
1984-01-01
The Square Dish solar collectors are subjected to rigorous design attention regarding corrosion at the site, and certification of the collector structure. The microprocessor controls and tracking mechanisms are improved in the areas of fail safe operations, durability, and low parasitic power requirements. Prototype testing demonstrates performance efficiency of approximately 72% at 730 F outlet temperature. Studies are conducted that include developing formal engineering design studies, developing formal engineering design drawing and fabrication details, establishing subcontracts for fabrication of major components, and developing a rigorous quality control system. The improved design is more cost effective to product and the extensive manuals developed for assembly and operation/maintenance result in faster field assembly and ease of operation.
Recent developments: PKI square dish for the Soleras Project
NASA Astrophysics Data System (ADS)
Rogers, W. E.
1984-03-01
The Square Dish solar collectors are subjected to rigorous design attention regarding corrosion at the site, and certification of the collector structure. The microprocessor controls and tracking mechanisms are improved in the areas of fail safe operations, durability, and low parasitic power requirements. Prototype testing demonstrates performance efficiency of approximately 72% at 730 F outlet temperature. Studies are conducted that include developing formal engineering design studies, developing formal engineering design drawing and fabrication details, establishing subcontracts for fabrication of major components, and developing a rigorous quality control system. The improved design is more cost effective to product and the extensive manuals developed for assembly and operation/maintenance result in faster field assembly and ease of operation.
A deep convolutional neural network using directional wavelets for low-dose X-ray CT reconstruction.
Kang, Eunhee; Min, Junhong; Ye, Jong Chul
2017-10-01
Due to the potential risk of inducing cancer, radiation exposure by X-ray CT devices should be reduced for routine patient scanning. However, in low-dose X-ray CT, severe artifacts typically occur due to photon starvation, beam hardening, and other causes, all of which decrease the reliability of the diagnosis. Thus, a high-quality reconstruction method from low-dose X-ray CT data has become a major research topic in the CT community. Conventional model-based de-noising approaches are, however, computationally very expensive, and image-domain de-noising approaches cannot readily remove CT-specific noise patterns. To tackle these problems, we want to develop a new low-dose X-ray CT algorithm based on a deep-learning approach. We propose an algorithm which uses a deep convolutional neural network (CNN) which is applied to the wavelet transform coefficients of low-dose CT images. More specifically, using a directional wavelet transform to extract the directional component of artifacts and exploit the intra- and inter- band correlations, our deep network can effectively suppress CT-specific noise. In addition, our CNN is designed with a residual learning architecture for faster network training and better performance. Experimental results confirm that the proposed algorithm effectively removes complex noise patterns from CT images derived from a reduced X-ray dose. In addition, we show that the wavelet-domain CNN is efficient when used to remove noise from low-dose CT compared to existing approaches. Our results were rigorously evaluated by several radiologists at the Mayo Clinic and won second place at the 2016 "Low-Dose CT Grand Challenge." To the best of our knowledge, this work is the first deep-learning architecture for low-dose CT reconstruction which has been rigorously evaluated and proven to be effective. In addition, the proposed algorithm, in contrast to existing model-based iterative reconstruction (MBIR) methods, has considerable potential to benefit from large data sets. Therefore, we believe that the proposed algorithm opens a new direction in the area of low-dose CT research. © 2017 American Association of Physicists in Medicine.
NASA Astrophysics Data System (ADS)
Sun, Hui; Wen, Jiayi; Zhao, Yanxiang; Li, Bo; McCammon, J. Andrew
2015-12-01
Dielectric boundary based implicit-solvent models provide efficient descriptions of coarse-grained effects, particularly the electrostatic effect, of aqueous solvent. Recent years have seen the initial success of a new such model, variational implicit-solvent model (VISM) [Dzubiella, Swanson, and McCammon Phys. Rev. Lett. 96, 087802 (2006) and J. Chem. Phys. 124, 084905 (2006)], in capturing multiple dry and wet hydration states, describing the subtle electrostatic effect in hydrophobic interactions, and providing qualitatively good estimates of solvation free energies. Here, we develop a phase-field VISM to the solvation of charged molecules in aqueous solvent to include more flexibility. In this approach, a stable equilibrium molecular system is described by a phase field that takes one constant value in the solute region and a different constant value in the solvent region, and smoothly changes its value on a thin transition layer representing a smeared solute-solvent interface or dielectric boundary. Such a phase field minimizes an effective solvation free-energy functional that consists of the solute-solvent interfacial energy, solute-solvent van der Waals interaction energy, and electrostatic free energy described by the Poisson-Boltzmann theory. We apply our model and methods to the solvation of single ions, two parallel plates, and protein complexes BphC and p53/MDM2 to demonstrate the capability and efficiency of our approach at different levels. With a diffuse dielectric boundary, our new approach can describe the dielectric asymmetry in the solute-solvent interfacial region. Our theory is developed based on rigorous mathematical studies and is also connected to the Lum-Chandler-Weeks theory (1999). We discuss these connections and possible extensions of our theory and methods.
Sun, Hui; Wen, Jiayi; Zhao, Yanxiang; Li, Bo; McCammon, J Andrew
2015-12-28
Dielectric boundary based implicit-solvent models provide efficient descriptions of coarse-grained effects, particularly the electrostatic effect, of aqueous solvent. Recent years have seen the initial success of a new such model, variational implicit-solvent model (VISM) [Dzubiella, Swanson, and McCammon Phys. Rev. Lett. 96, 087802 (2006) and J. Chem. Phys. 124, 084905 (2006)], in capturing multiple dry and wet hydration states, describing the subtle electrostatic effect in hydrophobic interactions, and providing qualitatively good estimates of solvation free energies. Here, we develop a phase-field VISM to the solvation of charged molecules in aqueous solvent to include more flexibility. In this approach, a stable equilibrium molecular system is described by a phase field that takes one constant value in the solute region and a different constant value in the solvent region, and smoothly changes its value on a thin transition layer representing a smeared solute-solvent interface or dielectric boundary. Such a phase field minimizes an effective solvation free-energy functional that consists of the solute-solvent interfacial energy, solute-solvent van der Waals interaction energy, and electrostatic free energy described by the Poisson-Boltzmann theory. We apply our model and methods to the solvation of single ions, two parallel plates, and protein complexes BphC and p53/MDM2 to demonstrate the capability and efficiency of our approach at different levels. With a diffuse dielectric boundary, our new approach can describe the dielectric asymmetry in the solute-solvent interfacial region. Our theory is developed based on rigorous mathematical studies and is also connected to the Lum-Chandler-Weeks theory (1999). We discuss these connections and possible extensions of our theory and methods.
Sun, Hui; Wen, Jiayi; Zhao, Yanxiang; Li, Bo; McCammon, J. Andrew
2015-01-01
Dielectric boundary based implicit-solvent models provide efficient descriptions of coarse-grained effects, particularly the electrostatic effect, of aqueous solvent. Recent years have seen the initial success of a new such model, variational implicit-solvent model (VISM) [Dzubiella, Swanson, and McCammon Phys. Rev. Lett. 96, 087802 (2006) and J. Chem. Phys. 124, 084905 (2006)], in capturing multiple dry and wet hydration states, describing the subtle electrostatic effect in hydrophobic interactions, and providing qualitatively good estimates of solvation free energies. Here, we develop a phase-field VISM to the solvation of charged molecules in aqueous solvent to include more flexibility. In this approach, a stable equilibrium molecular system is described by a phase field that takes one constant value in the solute region and a different constant value in the solvent region, and smoothly changes its value on a thin transition layer representing a smeared solute-solvent interface or dielectric boundary. Such a phase field minimizes an effective solvation free-energy functional that consists of the solute-solvent interfacial energy, solute-solvent van der Waals interaction energy, and electrostatic free energy described by the Poisson–Boltzmann theory. We apply our model and methods to the solvation of single ions, two parallel plates, and protein complexes BphC and p53/MDM2 to demonstrate the capability and efficiency of our approach at different levels. With a diffuse dielectric boundary, our new approach can describe the dielectric asymmetry in the solute-solvent interfacial region. Our theory is developed based on rigorous mathematical studies and is also connected to the Lum–Chandler–Weeks theory (1999). We discuss these connections and possible extensions of our theory and methods. PMID:26723595
DOE Office of Scientific and Technical Information (OSTI.GOV)
LaClair, Tim; Gao, Zhiming; Fu, Joshua
Quantifying the fuel savings and emissions reductions that can be achieved from truck fuel efficiency technologies for a fleet's specific usage allows the fleet to select a combination of technologies that will yield the greatest operational efficiency and profitability. An accurate characterization of usage for the fleet is critical for such an evaluation; however, short-term measured drive cycle data do not generally reflect overall usage very effectively. This study presents a detailed analysis of vehicle usage in a commercial vehicle fleet and demonstrates the development of a short-duration synthetic drive cycle with measured drive cycle data collected over an extendedmore » period of time. The approach matched statistical measures of the vehicle speed with acceleration history and integrated measured grade data to develop a compressed drive cycle that accurately represents total usage. Drive cycle measurements obtained during a full year from six tractor trailers in normal operations in a less-than-truckload carrier were analyzed to develop a synthetic drive cycle. The vehicle mass was also estimated to account for the variation of loads that the fleet experienced. These drive cycle and mass data were analyzed with a tractive energy analysis to quantify the benefits in terms of fuel efficiency and reduced carbon dioxide emissions that can be achieved on Class 8 tractor trailers by using advanced efficiency technologies, either individually or in combination. Although differences exist between Class 8 tractor trailer fleets, this study provides valuable insight into the energy and emissions reduction potential that various technologies can bring in this important trucking application. Finally, the methodology employed for generating the synthetic drive cycle serves as a rigorous approach to develop an accurate usage characterization that can be used to effectively compress large quantities of drive cycle data.« less
LaClair, Tim; Gao, Zhiming; Fu, Joshua; ...
2014-12-01
Quantifying the fuel savings and emissions reductions that can be achieved from truck fuel efficiency technologies for a fleet's specific usage allows the fleet to select a combination of technologies that will yield the greatest operational efficiency and profitability. An accurate characterization of usage for the fleet is critical for such an evaluation; however, short-term measured drive cycle data do not generally reflect overall usage very effectively. This study presents a detailed analysis of vehicle usage in a commercial vehicle fleet and demonstrates the development of a short-duration synthetic drive cycle with measured drive cycle data collected over an extendedmore » period of time. The approach matched statistical measures of the vehicle speed with acceleration history and integrated measured grade data to develop a compressed drive cycle that accurately represents total usage. Drive cycle measurements obtained during a full year from six tractor trailers in normal operations in a less-than-truckload carrier were analyzed to develop a synthetic drive cycle. The vehicle mass was also estimated to account for the variation of loads that the fleet experienced. These drive cycle and mass data were analyzed with a tractive energy analysis to quantify the benefits in terms of fuel efficiency and reduced carbon dioxide emissions that can be achieved on Class 8 tractor trailers by using advanced efficiency technologies, either individually or in combination. Although differences exist between Class 8 tractor trailer fleets, this study provides valuable insight into the energy and emissions reduction potential that various technologies can bring in this important trucking application. Finally, the methodology employed for generating the synthetic drive cycle serves as a rigorous approach to develop an accurate usage characterization that can be used to effectively compress large quantities of drive cycle data.« less
ERIC Educational Resources Information Center
Garcia, Stephan Ramon; Ross, William T.
2017-01-01
We hope to initiate a discussion about various methods for introducing Cauchy's Theorem. Although Cauchy's Theorem is the fundamental theorem upon which complex analysis is based, there is no "standard approach." The appropriate choice depends upon the prerequisites for the course and the level of rigor intended. Common methods include…
The US Environmental Protection Agency (EPA) is revising its strategy to obtain the information needed to answer questions pertinent to water-quality management efficiently and rigorously at national scales. One tool of this revised strategy is use of statistically based surveys ...
Gender bias affects forests worldwide
Marlène Elias; Susan S Hummel; Bimbika S Basnett; Carol J.P. Colfer
2017-01-01
Gender biases persist in forestry research and practice. These biases result in reduced scientific rigor and inequitable, ineffective, and less efficient policies, programs, and interventions. Drawing from a two-volume collection of current and classic analyses on gender in forests, we outline five persistent and inter-related themes: gendered governance, tree tenure,...
Educating Part-Time MBAs for the Global Business Environment
ERIC Educational Resources Information Center
Randolph, W. Alan
2008-01-01
To be successful managers in the business world of the 21st century, MBA students must acquire global skills of business acumen, reflection, cultural sensitivity, and multi-cultural teamwork. Developing these skills requires international experience, but educating part-time MBAs creates a special challenge demanding both rigor and efficiency. This…
Studying technology use as social practice: the untapped potential of ethnography
2011-01-01
Information and communications technologies (ICTs) in healthcare are often introduced with expectations of higher-quality, more efficient, and safer care. Many fail to meet these expectations. We argue here that the well-documented failures of ICTs in healthcare are partly attributable to the philosophical foundations of much health informatics research. Positivistic assumptions underpinning the design, implementation and evaluation of ICTs (in particular the notion that technology X has an impact which can be measured and reproduced in new settings), and the deterministic experimental and quasi-experimental study designs which follow from these assumptions, have inherent limitations when ICTs are part of complex social practices involving multiple human actors. We suggest that while experimental and quasi-experimental studies have an important place in health informatics research overall, ethnography is the preferred methodological approach for studying ICTs introduced into complex social systems. But for ethnographic approaches to be accepted and used to their full potential, many in the health informatics community will need to revisit their philosophical assumptions about what counts as research rigor. PMID:21521535
Rigorous RG Algorithms and Area Laws for Low Energy Eigenstates in 1D
NASA Astrophysics Data System (ADS)
Arad, Itai; Landau, Zeph; Vazirani, Umesh; Vidick, Thomas
2017-11-01
One of the central challenges in the study of quantum many-body systems is the complexity of simulating them on a classical computer. A recent advance (Landau et al. in Nat Phys, 2015) gave a polynomial time algorithm to compute a succinct classical description for unique ground states of gapped 1D quantum systems. Despite this progress many questions remained unsolved, including whether there exist efficient algorithms when the ground space is degenerate (and of polynomial dimension in the system size), or for the polynomially many lowest energy states, or even whether such states admit succinct classical descriptions or area laws. In this paper we give a new algorithm, based on a rigorously justified RG type transformation, for finding low energy states for 1D Hamiltonians acting on a chain of n particles. In the process we resolve some of the aforementioned open questions, including giving a polynomial time algorithm for poly( n) degenerate ground spaces and an n O(log n) algorithm for the poly( n) lowest energy states (under a mild density condition). For these classes of systems the existence of a succinct classical description and area laws were not rigorously proved before this work. The algorithms are natural and efficient, and for the case of finding unique ground states for frustration-free Hamiltonians the running time is {\\tilde{O}(nM(n))} , where M( n) is the time required to multiply two n × n matrices.
Normalization and extension of single-collector efficiency correlation equation
NASA Astrophysics Data System (ADS)
Messina, Francesca; Marchisio, Daniele; Sethi, Rajandrea
2015-04-01
The colloidal transport and deposition are important phenomena involved in many engineering problems. In the environmental engineering field the use of micro- and nano-scale zerovalent iron (M-NZVI) is one of the most promising technologies for groundwater remediation. Colloid deposition is normally studied from a micro scale point of view and the results are then implemented in macro scale models that are used to design field-scale applications. The single collector efficiency concept predicts particles deposition onto a single grain of a complex porous medium in terms of probability that an approaching particle would be retained on the solid grain. In literature, many different approaches and equations exist to predict it, but most of them fail under specific conditions (e.g. very small or very big particle size and very low fluid velocity) because they predict efficiency values exceeding unity. By analysing particle fluxes and deposition mechanisms and performing a mass balance on the entire domain, the traditional definition of efficiency was reformulated and a novel total flux normalized correlation equation is proposed for predicting single-collector efficiency under a broad range of parameters. It has been formulated starting from a combination of Eulerian and Lagrangian numerical simulations, performed under Smoluchowski-Levich conditions, in a geometry which consists of a sphere enveloped by a control volume. In order to guarantee the independence of each term, the correlation equation is derived through a rigorous hierarchical parameter estimation process, accounting for single and mutual interacting transport mechanisms. The correlation equation provides efficiency values lower than one over a wide range of parameters and is valid both for point and finite-size particles. A reduced form is also proposed by elimination of the less relevant terms. References 1. Yao, K. M.; Habibian, M. M.; Omelia, C. R., Water and Waste Water Filtration - Concepts and Applications. Environ Sci Technol 1971, 5, (11), 1105-&. 2. Tufenkji, N., and M. Elimelech, Correlation equation for predicting single-collector efficiency in physicochemical filtration in saturated porous media. Environmental Science & Technology 2004 38(2):529-536. 3. Boccardo, G.; Marchisio, D. L.; Sethi, R., Microscale simulation of particle deposition in porous media. J Colloid Interface Sci 2014, 417, 227-37
NASA Astrophysics Data System (ADS)
Messina, F.; Tosco, T.; Sethi, R.
2017-12-01
Colloidal transport and deposition in saturated porous media are phenomena of considerable importance in a large number of natural processes and engineering applications, such as the contaminant and microorganism propagation in aquifer systems, the development of innovative groundwater remediation technologies, air and water filtration, and many others. Therefore, a thorough understanding of particle filtration is essential for predicting the transport and fate of colloids in the subsurface environment. The removal efficiency of a filter is a key aspect for colloid transport in porous media. Several efforts were devoted to derive accurate correlations for the single collector efficiency, one of the key concept in the filtration theory. However, up scaling this parameter to the entire porous medium is still a challenge. The common up-scaling approach assumes the deposition to be independent of the transport history, which means that the collector efficiency is considered uniform along the porous medium. However, previous works showed that this approach is inadequate under unfavorable deposition conditions. This study demonstrates that it is not adequate even in the simplest case of favorable deposition. Computational Fluid Dynamics simulations were run for a simplify porous media geometry, composed of a vertical array of 50 identical spherical collectors. A combination of Lagrangian and Eulerian simulations were performed to analyze the particle transport under a broad range of parameters (i.e., particle size, particle density, water velocity). The results show the limits of the existing models to interpret the experimental data. In fact, the outcome evidenced that when particle deposition is not controlled by Brownian diffusion, non-exponential concentration profiles are retrieved, in contrast with the assumption of uniform efficiency. Moreover, when the deposition mechanisms of sedimentation and interception dominate, the efficiency of the first sphere of the column is significantly higher compared to the others, and then it declines along the array down to an asymptotic value. A more rigorous procedure to evaluate the filtration processes in presence of a series of collectors was developed, and a new correlation for the up-scaled removal efficiency of the entire array was derived and proposed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ramanathan, Arvind; Steed, Chad A; Pullum, Laura L
Compartmental models in epidemiology are widely used as a means to model disease spread mechanisms and understand how one can best control the disease in case an outbreak of a widespread epidemic occurs. However, a significant challenge within the community is in the development of approaches that can be used to rigorously verify and validate these models. In this paper, we present an approach to rigorously examine and verify the behavioral properties of compartmen- tal epidemiological models under several common modeling scenarios including birth/death rates and multi-host/pathogen species. Using metamorphic testing, a novel visualization tool and model checking, we buildmore » a workflow that provides insights into the functionality of compartmental epidemiological models. Our initial results indicate that metamorphic testing can be used to verify the implementation of these models and provide insights into special conditions where these mathematical models may fail. The visualization front-end allows the end-user to scan through a variety of parameters commonly used in these models to elucidate the conditions under which an epidemic can occur. Further, specifying these models using a process algebra allows one to automatically construct behavioral properties that can be rigorously verified using model checking. Taken together, our approach allows for detecting implementation errors as well as handling conditions under which compartmental epidemiological models may fail to provide insights into disease spread dynamics.« less
An Interaction-Based Approach to Enhancing Secondary School Instruction and Student Achievement
ERIC Educational Resources Information Center
Allen, Joseph; Pianta, Robert; Gregory, Anne; Mikami, Amori; Lun, Janetta
2011-01-01
Improving teaching quality is widely recognized as critical to addressing deficiencies in secondary school education, yet the field has struggled to identify rigorously evaluated teacher-development approaches that can produce reliable gains in student achievement. A randomized controlled trial of My Teaching Partner-Secondary--a Web-mediated…
A Practical Guide to Regression Discontinuity
ERIC Educational Resources Information Center
Jacob, Robin; Zhu, Pei; Somers, Marie-Andrée; Bloom, Howard
2012-01-01
Regression discontinuity (RD) analysis is a rigorous nonexperimental approach that can be used to estimate program impacts in situations in which candidates are selected for treatment based on whether their value for a numeric rating exceeds a designated threshold or cut-point. Over the last two decades, the regression discontinuity approach has…
A Qualitative Approach to Enzyme Inhibition
ERIC Educational Resources Information Center
Waldrop, Grover L.
2009-01-01
Most general biochemistry textbooks present enzyme inhibition by showing how the basic Michaelis-Menten parameters K[subscript m] and V[subscript max] are affected mathematically by a particular type of inhibitor. This approach, while mathematically rigorous, does not lend itself to understanding how inhibition patterns are used to determine the…
Treatment Controversies in Autism
ERIC Educational Resources Information Center
Schreibman, Laura
2008-01-01
With the increasing numbers of children who are being diagnosed with an autism spectrum disorder there are a wide variety of treatment approaches being marketed to vulnerable parents who are desperate to help their child. However, many of these approaches have not been rigorously evaluated and can lead to false hopes, unreasonable fears, or…
Three Views of Systems Theories and Their Implications for Sustainability Education
ERIC Educational Resources Information Center
Porter, Terry; Cordoba, Jose
2009-01-01
Worldwide, there is an emerging interest in sustainability and sustainability education. A popular and promising approach is the use of systems thinking. However, the systems approach to sustainability has neither been clearly defined nor has its practical application followed any systematic rigor, resulting in confounded and underspecified…
1985-02-01
Energy Analysis , a branch of dynamic modal analysis developed for analyzing acoustic vibration problems, its present stage of development embodies a...Maximum Entropy Stochastic Modelling and Reduced-Order Design Synthesis is a rigorous new approach to this class of problems. Inspired by Statistical
Araújo, Luciano V; Malkowski, Simon; Braghetto, Kelly R; Passos-Bueno, Maria R; Zatz, Mayana; Pu, Calton; Ferreira, João E
2011-12-22
Recent medical and biological technology advances have stimulated the development of new testing systems that have been providing huge, varied amounts of molecular and clinical data. Growing data volumes pose significant challenges for information processing systems in research centers. Additionally, the routines of genomics laboratory are typically characterized by high parallelism in testing and constant procedure changes. This paper describes a formal approach to address this challenge through the implementation of a genetic testing management system applied to human genome laboratory. We introduced the Human Genome Research Center Information System (CEGH) in Brazil, a system that is able to support constant changes in human genome testing and can provide patients updated results based on the most recent and validated genetic knowledge. Our approach uses a common repository for process planning to ensure reusability, specification, instantiation, monitoring, and execution of processes, which are defined using a relational database and rigorous control flow specifications based on process algebra (ACP). The main difference between our approach and related works is that we were able to join two important aspects: 1) process scalability achieved through relational database implementation, and 2) correctness of processes using process algebra. Furthermore, the software allows end users to define genetic testing without requiring any knowledge about business process notation or process algebra. This paper presents the CEGH information system that is a Laboratory Information Management System (LIMS) based on a formal framework to support genetic testing management for Mendelian disorder studies. We have proved the feasibility and showed usability benefits of a rigorous approach that is able to specify, validate, and perform genetic testing using easy end user interfaces.
2011-01-01
Background Recent medical and biological technology advances have stimulated the development of new testing systems that have been providing huge, varied amounts of molecular and clinical data. Growing data volumes pose significant challenges for information processing systems in research centers. Additionally, the routines of genomics laboratory are typically characterized by high parallelism in testing and constant procedure changes. Results This paper describes a formal approach to address this challenge through the implementation of a genetic testing management system applied to human genome laboratory. We introduced the Human Genome Research Center Information System (CEGH) in Brazil, a system that is able to support constant changes in human genome testing and can provide patients updated results based on the most recent and validated genetic knowledge. Our approach uses a common repository for process planning to ensure reusability, specification, instantiation, monitoring, and execution of processes, which are defined using a relational database and rigorous control flow specifications based on process algebra (ACP). The main difference between our approach and related works is that we were able to join two important aspects: 1) process scalability achieved through relational database implementation, and 2) correctness of processes using process algebra. Furthermore, the software allows end users to define genetic testing without requiring any knowledge about business process notation or process algebra. Conclusions This paper presents the CEGH information system that is a Laboratory Information Management System (LIMS) based on a formal framework to support genetic testing management for Mendelian disorder studies. We have proved the feasibility and showed usability benefits of a rigorous approach that is able to specify, validate, and perform genetic testing using easy end user interfaces. PMID:22369688
Chiu, Grace S; Wu, Margaret A; Lu, Lin
2013-01-01
The ability to quantitatively assess ecological health is of great interest to those tasked with monitoring and conserving ecosystems. For decades, biomonitoring research and policies have relied on multimetric health indices of various forms. Although indices are numbers, many are constructed based on qualitative procedures, thus limiting the quantitative rigor of the practical interpretations of such indices. The statistical modeling approach to construct the latent health factor index (LHFI) was recently developed. With ecological data that otherwise are used to construct conventional multimetric indices, the LHFI framework expresses such data in a rigorous quantitative model, integrating qualitative features of ecosystem health and preconceived ecological relationships among such features. This hierarchical modeling approach allows unified statistical inference of health for observed sites (along with prediction of health for partially observed sites, if desired) and of the relevance of ecological drivers, all accompanied by formal uncertainty statements from a single, integrated analysis. Thus far, the LHFI approach has been demonstrated and validated in a freshwater context. We adapt this approach to modeling estuarine health, and illustrate it on the previously unassessed system in Richibucto in New Brunswick, Canada, where active oyster farming is a potential stressor through its effects on sediment properties. Field data correspond to health metrics that constitute the popular AZTI marine biotic index and the infaunal trophic index, as well as abiotic predictors preconceived to influence biota. Our paper is the first to construct a scientifically sensible model that rigorously identifies the collective explanatory capacity of salinity, distance downstream, channel depth, and silt-clay content-all regarded a priori as qualitatively important abiotic drivers-towards site health in the Richibucto ecosystem. This suggests the potential effectiveness of the LHFI approach for assessing not only freshwater systems but aquatic ecosystems in general.
ERIC Educational Resources Information Center
Gibson, Matthew
2014-01-01
The Signs of Safety approach to child protection has been gaining prominence around the world and this approach has developed through learning from good practice. Generally, examples of good practice are derived from adults who pose a risk to children, while this paper outlines an example of good practice that engages an adolescent in building a…
Rigor force responses of permeabilized fibres from fast and slow skeletal muscles of aged rats.
Plant, D R; Lynch, G S
2001-09-01
1. Ageing is generally associated with a decline in skeletal muscle mass and strength and a slowing of muscle contraction, factors that impact upon the quality of life for the elderly. The mechanisms underlying this age-related muscle weakness have not been fully resolved. The purpose of the present study was to determine whether the decrease in muscle force as a consequence of age could be attributed partly to a decrease in the number of cross-bridges participating during contraction. 2. Given that the rigor force is proportional to the approximate total number of interacting sites between the actin and myosin filaments, we tested the null hypothesis that the rigor force of permeabilized muscle fibres from young and old rats would not be different. 3. Permeabilized fibres from the extensor digitorum longus (fast-twitch; EDL) and soleus (predominantly slow-twitch) muscles of young (6 months of age) and old (27 months of age) male F344 rats were activated in Ca2+-buffered solutions to determine force-pCa characteristics (where pCa = -log(10)[Ca2+]) and then in solutions lacking ATP and Ca2+ to determine rigor force levels. 4. The rigor forces for EDL and soleus muscle fibres were not different between young and old rats, indicating that the approximate total number of cross-bridges that can be formed between filaments did not decline with age. We conclude that the age-related decrease in force output is more likely attributed to a decrease in the force per cross-bridge and/or decreases in the efficiency of excitation-contraction coupling.
The MIXED framework: A novel approach to evaluating mixed-methods rigor.
Eckhardt, Ann L; DeVon, Holli A
2017-10-01
Evaluation of rigor in mixed-methods (MM) research is a persistent challenge due to the combination of inconsistent philosophical paradigms, the use of multiple research methods which require different skill sets, and the need to combine research at different points in the research process. Researchers have proposed a variety of ways to thoroughly evaluate MM research, but each method fails to provide a framework that is useful for the consumer of research. In contrast, the MIXED framework is meant to bridge the gap between an academic exercise and practical assessment of a published work. The MIXED framework (methods, inference, expertise, evaluation, and design) borrows from previously published frameworks to create a useful tool for the evaluation of a published study. The MIXED framework uses an experimental eight-item scale that allows for comprehensive integrated assessment of MM rigor in published manuscripts. Mixed methods are becoming increasingly prevalent in nursing and healthcare research requiring researchers and consumers to address issues unique to MM such as evaluation of rigor. © 2017 John Wiley & Sons Ltd.
Elf, Johan
2016-04-27
A new, game-changing approach makes it possible to rigorously disprove models without making assumptions about the unknown parts of the biological system. Copyright © 2016 Elsevier Inc. All rights reserved.
On the total bandwidth for the rational Harper's equation
NASA Astrophysics Data System (ADS)
Helffer, Bernard; Kerdelhué, Phillippe
1995-10-01
In the last years several contributions have been done around the total bandwidth of the spectrum for the Harper's operator. In particular an interesting conjecture has been proposed by Thouless which gives also strong convincing arguments for the proof in special cases. On the other hand, in the study of the Cantor structure of the spectrum, B. Helffer and J. Sjöstrand have justified an heuristic semiclassical approach proposed by M. Wilkinson. The aim of this article is to explain how one can use the first step of this approach to give a rigorous semi-classical proof of the Thouless formula in some of the simplest cases. We shall also indicate how one can hope with more effort to prove rigorously recent results of Last and Wilkinson on the same conjecture.
Brandt, Heather M.; Freedman, Darcy A.; Adams, Swann Arp; Young, Vicki M.; Ureda, John R.; McCracken, James Lyndon; Hébert, James R.
2014-01-01
The South Carolina Cancer Prevention and Control Research Network (SC-CPCRN) is 1 of 10 networks funded by the Centers for Disease Control and Prevention and the National Cancer Institute (NCI) that works to reduce cancer-related health disparities. In partnership with federally qualified health centers and community stakeholders, the SC-CPCRN uses evidence-based approaches (eg, NCI Research-tested Intervention Programs) to disseminate and implement cancer prevention and control messages, programs, and interventions. We describe the innovative stakeholder- and community-driven communication efforts conducted by the SC-CPCRN to improve overall health and reduce cancer-related health disparities among high-risk and disparate populations in South Carolina. We describe how our communication efforts are aligned with 5 core values recommended for dissemination and implementation science: 1) rigor and relevance, 2) efficiency and speed, 3) collaboration, 4) improved capacity, and 5) cumulative knowledge. PMID:25058673
NASA Technical Reports Server (NTRS)
Arnold, S. M.; Binienda, W. K.; Tan, H. Q.; Xu, M. H.
1992-01-01
Analytical derivations of stress intensity factors (SIF's) of a multicracked plate can be complex and tedious. Recent advances, however, in intelligent application of symbolic computation can overcome these difficulties and provide the means to rigorously and efficiently analyze this class of problems. Here, the symbolic algorithm required to implement the methodology described in Part 1 is presented. The special problem-oriented symbolic functions to derive the fundamental kernels are described, and the associated automatically generated FORTRAN subroutines are given. As a result, a symbolic/FORTRAN package named SYMFRAC, capable of providing accurate SIF's at each crack tip, was developed and validated. Simple illustrative examples using SYMFRAC show the potential of the present approach for predicting the macrocrack propagation path due to existing microcracks in the vicinity of a macrocrack tip, when the influence of the microcrack's location, orientation, size, and interaction are taken into account.
Towards lifetime electronic health record implementation.
Gand, Kai; Richter, Peggy; Esswein, Werner
2015-01-01
Integrated care concepts can help to diminish demographic challenges. Hereof, the use of eHealth, esp. overarching electronic health records, is recognized as an efficient approach. The article aims at rigorously defining the concept of lifetime electronic health records (LEHRs) and the identification of core factors that need to be fulfilled in order to implement such. A literature review was conducted. Existing definitions were identified and relevant factors were categorized. The derived assessment categories are demonstrated by a case study on Germany. Seven dimensions to differentiate types of electronic health records were found. The analysis revealed, that culture, regulation, informational self-determination, incentives, compliance, ICT infrastructure and standards are important preconditions to successfully implement LEHRs. The article paves the way for LEHR implementation and therewith for integrated care. Besides the expected benefits of LEHRs, there are a number of ethical, legal and social concerns, which need to be balanced.
Overarching framework for data-based modelling
NASA Astrophysics Data System (ADS)
Schelter, Björn; Mader, Malenka; Mader, Wolfgang; Sommerlade, Linda; Platt, Bettina; Lai, Ying-Cheng; Grebogi, Celso; Thiel, Marco
2014-02-01
One of the main modelling paradigms for complex physical systems are networks. When estimating the network structure from measured signals, typically several assumptions such as stationarity are made in the estimation process. Violating these assumptions renders standard analysis techniques fruitless. We here propose a framework to estimate the network structure from measurements of arbitrary non-linear, non-stationary, stochastic processes. To this end, we propose a rigorous mathematical theory that underlies this framework. Based on this theory, we present a highly efficient algorithm and the corresponding statistics that are immediately sensibly applicable to measured signals. We demonstrate its performance in a simulation study. In experiments of transitions between vigilance stages in rodents, we infer small network structures with complex, time-dependent interactions; this suggests biomarkers for such transitions, the key to understand and diagnose numerous diseases such as dementia. We argue that the suggested framework combines features that other approaches followed so far lack.
Stochastic competitive learning in complex networks.
Silva, Thiago Christiano; Zhao, Liang
2012-03-01
Competitive learning is an important machine learning approach which is widely employed in artificial neural networks. In this paper, we present a rigorous definition of a new type of competitive learning scheme realized on large-scale networks. The model consists of several particles walking within the network and competing with each other to occupy as many nodes as possible, while attempting to reject intruder particles. The particle's walking rule is composed of a stochastic combination of random and preferential movements. The model has been applied to solve community detection and data clustering problems. Computer simulations reveal that the proposed technique presents high precision of community and cluster detections, as well as low computational complexity. Moreover, we have developed an efficient method for estimating the most likely number of clusters by using an evaluator index that monitors the information generated by the competition process itself. We hope this paper will provide an alternative way to the study of competitive learning..
Space station dynamics, attitude control and momentum management
NASA Technical Reports Server (NTRS)
Sunkel, John W.; Singh, Ramen P.; Vengopal, Ravi
1989-01-01
The Space Station Attitude Control System software test-bed provides a rigorous environment for the design, development and functional verification of GN and C algorithms and software. The approach taken for the simulation of the vehicle dynamics and environmental models using a computationally efficient algorithm is discussed. The simulation includes capabilities for docking/berthing dynamics, prescribed motion dynamics associated with the Mobile Remote Manipulator System (MRMS) and microgravity disturbances. The vehicle dynamics module interfaces with the test-bed through the central Communicator facility which is in turn driven by the Station Control Simulator (SCS) Executive. The Communicator addresses issues such as the interface between the discrete flight software and the continuous vehicle dynamics, and multi-programming aspects such as the complex flow of control in real-time programs. Combined with the flight software and redundancy management modules, the facility provides a flexible, user-oriented simulation platform.
Friedman, Daniela B; Brandt, Heather M; Freedman, Darcy A; Adams, Swann Arp; Young, Vicki M; Ureda, John R; McCracken, James Lyndon; Hébert, James R
2014-07-24
The South Carolina Cancer Prevention and Control Research Network (SC-CPCRN) is 1 of 10 networks funded by the Centers for Disease Control and Prevention and the National Cancer Institute (NCI) that works to reduce cancer-related health disparities. In partnership with federally qualified health centers and community stakeholders, the SC-CPCRN uses evidence-based approaches (eg, NCI Research-tested Intervention Programs) to disseminate and implement cancer prevention and control messages, programs, and interventions. We describe the innovative stakeholder- and community-driven communication efforts conducted by the SC-CPCRN to improve overall health and reduce cancer-related health disparities among high-risk and disparate populations in South Carolina. We describe how our communication efforts are aligned with 5 core values recommended for dissemination and implementation science: 1) rigor and relevance, 2) efficiency and speed, 3) collaboration, 4) improved capacity, and 5) cumulative knowledge.
NASA Astrophysics Data System (ADS)
Syusina, O. M.; Chernitsov, A. M.; Tamarov, V. A.
2011-07-01
Simple and mathematically rigorous methods for calculating of nonlinearity coefficients are proposed. These coefficients allow us to make classification for the least squares problem as strongly or weakly nonlinear one. The advices are given on how to reduce a concrete estimation problem to weakly nonlinear one where a more efficient linear approach can be used.
Excitation of multiple surface-plasmon-polariton waves using a compound surface-relief grating
NASA Astrophysics Data System (ADS)
Faryad, Muhammad; Lakhtakia, Akhlesh
2012-01-01
The excitation of multiple surface-plasmon-polariton waves, all of the same frequency but different polarization states, phase speeds, spatial profiles and degrees of localization, by a compound surface-relief grating formed by a metal and a rugate filter, both of finite thickness, was studied using the rigorous coupled-wave approach. Each period of the compound surface-relief grating was chosen to have an integral number of periods of two different simple surface-relief gratings. The excitation of different SPP waves was inferred from the absorptance peaks that were independent of the thickness of the rugate filter. The excitation of each SPP wave could be attributed to either a simple surface-relief grating present in the compound surface-relief grating or to the compound surface-relief grating itself. However, the excitation of SPP waves was found to be less efficient with the compound surface-relief grating than with a simple surface-relief grating.
NASA Astrophysics Data System (ADS)
Schwarz, Karsten; Rieger, Heiko
2013-03-01
We present an efficient Monte Carlo method to simulate reaction-diffusion processes with spatially varying particle annihilation or transformation rates as it occurs for instance in the context of motor-driven intracellular transport. Like Green's function reaction dynamics and first-passage time methods, our algorithm avoids small diffusive hops by propagating sufficiently distant particles in large hops to the boundaries of protective domains. Since for spatially varying annihilation or transformation rates the single particle diffusion propagator is not known analytically, we present an algorithm that generates efficiently either particle displacements or annihilations with the correct statistics, as we prove rigorously. The numerical efficiency of the algorithm is demonstrated with an illustrative example.
Das, Ashok Kumar; Goswami, Adrijit
2014-06-01
Recently, Awasthi and Srivastava proposed a novel biometric remote user authentication scheme for the telecare medicine information system (TMIS) with nonce. Their scheme is very efficient as it is based on efficient chaotic one-way hash function and bitwise XOR operations. In this paper, we first analyze Awasthi-Srivastava's scheme and then show that their scheme has several drawbacks: (1) incorrect password change phase, (2) fails to preserve user anonymity property, (3) fails to establish a secret session key beween a legal user and the server, (4) fails to protect strong replay attack, and (5) lacks rigorous formal security analysis. We then a propose a novel and secure biometric-based remote user authentication scheme in order to withstand the security flaw found in Awasthi-Srivastava's scheme and enhance the features required for an idle user authentication scheme. Through the rigorous informal and formal security analysis, we show that our scheme is secure against possible known attacks. In addition, we simulate our scheme for the formal security verification using the widely-accepted AVISPA (Automated Validation of Internet Security Protocols and Applications) tool and show that our scheme is secure against passive and active attacks, including the replay and man-in-the-middle attacks. Our scheme is also efficient as compared to Awasthi-Srivastava's scheme.
Ensuring Effective Curriculum Approval Processes: A Guide for Local Senates
ERIC Educational Resources Information Center
Academic Senate for California Community Colleges, 2016
2016-01-01
Curriculum is the heart of the mission of every college. College curriculum approval processes have been established to ensure that rigorous, high quality curriculum is offered that meets the needs of students. While some concerns may exist regarding the effectiveness and efficiency of local curriculum processes, all participants in the process…
Educational Technology: A Theoretical Discussion
ERIC Educational Resources Information Center
Andrews, Barbara; Hakken, David
1977-01-01
Views educational technology in relation to the pattern of technological change, argues that the new technology must be rigorously evaluated, and suggests it is best understood as a business approach to education. (DD)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ahn, Hye-Kyung; Kim, Byoung Chan; Jun, Seung-Hyun
2010-12-15
An efficient protein digestion in proteomic analysis requires the stabilization of proteases such as trypsin. In the present work, trypsin was stabilized in the form of enzyme coating on electrospun polymer nanofibers (EC-TR), which crosslinks additional trypsin molecules onto covalently-attached trypsin (CA-TR). EC-TR showed better stability than CA-TR in rigorous conditions, such as at high temperatures of 40 °C and 50 °C, in the presence of organic co-solvents, and at various pH's. For example, the half-lives of CA-TR and EC-TR were 0.24 and 163.20 hours at 40 ºC, respectively. The improved stability of EC-TR can be explained by covalent-linkages onmore » the surface of trypsin molecules, which effectively inhibits the denaturation, autolysis, and leaching of trypsin. The protein digestion was performed at 40 °C by using both CA-TR and EC-TR in digesting a model protein, enolase. EC-TR showed better performance and stability than CA-TR by maintaining good performance of enolase digestion under recycled uses for a period of one week. In the same condition, CA-TR showed poor performance from the beginning, and could not be used for digestion at all after a few usages. The enzyme coating approach is anticipated to be successfully employed not only for protein digestion in proteomic analysis, but also for various other fields where the poor enzyme stability presently hampers the practical applications of enzymes.« less
Optimal design and evaluation of a color separation grating using rigorous coupled wave analysis
NASA Astrophysics Data System (ADS)
Nagayoshi, Mayumi; Oka, Keiko; Klaus, Werner; Komai, Yuki; Kodate, Kashiko
2006-02-01
In recent years, the technology which separates white light into the three primary colors of Red (R), Green (G) and Blue (B) and adjusts each optical intensity and composites R, G and B to display various colors is required in the development and spread of color visual equipments. Various color separation devices have been proposed and have been put to practical use in color visual equipments. We have focused on a small and light grating-type device which has the possibility of reduction in cost and large-scale production and generates only the three primary colors of R, G and B so that a high saturation level can be obtained. To perform a rigorous analysis and design of color separation gratings, our group has developed a program that is based on the Rigorous Coupled Wave Analysis (RCWA). We then calculated the parameters to obtain a diffraction efficiency of higher than 70% and the color gamut of about 70%. We will report on the design, fabrication and evaluation of color separation gratings that have been optimized for fabrication by laser drawing.
Facility Design and Health Management Program at the Sinnhuber Aquatic Research Laboratory
Barton, Carrie L.; Johnson, Eric W.
2016-01-01
Abstract The number of researchers and institutions moving to the utilization of zebrafish for biomedical research continues to increase because of the recognized advantages of this model. Numerous factors should be considered before building a new or retooling an existing facility. Design decisions will directly impact the management and maintenance costs. We and others have advocated for more rigorous approaches to zebrafish health management to support and protect an increasingly diverse portfolio of important research. The Sinnhuber Aquatic Research Laboratory (SARL) is located ∼3 miles from the main Oregon State University campus in Corvallis, Oregon. This facility supports several research programs that depend heavily on the use of adult, larval, and embryonic zebrafish. The new zebrafish facility of the SARL began operation in 2007 with a commitment to build and manage an efficient facility that diligently protects human and fish health. An important goal was to ensure that the facility was free of Pseudoloma neurophilia (Microsporidia), which is very common in zebrafish research facilities. We recognize that there are certain limitations in space, resources, and financial support that are institution dependent, but in this article, we describe the steps taken to build and manage an efficient specific pathogen-free facility. PMID:26981844
Quantifying Differential Privacy under Temporal Correlations.
Cao, Yang; Yoshikawa, Masatoshi; Xiao, Yonghui; Xiong, Li
2017-04-01
Differential Privacy (DP) has received increasing attention as a rigorous privacy framework. Many existing studies employ traditional DP mechanisms (e.g., the Laplace mechanism) as primitives, which assume that the data are independent, or that adversaries do not have knowledge of the data correlations. However, continuous generated data in the real world tend to be temporally correlated, and such correlations can be acquired by adversaries. In this paper, we investigate the potential privacy loss of a traditional DP mechanism under temporal correlations in the context of continuous data release. First, we model the temporal correlations using Markov model and analyze the privacy leakage of a DP mechanism when adversaries have knowledge of such temporal correlations. Our analysis reveals that the privacy loss of a DP mechanism may accumulate and increase over time . We call it temporal privacy leakage . Second, to measure such privacy loss, we design an efficient algorithm for calculating it in polynomial time. Although the temporal privacy leakage may increase over time, we also show that its supremum may exist in some cases. Third, to bound the privacy loss, we propose mechanisms that convert any existing DP mechanism into one against temporal privacy leakage. Experiments with synthetic data confirm that our approach is efficient and effective.
Facility Design and Health Management Program at the Sinnhuber Aquatic Research Laboratory.
Barton, Carrie L; Johnson, Eric W; Tanguay, Robert L
2016-07-01
The number of researchers and institutions moving to the utilization of zebrafish for biomedical research continues to increase because of the recognized advantages of this model. Numerous factors should be considered before building a new or retooling an existing facility. Design decisions will directly impact the management and maintenance costs. We and others have advocated for more rigorous approaches to zebrafish health management to support and protect an increasingly diverse portfolio of important research. The Sinnhuber Aquatic Research Laboratory (SARL) is located ∼3 miles from the main Oregon State University campus in Corvallis, Oregon. This facility supports several research programs that depend heavily on the use of adult, larval, and embryonic zebrafish. The new zebrafish facility of the SARL began operation in 2007 with a commitment to build and manage an efficient facility that diligently protects human and fish health. An important goal was to ensure that the facility was free of Pseudoloma neurophilia (Microsporidia), which is very common in zebrafish research facilities. We recognize that there are certain limitations in space, resources, and financial support that are institution dependent, but in this article, we describe the steps taken to build and manage an efficient specific pathogen-free facility.
Rosenfeld, Richard M.; Shiffman, Richard N.
2010-01-01
Background Guidelines translate best evidence into best practice. A well-crafted guideline promotes quality by reducing healthcare variations, improving diagnostic accuracy, promoting effective therapy, and discouraging ineffective – or potentially harmful – interventions. Despite a plethora of published guidelines, methodology is often poorly defined and varies greatly within and among organizations. Purpose This manual describes the principles and practices used successfully by the American Academy of Otolaryngology – Head and Neck Surgery to produce quality-driven, evidence-based guidelines using efficient and transparent methodology for action-ready recommendations with multi-disciplinary applicability. The development process, which allows moving from conception to completion in twelve months, emphasizes a logical sequence of key action statements supported by amplifying text, evidence profiles, and recommendation grades that link action to evidence. Conclusions As clinical practice guidelines become more prominent as a key metric of quality healthcare, organizations must develop efficient production strategies that balance rigor and pragmatism. Equally important, clinicians must become savvy in understanding what guidelines are – and are not – and how they are best utilized to improve care. The information in this manual should help clinicians and organizations achieve these goals. PMID:19464525
Back to the Future: The Expanding Communities Curriculum in Geography Education
ERIC Educational Resources Information Center
Halvorsen, Anne-Lise
2009-01-01
This article traces the history of the expanding communities approach, the leading organizational structure for elementary social studies education since the 1930s. Since its introduction into the curriculum, educators have argued about the approach's effectiveness and suitability. Critics claim it lacks intellectual rigor and is redundant in that…
Special Teaching for Special Children? Pedagogies for Inclusion. Inclusive Education
ERIC Educational Resources Information Center
Lewis, Ann, Ed.; Norwich, Brahm, Ed.
2004-01-01
Some special needs groups (for example dyslexia) have argued strongly for the need for particular specialist approaches. In contrast, many proponents of inclusion have argued that "good teaching is good teaching for all" and that all children benefit from similar approaches. Both positions fail to scrutinise this issue rigorously and…
ERIC Educational Resources Information Center
Chang, Benjamin
2017-01-01
This article discusses the use of critical and sociocultural approaches to more dynamically 'internationalise' higher education in the Hong Kong Special Administrative Region of China. The article explores the integration of critical pedagogy and sociocultural learning theory in developing more engaging and rigorous education practices for…
NASA Astrophysics Data System (ADS)
Toman, Blaza; Nelson, Michael A.; Bedner, Mary
2017-06-01
Chemical measurement methods are designed to promote accurate knowledge of a measurand or system. As such, these methods often allow elicitation of latent sources of variability and correlation in experimental data. They typically implement measurement equations that support quantification of effects associated with calibration standards and other known or observed parametric variables. Additionally, multiple samples and calibrants are usually analyzed to assess accuracy of the measurement procedure and repeatability by the analyst. Thus, a realistic assessment of uncertainty for most chemical measurement methods is not purely bottom-up (based on the measurement equation) or top-down (based on the experimental design), but inherently contains elements of both. Confidence in results must be rigorously evaluated for the sources of variability in all of the bottom-up and top-down elements. This type of analysis presents unique challenges due to various statistical correlations among the outputs of measurement equations. One approach is to use a Bayesian hierarchical (BH) model which is intrinsically rigorous, thus making it a straightforward method for use with complex experimental designs, particularly when correlations among data are numerous and difficult to elucidate or explicitly quantify. In simpler cases, careful analysis using GUM Supplement 1 (MC) methods augmented with random effects meta analysis yields similar results to a full BH model analysis. In this article we describe both approaches to rigorous uncertainty evaluation using as examples measurements of 25-hydroxyvitamin D3 in solution reference materials via liquid chromatography with UV absorbance detection (LC-UV) and liquid chromatography mass spectrometric detection using isotope dilution (LC-IDMS).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beau, Mathieu, E-mail: mbeau@stp.dias.ie; Savoie, Baptiste, E-mail: baptiste.savoie@gmail.com
2014-05-15
In this paper, we rigorously investigate the reduced density matrix (RDM) associated to the ideal Bose gas in harmonic traps. We present a method based on a sum-decomposition of the RDM allowing to treat not only the isotropic trap, but also general anisotropic traps. When focusing on the isotropic trap, the method is analogous to the loop-gas approach developed by Mullin [“The loop-gas approach to Bose-Einstein condensation for trapped particles,” Am. J. Phys. 68(2), 120 (2000)]. Turning to the case of anisotropic traps, we examine the RDM for some anisotropic trap models corresponding to some quasi-1D and quasi-2D regimes. Formore » such models, we bring out an additional contribution in the local density of particles which arises from the mesoscopic loops. The close connection with the occurrence of generalized-Bose-Einstein condensation is discussed. Our loop-gas-like approach provides relevant information which can help guide numerical investigations on highly anisotropic systems based on the Path Integral Monte Carlo method.« less
NASA Technical Reports Server (NTRS)
Hornstein, Rhoda S.; Wunderlich, Dana A.; Willoughby, John K.
1992-01-01
New and innovative software technology is presented that provides a cost effective bridge for smoothly transitioning prototype software, in the field of planning and scheduling, into an operational environment. Specifically, this technology mixes the flexibility and human design efficiency of dynamic data typing with the rigor and run-time efficiencies of static data typing. This new technology provides a very valuable tool for conducting the extensive, up-front system prototyping that leads to specifying the correct system and producing a reliable, efficient version that will be operationally effective and will be accepted by the intended users.
Determinants of the Rigor of State Protection Policies for Persons With Dementia in Assisted Living.
Nattinger, Matthew C; Kaskie, Brian
2017-01-01
Continued growth in the number of individuals with dementia residing in assisted living (AL) facilities raises concerns about their safety and protection. However, unlike federally regulated nursing facilities, AL facilities are state-regulated and there is a high degree of variation among policies designed to protect persons with dementia. Despite the important role these protection policies have in shaping the quality of life of persons with dementia residing in AL facilities, little is known about their formation. In this research, we examined the adoption of AL protection policies pertaining to staffing, the physical environment, and the use of chemical restraints. For each protection policy type, we modeled policy rigor using an innovative point-in-time approach, incorporating variables associated with state contextual, institutional, political, and external factors. We found that the rate of state AL protection policy adoptions remained steady over the study period, with staffing policies becoming less rigorous over time. Variables reflecting institutional policy making, including legislative professionalism and bureaucratic oversight, were associated with the rigor of state AL dementia protection policies. As we continue to evaluate the mechanisms contributing to the rigor of AL protection policies, it seems that organized advocacy efforts might expand their role in educating state policy makers about the importance of protecting persons with dementia residing in AL facilities and moving to advance appropriate policies.
All That Glitters: A Glimpse into the Future of Cancer Screening
Developing new screening approaches and rigorously establishing their validity is challenging. Researchers are actively searching for new screening tests that improve the benefits of screening while limiting the harms.
Testability of evolutionary game dynamics based on experimental economics data
NASA Astrophysics Data System (ADS)
Wang, Yijia; Chen, Xiaojie; Wang, Zhijian
2017-11-01
Understanding the dynamic processes of a real game system requires an appropriate dynamics model, and rigorously testing a dynamics model is nontrivial. In our methodological research, we develop an approach to testing the validity of game dynamics models that considers the dynamic patterns of angular momentum and speed as measurement variables. Using Rock-Paper-Scissors (RPS) games as an example, we illustrate the geometric patterns in the experiment data. We then derive the related theoretical patterns from a series of typical dynamics models. By testing the goodness-of-fit between the experimental and theoretical patterns, we show that the validity of these models can be evaluated quantitatively. Our approach establishes a link between dynamics models and experimental systems, which is, to the best of our knowledge, the most effective and rigorous strategy for ascertaining the testability of evolutionary game dynamics models.
Fetisova, Z G
2004-01-01
In accordance with our concept of rigorous optimization of photosynthetic machinery by a functional criterion, this series of papers continues purposeful search in natural photosynthetic units (PSU) for the basic principles of their organization that we predicted theoretically for optimal model light-harvesting systems. This approach allowed us to determine the basic principles for the organization of a PSU of any fixed size. This series of papers deals with the problem of structural optimization of light-harvesting antenna of variable size controlled in vivo by the light intensity during the growth of organisms, which accentuates the problem of antenna structure optimization because optimization requirements become more stringent as the PSU increases in size. In this work, using mathematical modeling for the functioning of natural PSUs, we have shown that the aggregation of pigments of model light-harvesting antenna, being one of universal optimizing factors, furthermore allows controlling the antenna efficiency if the extent of pigment aggregation is a variable parameter. In this case, the efficiency of antenna increases with the size of the elementary antenna aggregate, thus ensuring the high efficiency of the PSU irrespective of its size; i.e., variation in the extent of pigment aggregation controlled by the size of light-harvesting antenna is biologically expedient.
Statistical issues in signal extraction from microarrays
NASA Astrophysics Data System (ADS)
Bergemann, Tracy; Quiaoit, Filemon; Delrow, Jeffrey J.; Zhao, Lue Ping
2001-06-01
Microarray technologies are increasingly used in biomedical research to study genome-wide expression profiles in the post genomic era. Their popularity is largely due to their high throughput and economical affordability. For example, microarrays have been applied to studies of cell cycle, regulatory circuitry, cancer cell lines, tumor tissues, and drug discoveries. One obstacle facing the continued success of applying microarray technologies, however, is the random variaton present on microarrays: within signal spots, between spots and among chips. In addition, signals extracted by available software packages seem to vary significantly. Despite a variety of software packages, it appears that there are two major approaches to signal extraction. One approach is to focus on the identification of signal regions and hence estimation of signal levels above background levels. The other approach is to use the distribution of intensity values as a way of identifying relevant signals. Building upon both approaches, the objective of our work is to develop a method that is statistically rigorous and also efficient and robust. Statistical issues to be considered here include: (1) how to refine grid alignment so that the overall variation is minimized, (2) how to estimate the signal levels relative to the local background levels as well as the variance of this estimate, and (3) how to integrate red and green channel signals so that the ratio of interest is stable, simultaneously relaxing distributional assumptions.
Highly efficient color filter array using resonant Si3N4 gratings.
Uddin, Mohammad Jalal; Magnusson, Robert
2013-05-20
We demonstrate the design and fabrication of a highly efficient guided-mode resonant color filter array. The device is designed using numerical methods based on rigorous coupled-wave analysis and is patterned using UV-laser interferometric lithography. It consists of a 60-nm-thick subwavelength silicon nitride grating along with a 105-nm-thick homogeneous silicon nitride waveguide on a glass substrate. The fabricated device exhibits blue, green, and red color response for grating periods of 274, 327, and 369 nm, respectively. The pixels have a spectral bandwidth of ~12 nm with efficiencies of 94%, 96%, and 99% at the center wavelength of blue, green, and red color filter, respectively. These are higher efficiencies than reported in the literature previously.
Effective grating theory for resonance domain surface-relief diffraction gratings.
Golub, Michael A; Friesem, Asher A
2005-06-01
An effective grating model, which generalizes effective-medium theory to the case of resonance domain surface-relief gratings, is presented. In addition to the zero order, it takes into account the first diffraction order, which obeys the Bragg condition. Modeling the surface-relief grating as an effective grating with two diffraction orders provides closed-form analytical relationships between efficiency and grating parameters. The aspect ratio, the grating period, and the required incidence angle that would lead to high diffraction efficiencies are predicted for TE and TM polarization and verified by rigorous numerical calculations.
Evolution Of USDOE Performance Assessments Over 20 Years
DOE Office of Scientific and Technical Information (OSTI.GOV)
Seitz, Roger R.; Suttora, Linda C.
2013-02-26
Performance assessments (PAs) have been used for many years for the analysis of post-closure hazards associated with a radioactive waste disposal facility and to provide a reasonable expectation of the ability of the site and facility design to meet objectives for the protection of members of the public and the environment. The use of PA to support decision-making for LLW disposal facilities has been mandated in United States Department of Energy (USDOE) directives governing radioactive waste management since 1988 (currently DOE Order 435.1, Radioactive Waste Management). Prior to that time, PAs were also used in a less formal role. Overmore » the past 20+ years, the USDOE approach to conduct, review and apply PAs has evolved into an efficient, rigorous and mature process that includes specific requirements for continuous improvement and independent reviews. The PA process has evolved through refinement of a graded and iterative approach designed to help focus efforts on those aspects of the problem expected to have the greatest influence on the decision being made. Many of the evolutionary changes to the PA process are linked to the refinement of the PA maintenance concept that has proven to be an important element of USDOE PA requirements in the context of supporting decision-making for safe disposal of LLW. The PA maintenance concept represents the evolution of the graded and iterative philosophy and has helped to drive the evolution of PAs from a deterministic compliance calculation into a systematic approach that helps to focus on critical aspects of the disposal system in a manner designed to provide a more informed basis for decision-making throughout the life of a disposal facility (e.g., monitoring, research and testing, waste acceptance criteria, design improvements, data collection, model refinements). A significant evolution in PA modeling has been associated with improved use of uncertainty and sensitivity analysis techniques to support efficient implementation of the graded and iterative approach. Rather than attempt to exactly predict the migration of radionuclides in a disposal unit, the best PAs have evolved into tools that provide a range of results to guide decision-makers in planning the most efficient, cost effective, and safe disposal of radionuclides.« less
Lichen elements as pollution indicators: evaluation of methods for large monitoring programmes
Susan Will-Wolf; Sarah Jovan; Michael C. Amacher
2017-01-01
Lichen element content is a reliable indicator for relative air pollution load in research and monitoring programmes requiring both efficiency and representation of many sites. We tested the value of costly rigorous field and handling protocols for sample element analysis using five lichen species. No relaxation of rigour was supported; four relaxed protocols generated...
Validation of the ROMI-RIP rough mill simulator
Edward R. Thomas; Urs Buehlmann
2002-01-01
The USDA Forest Service's ROMI-RIP rough mill rip-first simulation program is a popular tool for analyzing rough mill conditions, determining more efficient rough mill practices, and finding optimal lumber board cut-up patterns. However, until now, the results generated by ROMI-RIP have not been rigorously compared to those of an actual rough mill. Validating the...
ERIC Educational Resources Information Center
Vineberg, Robert; Joyner, John N.
Instructional System Development (ISD) methodologies and practices were examined in the Army, Navy, Marine Corps, and Air Force, each of which prescribes the ISD system involving rigorous derivation of training requirements from job requirements, selection of instructional strategies to maximize training efficiency, and revision of instruction…
Ghosh, Monisankar; Saha, Suchandrima; Dutta, Samir Kumar
2016-02-07
Herein, we synthesize and elucidate the potential of a novel 'dual hit' molecule, LDCA, to constitutively block lactate dehydrogenase isoform-A (LDH-A) to selectively subvert apoptosis and rigorously attenuate breast tumor progression in a mouse model, comprehensively delineating the therapeutic prospectus of LDCA in the field of cancer metabolics.
A Curricular-Sampling Approach to Progress Monitoring: Mathematics Concepts and Applications
ERIC Educational Resources Information Center
Fuchs, Lynn S.; Fuchs, Douglas; Zumeta, Rebecca O.
2008-01-01
Progress monitoring is an important component of effective instructional practice. Curriculum-based measurement (CBM) is a form of progress monitoring that has been the focus of rigorous research. Two approaches for formulating CBM systems exist. The first is to assess performance regularly on a task that serves as a global indicator of competence…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-11
... tests will help to ensure that control systems are maintained properly over time and a more rigorous... approach, industry is expected to save time in the performance test submittal process. Additionally this... pulping vent gas control at mills where the CCA approach would be adversely affected. Our revised cost...
A Review of the Application of Lifecycle Analysis to Renewable Energy Systems
ERIC Educational Resources Information Center
Lund, Chris; Biswas, Wahidul
2008-01-01
The lifecycle concept is a "cradle to grave" approach to thinking about products, processes, and services, recognizing that all stages have environmental and economic impacts. Any rigorous and meaningful comparison of energy supply options must be done using a lifecycle analysis approach. It has been applied to an increasing number of conventional…
Optimization of Wireless Power Transfer Systems Enhanced by Passive Elements and Metasurfaces
NASA Astrophysics Data System (ADS)
Lang, Hans-Dieter; Sarris, Costas D.
2017-10-01
This paper presents a rigorous optimization technique for wireless power transfer (WPT) systems enhanced by passive elements, ranging from simple reflectors and intermedi- ate relays all the way to general electromagnetic guiding and focusing structures, such as metasurfaces and metamaterials. At its core is a convex semidefinite relaxation formulation of the otherwise nonconvex optimization problem, of which tightness and optimality can be confirmed by a simple test of its solutions. The resulting method is rigorous, versatile, and general -- it does not rely on any assumptions. As shown in various examples, it is able to efficiently and reliably optimize such WPT systems in order to find their physical limitations on performance, optimal operating parameters and inspect their working principles, even for a large number of active transmitters and passive elements.
Advances in knowledge-based software engineering
NASA Technical Reports Server (NTRS)
Truszkowski, Walt
1991-01-01
The underlying hypothesis of this work is that a rigorous and comprehensive software reuse methodology can bring about a more effective and efficient utilization of constrained resources in the development of large-scale software systems by both government and industry. It is also believed that correct use of this type of software engineering methodology can significantly contribute to the higher levels of reliability that will be required of future operational systems. An overview and discussion of current research in the development and application of two systems that support a rigorous reuse paradigm are presented: the Knowledge-Based Software Engineering Environment (KBSEE) and the Knowledge Acquisition fo the Preservation of Tradeoffs and Underlying Rationales (KAPTUR) systems. Emphasis is on a presentation of operational scenarios which highlight the major functional capabilities of the two systems.
Capalbo, Antonio; Romanelli, Valeria; Cimadomo, Danilo; Girardi, Laura; Stoppa, Marta; Dovere, Lisa; Dell'Edera, Domenico; Ubaldi, Filippo Maria; Rienzi, Laura
2016-10-01
For an IVF clinic that wishes to implement preimplantation genetic diagnosis for monogenic diseases (PGD) and for aneuploidy testing (PGD-A), a global improvement is required through all the steps of an IVF treatment and patient care. At present, CCS (Comprehensive Chromosome Screening)-based trophectoderm (TE) biopsy has been demonstrated as a safe, accurate and reproducible approach to conduct PGD-A and possibly also PGD from the same biopsy. Key challenges in PGD/PGD-A implementation cover genetic and reproductive counselling, selection of the most efficient approach for blastocyst biopsy as well as of the best performing molecular technique to conduct CCS and monogenic disease analysis. Three different approaches for TE biopsy can be compared. However, among them, the application of TE biopsy approaches, entailing the zona opening when the expanded blastocyst stage is reached, represent the only biopsy methods suited with a totally undisturbed embryo culture strategy (time lapse-based incubation in a single media). Moreover, contemporary CCS technologies show a different spectrum of capabilities and limits that potentially impact the clinical outcomes, the management and the applicability of the PGD-A itself. In general, CCS approaches that avoid the use of whole genome amplification (WGA) can provide higher reliability of results with lower costs and turnaround time of analysis. The future perspectives are focused on the scrupulous and rigorous clinical validations of novel CCS methods based on targeted approaches that avoid the use of WGA, such as targeted next-generation sequencing technology, to further improve the throughput of analysis and the overall cost-effectiveness of PGD/PGD-A.
Biomedical text mining for research rigor and integrity: tasks, challenges, directions.
Kilicoglu, Halil
2017-06-13
An estimated quarter of a trillion US dollars is invested in the biomedical research enterprise annually. There is growing alarm that a significant portion of this investment is wasted because of problems in reproducibility of research findings and in the rigor and integrity of research conduct and reporting. Recent years have seen a flurry of activities focusing on standardization and guideline development to enhance the reproducibility and rigor of biomedical research. Research activity is primarily communicated via textual artifacts, ranging from grant applications to journal publications. These artifacts can be both the source and the manifestation of practices leading to research waste. For example, an article may describe a poorly designed experiment, or the authors may reach conclusions not supported by the evidence presented. In this article, we pose the question of whether biomedical text mining techniques can assist the stakeholders in the biomedical research enterprise in doing their part toward enhancing research integrity and rigor. In particular, we identify four key areas in which text mining techniques can make a significant contribution: plagiarism/fraud detection, ensuring adherence to reporting guidelines, managing information overload and accurate citation/enhanced bibliometrics. We review the existing methods and tools for specific tasks, if they exist, or discuss relevant research that can provide guidance for future work. With the exponential increase in biomedical research output and the ability of text mining approaches to perform automatic tasks at large scale, we propose that such approaches can support tools that promote responsible research practices, providing significant benefits for the biomedical research enterprise. Published by Oxford University Press 2017. This work is written by a US Government employee and is in the public domain in the US.
Bikson, Marom; Brunoni, Andre R; Charvet, Leigh E; Clark, Vincent P; Cohen, Leonardo G; Deng, Zhi-De; Dmochowski, Jacek; Edwards, Dylan J; Frohlich, Flavio; Kappenman, Emily S; Lim, Kelvin O; Loo, Colleen; Mantovani, Antonio; McMullen, David P; Parra, Lucas C; Pearson, Michele; Richardson, Jessica D; Rumsey, Judith M; Sehatpour, Pejman; Sommers, David; Unal, Gozde; Wassermann, Eric M; Woods, Adam J; Lisanby, Sarah H
Neuropsychiatric disorders are a leading source of disability and require novel treatments that target mechanisms of disease. As such disorders are thought to result from aberrant neuronal circuit activity, neuromodulation approaches are of increasing interest given their potential for manipulating circuits directly. Low intensity transcranial electrical stimulation (tES) with direct currents (transcranial direct current stimulation, tDCS) or alternating currents (transcranial alternating current stimulation, tACS) represent novel, safe, well-tolerated, and relatively inexpensive putative treatment modalities. This report seeks to promote the science, technology and effective clinical applications of these modalities, identify research challenges, and suggest approaches for addressing these needs in order to achieve rigorous, reproducible findings that can advance clinical treatment. The National Institute of Mental Health (NIMH) convened a workshop in September 2016 that brought together experts in basic and human neuroscience, electrical stimulation biophysics and devices, and clinical trial methods to examine the physiological mechanisms underlying tDCS/tACS, technologies and technical strategies for optimizing stimulation protocols, and the state of the science with respect to therapeutic applications and trial designs. Advances in understanding mechanisms, methodological and technological improvements (e.g., electronics, computational models to facilitate proper dosing), and improved clinical trial designs are poised to advance rigorous, reproducible therapeutic applications of these techniques. A number of challenges were identified and meeting participants made recommendations made to address them. These recommendations align with requirements in NIMH funding opportunity announcements to, among other needs, define dosimetry, demonstrate dose/response relationships, implement rigorous blinded trial designs, employ computational modeling, and demonstrate target engagement when testing stimulation-based interventions for the treatment of mental disorders. Published by Elsevier Inc.
Bikson, Marom; Brunoni, Andre R.; Charvet, Leigh E.; Clark, Vincent P.; Cohen, Leonardo G.; Deng, Zhi-De; Dmochowski, Jacek; Edwards, Dylan J.; Frohlich, Flavio; Kappenman, Emily S.; Lim, Kelvin O.; Loo, Colleen; Mantovani, Antonio; McMullen, David P.; Parra, Lucas C.; Pearson, Michele; Richardson, Jessica D.; Rumsey, Judith M.; Sehatpour, Pejman; Sommers, David; Unal, Gozde; Wassermann, Eric M.; Woods, Adam J.; Lisanby, Sarah H.
2018-01-01
Background Neuropsychiatric disorders are a leading source of disability and require novel treatments that target mechanisms of disease. As such disorders are thought to result from aberrant neuronal circuit activity, neuromodulation approaches are of increasing interest given their potential for manipulating circuits directly. Low intensity transcranial electrical stimulation (tES) with direct currents (transcranial direct current stimulation, tDCS) or alternating currents (transcranial alternating current stimulation, tACS) represent novel, safe, well-tolerated, and relatively inexpensive putative treatment modalities. Objective This report seeks to promote the science, technology and effective clinical applications of these modalities, identify research challenges, and suggest approaches for addressing these needs in order to achieve rigorous, reproducible findings that can advance clinical treatment. Methods The National Institute of Mental Health (NIMH) convened a workshop in September 2016 that brought together experts in basic and human neuroscience, electrical stimulation biophysics and devices, and clinical trial methods to examine the physiological mechanisms underlying tDCS/tACS, technologies and technical strategies for optimizing stimulation protocols, and the state of the science with respect to therapeutic applications and trial designs. Results Advances in understanding mechanisms, methodological and technological improvements (e.g., electronics, computational models to facilitate proper dosing), and improved clinical trial designs are poised to advance rigorous, reproducible therapeutic applications of these techniques. A number of challenges were identified and meeting participants made recommendations made to address them. Conclusions These recommendations align with requirements in NIMH funding opportunity announcements to, among other needs, define dosimetry, demonstrate dose/response relationships, implement rigorous blinded trial designs, employ computational modeling, and demonstrate target engagement when testing stimulation-based interventions for the treatment of mental disorders. PMID:29398575
NASA Astrophysics Data System (ADS)
Gillam, Thomas P. S.; Lester, Christopher G.
2014-11-01
We consider current and alternative approaches to setting limits on new physics signals having backgrounds from misidentified objects; for example jets misidentified as leptons, b-jets or photons. Many ATLAS and CMS analyses have used a heuristic "matrix method" for estimating the background contribution from such sources. We demonstrate that the matrix method suffers from statistical shortcomings that can adversely affect its ability to set robust limits. A rigorous alternative method is discussed, and is seen to produce fake rate estimates and limits with better qualities, but is found to be too costly to use. Having investigated the nature of the approximations used to derive the matrix method, we propose a third strategy that is seen to marry the speed of the matrix method to the performance and physicality of the more rigorous approach.
Effectiveness of Culturally Appropriate Adaptations to Juvenile Justice Services
Vergara, Andrew T.; Kathuria, Parul; Woodmass, Kyler; Janke, Robert; Wells, Susan J.
2017-01-01
Despite efforts to increase cultural competence of services within juvenile justice systems, disproportional minority contact (DMC) persists throughout Canada and the United States. Commonly cited approaches to decreasing DMC include large-scale systemic changes as well as enhancement of the cultural relevance and responsiveness of services delivered. Cultural adaptations to service delivery focus on prevention, decision-making, and treatment services to reduce initial contact, minimize unnecessary restraint, and reduce recidivism. Though locating rigorous testing of these approaches compared to standard interventions is difficult, this paper identifies and reports on such research. The Cochrane guidelines for systematic literature reviews and meta-analyses served as a foundation for study methodology. Databases such as Legal Periodicals and Books were searched through June 2015. Three studies were sufficiently rigorous to identify the effect of the cultural adaptations, and three studies that are making potentially important contributions to the field were also reviewed. PMID:29468092
Model-based safety analysis of human-robot interactions: the MIRAS walking assistance robot.
Guiochet, Jérémie; Hoang, Quynh Anh Do; Kaaniche, Mohamed; Powell, David
2013-06-01
Robotic systems have to cope with various execution environments while guaranteeing safety, and in particular when they interact with humans during rehabilitation tasks. These systems are often critical since their failure can lead to human injury or even death. However, such systems are difficult to validate due to their high complexity and the fact that they operate within complex, variable and uncertain environments (including users), in which it is difficult to foresee all possible system behaviors. Because of the complexity of human-robot interactions, rigorous and systematic approaches are needed to assist the developers in the identification of significant threats and the implementation of efficient protection mechanisms, and in the elaboration of a sound argumentation to justify the level of safety that can be achieved by the system. For threat identification, we propose a method called HAZOP-UML based on a risk analysis technique adapted to system description models, focusing on human-robot interaction models. The output of this step is then injected in a structured safety argumentation using the GSN graphical notation. Those approaches have been successfully applied to the development of a walking assistant robot which is now in clinical validation.
NASA Astrophysics Data System (ADS)
Zhou, X.; Albertson, J. D.
2016-12-01
Natural gas is considered as a bridge fuel towards clean energy due to its potential lower greenhouse gas emission comparing with other fossil fuels. Despite numerous efforts, an efficient and cost-effective approach to monitor fugitive methane emissions along the natural gas production-supply chain has not been developed yet. Recently, mobile methane measurement has been introduced which applies a Bayesian approach to probabilistically infer methane emission rates and update estimates recursively when new measurements become available. However, the likelihood function, especially the error term which determines the shape of the estimate uncertainty, is not rigorously defined and evaluated with field data. To address this issue, we performed a series of near-source (< 30 m) controlled methane release experiments using a specialized vehicle mounted with fast response methane analyzers and a GPS unit. Methane concentrations were measured at two different heights along mobile traversals downwind of the sources, and concurrent wind and temperature data are recorded by nearby 3-D sonic anemometers. With known methane release rates, the measurements were used to determine the functional form and the parameterization of the likelihood function in the Bayesian inference scheme under different meteorological conditions.
The Applied Mathematics for Power Systems (AMPS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chertkov, Michael
2012-07-24
Increased deployment of new technologies, e.g., renewable generation and electric vehicles, is rapidly transforming electrical power networks by crossing previously distinct spatiotemporal scales and invalidating many traditional approaches for designing, analyzing, and operating power grids. This trend is expected to accelerate over the coming years, bringing the disruptive challenge of complexity, but also opportunities to deliver unprecedented efficiency and reliability. Our Applied Mathematics for Power Systems (AMPS) Center will discover, enable, and solve emerging mathematics challenges arising in power systems and, more generally, in complex engineered networks. We will develop foundational applied mathematics resulting in rigorous algorithms and simulation toolboxesmore » for modern and future engineered networks. The AMPS Center deconstruction/reconstruction approach 'deconstructs' complex networks into sub-problems within non-separable spatiotemporal scales, a missing step in 20th century modeling of engineered networks. These sub-problems are addressed within the appropriate AMPS foundational pillar - complex systems, control theory, and optimization theory - and merged or 'reconstructed' at their boundaries into more general mathematical descriptions of complex engineered networks where important new questions are formulated and attacked. These two steps, iterated multiple times, will bridge the growing chasm between the legacy power grid and its future as a complex engineered network.« less
Sun, Xin; Zhang, Feng; Ding, Yinhuan; Davies, Thomas W; Li, Yu; Wu, Donghui
2017-08-15
Species delimitation remains a significant challenge when the diagnostic morphological characters are limited. Integrative taxonomy was applied to the genus Protaphorura (Collembola: Onychiuridae), which is one of most difficult soil animals to distinguish taxonomically. Three delimitation approaches (morphology, molecular markers and geography) were applied providing rigorous species validation criteria with an acceptably low error rate. Multiple molecular approaches, including distance- and evolutionary model-based methods, were used to determine species boundaries based on 144 standard barcode sequences. Twenty-two molecular putative species were consistently recovered across molecular and geographical analyses. Geographic criteria were was proved to be an efficient delimitation method for onychiurids. Further morphological examination, based on the combination of the number of pseudocelli, parapseudocelli and ventral mesothoracic chaetae, confirmed 18 taxa of 22 molecular units, with six of them described as new species. These characters were found to be of high taxonomical value. This study highlights the potential benefits of integrative taxonomy, particularly simultaneous use of molecular/geographical tools, as a powerful way of ascertaining the true diversity of the Onychiuridae. Our study also highlights that discovering new morphological characters remains central to achieving a full understanding of collembolan taxonomy.
Growing organizational capacity through a systems approach: one health network's experience.
MacKenzie, Richard; Capuano, Terry; Durishin, Linda Drexinger; Stern, Glen; Burke, James B
2008-02-01
Hospitals are reporting unexpected surges in demand for services. Lehigh Valley Hospital challenged its clinical and administrative staff to increase capacity by at least 4% per year using an interdepartmental, systemwide initiative, Growing Organizational Capacity (GOC). Following a systemwide leadership retreat that yielded more than 1,000 ideas, the initiative's principal sponsor convened a cross-functional improvement team. During a two-year period, 17 projects were implemented. Using a complex systems approach, improvement ideas "emerged" from microsystems at the points of care. Through rigorous reporting and testing of process adaptations, need, data, and people drove innovation. Hundreds of multilevel clinical and administrative staff redesigned processes and roles to increase organizational capacity. Admissions rose by 6.1%, 5.5 %, 8.7%, 5.0%, and 3.8% in fiscal years 2003 through 2007, respectively. Process enhancements cost approximately $1 million, while increased revenues attributable to increased capacity totaled $2.5 million. Multiple, coordinated, and concurrent projects created a greater impact than that possible with a single project. GOC and its success, best explained in the context of complex adaptive systems and microsystem theories, are transferrable to throughput issues that challenge efficiency and effectiveness in other health care systems.
Thermal conduction in particle packs via finite elements
NASA Astrophysics Data System (ADS)
Lechman, Jeremy B.; Yarrington, Cole; Erikson, William; Noble, David R.
2013-06-01
Conductive transport in heterogeneous materials composed of discrete particles is a fundamental problem for a number of applications. While analytical results and rigorous bounds on effective conductivity in mono-sized particle dispersions are well established in the literature, the methods used to arrive at these results often fail when the average size of particle clusters becomes large (i.e., near the percolation transition where particle contact networks dominate the bulk conductivity). Our aim is to develop general, efficient numerical methods that would allow us to explore this behavior and compare to a recent microstructural description of conduction in this regime. To this end, we present a finite element analysis approach to modeling heat transfer in granular media with the goal of predicting effective bulk thermal conductivities of particle-based heterogeneous composites. Our approach is verified against theoretical predictions for random isotropic dispersions of mono-disperse particles at various volume fractions up to close packing. Finally, we present results for the probability distribution of the effective conductivity in particle dispersions generated by Brownian dynamics, and suggest how this might be useful in developing stochastic models of effective properties based on the dynamical process involved in creating heterogeneous dispersions.
NASA Astrophysics Data System (ADS)
Liu, Changying; Iserles, Arieh; Wu, Xinyuan
2018-03-01
The Klein-Gordon equation with nonlinear potential occurs in a wide range of application areas in science and engineering. Its computation represents a major challenge. The main theme of this paper is the construction of symmetric and arbitrarily high-order time integrators for the nonlinear Klein-Gordon equation by integrating Birkhoff-Hermite interpolation polynomials. To this end, under the assumption of periodic boundary conditions, we begin with the formulation of the nonlinear Klein-Gordon equation as an abstract second-order ordinary differential equation (ODE) and its operator-variation-of-constants formula. We then derive a symmetric and arbitrarily high-order Birkhoff-Hermite time integration formula for the nonlinear abstract ODE. Accordingly, the stability, convergence and long-time behaviour are rigorously analysed once the spatial differential operator is approximated by an appropriate positive semi-definite matrix, subject to suitable temporal and spatial smoothness. A remarkable characteristic of this new approach is that the requirement of temporal smoothness is reduced compared with the traditional numerical methods for PDEs in the literature. Numerical results demonstrate the advantage and efficiency of our time integrators in comparison with the existing numerical approaches.
Bloom, Guillaume; Larat, Christian; Lallier, Eric; Lee-Bouhours, Mane-Si Laure; Loiseaux, Brigitte; Huignard, Jean-Pierre
2011-02-10
We have designed a high-efficiency array generator composed of subwavelength grooves etched in a GaAs substrate for operation at 4.5 μm. The method used combines rigorous coupled wave analysis with an optimization algorithm. The optimized beam splitter has both a high efficiency (∼96%) and a good intensity uniformity (∼0.2%). The fabrication error tolerances are numerically calculated, and it is shown that this subwavelength array generator could be fabricated with current electron beam writers and inductively coupled plasma etching. Finally, we studied the effect of a simple and realistic antireflection coating on the performance of the beam splitter.
DISCUSSION ON THE EFFECTS OF MIDDLE-EAR DISEASE ON EFFICIENCY IN CIVIL AND MILITARY LIFE
1928-01-01
(1) Introductory.—(2) Symptoms and complications of middle-ear disease and their effects on efficiency.—(3) Service views on the disposal of men suffering from middle-ear disease.—(4) Comparison of the disability caused by middle-ear disease in civil and military life.—(5) Are the aural recruiting and invaliding standards of to-day too exacting?—(6) Aural disease in recruiting and invaliding—tables.—(7) Advantages of rigorous standards of recruiting.—(8) Attainment of aural efficiency in the Services.—(9) Is any relaxation of standards ever justifiable?—(10) Importance of civil and military co-operation. PMID:19986558
NASA Astrophysics Data System (ADS)
Šprlák, M.; Han, S.-C.; Featherstone, W. E.
2017-12-01
Rigorous modelling of the spherical gravitational potential spectra from the volumetric density and geometry of an attracting body is discussed. Firstly, we derive mathematical formulas for the spatial analysis of spherical harmonic coefficients. Secondly, we present a numerically efficient algorithm for rigorous forward modelling. We consider the finite-amplitude topographic modelling methods as special cases, with additional postulates on the volumetric density and geometry. Thirdly, we implement our algorithm in the form of computer programs and test their correctness with respect to the finite-amplitude topography routines. For this purpose, synthetic and realistic numerical experiments, applied to the gravitational field and geometry of the Moon, are performed. We also investigate the optimal choice of input parameters for the finite-amplitude modelling methods. Fourth, we exploit the rigorous forward modelling for the determination of the spherical gravitational potential spectra inferred by lunar crustal models with uniform, laterally variable, radially variable, and spatially (3D) variable bulk density. Also, we analyse these four different crustal models in terms of their spectral characteristics and band-limited radial gravitation. We demonstrate applicability of the rigorous forward modelling using currently available computational resources up to degree and order 2519 of the spherical harmonic expansion, which corresponds to a resolution of 2.2 km on the surface of the Moon. Computer codes, a user manual and scripts developed for the purposes of this study are publicly available to potential users.
Chiu, Grace S.; Wu, Margaret A.; Lu, Lin
2013-01-01
The ability to quantitatively assess ecological health is of great interest to those tasked with monitoring and conserving ecosystems. For decades, biomonitoring research and policies have relied on multimetric health indices of various forms. Although indices are numbers, many are constructed based on qualitative procedures, thus limiting the quantitative rigor of the practical interpretations of such indices. The statistical modeling approach to construct the latent health factor index (LHFI) was recently developed. With ecological data that otherwise are used to construct conventional multimetric indices, the LHFI framework expresses such data in a rigorous quantitative model, integrating qualitative features of ecosystem health and preconceived ecological relationships among such features. This hierarchical modeling approach allows unified statistical inference of health for observed sites (along with prediction of health for partially observed sites, if desired) and of the relevance of ecological drivers, all accompanied by formal uncertainty statements from a single, integrated analysis. Thus far, the LHFI approach has been demonstrated and validated in a freshwater context. We adapt this approach to modeling estuarine health, and illustrate it on the previously unassessed system in Richibucto in New Brunswick, Canada, where active oyster farming is a potential stressor through its effects on sediment properties. Field data correspond to health metrics that constitute the popular AZTI marine biotic index and the infaunal trophic index, as well as abiotic predictors preconceived to influence biota. Our paper is the first to construct a scientifically sensible model that rigorously identifies the collective explanatory capacity of salinity, distance downstream, channel depth, and silt–clay content–all regarded a priori as qualitatively important abiotic drivers–towards site health in the Richibucto ecosystem. This suggests the potential effectiveness of the LHFI approach for assessing not only freshwater systems but aquatic ecosystems in general. PMID:23785443
Controlled-Root Approach To Digital Phase-Locked Loops
NASA Technical Reports Server (NTRS)
Stephens, Scott A.; Thomas, J. Brooks
1995-01-01
Performance tailored more flexibly and directly to satisfy design requirements. Controlled-root approach improved method for analysis and design of digital phase-locked loops (DPLLs). Developed rigorously from first principles for fully digital loops, making DPLL theory and design simpler and more straightforward (particularly for third- or fourth-order DPLL) and controlling performance more accurately in case of high gain.
Case Method Teaching as Science and Art: A Metaphoric Approach and Curricular Application
ERIC Educational Resources Information Center
Greenhalgh, Anne M.
2007-01-01
The following article takes a metaphoric approach to case method teaching to shed light on one of our most important practices. The article hinges on the dual comparison of case method as science and as art. The dominant, scientific view of cases is that they are neutral descriptions of real-life business problems, subject to rigorous analysis.…
ERIC Educational Resources Information Center
Dvir, Assaf; Tabach, Michal
2017-01-01
High schools commonly use a differential approach to teach minima and maxima geometric problems. Although calculus serves as a systematic and powerful technique, this rigorous instrument might hinder students' ability to understand the behavior and constraints of the objective function. The proliferation of digital environments allowed us to adopt…
ERIC Educational Resources Information Center
Lotan, Meir; Gold, Christian
2009-01-01
Background: The Snoezelen[R] is a multisensory intervention approach that has been implemented with various populations. Due to an almost complete absence of rigorous research in this field, the confirmation of this approach as an effective therapeutic intervention is warranted. Method: To evaluate the therapeutic influence of the…
Precisely and Accurately Inferring Single-Molecule Rate Constants
Kinz-Thompson, Colin D.; Bailey, Nevette A.; Gonzalez, Ruben L.
2017-01-01
The kinetics of biomolecular systems can be quantified by calculating the stochastic rate constants that govern the biomolecular state versus time trajectories (i.e., state trajectories) of individual biomolecules. To do so, the experimental signal versus time trajectories (i.e., signal trajectories) obtained from observing individual biomolecules are often idealized to generate state trajectories by methods such as thresholding or hidden Markov modeling. Here, we discuss approaches for idealizing signal trajectories and calculating stochastic rate constants from the resulting state trajectories. Importantly, we provide an analysis of how the finite length of signal trajectories restrict the precision of these approaches, and demonstrate how Bayesian inference-based versions of these approaches allow rigorous determination of this precision. Similarly, we provide an analysis of how the finite lengths and limited time resolutions of signal trajectories restrict the accuracy of these approaches, and describe methods that, by accounting for the effects of the finite length and limited time resolution of signal trajectories, substantially improve this accuracy. Collectively, therefore, the methods we consider here enable a rigorous assessment of the precision, and a significant enhancement of the accuracy, with which stochastic rate constants can be calculated from single-molecule signal trajectories. PMID:27793280
Stakeholder-Driven Quality Improvement: A Compelling Force for Clinical Practice Guidelines.
Rosenfeld, Richard M; Wyer, Peter C
2018-01-01
Clinical practice guideline development should be driven by rigorous methodology, but what is less clear is where quality improvement enters the process: should it be a priority-guiding force, or should it enter only after recommendations are formulated? We argue for a stakeholder-driven approach to guideline development, with an overriding goal of quality improvement based on stakeholder perceptions of needs, uncertainties, and knowledge gaps. In contrast, the widely used topic-driven approach, which often makes recommendations based only on randomized controlled trials, is driven by epidemiologic purity and evidence rigor, with quality improvement a downstream consideration. The advantages of a stakeholder-driven versus a topic-driven approach are highlighted by comparisons of guidelines for otitis media with effusion, thyroid nodules, sepsis, and acute bacterial rhinosinusitis. These comparisons show that stakeholder-driven guidelines are more likely to address the quality improvement needs and pressing concerns of clinicians and patients, including understudied populations and patients with multiple chronic conditions. Conversely, a topic-driven approach often addresses "typical" patients, based on research that may not reflect the needs of high-risk groups excluded from studies because of ethical issues or a desire for purity of research design.
Rigorous theory of graded thermoelectric converters including finite heat transfer coefficients
NASA Astrophysics Data System (ADS)
Gerstenmaier, York Christian; Wachutka, Gerhard
2017-11-01
Maximization of thermoelectric (TE) converter performance with an inhomogeneous material and electric current distribution has been investigated in previous literature neglecting thermal contact resistances to the heat reservoirs. The heat transfer coefficients (HTCs), defined as inverse thermal contact resistances per unit area, are thus infinite, whereas in reality, always parasitic thermal resistances, i.e., finite HTCs, are present. Maximization of the generated electric power and of cooling power in the refrigerator mode with respect to Seebeck coefficients and heat conductivity for a given profile of the material's TE figure of merit Z are mathematically ill-posed problems in the presence of infinite HTCs. As will be shown in this work, a fully self consistent solution is possible for finite HTCs, and in many respects, the results are fundamentally different. A previous theory for 3D devices will be extended to include finite HTCs and is applied to 1D devices. For the heat conductivity profile, an infinite number of solutions exist leading to the same device performance. Cooling power maximization for finite HTCs in 1D will lead to a strongly enhanced corresponding efficiency (coefficient of performance), whereas results with infinite HTCs lead to a non-monotonous temperature profile and coefficient of performance tending to zero for the prescribed heat conductivities. For maximized generated electric power, the corresponding generator efficiency is nearly a constant independent from the finite HTC values. The maximized efficiencies in the generator and cooling mode are equal to the efficiencies for the infinite HTC, provided that the corresponding powers approach zero. These and more findings are condensed in 4 theorems in the conclusions.
ERIC Educational Resources Information Center
McCready, John W.
2010-01-01
The purpose of this study was to examine use of decision-making tools and feedback in strategic planning in order to develop a rigorous process that would promote the efficiency of strategic planning for acquisitions in the United States Coast Guard (USCG). Strategic planning is critical to agencies such as the USCG in order to be effective…
ERIC Educational Resources Information Center
Glassman, Jill R.; Potter, Susan C.; Baumler, Elizabeth R.; Coyle, Karin K.
2015-01-01
Introduction: Group-randomized trials (GRTs) are one of the most rigorous methods for evaluating the effectiveness of group-based health risk prevention programs. Efficiently designing GRTs with a sample size that is sufficient for meeting the trial's power and precision goals while not wasting resources exceeding them requires estimates of the…
Bayesian Inference: with ecological applications
Link, William A.; Barker, Richard J.
2010-01-01
This text provides a mathematically rigorous yet accessible and engaging introduction to Bayesian inference with relevant examples that will be of interest to biologists working in the fields of ecology, wildlife management and environmental studies as well as students in advanced undergraduate statistics.. This text opens the door to Bayesian inference, taking advantage of modern computational efficiencies and easily accessible software to evaluate complex hierarchical models.
Hu, Kexiang; Ding, Enjie; Wangyang, Peihua; Wang, Qingkang
2016-06-01
The electromagnetic spectrum and the photoelectric conversion efficiency of the silicon hexagonal nanoconical hole (SiHNH) arrays based solar cells is systematically analyzed according to Rigorous Coupled Wave Analysis (RCWA) and Modal Transmission Line (MTL) theory. An ultimate efficiency of the optimized SiHNH arrays based solar cell is up to 31.92% in consideration of the absorption spectrum, 4.52% higher than that of silicon hexagonal nanoconical frustum (SiHNF) arrays. The absorption enhancement of the SiHNH arrays is due to its lower reflectance and more supported guided-mode resonances, and the enhanced ultimate efficiency is insensitive to bottom diameter (D(bot)) of nanoconical hole and the incident angle. The result provides an additional guideline for the nanostructure surface texturing fabrication design for photovoltaic applications.
NASA Technical Reports Server (NTRS)
Zhang, Z.; Meyer, K.; Platnick, S.; Oreopoulos, L.; Lee, D.; Yu, H.
2013-01-01
This paper describes an efficient and unique method for computing the shortwave direct radiative effect (DRE) of aerosol residing above low-level liquid-phase clouds using CALIOP and MODIS data. It accounts for the overlapping of aerosol and cloud rigorously by utilizing the joint histogram of cloud optical depth and cloud top pressure. Effects of sub-grid scale cloud and aerosol variations on DRE are accounted for. It is computationally efficient through using grid-level cloud and aerosol statistics, instead of pixel-level products, and a pre-computed look-up table in radiative transfer calculations. We verified that for smoke over the southeast Atlantic Ocean the method yields a seasonal mean instantaneous shortwave DRE that generally agrees with more rigorous pixel-level computation within 4%. We have also computed the annual mean instantaneous shortwave DRE of light-absorbing aerosols (i.e., smoke and polluted dust) over global ocean based on 4 yr of CALIOP and MODIS data. We found that the variability of the annual mean shortwave DRE of above-cloud light-absorbing aerosol is mainly driven by the optical depth of the underlying clouds.
NASA Technical Reports Server (NTRS)
Zhang, Z.; Meyer, K.; Platnick, S.; Oreopoulos, L.; Lee, D.; Yu, H.
2014-01-01
This paper describes an efficient and unique method for computing the shortwave direct radiative effect (DRE) of aerosol residing above low-level liquid-phase clouds using CALIOP and MODIS data. It accounts for the overlapping of aerosol and cloud rigorously by utilizing the joint histogram of cloud optical depth and cloud top pressure. Effects of sub-grid scale cloud and aerosol variations on DRE are accounted for. It is computationally efficient through using grid-level cloud and aerosol statistics, instead of pixel-level products, and a pre-computed look-up table in radiative transfer calculations. We verified that for smoke over the southeast Atlantic Ocean the method yields a seasonal mean instantaneous shortwave DRE that generally agrees with more rigorous pixel-level computation within 4. We have also computed the annual mean instantaneous shortwave DRE of light-absorbing aerosols (i.e., smoke and polluted dust) over global ocean based on 4 yr of CALIOP and MODIS data. We found that the variability of the annual mean shortwave DRE of above-cloud light-absorbing aerosol is mainly driven by the optical depth of the underlying clouds.
Methods for the guideline-based development of quality indicators--a systematic review
2012-01-01
Background Quality indicators (QIs) are used in many healthcare settings to measure, compare, and improve quality of care. For the efficient development of high-quality QIs, rigorous, approved, and evidence-based development methods are needed. Clinical practice guidelines are a suitable source to derive QIs from, but no gold standard for guideline-based QI development exists. This review aims to identify, describe, and compare methodological approaches to guideline-based QI development. Methods We systematically searched medical literature databases (Medline, EMBASE, and CINAHL) and grey literature. Two researchers selected publications reporting methodological approaches to guideline-based QI development. In order to describe and compare methodological approaches used in these publications, we extracted detailed information on common steps of guideline-based QI development (topic selection, guideline selection, extraction of recommendations, QI selection, practice test, and implementation) to predesigned extraction tables. Results From 8,697 hits in the database search and several grey literature documents, we selected 48 relevant references. The studies were of heterogeneous type and quality. We found no randomized controlled trial or other studies comparing the ability of different methodological approaches to guideline-based development to generate high-quality QIs. The relevant publications featured a wide variety of methodological approaches to guideline-based QI development, especially regarding guideline selection and extraction of recommendations. Only a few studies reported patient involvement. Conclusions Further research is needed to determine which elements of the methodological approaches identified, described, and compared in this review are best suited to constitute a gold standard for guideline-based QI development. For this research, we provide a comprehensive groundwork. PMID:22436067
Computation of elementary modes: a unifying framework and the new binary approach
Gagneur, Julien; Klamt, Steffen
2004-01-01
Background Metabolic pathway analysis has been recognized as a central approach to the structural analysis of metabolic networks. The concept of elementary (flux) modes provides a rigorous formalism to describe and assess pathways and has proven to be valuable for many applications. However, computing elementary modes is a hard computational task. In recent years we assisted in a multiplication of algorithms dedicated to it. We require a summarizing point of view and a continued improvement of the current methods. Results We show that computing the set of elementary modes is equivalent to computing the set of extreme rays of a convex cone. This standard mathematical representation provides a unified framework that encompasses the most prominent algorithmic methods that compute elementary modes and allows a clear comparison between them. Taking lessons from this benchmark, we here introduce a new method, the binary approach, which computes the elementary modes as binary patterns of participating reactions from which the respective stoichiometric coefficients can be computed in a post-processing step. We implemented the binary approach in FluxAnalyzer 5.1, a software that is free for academics. The binary approach decreases the memory demand up to 96% without loss of speed giving the most efficient method available for computing elementary modes to date. Conclusions The equivalence between elementary modes and extreme ray computations offers opportunities for employing tools from polyhedral computation for metabolic pathway analysis. The new binary approach introduced herein was derived from this general theoretical framework and facilitates the computation of elementary modes in considerably larger networks. PMID:15527509
Effective recruitment of minority populations through community-led strategies.
Horowitz, Carol R; Brenner, Barbara L; Lachapelle, Susanne; Amara, Duna A; Arniella, Guedy
2009-12-01
Traditional research approaches frequently fail to yield representative numbers of people of color in research. Community-based participatory research (CBPR) may be an important strategy for partnering with and reaching populations that bear a greater burden of illness but have been historically difficult to engage. The Community Action Board, consisting of 20 East Harlem residents, leaders, and advocates, used CBPR to compare the effectiveness of various strategies in recruiting and enrolling adults with prediabetes into a peer-led, diabetes prevention intervention. The board created five recruitment strategies: recruiting through clinicians; recruiting at large public events such as farmers markets; organizing special local recruitment events; recruiting at local organizations; and recruiting through a partner-led approach, in which community partners developed and managed the recruitment efforts at their sites. In 3 months, 555 local adults were approached; 249 were appropriate candidates for further evaluation (overweight, nonpregnant, East Harlem residents without known diabetes); 179 consented and returned in a fasting state for 1/2 day of prediabetes testing; 99 had prediabetes and enrolled in a pilot randomized trial. The partner-led approach was highly successful, recruiting 68% of those enrolled. This strategy was also the most efficient; 34% of those approached through partners were ultimately enrolled, versus 0%-17% enrolled through the other four strategies. Participants were predominantly low-income, uninsured, undereducated, Spanish-speaking women. This CBPR approach highlights the value of partner-led recruitment to identify, reach out to, and motivate a vulnerable population into participation in research, using techniques that may be unfamiliar to researchers but are nevertheless rigorous and effective.
Scientific approaches to science policy.
Berg, Jeremy M
2013-11-01
The development of robust science policy depends on use of the best available data, rigorous analysis, and inclusion of a wide range of input. While director of the National Institute of General Medical Sciences (NIGMS), I took advantage of available data and emerging tools to analyze training time distribution by new NIGMS grantees, the distribution of the number of publications as a function of total annual National Institutes of Health support per investigator, and the predictive value of peer-review scores on subsequent scientific productivity. Rigorous data analysis should be used to develop new reforms and initiatives that will help build a more sustainable American biomedical research enterprise.
Griffith Edwards' rigorous sympathy with Alcoholics Anonymous.
Humphreys, Keith
2015-07-01
Griffith Edwards made empirical contributions early in his career to the literature on Alcoholics Anonymous (AA), but the attitude he adopted towards AA and other peer-led mutual help initiatives constitutes an even more important legacy. Unlike many treatment professionals who dismissed the value of AA or were threatened by its non-professional approach, Edwards was consistently respectful of the organization. However, he never became an uncritical booster of AA or overgeneralized what could be learnt from it. Future scholarly and clinical endeavors concerning addiction-related mutual help initiatives will benefit by continuing Edwards' tradition of 'rigorous sympathy'. © 2015 Society for the Study of Addiction.
Shaver, Aaron C; Greig, Bruce W; Mosse, Claudio A; Seegmiller, Adam C
2015-05-01
Optimizing a clinical flow cytometry panel can be a subjective process dependent on experience. We develop a quantitative method to make this process more rigorous and apply it to B lymphoblastic leukemia/lymphoma (B-ALL) minimal residual disease (MRD) testing. We retrospectively analyzed our existing three-tube, seven-color B-ALL MRD panel and used our novel method to develop an optimized one-tube, eight-color panel, which was tested prospectively. The optimized one-tube, eight-color panel resulted in greater efficiency of time and resources with no loss in diagnostic power. Constructing a flow cytometry panel using a rigorous, objective, quantitative method permits optimization and avoids problems of interdependence and redundancy in a large, multiantigen panel. Copyright© by the American Society for Clinical Pathology.
The impact of rigorous mathematical thinking as learning method toward geometry understanding
NASA Astrophysics Data System (ADS)
Nugraheni, Z.; Budiyono, B.; Slamet, I.
2018-05-01
To reach higher order thinking skill, needed to be mastered the conceptual understanding. RMT is a unique realization of the cognitive conceptual construction approach based on Mediated Learning Experience (MLE) theory by Feurstein and Vygotsky’s sociocultural theory. This was quasi experimental research which was comparing the experimental class that was given Rigorous Mathematical Thinking (RMT) as learning method and control class that was given Direct Learning (DL) as the conventional learning activity. This study examined whether there was different effect of two learning method toward conceptual understanding of Junior High School students. The data was analyzed by using Independent t-test and obtained a significant difference of mean value between experimental and control class on geometry conceptual understanding. Further, by semi-structure interview known that students taught by RMT had deeper conceptual understanding than students who were taught by conventional way. By these result known that Rigorous Mathematical Thinking (RMT) as learning method have positive impact toward Geometry conceptual understanding.
Optical properties of electrohydrodynamic convection patterns: rigorous and approximate methods.
Bohley, Christian; Heuer, Jana; Stannarius, Ralf
2005-12-01
We analyze the optical behavior of two-dimensionally periodic structures that occur in electrohydrodynamic convection (EHC) patterns in nematic sandwich cells. These structures are anisotropic, locally uniaxial, and periodic on the scale of micrometers. For the first time, the optics of these structures is investigated with a rigorous method. The method used for the description of the electromagnetic waves interacting with EHC director patterns is a numerical approach that discretizes directly the Maxwell equations. It works as a space-grid-time-domain method and computes electric and magnetic fields in time steps. This so-called finite-difference-time-domain (FDTD) method is able to generate the fields with arbitrary accuracy. We compare this rigorous method with earlier attempts based on ray-tracing and analytical approximations. Results of optical studies of EHC structures made earlier based on ray-tracing methods are confirmed for thin cells, when the spatial periods of the pattern are sufficiently large. For the treatment of small-scale convection structures, the FDTD method is without alternatives.
ERIC Educational Resources Information Center
Tsaparlis, Georgios; Kolioulis, Dimitrios; Pappa, Eleni
2010-01-01
We present a programme for a novel introductory lower-secondary chemistry course (seventh or eighth grade) that aims at the application of theories of science education, and in particular of conceptual/meaningful learning and of teaching methodology that encourages active and inquiry forms of learning The approach is rigorous with careful use of…
2018-01-01
Signaling pathways represent parts of the global biological molecular network which connects them into a seamless whole through complex direct and indirect (hidden) crosstalk whose structure can change during development or in pathological conditions. We suggest a novel methodology, called Googlomics, for the structural analysis of directed biological networks using spectral analysis of their Google matrices, using parallels with quantum scattering theory, developed for nuclear and mesoscopic physics and quantum chaos. We introduce analytical “reduced Google matrix” method for the analysis of biological network structure. The method allows inferring hidden causal relations between the members of a signaling pathway or a functionally related group of genes. We investigate how the structure of hidden causal relations can be reprogrammed as a result of changes in the transcriptional network layer during cancerogenesis. The suggested Googlomics approach rigorously characterizes complex systemic changes in the wiring of large causal biological networks in a computationally efficient way. PMID:29370181
Mechanical transduction via a single soft polymer
NASA Astrophysics Data System (ADS)
Hou, Ruizheng; Wang, Nan; Bao, Weizhu; Wang, Zhisong
2018-04-01
Molecular machines from biology and nanotechnology often depend on soft structures to perform mechanical functions, but the underlying mechanisms and advantages or disadvantages over rigid structures are not fully understood. We report here a rigorous study of mechanical transduction along a single soft polymer based on exact solutions to the realistic three-dimensional wormlike-chain model and augmented with analytical relations derived from simpler polymer models. The results reveal surprisingly that a soft polymer with vanishingly small persistence length below a single chemical bond still transduces biased displacement and mechanical work up to practically significant amounts. This "soft" approach possesses unique advantages over the conventional wisdom of rigidity-based transduction, and potentially leads to a unified mechanism for effective allosterylike transduction and relay of mechanical actions, information, control, and molecules from one position to another in molecular devices and motors. This study also identifies an entropy limit unique to the soft transduction, and thereby suggests a possibility of detecting higher efficiency for kinesin motor and mutants in future experiments.
Simultaneous Multi-Scale Diffusion Estimation and Tractography Guided by Entropy Spectrum Pathways
Galinsky, Vitaly L.; Frank, Lawrence R.
2015-01-01
We have developed a method for the simultaneous estimation of local diffusion and the global fiber tracts based upon the information entropy flow that computes the maximum entropy trajectories between locations and depends upon the global structure of the multi-dimensional and multi-modal diffusion field. Computation of the entropy spectrum pathways requires only solving a simple eigenvector problem for the probability distribution for which efficient numerical routines exist, and a straight forward integration of the probability conservation through ray tracing of the convective modes guided by a global structure of the entropy spectrum coupled with a small scale local diffusion. The intervoxel diffusion is sampled by multi b-shell multi q-angle DWI data expanded in spherical waves. This novel approach to fiber tracking incorporates global information about multiple fiber crossings in every individual voxel and ranks it in the most scientifically rigorous way. This method has potential significance for a wide range of applications, including studies of brain connectivity. PMID:25532167
NASA Astrophysics Data System (ADS)
Choi, Youngsun; Hahn, Choloong; Yoon, Jae Woong; Song, Seok Ho; Berini, Pierre
2017-01-01
Time-asymmetric state-evolution properties while encircling an exceptional point are presently of great interest in search of new principles for controlling atomic and optical systems. Here, we show that encircling-an-exceptional-point interactions that are essentially reciprocal in the linear interaction regime make a plausible nonlinear integrated optical device architecture highly nonreciprocal over an extremely broad spectrum. In the proposed strategy, we describe an experimentally realizable coupled-waveguide structure that supports an encircling-an-exceptional-point parametric evolution under the influence of a gain saturation nonlinearity. Using an intuitive time-dependent Hamiltonian and rigorous numerical computations, we demonstrate strictly nonreciprocal optical transmission with a forward-to-backward transmission ratio exceeding 10 dB and high forward transmission efficiency (~100%) persisting over an extremely broad bandwidth approaching 100 THz. This predicted performance strongly encourages experimental realization of the proposed concept to establish a practical on-chip optical nonreciprocal element for ultra-short laser pulses and broadband high-density optical signal processing.
Modal-pushover-based ground-motion scaling procedure
Kalkan, Erol; Chopra, Anil K.
2011-01-01
Earthquake engineering is increasingly using nonlinear response history analysis (RHA) to demonstrate the performance of structures. This rigorous method of analysis requires selection and scaling of ground motions appropriate to design hazard levels. This paper presents a modal-pushover-based scaling (MPS) procedure to scale ground motions for use in a nonlinear RHA of buildings. In the MPS method, the ground motions are scaled to match to a specified tolerance, a target value of the inelastic deformation of the first-mode inelastic single-degree-of-freedom (SDF) system whose properties are determined by the first-mode pushover analysis. Appropriate for first-mode dominated structures, this approach is extended for structures with significant contributions of higher modes by considering elastic deformation of second-mode SDF systems in selecting a subset of the scaled ground motions. Based on results presented for three actual buildings-4, 6, and 13-story-the accuracy and efficiency of the MPS procedure are established and its superiority over the ASCE/SEI 7-05 scaling procedure is demonstrated.
Lin, Yunyue; Wu, Qishi; Cai, Xiaoshan; ...
2010-01-01
Data transmission from sensor nodes to a base station or a sink node often incurs significant energy consumption, which critically affects network lifetime. We generalize and solve the problem of deploying multiple base stations to maximize network lifetime in terms of two different metrics under one-hop and multihop communication models. In the one-hop communication model, the sensors far away from base stations always deplete their energy much faster than others. We propose an optimal solution and a heuristic approach based on the minimal enclosing circle algorithm to deploy a base station at the geometric center of each cluster. In themore » multihop communication model, both base station location and data routing mechanism need to be considered in maximizing network lifetime. We propose an iterative algorithm based on rigorous mathematical derivations and use linear programming to compute the optimal routing paths for data transmission. Simulation results show the distinguished performance of the proposed deployment algorithms in maximizing network lifetime.« less
Multivariate localization methods for ensemble Kalman filtering
NASA Astrophysics Data System (ADS)
Roh, S.; Jun, M.; Szunyogh, I.; Genton, M. G.
2015-05-01
In ensemble Kalman filtering (EnKF), the small number of ensemble members that is feasible to use in a practical data assimilation application leads to sampling variability of the estimates of the background error covariances. The standard approach to reducing the effects of this sampling variability, which has also been found to be highly efficient in improving the performance of EnKF, is the localization of the estimates of the covariances. One family of localization techniques is based on taking the Schur (entry-wise) product of the ensemble-based sample covariance matrix and a correlation matrix whose entries are obtained by the discretization of a distance-dependent correlation function. While the proper definition of the localization function for a single state variable has been extensively investigated, a rigorous definition of the localization function for multiple state variables has been seldom considered. This paper introduces two strategies for the construction of localization functions for multiple state variables. The proposed localization functions are tested by assimilating simulated observations experiments into the bivariate Lorenz 95 model with their help.
NASA Astrophysics Data System (ADS)
D'Ambrosio, Raffaele; Moccaldi, Martina; Paternoster, Beatrice
2018-05-01
In this paper, an adapted numerical scheme for reaction-diffusion problems generating periodic wavefronts is introduced. Adapted numerical methods for such evolutionary problems are specially tuned to follow prescribed qualitative behaviors of the solutions, making the numerical scheme more accurate and efficient as compared with traditional schemes already known in the literature. Adaptation through the so-called exponential fitting technique leads to methods whose coefficients depend on unknown parameters related to the dynamics and aimed to be numerically computed. Here we propose a strategy for a cheap and accurate estimation of such parameters, which consists essentially in minimizing the leading term of the local truncation error whose expression is provided in a rigorous accuracy analysis. In particular, the presented estimation technique has been applied to a numerical scheme based on combining an adapted finite difference discretization in space with an implicit-explicit time discretization. Numerical experiments confirming the effectiveness of the approach are also provided.
Societal challenges of precision medicine: Bringing order to chaos.
Salgado, Roberto; Moore, Helen; Martens, John W M; Lively, Tracy; Malik, Shakun; McDermott, Ultan; Michiels, Stefan; Moscow, Jeffrey A; Tejpar, Sabine; McKee, Tawnya; Lacombe, Denis
2017-10-01
The increasing number of drugs targeting specific proteins implicated in tumourigenesis and the commercial promotion of relatively affordable genome-wide analyses has led to an increasing expectation among patients with cancer that they can now receive effective personalised treatment based on the often complex genomic signature of their tumour. For such approaches to work in routine practice, the development of correspondingly complex biomarker assays through an appropriate and rigorous regulatory framework will be required. It is becoming increasingly evident that a re-engineering of clinical research is necessary so that regulatory considerations and procedures facilitate the efficient translation of these required biomarker assays from the discovery setting through to clinical application. This article discusses the practical requirements and challenges of developing such new precision medicine strategies, based on leveraging complex genomic profiles, as discussed at the Innovation and Biomarkers in Cancer Drug Development meeting (8th-9th September 2016, Brussels, Belgium). Copyright © 2017 Elsevier Ltd. All rights reserved.
Joe, Jonathan; Chaudhuri, Shomir; Le, Thai; Thompson, Hilaire; Demiris, George
2015-08-01
While health information technologies have become increasingly popular, many have not been formally tested to ascertain their usability. Traditional rigorous methods take significant amounts of time and manpower to evaluate the usability of a system. In this paper, we evaluate the use of instant data analysis (IDA) as developed by Kjeldskov et al. to perform usability testing on a tool designed for older adults and caregivers. The IDA method is attractive because it takes significantly less time and manpower than the traditional usability testing methods. In this paper we demonstrate how IDA was used to evaluate usability of a multifunctional wellness tool, discuss study results and lessons learned while using this method. We also present findings from an extension of the method which allows the grouping of similar usability problems in an efficient manner. We found that the IDA method is a quick, relatively easy approach to identifying and ranking usability issues among health information technologies. Copyright © 2015 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Sun, Tai; Zhang, Zheye; Xiao, Junwu; Chen, Chen; Xiao, Fei; Wang, Shuai; Liu, Yunqi
2013-08-01
We report a facile and green method to synthesize a new type of catalyst by coating Pd nanoparticles (NPs) on reduced graphene oxide (rGO)-carbon nanotube (CNT) nanocomposite. An rGO-CNT nanocomposite with three-dimensional microstructures was obtained by hydrothermal treatment of an aqueous dispersion of graphene oxide (GO) and CNTs. After the rGO-CNT composites have been dipped in K2PdCl4 solution, the spontaneous redox reaction between the GO-CNT and PdCl42- led to the formation of nanohybrid materials consisting rGO-CNT decorated with 4 nm Pd NPs, which exhibited excellent and stable catalytic activity: the reduction of 4-nitrophenol to 4-aminophenol using NaBH4 as a catalyst was completed in only 20 s at room temperature, even when the Pd content of the catalyst was 1.12 wt%. This method does not require rigorous conditions or toxic agents and thus is a rapid, efficient, and green approach to the fabrication of highly active catalysts.
NASA Astrophysics Data System (ADS)
Yan, Zilin; Kim, Yongtae; Hara, Shotaro; Shikazono, Naoki
2017-04-01
The Potts Kinetic Monte Carlo (KMC) model, proven to be a robust tool to study all stages of sintering process, is an ideal tool to analyze the microstructure evolution of electrodes in solid oxide fuel cells (SOFCs). Due to the nature of this model, the input parameters of KMC simulations such as simulation temperatures and attempt frequencies are difficult to identify. We propose a rigorous and efficient approach to facilitate the input parameter calibration process using artificial neural networks (ANNs). The trained ANN reduces drastically the number of trial-and-error of KMC simulations. The KMC simulation using the calibrated input parameters predicts the microstructures of a La0.6Sr0.4Co0.2Fe0.8O3 cathode material during sintering, showing both qualitative and quantitative congruence with real 3D microstructures obtained by focused ion beam scanning electron microscopy (FIB-SEM) reconstruction.
Data Assimilation - Advances and Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, Brian J.
2014-07-30
This presentation provides an overview of data assimilation (model calibration) for complex computer experiments. Calibration refers to the process of probabilistically constraining uncertain physics/engineering model inputs to be consistent with observed experimental data. An initial probability distribution for these parameters is updated using the experimental information. Utilization of surrogate models and empirical adjustment for model form error in code calibration form the basis for the statistical methodology considered. The role of probabilistic code calibration in supporting code validation is discussed. Incorporation of model form uncertainty in rigorous uncertainty quantification (UQ) analyses is also addressed. Design criteria used within a batchmore » sequential design algorithm are introduced for efficiently achieving predictive maturity and improved code calibration. Predictive maturity refers to obtaining stable predictive inference with calibrated computer codes. These approaches allow for augmentation of initial experiment designs for collecting new physical data. A standard framework for data assimilation is presented and techniques for updating the posterior distribution of the state variables based on particle filtering and the ensemble Kalman filter are introduced.« less
Bioremediation of the Exxon Valdez oil in Prince William Sound beaches.
Boufadel, Michel C; Geng, Xiaolong; Short, Jeff
2016-12-15
Oil from the Exxon Valdez laden with polycyclic aromatic hydrocarbons (PAH) has persisted on some beaches in Prince William Sound, Alaska, >20years after these beaches became contaminated. The degradation rate of the total PAH (TPAH) is estimated at 1% per year. Low oxygen concentrations were found to be the major factor causing oil persistence, and bioremediation through the injection of hydrogen peroxide and nutrients deep into four beaches in PWS were conducted in the summers of 2011 and 2012. It was found that due to the treatment, the TPAH biodegradation rate was between 13% and 70% during summer 2011 and summer 2012. The results also showed high efficiency in the delivery of oxygen and nutrient to the contaminated areas of the beach. However, the approach has an environmental cost associated with it, and stakeholders would need to conduct a rigorous net environmental benefit analysis (NEBA) for pursuing the bioremediation of submerged contaminated sediments, especially in higher latitudes. Copyright © 2016 Elsevier Ltd. All rights reserved.
Application of systematic review methodology to the field of nutrition
USDA-ARS?s Scientific Manuscript database
Systematic reviews represent a rigorous and transparent approach of synthesizing scientific evidence that minimizes bias. They evolved within the medical community to support development of clinical and public health practice guidelines, set research agendas and formulate scientific consensus state...
Peer Assessment with Online Tools to Improve Student Modeling
NASA Astrophysics Data System (ADS)
Atkins, Leslie J.
2012-11-01
Introductory physics courses often require students to develop precise models of phenomena and represent these with diagrams, including free-body diagrams, light-ray diagrams, and maps of field lines. Instructors expect that students will adopt a certain rigor and precision when constructing these diagrams, but we want that rigor and precision to be an aid to sense-making rather than meeting seemingly arbitrary requirements set by the instructor. By giving students the authority to develop their own models and establish requirements for their diagrams, the sense that these are arbitrary requirements diminishes and students are more likely to see modeling as a sense-making activity. The practice of peer assessment can help students take ownership; however, it can be difficult for instructors to manage. Furthermore, it is not without risk: students can be reluctant to critique their peers, they may view this as the job of the instructor, and there is no guarantee that students will employ greater rigor and precision as a result of peer assessment. In this article, we describe one approach for peer assessment that can establish norms for diagrams in a way that is student driven, where students retain agency and authority in assessing and improving their work. We show that such an approach does indeed improve students' diagrams and abilities to assess their own work, without sacrificing students' authority and agency.
Quantifying Differential Privacy under Temporal Correlations
Cao, Yang; Yoshikawa, Masatoshi; Xiao, Yonghui; Xiong, Li
2017-01-01
Differential Privacy (DP) has received increasing attention as a rigorous privacy framework. Many existing studies employ traditional DP mechanisms (e.g., the Laplace mechanism) as primitives, which assume that the data are independent, or that adversaries do not have knowledge of the data correlations. However, continuous generated data in the real world tend to be temporally correlated, and such correlations can be acquired by adversaries. In this paper, we investigate the potential privacy loss of a traditional DP mechanism under temporal correlations in the context of continuous data release. First, we model the temporal correlations using Markov model and analyze the privacy leakage of a DP mechanism when adversaries have knowledge of such temporal correlations. Our analysis reveals that the privacy loss of a DP mechanism may accumulate and increase over time. We call it temporal privacy leakage. Second, to measure such privacy loss, we design an efficient algorithm for calculating it in polynomial time. Although the temporal privacy leakage may increase over time, we also show that its supremum may exist in some cases. Third, to bound the privacy loss, we propose mechanisms that convert any existing DP mechanism into one against temporal privacy leakage. Experiments with synthetic data confirm that our approach is efficient and effective. PMID:28883711
Xie, Yinghao; Wu, Fangfang; Sun, Xiaoqin; Chen, Hongmei; Lv, Meilin; Ni, Shuang; Liu, Gang; Xu, Xiaoxiang
2016-01-01
Wurtzite solid solutions between GaN and ZnO highlight an intriguing paradigm for water splitting into hydrogen and oxygen using solar energy. However, large composition discrepancy often occurs inside the compound owing to the volatile nature of Zn, thereby prescribing rigorous terms on synthetic conditions. Here we demonstrate the merits of constituting quinary Zn-Ga-Ge-N-O solid solutions by introducing Ge into the wurtzite framework. The presence of Ge not only mitigates the vaporization of Zn but also strongly promotes particle crystallization. Synthetic details for these quinary compounds were systematically explored and their photocatalytic properties were thoroughly investigated. Proper starting molar ratios of Zn/Ga/Ge are of primary importance for single phase formation, high particle crystallinity and good photocatalytic performance. Efficient photocatalytic hydrogen and oxygen production from water were achieved for these quinary solid solutions which is strongly correlated with Ge content in the structure. Apparent quantum efficiency for optimized sample approaches 1.01% for hydrogen production and 1.14% for oxygen production. Theoretical calculation reveals the critical role of Zn for the band gap reduction in these solid solutions and their superior photocatalytic acitivity can be understood by the preservation of Zn in the structure as well as a good crystallinity after introducing Ge. PMID:26755070
Xie, Yinghao; Wu, Fangfang; Sun, Xiaoqin; Chen, Hongmei; Lv, Meilin; Ni, Shuang; Liu, Gang; Xu, Xiaoxiang
2016-01-12
Wurtzite solid solutions between GaN and ZnO highlight an intriguing paradigm for water splitting into hydrogen and oxygen using solar energy. However, large composition discrepancy often occurs inside the compound owing to the volatile nature of Zn, thereby prescribing rigorous terms on synthetic conditions. Here we demonstrate the merits of constituting quinary Zn-Ga-Ge-N-O solid solutions by introducing Ge into the wurtzite framework. The presence of Ge not only mitigates the vaporization of Zn but also strongly promotes particle crystallization. Synthetic details for these quinary compounds were systematically explored and their photocatalytic properties were thoroughly investigated. Proper starting molar ratios of Zn/Ga/Ge are of primary importance for single phase formation, high particle crystallinity and good photocatalytic performance. Efficient photocatalytic hydrogen and oxygen production from water were achieved for these quinary solid solutions which is strongly correlated with Ge content in the structure. Apparent quantum efficiency for optimized sample approaches 1.01% for hydrogen production and 1.14% for oxygen production. Theoretical calculation reveals the critical role of Zn for the band gap reduction in these solid solutions and their superior photocatalytic acitivity can be understood by the preservation of Zn in the structure as well as a good crystallinity after introducing Ge.
NASA Astrophysics Data System (ADS)
Xie, Yinghao; Wu, Fangfang; Sun, Xiaoqin; Chen, Hongmei; Lv, Meilin; Ni, Shuang; Liu, Gang; Xu, Xiaoxiang
2016-01-01
Wurtzite solid solutions between GaN and ZnO highlight an intriguing paradigm for water splitting into hydrogen and oxygen using solar energy. However, large composition discrepancy often occurs inside the compound owing to the volatile nature of Zn, thereby prescribing rigorous terms on synthetic conditions. Here we demonstrate the merits of constituting quinary Zn-Ga-Ge-N-O solid solutions by introducing Ge into the wurtzite framework. The presence of Ge not only mitigates the vaporization of Zn but also strongly promotes particle crystallization. Synthetic details for these quinary compounds were systematically explored and their photocatalytic properties were thoroughly investigated. Proper starting molar ratios of Zn/Ga/Ge are of primary importance for single phase formation, high particle crystallinity and good photocatalytic performance. Efficient photocatalytic hydrogen and oxygen production from water were achieved for these quinary solid solutions which is strongly correlated with Ge content in the structure. Apparent quantum efficiency for optimized sample approaches 1.01% for hydrogen production and 1.14% for oxygen production. Theoretical calculation reveals the critical role of Zn for the band gap reduction in these solid solutions and their superior photocatalytic acitivity can be understood by the preservation of Zn in the structure as well as a good crystallinity after introducing Ge.
NASA Astrophysics Data System (ADS)
Vanicek, Jiri
2014-03-01
Rigorous quantum-mechanical calculations of coherent ultrafast electronic spectra remain difficult. I will present several approaches developed in our group that increase the efficiency and accuracy of such calculations: First, we justified the feasibility of evaluating time-resolved spectra of large systems by proving that the number of trajectories needed for convergence of the semiclassical dephasing representation/phase averaging is independent of dimensionality. Recently, we further accelerated this approximation with a cellular scheme employing inverse Weierstrass transform and optimal scaling of the cell size. The accuracy of potential energy surfaces was increased by combining the dephasing representation with accurate on-the-fly ab initio electronic structure calculations, including nonadiabatic and spin-orbit couplings. Finally, the inherent semiclassical approximation was removed in the exact quantum Gaussian dephasing representation, in which semiclassical trajectories are replaced by communicating frozen Gaussian basis functions evolving classically with an average Hamiltonian. Among other examples I will present an on-the-fly ab initio semiclassical dynamics calculation of the dispersed time-resolved stimulated emission spectrum of the 54-dimensional azulene. This research was supported by EPFL and by the Swiss National Science Foundation NCCR MUST (Molecular Ultrafast Science and Technology) and Grant No. 200021124936/1.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tong, Dudu; Yang, Sichun; Lu, Lanyuan
2016-06-20
Structure modellingviasmall-angle X-ray scattering (SAXS) data generally requires intensive computations of scattering intensity from any given biomolecular structure, where the accurate evaluation of SAXS profiles using coarse-grained (CG) methods is vital to improve computational efficiency. To date, most CG SAXS computing methods have been based on a single-bead-per-residue approximation but have neglected structural correlations between amino acids. To improve the accuracy of scattering calculations, accurate CG form factors of amino acids are now derived using a rigorous optimization strategy, termed electron-density matching (EDM), to best fit electron-density distributions of protein structures. This EDM method is compared with and tested againstmore » other CG SAXS computing methods, and the resulting CG SAXS profiles from EDM agree better with all-atom theoretical SAXS data. By including the protein hydration shell represented by explicit CG water molecules and the correction of protein excluded volume, the developed CG form factors also reproduce the selected experimental SAXS profiles with very small deviations. Taken together, these EDM-derived CG form factors present an accurate and efficient computational approach for SAXS computing, especially when higher molecular details (represented by theqrange of the SAXS data) become necessary for effective structure modelling.« less
Mandal, Aninda; Datta, Animesh K
2014-01-01
A "thick stem" mutant of Corchorus olitorius L. was induced at M2 (0.50%, 4 h, EMS) and the true breeding mutant is assessed across generations (M5 to M7) considering morphometric traits as well as SEM analysis of pollen grains and raw jute fibres, stem anatomy, cytogenetical attributes, and lignin content in relation to control. Furthermore, single fibre diameter and tensile strength are also analysed. The objective is to assess the stability of mutant for its effective exploration for raising a new plant type in tossa jute for commercial exploitation and efficient breeding. The mutant trait is monogenic recessive to normal. Results indicate that "thick stem" mutant is stable across generations (2n = 14) with distinctive high seed and fibre yield and significantly low lignin content. Stem anatomy of the mutant shows significant enhancement in fibre zone, number of fibre pyramids and fibre bundles per pyramid, and diameter of fibre cell in relation to control. Moreover, tensile strength of mutant fibre is significantly higher than control fibre and the trait is inversely related to fibre diameter. However the mutant is associated with low germination frequency, poor seed viability, and high pollen sterility, which may be eliminated through mutational approach followed by rigorous selection and efficient breeding.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Phadke, Amol; Shah, Nihar; Abhyankar, Nikit
Improving efficiency of air conditioners (ACs) typically involves improving the efficiency of various components such as compressors, heat exchangers, expansion valves, refrigerant,and fans. We estimate the incremental cost of improving the efficiency of room ACs based on the cost of improving the efficiency of its key components. Further, we estimate the retail price increase required to cover the cost of efficiency improvement, compare it with electricity bill savings, and calculate the payback period for consumers to recover the additional price of a more efficient AC. The finding that significant efficiency improvement is cost effective from a consumer perspective is robustmore » over a wide range of assumptions. If we assume a 50% higher incremental price compared to our baseline estimate, the payback period for the efficiency level of 3.5 ISEER is 1.1 years. Given the findings of this study, establishing more stringent minimum efficiency performance criteria (one-star level) should be evaluated rigorously considering significant benefits to consumers, energy security, and environment« less
NASA Astrophysics Data System (ADS)
Vidal, A.; San-Blas, A. A.; Quesada-Pereira, F. D.; Pérez-Soler, J.; Gil, J.; Vicente, C.; Gimeno, B.; Boria, V. E.
2015-07-01
A novel technique for the full-wave analysis of 3-D complex waveguide devices is presented. This new formulation, based on the Boundary Integral-Resonant Mode Expansion (BI-RME) method, allows the rigorous full-wave electromagnetic characterization of 3-D arbitrarily shaped metallic structures making use of extremely low CPU resources (both time and memory). The unknown electric current density on the surface of the metallic elements is represented by means of Rao-Wilton-Glisson basis functions, and an algebraic procedure based on a singular value decomposition is applied to transform such functions into the classical solenoidal and nonsolenoidal basis functions needed by the original BI-RME technique. The developed tool also provides an accurate computation of the electromagnetic fields at an arbitrary observation point of the considered device, so it can be used for predicting high-power breakdown phenomena. In order to validate the accuracy and efficiency of this novel approach, several new designs of band-pass waveguides filters are presented. The obtained results (S-parameters and electromagnetic fields) are successfully compared both to experimental data and to numerical simulations provided by a commercial software based on the finite element technique. The results obtained show that the new technique is specially suitable for the efficient full-wave analysis of complex waveguide devices considering an integrated coaxial excitation, where the coaxial probes may be in contact with the metallic insets of the component.
Mechanical properties of frog skeletal muscles in iodoacetic acid rigor.
Mulvany, M J
1975-01-01
1. Methods have been developed for describing the length: tension characteristics of frog skeletal muscles which go into rigor at 4 degrees C following iodoacetic acid poisoning either in the presence of Ca2+ (Ca-rigor) or its absence (Ca-free-rigor). 2. Such rigor muscles showed less resistance to slow stretch (slow rigor resistance) that to fast stretch (fast rigor resistance). The slow and fast rigor resistances of Ca-free-rigor muscles were much lower than those of Ca-rigor muscles. 3. The slow rigor resistance of Ca-rigor muscles was proportional to the amount of overlap between the contractile filaments present when the muscles were put into rigor. 4. Withdrawing Ca2+ from Ca-rigor muscles (induced-Ca-free rigor) reduced their slow and fast rigor resistances. Readdition of Ca2+ (but not Mg2+, Mn2+ or Sr2+) reversed the effect. 5. The slow and fast rigor resistances of Ca-rigor muscles (but not of Ca-free-rigor muscles) decreased with time. 6.The sarcomere structure of Ca-rigor and induced-Ca-free rigor muscles stretched by 0.2lo was destroyed in proportion to the amount of stretch, but the lengths of the remaining intact sarcomeres were essentially unchanged. This suggests that there had been a successive yielding of the weakeast sarcomeres. 7. The difference between the slow and fast rigor resistance and the effect of calcium on these resistances are discussed in relation to possible variations in the strength of crossbridges between the thick and thin filaments. Images Plate 1 Plate 2 PMID:1082023
NASA Astrophysics Data System (ADS)
Moortgat, Joachim; Firoozabadi, Abbas
2013-10-01
Numerical simulation of multiphase compositional flow in fractured porous media, when all the species can transfer between the phases, is a real challenge. Despite the broad applications in hydrocarbon reservoir engineering and hydrology, a compositional numerical simulator for three-phase flow in fractured media has not appeared in the literature, to the best of our knowledge. In this work, we present a three-phase fully compositional simulator for fractured media, based on higher-order finite element methods. To achieve computational efficiency, we invoke the cross-flow equilibrium (CFE) concept between discrete fractures and a small neighborhood in the matrix blocks. We adopt the mixed hybrid finite element (MHFE) method to approximate convective Darcy fluxes and the pressure equation. This approach is the most natural choice for flow in fractured media. The mass balance equations are discretized by the discontinuous Galerkin (DG) method, which is perhaps the most efficient approach to capture physical discontinuities in phase properties at the matrix-fracture interfaces and at phase boundaries. In this work, we account for gravity and Fickian diffusion. The modeling of capillary effects is discussed in a separate paper. We present the mathematical framework, using the implicit-pressure-explicit-composition (IMPEC) scheme, which facilitates rigorous thermodynamic stability analyses and the computation of phase behavior effects to account for transfer of species between the phases. A deceptively simple CFL condition is implemented to improve numerical stability and accuracy. We provide six numerical examples at both small and larger scales and in two and three dimensions, to demonstrate powerful features of the formulation.
ERIC Educational Resources Information Center
Coryn, Chris L. S.; Schroter, Daniela C.; Hanssen, Carl E.
2009-01-01
Brinkerhoff's Success Case Method (SCM) was developed with the specific purpose of assessing the impact of organizational interventions (e.g., training and coaching) on business goals by analyzing extreme groups using case study techniques and storytelling. As an efficient and cost-effective method of evaluative inquiry, SCM is attractive in other…
An analysis of phenotypic selection in natural stands of northern red oak (Quercus rubra L.)
Jeffery W. Stringer; David B. Wagner; Scott E. Schlarbaum; Daniel B. Houston
1995-01-01
Comparison of growth and stem quality parameters of 19-year-old progeny from superior and comparison trees indicates that rigorous phenotypic selection of trees in natural stands may not be an efficient method of parent tree selection for Quercus rubra L. Total tree height, dbh, number of branches in the butt log, fork height, and number of mainstem...
Library design practices for success in lead generation with small molecule libraries.
Goodnow, R A; Guba, W; Haap, W
2003-11-01
The generation of novel structures amenable to rapid and efficient lead optimization comprises an emerging strategy for success in modern drug discovery. Small molecule libraries of sufficient size and diversity to increase the chances of discovery of novel structures make the high throughput synthesis approach the method of choice for lead generation. Despite an industry trend for smaller, more focused libraries, the need to generate novel lead structures makes larger libraries a necessary strategy. For libraries of a several thousand or more members, solid phase synthesis approaches are the most suitable. While the technology and chemistry necessary for small molecule library synthesis continue to advance, success in lead generation requires rigorous consideration in the library design process to ensure the synthesis of molecules possessing the proper characteristics for subsequent lead optimization. Without proper selection of library templates and building blocks, solid phase synthesis methods often generate molecules which are too heavy, too lipophilic and too complex to be useful for lead optimization. The appropriate filtering of virtual library designs with multiple computational tools allows the generation of information-rich libraries within a drug-like molecular property space. An understanding of the hit-to-lead process provides a practical guide to molecular design characteristics. Examples of leads generated from library approaches also provide a benchmarking of successes as well as aspects for continued development of library design practices.
Cutting More than Metal: Breaking the Development Cycle
NASA Technical Reports Server (NTRS)
Singer, Chris
2014-01-01
New technology is changing the way we do business at NASA. The ability to use these new tools is made possible by a learning culture able to embrace innovation, flexibility, and prudent risk tolerance, while retaining the hard-won lessons learned of other successes and failures. Technologies such as 3-D manufacturing and structured light scanning are re-shaping the entire product life cycle, from design and analysis, through production, verification, logistics and operations. New fabrication techniques, verification techniques, integrated analysis, and models that follow the hardware from initial concept through operation are reducing the cost and time of building space hardware. Using these technologies to be more efficient, reliable and affordable requires we bring them to a level safe for NASA systems, maintain appropriate rigor in testing and acceptance, and transition new technology. Maximizing these technologies also requires cultural acceptance and understanding and balancing rules with creativity. Evolved systems engineering processes at NASA are increasingly more flexible than they have been in the past, enabling the implementation of new techniques and approaches. This paper provides an overview of NASA Marshall Space Flight Center's new approach to development, as well as examples of how that approach has been incorporated into NASA's Space Launch System (SLS) Program, which counts among its key tenants - safety, affordability, and sustainability. One of the 3D technologies that will be discussed in this paper is the design and testing of various rocket engine components.
Wide-field-of-view nanoscale Bragg liquid crystal polarization gratings
NASA Astrophysics Data System (ADS)
Xiang, Xiao; Kim, Jihwan; Escuti, Michael J.
2018-02-01
Here, we demonstrate a liquid crystal (LC) polymer Bragg polarization grating (PG) with large angular band- width and high efficiency in transmission-mode for 532 nm wavelength and 400 nm period. The field-of-view (FOV ) is increased significantly while preserving high diffraction efficiency by realizing a monolithic grating comprising two different slants. Using rigorous coupled-wave analysis simulation, we identified a structure with 48° FOV and 70% average first-order efficiency. We then experimentally fabricated and characterized the grating with a photo-aligned LC polymer network, also known as reactive mesogens. We measured 40° FOV and nearly 80% average diffraction efficiency. With this broadened and fairly uniform angular response, this wide FOV Bragg PG is a compelling option for large deflection-angle applications, including near-eye display in augmented reality systems, waveguide based illumination, and beam steering.
Landscape metrics, scales of resolution
Samuel A. Cushman; Kevin McGarigal
2008-01-01
Effective implementation of the "multiple path" approach to managing green landscapes depends fundamentally on rigorous quantification of the composition and structure of the landscapes of concern at present, modelling landscape structure trajectories under alternative management paths, and monitoring landscape structure into the future to confirm...
Autism and Pervasive Developmental Disorders
ERIC Educational Resources Information Center
Volkmar, Fred R.; Lord, Catherine; Bailey, Anthony; Schultz, Robert T.; Klin, Ami
2004-01-01
The quantity and quality of research into autism and related conditions have increased dramatically in recent years. Consequently we selectively review key accomplishments and highlight directions for future research. More consistent approaches to diagnosis and more rigorous assessment methods have significantly advanced research, although the…
High-order computer-assisted estimates of topological entropy
NASA Astrophysics Data System (ADS)
Grote, Johannes
The concept of Taylor Models is introduced, which offers highly accurate C0-estimates for the enclosures of functional dependencies, combining high-order Taylor polynomial approximation of functions and rigorous estimates of the truncation error, performed using verified interval arithmetic. The focus of this work is on the application of Taylor Models in algorithms for strongly nonlinear dynamical systems. A method to obtain sharp rigorous enclosures of Poincare maps for certain types of flows and surfaces is developed and numerical examples are presented. Differential algebraic techniques allow the efficient and accurate computation of polynomial approximations for invariant curves of certain planar maps around hyperbolic fixed points. Subsequently we introduce a procedure to extend these polynomial curves to verified Taylor Model enclosures of local invariant manifolds with C0-errors of size 10-10--10 -14, and proceed to generate the global invariant manifold tangle up to comparable accuracy through iteration in Taylor Model arithmetic. Knowledge of the global manifold structure up to finite iterations of the local manifold pieces enables us to find all homoclinic and heteroclinic intersections in the generated manifold tangle. Combined with the mapping properties of the homoclinic points and their ordering we are able to construct a subshift of finite type as a topological factor of the original planar system to obtain rigorous lower bounds for its topological entropy. This construction is fully automatic and yields homoclinic tangles with several hundred homoclinic points. As an example rigorous lower bounds for the topological entropy of the Henon map are computed, which to the best knowledge of the authors yield the largest such estimates published so far.
Endobiogeny: a global approach to systems biology (part 1 of 2).
Lapraz, Jean-Claude; Hedayat, Kamyar M
2013-01-01
Endobiogeny is a global systems approach to human biology that may offer an advancement in clinical medicine based in scientific principles of rigor and experimentation and the humanistic principles of individualization of care and alleviation of suffering with minimization of harm. Endobiogeny is neither a movement away from modern science nor an uncritical embracing of pre-rational methods of inquiry but a synthesis of quantitative and qualitative relationships reflected in a systems-approach to life and based on new mathematical paradigms of pattern recognition.
NASA Astrophysics Data System (ADS)
Bi, Lei; Yang, Ping
2016-07-01
The accuracy of the physical-geometric optics (PG-O) approximation is examined for the simulation of electromagnetic scattering by nonspherical dielectric particles. This study seeks a better understanding of the tunneling effect on the phase matrix by employing the invariant imbedding method to rigorously compute the zeroth-order Debye series, from which the tunneling efficiency and the phase matrix corresponding to the diffraction and external reflection are obtained. The tunneling efficiency is shown to be a factor quantifying the relative importance of the tunneling effect over the Fraunhofer diffraction near the forward scattering direction. Due to the tunneling effect, different geometries with the same projected cross section might have different diffraction patterns, which are traditionally assumed to be identical according to the Babinet principle. For particles with a fixed orientation, the PG-O approximation yields the external reflection pattern with reasonable accuracy, but ordinarily fails to predict the locations of peaks and minima in the diffraction pattern. The larger the tunneling efficiency, the worse the PG-O accuracy is at scattering angles less than 90°. If the particles are assumed to be randomly oriented, the PG-O approximation yields the phase matrix close to the rigorous counterpart, primarily due to error cancellations in the orientation-average process. Furthermore, the PG-O approximation based on an electric field volume-integral equation is shown to usually be much more accurate than the Kirchhoff surface integral equation at side-scattering angles, particularly when the modulus of the complex refractive index is close to unity. Finally, tunneling efficiencies are tabulated for representative faceted particles.
Augmented assessment as a means to augmented reality.
Bergeron, Bryan
2006-01-01
Rigorous scientific assessment of educational technologies typically lags behind the availability of the technologies by years because of the lack of validated instruments and benchmarks. Even when the appropriate assessment instruments are available, they may not be applied because of time and monetary constraints. Work in augmented reality, instrumented mannequins, serious gaming, and similar promising educational technologies that haven't undergone timely, rigorous evaluation, highlights the need for assessment methodologies that address the limitations of traditional approaches. The most promising augmented assessment solutions incorporate elements of rapid prototyping used in the software industry, simulation-based assessment techniques modeled after methods used in bioinformatics, and object-oriented analysis methods borrowed from object oriented programming.
Rigorous Numerics for ill-posed PDEs: Periodic Orbits in the Boussinesq Equation
NASA Astrophysics Data System (ADS)
Castelli, Roberto; Gameiro, Marcio; Lessard, Jean-Philippe
2018-04-01
In this paper, we develop computer-assisted techniques for the analysis of periodic orbits of ill-posed partial differential equations. As a case study, our proposed method is applied to the Boussinesq equation, which has been investigated extensively because of its role in the theory of shallow water waves. The idea is to use the symmetry of the solutions and a Newton-Kantorovich type argument (the radii polynomial approach) to obtain rigorous proofs of existence of the periodic orbits in a weighted ℓ1 Banach space of space-time Fourier coefficients with exponential decay. We present several computer-assisted proofs of the existence of periodic orbits at different parameter values.
Peter, Trevor; Zeh, Clement; Katz, Zachary; Elbireer, Ali; Alemayehu, Bereket; Vojnov, Lara; Costa, Alex; Doi, Naoko; Jani, Ilesh
2017-11-01
The scale-up of effective HIV viral load (VL) testing is an urgent public health priority. Implementation of testing is supported by the availability of accurate, nucleic acid based laboratory and point-of-care (POC) VL technologies and strong WHO guidance recommending routine testing to identify treatment failure. However, test implementation faces challenges related to the developing health systems in many low-resource countries. The purpose of this commentary is to review the challenges and solutions from the large-scale implementation of other diagnostic tests, namely nucleic-acid based early infant HIV diagnosis (EID) and CD4 testing, and identify key lessons to inform the scale-up of VL. Experience with EID and CD4 testing provides many key lessons to inform VL implementation and may enable more effective and rapid scale-up. The primary lessons from earlier implementation efforts are to strengthen linkage to clinical care after testing, and to improve the efficiency of testing. Opportunities to improve linkage include data systems to support the follow-up of patients through the cascade of care and test delivery, rapid sample referral networks, and POC tests. Opportunities to increase testing efficiency include improvements to procurement and supply chain practices, well connected tiered laboratory networks with rational deployment of test capacity across different levels of health services, routine resource mapping and mobilization to ensure adequate resources for testing programs, and improved operational and quality management of testing services. If applied to VL testing programs, these approaches could help improve the impact of VL on ART failure management and patient outcomes, reduce overall costs and help ensure the sustainable access to reduced pricing for test commodities, as well as improve supportive health systems such as efficient, and more rigorous quality assurance. These lessons draw from traditional laboratory practices as well as fields such as logistics, operations management and business. The lessons and innovations from large-scale EID and CD4 programs described here can be adapted to inform more effective scale-up approaches for VL. They demonstrate that an integrated approach to health system strengthening focusing on key levers for test access such as data systems, supply efficiencies and network management. They also highlight the challenges with implementation and the need for more innovative approaches and effective partnerships to achieve equitable and cost-effective test access. © 2017 The Authors. Journal of the International AIDS Society published by John Wiley & sons Ltd on behalf of the International AIDS Society.
ERIC Educational Resources Information Center
Marsh, Herbert W.; Hau, Kit-Tai; Wen, Zhonglin
2004-01-01
Goodness-of-fit (GOF) indexes provide "rules of thumb"?recommended cutoff values for assessing fit in structural equation modeling. Hu and Bentler (1999) proposed a more rigorous approach to evaluating decision rules based on GOF indexes and, on this basis, proposed new and more stringent cutoff values for many indexes. This article discusses…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Falsaperla, P.; Fonte, G.
1993-05-01
Applying a method based on some results due to Kato [Proc. Phys. Soc. Jpn. 4, 334 (1949)], we show that series of Rydberg eigenvalues and Rydberg eigenfunctions of hydrogen in a uniform magnetic field can be calculated with a rigorous error estimate. The efficiency of the method decreases as the eigenvalue density increases and as [gamma][ital n][sup 3][r arrow]1, where [gamma] is the magnetic-field strength in units of 2.35[times]10[sup 9] G and [ital n] is the principal quantum number of the unperturbed hydrogenic manifold from which the diamagnetic Rydberg states evolve. Fixing [gamma] at the laboratory value 2[times]10[sup [minus]5] andmore » confining our calculations to the region [gamma][ital n][sup 3][lt]1 (weak-field regime), we obtain extremely accurate results up to states corresponding to the [ital n]=32 manifold.« less
Predictive QSAR modeling workflow, model applicability domains, and virtual screening.
Tropsha, Alexander; Golbraikh, Alexander
2007-01-01
Quantitative Structure Activity Relationship (QSAR) modeling has been traditionally applied as an evaluative approach, i.e., with the focus on developing retrospective and explanatory models of existing data. Model extrapolation was considered if only in hypothetical sense in terms of potential modifications of known biologically active chemicals that could improve compounds' activity. This critical review re-examines the strategy and the output of the modern QSAR modeling approaches. We provide examples and arguments suggesting that current methodologies may afford robust and validated models capable of accurate prediction of compound properties for molecules not included in the training sets. We discuss a data-analytical modeling workflow developed in our laboratory that incorporates modules for combinatorial QSAR model development (i.e., using all possible binary combinations of available descriptor sets and statistical data modeling techniques), rigorous model validation, and virtual screening of available chemical databases to identify novel biologically active compounds. Our approach places particular emphasis on model validation as well as the need to define model applicability domains in the chemistry space. We present examples of studies where the application of rigorously validated QSAR models to virtual screening identified computational hits that were confirmed by subsequent experimental investigations. The emerging focus of QSAR modeling on target property forecasting brings it forward as predictive, as opposed to evaluative, modeling approach.
NASA Astrophysics Data System (ADS)
Vassena, G.; Clerici, A.
2018-05-01
The state of the art of 3D surveying technologies, if correctly applied, allows to obtain 3D coloured models of large open pit mines using different technologies as terrestrial laser scanner (TLS), with images, combined with UAV based digital photogrammetry. GNSS and/or total station are also currently used to geo reference the model. The University of Brescia has been realised a project to map in 3D an open pit mine located in Botticino, a famous location of marble extraction close to Brescia in North Italy. Terrestrial Laser Scanner 3D point clouds combined with RGB images and digital photogrammetry from UAV have been used to map a large part of the cave. By rigorous and well know procedures a 3D point cloud and mesh model have been obtained using an easy and rigorous approach. After the description of the combined mapping process, the paper describes the innovative process proposed for the daily/weekly update of the model itself. To realize this task a SLAM technology approach is described, using an innovative approach based on an innovative instrument capable to run an automatic localization process and real time on the field change detection analysis.
Bayesian Reconstruction of Disease Outbreaks by Combining Epidemiologic and Genomic Data
Jombart, Thibaut; Cori, Anne; Didelot, Xavier; Cauchemez, Simon; Fraser, Christophe; Ferguson, Neil
2014-01-01
Recent years have seen progress in the development of statistically rigorous frameworks to infer outbreak transmission trees (“who infected whom”) from epidemiological and genetic data. Making use of pathogen genome sequences in such analyses remains a challenge, however, with a variety of heuristic approaches having been explored to date. We introduce a statistical method exploiting both pathogen sequences and collection dates to unravel the dynamics of densely sampled outbreaks. Our approach identifies likely transmission events and infers dates of infections, unobserved cases and separate introductions of the disease. It also proves useful for inferring numbers of secondary infections and identifying heterogeneous infectivity and super-spreaders. After testing our approach using simulations, we illustrate the method with the analysis of the beginning of the 2003 Singaporean outbreak of Severe Acute Respiratory Syndrome (SARS), providing new insights into the early stage of this epidemic. Our approach is the first tool for disease outbreak reconstruction from genetic data widely available as free software, the R package outbreaker. It is applicable to various densely sampled epidemics, and improves previous approaches by detecting unobserved and imported cases, as well as allowing multiple introductions of the pathogen. Because of its generality, we believe this method will become a tool of choice for the analysis of densely sampled disease outbreaks, and will form a rigorous framework for subsequent methodological developments. PMID:24465202
Qualitative Methods in Field Research: An Indonesian Experience in Community Based Practice.
ERIC Educational Resources Information Center
Lysack, Catherine L.; Krefting, Laura
1994-01-01
Cross-cultural evaluation of a community-based rehabilitation project in Indonesia used three methods: focus groups, questionnaires, and key informant interviews. A continuous cyclical approach to data collection and concern for cultural sensitivity increased the rigor of the research. (SK)
Accurate Biomass Estimation via Bayesian Adaptive Sampling
NASA Technical Reports Server (NTRS)
Wheeler, Kevin R.; Knuth, Kevin H.; Castle, Joseph P.; Lvov, Nikolay
2005-01-01
The following concepts were introduced: a) Bayesian adaptive sampling for solving biomass estimation; b) Characterization of MISR Rahman model parameters conditioned upon MODIS landcover. c) Rigorous non-parametric Bayesian approach to analytic mixture model determination. d) Unique U.S. asset for science product validation and verification.
Blended Learning: An Innovative Approach
ERIC Educational Resources Information Center
Lalima; Dangwal, Kiran Lata
2017-01-01
Blended learning is an innovative concept that embraces the advantages of both traditional teaching in the classroom and ICT supported learning including both offline learning and online learning. It has scope for collaborative learning; constructive learning and computer assisted learning (CAI). Blended learning needs rigorous efforts, right…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burgard, K.G.
This Configuration Management Implementation Plan was developed to assist in the management of systems, structures, and components, to facilitate the effective control and statusing of changes to systems, structures, and components; and to ensure technical consistency between design, performance, and operational requirements. Its purpose is to describe the approach Project W-464 will take in implementing a configuration management control, to determine the rigor of control, and to identify the mechanisms for imposing that control.This Configuration Management Implementation Plan was developed to assist in the management of systems, structures, and components, to facilitate the effective control and statusing of changes tomore » systems, structures, and components; and to ensure technical consistency between design, performance, and operational requirements. Its purpose is to describe the approach Project W-464 will take in implementing a configuration management control, to determine the rigor of control, and to identify the mechanisms for imposing that control.« less
Chan, T M Simon; Teram, Eli; Shaw, Ian
2017-01-01
Despite growing consideration of the needs of research participants in studies related to sensitive issues, discussions of alternative ways to design sensitive research are scarce. Structured as an exchange between two researchers who used different approaches in their studies with childhood sexual abuse survivors, in this article, we seek to advance understanding of methodological and ethical issues in designing sensitive research. The first perspective, which is termed protective, promotes the gradual progression of participants from a treatment phase into a research phase, with the ongoing presence of a researcher and a social worker in both phases. In the second perspective, which is termed minimalist, we argue for clear boundaries between research and treatment processes, limiting the responsibility of researchers to ensuring that professional support is available to participants who experience emotional difficulties. Following rebuttals, lessons are drawn for ethical balancing between methodological rigor and the needs of participants. © The Author(s) 2015.
Phytoremediation of hazardous wastes. Technical report, 23--26 July 1995
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCutcheon, S.C.; Wolfe, N.L.; Carreria, L.H.
1995-07-26
A new and innovative approach to phytoremediation (the use of plants to degrade hazardous contaminants) was developed. The new approach to phytoremediation involves rigorous pathway analyses, mass balance determinations, and identification of specific enzymes that break down trinitrotoluene (TNT), other explosives (RDX and HMX), nitrobenzene, and chlorinated solvents (e.g., TCE and PCE) (EPA 1994). As a good example, TNT is completely and rapidly degraded by nitroreductase and laccase enzymes. The aromatic ring is broken and the carbon in the ring fragments is incorporated into new plant fiber, as part of the natural lignification process. Half lives for TNT degradation approachmore » 1 hr or less under ideal laboratory conditions. Continuous-flow pilot studies indicate that scale up residence times in created wetlands may be two to three times longer than in laboratory batch studies. The use of created wetlands and land farming techniques guided by rigorous field biochemistry and ecology promises to be a vital part of a newly evolving field, ecological engineering.« less
Phytoremediation of hazardous wastes
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCutcheon, S.C.; Wolfe, N.L.; Carreria, L.H.
1995-11-01
A new and innovative approach to phytoremediation (the use of plants to degrade hazardous contaminants) was developed. The new approach to phytoremediation involves rigorous pathway analyses, mass balance determinations, and identification of specific enzymes that break down trinitrotoluene (TNT), other explosives (RDX and HMX), nitrobenzene, and chlorinated solvents (e.g., TCE and PCE) (EPA 1994). As a good example, TNT is completely and rapidly degraded by nitroreductase and laccase enzymes. The aromatic ring is broken and the carbon in the ring fragments is incorporated into new plant fiber, as part of the natural lignification process. Half lives for TNT degradation approachmore » 1 hr or less under ideal laboratory conditions. Continuous-flow pilot studies indicate that scale up residence times in created wetlands may be two to three times longer than in laboratory batch studies. The use of created wetlands and land farming techniques guided by rigorous field biochemistry and ecology promises to be a vital part of a newly evolving field, ecological engineering.« less
Rohwer, Anke; Schoonees, Anel; Young, Taryn
2014-11-02
This paper describes the process, our experience and the lessons learnt in doing document reviews of health science curricula. Since we could not find relevant literature to guide us on how to approach these reviews, we feel that sharing our experience would benefit researchers embarking on similar projects. We followed a rigorous, transparent, pre-specified approach that included the preparation of a protocol, a pre-piloted data extraction form and coding schedule. Data were extracted, analysed and synthesised. Quality checks were included at all stages of the process. The main lessons we learnt related to time and project management, continuous quality assurance, selecting the software that meets the needs of the project, involving experts as needed and disseminating the findings to relevant stakeholders. A complete curriculum evaluation comprises, apart from a document review, interviews with students and lecturers to assess the learnt and taught curricula respectively. Rigorous methods must be used to ensure an objective assessment.
Shear-induced opening of the coronal magnetic field
NASA Technical Reports Server (NTRS)
Wolfson, Richard
1995-01-01
This work describes the evolution of a model solar corona in response to motions of the footpoints of its magnetic field. The mathematics involved is semianalytic, with the only numerical solution being that of an ordinary differential equation. This approach, while lacking the flexibility and physical details of full MHD simulations, allows for very rapid computation along with complete and rigorous exploration of the model's implications. We find that the model coronal field bulges upward, at first slowly and then more dramatically, in response to footpoint displacements. The energy in the field rises monotonically from that of the initial potential state, and the field configuration and energy appraoch asymptotically that of a fully open field. Concurrently, electric currents develop and concentrate into a current sheet as the limiting case of the open field is approached. Examination of the equations shows rigorously that in the asymptotic limit of the fully open field, the current layer becomes a true ideal MHD singularity.
Including Magnetostriction in Micromagnetic Models
NASA Astrophysics Data System (ADS)
Conbhuí, Pádraig Ó.; Williams, Wyn; Fabian, Karl; Nagy, Lesleis
2016-04-01
The magnetic anomalies that identify crustal spreading are predominantly recorded by basalts formed at the mid-ocean ridges, whose magnetic signals are dominated by iron-titanium-oxides (Fe3-xTixO4), so called "titanomagnetites", of which the Fe2.4Ti0.6O4 (TM60) phase is the most common. With sufficient quantities of titanium present, these minerals exhibit strong magnetostriction. To date, models of these grains in the pseudo-single domain (PSD) range have failed to accurately account for this effect. In particular, a popular analytic treatment provided by Kittel (1949) for describing the magnetostrictive energy as an effective increase of the anisotropy constant can produce unphysical strains for non-uniform magnetizations. I will present a rigorous approach based on work by Brown (1966) and by Kroner (1958) for including magnetostriction in micromagnetic codes which is suitable for modelling hysteresis loops and finding remanent states in the PSD regime. Preliminary results suggest the more rigorously defined micromagnetic models exhibit higher coercivities and extended single domain ranges when compared to more simplistic approaches.
Li, Jia; Lam, Edmund Y
2014-04-21
Mask topography effects need to be taken into consideration for a more accurate solution of source mask optimization (SMO) in advanced optical lithography. However, rigorous 3D mask models generally involve intensive computation and conventional SMO fails to manipulate the mask-induced undesired phase errors that degrade the usable depth of focus (uDOF) and process yield. In this work, an optimization approach incorporating pupil wavefront aberrations into SMO procedure is developed as an alternative to maximize the uDOF. We first design the pupil wavefront function by adding primary and secondary spherical aberrations through the coefficients of the Zernike polynomials, and then apply the conjugate gradient method to achieve an optimal source-mask pair under the condition of aberrated pupil. We also use a statistical model to determine the Zernike coefficients for the phase control and adjustment. Rigorous simulations of thick masks show that this approach provides compensation for mask topography effects by improving the pattern fidelity and increasing uDOF.
Designing High-Efficiency Thin Silicon Solar Cells Using Parabolic-Pore Photonic Crystals
NASA Astrophysics Data System (ADS)
Bhattacharya, Sayak; John, Sajeev
2018-04-01
We demonstrate the efficacy of wave-interference-based light trapping and carrier transport in parabolic-pore photonic-crystal, thin-crystalline silicon (c -Si) solar cells to achieve above 29% power conversion efficiencies. Using a rigorous solution of Maxwell's equations through a standard finite-difference time domain scheme, we optimize the design of the vertical-parabolic-pore photonic crystal (PhC) on a 10 -μ m -thick c -Si solar cell to obtain a maximum achievable photocurrent density (MAPD) of 40.6 mA /cm2 beyond the ray-optical, Lambertian light-trapping limit. For a slanted-parabolic-pore PhC that breaks x -y symmetry, improved light trapping occurs due to better coupling into parallel-to-interface refraction modes. We achieve the optimum MAPD of 41.6 mA /cm2 for a tilt angle of 10° with respect to the vertical axis of the pores. This MAPD is further improved to 41.72 mA /cm2 by introducing a 75-nm SiO2 antireflective coating on top of the solar cell. We use this MAPD and the associated charge-carrier generation profile as input for a numerical solution of Poisson's equation coupled with semiconductor drift-diffusion equations using a Shockley-Read-Hall and Auger recombination model. Using experimentally achieved surface recombination velocities of 10 cm /s , we identify semiconductor doping profiles that yield power conversion efficiencies over 29%. Practical considerations of additional upper-contact losses suggest efficiencies close to 28%. This improvement beyond the current world record is largely due to an open-circuit voltage approaching 0.8 V enabled by reduced bulk recombination in our thin silicon architecture while maintaining a high short-circuit current through wave-interference-based light trapping.
Multifunctional ZnO Nanomaterials for Efficient Energy Conversion and Sensing
2015-09-01
plasmonic response in the nanostructure in a rigorous manner in all three dimensions. We examine a silver nanoparticle with an ellipsoid-like...around silver nanoparticles and dimers. J. Chem. Phys. 120, 357–366 (2004). 47. Gómez-Medina, R., Yamamoto, N., Nakano, M. & Abajo, F. J. G. de...and reproducible nanomaterials growth/ synthesis with control of nanostructure size, shape, and functionality, in uniform functionalization with both
Evaluation of automatic video summarization systems
NASA Astrophysics Data System (ADS)
Taskiran, Cuneyt M.
2006-01-01
Compact representations of video, or video summaries, data greatly enhances efficient video browsing. However, rigorous evaluation of video summaries generated by automatic summarization systems is a complicated process. In this paper we examine the summary evaluation problem. Text summarization is the oldest and most successful summarization domain. We show some parallels between these to domains and introduce methods and terminology. Finally, we present results for a comprehensive evaluation summary that we have performed.
Developing a Decision Support System for Flood Response: NIMS/ICS Fundamentals
NASA Astrophysics Data System (ADS)
Gutenson, J. L.; Zhang, X.; Ernest, A. N. S.; Oubeidillah, A.; Zhu, L.
2015-12-01
Effective response to regional disasters such as floods requires a multipronged, non-linear approach to reduce loss of life, property and harm to the environment. These coordinated response actions are typically undertaken by multiple jurisdictions, levels of government, functional agencies and other responsible entities. A successful response is highly dependent on the effectiveness and efficiency of each coordinated response action undertaken across a broad spectrum of organizations and activities. In order to provide a unified framework for those responding to incidents or planned events, FEMA provides a common and flexible approach for managing incidents, regardless of cause, size, location or complexity, referred to as the National Incident Management System (NIMS). Integral to NIMS is the Incident Command System (ICS), which establishes a common, pre-defined organizational structure to ensure coordination and management of procedures, resources and communications, for efficient incident management. While being both efficient and rigorous, NIMS, and ICS to a lesser extent, is an inherently complex framework that requires significant amount of training for planners, responders and managers to master, especially considering the wide array of incident types that Local Emergency Planning Committees (LEPCs) must be prepared to respond to. The existing Water-Wizard Decision Support System (DSS), developed to support water distribution system recovery operations for Decontamination (Decon), Operational Optimization (WDS), and Economic Consequence Assessment (Econ), is being evolved to integrate incident response functions. Water-Wizard runs on both mobile and desktop devices, and is being extended to utilize smartphone and mobile device specific data streams (e.g GPS location) to augment its fact-base in real-time for situational-aware DSS recommendations. In addition, the structured NIMS and ICS frameworks for incident management and response are being incorporated into the Water-Wizard knowledgebase, with a mid-term goal of integrating flood-specific emergency response domain knowledge to provide a real-time flood responder decision support.
Darius, H T; Drescher, O; Ponton, D; Pawlowiez, R; Laurent, D; Dewailly, E; Chinain, M
2013-01-01
Ciguatera fish poisoning is a seafood intoxication commonly afflicting island communities in the Pacific. These populations, which are strongly dependent on fish resources, have developed over centuries various strategies to decrease the risk of intoxication, including the use of folk tests to detect ciguateric fish. This study aims to evaluate the effectiveness of two folk tests commonly used in Raivavae Island (Australes, French Polynesia): the rigor mortis test (RMT) and the bleeding test (BT). A total of 107 fish were collected in Raivavae Lagoon, among which 80 were tested by five testers using the RMT versus 107 tested by four testers using BT. First, the performance between testers was compared. Second, the efficiency of these tests was compared with toxicity data obtained via the receptor binding assay (RBA) by assessing various parameter's values such as sensitivity (Se), specificity (Sp), positive predictive value (PPV) and negative predictive value (NPV). Comparisons of outcomes between folk tests and RBA analyses were considered: tests used separately or in a parallel versus the series approach by each tester. The overall efficiency of the RMT and BT tests was also evaluated when the judgments of all testers were "pooled". The results demonstrate that efficiencies varied between testers with one showing the best scores in detecting toxic fish: 55% with RMT and 69.2% with BT. BT gave the best results in detecting toxic fish as compared with RMT, giving also better agreement between testers. If high NPV and Se values were to be privileged, the data also suggest that the best way to limit cases of intoxication would be to use RMT and BT tests in a parallel approach. The use of traditional knowledge and a good knowledge of risky versus healthy fishing areas may help reduce the risk of intoxication among communities where ciguatera fish poisoning is highly prevalent.
Evolution of US DOE Performance Assessments Over 20 Years - 13597
DOE Office of Scientific and Technical Information (OSTI.GOV)
Suttora, Linda C.; Seitz, Roger R.
2013-07-01
Performance assessments (PAs) have been used for many years for the analysis of post-closure hazards associated with a radioactive waste disposal facility and to provide a reasonable expectation of the ability of the site and facility design to meet objectives for the protection of members of the public and the environment. The use of PA to support decision-making for LLW disposal facilities has been mandated in United States Department of Energy (US DOE) directives governing radioactive waste management since 1988 (currently DOE Order 435.1, Radioactive Waste Management). Prior to that time, PAs were also used in a less formal role.more » Over the past 20+ years, the US DOE approach to conduct, review and apply PAs has evolved into an efficient, rigorous and mature process that includes specific requirements for continuous improvement and independent reviews. The PA process has evolved through refinement of a graded and iterative approach designed to help focus efforts on those aspects of the problem expected to have the greatest influence on the decision being made. Many of the evolutionary changes to the PA process are linked to the refinement of the PA maintenance concept that has proven to be an important element of US DOE PA requirements in the context of supporting decision-making for safe disposal of LLW. The PA maintenance concept is central to the evolution of the graded and iterative philosophy and has helped to drive the evolution of PAs from a deterministic compliance calculation into a systematic approach that helps to focus on critical aspects of the disposal system in a manner designed to provide a more informed basis for decision-making throughout the life of a disposal facility (e.g., monitoring, research and testing, waste acceptance criteria, design improvements, data collection, model refinements). A significant evolution in PA modeling has been associated with improved use of uncertainty and sensitivity analysis techniques to support efficient implementation of the graded and iterative approach. Rather than attempt to exactly predict the migration of radionuclides in a disposal unit, the best PAs have evolved into tools that provide a range of results to guide decision-makers in planning the most efficient, cost effective, and safe disposal of radionuclides. (authors)« less
A six-legged rover for planetary exploration
NASA Technical Reports Server (NTRS)
Simmons, Reid; Krotkov, Eric; Bares, John
1991-01-01
To survive the rigors and isolation of planetary exploration, an autonomous rover must be competent, reliable, and efficient. This paper presents the Ambler, a six-legged robot featuring orthogonal legs and a novel circulating gait, which has been designed for traversal of rugged, unknown environments. An autonomous software system that integrates perception, planning, and real-time control has been developed to walk the Ambler through obstacle strewn terrain. The paper describes the information and control flow of the walking system, and how the design of the mechanism and software combine to achieve competent walking, reliable behavior in the face of unexpected failures, and efficient utilization of time and power.
Automatic Target Recognition Based on Cross-Plot
Wong, Kelvin Kian Loong; Abbott, Derek
2011-01-01
Automatic target recognition that relies on rapid feature extraction of real-time target from photo-realistic imaging will enable efficient identification of target patterns. To achieve this objective, Cross-plots of binary patterns are explored as potential signatures for the observed target by high-speed capture of the crucial spatial features using minimal computational resources. Target recognition was implemented based on the proposed pattern recognition concept and tested rigorously for its precision and recall performance. We conclude that Cross-plotting is able to produce a digital fingerprint of a target that correlates efficiently and effectively to signatures of patterns having its identity in a target repository. PMID:21980508
Structural reliability analysis under evidence theory using the active learning kriging model
NASA Astrophysics Data System (ADS)
Yang, Xufeng; Liu, Yongshou; Ma, Panke
2017-11-01
Structural reliability analysis under evidence theory is investigated. It is rigorously proved that a surrogate model providing only correct sign prediction of the performance function can meet the accuracy requirement of evidence-theory-based reliability analysis. Accordingly, a method based on the active learning kriging model which only correctly predicts the sign of the performance function is proposed. Interval Monte Carlo simulation and a modified optimization method based on Karush-Kuhn-Tucker conditions are introduced to make the method more efficient in estimating the bounds of failure probability based on the kriging model. Four examples are investigated to demonstrate the efficiency and accuracy of the proposed method.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shah, Nihar; Abhyankar, Nikit; Park, Won Young
Improving efficiency of air conditioners (ACs) typically involves improving the efficiency of various components such as compressors, heat exchangers, expansion valves, refrigerant and fans. We estimate the incremental cost of improving the efficiency of room ACs based on the cost of improving the efficiency of its key components. Further, we estimate the retail price increase required to cover the cost of efficiency improvement, compare it with electricity bill savings, and calculate the payback period for consumers to recover the additional price of a more efficient AC. We assess several efficiency levels, two of which are summarized below in the report.more » The finding that significant efficiency improvement is cost effective from a consumer perspective is robust over a wide range of assumptions. If we assume a 50% higher incremental price compared to our baseline estimate, the payback period for the efficiency level of 3.5 ISEER is 1.1 years. Given the findings of this study, establishing more stringent minimum efficiency performance criteria (one star level) should be evaluated rigorously considering significant benefits to consumers, energy security and environment.« less
The Evolution of a More Rigorous Approach to Benefit Transfer: Benefit Function Transfer
NASA Astrophysics Data System (ADS)
Loomis, John B.
1992-03-01
The desire for economic values of recreation for unstudied recreation resources dates back to the water resource development benefit-cost analyses of the early 1960s. Rather than simply applying existing estimates of benefits per trip to the study site, a fairly rigorous approach was developed by a number of economists. This approach involves application of travel cost demand equations and contingent valuation benefit functions from existing sites to the new site. In this way the spatial market of the new site (i.e., its differing own price, substitute prices and population distribution) is accounted for in the new estimate of total recreation benefits. The assumptions of benefit transfer from recreation sites in one state to another state for the same recreation activity is empirically tested. The equality of demand coefficients for ocean sport salmon fishing in Oregon versus Washington and for freshwater steelhead fishing in Oregon versus Idaho is rejected. Thus transfer of either demand equations or average benefits per trip are likely to be in error. Using the Oregon steelhead equation, benefit transfers to rivers within the state are shown to be accurate to within 5-15%.
Adult asthma disease management: an analysis of studies, approaches, outcomes, and methods.
Maciejewski, Matthew L; Chen, Shih-Yin; Au, David H
2009-07-01
Disease management has been implemented for patients with asthma in various ways. We describe the approaches to and components of adult asthma disease-management interventions, examine the outcomes evaluated, and assess the quality of published studies. We searched the MEDLINE, EMBASE, CINAHL, PsychInfo, and Cochrane databases for studies published in 1986 through 2008, on adult asthma management. With the studies that met our inclusion criteria, we examined the clinical, process, medication, economic, and patient-reported outcomes reported, and the study designs, provider collaboration during the studies, and statistical methods. Twenty-nine articles describing 27 studies satisfied our inclusion criteria. There was great variation in the content, extent of collaboration between physician and non-physician providers responsible for intervention delivery, and outcomes examined across the 27 studies. Because of limitations in the design of 22 of the 27 studies, the differences in outcomes assessed, and the lack of rigorous statistical adjustment, we could not draw definitive conclusions about the effectiveness or cost-effectiveness of the asthma disease-management programs or which approach was most effective. Few well-designed studies with rigorous evaluations have been conducted to evaluate disease-management interventions for adults with asthma. Current evidence is insufficient to recommend any particular intervention.
Aerial photography flight quality assessment with GPS/INS and DEM data
NASA Astrophysics Data System (ADS)
Zhao, Haitao; Zhang, Bing; Shang, Jiali; Liu, Jiangui; Li, Dong; Chen, Yanyan; Zuo, Zhengli; Chen, Zhengchao
2018-01-01
The flight altitude, ground coverage, photo overlap, and other acquisition specifications of an aerial photography flight mission directly affect the quality and accuracy of the subsequent mapping tasks. To ensure smooth post-flight data processing and fulfill the pre-defined mapping accuracy, flight quality assessments should be carried out in time. This paper presents a novel and rigorous approach for flight quality evaluation of frame cameras with GPS/INS data and DEM, using geometric calculation rather than image analysis as in the conventional methods. This new approach is based mainly on the collinearity equations, in which the accuracy of a set of flight quality indicators is derived through a rigorous error propagation model and validated with scenario data. Theoretical analysis and practical flight test of an aerial photography mission using an UltraCamXp camera showed that the calculated photo overlap is accurate enough for flight quality assessment of 5 cm ground sample distance image, using the SRTMGL3 DEM and the POSAV510 GPS/INS data. An even better overlap accuracy could be achieved for coarser-resolution aerial photography. With this new approach, the flight quality evaluation can be conducted on site right after landing, providing accurate and timely information for decision making.
ERIC Educational Resources Information Center
Celeste, Eric
2016-01-01
Communities of practice have become important tools for districts striving to improve teacher quality in a way that improves student outcomes, but scaling the benefits of these communities requires a more rigorous, intentional approach. That's why Learning Forward, with support from the Bill & Melinda Gates Foundation, created the Redesign PD…
Towards a Unified Theory of Engineering Education
ERIC Educational Resources Information Center
Salcedo Orozco, Oscar H.
2017-01-01
STEM education is an interdisciplinary approach to learning where rigorous academic concepts are coupled with real-world lessons and activities as students apply science, technology, engineering, and mathematics in contexts that make connections between school, community, work, and the global enterprise enabling STEM literacy (Tsupros, Kohler and…
DOT National Transportation Integrated Search
2015-12-01
MAP-21 and AASHTOs framework for transportation asset management (TAM) offer opportunities to use more : rigorous approaches to collect and apply evidence within a TAM context. This report documents the results of a study : funded by the Georgia D...
Diversity Management. Time for a New Approach.
ERIC Educational Resources Information Center
Ivancevich, John M.; Gilbert, Jacqueline A.
2000-01-01
A review of the history of diversity management resulted in a call for a new agenda that encourages more collaboration between scholars and administrators, increased researcher observation of workplace reactions to diversity management initiatives, more informative and rigorous case studies, and more third-party evaluations of diversity management…
Caution--Praise Can Be Dangerous.
ERIC Educational Resources Information Center
Dweck, Carol S.
1999-01-01
Reviews research into the effects of praise on students. Suggests an approach that gets students to focus on their potential to learn, to value challenge, and to concentrate on effort and learning processes in the face of obstacles. This can all be done while holding students to rigorous standards. (SLD)
Connected Learning Communities: A Toolkit for Reinventing High School.
ERIC Educational Resources Information Center
Almeida, Cheryl, Ed.; Steinberg, Adria, Ed.
This document presents tools and guidelines to help practitioners transform their high schools into institutions facilitating community-connected learning. The approach underpinning the tools and guidelines is based on the following principles: academic rigor and relevance; personalized learning; self-passage to adulthood; and productive learning…
Methods for assessing Phytophthora ramorum chlamydospore germination
Joyce Eberhart; Elilzabeth Stamm; Jennifer Parke
2013-01-01
Germination of chlamydospores is difficult to accurately assess when chlamydospores are attached to remnants of supporting hyphae. We developed two approaches for closely observing and rigorously quantifying the frequency of chlamydospore germination in vitro. The plate marking and scanning method was useful for quantifying germination of large...
A Point System Approach to Secondary Classroom Management
ERIC Educational Resources Information Center
Xenos, Anthony J.
2012-01-01
This article presents guiding principles governing the design, implementation, and management of a point system to promote discipline and academic rigor in a secondary classroom. Four considerations are discussed: (1) assigning appropriate point values to integral classroom behaviors and tasks; (2) determining the relationship among consequences,…
A Criterion-Referenced Approach to Student Ratings of Instruction
ERIC Educational Resources Information Center
Meyer, J. Patrick; Doromal, Justin B.; Wei, Xiaoxin; Zhu, Shi
2017-01-01
We developed a criterion-referenced student rating of instruction (SRI) to facilitate formative assessment of teaching. It involves four dimensions of teaching quality that are grounded in current instructional design principles: Organization and structure, Assessment and feedback, Personal interactions, and Academic rigor. Using item response…
ERIC Educational Resources Information Center
Powers, Laurie E.
2017-01-01
Action research approaches reflecting power sharing by academic and community researchers, full engagement of community partners across all study phases, and ongoing commitment to partnership and capacity building have been increasingly embraced, particularly in research affecting marginalized populations. Findings suggest action research…
Characterizing Surface Transport Barriers in the South China Sea
2015-09-30
to a coral reef system flow, rigorously identifying hyperbolic and elliptic flow structures. 2 RESULTS The FTLE approach was found to be...included in real world applications (Allshouse et al. 2015). Figure 3: The impact of windage on a hypothetical tracer release event of Ningaloo Reef
On the estimation of the worst-case implant-induced RF-heating in multi-channel MRI.
Córcoles, Juan; Zastrow, Earl; Kuster, Niels
2017-06-21
The increasing use of multiple radiofrequency (RF) transmit channels in magnetic resonance imaging (MRI) systems makes it necessary to rigorously assess the risk of RF-induced heating. This risk is especially aggravated with inclusions of medical implants within the body. The worst-case RF-heating scenario is achieved when the local tissue deposition in the at-risk region (generally in the vicinity of the implant electrodes) reaches its maximum value while MRI exposure is compliant with predefined general specific absorption rate (SAR) limits or power requirements. This work first reviews the common approach to estimate the worst-case RF-induced heating in multi-channel MRI environment, based on the maximization of the ratio of two Hermitian forms by solving a generalized eigenvalue problem. It is then shown that the common approach is not rigorous and may lead to an underestimation of the worst-case RF-heating scenario when there is a large number of RF transmit channels and there exist multiple SAR or power constraints to be satisfied. Finally, this work derives a rigorous SAR-based formulation to estimate a preferable worst-case scenario, which is solved by casting a semidefinite programming relaxation of this original non-convex problem, whose solution closely approximates the true worst-case including all SAR constraints. Numerical results for 2, 4, 8, 16, and 32 RF channels in a 3T-MRI volume coil for a patient with a deep-brain stimulator under a head imaging exposure are provided as illustrative examples.
On the estimation of the worst-case implant-induced RF-heating in multi-channel MRI
NASA Astrophysics Data System (ADS)
Córcoles, Juan; Zastrow, Earl; Kuster, Niels
2017-06-01
The increasing use of multiple radiofrequency (RF) transmit channels in magnetic resonance imaging (MRI) systems makes it necessary to rigorously assess the risk of RF-induced heating. This risk is especially aggravated with inclusions of medical implants within the body. The worst-case RF-heating scenario is achieved when the local tissue deposition in the at-risk region (generally in the vicinity of the implant electrodes) reaches its maximum value while MRI exposure is compliant with predefined general specific absorption rate (SAR) limits or power requirements. This work first reviews the common approach to estimate the worst-case RF-induced heating in multi-channel MRI environment, based on the maximization of the ratio of two Hermitian forms by solving a generalized eigenvalue problem. It is then shown that the common approach is not rigorous and may lead to an underestimation of the worst-case RF-heating scenario when there is a large number of RF transmit channels and there exist multiple SAR or power constraints to be satisfied. Finally, this work derives a rigorous SAR-based formulation to estimate a preferable worst-case scenario, which is solved by casting a semidefinite programming relaxation of this original non-convex problem, whose solution closely approximates the true worst-case including all SAR constraints. Numerical results for 2, 4, 8, 16, and 32 RF channels in a 3T-MRI volume coil for a patient with a deep-brain stimulator under a head imaging exposure are provided as illustrative examples.
A preliminary design for the GMT-Consortium Large Earth Finder (G-CLEF)
NASA Astrophysics Data System (ADS)
Szentgyorgyi, Andrew; Barnes, Stuart; Bean, Jacob; Bigelow, Bruce; Bouchez, Antonin; Chun, Moo-Young; Crane, Jeffrey D.; Epps, Harland; Evans, Ian; Evans, Janet; Frebel, Anna; Furesz, Gabor; Glenday, Alex; Guzman, Dani; Hare, Tyson; Jang, Bi-Ho; Jang, Jeong-Gyun; Jeong, Ueejong; Jordan, Andres; Kim, Kang-Min; Kim, Jihun; Li, Chih-Hao; Lopez-Morales, Mercedes; McCracken, Kenneth; McLeod, Brian; Mueller, Mark; Nah, Jakyung; Norton, Timothy; Oh, Heeyoung; Oh, Jae Sok; Ordway, Mark; Park, Byeong-Gon; Park, Chan; Park, Sung-Joon; Phillips, David; Plummer, David; Podgorski, William; Rodler, Florian; Seifahrt, Andreas; Tak, Kyung-Mo; Uomoto, Alan; Van Dam, Marcos A.; Walsworth, Ronald; Yu, Young Sam; Yuk, In-Soo
2014-08-01
The GMT-Consortium Large Earth Finder (G-CLEF) is an optical-band echelle spectrograph that has been selected as the first light instrument for the Giant Magellan Telescope (GMT). G-CLEF is a general-purpose, high dispersion spectrograph that is fiber fed and capable of extremely precise radial velocity measurements. The G-CLEF Concept Design (CoD) was selected in Spring 2013. Since then, G-CLEF has undergone science requirements and instrument requirements reviews and will be the subject of a preliminary design review (PDR) in March 2015. Since CoD review (CoDR), the overall G-CLEF design has evolved significantly as we have optimized the constituent designs of the major subsystems, i.e. the fiber system, the telescope interface, the calibration system and the spectrograph itself. These modifications have been made to enhance G-CLEF's capability to address frontier science problems, as well as to respond to the evolution of the GMT itself and developments in the technical landscape. G-CLEF has been designed by applying rigorous systems engineering methodology to flow Level 1 Scientific Objectives to Level 2 Observational Requirements and thence to Level 3 and Level 4. The rigorous systems approach applied to G-CLEF establishes a well defined science requirements framework for the engineering design. By adopting this formalism, we may flexibly update and analyze the capability of G-CLEF to respond to new scientific discoveries as we move toward first light. G-CLEF will exploit numerous technological advances and features of the GMT itself to deliver an efficient, high performance instrument, e.g. exploiting the adaptive optics secondary system to increase both throughput and radial velocity measurement precision.
Topological Isomorphisms of Human Brain and Financial Market Networks
Vértes, Petra E.; Nicol, Ruth M.; Chapman, Sandra C.; Watkins, Nicholas W.; Robertson, Duncan A.; Bullmore, Edward T.
2011-01-01
Although metaphorical and conceptual connections between the human brain and the financial markets have often been drawn, rigorous physical or mathematical underpinnings of this analogy remain largely unexplored. Here, we apply a statistical and graph theoretic approach to the study of two datasets – the time series of 90 stocks from the New York stock exchange over a 3-year period, and the fMRI-derived time series acquired from 90 brain regions over the course of a 10-min-long functional MRI scan of resting brain function in healthy volunteers. Despite the many obvious substantive differences between these two datasets, graphical analysis demonstrated striking commonalities in terms of global network topological properties. Both the human brain and the market networks were non-random, small-world, modular, hierarchical systems with fat-tailed degree distributions indicating the presence of highly connected hubs. These properties could not be trivially explained by the univariate time series statistics of stock price returns. This degree of topological isomorphism suggests that brains and markets can be regarded broadly as members of the same family of networks. The two systems, however, were not topologically identical. The financial market was more efficient and more modular – more highly optimized for information processing – than the brain networks; but also less robust to systemic disintegration as a result of hub deletion. We conclude that the conceptual connections between brains and markets are not merely metaphorical; rather these two information processing systems can be rigorously compared in the same mathematical language and turn out often to share important topological properties in common to some degree. There will be interesting scientific arbitrage opportunities in further work at the graph-theoretically mediated interface between systems neuroscience and the statistical physics of financial markets. PMID:22007161
Spectral edge: gradient-preserving spectral mapping for image fusion.
Connah, David; Drew, Mark S; Finlayson, Graham D
2015-12-01
This paper describes a novel approach to image fusion for color display. Our goal is to generate an output image whose gradient matches that of the input as closely as possible. We achieve this using a constrained contrast mapping paradigm in the gradient domain, where the structure tensor of a high-dimensional gradient representation is mapped exactly to that of a low-dimensional gradient field which is then reintegrated to form an output. Constraints on output colors are provided by an initial RGB rendering. Initially, we motivate our solution with a simple "ansatz" (educated guess) for projecting higher-D contrast onto color gradients, which we expand to a more rigorous theorem to incorporate color constraints. The solution to these constrained optimizations is closed-form, allowing for simple and hence fast and efficient algorithms. The approach can map any N-D image data to any M-D output and can be used in a variety of applications using the same basic algorithm. In this paper, we focus on the problem of mapping N-D inputs to 3D color outputs. We present results in five applications: hyperspectral remote sensing, fusion of color and near-infrared or clear-filter images, multilighting imaging, dark flash, and color visualization of magnetic resonance imaging diffusion-tensor imaging.
MRF energy minimization and beyond via dual decomposition.
Komodakis, Nikos; Paragios, Nikos; Tziritas, Georgios
2011-03-01
This paper introduces a new rigorous theoretical framework to address discrete MRF-based optimization in computer vision. Such a framework exploits the powerful technique of Dual Decomposition. It is based on a projected subgradient scheme that attempts to solve an MRF optimization problem by first decomposing it into a set of appropriately chosen subproblems, and then combining their solutions in a principled way. In order to determine the limits of this method, we analyze the conditions that these subproblems have to satisfy and demonstrate the extreme generality and flexibility of such an approach. We thus show that by appropriately choosing what subproblems to use, one can design novel and very powerful MRF optimization algorithms. For instance, in this manner we are able to derive algorithms that: 1) generalize and extend state-of-the-art message-passing methods, 2) optimize very tight LP-relaxations to MRF optimization, and 3) take full advantage of the special structure that may exist in particular MRFs, allowing the use of efficient inference techniques such as, e.g., graph-cut-based methods. Theoretical analysis on the bounds related with the different algorithms derived from our framework and experimental results/comparisons using synthetic and real data for a variety of tasks in computer vision demonstrate the extreme potentials of our approach.
Nanosystem self-assembly pathways discovered via all-atom multiscale analysis.
Pankavich, Stephen D; Ortoleva, Peter J
2012-07-26
We consider the self-assembly of composite structures from a group of nanocomponents, each consisting of particles within an N-atom system. Self-assembly pathways and rates for nanocomposites are derived via a multiscale analysis of the classical Liouville equation. From a reduced statistical framework, rigorous stochastic equations for population levels of beginning, intermediate, and final aggregates are also derived. It is shown that the definition of an assembly type is a self-consistency criterion that must strike a balance between precision and the need for population levels to be slowly varying relative to the time scale of atomic motion. The deductive multiscale approach is complemented by a qualitative notion of multicomponent association and the ensemble of exact atomic-level configurations consistent with them. In processes such as viral self-assembly from proteins and RNA or DNA, there are many possible intermediates, so that it is usually difficult to predict the most efficient assembly pathway. However, in the current study, rates of assembly of each possible intermediate can be predicted. This avoids the need, as in a phenomenological approach, for recalibration with each new application. The method accounts for the feedback across scales in space and time that is fundamental to nanosystem self-assembly. The theory has applications to bionanostructures, geomaterials, engineered composites, and nanocapsule therapeutic delivery systems.
Surface-plasmon polariton scattering from a finite array of nanogrooves/ridges: Efficient mirrors
NASA Astrophysics Data System (ADS)
Sánchez-Gil, José A.; Maradudin, Alexei A.
2005-06-01
The scattering of surface-plasmon polaritons (SPP) by finite arrays of one-dimensional nanodefects on metal surfaces is theoretically investigated on the basis of the reduced Rayleigh equation. Numerical calculations are carried out that rigorously account for all the scattering channels: SPP reflection and transmission, and radiative leakage. We analyze the range of parameters (defect size and number) for which high SPP reflection efficiency (low radiative losses) is achieved within a SPP band gap (negligible SPP transmission), neglecting ohmic losses (justified for array lengths significantly shorter than the SPP inelastic length): Smaller defects play better as SPP mirrors (e.g., efficiency >90% at λ ˜650nm for Gaussian ridges/grooves with sub-30nm height and half-width) than larger defects, since the latter yield significant radiative losses.
Dual-function beam splitter of a subwavelength fused-silica grating.
Feng, Jijun; Zhou, Changhe; Zheng, Jiangjun; Cao, Hongchao; Lv, Peng
2009-05-10
We present the design and fabrication of a novel dual-function subwavelength fused-silica grating that can be used as a polarization-selective beam splitter. For TM polarization, the grating can be used as a two-port beam splitter at a wavelength of 1550 nm with a total diffraction efficiency of 98%. For TE polarization, the grating can function as a high-efficiency grating, and the diffraction efficiency of the -1st order is 95% under Littrow mounting. This dual-function grating design is based on a simplified modal method. By using the rigorous coupled-wave analysis, the optimum grating parameters can be determined. Holographic recording technology and inductively coupled plasma etching are used to manufacture the fused-silica grating. Experimental results are in agreement with the theoretical values.
Entanglement negativity bounds for fermionic Gaussian states
NASA Astrophysics Data System (ADS)
Eisert, Jens; Eisler, Viktor; Zimborás, Zoltán
2018-04-01
The entanglement negativity is a versatile measure of entanglement that has numerous applications in quantum information and in condensed matter theory. It can not only efficiently be computed in the Hilbert space dimension, but for noninteracting bosonic systems, one can compute the negativity efficiently in the number of modes. However, such an efficient computation does not carry over to the fermionic realm, the ultimate reason for this being that the partial transpose of a fermionic Gaussian state is no longer Gaussian. To provide a remedy for this state of affairs, in this work, we introduce efficiently computable and rigorous upper and lower bounds to the negativity, making use of techniques of semidefinite programming, building upon the Lagrangian formulation of fermionic linear optics, and exploiting suitable products of Gaussian operators. We discuss examples in quantum many-body theory and hint at applications in the study of topological properties at finite temperature.
ERIC Educational Resources Information Center
Coalition for Evidence-Based Policy, 2014
2014-01-01
An important recent development in evidence-based policy is the federal government's use of a "tiered evidence" approach to allocating funding in grant programs such as the U.S. Department of Education's Investing in Innovation Fund (i3). A central feature of this approach is that the largest grants are awarded to fund large-scale…
Alternative approaches to research in physical therapy: positivism and phenomenology.
Shepard, K F; Jensen, G M; Schmoll, B J; Hack, L M; Gwyer, J
1993-02-01
This article presents philosophical approaches to research in physical therapy. A comparison is made to demonstrate how the research purpose, research design, research methods, and research data differ when one approaches research from the philosophical perspective of positivism (predominantly quantitative) as compared with the philosophical perspective of phenomenology (predominantly qualitative). Differences between the two approaches are highlighted by examples from research articles published in Physical Therapy. The authors urge physical therapy researchers to become familiar with the tenets, rigor, and knowledge gained from the use of both approaches in order to increase their options in conducting research relevant to the practice of physical therapy.
A feminist response to Weitzer.
Dines, Gail
2012-04-01
In his review of my book Pornland: How Porn has Hijacked our Sexuality, Ronald Weitzer claims that anti-porn feminists are incapable of objective, rigorous research because they operate within the "oppression paradigm," which he defines as "a perspective that depicts all types of sex work as exploitive, violent, and perpetuating gender inequality." (VAW, 2011, 666). This article argues that while anti-porn feminists do indeed see pornography as exploitive, such a position is rooted in the rigorous theories and methods of cultural studies developed by critical media scholars such as Stuart Hall and Antonio Gramsci. Pornland applies a cultural studies approach by exploring how porn images are part of a wider system of sexist representations that legitimize and normalize the economic, political and legal oppression of women.
Development of rigor mortis is not affected by muscle volume.
Kobayashi, M; Ikegaya, H; Takase, I; Hatanaka, K; Sakurada, K; Iwase, H
2001-04-01
There is a hypothesis suggesting that rigor mortis progresses more rapidly in small muscles than in large muscles. We measured rigor mortis as tension determined isometrically in rat musculus erector spinae that had been cut into muscle bundles of various volumes. The muscle volume did not influence either the progress or the resolution of rigor mortis, which contradicts the hypothesis. Differences in pre-rigor load on the muscles influenced the onset and resolution of rigor mortis in a few pairs of samples, but did not influence the time taken for rigor mortis to reach its full extent after death. Moreover, the progress of rigor mortis in this muscle was biphasic; this may reflect the early rigor of red muscle fibres and the late rigor of white muscle fibres.
Neoliberalism, Policy Reforms and Higher Education in Bangladesh
ERIC Educational Resources Information Center
Kabir, Ariful Haq
2013-01-01
Bangladesh has introduced neoliberal policies since the 1970s. Military regimes, since the dramatic political changes in 1975, accelerated the process. A succession of military rulers made rigorous changes in policy-making in various sectors. This article uses a critical approach to document analysis and examines the perceptions of key…
Quantitative Approaches to Group Research: Suggestions for Best Practices
ERIC Educational Resources Information Center
McCarthy, Christopher J.; Whittaker, Tiffany A.; Boyle, Lauren H.; Eyal, Maytal
2017-01-01
Rigorous scholarship is essential to the continued growth of group work, yet the unique nature of this counseling specialty poses challenges for quantitative researchers. The purpose of this proposal is to overview unique challenges to quantitative research with groups in the counseling field, including difficulty in obtaining large sample sizes…
ERIC Educational Resources Information Center
Boohan, Richard
2014-01-01
This article describes an approach to teaching about the energy concept that aims to be accessible to students starting in early secondary school, while being scientifically rigorous and forming the foundation for later work. It discusses how exploring thermal processes is a good starting point for a more general consideration of the ways that…
ERIC Educational Resources Information Center
Tucker, James E.
2014-01-01
The field of K-12 deaf education today continues to be fractured by ideological camps. A newcomer to the field quickly learns that the controversies related to language, communication, and instructional approaches continue to rage after almost 200 years of contentious debate. Much attention is given to auditory and speech development as well as…
This research will quantify the extent of de facto reuse of untreated wastewater at the global scale. Through the integration of multiple existing spatial data sources, this project will produce rigorous analyses assessing the relationship between wastewater irrigation, hea...
Management Information System Based on the Balanced Scorecard
ERIC Educational Resources Information Center
Kettunen, Juha; Kantola, Ismo
2005-01-01
Purpose: This study seeks to describe the planning and implementation in Finland of a campus-wide management information system using a rigorous planning methodology. Design/methodology/approach: The structure of the management information system is planned on the basis of the management process, where strategic management and the balanced…
Competency-Based Curriculum: An Effective Approach to Digital Curation Education
ERIC Educational Resources Information Center
Kim, Jeonghyun
2015-01-01
The University of North Texas conducted a project involving rigorous curriculum development and instructional design to address the goal of building capacity in the Library and Information Sciences curriculum. To prepare information professionals with the competencies needed for digital curation and data management practice, the project developed…
The Cost Effectiveness of 22 Approaches for Raising Student Achievement
ERIC Educational Resources Information Center
Yeh, Stuart S.
2010-01-01
Review of cost-effectiveness studies suggests that rapid assessment is more cost effective with regard to student achievement than comprehensive school reform (CSR), cross-age tutoring, computer-assisted instruction, a longer school day, increases in teacher education, teacher experience or teacher salaries, summer school, more rigorous math…
A Novel Approach to Physiology Education for Biomedical Engineering Students
ERIC Educational Resources Information Center
DiCecco, J.; Wu, J.; Kuwasawa, K.; Sun, Y.
2007-01-01
It is challenging for biomedical engineering programs to incorporate an indepth study of the systemic interdependence of cells, tissues, and organs into the rigorous mathematical curriculum that is the cornerstone of engineering education. To be sure, many biomedical engineering programs require their students to enroll in anatomy and physiology…
Building "Applied Linguistic Historiography": Rationale, Scope, and Methods
ERIC Educational Resources Information Center
Smith, Richard
2016-01-01
In this article I argue for the establishment of "Applied Linguistic Historiography" (ALH), that is, a new domain of enquiry within applied linguistics involving a rigorous, scholarly, and self-reflexive approach to historical research. Considering issues of rationale, scope, and methods in turn, I provide reasons why ALH is needed and…
ERIC Educational Resources Information Center
Stanly, Pat
2009-01-01
Rough patches occur at both ends of the education pipeline, as students enter community colleges and move on to work or enrollment in four-year institutions. Career pathways--sequences of coherent, articulated, and rigorous career and academic courses that lead to an industry-recognized certificate or a college degree--are a promising approach to…
A Writing-Intensive, Methods-Based Laboratory Course for Undergraduates
ERIC Educational Resources Information Center
Colabroy, Keri L.
2011-01-01
Engaging undergraduate students in designing and executing original research should not only be accompanied by technique training but also intentional instruction in the critical analysis and writing of scientific literature. The course described here takes a rigorous approach to scientific reading and writing using primary literature as the model…
Group Practices: A New Way of Viewing CSCL
ERIC Educational Resources Information Center
Stahl, Gerry
2017-01-01
The analysis of "group practices" can make visible the work of novices learning how to inquire in science or mathematics. These ubiquitous practices are invisibly taken for granted by adults, but can be observed and rigorously studied in adequate traces of online collaborative learning. Such an approach contrasts with traditional…
A Transformative Model for Undergraduate Quantitative Biology Education
ERIC Educational Resources Information Center
Usher, David C.; Driscoll, Tobin A.; Dhurjati, Prasad; Pelesko, John A.; Rossi, Louis F.; Schleiniger, Gilberto; Pusecker, Kathleen; White, Harold B.
2010-01-01
The "BIO2010" report recommended that students in the life sciences receive a more rigorous education in mathematics and physical sciences. The University of Delaware approached this problem by (1) developing a bio-calculus section of a standard calculus course, (2) embedding quantitative activities into existing biology courses, and (3)…
Chinese Students' Satisfaction of the Study Abroad Experience
ERIC Educational Resources Information Center
Wang, Qinggang; Taplin, Ross; Brown, Alistair M.
2011-01-01
Purpose: Building upon McLeod and Wainwright's paradigm for rigorous scientific assessment of study abroad programs, this paper aims to use social learning theory to assess mainland Chinese students' satisfaction of the Chinese Curtin Student Accounting Academic Programme. Design/methodology/approach: A sample of mainland Chinese students enrolled…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-12
... that can demonstrate impact through other methodological approaches such as a quasi-experimental design... definition of ``relevant outcome.'' Lastly, quasi-experimental designs are already included in the definition... paragraph (b) of this definition, provided they are rigorous and comparable across schools. (b) For non...
Approaches to Cross-Cultural Research in Art Education.
ERIC Educational Resources Information Center
Anderson, Frances E.
1979-01-01
The author defines the aims of cross-cultural research in art education and examines the problems inherent in such research, using as an illustration a summary chart of Child's cross-cultural studies of esthetic sensitivity. Emphasis is placed on the need for rigor in research design and execution. (SJL)
Math Exchanges: Guiding Young Mathematicians in Small-Group Meetings
ERIC Educational Resources Information Center
Wedekind, Kassia Omohundro
2011-01-01
Traditionally, small-group math instruction has been used as a format for reaching children who struggle to understand. Math coach Kassia Omohundro Wedekind uses small-group instruction as the centerpiece of her math workshop approach, engaging all students in rigorous "math exchanges." The key characteristics of these mathematical conversations…
Emerging Action Research Traditions: Rigor in Practice
ERIC Educational Resources Information Center
Watkins, Karen E.; Nicolaides, Aliki; Marsick, Victoria J.
2016-01-01
The authors argue here that contemporary use of action research shares the exploratory, inductive nature of many qualitative research approaches--no matter the type of data collected--because the type of research problems studied are set in complex, dynamic, rapidly changing contexts and because action research is undertaken to support social and…
Testing Adaptive Toolbox Models: A Bayesian Hierarchical Approach
ERIC Educational Resources Information Center
Scheibehenne, Benjamin; Rieskamp, Jorg; Wagenmakers, Eric-Jan
2013-01-01
Many theories of human cognition postulate that people are equipped with a repertoire of strategies to solve the tasks they face. This theoretical framework of a cognitive toolbox provides a plausible account of intra- and interindividual differences in human behavior. Unfortunately, it is often unclear how to rigorously test the toolbox…
Reexamining Our Approach to College Access
ERIC Educational Resources Information Center
Pérez, Angel B.
2017-01-01
In this article Trinity College vice president for enrollment and student success, Angel Pérez addresses the nation's inability to offer consistent college preparation, academic rigor and counseling across varying socioeconomic communities. Research has highlighted the fact that standardized tests do more to keep low-income students out of top…
Data Use and Inquiry in Research-Practice Partnerships: Four Case Examples
ERIC Educational Resources Information Center
Biag, Manuelito; Gerstein, Amy; Fehrer, Kendra; Sanchez, Monika; Sipes, Laurel
2016-01-01
The four case examples presented in this brief are drawn from the Gardner Center's substantial experience conducting rigorous research in research-practice partnerships. The first case describes a partnership approach that enhances a school district's capacity to use integrated longitudinal data to tackle persistent problems of practice and…
Hughes, Brianna H; Greenberg, Neil J; Yang, Tom C; Skonberg, Denise I
2015-01-01
High-pressure processing (HPP) is used to increase meat safety and shelf-life, with conflicting quality effects depending on rigor status during HPP. In the seafood industry, HPP is used to shuck and pasteurize oysters, but its use on abalones has only been minimally evaluated and the effect of rigor status during HPP on abalone quality has not been reported. Farm-raised abalones (Haliotis rufescens) were divided into 12 HPP treatments and 1 unprocessed control treatment. Treatments were processed pre-rigor or post-rigor at 2 pressures (100 and 300 MPa) and 3 processing times (1, 3, and 5 min). The control was analyzed post-rigor. Uniform plugs were cut from adductor and foot meat for texture profile analysis, shear force, and color analysis. Subsamples were used for scanning electron microscopy of muscle ultrastructure. Texture profile analysis revealed that post-rigor processed abalone was significantly (P < 0.05) less firm and chewy than pre-rigor processed irrespective of muscle type, processing time, or pressure. L values increased with pressure to 68.9 at 300 MPa for pre-rigor processed foot, 73.8 for post-rigor processed foot, 90.9 for pre-rigor processed adductor, and 89.0 for post-rigor processed adductor. Scanning electron microscopy images showed fraying of collagen fibers in processed adductor, but did not show pressure-induced compaction of the foot myofibrils. Post-rigor processed abalone meat was more tender than pre-rigor processed meat, and post-rigor processed foot meat was lighter in color than pre-rigor processed foot meat, suggesting that waiting for rigor to resolve prior to processing abalones may improve consumer perceptions of quality and market value. © 2014 Institute of Food Technologists®
Zhang, Panpan; Li, Jing; Lv, Lingxiao; Zhao, Yang; Qu, Liangti
2017-05-23
Efficient utilization of solar energy for clean water is an attractive, renewable, and environment friendly way to solve the long-standing water crisis. For this task, we prepared the long-range vertically aligned graphene sheets membrane (VA-GSM) as the highly efficient solar thermal converter for generation of clean water. The VA-GSM was prepared by the antifreeze-assisted freezing technique we developed, which possessed the run-through channels facilitating the water transport, high light absorption capacity for excellent photothermal transduction, and the extraordinary stability in rigorous conditions. As a result, VA-GSM has achieved average water evaporation rates of 1.62 and 6.25 kg m -2 h -1 under 1 and 4 sun illumination with a superb solar thermal conversion efficiency of up to 86.5% and 94.2%, respectively, better than that of most carbon materials reported previously, which can efficiently produce the clean water from seawater, common wastewater, and even concentrated acid and/or alkali solutions.
NASA Astrophysics Data System (ADS)
Meyer, Toni; Körner, Christian; Vandewal, Koen; Leo, Karl
2018-04-01
In two terminal tandem solar cells, the current density - voltage (jV) characteristic of the individual subcells is typically not directly measurable, but often required for a rigorous device characterization. In this work, we reconstruct the jV-characteristic of organic solar cells from measurements of the external quantum efficiency under applied bias voltages and illumination. We show that it is necessary to perform a bias irradiance variation at each voltage and subsequently conduct a mathematical correction of the differential to the absolute external quantum efficiency to obtain an accurate jV-characteristic. Furthermore, we show that measuring the external quantum efficiency as a function of voltage for a single bias irradiance of 0.36 AM1.5g equivalent sun provides a good approximation of the photocurrent density over voltage curve. The method is tested on a selection of efficient, common single-junctions. The obtained conclusions can easily be transferred to multi-junction devices with serially connected subcells.
Zhang, Zheshen; Voss, Paul L
2009-07-06
We propose a continuous variable based quantum key distribution protocol that makes use of discretely signaled coherent light and reverse error reconciliation. We present a rigorous security proof against collective attacks with realistic lossy, noisy quantum channels, imperfect detector efficiency, and detector electronic noise. This protocol is promising for convenient, high-speed operation at link distances up to 50 km with the use of post-selection.
Learning optimal quantum models is NP-hard
NASA Astrophysics Data System (ADS)
Stark, Cyril J.
2018-02-01
Physical modeling translates measured data into a physical model. Physical modeling is a major objective in physics and is generally regarded as a creative process. How good are computers at solving this task? Here, we show that in the absence of physical heuristics, the inference of optimal quantum models cannot be computed efficiently (unless P=NP ). This result illuminates rigorous limits to the extent to which computers can be used to further our understanding of nature.
Differential geometry based solvation model I: Eulerian formulation
NASA Astrophysics Data System (ADS)
Chen, Zhan; Baker, Nathan A.; Wei, G. W.
2010-11-01
This paper presents a differential geometry based model for the analysis and computation of the equilibrium property of solvation. Differential geometry theory of surfaces is utilized to define and construct smooth interfaces with good stability and differentiability for use in characterizing the solvent-solute boundaries and in generating continuous dielectric functions across the computational domain. A total free energy functional is constructed to couple polar and nonpolar contributions to the solvation process. Geometric measure theory is employed to rigorously convert a Lagrangian formulation of the surface energy into an Eulerian formulation so as to bring all energy terms into an equal footing. By optimizing the total free energy functional, we derive coupled generalized Poisson-Boltzmann equation (GPBE) and generalized geometric flow equation (GGFE) for the electrostatic potential and the construction of realistic solvent-solute boundaries, respectively. By solving the coupled GPBE and GGFE, we obtain the electrostatic potential, the solvent-solute boundary profile, and the smooth dielectric function, and thereby improve the accuracy and stability of implicit solvation calculations. We also design efficient second-order numerical schemes for the solution of the GPBE and GGFE. Matrix resulted from the discretization of the GPBE is accelerated with appropriate preconditioners. An alternative direct implicit (ADI) scheme is designed to improve the stability of solving the GGFE. Two iterative approaches are designed to solve the coupled system of nonlinear partial differential equations. Extensive numerical experiments are designed to validate the present theoretical model, test computational methods, and optimize numerical algorithms. Example solvation analysis of both small compounds and proteins are carried out to further demonstrate the accuracy, stability, efficiency and robustness of the present new model and numerical approaches. Comparison is given to both experimental and theoretical results in the literature.
Differential geometry based solvation model I: Eulerian formulation
Chen, Zhan; Baker, Nathan A.; Wei, G. W.
2010-01-01
This paper presents a differential geometry based model for the analysis and computation of the equilibrium property of solvation. Differential geometry theory of surfaces is utilized to define and construct smooth interfaces with good stability and differentiability for use in characterizing the solvent-solute boundaries and in generating continuous dielectric functions across the computational domain. A total free energy functional is constructed to couple polar and nonpolar contributions to the salvation process. Geometric measure theory is employed to rigorously convert a Lagrangian formulation of the surface energy into an Eulerian formulation so as to bring all energy terms into an equal footing. By minimizing the total free energy functional, we derive coupled generalized Poisson-Boltzmann equation (GPBE) and generalized geometric flow equation (GGFE) for the electrostatic potential and the construction of realistic solvent-solute boundaries, respectively. By solving the coupled GPBE and GGFE, we obtain the electrostatic potential, the solvent-solute boundary profile, and the smooth dielectric function, and thereby improve the accuracy and stability of implicit solvation calculations. We also design efficient second order numerical schemes for the solution of the GPBE and GGFE. Matrix resulted from the discretization of the GPBE is accelerated with appropriate preconditioners. An alternative direct implicit (ADI) scheme is designed to improve the stability of solving the GGFE. Two iterative approaches are designed to solve the coupled system of nonlinear partial differential equations. Extensive numerical experiments are designed to validate the present theoretical model, test computational methods, and optimize numerical algorithms. Example solvation analysis of both small compounds and proteins are carried out to further demonstrate the accuracy, stability, efficiency and robustness of the present new model and numerical approaches. Comparison is given to both experimental and theoretical results in the literature. PMID:20938489
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2012-01-01
Case Study with WIPP program overview, information regarding eligibility, and successes from Pennsylvania's Commission on Economic Opportunity (CEO) that demonstrate innovative approaches that maximize the benefit of the program. The Department of Energy (DOE) Office of Energy Efficiency and Renewable Energy (EERE) recently launched the Weatherization Innovation Pilot Program (WIPP) to accelerate innovations in whole-house weatherization and advance DOE's goal of increasing the energy efficiency and health and safety of homes of low-income families. Since 2010, WIPP has helped weatherization service providers as well as new and nontraditional partners leverage non-federal financial resources to supplement federal grants, saving taxpayer money.more » WIPP complements the Weatherization Assistance program (WAP), which operates nation-wide, in U.S. territories and in three Native American tribes. 16 grantees are implementing weatherization innovation projects using experimental approaches to find new and better ways to weatherize homes. They are using approaches such as: (1) Financial tools - by understanding a diverse range of financing mechanisms, grantees can maximize the impact of the federal grant dollars while providing high-quality work and benefits to eligible low-income clients; (2) Green and healthy homes - in addition to helping families reduce their energy costs, grantees can protect their health and safety. Two WIPP projects (Connecticut and Maryland) will augment standard weatherization services with a comprehensive green and healthy homes approach; (3) New technologies and techniques - following the model of continuous improvement in weatherization, WIPP grantees will continue to use new and better technologies and techniques to improve the quality of work; (4) Residential energy behavior change - Two grantees are rigorously testing home energy monitors (HEMs) that display energy used in kilowatt-hours, allowing residents to monitor and reduce their energy use, and another is examining best-practices for mobile home energy efficiency; (5) Workforce development and volunteers - with a goal of creating a self-sustaining weatherization model that does not require future federal investment, three grantees are adapting business models successful in other sectors of the home performance business to perform weatherization work. Youthbuild is training youth to perform home energy upgrades to eligible clients and Habitat for Humanity is developing a model for how to incorporate volunteer labor in home weatherization. These innovative approaches will improve key weatherization outcomes, such as: Increasing the total number of homes that are weatherized; Reducing the weatherization cost per home; Increasing the energy savings in each weatherized home; Increasing the number of weatherization jobs created and retained; and Reducing greenhouse gas emissions.« less
Mathematical models and photogrammetric exploitation of image sensing
NASA Astrophysics Data System (ADS)
Puatanachokchai, Chokchai
Mathematical models of image sensing are generally categorized into physical/geometrical sensor models and replacement sensor models. While the former is determined from image sensing geometry, the latter is based on knowledge of the physical/geometric sensor models and on using such models for its implementation. The main thrust of this research is in replacement sensor models which have three important characteristics: (1) Highly accurate ground-to-image functions; (2) Rigorous error propagation that is essentially of the same accuracy as the physical model; and, (3) Adjustability, or the ability to upgrade the replacement sensor model parameters when additional control information becomes available after the replacement sensor model has replaced the physical model. In this research, such replacement sensor models are considered as True Replacement Models or TRMs. TRMs provide a significant advantage of universality, particularly for image exploitation functions. There have been several writings about replacement sensor models, and except for the so called RSM (Replacement Sensor Model as a product described in the Manual of Photogrammetry), almost all of them pay very little or no attention to errors and their propagation. This is because, it is suspected, the few physical sensor parameters are usually replaced by many more parameters, thus presenting a potential error estimation difficulty. The third characteristic, adjustability, is perhaps the most demanding. It provides an equivalent flexibility to that of triangulation using the physical model. Primary contributions of this thesis include not only "the eigen-approach", a novel means of replacing the original sensor parameter covariance matrices at the time of estimating the TRM, but also the implementation of the hybrid approach that combines the eigen-approach with the added parameters approach used in the RSM. Using either the eigen-approach or the hybrid approach, rigorous error propagation can be performed during image exploitation. Further, adjustability can be performed when additional control information becomes available after the TRM has been implemented. The TRM is shown to apply to imagery from sensors having different geometries, including an aerial frame camera, a spaceborne linear array sensor, an airborne pushbroom sensor, and an airborne whiskbroom sensor. TRM results show essentially negligible differences as compared to those from rigorous physical sensor models, both for geopositioning from single and overlapping images. Simulated as well as real image data are used to address all three characteristics of the TRM.
Combining correlative and mechanistic habitat suitability models to improve ecological compensation.
Meineri, Eric; Deville, Anne-Sophie; Grémillet, David; Gauthier-Clerc, Michel; Béchet, Arnaud
2015-02-01
Only a few studies have shown positive impacts of ecological compensation on species dynamics affected by human activities. We argue that this is due to inappropriate methods used to forecast required compensation in environmental impact assessments. These assessments are mostly descriptive and only valid at limited spatial and temporal scales. However, habitat suitability models developed to predict the impacts of environmental changes on potential species' distributions should provide rigorous science-based tools for compensation planning. Here we describe the two main classes of predictive models: correlative models and individual-based mechanistic models. We show how these models can be used alone or synoptically to improve compensation planning. While correlative models are easier to implement, they tend to ignore underlying ecological processes and lack accuracy. On the contrary, individual-based mechanistic models can integrate biological interactions, dispersal ability and adaptation. Moreover, among mechanistic models, those considering animal energy balance are particularly efficient at predicting the impact of foraging habitat loss. However, mechanistic models require more field data compared to correlative models. Hence we present two approaches which combine both methods for compensation planning, especially in relation to the spatial scale considered. We show how the availability of biological databases and software enabling fast and accurate population projections could be advantageously used to assess ecological compensation requirement efficiently in environmental impact assessments. © 2014 The Authors. Biological Reviews © 2014 Cambridge Philosophical Society.
Mandal, Aninda; Datta, Animesh K.
2014-01-01
A “thick stem” mutant of Corchorus olitorius L. was induced at M2 (0.50%, 4 h, EMS) and the true breeding mutant is assessed across generations (M5 to M7) considering morphometric traits as well as SEM analysis of pollen grains and raw jute fibres, stem anatomy, cytogenetical attributes, and lignin content in relation to control. Furthermore, single fibre diameter and tensile strength are also analysed. The objective is to assess the stability of mutant for its effective exploration for raising a new plant type in tossa jute for commercial exploitation and efficient breeding. The mutant trait is monogenic recessive to normal. Results indicate that “thick stem” mutant is stable across generations (2n = 14) with distinctive high seed and fibre yield and significantly low lignin content. Stem anatomy of the mutant shows significant enhancement in fibre zone, number of fibre pyramids and fibre bundles per pyramid, and diameter of fibre cell in relation to control. Moreover, tensile strength of mutant fibre is significantly higher than control fibre and the trait is inversely related to fibre diameter. However the mutant is associated with low germination frequency, poor seed viability, and high pollen sterility, which may be eliminated through mutational approach followed by rigorous selection and efficient breeding. PMID:24860822
Rosenfeld, Richard M; Shiffman, Richard N; Robertson, Peter
2013-01-01
Guidelines translate best evidence into best practice. A well-crafted guideline promotes quality by reducing health care variations, improving diagnostic accuracy, promoting effective therapy, and discouraging ineffective-or potentially harmful-interventions. Despite a plethora of published guidelines, methodology is often poorly defined and varies greatly within and among organizations. The third edition of this manual describes the principles and practices used successfully by the American Academy of Otolaryngology--Head and Neck Surgery Foundation to produce quality-driven, evidence-based guidelines using efficient and transparent methodology for actionable recommendations with multidisciplinary applicability. The development process emphasizes a logical sequence of key action statements supported by amplifying text, action statement profiles, and recommendation grades linking action to evidence. New material in this edition includes standards for trustworthy guidelines, updated classification of evidence levels, increased patient and public involvement, assessing confidence in the evidence, documenting differences of opinion, expanded discussion of conflict of interest, and use of computerized decision support for crafting actionable recommendations. As clinical practice guidelines become more prominent as a key metric of quality health care, organizations must develop efficient production strategies that balance rigor and pragmatism. Equally important, clinicians must become savvy in understanding what guidelines are--and are not--and how they are best used to improve care. The information in this manual should help clinicians and organizations achieve these goals.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eccleston, C.H.
1997-09-05
The National Environmental Policy Act (NEPA) of 1969 was established by Congress more than a quarter of a century ago, yet there is a surprising lack of specific tools, techniques, and methodologies for effectively implementing these regulatory requirements. Lack of professionally accepted techniques is a principal factor responsible for many inefficiencies. Often, decision makers do not fully appreciate or capitalize on the true potential which NEPA provides as a platform for planning future actions. New approaches and modem management tools must be adopted to fully achieve NEPA`s mandate. A new strategy, referred to as Total Federal Planning, is proposed formore » unifying large-scale federal planning efforts under a single, systematic, structured, and holistic process. Under this approach, the NEPA planning process provides a unifying framework for integrating all early environmental and nonenvironmental decision-making factors into a single comprehensive planning process. To promote effectiveness and efficiency, modem tools and principles from the disciplines of Value Engineering, Systems Engineering, and Total Quality Management are incorporated. Properly integrated and implemented, these planning tools provide the rigorous, structured, and disciplined framework essential in achieving effective planning. Ultimately, the goal of a Total Federal Planning strategy is to construct a unified and interdisciplinary framework that substantially improves decision-making, while reducing the time, cost, redundancy, and effort necessary to comply with environmental and other planning requirements. At a time when Congress is striving to re-engineer the governmental framework, apparatus, and process, a Total Federal Planning philosophy offers a systematic approach for uniting the disjointed and often convoluted planning process currently used by most federal agencies. Potentially this approach has widespread implications in the way federal planning is approached.« less
2007-01-01
results could be compromised. While service development efforts tied to seabasing are approaching milestones for investment decisions , it is...estimates for joint seabasing options are developed and made transparent to DOD and Congress, decision makers will not be able to evaluate the cost...Integrate Service Initiatives 10 DOD Has Not Developed a Joint Experimentation Campaign Plan to Inform Decisions About Joint Seabasing 16 Timeframe for
Agut, C; Caron, A; Giordano, C; Hoffman, D; Ségalini, A
2011-09-10
In 2001, a multidisciplinary team made of analytical scientists and statisticians at Sanofi-aventis has published a methodology which has governed, from that time, the transfers from R&D sites to Manufacturing sites of the release monographs. This article provides an overview of the recent adaptations brought to this original methodology taking advantage of our experience and the new regulatory framework, and, in particular, the risk management perspective introduced by ICH Q9. Although some alternate strategies have been introduced in our practices, the comparative testing one, based equivalence testing as statistical approach, remains the standard for assays lying on very critical quality attributes. This is conducted with the concern to control the most important consumer's risk involved at two levels in analytical decisions in the frame of transfer studies: risk, for the receiving laboratory, to take poor release decisions with the analytical method and risk, for the sending laboratory, to accredit such a receiving laboratory on account of its insufficient performances with the method. Among the enhancements to the comparative studies, the manuscript presents the process settled within our company for a better integration of the transfer study into the method life-cycle, just as proposals of generic acceptance criteria and designs for assay and related substances methods. While maintaining rigor and selectivity of the original approach, these improvements tend towards an increased efficiency in the transfer operations. Copyright © 2011 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Piefke, Christoph; Lechermann, Frank
2018-03-01
The theory of correlated electron systems on a lattice proves notoriously complicated because of the exponential growth of Hilbert space. Mean-field approaches provide valuable insight when the self-energy has a dominant local structure. Additionally, the extraction of effective low-energy theories from the generalized many-body representation is highly desirable. In this respect, the rotational-invariant slave-boson (RISB) approach in its mean-field formulation enables versatile access to correlated lattice problems. However, in its original form, due to numerical complexity, the RISB approach is limited to about three correlated orbitals per lattice site. We thus present a thorough symmetry-adapted advancement of RISB theory, suited to efficiently deal with multiorbital Hubbard Hamiltonians for complete atomic-shell manifolds. It is utilized to study the intriguing problem of Hund's physics for three- and especially five-orbital manifolds on the correlated lattice, including crystal-field terms as well as spin-orbit interaction. The well-known Janus-face phenomenology, i.e., strengthening of correlations at smaller-to-intermediate Hubbard U accompanied by a shift of the Mott transition to a larger U value, has a stronger signature and more involved multiplet resolution for five-orbital problems. Spin-orbit interaction effectively reduces the critical local interaction strength and weakens the Janus-face behavior. Application to the realistic challenge of Fe chalcogenides underlines the subtle interplay of the orbital degrees of freedom in these materials.
Xu, Yan; Chen, Yan; Li, Daliang; Liu, Qing; Xuan, Zhenyu; Li, Wen-Hong
2017-02-01
MicroRNAs are small non-coding RNAs acting as posttranscriptional repressors of gene expression. Identifying mRNA targets of a given miRNA remains an outstanding challenge in the field. We have developed a new experimental approach, TargetLink, that applied locked nucleic acid (LNA) as the affinity probe to enrich target genes of a specific microRNA in intact cells. TargetLink also consists a rigorous and systematic data analysis pipeline to identify target genes by comparing LNA-enriched sequences between experimental and control samples. Using miR-21 as a test microRNA, we identified 12 target genes of miR-21 in a human colorectal cancer cell by this approach. The majority of the identified targets interacted with miR-21 via imperfect seed pairing. Target validation confirmed that miR-21 repressed the expression of the identified targets. The cellular abundance of the identified miR-21 target transcripts varied over a wide range, with some targets expressed at a rather low level, confirming that both abundant and rare transcripts are susceptible to regulation by microRNAs, and that TargetLink is an efficient approach for identifying the target set of a specific microRNA in intact cells. C20orf111, one of the novel targets identified by TargetLink, was found to reside in the nuclear speckle and to be reliably repressed by miR-21 through the interaction at its coding sequence.
NASA Technical Reports Server (NTRS)
Kuwata, Yoshiaki; Pavone, Marco; Balaram, J. (Bob)
2012-01-01
This paper presents a novel risk-constrained multi-stage decision making approach to the architectural analysis of planetary rover missions. In particular, focusing on a 2018 Mars rover concept, which was considered as part of a potential Mars Sample Return campaign, we model the entry, descent, and landing (EDL) phase and the rover traverse phase as four sequential decision-making stages. The problem is to find a sequence of divert and driving maneuvers so that the rover drive is minimized and the probability of a mission failure (e.g., due to a failed landing) is below a user specified bound. By solving this problem for several different values of the model parameters (e.g., divert authority), this approach enables rigorous, accurate and systematic trade-offs for the EDL system vs. the mobility system, and, more in general, cross-domain trade-offs for the different phases of a space mission. The overall optimization problem can be seen as a chance-constrained dynamic programming problem, with the additional complexity that 1) in some stages the disturbances do not have any probabilistic characterization, and 2) the state space is extremely large (i.e, hundreds of millions of states for trade-offs with high-resolution Martian maps). To this purpose, we solve the problem by performing an unconventional combination of average and minimax cost analysis and by leveraging high efficient computation tools from the image processing community. Preliminary trade-off results are presented.
Preserving pre-rigor meat functionality for beef patty production.
Claus, J R; Sørheim, O
2006-06-01
Three methods were examined for preserving pre-rigor meat functionality in beef patties. Hot-boned semimembranosus muscles were processed as follows: (1) pre-rigor ground, salted, patties immediately cooked; (2) pre-rigor ground, salted and stored overnight; (3) pre-rigor injected with brine; and (4) post-rigor ground and salted. Raw patties contained 60% lean beef, 19.7% beef fat trim, 1.7% NaCl, 3.6% starch, and 15% water. Pre-rigor processing occurred at 3-3.5h postmortem. Patties made from pre-rigor ground meat had higher pH values; greater protein solubility; firmer, more cohesive, and chewier texture; and substantially lower cooking losses than the other treatments. Addition of salt was sufficient to reduce the rate and extent of glycolysis. Brine injection of intact pre-rigor muscles resulted in some preservation of the functional properties but not as pronounced as with salt addition to pre-rigor ground meat.
Toward a new approach to the study of personality in culture.
Cheung, Fanny M; van de Vijver, Fons J R; Leong, Frederick T L
2011-10-01
We review recent developments in the study of culture and personality measurement. Three approaches are described: an etic approach that focuses on establishing measurement equivalence in imported measures of personality, an emic (indigenous) approach that studies personality in specific cultures, and a combined emic-etic approach to personality. We propose the latter approach as a way of combining the methodological rigor of the etic approach and the cultural sensitivity of the emic approach. The combined approach is illustrated by two examples: the first with origins in Chinese culture and the second in South Africa. The article ends with a discussion of the theoretical and practical implications of the combined emic-etic approach for the study of culture and personality and for psychology as a science.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maunz, Peter; Wilhelm, Lukas
Qubits can be encoded in clock states of trapped ions. These states are well isolated from the environment resulting in long coherence times [1] while enabling efficient high-fidelity qubit interactions mediated by the Coulomb coupled motion of the ions in the trap. Quantum states can be prepared with high fidelity and measured efficiently using fluorescence detection. State preparation and detection with 99.93% fidelity have been realized in multiple systems [1,2]. Single qubit gates have been demonstrated below rigorous fault-tolerance thresholds [1,3]. Two qubit gates have been realized with more than 99.9% fidelity [4,5]. Quantum algorithms have been demonstrated on systemsmore » of 5 to 15 qubits [6–8].« less
Polarization sensitivity testing of off-plane reflection gratings
NASA Astrophysics Data System (ADS)
Marlowe, Hannah; McEntaffer, Randal L.; DeRoo, Casey T.; Miles, Drew M.; Tutt, James H.; Laubis, Christian; Soltwisch, Victor
2015-09-01
Off-Plane reflection gratings were previously predicted to have different efficiencies when the incident light is polarized in the transverse-magnetic (TM) versus transverse-electric (TE) orientations with respect to the grating grooves. However, more recent theoretical calculations which rigorously account for finitely conducting, rather than perfectly conducting, grating materials no longer predict significant polarization sensitivity. We present the first empirical results for radially ruled, laminar groove profile gratings in the off-plane mount which demonstrate no difference in TM versus TE efficiency across our entire 300-1500 eV bandpass. These measurements together with the recent theoretical results confirm that grazing incidence off-plane reflection gratings using real, not perfectly conducting, materials are not polarization sensitive.
Modeling and Optimization of Sub-Wavelength Grating Nanostructures on Cu(In,Ga)Se2 Solar Cell
NASA Astrophysics Data System (ADS)
Kuo, Shou-Yi; Hsieh, Ming-Yang; Lai, Fang-I.; Liao, Yu-Kuang; Kao, Ming-Hsuan; Kuo, Hao-Chung
2012-10-01
In this study, an optical simulation of Cu(In,Ga)Se2 (CIGS) solar cells by the rigorous coupled-wave analysis (RCWA) method is carried out to investigate the effects of surface morphology on the light absorption and power conversion efficiencies. Various sub-wavelength grating (SWG) nanostructures of periodic ZnO:Al (AZO) on CIGS solar cells were discussed in detail. SWG nanostructures were used as efficient antireflection layers. From the simulation results, AZO structures with nipple arrays effectively suppress the Fresnel reflection compared with nanorod- and cone-shaped AZO structures. The optimized reflectance decreased from 8.44 to 3.02% and the efficiency increased from 14.92 to 16.11% accordingly. The remarkable enhancement in light harvesting is attributed to the gradient refractive index profile between the AZO nanostructures and air.
Assessing tiger population dynamics using photographic capture-recapture sampling
Karanth, K.U.; Nichols, J.D.; Kumar, N.S.; Hines, J.E.
2006-01-01
Although wide-ranging, elusive, large carnivore species, such as the tiger, are of scientific and conservation interest, rigorous inferences about their population dynamics are scarce because of methodological problems of sampling populations at the required spatial and temporal scales. We report the application of a rigorous, noninvasive method for assessing tiger population dynamics to test model-based predictions about population viability. We obtained photographic capture histories for 74 individual tigers during a nine-year study involving 5725 trap-nights of effort. These data were modeled under a likelihood-based, ?robust design? capture?recapture analytic framework. We explicitly modeled and estimated ecological parameters such as time-specific abundance, density, survival, recruitment, temporary emigration, and transience, using models that incorporated effects of factors such as individual heterogeneity, trap-response, and time on probabilities of photo-capturing tigers. The model estimated a random temporary emigration parameter of =K' =Y' 0.10 ? 0.069 (values are estimated mean ? SE). When scaled to an annual basis, tiger survival rates were estimated at S = 0.77 ? 0.051, and the estimated probability that a newly caught animal was a transient was = 0.18 ? 0.11. During the period when the sampled area was of constant size, the estimated population size Nt varied from 17 ? 1.7 to 31 ? 2.1 tigers, with a geometric mean rate of annual population change estimated as = 1.03 ? 0.020, representing a 3% annual increase. The estimated recruitment of new animals, Bt, varied from 0 ? 3.0 to 14 ? 2.9 tigers. Population density estimates, D, ranged from 7.33 ? 0.8 tigers/100 km2 to 21.73 ? 1.7 tigers/100 km2 during the study. Thus, despite substantial annual losses and temporal variation in recruitment, the tiger density remained at relatively high levels in Nagarahole. Our results are consistent with the hypothesis that protected wild tiger populations can remain healthy despite heavy mortalities because of their inherently high reproductive potential. The ability to model the entire photographic capture history data set and incorporate reduced-parameter models led to estimates of mean annual population change that were sufficiently precise to be useful. This efficient, noninvasive sampling approach can be used to rigorously investigate the population dynamics of tigers and other elusive, rare, wide-ranging animal species in which individuals can be identified from photographs or other means.
Assessing tiger population dynamics using photographic capture-recapture sampling.
Karanth, K Ullas; Nichols, James D; Kumar, N Samba; Hines, James E
2006-11-01
Although wide-ranging, elusive, large carnivore species, such as the tiger, are of scientific and conservation interest, rigorous inferences about their population dynamics are scarce because of methodological problems of sampling populations at the required spatial and temporal scales. We report the application of a rigorous, noninvasive method for assessing tiger population dynamics to test model-based predictions about population viability. We obtained photographic capture histories for 74 individual tigers during a nine-year study involving 5725 trap-nights of effort. These data were modeled under a likelihood-based, "robust design" capture-recapture analytic framework. We explicitly modeled and estimated ecological parameters such as time-specific abundance, density, survival, recruitment, temporary emigration, and transience, using models that incorporated effects of factors such as individual heterogeneity, trap-response, and time on probabilities of photo-capturing tigers. The model estimated a random temporary emigration parameter of gamma" = gamma' = 0.10 +/- 0.069 (values are estimated mean +/- SE). When scaled to an annual basis, tiger survival rates were estimated at S = 0.77 +/- 0.051, and the estimated probability that a newly caught animal was a transient was tau = 0.18 +/- 0.11. During the period when the sampled area was of constant size, the estimated population size N(t) varied from 17 +/- 1.7 to 31 +/- 2.1 tigers, with a geometric mean rate of annual population change estimated as lambda = 1.03 +/- 0.020, representing a 3% annual increase. The estimated recruitment of new animals, B(t), varied from 0 +/- 3.0 to 14 +/- 2.9 tigers. Population density estimates, D, ranged from 7.33 +/- 0.8 tigers/100 km2 to 21.73 +/- 1.7 tigers/100 km2 during the study. Thus, despite substantial annual losses and temporal variation in recruitment, the tiger density remained at relatively high levels in Nagarahole. Our results are consistent with the hypothesis that protected wild tiger populations can remain healthy despite heavy mortalities because of their inherently high reproductive potential. The ability to model the entire photographic capture history data set and incorporate reduced-parameter models led to estimates of mean annual population change that were sufficiently precise to be useful. This efficient, noninvasive sampling approach can be used to rigorously investigate the population dynamics of tigers and other elusive, rare, wide-ranging animal species in which individuals can be identified from photographs or other means.
Asoubar, Daniel; Wyrowski, Frank
2015-07-27
The computer-aided design of high quality mono-mode, continuous-wave solid-state lasers requires fast, flexible and accurate simulation algorithms. Therefore in this work a model for the calculation of the transversal dominant mode structure is introduced. It is based on the generalization of the scalar Fox and Li algorithm to a fully-vectorial light representation. To provide a flexible modeling concept of different resonator geometries containing various optical elements, rigorous and approximative solutions of Maxwell's equations are combined in different subdomains of the resonator. This approach allows the simulation of plenty of different passive intracavity components as well as active media. For the numerically efficient simulation of nonlinear gain, thermal lensing and stress-induced birefringence effects in solid-state active crystals a semi-analytical vectorial beam propagation method is discussed in detail. As a numerical example the beam quality and output power of a flash-lamp-pumped Nd:YAG laser are improved. To that end we compensate the influence of stress-induced birefringence and thermal lensing by an aspherical mirror and a 90° quartz polarization rotator.
Multivariate localization methods for ensemble Kalman filtering
NASA Astrophysics Data System (ADS)
Roh, S.; Jun, M.; Szunyogh, I.; Genton, M. G.
2015-12-01
In ensemble Kalman filtering (EnKF), the small number of ensemble members that is feasible to use in a practical data assimilation application leads to sampling variability of the estimates of the background error covariances. The standard approach to reducing the effects of this sampling variability, which has also been found to be highly efficient in improving the performance of EnKF, is the localization of the estimates of the covariances. One family of localization techniques is based on taking the Schur (element-wise) product of the ensemble-based sample covariance matrix and a correlation matrix whose entries are obtained by the discretization of a distance-dependent correlation function. While the proper definition of the localization function for a single state variable has been extensively investigated, a rigorous definition of the localization function for multiple state variables that exist at the same locations has been seldom considered. This paper introduces two strategies for the construction of localization functions for multiple state variables. The proposed localization functions are tested by assimilating simulated observations experiments into the bivariate Lorenz 95 model with their help.
Development of a dynamic computational model of social cognitive theory.
Riley, William T; Martin, Cesar A; Rivera, Daniel E; Hekler, Eric B; Adams, Marc A; Buman, Matthew P; Pavel, Misha; King, Abby C
2016-12-01
Social cognitive theory (SCT) is among the most influential theories of behavior change and has been used as the conceptual basis of health behavior interventions for smoking cessation, weight management, and other health behaviors. SCT and other behavior theories were developed primarily to explain differences between individuals, but explanatory theories of within-person behavioral variability are increasingly needed as new technologies allow for intensive longitudinal measures and interventions adapted from these inputs. These within-person explanatory theoretical applications can be modeled as dynamical systems. SCT constructs, such as reciprocal determinism, are inherently dynamical in nature, but SCT has not been modeled as a dynamical system. This paper describes the development of a dynamical system model of SCT using fluid analogies and control systems principles drawn from engineering. Simulations of this model were performed to assess if the model performed as predicted based on theory and empirical studies of SCT. This initial model generates precise and testable quantitative predictions for future intensive longitudinal research. Dynamic modeling approaches provide a rigorous method for advancing health behavior theory development and refinement and for guiding the development of more potent and efficient interventions.
Classification of hyperspectral imagery with neural networks: comparison to conventional tools
NASA Astrophysics Data System (ADS)
Merényi, Erzsébet; Farrand, William H.; Taranik, James V.; Minor, Timothy B.
2014-12-01
Efficient exploitation of hyperspectral imagery is of great importance in remote sensing. Artificial intelligence approaches have been receiving favorable reviews for classification of hyperspectral data because the complexity of such data challenges the limitations of many conventional methods. Artificial neural networks (ANNs) were shown to outperform traditional classifiers in many situations. However, studies that use the full spectral dimensionality of hyperspectral images to classify a large number of surface covers are scarce if non-existent. We advocate the need for methods that can handle the full dimensionality and a large number of classes to retain the discovery potential and the ability to discriminate classes with subtle spectral differences. We demonstrate that such a method exists in the family of ANNs. We compare the maximum likelihood, Mahalonobis distance, minimum distance, spectral angle mapper, and a hybrid ANN classifier for real hyperspectral AVIRIS data, using the full spectral resolution to map 23 cover types and using a small training set. Rigorous evaluation of the classification accuracies shows that the ANN outperforms the other methods and achieves ≈90% accuracy on test data.
Risk assessment and management of Chlamydia psittaci in poultry processing plants.
Deschuyffeleer, Thomas P G; Tyberghien, Laurens F V; Dickx, Veerle L C; Geens, Tom; Saelen, Jacques M M M; Vanrompay, Daisy C G; Braeckman, Lutgard A C M
2012-04-01
Chlamydia psittaci causes respiratory disease in poultry and can be transmitted to humans. Historical outbreaks of psittacosis in poultry workers indicated the need for higher awareness and an efficient risk assessment and management. This group reviewed relevant previous research, practical guidelines, and European directives. Subsequently, basic suggestions were made on how to assess and manage the risk of psittacosis in poultry processing plants based on a classical four-step approach. Collective and personal protective measures as well as the role of occupational medicine are described. Despite the finding that exposure is found in every branch, abattoir workstations seem to be associated with the highest prevalence of psittacosis. Complete eradication is difficult to achieve. Ventilation, cleaning, hand hygiene, and personal protective equipment are the most important protective measures to limit and control exposure to C. psittaci. Adequate information, communication, and health surveillance belong to the responsibilities of the occupational physician. Future challenges lay in the rigorous reporting of infections in both poultry and poultry workers and in the development of an avian and human vaccine.
Post-processing interstitialcy diffusion from molecular dynamics simulations
NASA Astrophysics Data System (ADS)
Bhardwaj, U.; Bukkuru, S.; Warrier, M.
2016-01-01
An algorithm to rigorously trace the interstitialcy diffusion trajectory in crystals is developed. The algorithm incorporates unsupervised learning and graph optimization which obviate the need to input extra domain specific information depending on crystal or temperature of the simulation. The algorithm is implemented in a flexible framework as a post-processor to molecular dynamics (MD) simulations. We describe in detail the reduction of interstitialcy diffusion into known computational problems of unsupervised clustering and graph optimization. We also discuss the steps, computational efficiency and key components of the algorithm. Using the algorithm, thermal interstitialcy diffusion from low to near-melting point temperatures is studied. We encapsulate the algorithms in a modular framework with functionality to calculate diffusion coefficients, migration energies and other trajectory properties. The study validates the algorithm by establishing the conformity of output parameters with experimental values and provides detailed insights for the interstitialcy diffusion mechanism. The algorithm along with the help of supporting visualizations and analysis gives convincing details and a new approach to quantifying diffusion jumps, jump-lengths, time between jumps and to identify interstitials from lattice atoms.
Bayesian Lagrangian Data Assimilation and Drifter Deployment Strategies
NASA Astrophysics Data System (ADS)
Dutt, A.; Lermusiaux, P. F. J.
2017-12-01
Ocean currents transport a variety of natural (e.g. water masses, phytoplankton, zooplankton, sediments, etc.) and man-made materials and other objects (e.g. pollutants, floating debris, search and rescue, etc.). Lagrangian Coherent Structures (LCSs) or the most influential/persistent material lines in a flow, provide a robust approach to characterize such Lagrangian transports and organize classic trajectories. Using the flow-map stochastic advection and a dynamically-orthogonal decomposition, we develop uncertainty prediction schemes for both Eulerian and Lagrangian variables. We then extend our Bayesian Gaussian Mixture Model (GMM)-DO filter to a joint Eulerian-Lagrangian Bayesian data assimilation scheme. The resulting nonlinear filter allows the simultaneous non-Gaussian estimation of Eulerian variables (e.g. velocity, temperature, salinity, etc.) and Lagrangian variables (e.g. drifter/float positions, trajectories, LCSs, etc.). Its results are showcased using a double-gyre flow with a random frequency, a stochastic flow past a cylinder, and realistic ocean examples. We further show how our Bayesian mutual information and adaptive sampling equations provide a rigorous efficient methodology to plan optimal drifter deployment strategies and predict the optimal times, locations, and types of measurements to be collected.
Review and prospect of supersonic business jet design
NASA Astrophysics Data System (ADS)
Sun, Yicheng; Smith, Howard
2017-04-01
This paper reviews the environmental issues and challenges appropriate to the design of supersonic business jets (SSBJs). There has been a renewed, worldwide interest in developing an environmentally friendly, economically viable and technologically feasible supersonic transport aircraft. A historical overview indicates that the SSBJ will be the pioneer for the next generation of supersonic airliners. As a high-end product itself, the SSBJ will likely take a market share in the future. The mission profile appropriate to this vehicle is explored considering the rigorous environmental constraints. Mitigation of the sonic boom and improvements aerodynamic efficiency in flight are the most challenging features of civil supersonic transport. Technical issues and challenges associated with this type of aircraft are identified, and methodologies for the SSBJ design are discussed. Due to the tightly coupled issues, a multidisciplinary design, analysis and optimization environment is regarded as the essential approach to the creation of a low-boom low-drag supersonic aircraft. Industrial and academic organizations have an interest in this type of vehicle are presented. Their investments in SSBJ design will hopefully get civil supersonic transport back soon.
Post-processing interstitialcy diffusion from molecular dynamics simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bhardwaj, U., E-mail: haptork@gmail.com; Bukkuru, S.; Warrier, M.
2016-01-15
An algorithm to rigorously trace the interstitialcy diffusion trajectory in crystals is developed. The algorithm incorporates unsupervised learning and graph optimization which obviate the need to input extra domain specific information depending on crystal or temperature of the simulation. The algorithm is implemented in a flexible framework as a post-processor to molecular dynamics (MD) simulations. We describe in detail the reduction of interstitialcy diffusion into known computational problems of unsupervised clustering and graph optimization. We also discuss the steps, computational efficiency and key components of the algorithm. Using the algorithm, thermal interstitialcy diffusion from low to near-melting point temperatures ismore » studied. We encapsulate the algorithms in a modular framework with functionality to calculate diffusion coefficients, migration energies and other trajectory properties. The study validates the algorithm by establishing the conformity of output parameters with experimental values and provides detailed insights for the interstitialcy diffusion mechanism. The algorithm along with the help of supporting visualizations and analysis gives convincing details and a new approach to quantifying diffusion jumps, jump-lengths, time between jumps and to identify interstitials from lattice atoms. -- Graphical abstract:.« less
On Mathematical Modeling Of Quantum Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Achuthan, P.; Dept. of Mathematics, Indian Institute of Technology, Madras, 600 036; Narayanankutty, Karuppath
2009-07-02
The world of physical systems at the most fundamental levels is replete with efficient, interesting models possessing sufficient ability to represent the reality to a considerable extent. So far, quantum mechanics (QM) forming the basis of almost all natural phenomena, has found beyond doubt its intrinsic ingenuity, capacity and robustness to stand the rigorous tests of validity from and through appropriate calculations and experiments. No serious failures of quantum mechanical predictions have been reported, yet. However, Albert Einstein, the greatest theoretical physicist of the twentieth century and some other eminent men of science have stated firmly and categorically that QM,more » though successful by and large, is incomplete. There are classical and quantum reality models including those based on consciousness. Relativistic quantum theoretical approaches to clearly understand the ultimate nature of matter as well as radiation have still much to accomplish in order to qualify for a final theory of everything (TOE). Mathematical models of better, suitable character as also strength are needed to achieve satisfactory explanation of natural processes and phenomena. We, in this paper, discuss some of these matters with certain apt illustrations as well.« less
ERIC Educational Resources Information Center
Cen, Ling; Ruta, Dymitr; Powell, Leigh; Hirsch, Benjamin; Ng, Jason
2016-01-01
The benefits of collaborative learning, although widely reported, lack the quantitative rigor and detailed insight into the dynamics of interactions within the group, while individual contributions and their impacts on group members and their collaborative work remain hidden behind joint group assessment. To bridge this gap we intend to address…
The Applicability of Course Experience Questionnaire for a Malaysian University Context
ERIC Educational Resources Information Center
Thien, Lei Mee; Ong, Mei Yean
2016-01-01
Purpose: The purpose of this study is to examine the applicability of Course Experience Questionnaire (CEQ) in a Malaysian university context. Design/methodology/approach: The CEQ was translated into Malay language using rigorous cross-cultural adaptation procedures. The Malay version CEQ was administered to 190 undergraduate students in one…
Co-Occurrence of ADHD and High IQ: A Case Series Empirical Study
ERIC Educational Resources Information Center
Cordeiro, Mara L.; Farias, Antonio C.; Cunha, Alexandre; Benko, Cassia R.; Farias, Lucilene G.; Costa, Maria T.; Martins, Leandra F.; McCracken, James T.
2011-01-01
Objective: The validity of a diagnosis of ADHD in children with a high intelligence quotient (IQ) remains controversial. Using a multidisciplinary approach, rigorous diagnostic criteria, and worldwide-validated psychometric instruments, we identified a group of children attending public schools in southern Brazil for co-occurrence of high IQ and…
ERIC Educational Resources Information Center
Riazi, A. Mehdi
2016-01-01
Mixed-methods research (MMR), as an inter-discourse (quantitative and qualitative) methodology, can provide applied linguistics researchers the opportunity to draw on and integrate the strengths of the two research methodological approaches in favour of making more rigorous inferences about research problems. In this article, the argument is made…
Le Chatelier's Principle: The Effect of Temperature on the Solubility of Solids in Liquids.
ERIC Educational Resources Information Center
Brice, L. K.
1983-01-01
Provides a rigorous but straightforward thermodynamic treatment of the temperature dependence of the solubility of solids in liquids that is suitable for presentation to undergraduates, suggesting how to approach the qualitative aspects of the subject for freshmen. Considers unsolvated/solvated solutes and Le Chatelier's principle. (JN)
ERIC Educational Resources Information Center
Phan, Huy P.; Ngu, Bing H.
2017-01-01
In social sciences, the use of stringent methodological approaches is gaining increasing emphasis. Researchers have recognized the limitations of cross-sectional, non-manipulative data in the study of causality. True experimental designs, in contrast, are preferred as they represent rigorous standards for achieving causal flows between variables.…
Learning Transfer--Validation of the Learning Transfer System Inventory in Portugal
ERIC Educational Resources Information Center
Velada, Raquel; Caetano, Antonio; Bates, Reid; Holton, Ed
2009-01-01
Purpose: The purpose of this paper is to analyze the construct validity of learning transfer system inventory (LTSI) for use in Portugal. Furthermore, it also aims to analyze whether LTSI dimensions differ across individual variables such as gender, age, educational level and job tenure. Design/methodology/approach: After a rigorous translation…
ERIC Educational Resources Information Center
Thomas, Jason E.; Hornsey, Philip E.
2014-01-01
Formative Classroom Assessment Techniques (CAT) have been well-established instructional tools in higher education since their exposition in the late 1980s (Angelo & Cross, 1993). A large body of literature exists surrounding the strengths and weaknesses of formative CATs. Simpson-Beck (2011) suggested insufficient quantitative evidence exists…
Reconstructing Constructivism: Causal Models, Bayesian Learning Mechanisms, and the Theory Theory
ERIC Educational Resources Information Center
Gopnik, Alison; Wellman, Henry M.
2012-01-01
We propose a new version of the "theory theory" grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework…
The Mauritian Education System: Was There a Will to Anglicize it?
ERIC Educational Resources Information Center
Tirvsssen, Rada
2007-01-01
Clive Whitehead (2005: 315-329) makes an indisputable claim that British colonial education is a controversial topic in the history of education. The macro study of educational systems undertaken within a framework that guarantees a systematic and rigorous approach can offer answers to many disputed issues, but researchers should not underestimate…
ERIC Educational Resources Information Center
Kinsler, Paul; Favaro, Alberto; McCall, Martin W.
2009-01-01
The Poynting vector is an invaluable tool for analysing electromagnetic problems. However, even a rigorous stress-energy tensor approach can still leave us with the question: is it best defined as E x H or as D x B? Typical electromagnetic treatments provide yet another perspective: they regard E x B as the appropriate definition, because E and B…
ERIC Educational Resources Information Center
Pianta, Robert C.
2011-01-01
There is widespread acknowledgement that the production of effective teaching and teachers is perhaps the critical component of education reform and innovation for improvement of student learning. This aim requires a serious investment of time, rigor, and evaluation to produce professional-development programs that actually work. This report…
A Geometric Comparison of the Transformation Loci with Specific and Mobile Capital
ERIC Educational Resources Information Center
Colander, David; Gilbert, John; Oladi, Reza
2008-01-01
The authors show how the transformation loci in the specific factors model (capital specificity) and the Heckscher-Ohlin-Samuelson model (capital mobility) can be rigorously derived and easily compared by using geometric techniques on the basis of Savosnick geometry. The approach shows directly that the transformation locus with capital…
A Study of Statistics through Tootsie Pops
ERIC Educational Resources Information Center
Aaberg, Shelby; Vitosh, Jason; Smith, Wendy
2016-01-01
A classic TV commercial once asked, "How many licks does it take to get to the center of a Tootsie Roll Tootsie Pop?" The narrator claims, "The world may never know" (Tootsie Roll 2012), but an Internet search returns a multitude of answers, some of which include rigorous systematic approaches by academics to address the…
Characterizing dispersal patterns in a threatened seabird with limited genetic structure
Laurie A. Hall; Per J. Palsboll; Steven R. Beissinger; James T. Harvey; Martine Berube; Martin G. Raphael; Kim Nelson; Richard T. Golightly; Laura McFarlane-Tranquilla; Scott H. Newman; M. Zachariah Peery
2009-01-01
Genetic assignment methods provide an appealing approach for characterizing dispersal patterns on ecological time scales, but require sufficient genetic differentiation to accurately identify migrants and a large enough sample size of migrants to, for example, compare dispersal between sexes or age classes. We demonstrate that assignment methods can be rigorously used...
ERIC Educational Resources Information Center
McFadden, Paula; Taylor, Brian J.; Campbell, Anne; McQuilkin, Janice
2012-01-01
Context: The development of a consolidated knowledge base for social work requires rigorous approaches to identifying relevant research. Method: The quality of 10 databases and a web search engine were appraised by systematically searching for research articles on resilience and burnout in child protection social workers. Results: Applied Social…
Assessment in Work-Based Learning: Investigating a Pedagogical Approach to Enhance Student Learning
ERIC Educational Resources Information Center
Brodie, Pandy; Irving, Kate
2007-01-01
Work-based learning (WBL) is undertaken in a wide variety of higher education contexts and is increasingly viewed as a valuable, and increasingly essential, component of both the undergraduate and postgraduate student learning experience. However, the development of rigorous pedagogies to underpin WBL and its assessment is still embryonic. This…
Uncertainty analysis: an evaluation metric for synthesis science
Mark E. Harmon; Becky Fasth; Charles B. Halpern; James A. Lutz
2015-01-01
The methods for conducting reductionist ecological science are well known and widely used. In contrast, those used in the synthesis of ecological science (i.e., synthesis science) are still being developed, vary widely, and often lack the rigor of reductionist approaches. This is unfortunate because the synthesis of ecological parts into a greater whole is...
Traditionally, the EPA has monitored aquatic ecosystems using statistically rigorous sample designs and intensive field efforts which provide high quality datasets. But by their nature they leave many aquatic systems unsampled, follow a top down approach, have a long lag between ...
ERIC Educational Resources Information Center
Helton, Nicole D.; Helton, William S.
2008-01-01
In his reply to our paper Marangudakis raises important points regarding: (1) the measurement of environmental values; and (2) potential risks of deep ecological views to human welfare. We definitely agree that a more rigorous approach to the measurement of environmental values is needed. While the extent of belief in deep ecology remains an open…
Poverty in People with Disabilities: Indicators from the Capability Approach
ERIC Educational Resources Information Center
Rosano, Aldo; Mancini, Federica; Solipaca, Alessandro
2009-01-01
People with disability are particularly exposed to poor living conditions: on one hand they have more difficulties in getting an income cause to their inabilities, on the other hand conditions of poverty increase the risk of disability. However, little rigorous quantitative research has been undertaken to measure the real impact of disability on…
Shakespeare and the Common Core: An Opportunity to Reboot
ERIC Educational Resources Information Center
Turchi, Laura; Thompson, Ayanna
2013-01-01
The Common Core generally eschews mandating texts in favor of promoting critical analysis and rigor. So it's significant that Shakespeare is the only author invoked in imperatives. His explicit inclusion offers a significant opportunity for educators to rethink how we approach Shakespearean instruction. Rather than the traditional learning of…
ERIC Educational Resources Information Center
Mount, Helen; Cavet, Judith
1995-01-01
This article addresses the controversy concerning multisensory environments for children and adults with profound and multiple learning difficulties, from a British perspective. The need for critical evaluation of such multisensory interventions as the "snoezelen" approach and the paucity of relevant, rigorous research on educational…
The Theoretical and Empirical Basis for Meditation as an Intervention for PTSD
ERIC Educational Resources Information Center
Lang, Ariel J.; Strauss, Jennifer L.; Bomyea, Jessica; Bormann, Jill E.; Hickman, Steven D.; Good, Raquel C.; Essex, Michael
2012-01-01
In spite of the existence of good empirically supported treatments for posttraumatic stress disorder (PTSD), consumers and providers continue to ask for more options for managing this common and often chronic condition. Meditation-based approaches are being widely implemented, but there is minimal research rigorously assessing their effectiveness.…
A More Rigorous Quasi-Experimental Alternative to the One-Group Pretest-Posttest Design.
ERIC Educational Resources Information Center
Johnson, Craig W.
1986-01-01
A simple quasi-experimental design is described which may have utility in a variety of applied and laboratory research settings where ordinarily the one-group pretest-posttest pre-experimental design might otherwise be the procedure of choice. The design approaches the internal validity of true experimental designs while optimizing external…
Complexity, Representation and Practice: Case Study as Method and Methodology
ERIC Educational Resources Information Center
Miles, Rebecca
2015-01-01
While case study is considered a common approach to examining specific and particular examples in research disciplines such as law, medicine and psychology, in the social sciences case study is often treated as a lesser, flawed or undemanding methodology which is less valid, reliable or theoretically rigorous than other methodologies. Building on…
Rigorous Tests of Student Outcomes in CTE Programs of Study: Final Report
ERIC Educational Resources Information Center
Castellano, Marisa; Sundell, Kirsten E.; Overman, Laura T.; Richardson, George B.; Stone, James R., III
2014-01-01
This study was designed to investigate the relationship between participation in federally mandated college and career-preparatory programs--known as programs of study (POS)--and high school achievement outcomes. POS are an organized approach to college and career readiness that offer an aligned sequence of courses spanning secondary and…
ERIC Educational Resources Information Center
McEvoy, Suzanne
2012-01-01
With the changing U.S. demographics, higher numbers of diverse, low-income, first-generation students are underprepared for the academic rigors of four-year institutions oftentimes requiring assistance, and remedial and/or developmental coursework in English and mathematics. Without intervention approaches these students are at high risk for…
Visualizing, Rather than Deriving, Russell-Saunders Terms: A Classroom Activity with Quantum Numbers
ERIC Educational Resources Information Center
Coppo, Paolo
2016-01-01
A 1 h classroom activity is presented, aimed at consolidating the concepts of microstates and Russell-Saunders energy terms in transition metal atoms and coordination complexes. The unconventional approach, based on logic and intuition rather than rigorous mathematics, is designed to stimulate discussion and enhance familiarity with quantum…
Designer Librarian: Embedded in K12 Online Learning
ERIC Educational Resources Information Center
Boyer, Brenda
2015-01-01
Over the past two decades, shifts in technology have altered the roles of school librarians in a multitude of ways. New rigorous standards, proliferation of devices, and steady growth of online and blended learning for the K12 market now demand librarians engage with learners in online environments. Taking an instructional design approach is the…
Bray, Jeremy W.; Kelly, Erin L.; Hammer, Leslie B.; Almeida, David M.; Dearing, James W.; King, Rosalind B.; Buxton, Orfeu M.
2013-01-01
Recognizing a need for rigorous, experimental research to support the efforts of workplaces and policymakers in improving the health and wellbeing of employees and their families, the National Institutes of Health and the Centers for Disease Control and Prevention formed the Work, Family & Health Network (WFHN). The WFHN is implementing an innovative multisite study with a rigorous experimental design (adaptive randomization, control groups), comprehensive multilevel measures, a novel and theoretically based intervention targeting the psychosocial work environment, and translational activities. This paper describes challenges and benefits of designing a multilevel and transdisciplinary research network that includes an effectiveness study to assess intervention effects on employees, families, and managers; a daily diary study to examine effects on family functioning and daily stress; a process study to understand intervention implementation; and translational research to understand and inform diffusion of innovation. Challenges were both conceptual and logistical, spanning all aspects of study design and implementation. In dealing with these challenges, however, the WFHN developed innovative, transdisciplinary, multi-method approaches to conducting workplace research that will benefit both the research and business communities. PMID:24618878
Bhopal, Raj
2017-03-01
Rigorous evaluation of associations in epidemiology is essential, especially given big data, data mining, and hypothesis-free analyses. There is a precedent in making judgments on associations in the monographs of the International Agency for Research on Cancer; however, only the carcinogenic effects of exposures are examined. The idea of a World Council of Epidemiology and Causality to undertake rigorous, independent, comprehensive examination of associations has been debated, including in a workshop at the International Epidemiology Association's 20 th World Congress of Epidemiology, 2014. The objective of the workshop was both to, briefly, debate the idea and set out further questions and next steps. The principal conclusion from feedback including from the workshop is that the World Council of Epidemiology and Causality idea, notwithstanding challenges, has promise and deserves more debate. The preferred model is for a small independent body working closely with relevant partners with a distributed approach to tasks. Recommendations are contextualized in contemporary approaches in causal thinking in epidemiology. Copyright © 2017 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Buffalano, C.; Fogleman, S.; Gielecki, M.
1976-01-01
A methodology is outlined which can be used to estimate the costs of research and development projects. The approach uses the Delphi technique a method developed by the Rand Corporation for systematically eliciting and evaluating group judgments in an objective manner. The use of the Delphi allows for the integration of expert opinion into the cost-estimating process in a consistent and rigorous fashion. This approach can also signal potential cost-problem areas. This result can be a useful tool in planning additional cost analysis or in estimating contingency funds. A Monte Carlo approach is also examined.
NASA Technical Reports Server (NTRS)
Lewis, Robert Michael; Patera, Anthony T.; Peraire, Jaume
1998-01-01
We present a Neumann-subproblem a posteriori finite element procedure for the efficient and accurate calculation of rigorous, 'constant-free' upper and lower bounds for sensitivity derivatives of functionals of the solutions of partial differential equations. The design motivation for sensitivity derivative error control is discussed; the a posteriori finite element procedure is described; the asymptotic bounding properties and computational complexity of the method are summarized; and illustrative numerical results are presented.
Rigorous analysis of thick microstrip antennas and wire antennas embedded in a substrate
NASA Astrophysics Data System (ADS)
Smolders, A. B.
1992-07-01
An efficient and rigorous method for the analysis of electrically thick rectangular microstrip antennas and wire antennas with a dielectric cover is presented. The method of moments is used in combination with the exact spectral domain Green's function in order to find the unknown currents on the antenna. The microstrip antenna is fed by a coaxial cable. A proper model of the feeding coaxial structure is used. In addition, a special attachment mode was applied to ensure continuity of current at the patch-coax transition. The efficiency of the method of moments is improved by using the so called source term extraction technique, where a great part of the infinite integrals involved with the method of moment formulation is calculated analytically. Computation time can be saved by selecting a set of basis functions that describes the current distribution on the patch and probe in an accurate way using only a few terms of this set. Thick microstrip antennas have broadband characteristics. However, a proper match to 50 Ohms is often difficult. This matching problem can be avoided by using a slightly different excitation structure. The patch is now electromagnetically coupled to the feeding probe. A bandwidth of more than 40 can easily be obtained for this type of microstrip antenna. The price to be paid is a degradation of the radiation characteristics.
Increasing rigor in NMR-based metabolomics through validated and open source tools
Eghbalnia, Hamid R; Romero, Pedro R; Westler, William M; Baskaran, Kumaran; Ulrich, Eldon L; Markley, John L
2016-01-01
The metabolome, the collection of small molecules associated with an organism, is a growing subject of inquiry, with the data utilized for data-intensive systems biology, disease diagnostics, biomarker discovery, and the broader characterization of small molecules in mixtures. Owing to their close proximity to the functional endpoints that govern an organism’s phenotype, metabolites are highly informative about functional states. The field of metabolomics identifies and quantifies endogenous and exogenous metabolites in biological samples. Information acquired from nuclear magnetic spectroscopy (NMR), mass spectrometry (MS), and the published literature, as processed by statistical approaches, are driving increasingly wider applications of metabolomics. This review focuses on the role of databases and software tools in advancing the rigor, robustness, reproducibility, and validation of metabolomics studies. PMID:27643760
Thomson, Hilary
2013-08-01
Systematic reviews have the potential to promote knowledge exchange between researchers and decision-makers. Review planning requires engagement with evidence users to ensure preparation of relevant reviews, and well-conducted reviews should provide accessible and reliable synthesis to support decision-making. Yet, systematic reviews are not routinely referred to by decision-makers, and innovative approaches to improve the utility of reviews is needed. Evidence synthesis for healthy public policy is typically complex and methodologically challenging. Although not lessening the value of reviews, these challenges can be overwhelming and threaten their utility. Using the interrelated principles of relevance, rigor, and readability, and in light of available resources, this article considers how utility of evidence synthesis for healthy public policy might be improved.
2013-01-01
Systematic reviews have the potential to promote knowledge exchange between researchers and decision-makers. Review planning requires engagement with evidence users to ensure preparation of relevant reviews, and well-conducted reviews should provide accessible and reliable synthesis to support decision-making. Yet, systematic reviews are not routinely referred to by decision-makers, and innovative approaches to improve the utility of reviews is needed. Evidence synthesis for healthy public policy is typically complex and methodologically challenging. Although not lessening the value of reviews, these challenges can be overwhelming and threaten their utility. Using the interrelated principles of relevance, rigor, and readability, and in light of available resources, this article considers how utility of evidence synthesis for healthy public policy might be improved. PMID:23763400
Increasing rigor in NMR-based metabolomics through validated and open source tools.
Eghbalnia, Hamid R; Romero, Pedro R; Westler, William M; Baskaran, Kumaran; Ulrich, Eldon L; Markley, John L
2017-02-01
The metabolome, the collection of small molecules associated with an organism, is a growing subject of inquiry, with the data utilized for data-intensive systems biology, disease diagnostics, biomarker discovery, and the broader characterization of small molecules in mixtures. Owing to their close proximity to the functional endpoints that govern an organism's phenotype, metabolites are highly informative about functional states. The field of metabolomics identifies and quantifies endogenous and exogenous metabolites in biological samples. Information acquired from nuclear magnetic spectroscopy (NMR), mass spectrometry (MS), and the published literature, as processed by statistical approaches, are driving increasingly wider applications of metabolomics. This review focuses on the role of databases and software tools in advancing the rigor, robustness, reproducibility, and validation of metabolomics studies. Copyright © 2016. Published by Elsevier Ltd.
Fisher, Philip A.; Gilliam, Kathryn S.
2017-01-01
Although many psychotherapeutic approaches exist for treating troubled children and their families, not all have been evaluated to be effective through research. Moreover, among those that have been determined to be “evidence-based,” few have followed as coherent and rigorous a path of rigorous scientific investigation as the interventions that have been developed at the Oregon Social Learning Center. As such, these interventions serve as a model of “research to theory to practice” that may not only be employed to support families with children in need of treatment, but may also guide other programs of treatment development. This is the story of how this work has unfolded over the past four decades. PMID:29225459
Reviewing the methodology of an integrative review.
Hopia, Hanna; Latvala, Eila; Liimatainen, Leena
2016-12-01
Whittemore and Knafl's updated description of methodological approach for integrative review was published in 2005. Since then, the five stages of the approach have been regularly used as a basic conceptual structure of the integrative reviews conducted by nursing researchers. However, this methodological approach is seldom examined from the perspective of how systematically and rigorously the stages are implemented in the published integrative reviews. To appraise the selected integrative reviews on the basis of the methodological approach according to the five stages published by Whittemore and Knafl in 2005. A literature review was used in this study. CINAHL (Cumulative Index to Nursing and Allied Health), PubMed, OVID (Journals@Ovid) and the Cochrane Library databases were searched for integrative reviews published between 2002 and 2014. Papers were included if they used the methodological approach described by Whittemore and Knafl, were published in English and were focused on nursing education or nursing expertise. A total of 259 integrative review publications for potential inclusion were identified. Ten integrative reviews fulfilled the inclusion criteria. Findings from the studies were extracted and critically examined according to the five methodological stages. The reviews assessed followed the guidelines of the stated methodology approach to different extents. The stages of literature search, data evaluation and data analysis were fairly poorly formulated and only partially implemented in the studies included in the sample. The other two stages, problem identification and presentation, followed those described in the methodological approach quite well. Increasing use of research in clinical practice is inevitable, and therefore, integrative reviews can play a greater role in developing evidence-based nursing practices. Because of this, nurse researchers should pay more attention to sound integrative nursing research to systematise the review process and make it more rigorous. © 2016 Nordic College of Caring Science.
Facilitating long-term changes in student approaches to learning science.
Buchwitz, Brian J; Beyer, Catharine H; Peterson, Jon E; Pitre, Emile; Lalic, Nevena; Sampson, Paul D; Wakimoto, Barbara T
2012-01-01
Undergraduates entering science curricula differ greatly in individual starting points and learning needs. The fast pace, high enrollment, and high stakes of introductory science courses, however, limit students' opportunities to self-assess and modify learning strategies. The University of Washington's Biology Fellows Program (BFP) intervenes through a 20-session, premajors course that introduces students to the rigor expected of bioscience majors and assists their development as science learners. This study uses quantitative and qualitative approaches to assess whether the 2007-2009 BFP achieved its desired short- and long-term impacts on student learning. Adjusting for differences in students' high school grade point average and Scholastic Aptitude Test scores, we found that participation in the BFP was associated with higher grades in two subsequent gateway biology courses, across multiple quarters and instructors. Two to 4 yr after participating in the program, students attributed changes in how they approached learning science to BFP participation. They reported having learned to "think like a scientist" and to value active-learning strategies and learning communities. In addition, they reported having developed a sense of belonging in bioscience communities. The achievement of long-term impacts for a short-term instructional investment suggests a practical means to prepare diverse students for the rigors of science curricula.
Solubility advantage of amorphous pharmaceuticals: I. A thermodynamic analysis.
Murdande, Sharad B; Pikal, Michael J; Shanker, Ravi M; Bogner, Robin H
2010-03-01
In recent years there has been growing interest in advancing amorphous pharmaceuticals as an approach for achieving adequate solubility. Due to difficulties in the experimental measurement of solubility, a reliable estimate of the solubility enhancement ratio of an amorphous form of a drug relative to its crystalline counterpart would be highly useful. We have developed a rigorous thermodynamic approach to estimate enhancement in solubility that can be achieved by conversion of a crystalline form to the amorphous form. We rigorously treat the three factors that contribute to differences in solubility between amorphous and crystalline forms. First, we calculate the free energy difference between amorphous and crystalline forms from thermal properties measured by modulated differential scanning calorimetry (MDSC). Secondly, since an amorphous solute can absorb significant amounts of water, which reduces its activity and solubility, a correction is made using water sorption isotherm data and the Gibbs-Duhem equation. Next, a correction is made for differences in the degree of ionization due to differences in solubilities of the two forms. Utilizing this approach the theoretically estimated solubility enhancement ratio of 7.0 for indomethacin (amorphous/gamma-crystal) was found to be in close agreement with the experimentally determined ratio of 4.9. 2009 Wiley-Liss, Inc. and the American Pharmacists Association
NASA Astrophysics Data System (ADS)
Lee, K. David; Wiesenfeld, Eric; Gelfand, Andrew
2007-04-01
One of the greatest challenges in modern combat is maintaining a high level of timely Situational Awareness (SA). In many situations, computational complexity and accuracy considerations make the development and deployment of real-time, high-level inference tools very difficult. An innovative hybrid framework that combines Bayesian inference, in the form of Bayesian Networks, and Possibility Theory, in the form of Fuzzy Logic systems, has recently been introduced to provide a rigorous framework for high-level inference. In previous research, the theoretical basis and benefits of the hybrid approach have been developed. However, lacking is a concrete experimental comparison of the hybrid framework with traditional fusion methods, to demonstrate and quantify this benefit. The goal of this research, therefore, is to provide a statistical analysis on the comparison of the accuracy and performance of hybrid network theory, with pure Bayesian and Fuzzy systems and an inexact Bayesian system approximated using Particle Filtering. To accomplish this task, domain specific models will be developed under these different theoretical approaches and then evaluated, via Monte Carlo Simulation, in comparison to situational ground truth to measure accuracy and fidelity. Following this, a rigorous statistical analysis of the performance results will be performed, to quantify the benefit of hybrid inference to other fusion tools.
Wong, Michelle; Bejarano, Esther; Carvlin, Graeme; Fellows, Katie; King, Galatea; Lugo, Humberto; Jerrett, Michael; Meltzer, Dan; Northcross, Amanda; Olmedo, Luis; Seto, Edmund; Wilkie, Alexa; English, Paul
2018-03-15
Air pollution continues to be a global public health threat, and the expanding availability of small, low-cost air sensors has led to increased interest in both personal and crowd-sourced air monitoring. However, to date, few low-cost air monitoring networks have been developed with the scientific rigor or continuity needed to conduct public health surveillance and inform policy. In Imperial County, California, near the U.S./Mexico border, we used a collaborative, community-engaged process to develop a community air monitoring network that attains the scientific rigor required for research, while also achieving community priorities. By engaging community residents in the project design, monitor siting processes, data dissemination, and other key activities, the resulting air monitoring network data are relevant, trusted, understandable, and used by community residents. Integration of spatial analysis and air monitoring best practices into the network development process ensures that the data are reliable and appropriate for use in research activities. This combined approach results in a community air monitoring network that is better able to inform community residents, support research activities, guide public policy, and improve public health. Here we detail the monitor siting process and outline the advantages and challenges of this approach.
Wong, Michelle; Bejarano, Esther; Carvlin, Graeme; King, Galatea; Lugo, Humberto; Jerrett, Michael; Northcross, Amanda; Olmedo, Luis; Seto, Edmund; Wilkie, Alexa; English, Paul
2018-01-01
Air pollution continues to be a global public health threat, and the expanding availability of small, low-cost air sensors has led to increased interest in both personal and crowd-sourced air monitoring. However, to date, few low-cost air monitoring networks have been developed with the scientific rigor or continuity needed to conduct public health surveillance and inform policy. In Imperial County, California, near the U.S./Mexico border, we used a collaborative, community-engaged process to develop a community air monitoring network that attains the scientific rigor required for research, while also achieving community priorities. By engaging community residents in the project design, monitor siting processes, data dissemination, and other key activities, the resulting air monitoring network data are relevant, trusted, understandable, and used by community residents. Integration of spatial analysis and air monitoring best practices into the network development process ensures that the data are reliable and appropriate for use in research activities. This combined approach results in a community air monitoring network that is better able to inform community residents, support research activities, guide public policy, and improve public health. Here we detail the monitor siting process and outline the advantages and challenges of this approach. PMID:29543726
Facilitating Long-Term Changes in Student Approaches to Learning Science
Buchwitz, Brian J.; Beyer, Catharine H.; Peterson, Jon E.; Pitre, Emile; Lalic, Nevena; Sampson, Paul D.; Wakimoto, Barbara T.
2012-01-01
Undergraduates entering science curricula differ greatly in individual starting points and learning needs. The fast pace, high enrollment, and high stakes of introductory science courses, however, limit students’ opportunities to self-assess and modify learning strategies. The University of Washington's Biology Fellows Program (BFP) intervenes through a 20-session, premajors course that introduces students to the rigor expected of bioscience majors and assists their development as science learners. This study uses quantitative and qualitative approaches to assess whether the 2007–2009 BFP achieved its desired short- and long-term impacts on student learning. Adjusting for differences in students’ high school grade point average and Scholastic Aptitude Test scores, we found that participation in the BFP was associated with higher grades in two subsequent gateway biology courses, across multiple quarters and instructors. Two to 4 yr after participating in the program, students attributed changes in how they approached learning science to BFP participation. They reported having learned to “think like a scientist” and to value active-learning strategies and learning communities. In addition, they reported having developed a sense of belonging in bioscience communities. The achievement of long-term impacts for a short-term instructional investment suggests a practical means to prepare diverse students for the rigors of science curricula. PMID:22949424
A new algorithm for construction of coarse-grained sites of large biomolecules.
Li, Min; Zhang, John Z H; Xia, Fei
2016-04-05
The development of coarse-grained (CG) models for large biomolecules remains a challenge in multiscale simulations, including a rigorous definition of CG representations for them. In this work, we proposed a new stepwise optimization imposed with the boundary-constraint (SOBC) algorithm to construct the CG sites of large biomolecules, based on the s cheme of essential dynamics CG. By means of SOBC, we can rigorously derive the CG representations of biomolecules with less computational cost. The SOBC is particularly efficient for the CG definition of large systems with thousands of residues. The resulted CG sites can be parameterized as a CG model using the normal mode analysis based fluctuation matching method. Through normal mode analysis, the obtained modes of CG model can accurately reflect the functionally related slow motions of biomolecules. The SOBC algorithm can be used for the construction of CG sites of large biomolecules such as F-actin and for the study of mechanical properties of biomaterials. © 2015 Wiley Periodicals, Inc.
Considering Research Outcomes as Essential Tools for Medical Education Decision Making.
Miller, Karen Hughes; Miller, Bonnie M; Karani, Reena
2015-11-01
As medical educators face the challenge of incorporating new content, learning methods, and assessment techniques into the curriculum, the need for rigorous medical education research to guide efficient and effective instructional planning increases. When done properly, well-designed education research can provide guidance for complex education decision making. In this Commentary, the authors consider the 2015 Research in Medical Education (RIME) research and review articles in terms of the critical areas in teaching and learning that they address. The broad categories include (1) assessment (the largest collection of RIME articles, including both feedback from learners and instructors and the reliability of learner assessment), (2) the institution's impact on the learning environment, (3) what can be learned from program evaluation, and (4) emerging issues in faculty development. While the articles in this issue are broad in scope and potential impact, the RIME committee noted few studies of sufficient rigor focusing on areas of diversity and diverse learners. Although challenging to investigate, the authors encourage continuing innovation in research focused on these important areas.
NASA Astrophysics Data System (ADS)
Reis, T.; Phillips, T. N.
2008-12-01
In this reply to the comment by Lallemand and Luo, we defend our assertion that the alternative approach for the solution of the dispersion relation for a generalized lattice Boltzmann dispersion equation [T. Reis and T. N. Phillips, Phys. Rev. E 77, 026702 (2008)] is mathematically transparent, elegant, and easily justified. Furthermore, the rigorous perturbation analysis used by Reis and Phillips does not require the reciprocals of the relaxation parameters to be small.
Using constraints and their value for optimization of large ODE systems
Domijan, Mirela; Rand, David A.
2015-01-01
We provide analytical tools to facilitate a rigorous assessment of the quality and value of the fit of a complex model to data. We use this to provide approaches to model fitting, parameter estimation, the design of optimization functions and experimental optimization. This is in the context where multiple constraints are used to select or optimize a large model defined by differential equations. We illustrate the approach using models of circadian clocks and the NF-κB signalling system. PMID:25673300
Geometrical optics in the near field: local plane-interface approach with evanescent waves.
Bose, Gaurav; Hyvärinen, Heikki J; Tervo, Jani; Turunen, Jari
2015-01-12
We show that geometrical models may provide useful information on light propagation in wavelength-scale structures even if evanescent fields are present. We apply a so-called local plane-wave and local plane-interface methods to study a geometry that resembles a scanning near-field microscope. We show that fair agreement between the geometrical approach and rigorous electromagnetic theory can be achieved in the case where evanescent waves are required to predict any transmission through the structure.
Why health care corruption needs a new approach.
Radin, Dagmar
2016-07-01
While corruption has been at the center of academic studies and on the agenda of international organizations for a couple of decades, in the health care sector corruption has not generated much interest or progress. At the centre of this issue is the lack of an interdisciplinary approach, which is warranted given the complexity of the issue and the lack of cooperation between STET scientifically rigorous academics and policy-makers, leaving room for more cooperation and progress. © The Author(s) 2015.
Bamberger, Michael; Tarsilla, Michele; Hesse-Biber, Sharlene
2016-04-01
Many widely-used impact evaluation designs, including randomized control trials (RCTs) and quasi-experimental designs (QEDs), frequently fail to detect what are often quite serious unintended consequences of development programs. This seems surprising as experienced planners and evaluators are well aware that unintended consequences frequently occur. Most evaluation designs are intended to determine whether there is credible evidence (statistical, theory-based or narrative) that programs have achieved their intended objectives and the logic of many evaluation designs, even those that are considered the most "rigorous," does not permit the identification of outcomes that were not specified in the program design. We take the example of RCTs as they are considered by many to be the most rigorous evaluation designs. We present a numbers of cases to illustrate how infusing RCTs with a mixed-methods approach (sometimes called an "RCT+" design) can strengthen the credibility of these designs and can also capture important unintended consequences. We provide a Mixed Methods Evaluation Framework that identifies 9 ways in which UCs can occur, and we apply this framework to two of the case studies. Copyright © 2016 Elsevier Ltd. All rights reserved.
Fast synthesis of topographic mask effects based on rigorous solutions
NASA Astrophysics Data System (ADS)
Yan, Qiliang; Deng, Zhijie; Shiely, James
2007-10-01
Topographic mask effects can no longer be ignored at technology nodes of 45 nm, 32 nm and beyond. As feature sizes become comparable to the mask topographic dimensions and the exposure wavelength, the popular thin mask model breaks down, because the mask transmission no longer follows the layout. A reliable mask transmission function has to be derived from Maxwell equations. Unfortunately, rigorous solutions of Maxwell equations are only manageable for limited field sizes, but impractical for full-chip optical proximity corrections (OPC) due to the prohibitive runtime. Approximation algorithms are in demand to achieve a balance between acceptable computation time and tolerable errors. In this paper, a fast algorithm is proposed and demonstrated to model topographic mask effects for OPC applications. The ProGen Topographic Mask (POTOMAC) model synthesizes the mask transmission functions out of small-sized Maxwell solutions from a finite-difference-in-time-domain (FDTD) engine, an industry leading rigorous simulator of topographic mask effect from SOLID-E. The integral framework presents a seamless solution to the end user. Preliminary results indicate the overhead introduced by POTOMAC is contained within the same order of magnitude in comparison to the thin mask approach.
NASA Astrophysics Data System (ADS)
Nugraheni, Z.; Budiyono, B.; Slamet, I.
2018-03-01
To reach higher order thinking skill, needed to be mastered the conceptual understanding and strategic competence as they are two basic parts of high order thinking skill (HOTS). RMT is a unique realization of the cognitive conceptual construction approach based on Feurstein with his theory of Mediated Learning Experience (MLE) and Vygotsky’s sociocultural theory. This was quasi-experimental research which compared the experimental class that was given Rigorous Mathematical Thinking (RMT) as learning method and the control class that was given Direct Learning (DL) as the conventional learning activity. This study examined whether there was different effect of two learning model toward conceptual understanding and strategic competence of Junior High School Students. The data was analyzed by using Multivariate Analysis of Variance (MANOVA) and obtained a significant difference between experimental and control class when considered jointly on the mathematics conceptual understanding and strategic competence (shown by Wilk’s Λ = 0.84). Further, by independent t-test is known that there was significant difference between two classes both on mathematical conceptual understanding and strategic competence. By this result is known that Rigorous Mathematical Thinking (RMT) had positive impact toward Mathematics conceptual understanding and strategic competence.
Sliding mode control for Mars entry based on extended state observer
NASA Astrophysics Data System (ADS)
Lu, Kunfeng; Xia, Yuanqing; Shen, Ganghui; Yu, Chunmei; Zhou, Liuyu; Zhang, Lijun
2017-11-01
This paper addresses high-precision Mars entry guidance and control approach via sliding mode control (SMC) and Extended State Observer (ESO). First, differential flatness (DF) approach is applied to the dynamic equations of the entry vehicle to represent the state variables more conveniently. Then, the presented SMC law can guarantee the property of finite-time convergence of tracking error, which requires no information on high uncertainties that are estimated by ESO, and the rigorous proof of tracking error convergence is given. Finally, Monte Carlo simulation results are presented to demonstrate the effectiveness of the suggested approach.
Linari, Marco; Caremani, Marco; Piperio, Claudia; Brandt, Philip; Lombardi, Vincenzo
2007-04-01
The stiffness of the single myosin motor (epsilon) is determined in skinned fibers from rabbit psoas muscle by both mechanical and thermodynamic approaches. Changes in the elastic strain of the half-sarcomere (hs) are measured by fast mechanics both in rigor, when all myosin heads are attached, and during active contraction, with the isometric force (T0) modulated by changing either [Ca2+] or temperature. The hs compliance is 43.0+/-0.8 nm MPa-1 in isometric contraction at saturating [Ca2+], whereas in rigor it is 28.2+/-1.1 nm MPa-1. The equivalent compliance of myofilaments is 21.0+/-3.3 nm MPa-1. Accordingly, the stiffness of the ensemble of myosin heads attached in the hs is 45.5+/-1.7 kPa nm-1 in isometric contraction at saturating [Ca2+] (e0), and in rigor (er) it rises to 138.9+/-21.2 kPa nm-1. Epsilon, calculated from er and the lattice molecular dimensions, is 1.21+/-0.18 pN nm-1. epsilon estimated, using a thermodynamic approach, from the relation of T0 at saturating [Ca2+] versus the reciprocal of absolute temperature is 1.25+/-0.14 pN nm-1, similar to that estimated for fibers in rigor. Consequently, the ratio e0/er (0.33+/-0.05) can be used to estimate the fraction of attached heads during isometric contraction at saturating [Ca2+]. If the osmotic agent dextran T-500 (4 g/100 ml) is used to reduce the lateral filament spacing of the relaxed fiber to the value before skinning, both e0 and er increase by approximately 40%. Epsilon becomes approximately 1.7 pN nm-1 and the fraction and the force of myosin heads attached in the isometric contraction remain the same as before dextran application. The finding that the fraction of myosin heads attached to actin in an isometric contraction is 0.33 rules out the hypothesis of multiple mechanical cycles per ATP hydrolyzed.
Identifying apparent local stable isotope equilibrium in a complex non-equilibrium system.
He, Yuyang; Cao, Xiaobin; Wang, Jianwei; Bao, Huiming
2018-02-28
Although being out of equilibrium, biomolecules in organisms have the potential to approach isotope equilibrium locally because enzymatic reactions are intrinsically reversible. A rigorous approach that can describe isotope distribution among biomolecules and their apparent deviation from equilibrium state is lacking, however. Applying the concept of distance matrix in graph theory, we propose that apparent local isotope equilibrium among a subset of biomolecules can be assessed using an apparent fractionation difference (|Δα|) matrix, in which the differences between the observed isotope composition (δ') and the calculated equilibrium fractionation factor (1000lnβ) can be more rigorously evaluated than by using a previous approach for multiple biomolecules. We tested our |Δα| matrix approach by re-analyzing published data of different amino acids (AAs) in potato and in green alga. Our re-analysis shows that biosynthesis pathways could be the reason for an apparently close-to-equilibrium relationship inside AA families in potato leaves. Different biosynthesis/degradation pathways in tubers may have led to the observed isotope distribution difference between potato leaves and tubers. The analysis of data from green algae does not support the conclusion that AAs are further from equilibrium in glucose-cultured green algae than in the autotrophic ones. Application of the |Δα| matrix can help us to locate potential reversible reactions or reaction networks in a complex system such as a metabolic system. The same approach can be broadly applied to all complex systems that have multiple components, e.g. geochemical or atmospheric systems of early Earth or other planets. Copyright © 2017 John Wiley & Sons, Ltd.
Comparative Effectiveness Research in Oncology
2013-01-01
Although randomized controlled trials represent the gold standard for comparative effective research (CER), a number of additional methods are available when randomized controlled trials are lacking or inconclusive because of the limitations of such trials. In addition to more relevant, efficient, and generalizable trials, there is a need for additional approaches utilizing rigorous methodology while fully recognizing their inherent limitations. CER is an important construct for defining and summarizing evidence on effectiveness and safety and comparing the value of competing strategies so that patients, providers, and policymakers can be offered appropriate recommendations for optimal patient care. Nevertheless, methodological as well as political and social challenges for CER remain. CER requires constant and sophisticated methodological oversight of study design and analysis similar to that required for randomized trials to reduce the potential for bias. At the same time, if appropriately conducted, CER offers an opportunity to identify the most effective and safe approach to patient care. Despite rising and unsustainable increases in health care costs, an even greater challenge to the implementation of CER arises from the social and political environment questioning the very motives and goals of CER. Oncologists and oncology professional societies are uniquely positioned to provide informed clinical and methodological expertise to steer the appropriate application of CER toward critical discussions related to health care costs, cost-effectiveness, and the comparative value of the available options for appropriate care of patients with cancer. PMID:23697601
A Fast Vector Radiative Transfer Model for Atmospheric and Oceanic Remote Sensing
NASA Astrophysics Data System (ADS)
Ding, J.; Yang, P.; King, M. D.; Platnick, S. E.; Meyer, K.
2017-12-01
A fast vector radiative transfer model is developed in support of atmospheric and oceanic remote sensing. This model is capable of simulating the Stokes vector observed at the top of the atmosphere (TOA) and the terrestrial surface by considering absorption, scattering, and emission. The gas absorption is parameterized in terms of atmospheric gas concentrations, temperature, and pressure. The parameterization scheme combines a regression method and the correlated-K distribution method, and can easily integrate with multiple scattering computations. The approach is more than four orders of magnitude faster than a line-by-line radiative transfer model with errors less than 0.5% in terms of transmissivity. A two-component approach is utilized to solve the vector radiative transfer equation (VRTE). The VRTE solver separates the phase matrices of aerosol and cloud into forward and diffuse parts and thus the solution is also separated. The forward solution can be expressed by a semi-analytical equation based on the small-angle approximation, and serves as the source of the diffuse part. The diffuse part is solved by the adding-doubling method. The adding-doubling implementation is computationally efficient because the diffuse component needs much fewer spherical function expansion terms. The simulated Stokes vector at both the TOA and the surface have comparable accuracy compared with the counterparts based on numerically rigorous methods.
Bellasio, Chandra; Beerling, David J; Griffiths, Howard
2016-06-01
The higher photosynthetic potential of C4 plants has led to extensive research over the past 50 years, including C4 -dominated natural biomes, crops such as maize, or for evaluating the transfer of C4 traits into C3 lineages. Photosynthetic gas exchange can be measured in air or in a 2% Oxygen mixture using readily available commercial gas exchange and modulated PSII fluorescence systems. Interpretation of these data, however, requires an understanding (or the development) of various modelling approaches, which limit the use by non-specialists. In this paper we present an accessible summary of the theory behind the analysis and derivation of C4 photosynthetic parameters, and provide a freely available Excel Fitting Tool (EFT), making rigorous C4 data analysis accessible to a broader audience. Outputs include those defining C4 photochemical and biochemical efficiency, the rate of photorespiration, bundle sheath conductance to CO2 diffusion and the in vivo biochemical constants for PEP carboxylase. The EFT compares several methodological variants proposed by different investigators, allowing users to choose the level of complexity required to interpret data. We provide a complete analysis of gas exchange data on maize (as a model C4 organism and key global crop) to illustrate the approaches, their analysis and interpretation. © 2015 John Wiley & Sons Ltd. © 2016 John Wiley & Sons Ltd.
Doyle, D M; Dauterive, R; Chuang, K H; Ellrodt, A G
2001-11-01
There are many challenges to effectively and efficiently translating evidence into practice. Potential strategies include (1) training more evidence-based practitioners in the art and science of evidence-based medicine, (2) enhancing the quality and availability of systematic reviews, and (3) more effectively linking evidence-based practitioners and evidence users through comprehensive behavioral change initiatives. Herein we explore the third strategy and highlight the key elements of success for a program using behavioral change strategies. We present a clinical model based on clear understanding of the "problem," a systematic approach to diagnosis, selection of scientifically sound treatment options, and effective evaluation with appropriate modification of the treatment plan. A successful program begins with effective team leadership, the expression of a clinically compelling case for change, and commitment to the pursuit of perfection in the delivery of key evidence-based interventions. The team must then diagnose behavioral barriers to change, using a systematic approach based on a published rigorous differential diagnosis framework. This diagnostic step provides the foundation for selection of effective dissemination and implementation strategies (treatments) proven to improve processes of care and clinical outcomes. Finally the team must evaluate progress toward perfection, reviewing interim data and adjusting the treatment regimen to newly diagnosed barriers. We then present a specific project (improving pneumococcal immunization rates in our rural community) and interim results to demonstrate the use of the framework in the real world.
Zhang, Wenjun; Wang, Ming L.; Khalili, Sammy
2016-01-01
Abstract We live in exciting times for a new generation of biomarkers being enabled by advances in the design and use of biomaterials for medical and clinical applications, from nano- to macro-materials, and protein to tissue. Key challenges arise, however, due to both scientific complexity and compatibility of the interface of biology and engineered materials. The linking of mechanisms across scales by using a materials science approach to provide structure–process–property relations characterizes the emerging field of ‘materiomics,’ which offers enormous promise to provide the hitherto missing tools for biomaterial development for clinical diagnostics and the next generation biomarker applications towards personal health monitoring. Put in other words, the emerging field of materiomics represents an essentially systematic approach to the investigation of biological material systems, integrating natural functions and processes with traditional materials science perspectives. Here we outline how materiomics provides a game-changing technology platform for disruptive innovation in biomaterial science to enable the design of tailored and functional biomaterials—particularly, the design and screening of DNA aptamers for targeting biomarkers related to oral diseases and oral health monitoring. Rigorous and complementary computational modeling and experimental techniques will provide an efficient means to develop new clinical technologies in silico, greatly accelerating the translation of materiomics-driven oral health diagnostics from concept to practice in the clinic. PMID:26760957
Uninformative Prior Multiple Target Tracking Using Evidential Particle Filters
NASA Astrophysics Data System (ADS)
Worthy, J. L., III; Holzinger, M. J.
Space situational awareness requires the ability to initialize state estimation from short measurements and the reliable association of observations to support the characterization of the space environment. The electro-optical systems used to observe space objects cannot fully characterize the state of an object given a short, unobservable sequence of measurements. Further, it is difficult to associate these short-arc measurements if many such measurements are generated through the observation of a cluster of satellites, debris from a satellite break-up, or from spurious detections of an object. An optimization based, probabilistic short-arc observation association approach coupled with a Dempster-Shafer based evidential particle filter in a multiple target tracking framework is developed and proposed to address these problems. The optimization based approach is shown in literature to be computationally efficient and can produce probabilities of association, state estimates, and covariances while accounting for systemic errors. Rigorous application of Dempster-Shafer theory is shown to be effective at enabling ignorance to be properly accounted for in estimation by augmenting probability with belief and plausibility. The proposed multiple hypothesis framework will use a non-exclusive hypothesis formulation of Dempster-Shafer theory to assign belief mass to candidate association pairs and generate tracks based on the belief to plausibility ratio. The proposed algorithm is demonstrated using simulated observations of a GEO satellite breakup scenario.
Development of an evidence-based review with recommendations using an online iterative process.
Rudmik, Luke; Smith, Timothy L
2011-01-01
The practice of modern medicine is governed by evidence-based principles. Due to the plethora of medical literature, clinicians often rely on systematic reviews and clinical guidelines to summarize the evidence and provide best practices. Implementation of an evidence-based clinical approach can minimize variation in health care delivery and optimize the quality of patient care. This article reports a method for developing an "Evidence-based Review with Recommendations" using an online iterative process. The manuscript describes the following steps involved in this process: Clinical topic selection, Evidence-hased review assignment, Literature review and initial manuscript preparation, Iterative review process with author selection, and Manuscript finalization. The goal of this article is to improve efficiency and increase the production of evidence-based reviews while maintaining the high quality and transparency associated with the rigorous methodology utilized for clinical guideline development. With the rise of evidence-based medicine, most medical and surgical specialties have an abundance of clinical topics which would benefit from a formal evidence-based review. Although clinical guideline development is an important methodology, the associated challenges limit development to only the absolute highest priority clinical topics. As outlined in this article, the online iterative approach to the development of an Evidence-based Review with Recommendations may improve productivity without compromising the quality associated with formal guideline development methodology. Copyright © 2011 American Rhinologic Society-American Academy of Otolaryngic Allergy, LLC.