DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Gaigong; Lin, Lin, E-mail: linlin@math.berkeley.edu; Computational Research Division, Lawrence Berkeley National Laboratory, Berkeley, CA 94720
Recently, we have proposed the adaptive local basis set for electronic structure calculations based on Kohn–Sham density functional theory in a pseudopotential framework. The adaptive local basis set is efficient and systematically improvable for total energy calculations. In this paper, we present the calculation of atomic forces, which can be used for a range of applications such as geometry optimization and molecular dynamics simulation. We demonstrate that, under mild assumptions, the computation of atomic forces can scale nearly linearly with the number of atoms in the system using the adaptive local basis set. We quantify the accuracy of the Hellmann–Feynmanmore » forces for a range of physical systems, benchmarked against converged planewave calculations, and find that the adaptive local basis set is efficient for both force and energy calculations, requiring at most a few tens of basis functions per atom to attain accuracies required in practice. Since the adaptive local basis set has implicit dependence on atomic positions, Pulay forces are in general nonzero. However, we find that the Pulay force is numerically small and systematically decreasing with increasing basis completeness, so that the Hellmann–Feynman force is sufficient for basis sizes of a few tens of basis functions per atom. We verify the accuracy of the computed forces in static calculations of quasi-1D and 3D disordered Si systems, vibration calculation of a quasi-1D Si system, and molecular dynamics calculations of H{sub 2} and liquid Al–Si alloy systems, where we show systematic convergence to benchmark planewave results and results from the literature.« less
Zhang, Gaigong; Lin, Lin; Hu, Wei; ...
2017-01-27
Recently, we have proposed the adaptive local basis set for electronic structure calculations based on Kohn–Sham density functional theory in a pseudopotential framework. The adaptive local basis set is efficient and systematically improvable for total energy calculations. In this paper, we present the calculation of atomic forces, which can be used for a range of applications such as geometry optimization and molecular dynamics simulation. We demonstrate that, under mild assumptions, the computation of atomic forces can scale nearly linearly with the number of atoms in the system using the adaptive local basis set. We quantify the accuracy of the Hellmann–Feynmanmore » forces for a range of physical systems, benchmarked against converged planewave calculations, and find that the adaptive local basis set is efficient for both force and energy calculations, requiring at most a few tens of basis functions per atom to attain accuracies required in practice. Sin ce the adaptive local basis set has implicit dependence on atomic positions, Pulay forces are in general nonzero. However, we find that the Pulay force is numerically small and systematically decreasing with increasing basis completeness, so that the Hellmann–Feynman force is sufficient for basis sizes of a few tens of basis functions per atom. We verify the accuracy of the computed forces in static calculations of quasi-1D and 3D disordered Si systems, vibration calculation of a quasi-1D Si system, and molecular dynamics calculations of H 2 and liquid Al–Si alloy systems, where we show systematic convergence to benchmark planewave results and results from the literature.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Gaigong; Lin, Lin; Hu, Wei
Recently, we have proposed the adaptive local basis set for electronic structure calculations based on Kohn–Sham density functional theory in a pseudopotential framework. The adaptive local basis set is efficient and systematically improvable for total energy calculations. In this paper, we present the calculation of atomic forces, which can be used for a range of applications such as geometry optimization and molecular dynamics simulation. We demonstrate that, under mild assumptions, the computation of atomic forces can scale nearly linearly with the number of atoms in the system using the adaptive local basis set. We quantify the accuracy of the Hellmann–Feynmanmore » forces for a range of physical systems, benchmarked against converged planewave calculations, and find that the adaptive local basis set is efficient for both force and energy calculations, requiring at most a few tens of basis functions per atom to attain accuracies required in practice. Sin ce the adaptive local basis set has implicit dependence on atomic positions, Pulay forces are in general nonzero. However, we find that the Pulay force is numerically small and systematically decreasing with increasing basis completeness, so that the Hellmann–Feynman force is sufficient for basis sizes of a few tens of basis functions per atom. We verify the accuracy of the computed forces in static calculations of quasi-1D and 3D disordered Si systems, vibration calculation of a quasi-1D Si system, and molecular dynamics calculations of H 2 and liquid Al–Si alloy systems, where we show systematic convergence to benchmark planewave results and results from the literature.« less
NASA Astrophysics Data System (ADS)
Zhang, Gaigong; Lin, Lin; Hu, Wei; Yang, Chao; Pask, John E.
2017-04-01
Recently, we have proposed the adaptive local basis set for electronic structure calculations based on Kohn-Sham density functional theory in a pseudopotential framework. The adaptive local basis set is efficient and systematically improvable for total energy calculations. In this paper, we present the calculation of atomic forces, which can be used for a range of applications such as geometry optimization and molecular dynamics simulation. We demonstrate that, under mild assumptions, the computation of atomic forces can scale nearly linearly with the number of atoms in the system using the adaptive local basis set. We quantify the accuracy of the Hellmann-Feynman forces for a range of physical systems, benchmarked against converged planewave calculations, and find that the adaptive local basis set is efficient for both force and energy calculations, requiring at most a few tens of basis functions per atom to attain accuracies required in practice. Since the adaptive local basis set has implicit dependence on atomic positions, Pulay forces are in general nonzero. However, we find that the Pulay force is numerically small and systematically decreasing with increasing basis completeness, so that the Hellmann-Feynman force is sufficient for basis sizes of a few tens of basis functions per atom. We verify the accuracy of the computed forces in static calculations of quasi-1D and 3D disordered Si systems, vibration calculation of a quasi-1D Si system, and molecular dynamics calculations of H2 and liquid Al-Si alloy systems, where we show systematic convergence to benchmark planewave results and results from the literature.
Updating the OMERACT filter: core areas as a basis for defining core outcome sets.
Kirwan, John R; Boers, Maarten; Hewlett, Sarah; Beaton, Dorcas; Bingham, Clifton O; Choy, Ernest; Conaghan, Philip G; D'Agostino, Maria-Antonietta; Dougados, Maxime; Furst, Daniel E; Guillemin, Francis; Gossec, Laure; van der Heijde, Désirée M; Kloppenburg, Margreet; Kvien, Tore K; Landewé, Robert B M; Mackie, Sarah L; Matteson, Eric L; Mease, Philip J; Merkel, Peter A; Ostergaard, Mikkel; Saketkoo, Lesley Ann; Simon, Lee; Singh, Jasvinder A; Strand, Vibeke; Tugwell, Peter
2014-05-01
The Outcome Measures in Rheumatology (OMERACT) Filter provides guidelines for the development and validation of outcome measures for use in clinical research. The "Truth" section of the OMERACT Filter presupposes an explicit framework for identifying the relevant core outcomes that are universal to all studies of the effects of intervention effects. There is no published outline for instrument choice or development that is aimed at measuring outcome, was derived from broad consensus over its underlying philosophy, or includes a structured and documented critique. Therefore, a new proposal for defining core areas of measurement ("Filter 2.0 Core Areas of Measurement") was presented at OMERACT 11 to explore areas of consensus and to consider whether already endorsed core outcome sets fit into this newly proposed framework. Discussion groups critically reviewed the extent to which case studies of current OMERACT Working Groups complied with or negated the proposed framework, whether these observations had a more general application, and what issues remained to be resolved. Although there was broad acceptance of the framework in general, several important areas of construction, presentation, and clarity of the framework were questioned. The discussion groups and subsequent feedback highlighted 20 such issues. These issues will require resolution to reach consensus on accepting the proposed Filter 2.0 framework of Core Areas as the basis for the selection of Core Outcome Domains and hence appropriate Core Outcome Sets for clinical trials.
42 CFR 412.400 - Basis and scope of subpart.
Code of Federal Regulations, 2010 CFR
2010-10-01
... Inpatient Hospital Services of Inpatient Psychiatric Facilities § 412.400 Basis and scope of subpart. (a... psychiatric facilities. (b) Scope. This subpart sets forth the framework for the prospective payment system for the inpatient hospital services of inpatient psychiatric facilities, including the methodology...
Ghanbari, Yasser; Smith, Alex R.; Schultz, Robert T.; Verma, Ragini
2014-01-01
Diffusion tensor imaging (DTI) offers rich insights into the physical characteristics of white matter (WM) fiber tracts and their development in the brain, facilitating a network representation of brain’s traffic pathways. Such a network representation of brain connectivity has provided a novel means of investigating brain changes arising from pathology, development or aging. The high dimensionality of these connectivity networks necessitates the development of methods that identify the connectivity building blocks or sub-network components that characterize the underlying variation in the population. In addition, the projection of the subject networks into the basis set provides a low dimensional representation of it, that teases apart different sources of variation in the sample, facilitating variation-specific statistical analysis. We propose a unified framework of non-negative matrix factorization and graph embedding for learning sub-network patterns of connectivity by their projective non-negative decomposition into a reconstructive basis set, as well as, additional basis sets representing variational sources in the population like age and pathology. The proposed framework is applied to a study of diffusion-based connectivity in subjects with autism that shows localized sparse sub-networks which mostly capture the changes related to pathology and developmental variations. PMID:25037933
Two Concepts of Radiation: A Case Study Investigating Existing Preconceptions
ERIC Educational Resources Information Center
Plotz, Thomas; Hopf, Martin
2016-01-01
Conceptual Change is a widely accepted theoretical framework for science education. Setting up successful learning and teaching arrangements in this framework necessarily entails including students´ preconceptions into the construction of those arrangements. In order to provide a basis for such arrangements this study investigated and explored…
NASA Astrophysics Data System (ADS)
Klinting, Emil Lund; Thomsen, Bo; Godtliebsen, Ian Heide; Christiansen, Ove
2018-02-01
We present an approach to treat sets of general fit-basis functions in a single uniform framework, where the functional form is supplied on input, i.e., the use of different functions does not require new code to be written. The fit-basis functions can be used to carry out linear fits to the grid of single points, which are generated with an adaptive density-guided approach (ADGA). A non-linear conjugate gradient method is used to optimize non-linear parameters if such are present in the fit-basis functions. This means that a set of fit-basis functions with the same inherent shape as the potential cuts can be requested and no other choices with regards to the fit-basis functions need to be taken. The general fit-basis framework is explored in relation to anharmonic potentials for model systems, diatomic molecules, water, and imidazole. The behaviour and performance of Morse and double-well fit-basis functions are compared to that of polynomial fit-basis functions for unsymmetrical single-minimum and symmetrical double-well potentials. Furthermore, calculations for water and imidazole were carried out using both normal coordinates and hybrid optimized and localized coordinates (HOLCs). Our results suggest that choosing a suitable set of fit-basis functions can improve the stability of the fitting routine and the overall efficiency of potential construction by lowering the number of single point calculations required for the ADGA. It is possible to reduce the number of terms in the potential by choosing the Morse and double-well fit-basis functions. These effects are substantial for normal coordinates but become even more pronounced if HOLCs are used.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gamble, John; Jacobson, Noah Tobias; Baczewski, Andrew
EMTpY is an implementation of effective mass theory in python. It is designed to simulate semiconductor qubits within a non-perturbative, multi-valley effective mass theory framework using robust Gaussian basis sets.
A Bayesian spatial model for neuroimaging data based on biologically informed basis functions.
Huertas, Ismael; Oldehinkel, Marianne; van Oort, Erik S B; Garcia-Solis, David; Mir, Pablo; Beckmann, Christian F; Marquand, Andre F
2017-11-01
The dominant approach to neuroimaging data analysis employs the voxel as the unit of computation. While convenient, voxels lack biological meaning and their size is arbitrarily determined by the resolution of the image. Here, we propose a multivariate spatial model in which neuroimaging data are characterised as a linearly weighted combination of multiscale basis functions which map onto underlying brain nuclei or networks or nuclei. In this model, the elementary building blocks are derived to reflect the functional anatomy of the brain during the resting state. This model is estimated using a Bayesian framework which accurately quantifies uncertainty and automatically finds the most accurate and parsimonious combination of basis functions describing the data. We demonstrate the utility of this framework by predicting quantitative SPECT images of striatal dopamine function and we compare a variety of basis sets including generic isotropic functions, anatomical representations of the striatum derived from structural MRI, and two different soft functional parcellations of the striatum derived from resting-state fMRI (rfMRI). We found that a combination of ∼50 multiscale functional basis functions accurately represented the striatal dopamine activity, and that functional basis functions derived from an advanced parcellation technique known as Instantaneous Connectivity Parcellation (ICP) provided the most parsimonious models of dopamine function. Importantly, functional basis functions derived from resting fMRI were more accurate than both structural and generic basis sets in representing dopamine function in the striatum for a fixed model order. We demonstrate the translational validity of our framework by constructing classification models for discriminating parkinsonian disorders and their subtypes. Here, we show that ICP approach is the only basis set that performs well across all comparisons and performs better overall than the classical voxel-based approach. This spatial model constitutes an elegant alternative to voxel-based approaches in neuroimaging studies; not only are their atoms biologically informed, they are also adaptive to high resolutions, represent high dimensions efficiently, and capture long-range spatial dependencies, which are important and challenging objectives for neuroimaging data. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
ControlShell - A real-time software framework
NASA Technical Reports Server (NTRS)
Schneider, Stanley A.; Ullman, Marc A.; Chen, Vincent W.
1991-01-01
ControlShell is designed to enable modular design and impplementation of real-time software. It is an object-oriented tool-set for real-time software system programming. It provides a series of execution and data interchange mechansims that form a framework for building real-time applications. These mechanisms allow a component-based approach to real-time software generation and mangement. By defining a set of interface specifications for intermodule interaction, ControlShell provides a common platform that is the basis for real-time code development and exchange.
Basis set limit and systematic errors in local-orbital based all-electron DFT
NASA Astrophysics Data System (ADS)
Blum, Volker; Behler, Jörg; Gehrke, Ralf; Reuter, Karsten; Scheffler, Matthias
2006-03-01
With the advent of efficient integration schemes,^1,2 numeric atom-centered orbitals (NAO's) are an attractive basis choice in practical density functional theory (DFT) calculations of nanostructured systems (surfaces, clusters, molecules). Though all-electron, the efficiency of practical implementations promises to be on par with the best plane-wave pseudopotential codes, while having a noticeably higher accuracy if required: Minimal-sized effective tight-binding like calculations and chemically accurate all-electron calculations are both possible within the same framework; non-periodic and periodic systems can be treated on equal footing; and the localized nature of the basis allows in principle for O(N)-like scaling. However, converging an observable with respect to the basis set is less straightforward than with competing systematic basis choices (e.g., plane waves). We here investigate the basis set limit of optimized NAO basis sets in all-electron calculations, using as examples small molecules and clusters (N2, Cu2, Cu4, Cu10). meV-level total energy convergence is possible using <=50 basis functions per atom in all cases. We also find a clear correlation between the errors which arise from underconverged basis sets, and the system geometry (interatomic distance). ^1 B. Delley, J. Chem. Phys. 92, 508 (1990), ^2 J.M. Soler et al., J. Phys.: Condens. Matter 14, 2745 (2002).
Case-based retrieval framework for gene expression data.
Anaissi, Ali; Goyal, Madhu; Catchpoole, Daniel R; Braytee, Ali; Kennedy, Paul J
2015-01-01
The process of retrieving similar cases in a case-based reasoning system is considered a big challenge for gene expression data sets. The huge number of gene expression values generated by microarray technology leads to complex data sets and similarity measures for high-dimensional data are problematic. Hence, gene expression similarity measurements require numerous machine-learning and data-mining techniques, such as feature selection and dimensionality reduction, to be incorporated into the retrieval process. This article proposes a case-based retrieval framework that uses a k-nearest-neighbor classifier with a weighted-feature-based similarity to retrieve previously treated patients based on their gene expression profiles. The herein-proposed methodology is validated on several data sets: a childhood leukemia data set collected from The Children's Hospital at Westmead, as well as the Colon cancer, the National Cancer Institute (NCI), and the Prostate cancer data sets. Results obtained by the proposed framework in retrieving patients of the data sets who are similar to new patients are as follows: 96% accuracy on the childhood leukemia data set, 95% on the NCI data set, 93% on the Colon cancer data set, and 98% on the Prostate cancer data set. The designed case-based retrieval framework is an appropriate choice for retrieving previous patients who are similar to a new patient, on the basis of their gene expression data, for better diagnosis and treatment of childhood leukemia. Moreover, this framework can be applied to other gene expression data sets using some or all of its steps.
On the basis set convergence of electron–electron entanglement measures: helium-like systems
Hofer, Thomas S.
2013-01-01
A systematic investigation of three different electron–electron entanglement measures, namely the von Neumann, the linear and the occupation number entropy at full configuration interaction level has been performed for the four helium-like systems hydride, helium, Li+ and Be2+ using a large number of different basis sets. The convergence behavior of the resulting energies and entropies revealed that the latter do in general not show the expected strictly monotonic increase upon increase of the one–electron basis. Overall, the three different entanglement measures show good agreement among each other, the largest deviations being observed for small basis sets. The data clearly demonstrates that it is important to consider the nature of the chemical system when investigating entanglement phenomena in the framework of Gaussian type basis sets: while in case of hydride the use of augmentation functions is crucial, the application of core functions greatly improves the accuracy in case of cationic systems such as Li+ and Be2+. In addition, numerical derivatives of the entanglement measures with respect to the nucleic charge have been determined, which proved to be a very sensitive probe of the convergence leading to qualitatively wrong results (i.e., the wrong sign) if too small basis sets are used. PMID:24790952
On the basis set convergence of electron-electron entanglement measures: helium-like systems.
Hofer, Thomas S
2013-01-01
A systematic investigation of three different electron-electron entanglement measures, namely the von Neumann, the linear and the occupation number entropy at full configuration interaction level has been performed for the four helium-like systems hydride, helium, Li(+) and Be(2+) using a large number of different basis sets. The convergence behavior of the resulting energies and entropies revealed that the latter do in general not show the expected strictly monotonic increase upon increase of the one-electron basis. Overall, the three different entanglement measures show good agreement among each other, the largest deviations being observed for small basis sets. The data clearly demonstrates that it is important to consider the nature of the chemical system when investigating entanglement phenomena in the framework of Gaussian type basis sets: while in case of hydride the use of augmentation functions is crucial, the application of core functions greatly improves the accuracy in case of cationic systems such as Li(+) and Be(2+). In addition, numerical derivatives of the entanglement measures with respect to the nucleic charge have been determined, which proved to be a very sensitive probe of the convergence leading to qualitatively wrong results (i.e., the wrong sign) if too small basis sets are used.
Leadership and priority setting: the perspective of hospital CEOs.
Reeleder, David; Goel, Vivek; Singer, Peter A; Martin, Douglas K
2006-11-01
The role of leadership in health care priority setting remains largely unexplored. While the management leadership literature has grown rapidly, the growing literature on priority setting in health care has looked in other directions to improve priority setting practices--to health economics and ethical approaches. Consequently, potential for improvement in hospital priority setting practices may be overlooked. A qualitative study involving interviews with 46 Ontario hospital CEOs was done to describe the role of leadership in priority setting through the perspective of hospital leaders. For the first time, we report a framework of leadership domains including vision, alignment, relationships, values and process to facilitate priority setting practices in health services' organizations. We believe this fledgling framework forms the basis for the sharing of good leadership practices for health reform. It also provides a leadership guide for decision makers to improve the quality of their leadership, and in so doing, we believe, the fairness of their priority setting.
Earth's rotation in the framework of general relativity: rigid multipole moments
NASA Astrophysics Data System (ADS)
Klioner, S. A.; Soffel, M.; Xu, Ch.; Wu, X.
A set of equations describing the rotational motion of the Earth relative to the GCRS is formulated in the approximation of rigidly rotating multipoles. The external bodies are supposed to be mass monopoles. The derived set of formulas is supposed to form the theoretical basis for a practical post-Newtonian theory of Earth precession and nutation.
Basis for paraxial surface-plasmon-polariton packets
NASA Astrophysics Data System (ADS)
Martinez-Herrero, Rosario; Manjavacas, Alejandro
2016-12-01
We present a theoretical framework for the study of surface-plasmon polariton (SPP) packets propagating along a lossy metal-dielectric interface within the paraxial approximation. Using a rigorous formulation based on the plane-wave spectrum formalism, we introduce a set of modes that constitute a complete basis set for the solutions of Maxwell's equations for a metal-dielectric interface in the paraxial approximation. The use of this set of modes allows us to fully analyze the evolution of the transversal structure of SPP packets beyond the single plane-wave approximation. As a paradigmatic example, we analyze the case of a Gaussian SPP mode, for which, exploiting the analogy with paraxial optical beams, we introduce a set of parameters that characterize its propagation.
Comparison of fMRI analysis methods for heterogeneous BOLD responses in block design studies
Bernal-Casas, David; Fang, Zhongnan; Lee, Jin Hyung
2017-01-01
A large number of fMRI studies have shown that the temporal dynamics of evoked BOLD responses can be highly heterogeneous. Failing to model heterogeneous responses in statistical analysis can lead to significant errors in signal detection and characterization and alter the neurobiological interpretation. However, to date it is not clear that, out of a large number of options, which methods are robust against variability in the temporal dynamics of BOLD responses in block-design studies. Here, we used rodent optogenetic fMRI data with heterogeneous BOLD responses and simulations guided by experimental data as a means to investigate different analysis methods’ performance against heterogeneous BOLD responses. Evaluations are carried out within the general linear model (GLM) framework and consist of standard basis sets as well as independent component analysis (ICA). Analyses show that, in the presence of heterogeneous BOLD responses, conventionally used GLM with a canonical basis set leads to considerable errors in the detection and characterization of BOLD responses. Our results suggest that the 3rd and 4th order gamma basis sets, the 7th to 9th order finite impulse response (FIR) basis sets, the 5th to 9th order B-spline basis sets, and the 2nd to 5th order Fourier basis sets are optimal for good balance between detection and characterization, while the 1st order Fourier basis set (coherence analysis) used in our earlier studies show good detection capability. ICA has mostly good detection and characterization capabilities, but detects a large volume of spurious activation with the control fMRI data. PMID:27993672
Parameter and Structure Inference for Nonlinear Dynamical Systems
NASA Technical Reports Server (NTRS)
Morris, Robin D.; Smelyanskiy, Vadim N.; Millonas, Mark
2006-01-01
A great many systems can be modeled in the non-linear dynamical systems framework, as x = f(x) + xi(t), where f() is the potential function for the system, and xi is the excitation noise. Modeling the potential using a set of basis functions, we derive the posterior for the basis coefficients. A more challenging problem is to determine the set of basis functions that are required to model a particular system. We show that using the Bayesian Information Criteria (BIC) to rank models, and the beam search technique, that we can accurately determine the structure of simple non-linear dynamical system models, and the structure of the coupling between non-linear dynamical systems where the individual systems are known. This last case has important ecological applications.
Comparison of fMRI analysis methods for heterogeneous BOLD responses in block design studies.
Liu, Jia; Duffy, Ben A; Bernal-Casas, David; Fang, Zhongnan; Lee, Jin Hyung
2017-02-15
A large number of fMRI studies have shown that the temporal dynamics of evoked BOLD responses can be highly heterogeneous. Failing to model heterogeneous responses in statistical analysis can lead to significant errors in signal detection and characterization and alter the neurobiological interpretation. However, to date it is not clear that, out of a large number of options, which methods are robust against variability in the temporal dynamics of BOLD responses in block-design studies. Here, we used rodent optogenetic fMRI data with heterogeneous BOLD responses and simulations guided by experimental data as a means to investigate different analysis methods' performance against heterogeneous BOLD responses. Evaluations are carried out within the general linear model (GLM) framework and consist of standard basis sets as well as independent component analysis (ICA). Analyses show that, in the presence of heterogeneous BOLD responses, conventionally used GLM with a canonical basis set leads to considerable errors in the detection and characterization of BOLD responses. Our results suggest that the 3rd and 4th order gamma basis sets, the 7th to 9th order finite impulse response (FIR) basis sets, the 5th to 9th order B-spline basis sets, and the 2nd to 5th order Fourier basis sets are optimal for good balance between detection and characterization, while the 1st order Fourier basis set (coherence analysis) used in our earlier studies show good detection capability. ICA has mostly good detection and characterization capabilities, but detects a large volume of spurious activation with the control fMRI data. Copyright © 2016 Elsevier Inc. All rights reserved.
Biological Collections: Chasing the Ideal
Kamenski, P. A.; Sazonov, A. E.; Fedyanin, A. A.; Sadovnichy, V. A.
2016-01-01
This article is based on the results of an analysis of existing biological collections in Russia and abroad set up in the framework of the project “Scientific Basis of the National Biobank –Depository of Living Systems” by M.V. Lomonosov Moscow State University [1]. PMID:27437135
Xu, Xin; Huang, Zhenhua; Graves, Daniel; Pedrycz, Witold
2014-12-01
In order to deal with the sequential decision problems with large or continuous state spaces, feature representation and function approximation have been a major research topic in reinforcement learning (RL). In this paper, a clustering-based graph Laplacian framework is presented for feature representation and value function approximation (VFA) in RL. By making use of clustering-based techniques, that is, K-means clustering or fuzzy C-means clustering, a graph Laplacian is constructed by subsampling in Markov decision processes (MDPs) with continuous state spaces. The basis functions for VFA can be automatically generated from spectral analysis of the graph Laplacian. The clustering-based graph Laplacian is integrated with a class of approximation policy iteration algorithms called representation policy iteration (RPI) for RL in MDPs with continuous state spaces. Simulation and experimental results show that, compared with previous RPI methods, the proposed approach needs fewer sample points to compute an efficient set of basis functions and the learning control performance can be improved for a variety of parameter settings.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kersten, J. A. F., E-mail: jennifer.kersten@cantab.net; Alavi, Ali, E-mail: a.alavi@fkf.mpg.de; Max Planck Institute for Solid State Research, Heisenbergstraße 1, 70569 Stuttgart
2016-08-07
The Full Configuration Interaction Quantum Monte Carlo (FCIQMC) method has proved able to provide near-exact solutions to the electronic Schrödinger equation within a finite orbital basis set, without relying on an expansion about a reference state. However, a drawback to the approach is that being based on an expansion of Slater determinants, the FCIQMC method suffers from a basis set incompleteness error that decays very slowly with the size of the employed single particle basis. The FCIQMC results obtained in a small basis set can be improved significantly with explicitly correlated techniques. Here, we present a study that assesses andmore » compares two contrasting “universal” explicitly correlated approaches that fit into the FCIQMC framework: the [2]{sub R12} method of Kong and Valeev [J. Chem. Phys. 135, 214105 (2011)] and the explicitly correlated canonical transcorrelation approach of Yanai and Shiozaki [J. Chem. Phys. 136, 084107 (2012)]. The former is an a posteriori internally contracted perturbative approach, while the latter transforms the Hamiltonian prior to the FCIQMC simulation. These comparisons are made across the 55 molecules of the G1 standard set. We found that both methods consistently reduce the basis set incompleteness, for accurate atomization energies in small basis sets, reducing the error from 28 mE{sub h} to 3-4 mE{sub h}. While many of the conclusions hold in general for any combination of multireference approaches with these methodologies, we also consider FCIQMC-specific advantages of each approach.« less
The Inter-Sectoral Impact Model Intercomparison Project (ISI–MIP): Project framework
Warszawski, Lila; Frieler, Katja; Huber, Veronika; Piontek, Franziska; Serdeczny, Olivia; Schewe, Jacob
2014-01-01
The Inter-Sectoral Impact Model Intercomparison Project offers a framework to compare climate impact projections in different sectors and at different scales. Consistent climate and socio-economic input data provide the basis for a cross-sectoral integration of impact projections. The project is designed to enable quantitative synthesis of climate change impacts at different levels of global warming. This report briefly outlines the objectives and framework of the first, fast-tracked phase of Inter-Sectoral Impact Model Intercomparison Project, based on global impact models, and provides an overview of the participating models, input data, and scenario set-up. PMID:24344316
Model of a programmable quantum processing unit based on a quantum transistor effect
NASA Astrophysics Data System (ADS)
Ablayev, Farid; Andrianov, Sergey; Fetisov, Danila; Moiseev, Sergey; Terentyev, Alexandr; Urmanchev, Andrey; Vasiliev, Alexander
2018-02-01
In this paper we propose a model of a programmable quantum processing device realizable with existing nano-photonic technologies. It can be viewed as a basis for new high performance hardware architectures. Protocols for physical implementation of device on the controlled photon transfer and atomic transitions are presented. These protocols are designed for executing basic single-qubit and multi-qubit gates forming a universal set. We analyze the possible operation of this quantum computer scheme. Then we formalize the physical architecture by a mathematical model of a Quantum Processing Unit (QPU), which we use as a basis for the Quantum Programming Framework. This framework makes it possible to perform universal quantum computations in a multitasking environment.
Abbass-Dick, Jennifer; Dennis, Cindy-Lee
Targeting mothers and fathers in breast-feeding promotion programs is recommended as research has found that father's support positively impacts breast-feeding duration and exclusivity. Breast-feeding coparenting refers to the manner in which parents work together to achieve their breast-feeding goals. The Breast-feeding Coparenting Framework was developed on the basis of diverse coparenting models and research related to father's involvement with breast-feeding. This framework consists of 5 components: joint breast-feeding goal setting, shared breast-feeding responsibility, proactive breast-feeding support, father's/partner's parental-child interactions, and productive communication and problem solving. This framework may be of value to policy makers and program providers working to improve breast-feeding outcomes.
Optimization of auxiliary basis sets for the LEDO expansion and a projection technique for LEDO-DFT.
Götz, Andreas W; Kollmar, Christian; Hess, Bernd A
2005-09-01
We present a systematic procedure for the optimization of the expansion basis for the limited expansion of diatomic overlap density functional theory (LEDO-DFT) and report on optimized auxiliary orbitals for the Ahlrichs split valence plus polarization basis set (SVP) for the elements H, Li--F, and Na--Cl. A new method to deal with near-linear dependences in the LEDO expansion basis is introduced, which greatly reduces the computational effort of LEDO-DFT calculations. Numerical results for a test set of small molecules demonstrate the accuracy of electronic energies, structural parameters, dipole moments, and harmonic frequencies. For larger molecular systems the numerical errors introduced by the LEDO approximation can lead to an uncontrollable behavior of the self-consistent field (SCF) process. A projection technique suggested by Löwdin is presented in the framework of LEDO-DFT, which guarantees for SCF convergence. Numerical results on some critical test molecules suggest the general applicability of the auxiliary orbitals presented in combination with this projection technique. Timing results indicate that LEDO-DFT is competitive with conventional density fitting methods. (c) 2005 Wiley Periodicals, Inc.
A framework for telehealth program evaluation.
Nepal, Surya; Li, Jane; Jang-Jaccard, Julian; Alem, Leila
2014-04-01
Evaluating telehealth programs is a challenging task, yet it is the most sensible first step when embarking on a telehealth study. How can we frame and report on telehealth studies? What are the health services elements to select based on the application needs? What are the appropriate terms to use to refer to such elements? Various frameworks have been proposed in the literature to answer these questions, and each framework is defined by a set of properties covering different aspects of telehealth systems. The most common properties include application, technology, and functionality. With the proliferation of telehealth, it is important not only to understand these properties, but also to define new properties to account for a wider range of context of use and evaluation outcomes. This article presents a comprehensive framework for delivery design, implementation, and evaluation of telehealth services. We first survey existing frameworks proposed in the literature and then present our proposed comprehensive multidimensional framework for telehealth. Six key dimensions of the proposed framework include health domains, health services, delivery technologies, communication infrastructure, environment setting, and socioeconomic analysis. We define a set of example properties for each dimension. We then demonstrate how we have used our framework to evaluate telehealth programs in rural and remote Australia. A few major international studies have been also mapped to demonstrate the feasibility of the framework. The key characteristics of the framework are as follows: (a) loosely coupled and hence easy to use, (b) provides a basis for describing a wide range of telehealth programs, and (c) extensible to future developments and needs.
Francis, Jill J; O'Connor, Denise; Curran, Janet
2012-04-24
Behaviour change is key to increasing the uptake of evidence into healthcare practice. Designing behaviour-change interventions first requires problem analysis, ideally informed by theory. Yet the large number of partly overlapping theories of behaviour makes it difficult to select the most appropriate theory. The need for an overarching theoretical framework of behaviour change was addressed in research in which 128 explanatory constructs from 33 theories of behaviour were identified and grouped. The resulting Theoretical Domains Framework (TDF) appears to be a helpful basis for investigating implementation problems. Research groups in several countries have conducted TDF-based studies. It seems timely to bring together the experience of these teams in a thematic series to demonstrate further applications and to report key developments. This overview article describes the TDF, provides a brief critique of the framework, and introduces this thematic series.In a brief review to assess the extent of TDF-based research, we identified 133 papers that cite the framework. Of these, 17 used the TDF as the basis for empirical studies to explore health professionals' behaviour. The identified papers provide evidence of the impact of the TDF on implementation research. Two major strengths of the framework are its theoretical coverage and its capacity to elicit beliefs that could signify key mediators of behaviour change. The TDF provides a useful conceptual basis for assessing implementation problems, designing interventions to enhance healthcare practice, and understanding behaviour-change processes. We discuss limitations and research challenges and introduce papers in this series.
2012-01-01
Behaviour change is key to increasing the uptake of evidence into healthcare practice. Designing behaviour-change interventions first requires problem analysis, ideally informed by theory. Yet the large number of partly overlapping theories of behaviour makes it difficult to select the most appropriate theory. The need for an overarching theoretical framework of behaviour change was addressed in research in which 128 explanatory constructs from 33 theories of behaviour were identified and grouped. The resulting Theoretical Domains Framework (TDF) appears to be a helpful basis for investigating implementation problems. Research groups in several countries have conducted TDF-based studies. It seems timely to bring together the experience of these teams in a thematic series to demonstrate further applications and to report key developments. This overview article describes the TDF, provides a brief critique of the framework, and introduces this thematic series. In a brief review to assess the extent of TDF-based research, we identified 133 papers that cite the framework. Of these, 17 used the TDF as the basis for empirical studies to explore health professionals’ behaviour. The identified papers provide evidence of the impact of the TDF on implementation research. Two major strengths of the framework are its theoretical coverage and its capacity to elicit beliefs that could signify key mediators of behaviour change. The TDF provides a useful conceptual basis for assessing implementation problems, designing interventions to enhance healthcare practice, and understanding behaviour-change processes. We discuss limitations and research challenges and introduce papers in this series. PMID:22531601
A Framework for Bounding Nonlocality of State Discrimination
NASA Astrophysics Data System (ADS)
Childs, Andrew M.; Leung, Debbie; Mančinska, Laura; Ozols, Maris
2013-11-01
We consider the class of protocols that can be implemented by local quantum operations and classical communication (LOCC) between two parties. In particular, we focus on the task of discriminating a known set of quantum states by LOCC. Building on the work in the paper Quantum nonlocality without entanglement (Bennett et al., Phys Rev A 59:1070-1091, 1999), we provide a framework for bounding the amount of nonlocality in a given set of bipartite quantum states in terms of a lower bound on the probability of error in any LOCC discrimination protocol. We apply our framework to an orthonormal product basis known as the domino states and obtain an alternative and simplified proof that quantifies its nonlocality. We generalize this result for similar bases in larger dimensions, as well as the “rotated” domino states, resolving a long-standing open question (Bennett et al., Phys Rev A 59:1070-1091, 1999).
Velocity-gauge real-time TDDFT within a numerical atomic orbital basis set
NASA Astrophysics Data System (ADS)
Pemmaraju, C. D.; Vila, F. D.; Kas, J. J.; Sato, S. A.; Rehr, J. J.; Yabana, K.; Prendergast, David
2018-05-01
The interaction of laser fields with solid-state systems can be modeled efficiently within the velocity-gauge formalism of real-time time dependent density functional theory (RT-TDDFT). In this article, we discuss the implementation of the velocity-gauge RT-TDDFT equations for electron dynamics within a linear combination of atomic orbitals (LCAO) basis set framework. Numerical results obtained from our LCAO implementation, for the electronic response of periodic systems to both weak and intense laser fields, are compared to those obtained from established real-space grid and Full-Potential Linearized Augmented Planewave approaches. Potential applications of the LCAO based scheme in the context of extreme ultra-violet and soft X-ray spectroscopies involving core-electronic excitations are discussed.
Water adsorption on a copper formate paddlewheel model of CuBTC: A comparative MP2 and DFT study
NASA Astrophysics Data System (ADS)
Toda, Jordi; Fischer, Michael; Jorge, Miguel; Gomes, José R. B.
2013-11-01
Simultaneous adsorption of two water molecules on open metal sites of the HKUST-1 metal-organic framework (MOF), modeled with a Cu2(HCOO)4 cluster, was studied by means of density functional theory (DFT) and second-order Moller-Plesset (MP2) approaches together with correlation consistent basis sets. Experimental geometries and MP2 energetic data extrapolated to the complete basis set limit were used as benchmarks for testing the accuracy of several different exchange-correlation functionals in the correct description of the water-MOF interaction. M06-L and some LC-DFT methods arise as the most appropriate in terms of the quality of geometrical data, energetic data and computational resources needed.
Extraction of information from major element chemical analyses of lunar basalts
NASA Technical Reports Server (NTRS)
Butler, J. C.
1985-01-01
Major element chemical analyses often form the framework within which similarities and differences of analyzed specimens are noted and used to propose or devise models. When percentages are formed the ratios of pairs of components are preserved whereas many familiar statistical and geometrical descriptors are likely to exhibit major changes. This ratio preserving aspect forms the basis for a proposed framework. An analysis of compositional variability within the data set of 42 major element analyses of lunar reference samples was selected to investigate this proposal.
Teaching and Learning Methodologies Supported by ICT Applied in Computer Science
ERIC Educational Resources Information Center
Capacho, Jose
2016-01-01
The main objective of this paper is to show a set of new methodologies applied in the teaching of Computer Science using ICT. The methodologies are framed in the conceptual basis of the following sciences: Psychology, Education and Computer Science. The theoretical framework of the research is supported by Behavioral Theory, Gestalt Theory.…
Women and Retirement: The Effect of Multiple Careers on Retirement Adjustment.
ERIC Educational Resources Information Center
Connidis, Ingrid
1982-01-01
The concept of career set is employed as the basis for a framework designed to analyze the impact of women's involvement in multiple careers on their adjustment to retirement. The author concludes that the familial careers engaged in by married, working women have a mediative effect on their transition to retirement. (Author/CT)
Stable Chimeras and Independently Synchronizable Clusters
NASA Astrophysics Data System (ADS)
Cho, Young Sul; Nishikawa, Takashi; Motter, Adilson E.
2017-08-01
Cluster synchronization is a phenomenon in which a network self-organizes into a pattern of synchronized sets. It has been shown that diverse patterns of stable cluster synchronization can be captured by symmetries of the network. Here, we establish a theoretical basis to divide an arbitrary pattern of symmetry clusters into independently synchronizable cluster sets, in which the synchronization stability of the individual clusters in each set is decoupled from that in all the other sets. Using this framework, we suggest a new approach to find permanently stable chimera states by capturing two or more symmetry clusters—at least one stable and one unstable—that compose the entire fully symmetric network.
A modeling framework for exposing risks in complex systems.
Sharit, J
2000-08-01
This article introduces and develops a modeling framework for exposing risks in the form of human errors and adverse consequences in high-risk systems. The modeling framework is based on two components: a two-dimensional theory of accidents in systems developed by Perrow in 1984, and the concept of multiple system perspectives. The theory of accidents differentiates systems on the basis of two sets of attributes. One set characterizes the degree to which systems are interactively complex; the other emphasizes the extent to which systems are tightly coupled. The concept of multiple perspectives provides alternative descriptions of the entire system that serve to enhance insight into system processes. The usefulness of these two model components derives from a modeling framework that cross-links them, enabling a variety of work contexts to be exposed and understood that would otherwise be very difficult or impossible to identify. The model components and the modeling framework are illustrated in the case of a large and comprehensive trauma care system. In addition to its general utility in the area of risk analysis, this methodology may be valuable in applications of current methods of human and system reliability analysis in complex and continually evolving high-risk systems.
Mapped grid methods for long-range molecules and cold collisions
NASA Astrophysics Data System (ADS)
Willner, K.; Dulieu, O.; Masnou-Seeuws, F.
2004-01-01
The paper discusses ways of improving the accuracy of numerical calculations for vibrational levels of diatomic molecules close to the dissociation limit or for ultracold collisions, in the framework of a grid representation. In order to avoid the implementation of very large grids, Kokoouline et al. [J. Chem. Phys. 110, 9865 (1999)] have proposed a mapping procedure through introduction of an adaptive coordinate x subjected to the variation of the local de Broglie wavelength as a function of the internuclear distance R. Some unphysical levels ("ghosts") then appear in the vibrational series computed via a mapped Fourier grid representation. In the present work the choice of the basis set is reexamined, and two alternative expansions are discussed: Sine functions and Hardy functions. It is shown that use of a basis set with fixed nodes at both grid ends is efficient to eliminate "ghost" solutions. It is further shown that the Hamiltonian matrix in the sine basis can be calculated very accurately by using an auxiliary basis of cosine functions, overcoming the problems arising from numerical calculation of the Jacobian J(x) of the R→x coordinate transformation.
Velocity-gauge real-time TDDFT within a numerical atomic orbital basis set
Pemmaraju, C. D.; Vila, F. D.; Kas, J. J.; ...
2018-02-07
The interaction of laser fields with solid-state systems can be modeled efficiently within the velocity-gauge formalism of real-time time dependent density functional theory (RT-TDDFT). In this article, we discuss the implementation of the velocity-gauge RT-TDDFT equations for electron dynamics within a linear combination of atomic orbitals (LCAO) basis set framework. Numerical results obtained from our LCAO implementation, for the electronic response of periodic systems to both weak and intense laser fields, are compared to those obtained from established real-space grid and Full-Potential Linearized Augmented Planewave approaches. As a result, potential applications of the LCAO based scheme in the context ofmore » extreme ultra-violet and soft X-ray spectroscopies involving core-electronic excitations are discussed.« less
Velocity-gauge real-time TDDFT within a numerical atomic orbital basis set
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pemmaraju, C. D.; Vila, F. D.; Kas, J. J.
The interaction of laser fields with solid-state systems can be modeled efficiently within the velocity-gauge formalism of real-time time dependent density functional theory (RT-TDDFT). In this article, we discuss the implementation of the velocity-gauge RT-TDDFT equations for electron dynamics within a linear combination of atomic orbitals (LCAO) basis set framework. Numerical results obtained from our LCAO implementation, for the electronic response of periodic systems to both weak and intense laser fields, are compared to those obtained from established real-space grid and Full-Potential Linearized Augmented Planewave approaches. As a result, potential applications of the LCAO based scheme in the context ofmore » extreme ultra-violet and soft X-ray spectroscopies involving core-electronic excitations are discussed.« less
Patients' views on priority setting in neurosurgery: A qualitative study.
Gunaratnam, Caroline; Bernstein, Mark
2016-01-01
Accountability for Reasonableness is an ethical framework which has been implemented in various health care systems to improve and evaluate the fairness of priority setting. This framework is grounded on four mandatory conditions: relevance, publicity, appeals, and enforcement. There have been few studies which have evaluated the patient stakeholders' acceptance of this framework; certainly no studies have been done on patients' views on the prioritization system for allocating patients for operating time in a system with pressure on the resource of inpatient beds. The aim of this study is to examine neurosurgical patients' views on the prioritization of patients for operating theater (OT) time on a daily basis at a tertiary and quaternary referral neurosurgery center. Semi-structured face-to-face interviews were conducted with thirty-seven patients, recruited from the neurosurgery clinic at Toronto Western Hospital. Family members and friends who accompanied the patient to their clinic visit were encouraged to contribute to the discussion. Interviews were audio recorded, transcribed verbatim, and subjected to thematic analysis using open and axial coding. Overall, patients are supportive of the concept of a priority-setting system based on fairness, but felt that a few changes would help to improve the fairness of the current system. These changes include lowering the level of priority given to volume-funded cases and providing scheduled surgeries that were previously canceled a higher level of prioritization. Good communication, early notification, and rescheduling canceled surgeries as soon as possible were important factors that directly reflected the patients' confidence level in their doctor, the hospital, and the health care system. This study is the first clinical qualitative study of patients' perspective on a prioritization system used for allocating neurosurgical patients for OT time on a daily basis in a socialized not-for-profit health care system with fixed resources.
Research on the content framework of information disclosure mechanism in Shanxi power market
NASA Astrophysics Data System (ADS)
Sun, Yanzhang; Li, Tao; Hou, Zhehui; Cao, Xiaozhong
2018-06-01
With the further development of the power reform, establishing a sound power system with rich content and efficient operation has become an urgent need. Faced with the current circumstance of power market information disclosure in Shanxi province, this paper fully incorporates the actual situation and introduces the index into the power market information disclosure mechanism, and sets up the general information disclosure framework in Shanxi province power market on the basis of which A direct information disclosure mechanism and an indirect information disclosure mechanism were designed. Then we formulate comprehensive power index system, generation index system, transmission and distribution index system, and power utilization index system. In conclusion, the outcomes above will enrich power information disclosure mechanism in Shanxi province and will provide a platform for various market members as a guidance on setting right business decisions.
Structural basis for inhibition of the histone chaperone activity of SET/TAF-Iβ by cytochrome c.
González-Arzola, Katiuska; Díaz-Moreno, Irene; Cano-González, Ana; Díaz-Quintana, Antonio; Velázquez-Campoy, Adrián; Moreno-Beltrán, Blas; López-Rivas, Abelardo; De la Rosa, Miguel A
2015-08-11
Chromatin is pivotal for regulation of the DNA damage process insofar as it influences access to DNA and serves as a DNA repair docking site. Recent works identify histone chaperones as key regulators of damaged chromatin's transcriptional activity. However, understanding how chaperones are modulated during DNA damage response is still challenging. This study reveals that the histone chaperone SET/TAF-Iβ interacts with cytochrome c following DNA damage. Specifically, cytochrome c is shown to be translocated into cell nuclei upon induction of DNA damage, but not upon stimulation of the death receptor or stress-induced pathways. Cytochrome c was found to competitively hinder binding of SET/TAF-Iβ to core histones, thereby locking its histone-binding domains and inhibiting its nucleosome assembly activity. In addition, we have used NMR spectroscopy, calorimetry, mutagenesis, and molecular docking to provide an insight into the structural features of the formation of the complex between cytochrome c and SET/TAF-Iβ. Overall, these findings establish a framework for understanding the molecular basis of cytochrome c-mediated blocking of SET/TAF-Iβ, which subsequently may facilitate the development of new drugs to silence the oncogenic effect of SET/TAF-Iβ's histone chaperone activity.
Nutrient intake values (NIVs): a recommended terminology and framework for the derivation of values.
King, Janet C; Vorster, Hester H; Tome, Daniel G
2007-03-01
Although most countries and regions around the world set recommended nutrient intake values for their populations, there is no standardized terminology or framework for establishing these standards. Different terms used for various components of a set of dietary standards are described in this paper and a common set of terminology is proposed. The recommended terminology suggests that the set of values be called nutrient intake values (NIVs) and that the set be composed of three different values. The average nutrient requirement (ANR) reflects the median requirement for a nutrient in a specific population. The individual nutrient level (INLx) is the recommended level of nutrient intake for all healthy people in the population, which is set at a certain level x above the mean requirement. For example, a value set at 2 standard deviations above the mean requirement would cover the needs of 98% of the population and would be INL98. The third component of the NIVs is an upper nutrient level (UNL), which is the highest level of daily nutrient intake that is likely to pose no risk of adverse health effects for almost all individuals in a specified life-stage group. The proposed framework for deriving a set of NIVs is based on a statistical approach for determining the midpoint of a distribution of requirements for a set of nutrients in a population (the ANR), the standard deviation of the requirements, and an individual nutrient level that assures health at some point above the mean, e.g., 2 standard deviations. Ideally, a second set of distributions of risk of excessive intakes is used as the basis for a UNL.
Open-Ended Recursive Approach for the Calculation of Multiphoton Absorption Matrix Elements
2015-01-01
We present an implementation of single residues for response functions to arbitrary order using a recursive approach. Explicit expressions in terms of density-matrix-based response theory for the single residues of the linear, quadratic, cubic, and quartic response functions are also presented. These residues correspond to one-, two-, three- and four-photon transition matrix elements. The newly developed code is used to calculate the one-, two-, three- and four-photon absorption cross sections of para-nitroaniline and para-nitroaminostilbene, making this the first treatment of four-photon absorption in the framework of response theory. We find that the calculated multiphoton absorption cross sections are not very sensitive to the size of the basis set as long as a reasonably large basis set with diffuse functions is used. The choice of exchange–correlation functional, however, significantly affects the calculated cross sections of both charge-transfer transitions and other transitions, in particular, for the larger para-nitroaminostilbene molecule. We therefore recommend the use of a range-separated exchange–correlation functional in combination with the augmented correlation-consistent double-ζ basis set aug-cc-pVDZ for the calculation of multiphoton absorption properties. PMID:25821415
Keeping it wild: Mapping wilderness character in the United States
Steve Carver; James Tricker; Peter Landres
2013-01-01
A GIS-based approach is developed to identify the state of wilderness character in US wilderness areas using Death Valley National Park (DEVA) as a case study. A set of indicators and measures are identified by DEVA staff and used as the basis for developing a flexible and broadly applicable framework to map wilderness character using data inputs selected by park staff...
The 4C framework for making reasonable adjustments for people with learning disabilities.
Marsden, Daniel; Giles, Rachel
2017-01-18
Background People with learning disabilities experience significant inequalities in accessing healthcare. Legal frameworks, such as the Equality Act 2010, are intended to reduce such disparities in care, and require organisations to make 'reasonable adjustments' for people with disabilities, including learning disabilities. However, reasonable adjustments are often not clearly defined or adequately implemented in clinical practice. Aim To examine and synthesise the challenges in caring for people with learning disabilities to develop a framework for making reasonable adjustments for people with learning disabilities in hospital. This framework would assist ward staff in identifying and managing the challenges of delivering person-centred, safe and effective healthcare to people with learning disabilities in this setting. Method Fourth-generation evaluation, collaborative thematic analysis, reflection and a secondary analysis were used to develop a framework for making reasonable adjustments in the hospital setting. The authors attended ward manager and matron group meetings to collect their claims, concerns and issues, then conducted a collaborative thematic analysis with the group members to identify the main themes. Findings Four main themes were identified from the ward manager and matron group meetings: communication, choice-making, collaboration and coordination. These were used to develop the 4C framework for making reasonable adjustments for people with learning disabilities in hospital. Discussion The 4C framework has provided a basis for delivering person-centred care for people with learning disabilities. It has been used to inform training needs analyses, develop audit tools to review delivery of care that is adjusted appropriately to the individual patient; and to develop competencies for learning disability champions. The most significant benefit of the 4C framework has been in helping to evaluate and resolve practice-based scenarios. Conclusion Use of the 4C framework may enhance the care of people with learning disabilities in hospital, by enabling reasonable adjustments to be made in these settings.
NASA Astrophysics Data System (ADS)
Laiti, L.; Mallucci, S.; Piccolroaz, S.; Bellin, A.; Zardi, D.; Fiori, A.; Nikulin, G.; Majone, B.
2018-03-01
Assessing the accuracy of gridded climate data sets is highly relevant to climate change impact studies, since evaluation, bias correction, and statistical downscaling of climate models commonly use these products as reference. Among all impact studies those addressing hydrological fluxes are the most affected by errors and biases plaguing these data. This paper introduces a framework, coined Hydrological Coherence Test (HyCoT), for assessing the hydrological coherence of gridded data sets with hydrological observations. HyCoT provides a framework for excluding meteorological forcing data sets not complying with observations, as function of the particular goal at hand. The proposed methodology allows falsifying the hypothesis that a given data set is coherent with hydrological observations on the basis of the performance of hydrological modeling measured by a metric selected by the modeler. HyCoT is demonstrated in the Adige catchment (southeastern Alps, Italy) for streamflow analysis, using a distributed hydrological model. The comparison covers the period 1989-2008 and includes five gridded daily meteorological data sets: E-OBS, MSWEP, MESAN, APGD, and ADIGE. The analysis highlights that APGD and ADIGE, the data sets with highest effective resolution, display similar spatiotemporal precipitation patterns and produce the largest hydrological efficiency indices. Lower performances are observed for E-OBS, MESAN, and MSWEP, especially in small catchments. HyCoT reveals deficiencies in the representation of spatiotemporal patterns of gridded climate data sets, which cannot be corrected by simply rescaling the meteorological forcing fields, as often done in bias correction of climate model outputs. We recommend this framework to assess the hydrological coherence of gridded data sets to be used in large-scale hydroclimatic studies.
Sivan, Manoj; Gallagher, Justin; Holt, Ray; Weightman, Andy; Levesley, Martin; Bhakta, Bipin
2014-01-01
This study evaluates whether the International Classification of Functioning, Disability, and Health (ICF) framework provides a useful basis to ensure that key user needs are identified in the development of a home-based arm rehabilitation system for stroke patients. Using a qualitative approach, nine people with residual arm weakness after stroke and six healthcare professionals with expertise in stroke rehabilitation were enrolled in the user-centered design process. They were asked, through semi-structured interviews, to define the needs and specification for a potential home-based rehabilitation device to facilitate self-managed arm exercise. The topic list for the interviews was derived by brainstorming ideas within the clinical and engineering multidisciplinary research team based on previous experience and existing literature in user-centered design. Meaningful concepts were extracted from questions and responses of these interviews. These concepts obtained were matched to the categories within the ICF comprehensive core set for stroke using ICF linking rules. Most of the concepts extracted from the interviews matched to the existing ICF Core Set categories. Person factors like gender, age, interest, compliance, motivation, choice, and convenience that might determine device usability are yet to be categorized within the ICF comprehensive core set. The results suggest that the categories of the comprehensive ICF Core Set for stroke provide a useful basis for structuring interviews to identify most users needs. However some personal factors (related to end users and healthcare professionals) need to be considered in addition to the ICF categories.
Habitat and environment of islands: primary and supplemental island sets
Matalas, Nicholas C.; Grossling, Bernardo F.
2002-01-01
The original intent of the study was to develop a first-order synopsis of island hydrology with an integrated geologic basis on a global scale. As the study progressed, the aim was broadened to provide a framework for subsequent assessments on large regional or global scales of island resources and impacts on those resources that are derived from global changes. Fundamental to the study was the development of a comprehensive framework?a wide range of parameters that describe a set of 'saltwater' islands sufficiently large to Characterize the spatial distribution of the world?s islands; Account for all major archipelagos; Account for almost all oceanically isolated islands, and Account collectively for a very large proportion of the total area of the world?s islands whereby additional islands would only marginally contribute to the representativeness and accountability of the island set. The comprehensive framework, which is referred to as the ?Primary Island Set,? is built on 122 parameters that describe 1,000 islands. To complement the investigations based on the Primary Island Set, two supplemental island sets, Set A?Other Islands (not in the Primary Island Set) and Set B?Lagoonal Atolls, are included in the study. The Primary Island Set, together with the Supplemental Island Sets A and B, provides a framework that can be used in various scientific disciplines for their island-based studies on broad regional or global scales. The study uses an informal, coherent, geophysical organization of the islands that belong to the three island sets. The organization is in the form of a global island chain, which is a particular sequential ordering of the islands referred to as the 'Alisida.' The Alisida was developed through a trial-and-error procedure by seeking to strike a balance between 'minimizing the length of the global chain' and 'maximizing the chain?s geophysical coherence.' The fact that an objective function cannot be minimized and maximized simultaneously indicates that the Alisida is not unique. Global island chains other than the Alisida may better serve disciplines other than those of hydrology and geology.
Dash, Bibek
2018-04-26
The present work deals with a density functional theory (DFT) study of porous organic framework materials containing - groups for CO 2 capture. In this study, first principle calculations were performed for CO 2 adsorption using N-containing covalent organic framework (COFs) models. Ab initio and DFT-based methods were used to characterize the N-containing porous model system based on their interaction energies upon complexing with CO 2 and nitrogen gas. Binding energies (BEs) of CO 2 and N 2 molecules with the polymer framework were calculated with DFT methods. Hybrid B3LYP and second order MP2 methods combined with of Pople 6-31G(d,p) and correlation consistent basis sets cc-pVDZ, cc-pVTZ and aug-ccVDZ were used to calculate BEs. The effect of linker groups in the designed covalent organic framework model system on the CO 2 and N 2 interactions was studied using quantum calculations.
Höfener, Sebastian; Bischoff, Florian A; Glöss, Andreas; Klopper, Wim
2008-06-21
In the recent years, Slater-type geminals (STGs) have been used with great success to expand the first-order wave function in an explicitly-correlated perturbation theory. The present work reports on this theory's implementation in the framework of the Turbomole suite of programs. A formalism is presented for evaluating all of the necessary molecular two-electron integrals by means of the Obara-Saika recurrence relations, which can be applied when the STG is expressed as a linear combination of a small number (n) of Gaussians (STG-nG geminal basis). In the Turbomole implementation of the theory, density fitting is employed and a complementary auxiliary basis set (CABS) is used for the resolution-of-the-identity (RI) approximation of explicitly-correlated theory. By virtue of this RI approximation, the calculation of molecular three- and four-electron integrals is avoided. An approximation is invoked to avoid the two-electron integrals over the commutator between the operators of kinetic energy and the STG. This approximation consists of computing commutators between matrices in place of operators. Integrals over commutators between operators would have occurred if the theory had been formulated and implemented as proposed originally. The new implementation in Turbomole was tested by performing a series of calculations on rotational conformers of the alkanols n-propanol through n-pentanol. Basis-set requirements concerning the orbital basis, the auxiliary basis set for density fitting and the CABS were investigated. Furthermore, various (constrained) optimizations of the amplitudes of the explicitly-correlated double excitations were studied. These amplitudes can be optimized in orbital-variant and orbital-invariant manners, or they can be kept fixed at the values governed by the rational generator approach, that is, by the electron cusp conditions. Electron-correlation effects beyond the level of second-order perturbation theory were accounted for by conventional coupled-cluster calculations with single, double and perturbative triple excitations [CCSD(T)]. The explicitly-correlated perturbation theory results were combined with CCSD(T) results and compared with literature data obtained by basis-set extrapolation.
ERIC Educational Resources Information Center
Randler, Christoph; Kummer, Barbara; Wilhelm, Christian
2012-01-01
The aim of this study was to assess the outcome of a zoo visit in terms of learning and retention of knowledge concerning the adaptations and behavior of vertebrate species. Basis of the work was the concept of implementing zoo visits as an out-of-school setting for formal, curriculum based learning. Our theoretical framework centers on the…
A structured policy review of the principles of professional self-regulation.
Benton, D C; González-Jurado, M A; Beneit-Montesinos, J V
2013-03-01
The International Council of Nurses (ICN) has, for many years, based its work on professional self-regulation on a set of 12 principles. These principles are research based and were identified nearly three decades ago. ICN has conducted a number of reviews of the principles; however, changes have been minimal. In the past 5-10 years, a number of authors and governments, often as part of the review of regulatory systems, have started to propose principles to guide the way regulatory frameworks are designed and implemented. These principles vary in number and content. This study examines the current policy literature on principle-based regulation and compares this with the set of principles advocated by the ICN. A systematic search of the literature on principle-based regulation is used as the basis for a qualitative thematic analysis to compare and contrast the 12 principles of self-regulation with more recently published work. A mapping of terms based on a detailed description of the principles used in the various research and policy documents was generated. This mapping forms the basis of a critique of the current ICN principles. A professional self-regulation advocated by the ICN were identified. A revised and extended set of 13 principles is needed if contemporary developments in the field of regulatory frameworks are to be accommodated. These revised principles should be considered for adoption by the ICN to underpin their advocacy work on professional self-regulation. © 2013 The Authors. International Nursing Review © 2013 International Council of Nurses.
Skirton, Heather; Lewis, Celine; Kent, Alastair; Coviello, Domenico A
2010-01-01
The use of genetics and genomics within a wide range of health-care settings requires health professionals to develop expertise to practise appropriately. There is a need for a common minimum standard of competence in genetics for health professionals in Europe but because of differences in professional education and regulation between European countries, setting curricula may not be practical. Core competences are used as a basis for health professional education in many fields and settings. An Expert Group working under the auspices of the EuroGentest project and European Society of Human Genetics Education Committee agreed that a pragmatic solution to the need to establish common standards for education and practice in genetic health care was to agree to a set of core competences that could apply across Europe. These were agreed through an exhaustive process of consultation with relevant health professionals and patient groups. Sets of competences for practitioners working in primary, secondary and tertiary care have been agreed and were approved by the European Society of Human Genetics. The competences provide an appropriate framework for genetics education of health professionals across national boundaries, and the suggested learning outcomes are available to guide development of curricula that are appropriate to the national context, educational system and health-care setting of the professional involved. Collaboration between individuals from many European countries and professions has resulted in an adaptable framework for both pre-registration and continuing professional education. This competence framework has the potential to improve the quality of genetic health care for patients globally. PMID:20442748
Skirton, Heather; Lewis, Celine; Kent, Alastair; Coviello, Domenico A
2010-09-01
The use of genetics and genomics within a wide range of health-care settings requires health professionals to develop expertise to practise appropriately. There is a need for a common minimum standard of competence in genetics for health professionals in Europe but because of differences in professional education and regulation between European countries, setting curricula may not be practical. Core competences are used as a basis for health professional education in many fields and settings. An Expert Group working under the auspices of the EuroGentest project and European Society of Human Genetics Education Committee agreed that a pragmatic solution to the need to establish common standards for education and practice in genetic health care was to agree to a set of core competences that could apply across Europe. These were agreed through an exhaustive process of consultation with relevant health professionals and patient groups. Sets of competences for practitioners working in primary, secondary and tertiary care have been agreed and were approved by the European Society of Human Genetics. The competences provide an appropriate framework for genetics education of health professionals across national boundaries, and the suggested learning outcomes are available to guide development of curricula that are appropriate to the national context, educational system and health-care setting of the professional involved. Collaboration between individuals from many European countries and professions has resulted in an adaptable framework for both pre-registration and continuing professional education. This competence framework has the potential to improve the quality of genetic health care for patients globally.
NASA Astrophysics Data System (ADS)
Poirier, Vincent
Mesh deformation schemes play an important role in numerical aerodynamic optimization. As the aerodynamic shape changes, the computational mesh must adapt to conform to the deformed geometry. In this work, an extension to an existing fast and robust Radial Basis Function (RBF) mesh movement scheme is presented. Using a reduced set of surface points to define the mesh deformation increases the efficiency of the RBF method; however, at the cost of introducing errors into the parameterization by not recovering the exact displacement of all surface points. A secondary mesh movement is implemented, within an adjoint-based optimization framework, to eliminate these errors. The proposed scheme is tested within a 3D Euler flow by reducing the pressure drag while maintaining lift of a wing-body configured Boeing-747 and an Onera-M6 wing. As well, an inverse pressure design is executed on the Onera-M6 wing and an inverse span loading case is presented for a wing-body configured DLR-F6 aircraft.
Schmitt, Jochen; Apfelbacher, Christian; Spuls, Phyllis I; Thomas, Kim S; Simpson, Eric L; Furue, Masutaka; Chalmers, Joanne; Williams, Hywel C
2015-01-01
Core outcome sets (COSs) are consensus-derived minimum sets of outcomes to be assessed in a specific situation. COSs are being increasingly developed to limit outcome-reporting bias, allow comparisons across trials, and strengthen clinical decision making. Despite the increasing interest in outcomes research, methods to develop COSs have not yet been standardized. The aim of this paper is to present the Harmonizing Outcomes Measures for Eczema (HOME) roadmap for the development and implementation of COSs, which was developed on the basis of our experience in the standardization of outcome measurements for atopic eczema. Following the establishment of a panel representing all relevant stakeholders and a research team experienced in outcomes research, the scope and setting of the core set should be defined. The next steps are the definition of a core set of outcome domains such as symptoms or quality of life, followed by the identification or development and validation of appropriate outcome measurement instruments to measure these core domains. Finally, the consented COS needs to be disseminated, implemented, and reviewed. We believe that the HOME roadmap is a useful methodological framework to develop COSs in dermatology, with the ultimate goal of better decision making and promoting patient-centered health care.
Ab initio molecular simulations with numeric atom-centered orbitals
NASA Astrophysics Data System (ADS)
Blum, Volker; Gehrke, Ralf; Hanke, Felix; Havu, Paula; Havu, Ville; Ren, Xinguo; Reuter, Karsten; Scheffler, Matthias
2009-11-01
We describe a complete set of algorithms for ab initio molecular simulations based on numerically tabulated atom-centered orbitals (NAOs) to capture a wide range of molecular and materials properties from quantum-mechanical first principles. The full algorithmic framework described here is embodied in the Fritz Haber Institute "ab initio molecular simulations" (FHI-aims) computer program package. Its comprehensive description should be relevant to any other first-principles implementation based on NAOs. The focus here is on density-functional theory (DFT) in the local and semilocal (generalized gradient) approximations, but an extension to hybrid functionals, Hartree-Fock theory, and MP2/GW electron self-energies for total energies and excited states is possible within the same underlying algorithms. An all-electron/full-potential treatment that is both computationally efficient and accurate is achieved for periodic and cluster geometries on equal footing, including relaxation and ab initio molecular dynamics. We demonstrate the construction of transferable, hierarchical basis sets, allowing the calculation to range from qualitative tight-binding like accuracy to meV-level total energy convergence with the basis set. Since all basis functions are strictly localized, the otherwise computationally dominant grid-based operations scale as O(N) with system size N. Together with a scalar-relativistic treatment, the basis sets provide access to all elements from light to heavy. Both low-communication parallelization of all real-space grid based algorithms and a ScaLapack-based, customized handling of the linear algebra for all matrix operations are possible, guaranteeing efficient scaling (CPU time and memory) up to massively parallel computer systems with thousands of CPUs.
Model-theoretic framework for sensor data fusion
NASA Astrophysics Data System (ADS)
Zavoleas, Kyriakos P.; Kokar, Mieczyslaw M.
1993-09-01
The main goal of our research in sensory data fusion (SDF) is the development of a systematic approach (a methodology) to designing systems for interpreting sensory information and for reasoning about the situation based upon this information and upon available data bases and knowledge bases. To achieve such a goal, two kinds of subgoals have been set: (1) develop a theoretical framework in which rational design/implementation decisions can be made, and (2) design a prototype SDF system along the lines of the framework. Our initial design of the framework has been described in our previous papers. In this paper we concentrate on the model-theoretic aspects of this framework. We postulate that data are embedded in data models, and information processing mechanisms are embedded in model operators. The paper is devoted to analyzing the classes of model operators and their significance in SDF. We investigate transformation abstraction and fusion operators. A prototype SDF system, fusing data from range and intensity sensors, is presented, exemplifying the structures introduced. Our framework is justified by the fact that it provides modularity, traceability of information flow, and a basis for a specification language for SDF.
A framework for modelling the complexities of food and water security under globalisation
NASA Astrophysics Data System (ADS)
Dermody, Brian J.; Sivapalan, Murugesu; Stehfest, Elke; van Vuuren, Detlef P.; Wassen, Martin J.; Bierkens, Marc F. P.; Dekker, Stefan C.
2018-01-01
We present a new framework for modelling the complexities of food and water security under globalisation. The framework sets out a method to capture regional and sectoral interdependencies and cross-scale feedbacks within the global food system that contribute to emergent water use patterns. The framework integrates aspects of existing models and approaches in the fields of hydrology and integrated assessment modelling. The core of the framework is a multi-agent network of city agents connected by infrastructural trade networks. Agents receive socio-economic and environmental constraint information from integrated assessment models and hydrological models respectively and simulate complex, socio-environmental dynamics that operate within those constraints. The emergent changes in food and water resources are aggregated and fed back to the original models with minimal modification of the structure of those models. It is our conviction that the framework presented can form the basis for a new wave of decision tools that capture complex socio-environmental change within our globalised world. In doing so they will contribute to illuminating pathways towards a sustainable future for humans, ecosystems and the water they share.
An ICT Adoption Framework for Education: A Case Study in Public Secondary School of Indonesia
NASA Astrophysics Data System (ADS)
Nurjanah, S.; Santoso, H. B.; Hasibuan, Z. A.
2017-01-01
This paper presents preliminary research findings on the ICT adoption framework for education. Despite many studies have been conducted on ICT adoption framework in education at various countries, they are lack of analysis on the degree of component contribution to the success to the framework. In this paper a set of components that link to ICT adoption in education is observed based on literatures and explorative analysis. The components are Infrastructure, Application, User Skills, Utilization, Finance, and Policy. The components are used as a basis to develop a questionnaire to capture the current ICT adoption condition in schools. The data from questionnaire are processed using Structured Equation Model (SEM). The results show that each component contributes differently to the ICT adoption framework. Finance provides the strongest affect to Infrastructure readiness, whilst User Skills provides the strongest affect to Utilization. The study concludes that development of ICT adoption framework should consider components contribution weights among the components that can be used to guide the implementation of ICT adoption in education.
Toxicology ontology perspectives.
Hardy, Barry; Apic, Gordana; Carthew, Philip; Clark, Dominic; Cook, David; Dix, Ian; Escher, Sylvia; Hastings, Janna; Heard, David J; Jeliazkova, Nina; Judson, Philip; Matis-Mitchell, Sherri; Mitic, Dragana; Myatt, Glenn; Shah, Imran; Spjuth, Ola; Tcheremenskaia, Olga; Toldo, Luca; Watson, David; White, Andrew; Yang, Chihae
2012-01-01
The field of predictive toxicology requires the development of open, public, computable, standardized toxicology vocabularies and ontologies to support the applications required by in silico, in vitro, and in vivo toxicology methods and related analysis and reporting activities. In this article we review ontology developments based on a set of perspectives showing how ontologies are being used in predictive toxicology initiatives and applications. Perspectives on resources and initiatives reviewed include OpenTox, eTOX, Pistoia Alliance, ToxWiz, Virtual Liver, EU-ADR, BEL, ToxML, and Bioclipse. We also review existing ontology developments in neighboring fields that can contribute to establishing an ontological framework for predictive toxicology. A significant set of resources is already available to provide a foundation for an ontological framework for 21st century mechanistic-based toxicology research. Ontologies such as ToxWiz provide a basis for application to toxicology investigations, whereas other ontologies under development in the biological, chemical, and biomedical communities could be incorporated in an extended future framework. OpenTox has provided a semantic web framework for the implementation of such ontologies into software applications and linked data resources. Bioclipse developers have shown the benefit of interoperability obtained through ontology by being able to link their workbench application with remote OpenTox web services. Although these developments are promising, an increased international coordination of efforts is greatly needed to develop a more unified, standardized, and open toxicology ontology framework.
Van Dijk-de Vries, Anneke N; Duimel-Peeters, Inge G P; Muris, Jean W; Wesseling, Geertjan J; Beusmans, George H M I; Vrijhoef, Hubertus J M
2016-04-08
Teamwork between healthcare providers is conditional for the delivery of integrated care. This study aimed to assess the usefulness of the conceptual framework Integrated Team Effectiveness Model for developing and testing of the Integrated Team Effectiveness Instrument. Focus groups with healthcare providers in an integrated care setting for people with chronic obstructive pulmonary disease (COPD) were conducted to examine the recognisability of the conceptual framework and to explore critical success factors for collaborative COPD practice out of this framework. The resulting items were transposed into a pilot instrument. This was reviewed by expert opinion and completed 153 times by healthcare providers. The underlying structure and internal consistency of the instrument were verified by factor analysis and Cronbach's alpha. The conceptual framework turned out to be comprehensible for discussing teamwork effectiveness. The pilot instrument measures 25 relevant aspects of teamwork in integrated COPD care. Factor analysis suggested three reliable components: teamwork effectiveness, team processes and team psychosocial traits (Cronbach's alpha between 0.76 and 0.81). The conceptual framework Integrated Team Effectiveness Model is relevant in developing a practical full-spectrum instrument to facilitate discussing teamwork effectiveness. The Integrated Team Effectiveness Instrument provides a well-founded basis to self-evaluate teamwork effectiveness in integrated COPD care by healthcare providers. Recommendations are provided for the improvement of the instrument.
Banerjee, Amartya S.; Lin, Lin; Hu, Wei; ...
2016-10-21
The Discontinuous Galerkin (DG) electronic structure method employs an adaptive local basis (ALB) set to solve the Kohn-Sham equations of density functional theory in a discontinuous Galerkin framework. The adaptive local basis is generated on-the-fly to capture the local material physics and can systematically attain chemical accuracy with only a few tens of degrees of freedom per atom. A central issue for large-scale calculations, however, is the computation of the electron density (and subsequently, ground state properties) from the discretized Hamiltonian in an efficient and scalable manner. We show in this work how Chebyshev polynomial filtered subspace iteration (CheFSI) canmore » be used to address this issue and push the envelope in large-scale materials simulations in a discontinuous Galerkin framework. We describe how the subspace filtering steps can be performed in an efficient and scalable manner using a two-dimensional parallelization scheme, thanks to the orthogonality of the DG basis set and block-sparse structure of the DG Hamiltonian matrix. The on-the-fly nature of the ALB functions requires additional care in carrying out the subspace iterations. We demonstrate the parallel scalability of the DG-CheFSI approach in calculations of large-scale twodimensional graphene sheets and bulk three-dimensional lithium-ion electrolyte systems. In conclusion, employing 55 296 computational cores, the time per self-consistent field iteration for a sample of the bulk 3D electrolyte containing 8586 atoms is 90 s, and the time for a graphene sheet containing 11 520 atoms is 75 s.« less
Pitz, Carline; Mahy, Grégory; Vermeulen, Cédric; Marlet, Christine; Séleck, Maxime
2016-07-01
This study aims to establish a common Key Performance Indicators (KPIs) framework for reporting about the gypsum industry biodiversity at the European level. In order to integrate different opinions and to reach a consensus framework, an original participatory process approach has been developed among different stakeholder groups: Eurogypsum, European and regional authorities, university scientists, consulting offices, European and regional associations for the conservation of nature, and the extractive industry. The strategy is developed around four main steps: (1) building of a maximum set of indicators to be submitted to stakeholders based on the literature (Focus Group method); (2) evaluating the consensus about indicators through a policy Delphi survey aiming at the prioritization of indicator classes using the Analytic Hierarchy Process method (AHP) and of individual indicators; (3) testing acceptability and feasibility through analysis of Environmental Impact Assessments (EIAs) and visits to three European quarries; (4) Eurogypsum final decision and communication. The resulting framework contains a set of 11 indicators considered the most suitable for all the stakeholders. Our KPIs respond to European legislation and strategies for biodiversity. The framework aims at improving sustainability in quarries and at helping to manage biodiversity as well as to allow the creation of coherent reporting systems. The final goal is to allow for the definition of the actual biodiversity status of gypsum quarries and allow for enhancing it. The framework is adaptable to the local context of each gypsum quarry.
Structural basis for inhibition of the histone chaperone activity of SET/TAF-Iβ by cytochrome c
González-Arzola, Katiuska; Díaz-Moreno, Irene; Cano-González, Ana; Díaz-Quintana, Antonio; Velázquez-Campoy, Adrián; Moreno-Beltrán, Blas; López-Rivas, Abelardo; De la Rosa, Miguel A.
2015-01-01
Chromatin is pivotal for regulation of the DNA damage process insofar as it influences access to DNA and serves as a DNA repair docking site. Recent works identify histone chaperones as key regulators of damaged chromatin’s transcriptional activity. However, understanding how chaperones are modulated during DNA damage response is still challenging. This study reveals that the histone chaperone SET/TAF-Iβ interacts with cytochrome c following DNA damage. Specifically, cytochrome c is shown to be translocated into cell nuclei upon induction of DNA damage, but not upon stimulation of the death receptor or stress-induced pathways. Cytochrome c was found to competitively hinder binding of SET/TAF-Iβ to core histones, thereby locking its histone-binding domains and inhibiting its nucleosome assembly activity. In addition, we have used NMR spectroscopy, calorimetry, mutagenesis, and molecular docking to provide an insight into the structural features of the formation of the complex between cytochrome c and SET/TAF-Iβ. Overall, these findings establish a framework for understanding the molecular basis of cytochrome c-mediated blocking of SET/TAF-Iβ, which subsequently may facilitate the development of new drugs to silence the oncogenic effect of SET/TAF-Iβ’s histone chaperone activity. PMID:26216969
Helitzer, Deborah L; Sussman, Andrew L; Hoffman, Richard M; Getrich, Christina M; Warner, Teddy D; Rhyne, Robert L
2014-08-01
Conceptual frameworks (CF) have historically been used to develop program theory. We re-examine the literature about the role of CF in this context, specifically how they can be used to create descriptive and prescriptive theories, as building blocks for a program theory. Using a case example of colorectal cancer screening intervention development, we describe the process of developing our initial CF, the methods used to explore the constructs in the framework and revise the framework for intervention development. We present seven steps that guided the development of our CF: (1) assemble the "right" research team, (2) incorporate existing literature into the emerging CF, (3) construct the conceptual framework, (4) diagram the framework, (5) operationalize the framework: develop the research design and measures, (6) conduct the research, and (7) revise the framework. A revised conceptual framework depicted more complicated inter-relationships of the different predisposing, enabling, reinforcing, and system-based factors. The updated framework led us to generate program theory and serves as the basis for designing future intervention studies and outcome evaluations. A CF can build a foundation for program theory. We provide a set of concrete steps and lessons learned to assist practitioners in developing a CF. Copyright © 2014 Elsevier Ltd. All rights reserved.
The evolution of PBMA: towards a macro-level priority setting framework for health regions.
Mitton, Craig R; Donaldson, Cam; Waldner, Howard; Eagle, Chris
2003-11-01
To date, relatively little work on priority setting has been carried out at a macro-level across major portfolios within integrated health care organizations. This paper describes a macro marginal analysis (MMA) process for setting priorities and allocating resources in health authorities, based on work carried out in a major urban health region in Alberta, Canada. MMA centers around an expert working group of managers and clinicians who are charged with identifying areas for resource re-allocation on an ongoing basis. Trade-offs between services are based on locally defined criteria and are informed by multiple inputs such as evidence from the literature and local expert opinion. The approach is put forth as a significant improvement on historical resource allocation patterns.
Two-rowed Hecke algebra representations at roots of unity
NASA Astrophysics Data System (ADS)
Welsh, Trevor Alan
1996-02-01
In this paper, we initiate a study into the explicit construction of irreducible representations of the Hecke algebraH n (q) of typeA n-1 in the non-generic case whereq is a root of unity. The approach is via the Specht modules ofH n (q) which are irreducible in the generic case, and possess a natural basis indexed by Young tableaux. The general framework in which the irreducible non-genericH n (q)-modules are to be constructed is set up and, in particular, the full set of modules corresponding to two-part partitions is described. Plentiful examples are given.
Goal setting in paediatric rehabilitation for children with motor disabilities: a scoping review.
Pritchard-Wiart, Lesley; Phelan, Shanon K
2018-02-01
The three objectives of this scoping review were to (1) identify key conceptual/theoretical frameworks and the extent to which they are used to inform goal setting related to rehabilitation goal setting with children with motor disabilities, (2) describe research that has evaluated goal setting processes and outcomes, and (3) summarize the purposes of goal setting described in paediatric rehabilitation literature. The scoping review process described by Arksey and O'Malley was used to guide article selection and data extraction. A total of 62 articles were included in the final review. While the concept of family-centered care was well represented, theoretical frameworks specific to goal setting (i.e. goal setting theory described by Locke and Latham, mastery motivation, social cognitive, personal construct, and self-determination theories) were rarely addressed. No articles reviewed addressed prominent behavior change theory. With the exception of the description of tools specifically designed for use with children, the role of the child in the goal setting process was generally absent or not well described. Few studies ( n = 6) discussed the linkage between goals and intervention strategies explicitly. Only two studies in the review evaluated outcomes associated with goal setting. The primary purpose for goal setting identified in the literature was to develop goals that are meaningful to families ( n = 49). The results highlight significant gaps in the literature explicating a sound theoretical basis for goal setting in paediatric rehabilitation and research evaluating the effects of goal qualities and goal setting processes on the achievement of meaningful outcomes.
Haji, Faizal A; Da Silva, Celina; Daigle, Delton T; Dubrowski, Adam
2014-08-01
Presently, health care simulation research is largely conducted on a study-by-study basis. Although such "project-based" research generates a plethora of evidence, it can be chaotic and contradictory. A move toward sustained, thematic, theory-based programs of research is necessary to advance knowledge in the field. Recognizing that simulation is a complex intervention, we present a framework for developing research programs in simulation-based education adapted from the Medical Research Council (MRC) guidance. This framework calls for an iterative approach to developing, refining, evaluating, and implementing simulation interventions. The adapted framework guidance emphasizes: (1) identification of theory and existing evidence; (2) modeling and piloting interventions to clarify active ingredients and identify mechanisms linking the context, intervention, and outcomes; and (3) evaluation of intervention processes and outcomes in both the laboratory and real-world setting. The proposed framework will aid simulation researchers in developing more robust interventions that optimize simulation-based education and advance our understanding of simulation pedagogy.
A framework for designing hand hygiene educational interventions in schools.
Appiah-Brempong, Emmanuel; Harris, Muriel J; Newton, Samuel; Gulis, Gabriel
2018-03-01
Hygiene education appears to be the commonest school-based intervention for preventing infectious diseases, especially in the developing world. Nevertheless, there remains a gap in literature regarding a school-specific theory-based framework for designing a hand hygiene educational intervention in schools. We sought to suggest a framework underpinned by psychosocial theories towards bridging this knowledge gap. Furthermore, we sought to propound a more comprehensive definition of hand hygiene which could guide the conceptualisation of hand hygiene interventions in varied settings. Literature search was guided by a standardized tool and literature was retrieved on the basis of a predetermined inclusion criteria. Databases consulted include PubMed, ERIC, and EBSCO host (Medline, CINAHL, PsycINFO, etc.). Evidence bordering on a theoretical framework to aid the design of school-based hand hygiene educational interventions is summarized narratively. School-based hand hygiene educational interventions seeking to positively influence behavioural outcomes could consider enhancing psychosocial variables including behavioural capacity, attitudes and subjective norms (normative beliefs and motivation to comply). A framework underpinned by formalized psychosocial theories has relevance and could enhance the design of hand hygiene educational interventions, especially in schools.
Health literacy and public health: a systematic review and integration of definitions and models.
Sørensen, Kristine; Van den Broucke, Stephan; Fullam, James; Doyle, Gerardine; Pelikan, Jürgen; Slonska, Zofia; Brand, Helmut
2012-01-25
Health literacy concerns the knowledge and competences of persons to meet the complex demands of health in modern society. Although its importance is increasingly recognised, there is no consensus about the definition of health literacy or about its conceptual dimensions, which limits the possibilities for measurement and comparison. The aim of the study is to review definitions and models on health literacy to develop an integrated definition and conceptual model capturing the most comprehensive evidence-based dimensions of health literacy. A systematic literature review was performed to identify definitions and conceptual frameworks of health literacy. A content analysis of the definitions and conceptual frameworks was carried out to identify the central dimensions of health literacy and develop an integrated model. The review resulted in 17 definitions of health literacy and 12 conceptual models. Based on the content analysis, an integrative conceptual model was developed containing 12 dimensions referring to the knowledge, motivation and competencies of accessing, understanding, appraising and applying health-related information within the healthcare, disease prevention and health promotion setting, respectively. Based upon this review, a model is proposed integrating medical and public health views of health literacy. The model can serve as a basis for developing health literacy enhancing interventions and provide a conceptual basis for the development and validation of measurement tools, capturing the different dimensions of health literacy within the healthcare, disease prevention and health promotion settings.
Novel gene sets improve set-level classification of prokaryotic gene expression data.
Holec, Matěj; Kuželka, Ondřej; Železný, Filip
2015-10-28
Set-level classification of gene expression data has received significant attention recently. In this setting, high-dimensional vectors of features corresponding to genes are converted into lower-dimensional vectors of features corresponding to biologically interpretable gene sets. The dimensionality reduction brings the promise of a decreased risk of overfitting, potentially resulting in improved accuracy of the learned classifiers. However, recent empirical research has not confirmed this expectation. Here we hypothesize that the reported unfavorable classification results in the set-level framework were due to the adoption of unsuitable gene sets defined typically on the basis of the Gene ontology and the KEGG database of metabolic networks. We explore an alternative approach to defining gene sets, based on regulatory interactions, which we expect to collect genes with more correlated expression. We hypothesize that such more correlated gene sets will enable to learn more accurate classifiers. We define two families of gene sets using information on regulatory interactions, and evaluate them on phenotype-classification tasks using public prokaryotic gene expression data sets. From each of the two gene-set families, we first select the best-performing subtype. The two selected subtypes are then evaluated on independent (testing) data sets against state-of-the-art gene sets and against the conventional gene-level approach. The novel gene sets are indeed more correlated than the conventional ones, and lead to significantly more accurate classifiers. The novel gene sets are indeed more correlated than the conventional ones, and lead to significantly more accurate classifiers. Novel gene sets defined on the basis of regulatory interactions improve set-level classification of gene expression data. The experimental scripts and other material needed to reproduce the experiments are available at http://ida.felk.cvut.cz/novelgenesets.tar.gz.
Using the 'protective environment' framework to analyse children's protection needs in Darfur.
Ager, Alastair; Boothby, Neil; Bremer, Megan
2009-10-01
A major humanitarian concern during the continuing crisis in Darfur, Sudan, has been the protection of children, although there has been little in the way of comprehensive analysis to guide intervention. Founded on a situational analysis conducted between October 2005 and March 2006, this paper documents the significant threats to children's well-being directly linked to the political conflict. It demonstrates the role of non-conflict factors in exacerbating these dangers and in promoting additional protection violations, and it uses the 'protective environment' framework (UNICEF Sudan, 2006a) to identify systematic features of the current environment that put children at risk. This framework is shown to provide a coherent basis for assessment and planning, prompting broad, multidisciplinary analysis, concentrating on preventive and protective action, and fostering a systemic approach (rather than placing an undue focus on the discrete needs of 'vulnerable groups'). Constraints on its present utility in emergency settings are also noted.
Witt, Claudia M; Pérard, Marion; Berman, Brian; Berman, Susan; Birdsall, Timothy C; Defren, Horst; Kümmel, Sherko; Deng, Gary; Dobos, Gustav; Drexler, Atje; Holmberg, Christine; Horneber, Markus; Jütte, Robert; Knutson, Lori; Kummer, Christopher; Volpers, Susanne; Schweiger, David
2015-01-01
An increasing number of clinics offer complementary or integrative medicine services; however, clear guidance about how complementary medicine could be successfully and efficiently integrated into conventional health care settings is still lacking. Combining conventional and complementary medicine into integrative medicine can be regarded as a kind of merger. In a merger, two or more organizations - usually companies - are combined into one in order to strengthen the companies financially and strategically. The corporate culture of both merger partners has an important influence on the integration. The aim of this project was to transfer the concept of corporate culture in mergers to the merging of two medical systems. A two-step approach (literature analyses and expert consensus procedure) was used to develop practical guidance for the development of a cultural basis for integrative medicine, based on the framework of corporate culture in "mergers," which could be used to build an integrative medicine department or integrative medicine service. Results include recommendations for general strategic dimensions (definition of the medical model, motivation for integration, clarification of the available resources, development of the integration team, and development of a communication strategy), and recommendations to overcome cultural differences (the clinic environment, the professional language, the professional image, and the implementation of evidence-based medicine). The framework of mergers in corporate culture provides an understanding of the difficulties involved in integrative medicine projects. The specific recommendations provide a good basis for more efficient implementation.
Witt, Claudia M; Pérard, Marion; Berman, Brian; Berman, Susan; Birdsall, Timothy C; Defren, Horst; Kümmel, Sherko; Deng, Gary; Dobos, Gustav; Drexler, Atje; Holmberg, Christine; Horneber, Markus; Jütte, Robert; Knutson, Lori; Kummer, Christopher; Volpers, Susanne; Schweiger, David
2015-01-01
Background An increasing number of clinics offer complementary or integrative medicine services; however, clear guidance about how complementary medicine could be successfully and efficiently integrated into conventional health care settings is still lacking. Combining conventional and complementary medicine into integrative medicine can be regarded as a kind of merger. In a merger, two or more organizations − usually companies − are combined into one in order to strengthen the companies financially and strategically. The corporate culture of both merger partners has an important influence on the integration. Purpose The aim of this project was to transfer the concept of corporate culture in mergers to the merging of two medical systems. Methods A two-step approach (literature analyses and expert consensus procedure) was used to develop practical guidance for the development of a cultural basis for integrative medicine, based on the framework of corporate culture in “mergers,” which could be used to build an integrative medicine department or integrative medicine service. Results Results include recommendations for general strategic dimensions (definition of the medical model, motivation for integration, clarification of the available resources, development of the integration team, and development of a communication strategy), and recommendations to overcome cultural differences (the clinic environment, the professional language, the professional image, and the implementation of evidence-based medicine). Conclusion The framework of mergers in corporate culture provides an understanding of the difficulties involved in integrative medicine projects. The specific recommendations provide a good basis for more efficient implementation. PMID:25632226
Van Dijk-de Vries, Anneke N.; Duimel-Peeters, Inge G. P.; Muris, Jean W.; Wesseling, Geertjan J.; Beusmans, George H. M. I.
2016-01-01
Introduction: Teamwork between healthcare providers is conditional for the delivery of integrated care. This study aimed to assess the usefulness of the conceptual framework Integrated Team Effectiveness Model for developing and testing of the Integrated Team Effectiveness Instrument. Theory and methods: Focus groups with healthcare providers in an integrated care setting for people with chronic obstructive pulmonary disease (COPD) were conducted to examine the recognisability of the conceptual framework and to explore critical success factors for collaborative COPD practice out of this framework. The resulting items were transposed into a pilot instrument. This was reviewed by expert opinion and completed 153 times by healthcare providers. The underlying structure and internal consistency of the instrument were verified by factor analysis and Cronbach’s alpha. Results: The conceptual framework turned out to be comprehensible for discussing teamwork effectiveness. The pilot instrument measures 25 relevant aspects of teamwork in integrated COPD care. Factor analysis suggested three reliable components: teamwork effectiveness, team processes and team psychosocial traits (Cronbach’s alpha between 0.76 and 0.81). Conclusions and discussion: The conceptual framework Integrated Team Effectiveness Model is relevant in developing a practical full-spectrum instrument to facilitate discussing teamwork effectiveness. The Integrated Team Effectiveness Instrument provides a well-founded basis to self-evaluate teamwork effectiveness in integrated COPD care by healthcare providers. Recommendations are provided for the improvement of the instrument. PMID:27616953
NASA Astrophysics Data System (ADS)
Fondevilla, Víctor; Dinarès-Turell, Jaume; Oms, Oriol
2016-05-01
The evolution of the end-Cretaceous terrestrial ecosystems and faunas outside of North America is largely restricted to the European Archipelago. The information scattered in this last area can only be integrated in a chronostratigraphic framework on the basis of robust age constraints and stratigraphy. Therefore, we have revisited the puzzling age calibration of the sedimentary infilling from the Isona sector in the Tremp syncline (South-Central Pyrenees), an area renowned for its rich Maastrichtian dinosaur fossil record. Aiming to shed light to existing controversial age determinations, we carried out a new magnetostratigraphic study along the ~ 420 m long Orcau and Nerets sections of that area. Our results reveal that most of the succession correlates to the early Maastrichtian (mostly chron C31r) in accordance to ages proposed by recent planktonic foraminifera biostratigraphy. The resulting chronostratigraphic framework of the entire Maastrichtian basin recorded in the Tremp syncline shows that a significant sedimentary hiatus of about 3 My characterizes most of the late Maastrichtian in the study area. This hiatus, related to an abrupt migration of the basin depocenter, is temporally close to similar hiatuses, decreases in sedimentary rates and facies shifts recorded in other southwestern European areas. The present chronologic framework sets the basis for a thorough assessment of end-Cretaceous terrestrial faunal turnover and extinction patterns, and the establishment of a more rigorous Pyrenean basin evolution analysis.
Tamura, Koichiro; Tao, Qiqing; Kumar, Sudhir
2018-01-01
Abstract RelTime estimates divergence times by relaxing the assumption of a strict molecular clock in a phylogeny. It shows excellent performance in estimating divergence times for both simulated and empirical molecular sequence data sets in which evolutionary rates varied extensively throughout the tree. RelTime is computationally efficient and scales well with increasing size of data sets. Until now, however, RelTime has not had a formal mathematical foundation. Here, we show that the basis of the RelTime approach is a relative rate framework (RRF) that combines comparisons of evolutionary rates in sister lineages with the principle of minimum rate change between evolutionary lineages and their respective descendants. We present analytical solutions for estimating relative lineage rates and divergence times under RRF. We also discuss the relationship of RRF with other approaches, including the Bayesian framework. We conclude that RelTime will be useful for phylogenies with branch lengths derived not only from molecular data, but also morphological and biochemical traits. PMID:29893954
Towards a Quantum Theory of Humour
NASA Astrophysics Data System (ADS)
Gabora, Liane; Kitto, Kirsty
2016-12-01
This paper proposes that cognitive humour can be modelled using the mathematical framework of quantum theory, suggesting that a Quantum Theory of Humour (QTH) is a viable approach. We begin with brief overviews of both research on humour, and the generalized quantum framework. We show how the bisociation of incongruous frames or word meanings in jokes can be modelled as a linear superposition of a set of basis states, or possible interpretations, in a complex Hilbert space. The choice of possible interpretations depends on the context provided by the set-up versus the punchline of a joke. We apply QTH first to a verbal pun, and then consider how this might be extended to frame blending in cartoons. An initial study of 85 participant responses to 35 jokes (and a number of variants) suggests that there is reason to believe that a quantum approach to the modelling of cognitive humour is a viable new avenue of research for the field of quantum cognition.
Huo, Zhiguang; Ding, Ying; Liu, Silvia; Oesterreich, Steffi; Tseng, George
2016-01-01
Disease phenotyping by omics data has become a popular approach that potentially can lead to better personalized treatment. Identifying disease subtypes via unsupervised machine learning is the first step towards this goal. In this paper, we extend a sparse K-means method towards a meta-analytic framework to identify novel disease subtypes when expression profiles of multiple cohorts are available. The lasso regularization and meta-analysis identify a unique set of gene features for subtype characterization. An additional pattern matching reward function guarantees consistent subtype signatures across studies. The method was evaluated by simulations and leukemia and breast cancer data sets. The identified disease subtypes from meta-analysis were characterized with improved accuracy and stability compared to single study analysis. The breast cancer model was applied to an independent METABRIC dataset and generated improved survival difference between subtypes. These results provide a basis for diagnosis and development of targeted treatments for disease subgroups. PMID:27330233
Huo, Zhiguang; Ding, Ying; Liu, Silvia; Oesterreich, Steffi; Tseng, George
Disease phenotyping by omics data has become a popular approach that potentially can lead to better personalized treatment. Identifying disease subtypes via unsupervised machine learning is the first step towards this goal. In this paper, we extend a sparse K -means method towards a meta-analytic framework to identify novel disease subtypes when expression profiles of multiple cohorts are available. The lasso regularization and meta-analysis identify a unique set of gene features for subtype characterization. An additional pattern matching reward function guarantees consistent subtype signatures across studies. The method was evaluated by simulations and leukemia and breast cancer data sets. The identified disease subtypes from meta-analysis were characterized with improved accuracy and stability compared to single study analysis. The breast cancer model was applied to an independent METABRIC dataset and generated improved survival difference between subtypes. These results provide a basis for diagnosis and development of targeted treatments for disease subgroups.
DFT study of the effect of substituents on the absorption and emission spectra of Indigo
2012-01-01
Background Theoretical analyses of the indigo dye molecule and its derivatives with Chlorine (Cl), Sulfur (S), Selenium (Se) and Bromine (Br) substituents, as well as an analysis of the Hemi-Indigo molecule, were performed using the Gaussian 03 software package. Results Calculations were performed based on the framework of density functional theory (DFT) with the Becke 3- parameter-Lee-Yang-Parr (B3LYP) functional, where the 6-31 G(d,p) basis set was employed. The configuration interaction singles (CIS) method with the same basis set was employed for the analysis of excited states and for the acquisition of the emission spectra. Conclusions The presented absorption and emission spectra were affected by the substitution position. When a hydrogen atom of the molecule was substituted by Cl or Br, practically no change in the absorbed and emitted energies relative to those of the indigo molecule were observed; however, when N was substituted by S or Se, the absorbed and emitted energies increased. PMID:22809100
Compressing random microstructures via stochastic Wang tilings.
Novák, Jan; Kučerová, Anna; Zeman, Jan
2012-10-01
This Rapid Communication presents a stochastic Wang tiling-based technique to compress or reconstruct disordered microstructures on the basis of given spatial statistics. Unlike the existing approaches based on a single unit cell, it utilizes a finite set of tiles assembled by a stochastic tiling algorithm, thereby allowing to accurately reproduce long-range orientation orders in a computationally efficient manner. Although the basic features of the method are demonstrated for a two-dimensional particulate suspension, the present framework is fully extensible to generic multidimensional media.
A theoretical framework to support research of health service innovation.
Fox, Amanda; Gardner, Glenn; Osborne, Sonya
2015-02-01
Health service managers and policy makers are increasingly concerned about the sustainability of innovations implemented in health care settings. The increasing demand on health services requires that innovations are both effective and sustainable; however, research in this field is limited, with multiple disciplines, approaches and paradigms influencing the field. These variations prevent a cohesive approach, and therefore the accumulation of research findings, in the development of a body of knowledge. The purpose of this paper is to provide a thorough examination of the research findings and provide an appropriate theoretical framework to examine sustainability of health service innovation. This paper presents an integrative review of the literature available in relation to sustainability of health service innovation and provides the development of a theoretical framework based on integration and synthesis of the literature. A theoretical framework serves to guide research, determine variables, influence data analysis and is central to the quest for ongoing knowledge development. This research outlines the sustainability of innovation framework; a theoretical framework suitable for examining the sustainability of health service innovation. If left unaddressed, health services research will continue in an ad hoc manner, preventing full utilisation of outcomes, recommendations and knowledge for effective provision of health services. The sustainability of innovation theoretical framework provides an operational basis upon which reliable future research can be conducted.
Developing a holistic policy and intervention framework for global mental health.
Khenti, Akwatu; Fréel, Stéfanie; Trainor, Ruth; Mohamoud, Sirad; Diaz, Pablo; Suh, Erica; Bobbili, Sireesha J; Sapag, Jaime C
2016-02-01
There are significant gaps in the accessibility and quality of mental health services around the globe. A wide range of institutions are addressing the challenges, but there is limited reflection and evaluation on the various approaches, how they compare with each other, and conclusions regarding the most effective approach for particular settings. This article presents a framework for global mental health capacity building that could potentially serve as a promising or best practice in the field. The framework is the outcome of a decade of collaborative global health work at the Centre for Addiction and Mental Health (CAMH) (Ontario, Canada). The framework is grounded in scientific evidence, relevant learning and behavioural theories and the underlying principles of health equity and human rights. Grounded in CAMH's research, programme evaluation and practical experience in developing and implementing mental health capacity building interventions, this article presents the iterative learning process and impetus that formed the basis of the framework. A developmental evaluation (Patton M.2010. Developmental Evaluation: Applying Complexity Concepts to Enhance Innovation and Use. New York: Guilford Press.) approach was used to build the framework, as global mental health collaboration occurs in complex or uncertain environments and evolving learning systems. A multilevel framework consists of five central components: (1) holistic health, (2) cultural and socioeconomic relevance, (3) partnerships, (4) collaborative action-based education and learning and (5) sustainability. The framework's practical application is illustrated through the presentation of three international case studies and four policy implications. Lessons learned, limitations and future opportunities are also discussed. The holistic policy and intervention framework for global mental health reflects an iterative learning process that can be applied and scaled up across different settings through appropriate modifications. © The Author 2015. Published by Oxford University Press in association with The London School of Hygiene and Tropical Medicine.
Computerized structural mechanics for 1990's: Advanced aircraft needs
NASA Technical Reports Server (NTRS)
Viswanathan, A. V.; Backman, B. F.
1989-01-01
The needs for computerized structural mechanics (CSM) as seen from the standpoint of the aircraft industry are discussed. These needs are projected into the 1990's with special focus on the new advanced materials. Preliminary design/analysis, research, and detail design/analysis are identified as major areas. The role of local/global analyses in these different areas is discussed. The lessons learned in the past are used as a basis for the design of a CSM framework that could modify and consolidate existing technology and include future developments in a rational and useful way. A philosophy is stated, and a set of analyses needs driven by the emerging advanced composites is enumerated. The roles of NASA, the universities, and the industry are identified. Finally, a set of rational research targets is recommended based on both the new types of computers and the increased complexity the industry faces. Computerized structural mechanics should be more than new methods in structural mechanics and numerical analyses. It should be a set of engineering applications software products that combines innovations in structural mechanics, numerical analysis, data processing, search and display features, and recent hardware advances and is organized in a framework that directly supports the design process.
Harris, Claire; Green, Sally; Elshaug, Adam G
2017-09-08
This is the tenth in a series of papers reporting a program of Sustainability in Health care by Allocating Resources Effectively (SHARE) in a local healthcare setting. After more than a decade of research, there is little published evidence of active and successful disinvestment. The paucity of frameworks, methods and tools is reported to be a factor in the lack of success. However there are clear and consistent messages in the literature that can be used to inform development of a framework for operationalising disinvestment. This paper, along with the conceptual review of disinvestment in Paper 9 of this series, aims to integrate the findings of the SHARE Program with the existing disinvestment literature to address the lack of information regarding systematic organisation-wide approaches to disinvestment at the local health service level. A framework for disinvestment in a local healthcare setting is proposed. Definitions for essential terms and key concepts underpinning the framework have been made explicit to address the lack of consistent terminology. Given the negative connotations of the word 'disinvestment' and the problems inherent in considering disinvestment in isolation, the basis for the proposed framework is 'resource allocation' to address the spectrum of decision-making from investment to disinvestment. The focus is positive: optimising healthcare, improving health outcomes, using resources effectively. The framework is based on three components: a program for decision-making, projects to implement decisions and evaluate outcomes, and research to understand and improve the program and project activities. The program consists of principles for decision-making and settings that provide opportunities to introduce systematic prompts and triggers to initiate disinvestment. The projects follow the steps in the disinvestment process. Potential methods and tools are presented, however the framework does not stipulate project design or conduct; allowing application of any theories, methods or tools at each step. Barriers are discussed and examples illustrating constituent elements are provided. The framework can be employed at network, institutional, departmental, ward or committee level. It is proposed as an organisation-wide application, embedded within existing systems and processes, which can be responsive to needs and priorities at the level of implementation. It can be used in policy, management or clinical contexts.
Uciteli, Alexandr; Groß, Silvia; Kireyev, Sergej; Herre, Heinrich
2011-08-09
This paper presents an ontologically founded basic architecture for information systems, which are intended to capture, represent, and maintain metadata for various domains of clinical and epidemiological research. Clinical trials exhibit an important basis for clinical research, and the accurate specification of metadata and their documentation and application in clinical and epidemiological study projects represents a significant expense in the project preparation and has a relevant impact on the value and quality of these studies.An ontological foundation of an information system provides a semantic framework for the precise specification of those entities which are presented in this system. This semantic framework should be grounded, according to our approach, on a suitable top-level ontology. Such an ontological foundation leads to a deeper understanding of the entities of the domain under consideration, and provides a common unifying semantic basis, which supports the integration of data and the interoperability between different information systems.The intended information systems will be applied to the field of clinical and epidemiological research and will provide, depending on the application context, a variety of functionalities. In the present paper, we focus on a basic architecture which might be common to all such information systems. The research, set forth in this paper, is included in a broader framework of clinical research and continues the work of the IMISE on these topics.
8D likelihood effective Higgs couplings extraction framework in h → 4ℓ
Chen, Yi; Di Marco, Emanuele; Lykken, Joe; ...
2015-01-23
We present an overview of a comprehensive analysis framework aimed at performing direct extraction of all possible effective Higgs couplings to neutral electroweak gauge bosons in the decay to electrons and muons, the so called ‘golden channel’. Our framework is based primarily on a maximum likelihood method constructed from analytic expressions of the fully differential cross sections for h → 4l and for the dominant irreduciblemore » $$ q\\overline{q} $$ → 4l background, where 4l = 2e2μ, 4e, 4μ. Detector effects are included by an explicit convolution of these analytic expressions with the appropriate transfer function over all center of mass variables. Utilizing the full set of observables, we construct an unbinned detector-level likelihood which is continuous in the effective couplings. We consider possible ZZ, Zγ, and γγ couplings simultaneously, allowing for general CP odd/even admixtures. A broad overview is given of how the convolution is performed and we discuss the principles and theoretical basis of the framework. This framework can be used in a variety of ways to study Higgs couplings in the golden channel using data obtained at the LHC and other future colliders.« less
Decision making and coping in healthcare: the Coping in Deliberation (CODE) framework.
Witt, Jana; Elwyn, Glyn; Wood, Fiona; Brain, Kate
2012-08-01
To develop a framework of decision making and coping in healthcare that describes the twin processes of appraisal and coping faced by patients making preference-sensitive healthcare decisions. We briefly review the literature for decision making theories and coping theories applicable to preference-sensitive decisions in healthcare settings. We describe first decision making, then coping and finally attempt to integrate these processes by building on current theory. Deliberation in healthcare may be described as a six step process, comprised of the presentation of a health threat, choice, options, preference construction, the decision itself and consolidation post-decision. Coping can be depicted in three stages, beginning with a threat, followed by primary and secondary appraisal and ultimately resulting in a coping effort. Drawing together concepts from prominent decision making theories and coping theories, we propose a multidimensional, interactive framework which integrates both processes and describes coping in deliberation. The proposed framework offers an insight into the complexity of decision making in preference-sensitive healthcare contexts from a patient perspective and may act as theoretical basis for decision support. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Health literacy and public health: A systematic review and integration of definitions and models
2012-01-01
Background Health literacy concerns the knowledge and competences of persons to meet the complex demands of health in modern society. Although its importance is increasingly recognised, there is no consensus about the definition of health literacy or about its conceptual dimensions, which limits the possibilities for measurement and comparison. The aim of the study is to review definitions and models on health literacy to develop an integrated definition and conceptual model capturing the most comprehensive evidence-based dimensions of health literacy. Methods A systematic literature review was performed to identify definitions and conceptual frameworks of health literacy. A content analysis of the definitions and conceptual frameworks was carried out to identify the central dimensions of health literacy and develop an integrated model. Results The review resulted in 17 definitions of health literacy and 12 conceptual models. Based on the content analysis, an integrative conceptual model was developed containing 12 dimensions referring to the knowledge, motivation and competencies of accessing, understanding, appraising and applying health-related information within the healthcare, disease prevention and health promotion setting, respectively. Conclusions Based upon this review, a model is proposed integrating medical and public health views of health literacy. The model can serve as a basis for developing health literacy enhancing interventions and provide a conceptual basis for the development and validation of measurement tools, capturing the different dimensions of health literacy within the healthcare, disease prevention and health promotion settings. PMID:22276600
Twelve evidence-based principles for implementing self-management support in primary care.
Battersby, Malcolm; Von Korff, Michael; Schaefer, Judith; Davis, Connie; Ludman, Evette; Greene, Sarah M; Parkerton, Melissa; Wagner, Edward H
2010-12-01
Recommendations to improve self-management support and health outcomes for people with chronic conditions in primary care settings are provided on the basis of expert opinion supported by evidence for practices and processes. Practices and processes that could improve self-management support in primary care were identified through a nominal group process. In a targeted search strategy, reviews and meta-analyses were then identifed using terms from a wide range of chronic conditions and behavioral risk factors in combination with Self-Care, Self-Management, and Primary Care. On the basis of these reviews, evidence-based principles for self-management support were developed. The evidence is organized within the framework of the Chronic Care Model. Evidence-based principles in 12 areas were associated with improved patient self-management and/or health outcomes: (1) brief targeted assessment, (2) evidence-based information to guide shared decision-making, (3) use of a nonjudgmental approach, (4) collaborative priority and goal setting, (5) collaborative problem solving, (6) self-management support by diverse providers, (7) self-management interventions delivered by diverse formats, (8) patient self-efficacy, (9) active followup, (10) guideline-based case management for selected patients, (11) linkages to evidence-based community programs, and (12) multifaceted interventions. A framework is provided for implementing these principles in three phases of the primary care visit: enhanced previsit assessment, a focused clinical encounter, and expanded postvisit options. There is a growing evidence base for how self-management support for chronic conditions can be integrated into routine health care.
Gorecki, Claudia; Lamping, Donna L; Brown, Julia M; Madill, Anna; Firth, Jill; Nixon, Jane
2010-12-01
Evaluating outcomes such as health-related quality of life is particularly important and relevant in skin conditions such as pressure ulcers where the condition and associated interventions pose substantial burden to patients. Measures to evaluate such outcomes need to be developed by utilising patient-perspective to ensure that content and conceptualisation is relevant to patients. Our aim was to develop a conceptual framework of health-related quality of life in pressure ulcers, based on patients' views about the impact of pressure ulcers and interventions on health-related quality of life to inform the development of a new patient-reported outcome measure. SETTING, PARTICIPANTS AND METHODS: We developed a working conceptual framework based on a previous review of the literature, then used semi-structured qualitative interviews with 30 adults with pressure ulcers (22-94 years) purposively sampled from hospital, community and rehabilitation care settings in England and Northern Ireland to obtain patients' views, and thematic content analysis and review by a multidisciplinary expert group to develop the final conceptual framework. Our conceptual model includes four health-related quality of life domains (symptoms, physical functioning, psychological well-being, social functioning), divided into 13 sub-domains and defined by specific descriptive components. We have identified health-related quality of life outcomes that are important to people with pressure ulcers and developed a conceptual framework using robust and systematic methods, which provides the basis for the development of a new pressure ulcer-specific measure of health-related quality of life. Copyright © 2010 Elsevier Ltd. All rights reserved.
Using and developing role plays in teaching aimed at preparing for social responsibility.
Doorn, Neelke; Kroesen, J Otto
2013-12-01
In this paper, we discuss the use of role plays in ethics education for engineering students. After presenting a rough taxonomy of different objectives, we illustrate how role plays can be used to broaden students' perspectives. We do this on the basis of our experiences with a newly developed role play about a Dutch political controversy concerning pig transport. The role play is special in that the discussion is about setting up an institutional framework for responsible action that goes beyond individual action. In that sense, the role play serves a double purpose. It not only aims at teaching students to become aware of the different dimensions in decision making, it also encourages students to think about what such an institutional framework for responsible action might possibly look like.
Health in global context; beyond the social determinants of health?
Krumeich, Anja; Meershoek, Agnes
2014-01-01
The rise of the social determinants of health (SDH) discourse on the basis of statistical evidence that correlates ill health to SDH and pictures causal pathways in comprehensive theoretical frameworks led to widespread awareness that health and health disparities are the outcome of complex pathways of interconnecting SDH. In this paper we explore whether and how SDH frameworks can be translated to effectively inform particular national health policies. To this end we identified major challenges for this translation followed by reflections on ways to overcome them. Most important challenges affecting adequate translation of these frameworks into concrete policy and intervention are 1) overcoming the inclination to conceptualize SDH as mere barriers to health behavior to be modified by lifestyle interventions by addressing them as structural factors instead; 2) obtaining sufficient in-depth insight in and evidence for the exact nature of the relationship between SDs and health; 3) to adequately translate the general determinants and pathways into explanations for ill health and limited access to health care in local settings; 4) to develop and implement policies and other interventions that are adjusted to those local circumstances. We conclude that to transform generic SDH models into useful policy tools and to prevent them to transform in SDH themselves, in depth understanding of the unique interplay between local, national and global SDH in a local setting, gathered by ethnographic research, is needed to be able to address structural SD in the local setting and decrease health inequity.
Usability Guidelines for Product Recommenders Based on Example Critiquing Research
NASA Astrophysics Data System (ADS)
Pu, Pearl; Faltings, Boi; Chen, Li; Zhang, Jiyong; Viappiani, Paolo
Over the past decade, our group has developed a suite of decision tools based on example critiquing to help users find their preferred products in e-commerce environments. In this chapter, we survey important usability research work relative to example critiquing and summarize the major results by deriving a set of usability guidelines. Our survey is focused on three key interaction activities between the user and the system: the initial preference elicitation process, the preference revision process, and the presentation of the systems recommendation results. To provide a basis for the derivation of the guidelines, we developed a multi-objective framework of three interacting criteria: accuracy, confidence, and effort (ACE). We use this framework to analyze our past work and provide a specific context for each guideline: when the system should maximize its ability to increase users' decision accuracy, when to increase user confidence, and when to minimize the interaction effort for the users. Due to the general nature of this multi-criteria model, the set of guidelines that we propose can be used to ease the usability engineering process of other recommender systems, especially those used in e-commerce environments. The ACE framework presented here is also the first in the field to evaluate the performance of preference-based recommenders from a user-centric point of view.
Clinical effort against secondhand smoke exposure: development of framework and intervention.
Winickoff, Jonathan P; Park, Elyse R; Hipple, Bethany J; Berkowitz, Anna; Vieira, Cecilia; Friebely, Joan; Healey, Erica A; Rigotti, Nancy A
2008-08-01
The purpose of this work was to describe a novel process and present results of formative research to develop a pediatric office intervention that uses available systems of care for addressing parental smoking. The scientific development of the intervention occurred in 3 stages. In stage 1, we designed an office system for parental tobacco control in the pediatric outpatient setting on the basis of complementary conceptual frameworks of preventive services delivery, conceptualized for the child health care setting through a process of key interviews with leaders in the field of implementing practice change; existing Public Health Service guidelines that had been shown effective in adult practices; and adaptation of an evidence-based adult office system for tobacco control. This was an iterative process that yielded a theoretically framed intervention prototype. In stage 2, we performed focus-group testing in pediatric practices with pediatricians, nurses, clinical assistants, and key office staff. Using qualitative methods, we adapted the intervention prototype on the basis of this feedback to include 5 key implementation steps for the child health care setting. In stage 3, we presented the intervention to breakout groups at 2 national meetings of pediatric practitioners for additional refinements. The main result was a theoretically grounded intervention that was responsive to the barriers and suggestions raised in the focus groups and at the national meetings. The Clinical Effort Against Secondhand Smoke Exposure intervention was designed to be flexible and adaptable to the particular practices' staffing, resources, and physical configuration. Practice staff can choose materials relevant to their own particular systems of care (www.ceasetobacco.org). Conceptually grounded and focus-group-tested strategies for parental tobacco control are now available for implementation in the pediatric outpatient setting. The tobacco-control intervention-development process might have particular relevance for other chronic pediatric conditions that have a strong evidence base and have available treatments or resources that are underused.
Basis sets for the calculation of core-electron binding energies
NASA Astrophysics Data System (ADS)
Hanson-Heine, Magnus W. D.; George, Michael W.; Besley, Nicholas A.
2018-05-01
Core-electron binding energies (CEBEs) computed within a Δ self-consistent field approach require large basis sets to achieve convergence with respect to the basis set limit. It is shown that supplementing a basis set with basis functions from the corresponding basis set for the element with the next highest nuclear charge (Z + 1) provides basis sets that give CEBEs close to the basis set limit. This simple procedure provides relatively small basis sets that are well suited for calculations where the description of a core-ionised state is important, such as time-dependent density functional theory calculations of X-ray emission spectroscopy.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Papajak, Ewa; Truhlar, Donald G.
We present sets of convergent, partially augmented basis set levels corresponding to subsets of the augmented “aug-cc-pV(n+d)Z” basis sets of Dunning and co-workers. We show that for many molecular properties a basis set fully augmented with diffuse functions is computationally expensive and almost always unnecessary. On the other hand, unaugmented cc-pV(n+d)Z basis sets are insufficient for many properties that require diffuse functions. Therefore, we propose using intermediate basis sets. We developed an efficient strategy for partial augmentation, and in this article, we test it and validate it. Sequentially deleting diffuse basis functions from the “aug” basis sets yields the “jul”,more » “jun”, “may”, “apr”, etc. basis sets. Tests of these basis sets for Møller-Plesset second-order perturbation theory (MP2) show the advantages of using these partially augmented basis sets and allow us to recommend which basis sets offer the best accuracy for a given number of basis functions for calculations on large systems. Similar truncations in the diffuse space can be performed for the aug-cc-pVxZ, aug-cc-pCVxZ, etc. basis sets.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spackman, Peter R.; Karton, Amir, E-mail: amir.karton@uwa.edu.au
Coupled cluster calculations with all single and double excitations (CCSD) converge exceedingly slowly with the size of the one-particle basis set. We assess the performance of a number of approaches for obtaining CCSD correlation energies close to the complete basis-set limit in conjunction with relatively small DZ and TZ basis sets. These include global and system-dependent extrapolations based on the A + B/L{sup α} two-point extrapolation formula, and the well-known additivity approach that uses an MP2-based basis-set-correction term. We show that the basis set convergence rate can change dramatically between different systems(e.g.it is slower for molecules with polar bonds and/ormore » second-row elements). The system-dependent basis-set extrapolation scheme, in which unique basis-set extrapolation exponents for each system are obtained from lower-cost MP2 calculations, significantly accelerates the basis-set convergence relative to the global extrapolations. Nevertheless, we find that the simple MP2-based basis-set additivity scheme outperforms the extrapolation approaches. For example, the following root-mean-squared deviations are obtained for the 140 basis-set limit CCSD atomization energies in the W4-11 database: 9.1 (global extrapolation), 3.7 (system-dependent extrapolation), and 2.4 (additivity scheme) kJ mol{sup –1}. The CCSD energy in these approximations is obtained from basis sets of up to TZ quality and the latter two approaches require additional MP2 calculations with basis sets of up to QZ quality. We also assess the performance of the basis-set extrapolations and additivity schemes for a set of 20 basis-set limit CCSD atomization energies of larger molecules including amino acids, DNA/RNA bases, aromatic compounds, and platonic hydrocarbon cages. We obtain the following RMSDs for the above methods: 10.2 (global extrapolation), 5.7 (system-dependent extrapolation), and 2.9 (additivity scheme) kJ mol{sup –1}.« less
NASA Astrophysics Data System (ADS)
Spackman, Peter R.; Karton, Amir
2015-05-01
Coupled cluster calculations with all single and double excitations (CCSD) converge exceedingly slowly with the size of the one-particle basis set. We assess the performance of a number of approaches for obtaining CCSD correlation energies close to the complete basis-set limit in conjunction with relatively small DZ and TZ basis sets. These include global and system-dependent extrapolations based on the A + B/Lα two-point extrapolation formula, and the well-known additivity approach that uses an MP2-based basis-set-correction term. We show that the basis set convergence rate can change dramatically between different systems(e.g.it is slower for molecules with polar bonds and/or second-row elements). The system-dependent basis-set extrapolation scheme, in which unique basis-set extrapolation exponents for each system are obtained from lower-cost MP2 calculations, significantly accelerates the basis-set convergence relative to the global extrapolations. Nevertheless, we find that the simple MP2-based basis-set additivity scheme outperforms the extrapolation approaches. For example, the following root-mean-squared deviations are obtained for the 140 basis-set limit CCSD atomization energies in the W4-11 database: 9.1 (global extrapolation), 3.7 (system-dependent extrapolation), and 2.4 (additivity scheme) kJ mol-1. The CCSD energy in these approximations is obtained from basis sets of up to TZ quality and the latter two approaches require additional MP2 calculations with basis sets of up to QZ quality. We also assess the performance of the basis-set extrapolations and additivity schemes for a set of 20 basis-set limit CCSD atomization energies of larger molecules including amino acids, DNA/RNA bases, aromatic compounds, and platonic hydrocarbon cages. We obtain the following RMSDs for the above methods: 10.2 (global extrapolation), 5.7 (system-dependent extrapolation), and 2.9 (additivity scheme) kJ mol-1.
Fault Management Design Strategies
NASA Technical Reports Server (NTRS)
Day, John C.; Johnson, Stephen B.
2014-01-01
Development of dependable systems relies on the ability of the system to determine and respond to off-nominal system behavior. Specification and development of these fault management capabilities must be done in a structured and principled manner to improve our understanding of these systems, and to make significant gains in dependability (safety, reliability and availability). Prior work has described a fundamental taxonomy and theory of System Health Management (SHM), and of its operational subset, Fault Management (FM). This conceptual foundation provides a basis to develop framework to design and implement FM design strategies that protect mission objectives and account for system design limitations. Selection of an SHM strategy has implications for the functions required to perform the strategy, and it places constraints on the set of possible design solutions. The framework developed in this paper provides a rigorous and principled approach to classifying SHM strategies, as well as methods for determination and implementation of SHM strategies. An illustrative example is used to describe the application of the framework and the resulting benefits to system and FM design and dependability.
Generalized probability theories: what determines the structure of quantum theory?
NASA Astrophysics Data System (ADS)
Janotta, Peter; Hinrichsen, Haye
2014-08-01
The framework of generalized probabilistic theories is a powerful tool for studying the foundations of quantum physics. It provides the basis for a variety of recent findings that significantly improve our understanding of the rich physical structure of quantum theory. This review paper tries to present the framework and recent results to a broader readership in an accessible manner. To achieve this, we follow a constructive approach. Starting from a few basic physically motivated assumptions we show how a given set of observations can be manifested in an operational theory. Furthermore, we characterize consistency conditions limiting the range of possible extensions. In this framework classical and quantum theory appear as special cases, and the aim is to understand what distinguishes quantum mechanics as the fundamental theory realized in nature. It turns out that non-classical features of single systems can equivalently result from higher-dimensional classical theories that have been restricted. Entanglement and non-locality, however, are shown to be genuine non-classical features.
Abramson, David M.; Grattan, Lynn M.; Mayer, Brian; Colten, Craig E.; Arosemena, Farah A.; Rung, Ariane; Lichtveld, Maureen
2014-01-01
A number of governmental agencies have called for enhancing citizen’s resilience as a means of preparing populations in advance of disasters, and as a counter-balance to social and individual vulnerabilities. This increasing scholarly, policy and programmatic interest in promoting individual and communal resilience presents a challenge to the research and practice communities: to develop a translational framework that can accommodate multi-disciplinary scientific perspectives into a single, applied model. The Resilience Activation Framework provides a basis for testing how access to social resources, such as formal and informal social support and help, promotes positive adaptation or reduced psychopathology among individuals and communities exposed to the acute collective stressors associated with disasters, whether manmade, natural, or technological in origin. Articulating the mechanisms by which access to social resources activate and sustain resilience capacities for optimal mental health outcomes post-disaster can lead to the development of effective preventive and early intervention programs. PMID:24870399
Cross-cultural re-entry for missionaries: a new application for the Dual Process Model.
Selby, Susan; Clark, Sheila; Braunack-Mayer, Annette; Jones, Alison; Moulding, Nicole; Beilby, Justin
Nearly half a million foreign aid workers currently work worldwide, including over 140,000 missionaries. During re-entry these workers may experience significant psychological distress. This article positions previous research about psychological distress during re-entry, emphasizing loss and grief. At present there is no identifiable theoretical framework to provide a basis for assessment, management, and prevention of re-entry distress in the clinical setting. The development of theoretical concepts and frameworks surrounding loss and grief including the Dual Process Model (DPM) are discussed. All the parameters of the DPM have been shown to be appropriate for the proposed re-entry model, the Dual Process Model applied to Re-entry (DPMR). It is proposed that the DPMR is an appropriate framework to address the processes and strategies of managing re-entry loss and grief. Possible future clinical applications and limitations of the proposed model are discussed. The DPMR is offered for further validation and use in clinical practice.
GIS Application System Design Applied to Information Monitoring
NASA Astrophysics Data System (ADS)
Qun, Zhou; Yujin, Yuan; Yuena, Kang
Natural environment information management system involves on-line instrument monitoring, data communications, database establishment, information management software development and so on. Its core lies in collecting effective and reliable environmental information, increasing utilization rate and sharing degree of environment information by advanced information technology, and maximizingly providing timely and scientific foundation for environmental monitoring and management. This thesis adopts C# plug-in application development and uses a set of complete embedded GIS component libraries and tools libraries provided by GIS Engine to finish the core of plug-in GIS application framework, namely, the design and implementation of framework host program and each functional plug-in, as well as the design and implementation of plug-in GIS application framework platform. This thesis adopts the advantages of development technique of dynamic plug-in loading configuration, quickly establishes GIS application by visualized component collaborative modeling and realizes GIS application integration. The developed platform is applicable to any application integration related to GIS application (ESRI platform) and can be as basis development platform of GIS application development.
Human Exploration Framework Team: Strategy and Status
NASA Technical Reports Server (NTRS)
Muirhead, Brian K.; Sherwood, Brent; Olson, John
2011-01-01
Human Exploration Framework Team (HEFT) was formulated to create a decision framework for human space exploration that drives out the knowledge, capabilities and infrastructure NASA needs to send people to explore multiple destinations in the Solar System in an efficient, sustainable way. The specific goal is to generate an initial architecture that can evolve into a long term, enterprise-wide architecture that is the basis for a robust human space flight enterprise. This paper will discuss the initial HEFT activity which focused on starting up the cross-agency team, getting it functioning, developing a comprehensive development and analysis process and conducting multiple iterations of the process. The outcome of this process will be discussed including initial analysis of capabilities and missions for at least two decades, keeping Mars as the ultimate destination. Details are provided on strategies that span a broad technical and programmatic trade space, are analyzed against design reference missions and evaluated against a broad set of figures of merit including affordability, operational complexity, and technical and programmatic risk.
Abramson, David M; Grattan, Lynn M; Mayer, Brian; Colten, Craig E; Arosemena, Farah A; Bedimo-Rung, Ariane; Lichtveld, Maureen
2015-01-01
A number of governmental agencies have called for enhancing citizens' resilience as a means of preparing populations in advance of disasters, and as a counterbalance to social and individual vulnerabilities. This increasing scholarly, policy, and programmatic interest in promoting individual and communal resilience presents a challenge to the research and practice communities: to develop a translational framework that can accommodate multidisciplinary scientific perspectives into a single, applied model. The Resilience Activation Framework provides a basis for testing how access to social resources, such as formal and informal social support and help, promotes positive adaptation or reduced psychopathology among individuals and communities exposed to the acute collective stressors associated with disasters, whether human-made, natural, or technological in origin. Articulating the mechanisms by which access to social resources activate and sustain resilience capacities for optimal mental health outcomes post-disaster can lead to the development of effective preventive and early intervention programs.
A unifying kinetic framework for modeling oxidoreductase-catalyzed reactions.
Chang, Ivan; Baldi, Pierre
2013-05-15
Oxidoreductases are a fundamental class of enzymes responsible for the catalysis of oxidation-reduction reactions, crucial in most bioenergetic metabolic pathways. From their common root in the ancient prebiotic environment, oxidoreductases have evolved into diverse and elaborate protein structures with specific kinetic properties and mechanisms adapted to their individual functional roles and environmental conditions. While accurate kinetic modeling of oxidoreductases is thus important, current models suffer from limitations to the steady-state domain, lack empirical validation or are too specialized to a single system or set of conditions. To address these limitations, we introduce a novel unifying modeling framework for kinetic descriptions of oxidoreductases. The framework is based on a set of seven elementary reactions that (i) form the basis for 69 pairs of enzyme state transitions for encoding various specific microscopic intra-enzyme reaction networks (micro-models), and (ii) lead to various specific macroscopic steady-state kinetic equations (macro-models) via thermodynamic assumptions. Thus, a synergistic bridge between the micro and macro kinetics can be achieved, enabling us to extract unitary rate constants, simulate reaction variance and validate the micro-models using steady-state empirical data. To help facilitate the application of this framework, we make available RedoxMech: a Mathematica™ software package that automates the generation and customization of micro-models. The Mathematica™ source code for RedoxMech, the documentation and the experimental datasets are all available from: http://www.igb.uci.edu/tools/sb/metabolic-modeling. pfbaldi@ics.uci.edu Supplementary data are available at Bioinformatics online.
Alkhatib, Omar J
2017-12-01
The construction industry is typically characterized as a fragmented, multi-organizational setting in which members from different technical backgrounds and moral values join together to develop a particular business or project. The most challenging obstacle in the construction process is to achieve a successful practice and to identify and apply an ethical framework to manage the behavior of involved specialists and contractors and to ensure the quality of all completed construction activities. The framework should reflect a common moral ground for myriad people involved in this process to survive and compete ethically in today's turbulent construction market. This study establishes a framework for moral judgment of behavior and actions conducted in the construction process. The moral framework provides the basis of judging actions as "moral" or "immoral" based on three levels of moral accountability: personal, professional, and social. The social aspect of the proposed framework is developed primarily from the essential attributes of normative business decision-making models identified in the literature review and subsequently incorporates additional attributes related to professional and personal moral values. The normative decision-making models reviewed are based primarily on social attributes as related to moral theories (e.g., utilitarianism, duty, rights, virtue, etc.). The professional and moral attributes are established by identifying a set of common moral values recognized by professionals in the construction industry and required to prevent common construction breaches. The moral framework presented here is the complementary part of the ethical framework developed in Part I of this article and is based primarily on the personal behavior or the moral aspect of professional responsibility. The framework can be implemented as a form of preventive personal ethics, which would help avoid ethical dilemmas and moral implications in the first place. Furthermore, the moral framework can be considered as a decision-making model to guide actions and improve the moral reasoning process, which would help individuals think through possible implications and the consequences of ethical and moral issues in the construction industry.
Bayesian anomaly detection in monitoring data applying relevance vector machine
NASA Astrophysics Data System (ADS)
Saito, Tomoo
2011-04-01
A method for automatically classifying the monitoring data into two categories, normal and anomaly, is developed in order to remove anomalous data included in the enormous amount of monitoring data, applying the relevance vector machine (RVM) to a probabilistic discriminative model with basis functions and their weight parameters whose posterior PDF (probabilistic density function) conditional on the learning data set is given by Bayes' theorem. The proposed framework is applied to actual monitoring data sets containing some anomalous data collected at two buildings in Tokyo, Japan, which shows that the trained models discriminate anomalous data from normal data very clearly, giving high probabilities of being normal to normal data and low probabilities of being normal to anomalous data.
The interval testing procedure: A general framework for inference in functional data analysis.
Pini, Alessia; Vantini, Simone
2016-09-01
We introduce in this work the Interval Testing Procedure (ITP), a novel inferential technique for functional data. The procedure can be used to test different functional hypotheses, e.g., distributional equality between two or more functional populations, equality of mean function of a functional population to a reference. ITP involves three steps: (i) the representation of data on a (possibly high-dimensional) functional basis; (ii) the test of each possible set of consecutive basis coefficients; (iii) the computation of the adjusted p-values associated to each basis component, by means of a new strategy here proposed. We define a new type of error control, the interval-wise control of the family wise error rate, particularly suited for functional data. We show that ITP is provided with such a control. A simulation study comparing ITP with other testing procedures is reported. ITP is then applied to the analysis of hemodynamical features involved with cerebral aneurysm pathology. ITP is implemented in the fdatest R package. © 2016, The International Biometric Society.
Conceptual frameworks of individual work performance: a systematic review.
Koopmans, Linda; Bernaards, Claire M; Hildebrandt, Vincent H; Schaufeli, Wilmar B; de Vet Henrica, C W; van der Beek, Allard J
2011-08-01
Individual work performance is differently conceptualized and operationalized in different disciplines. The aim of the current review was twofold: (1) identifying conceptual frameworks of individual work performance and (2) integrating these to reach a heuristic conceptual framework. A systematic review was conducted in medical, psychological, and management databases. Studies were selected independently by two researchers and included when they presented a conceptual framework of individual work performance. A total of 17 generic frameworks (applying across occupations) and 18 job-specific frameworks (applying to specific occupations) were identified. Dimensions frequently used to describe individual work performance were task performance, contextual performance, counterproductive work behavior, and adaptive performance. On the basis of the literature, a heuristic conceptual framework of individual work performance was proposed. This framework can serve as a theoretical basis for future research and practice.
NASA Astrophysics Data System (ADS)
Fukuda, Jun'ichi; Johnson, Kaj M.
2010-06-01
We present a unified theoretical framework and solution method for probabilistic, Bayesian inversions of crustal deformation data. The inversions involve multiple data sets with unknown relative weights, model parameters that are related linearly or non-linearly through theoretic models to observations, prior information on model parameters and regularization priors to stabilize underdetermined problems. To efficiently handle non-linear inversions in which some of the model parameters are linearly related to the observations, this method combines both analytical least-squares solutions and a Monte Carlo sampling technique. In this method, model parameters that are linearly and non-linearly related to observations, relative weights of multiple data sets and relative weights of prior information and regularization priors are determined in a unified Bayesian framework. In this paper, we define the mixed linear-non-linear inverse problem, outline the theoretical basis for the method, provide a step-by-step algorithm for the inversion, validate the inversion method using synthetic data and apply the method to two real data sets. We apply the method to inversions of multiple geodetic data sets with unknown relative data weights for interseismic fault slip and locking depth. We also apply the method to the problem of estimating the spatial distribution of coseismic slip on faults with unknown fault geometry, relative data weights and smoothing regularization weight.
A dynamic water-quality modeling framework for the Neuse River estuary, North Carolina
Bales, Jerad D.; Robbins, Jeanne C.
1999-01-01
As a result of fish kills in the Neuse River estuary in 1995, nutrient reduction strategies were developed for point and nonpoint sources in the basin. However, because of the interannual variability in the natural system and the resulting complex hydrologic-nutrient inter- actions, it is difficult to detect through a short-term observational program the effects of management activities on Neuse River estuary water quality and aquatic health. A properly constructed water-quality model can be used to evaluate some of the potential effects of manage- ment actions on estuarine water quality. Such a model can be used to predict estuarine response to present and proposed nutrient strategies under the same set of meteorological and hydrologic conditions, thus removing the vagaries of weather and streamflow from the analysis. A two-dimensional, laterally averaged hydrodynamic and water-quality modeling framework was developed for the Neuse River estuary by using previously collected data. Development of the modeling framework consisted of (1) computational grid development, (2) assembly of data for model boundary conditions and model testing, (3) selection of initial values of model parameters, and (4) limited model testing. The model domain extends from Streets Ferry to Oriental, N.C., includes seven lateral embayments that have continual exchange with the main- stem of the estuary, three point-source discharges, and three tributary streams. Thirty-five computational segments represent the mainstem of the estuary, and the entire framework contains a total of 60 computa- tional segments. Each computational cell is 0.5 meter thick; segment lengths range from 500 meters to 7,125 meters. Data that were used to develop the modeling framework were collected during March through October 1991 and represent the most comprehensive data set available prior to 1997. Most of the data were collected by the North Carolina Division of Water Quality, the University of North Carolina Institute of Marine Sciences, and the U.S. Geological Survey. Limitations in the modeling framework were clearly identified. These limitations formed the basis for a set of suggestions to refine the Neuse River estuary water-quality model.
Kutateladze, Andrei G; Mukhina, Olga A
2014-09-05
Spin-spin coupling constants in (1)H NMR carry a wealth of structural information and offer a powerful tool for deciphering molecular structures. However, accurate ab initio or DFT calculations of spin-spin coupling constants have been very challenging and expensive. Scaling of (easy) Fermi contacts, fc, especially in the context of recent findings by Bally and Rablen (Bally, T.; Rablen, P. R. J. Org. Chem. 2011, 76, 4818), offers a framework for achieving practical evaluation of spin-spin coupling constants. We report a faster and more precise parametrization approach utilizing a new basis set for hydrogen atoms optimized in conjunction with (i) inexpensive B3LYP/6-31G(d) molecular geometries, (ii) inexpensive 4-31G basis set for carbon atoms in fc calculations, and (iii) individual parametrization for different atom types/hybridizations, not unlike a force field in molecular mechanics, but designed for the fc's. With the training set of 608 experimental constants we achieved rmsd <0.19 Hz. The methodology performs very well as we illustrate with a set of complex organic natural products, including strychnine (rmsd 0.19 Hz), morphine (rmsd 0.24 Hz), etc. This precision is achieved with much shorter computational times: accurate spin-spin coupling constants for the two conformers of strychnine were computed in parallel on two 16-core nodes of a Linux cluster within 10 min.
A blueprint for strategic urban research: the urban piazza
Kourtit, Karima; Nijkamp, Peter; Franklin, Rachel S.; Rodríguez-Pose, Andrés
2014-01-01
Urban research in many countries has failed to keep up with the pace of rapidly and constantly evolving urban change. The growth of cities, the increasing complexity of their functions and the complex intra- and inter-urban linkages in this ‘urban century’ demand new approaches to urban analysis, which, from a systemic perspective, supersede the existing fragmentation in urban studies. In this paper we propose the concept of the urban piazza as a framework in order to address some of the inefficiencies associated with current urban analysis. By combining wealth-creating potential with smart urban mobility, ecological resilience and social buzz in this integrated and systemic framework, the aim is to set the basis for a ‘New Urban World’ research blueprint, which lays the foundation for a broader and more integrated research programme for strategic urban issues. PMID:25339782
A blueprint for strategic urban research: the urban piazza.
Kourtit, Karima; Nijkamp, Peter; Franklin, Rachel S; Rodríguez-Pose, Andrés
2014-01-01
Urban research in many countries has failed to keep up with the pace of rapidly and constantly evolving urban change. The growth of cities, the increasing complexity of their functions and the complex intra- and inter-urban linkages in this 'urban century' demand new approaches to urban analysis, which, from a systemic perspective, supersede the existing fragmentation in urban studies. In this paper we propose the concept of the urban piazza as a framework in order to address some of the inefficiencies associated with current urban analysis. By combining wealth-creating potential with smart urban mobility, ecological resilience and social buzz in this integrated and systemic framework, the aim is to set the basis for a ' New Urban World ' research blueprint, which lays the foundation for a broader and more integrated research programme for strategic urban issues.
Erlich, Yaniv; Gordon, Assaf; Brand, Michael; Hannon, Gregory J.; Mitra, Partha P.
2011-01-01
Over the past three decades we have steadily increased our knowledge on the genetic basis of many severe disorders. Nevertheless, there are still great challenges in applying this knowledge routinely in the clinic, mainly due to the relatively tedious and expensive process of genotyping. Since the genetic variations that underlie the disorders are relatively rare in the population, they can be thought of as a sparse signal. Using methods and ideas from compressed sensing and group testing, we have developed a cost-effective genotyping protocol to detect carriers for severe genetic disorders. In particular, we have adapted our scheme to a recently developed class of high throughput DNA sequencing technologies. The mathematical framework presented here has some important distinctions from the ’traditional’ compressed sensing and group testing frameworks in order to address biological and technical constraints of our setting. PMID:21451737
Qian, Ying-Jun; Li, Shi-Zhu; Xu, Jun-Fang; Zhang, Li; Fu, Qing; Zhou, Xiao-Nong
2013-12-01
To set up a framework of indicators for schistosomiasis and malaria to guide the formulation and evaluation of vector-borne disease control policies focusing on adaptation to the negative impact of climate change. A 2-level indicator framework was set up on the basis of literature review, and Delphi method was applied to a total of 22 and 19 experts working on schistosomiasis and malaria, respectively. The result was analyzed to calculate the weight of various indicators. A total of 41 questionnaires was delivered, and 38 with valid response (92.7%). The system included 4 indicators at first level, i.e. surveillance, scientific research, disease control and intervention, and adaptation capacity building, with 25 indicators for schistosomiasis and 21 for malaria at the second level. Among indicators at the first level, disease surveillance ranked first with a weight of 0.32. Among the indicators at the second level, vector monitoring scored the highest in terms of both schistosomiasis and malaria. The indicators set up by Delphi method are practical,universal and effective ones using in the field, which is also useful to technically support the establishment of adaptation to climate change in the field of public health.
Correlation consistent basis sets for the atoms In–Xe
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mahler, Andrew; Wilson, Angela K., E-mail: akwilson@unt.edu
In this work, the correlation consistent family of Gaussian basis sets has been expanded to include all-electron basis sets for In–Xe. The methodology for developing these basis sets is described, and several examples of the performance and utility of the new sets have been provided. Dissociation energies and bond lengths for both homonuclear and heteronuclear diatomics demonstrate the systematic convergence behavior with respect to increasing basis set quality expected by the family of correlation consistent basis sets in describing molecular properties. Comparison with recently developed correlation consistent sets designed for use with the Douglas-Kroll Hamiltonian is provided.
Advancing the use of performance evaluation in health care.
Traberg, Andreas; Jacobsen, Peter; Duthiers, Nadia Monique
2014-01-01
The purpose of this paper is to develop a framework for health care performance evaluation that enables decision makers to identify areas indicative of corrective actions. The framework should provide information on strategic pro-/regress in an operational context that justifies the need for organizational adjustments. The study adopts qualitative methods for constructing the framework, subsequently implementing the framework in a Danish magnetic resonance imaging (MRI) unit. Workshops and interviews form the basis of the qualitative construction phase, and two internal and five external databases are used for a quantitative data collection. By aggregating performance outcomes, collective measures of performance are achieved. This enables easy and intuitive identification of areas not strategically aligned. In general, the framework has proven helpful in an MRI unit, where operational decision makers have been struggling with extensive amounts of performance information. The implementation of the framework in a single case in a public and highly political environment restricts the generalizing potential. The authors acknowledge that there may be more suitable approaches in organizations with different settings. The strength of the framework lies in the identification of performance problems prior to decision making. The quality of decisions is directly related to the individual decision maker. The only function of the framework is to support these decisions. The study demonstrates a more refined and transparent use of performance reporting by combining strategic weight assignment and performance aggregation in hierarchies. In this way, the framework accentuates performance as a function of strategic progress or regress, thus assisting decision makers in exerting operational effort in pursuit of strategic alignment.
Globalization and health: a framework for analysis and action.
Woodward, D.; Drager, N.; Beaglehole, R.; Lipson, D.
2001-01-01
Globalization is a key challenge to public health, especially in developing countries, but the linkages between globalization and health are complex. Although a growing amount of literature has appeared on the subject, it is piecemeal, and suffers from a lack of an agreed framework for assessing the direct and indirect health effects of different aspects of globalization. This paper presents a conceptual framework for the linkages between economic globalization and health, with the intention that it will serve as a basis for synthesizing existing relevant literature, identifying gaps in knowledge, and ultimately developing national and international policies more favourable to health. The framework encompasses both the indirect effects on health, operating through the national economy, household economies and health-related sectors such as water, sanitation and education, as well as more direct effects on population-level and individual risk factors for health and on the health care system. Proposed also is a set of broad objectives for a programme of action to optimize the health effects of economic globalization. The paper concludes by identifying priorities for research corresponding with the five linkages identified as critical to the effects of globalization on health. PMID:11584737
Principles of Public Reason in the UNFCCC: Rethinking the Equity Framework.
Boran, Idil
2017-10-01
Since 2011, the focus of international negotiations under the UNFCCC has been on producing a new climate agreement to be adopted in 2015. This phase of negotiations is known as the Durban Platform for Enhanced Action. The goal has been to update the global effort on climate for long-term cooperation. In this period, various changes have been contemplated on the design of the architecture of the global climate effort. Whereas previously, the negotiation process consisted of setting mandated targets exclusively for developed countries, the current setting requests of each country to pledge its contribution to the climate effort in the form of Intended Nationally Determined Contributions (INDCs). The shift away from establishing negotiated targets for rich countries alone towards a universal system of participation through intended contributions raised persistent questions on how exactly the new agreement can ensure equitable terms. How to conceptualize equity within the 2015 climate agreement, and beyond, is the focus of this paper. The paper advances a framework on equity, which moves away from substantive moral conceptions of burden allocation toward refining principles of public reason specially designed for the negotiation process under the UNFCCC. The paper outlines the framework's main features and discusses how it can serve a facilitating role for multilateral discussion on equity on a long-term basis capable of adapting to changing circumstances.
The underlying pathway structure of biochemical reaction networks
Schilling, Christophe H.; Palsson, Bernhard O.
1998-01-01
Bioinformatics is yielding extensive, and in some cases complete, genetic and biochemical information about individual cell types and cellular processes, providing the composition of living cells and the molecular structure of its components. These components together perform integrated cellular functions that now need to be analyzed. In particular, the functional definition of biochemical pathways and their role in the context of the whole cell is lacking. In this study, we show how the mass balance constraints that govern the function of biochemical reaction networks lead to the translation of this problem into the realm of linear algebra. The functional capabilities of biochemical reaction networks, and thus the choices that cells can make, are reflected in the null space of their stoichiometric matrix. The null space is spanned by a finite number of basis vectors. We present an algorithm for the synthesis of a set of basis vectors for spanning the null space of the stoichiometric matrix, in which these basis vectors represent the underlying biochemical pathways that are fundamental to the corresponding biochemical reaction network. In other words, all possible flux distributions achievable by a defined set of biochemical reactions are represented by a linear combination of these basis pathways. These basis pathways thus represent the underlying pathway structure of the defined biochemical reaction network. This development is significant from a fundamental and conceptual standpoint because it yields a holistic definition of biochemical pathways in contrast to definitions that have arisen from the historical development of our knowledge about biochemical processes. Additionally, this new conceptual framework will be important in defining, characterizing, and studying biochemical pathways from the rapidly growing information on cellular function. PMID:9539712
Lindsey, Bruce D.; Bickford, Tammy M.
1999-01-01
State agencies responsible for regulating pesticides are required by the U.S. Environmental Protection Agency to develop state management plans for specific pesticides. A key part of these management plans includes assessing the potential for contamination of ground water by pesticides throughout the state. As an example of how a statewide assessment could be implemented, a plan is presented for the Commonwealth of Pennsylvania to illustrate how a hydrogeologic framework can be used as a basis for sampling areas within a state with the highest likelihood of having elevated pesticide concentrations in ground water. The framework was created by subdividing the state into 20 areas on the basis of physiography and aquifer type. Each of these 20 hydrogeologic settings is relatively homogeneous with respect to aquifer susceptibility and pesticide use—factors that would be likely to affect pesticide concentrations in ground water. Existing data on atrazine occurrence in ground water was analyzed to determine (1) which areas of the state already have sufficient samples collected to make statistical comparisons among hydrogeologic settings, and (2) the effect of factors such as land use and aquifer characteristics on pesticide occurrence. The theoretical vulnerability and the results of the data analysis were used to rank each of the 20 hydrogeologic settings on the basis of vulnerability of ground water to contamination by pesticides. Example sampling plans are presented for nine of the hydrogeologic settings that lack sufficient data to assess vulnerability to contamination. Of the highest priority areas of the state, two out of four have been adequately sampled, one of the three areas of moderate to high priority has been adequately sampled, four of the nine areas of moderate to low priority have been adequately sampled, and none of the three low priority areas have been sampled.Sampling to date has shown that, even in the most vulnerable hydrogeologic settings, pesticide concentrations in ground water rarely exceed U.S. Environmental Protection Agency Drinking Water Standards or Health Advisory Levels. Analyses of samples from 1,159 private water supplies reveal only 3 sites for which samples with concentrations of pesticides exceeded drinking-water standards. In most cases, samples with elevated concentrations could be traced to point sources at pesticide loading or mixing areas. These analyses included data from some of the most vulnerable areas of the state, indicating that it is highly unlikely that pesticide concentrations in water from wells in other areas of the state would exceed the drinking-water standards unless a point source of contamination were present. Analysis of existing data showed that water from wells in areas of the state underlain by carbonate (limestone and dolomite) bedrock, which commonly have a high percentage of corn production, was much more likely to have pesticides detected. Application of pesticides to the land surface generally has not caused concentrations of the five state priority pesticides in ground water to exceed health standards; however, this study has not evaluated the potential human health effects of mixtures of pesticides or pesticide degradation products in drinking water. This study also has not determined whether concentrations in ground water are stable, increasing, or decreasing.
NASA Astrophysics Data System (ADS)
Witte, Jonathon; Neaton, Jeffrey B.; Head-Gordon, Martin
2016-05-01
With the aim of systematically characterizing the convergence of common families of basis sets such that general recommendations for basis sets can be made, we have tested a wide variety of basis sets against complete-basis binding energies across the S22 set of intermolecular interactions—noncovalent interactions of small and medium-sized molecules consisting of first- and second-row atoms—with three distinct density functional approximations: SPW92, a form of local-density approximation; B3LYP, a global hybrid generalized gradient approximation; and B97M-V, a meta-generalized gradient approximation with nonlocal correlation. We have found that it is remarkably difficult to reach the basis set limit; for the methods and systems examined, the most complete basis is Jensen's pc-4. The Dunning correlation-consistent sequence of basis sets converges slowly relative to the Jensen sequence. The Karlsruhe basis sets are quite cost effective, particularly when a correction for basis set superposition error is applied: counterpoise-corrected def2-SVPD binding energies are better than corresponding energies computed in comparably sized Dunning and Jensen bases, and on par with uncorrected results in basis sets 3-4 times larger. These trends are exhibited regardless of the level of density functional approximation employed. A sense of the magnitude of the intrinsic incompleteness error of each basis set not only provides a foundation for guiding basis set choice in future studies but also facilitates quantitative comparison of existing studies on similar types of systems.
Inferring the nature of anthropogenic threats from long-term abundance records.
Shoemaker, Kevin T; Akçakaya, H Resit
2015-02-01
Diagnosing the processes that threaten species persistence is critical for recovery planning and risk forecasting. Dominant threats are typically inferred by experts on the basis of a patchwork of informal methods. Transparent, quantitative diagnostic tools would contribute much-needed consistency, objectivity, and rigor to the process of diagnosing anthropogenic threats. Long-term census records, available for an increasingly large and diverse set of taxa, may exhibit characteristic signatures of specific threatening processes and thereby provide information for threat diagnosis. We developed a flexible Bayesian framework for diagnosing threats on the basis of long-term census records and diverse ancillary sources of information. We tested this framework with simulated data from artificial populations subjected to varying degrees of exploitation and habitat loss and several real-world abundance time series for which threatening processes are relatively well understood: bluefin tuna (Thunnus maccoyii) and Atlantic cod (Gadus morhua) (exploitation) and Red Grouse (Lagopus lagopus scotica) and Eurasian Skylark (Alauda arvensis) (habitat loss). Our method correctly identified the process driving population decline for over 90% of time series simulated under moderate to severe threat scenarios. Successful identification of threats approached 100% for severe exploitation and habitat loss scenarios. Our method identified threats less successfully when threatening processes were weak and when populations were simultaneously affected by multiple threats. Our method selected the presumed true threat model for all real-world case studies, although results were somewhat ambiguous in the case of the Eurasian Skylark. In the latter case, incorporation of an ancillary source of information (records of land-use change) increased the weight assigned to the presumed true model from 70% to 92%, illustrating the value of the proposed framework in bringing diverse sources of information into a common rigorous framework. Ultimately, our framework may greatly assist conservation organizations in documenting threatening processes and planning species recovery. © 2014 Society for Conservation Biology.
Evolutionary interrogation of human biology in well-annotated genomic framework of rhesus macaque.
Zhang, Shi-Jian; Liu, Chu-Jun; Yu, Peng; Zhong, Xiaoming; Chen, Jia-Yu; Yang, Xinzhuang; Peng, Jiguang; Yan, Shouyu; Wang, Chenqu; Zhu, Xiaotong; Xiong, Jingwei; Zhang, Yong E; Tan, Bertrand Chin-Ming; Li, Chuan-Yun
2014-05-01
With genome sequence and composition highly analogous to human, rhesus macaque represents a unique reference for evolutionary studies of human biology. Here, we developed a comprehensive genomic framework of rhesus macaque, the RhesusBase2, for evolutionary interrogation of human genes and the associated regulations. A total of 1,667 next-generation sequencing (NGS) data sets were processed, integrated, and evaluated, generating 51.2 million new functional annotation records. With extensive NGS annotations, RhesusBase2 refined the fine-scale structures in 30% of the macaque Ensembl transcripts, reporting an accurate, up-to-date set of macaque gene models. On the basis of these annotations and accurate macaque gene models, we further developed an NGS-oriented Molecular Evolution Gateway to access and visualize macaque annotations in reference to human orthologous genes and associated regulations (www.rhesusbase.org/molEvo). We highlighted the application of this well-annotated genomic framework in generating hypothetical link of human-biased regulations to human-specific traits, by using mechanistic characterization of the DIEXF gene as an example that provides novel clues to the understanding of digestive system reduction in human evolution. On a global scale, we also identified a catalog of 9,295 human-biased regulatory events, which may represent novel elements that have a substantial impact on shaping human transcriptome and possibly underpin recent human phenotypic evolution. Taken together, we provide an NGS data-driven, information-rich framework that will broadly benefit genomics research in general and serves as an important resource for in-depth evolutionary studies of human biology.
The Natural Hospital Environment: a Socio-Technical-Material perspective.
Fernando, Juanita; Dawson, Linda
2014-02-01
This paper introduces two concepts into analyses of information security and hospital-based information systems-- a Socio-Technical-Material theoretical framework and the Natural Hospital Environment. The research is grounded in a review of pertinent literature with previously published Australian (Victoria) case study data to analyse the way clinicians work with privacy and security in their work. The analysis was sorted into thematic categories, providing the basis for the Natural Hospital Environment and Socio-Technical-Material framework theories discussed here. Natural Hospital Environments feature inadequate yet pervasive computer use, aural privacy shortcomings, shared workspace, meagre budgets, complex regulation that hinders training outcomes and out-dated infrastructure and are highly interruptive. Working collaboratively in many cases, participants found ways to avoid or misuse security tools, such as passwords or screensavers for patient care. Workgroup infrastructure was old, architecturally limited, haphazard in some instances, and was less useful than paper handover sheets to ensure the quality of patient care outcomes. Despite valiant efforts by some participants, they were unable to control factors influencing the privacy of patient health information in public hospital settings. Future improvements to hospital-based organisational frameworks for e-health can only be made when there is an improved understanding of the Socio-Technical-Material theoretical framework and Natural Hospital Environment contexts. Aspects within control of clinicians and administrators can be addressed directly although some others are beyond their control. An understanding and acknowledgement of these issues will benefit the management and planning of improved and secure hospital settings. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Transnational gestational surrogacy: does it have to be exploitative?
Kirby, Jeffrey
2014-01-01
This article explores the controversial practice of transnational gestational surrogacy and poses a provocative question: Does it have to be exploitative? Various existing models of exploitation are considered and a novel exploitation-evaluation heuristic is introduced to assist in the analysis of the potentially exploitative dimensions/elements of complex health-related practices. On the basis of application of the heuristic, I conclude that transnational gestational surrogacy, as currently practiced in low-income country settings (such as rural, western India), is exploitative of surrogate women. Arising out of consideration of the heuristic's exploitation conditions, a set of public education and enabled choice, enhanced protections, and empowerment reforms to transnational gestational surrogacy practice is proposed that, if incorporated into a national regulatory framework and actualized within a low income country, could possibly render such practice nonexploitative.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Witte, Jonathon; Molecular Foundry, Lawrence Berkeley National Laboratory, Berkeley, California 94720; Neaton, Jeffrey B.
With the aim of systematically characterizing the convergence of common families of basis sets such that general recommendations for basis sets can be made, we have tested a wide variety of basis sets against complete-basis binding energies across the S22 set of intermolecular interactions—noncovalent interactions of small and medium-sized molecules consisting of first- and second-row atoms—with three distinct density functional approximations: SPW92, a form of local-density approximation; B3LYP, a global hybrid generalized gradient approximation; and B97M-V, a meta-generalized gradient approximation with nonlocal correlation. We have found that it is remarkably difficult to reach the basis set limit; for the methodsmore » and systems examined, the most complete basis is Jensen’s pc-4. The Dunning correlation-consistent sequence of basis sets converges slowly relative to the Jensen sequence. The Karlsruhe basis sets are quite cost effective, particularly when a correction for basis set superposition error is applied: counterpoise-corrected def2-SVPD binding energies are better than corresponding energies computed in comparably sized Dunning and Jensen bases, and on par with uncorrected results in basis sets 3-4 times larger. These trends are exhibited regardless of the level of density functional approximation employed. A sense of the magnitude of the intrinsic incompleteness error of each basis set not only provides a foundation for guiding basis set choice in future studies but also facilitates quantitative comparison of existing studies on similar types of systems.« less
Resourcing the National Goals for Schooling: An Agreed Framework of Principles for Funding Schools
ERIC Educational Resources Information Center
Ministerial Council on Education, Employment, Training and Youth Affairs (NJ1), 2012
2012-01-01
Funding for school education in Australia should be on the basis of clear and agreed policy principles for achieving effectiveness, efficiency, equity and a socially and culturally cohesive society. On the basis of these principles a national framework for funding schools will be supported by complementary State and Commonwealth models for funding…
First principles calculations for interaction of tyrosine with (ZnO)3 cluster
NASA Astrophysics Data System (ADS)
Singh, Satvinder; Singh, Gurinder; Kaura, Aman; Tripathi, S. K.
2018-04-01
First Principles Calculations have been performed to study interactions of Phenol ring of Tyrosine (C6H5OH) with (ZnO)3 atomic cluster. All the calculations have been performed under the Density Functional Theory (DFT) framework. Structural and electronic properties of (ZnO)3/C6H5OH have been studied. Gaussian basis set approach has been adopted for the calculations. A ring type most stable (ZnO)3 atomic cluster has been modeled, analyzed and used for the calculations. The compatibility of the results with previous studies has been presented here.
First principle investigation of structural and optical properties of cubic titanium dioxide
NASA Astrophysics Data System (ADS)
Dash, Debashish; Chaudhury, Saurabh; Tripathy, Susanta K.
2018-05-01
This paper presents an analysis of structural and optical properties of cubic titanium dioxide (TiO2) using Orthogonalzed Linear Combinations of Atomic Orbitals (OLCAO) basis set under the framework of Density Functional Theory (DFT). The structural property, specially the lattice constant `a' and the optical properties such as refractive index, extinction coefficient, and reflectivity are investigated and discussed in the energy range of 0-16 eV. Further, the results have compared with previous theoretical as well as with experimental results. It was found that DFT based simulation results are approximation to experimental results.
Many-body-theory study of lithium photoionization
NASA Technical Reports Server (NTRS)
Chang, T. N.; Poe, R. T.
1975-01-01
A detailed theoretical calculation is carried out for the photoionization of lithium at low energies within the framework of Brueckner-Goldstone perturbational approach. In this calculation extensive use is made of the recently developed multiple-basis-set technique. Through this technique all second-order perturbation terms, plus a number of important classes of terms to infinite order, have been taken into account. Analysis of the results enables one to resolve the discrepancies between two previous works on this subject. The detailed calculation also serves as a test on the convergence of the many-body perturbation-expansion approach.
Proof Rules for Automated Compositional Verification through Learning
NASA Technical Reports Server (NTRS)
Barringer, Howard; Giannakopoulou, Dimitra; Pasareanu, Corina S.
2003-01-01
Compositional proof systems not only enable the stepwise development of concurrent processes but also provide a basis to alleviate the state explosion problem associated with model checking. An assume-guarantee style of specification and reasoning has long been advocated to achieve compositionality. However, this style of reasoning is often non-trivial, typically requiring human input to determine appropriate assumptions. In this paper, we present novel assume- guarantee rules in the setting of finite labelled transition systems with blocking communication. We show how these rules can be applied in an iterative and fully automated fashion within a framework based on learning.
Ory, Marcia G; Altpeter, Mary; Belza, Basia; Helduser, Janet; Zhang, Chen; Smith, Matthew Lee
2014-01-01
Dissemination and implementation (D&I) frameworks are increasingly being promoted in public health research. However, less is known about their uptake in the field, especially for diverse sets of programs. Limited questionnaires exist to assess the ways that frameworks can be utilized in program planning and evaluation. We present a case study from the United States that describes the implementation of the RE-AIM framework by state aging services providers and public health partners and a questionnaire that can be used to assess the utility of such frameworks in practice. An online questionnaire was developed to capture community perspectives about the utility of the RE-AIM framework. Distributed to project leads in 27 funded states in an evidence-based disease prevention initiative for older adults, 40 key stakeholders responded representing a 100% state-participation rate among the 27 funded states. Findings suggest that there is perceived utility in using the RE-AIM framework when evaluating grand-scale initiatives for older adults. The RE-AIM framework was seen as useful for planning, implementation, and evaluation with relevance for evaluators, providers, community leaders, and policy makers. Yet, the uptake was not universal, and some respondents reported difficulties in use, especially adopting the framework as a whole. This questionnaire can serve as the basis to assess ways the RE-AIM framework can be utilized by practitioners in state-wide D&I efforts. Maximal benefit can be derived from examining the assessment of RE-AIM-related knowledge and confidence as part of a continual quality assurance process. We recommend such an assessment be performed before the implementation of new funding initiatives and throughout their course to assess RE-AIM uptake and to identify areas for technical assistance.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Choi, Sunghwan; Hong, Kwangwoo; Kim, Jaewook
2015-03-07
We developed a self-consistent field program based on Kohn-Sham density functional theory using Lagrange-sinc functions as a basis set and examined its numerical accuracy for atoms and molecules through comparison with the results of Gaussian basis sets. The result of the Kohn-Sham inversion formula from the Lagrange-sinc basis set manifests that the pseudopotential method is essential for cost-effective calculations. The Lagrange-sinc basis set shows faster convergence of the kinetic and correlation energies of benzene as its size increases than the finite difference method does, though both share the same uniform grid. Using a scaling factor smaller than or equal tomore » 0.226 bohr and pseudopotentials with nonlinear core correction, its accuracy for the atomization energies of the G2-1 set is comparable to all-electron complete basis set limits (mean absolute deviation ≤1 kcal/mol). The same basis set also shows small mean absolute deviations in the ionization energies, electron affinities, and static polarizabilities of atoms in the G2-1 set. In particular, the Lagrange-sinc basis set shows high accuracy with rapid convergence in describing density or orbital changes by an external electric field. Moreover, the Lagrange-sinc basis set can readily improve its accuracy toward a complete basis set limit by simply decreasing the scaling factor regardless of systems.« less
A Model-Based Prioritisation Exercise for the European Water Framework Directive
Daginnus, Klaus; Gottardo, Stefania; Payá-Pérez, Ana; Whitehouse, Paul; Wilkinson, Helen; Zaldívar, José-Manuel
2011-01-01
A model-based prioritisation exercise has been carried out for the Water Framework Directive (WFD) implementation. The approach considers two aspects: the hazard of a certain chemical and its exposure levels, and focuses on aquatic ecosystems, but also takes into account hazards due to secondary poisoning, bioaccumulation through the food chain and potential human health effects. A list provided by EU Member States, Stakeholders and Non-Governmental Organizations comprising 2,034 substances was evaluated according to hazard and exposure criteria. Then 78 substances classified as “of high concern” where analysed and ranked in terms of risk ratio (Predicted Environmental Concentration/Predicted No-Effect Concentration). This exercise has been complemented by a monitoring-based prioritization exercise using data provided by Member States. The proposed approach constitutes the first step in setting the basis for an open modular screening tool that could be used for the next prioritization exercises foreseen by the WFD. PMID:21556195
Locality preserving non-negative basis learning with graph embedding.
Ghanbari, Yasser; Herrington, John; Gur, Ruben C; Schultz, Robert T; Verma, Ragini
2013-01-01
The high dimensionality of connectivity networks necessitates the development of methods identifying the connectivity building blocks that not only characterize the patterns of brain pathology but also reveal representative population patterns. In this paper, we present a non-negative component analysis framework for learning localized and sparse sub-network patterns of connectivity matrices by decomposing them into two sets of discriminative and reconstructive bases. In order to obtain components that are designed towards extracting population differences, we exploit the geometry of the population by using a graphtheoretical scheme that imposes locality-preserving properties as well as maintaining the underlying distance between distant nodes in the original and the projected space. The effectiveness of the proposed framework is demonstrated by applying it to two clinical studies using connectivity matrices derived from DTI to study a population of subjects with ASD, as well as a developmental study of structural brain connectivity that extracts gender differences.
Cartographic Modeling: Computer-assisted Analysis of Spatially Defined Neighborhoods
NASA Technical Reports Server (NTRS)
Berry, J. K.; Tomlin, C. D.
1982-01-01
Cartographic models addressing a wide variety of applications are composed of fundamental map processing operations. These primitive operations are neither data base nor application-specific. By organizing the set of operations into a mathematical-like structure, the basis for a generalized cartographic modeling framework can be developed. Among the major classes of primitive operations are those associated with reclassifying map categories, overlaying maps, determining distance and connectivity, and characterizing cartographic neighborhoods. The conceptual framework of cartographic modeling is established and techniques for characterizing neighborhoods are used as a means of demonstrating some of the more sophisticated procedures of computer-assisted map analysis. A cartographic model for assessing effective roundwood supply is briefly described as an example of a computer analysis. Most of the techniques described have been implemented as part of the map analysis package developed at the Yale School of Forestry and Environmental Studies.
Embedded-cluster calculations in a numeric atomic orbital density-functional theory framework.
Berger, Daniel; Logsdail, Andrew J; Oberhofer, Harald; Farrow, Matthew R; Catlow, C Richard A; Sherwood, Paul; Sokol, Alexey A; Blum, Volker; Reuter, Karsten
2014-07-14
We integrate the all-electron electronic structure code FHI-aims into the general ChemShell package for solid-state embedding quantum and molecular mechanical (QM/MM) calculations. A major undertaking in this integration is the implementation of pseudopotential functionality into FHI-aims to describe cations at the QM/MM boundary through effective core potentials and therewith prevent spurious overpolarization of the electronic density. Based on numeric atomic orbital basis sets, FHI-aims offers particularly efficient access to exact exchange and second order perturbation theory, rendering the established QM/MM setup an ideal tool for hybrid and double-hybrid level density functional theory calculations of solid systems. We illustrate this capability by calculating the reduction potential of Fe in the Fe-substituted ZSM-5 zeolitic framework and the reaction energy profile for (photo-)catalytic water oxidation at TiO2(110).
Embedded-cluster calculations in a numeric atomic orbital density-functional theory framework
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berger, Daniel, E-mail: daniel.berger@ch.tum.de; Oberhofer, Harald; Reuter, Karsten
2014-07-14
We integrate the all-electron electronic structure code FHI-aims into the general ChemShell package for solid-state embedding quantum and molecular mechanical (QM/MM) calculations. A major undertaking in this integration is the implementation of pseudopotential functionality into FHI-aims to describe cations at the QM/MM boundary through effective core potentials and therewith prevent spurious overpolarization of the electronic density. Based on numeric atomic orbital basis sets, FHI-aims offers particularly efficient access to exact exchange and second order perturbation theory, rendering the established QM/MM setup an ideal tool for hybrid and double-hybrid level density functional theory calculations of solid systems. We illustrate this capabilitymore » by calculating the reduction potential of Fe in the Fe-substituted ZSM-5 zeolitic framework and the reaction energy profile for (photo-)catalytic water oxidation at TiO{sub 2}(110)« less
Responsible gambling: general principles and minimal requirements.
Blaszczynski, Alex; Collins, Peter; Fong, Davis; Ladouceur, Robert; Nower, Lia; Shaffer, Howard J; Tavares, Hermano; Venisse, Jean-Luc
2011-12-01
Many international jurisdictions have introduced responsible gambling programs. These programs intend to minimize negative consequences of excessive gambling, but vary considerably in their aims, focus, and content. Many responsible gambling programs lack a conceptual framework and, in the absence of empirical data, their components are based only on general considerations and impressions. This paper outlines the consensus viewpoint of an international group of researchers suggesting fundamental responsible gambling principles, roles of key stakeholders, and minimal requirements that stakeholders can use to frame and inform responsible gambling programs across jurisdictions. Such a framework does not purport to offer value statements regarding the legal status of gambling or its expansion. Rather, it proposes gambling-related initiatives aimed at government, industry, and individuals to promote responsible gambling and consumer protection. This paper argues that there is a set of basic principles and minimal requirements that should form the basis for every responsible gambling program.
Dweck, Carol S
2017-11-01
Drawing on both classic and current approaches, I propose a theory that integrates motivation, personality, and development within one framework, using a common set of principles and mechanisms. The theory begins by specifying basic needs and by suggesting how, as people pursue need-fulfilling goals, they build mental representations of their experiences (beliefs, representations of emotions, and representations of action tendencies). I then show how these needs, goals, and representations can serve as the basis of both motivation and personality, and can help to integrate disparate views of personality. The article builds on this framework to provide a new perspective on development, particularly on the forces that propel development and the roles of nature and nurture. I argue throughout that the focus on representations provides an important entry point for change and growth. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Optimization of selected molecular orbitals in group basis sets.
Ferenczy, György G; Adams, William H
2009-04-07
We derive a local basis equation which may be used to determine the orbitals of a group of electrons in a system when the orbitals of that group are represented by a group basis set, i.e., not the basis set one would normally use but a subset suited to a specific electronic group. The group orbitals determined by the local basis equation minimize the energy of a system when a group basis set is used and the orbitals of other groups are frozen. In contrast, under the constraint of a group basis set, the group orbitals satisfying the Huzinaga equation do not minimize the energy. In a test of the local basis equation on HCl, the group basis set included only 12 of the 21 functions in a basis set one might ordinarily use, but the calculated active orbital energies were within 0.001 hartree of the values obtained by solving the Hartree-Fock-Roothaan (HFR) equation using all 21 basis functions. The total energy found was just 0.003 hartree higher than the HFR value. The errors with the group basis set approximation to the Huzinaga equation were larger by over two orders of magnitude. Similar results were obtained for PCl(3) with the group basis approximation. Retaining more basis functions allows an even higher accuracy as shown by the perfect reproduction of the HFR energy of HCl with 16 out of 21 basis functions in the valence basis set. When the core basis set was also truncated then no additional error was introduced in the calculations performed for HCl with various basis sets. The same calculations with fixed core orbitals taken from isolated heavy atoms added a small error of about 10(-4) hartree. This offers a practical way to calculate wave functions with predetermined fixed core and reduced base valence orbitals at reduced computational costs. The local basis equation can also be used to combine the above approximations with the assignment of local basis sets to groups of localized valence molecular orbitals and to derive a priori localized orbitals. An appropriately chosen localization and basis set assignment allowed a reproduction of the energy of n-hexane with an error of 10(-5) hartree, while the energy difference between its two conformers was reproduced with a similar accuracy for several combinations of localizations and basis set assignments. These calculations include localized orbitals extending to 4-5 heavy atoms and thus they require to solve reduced dimension secular equations. The dimensions are not expected to increase with increasing system size and thus the local basis equation may find use in linear scaling electronic structure calculations.
Electronic Structure Calculations of Hydrogen Storage in Lithium-Decorated Metal-Graphyne Framework.
Kumar, Sandeep; Dhilip Kumar, Thogluva Janardhanan
2017-08-30
Porous metal-graphyne framework (MGF) made up of graphyne linker decorated with lithium has been investigated for hydrogen storage. Applying density functional theory spin-polarized generalized gradient approximation with the Perdew-Burke-Ernzerhof functional containing Grimme's diffusion parameter with double numeric polarization basis set, the structural stability, and physicochemical properties have been analyzed. Each linker binds two Li atoms over the surface of the graphyne linker forming MGF-Li 8 by Dewar coordination. On saturation with hydrogen, each Li atom physisorbs three H 2 molecules resulting in MGF-Li 8 -H 24 . H 2 and Li interact by charge polarization mechanism leading to elongation in average H-H bond length indicating physisorption. Sorption energy decreases gradually from ≈0.4 to 0.20 eV on H 2 loading. Molecular dynamics simulations and computed sorption energy range indicate the high reversibility of H 2 in the MGF-Li 8 framework with the hydrogen storage capacity of 6.4 wt %. The calculated thermodynamic practical hydrogen storage at room temperature makes the Li-decorated MGF system a promising hydrogen storage material.
Predicting Player Position for Talent Identification in Association Football
NASA Astrophysics Data System (ADS)
Razali, Nazim; Mustapha, Aida; Yatim, Faiz Ahmad; Aziz, Ruhaya Ab
2017-08-01
This paper is set to introduce a new framework from the perspective of Computer Science for identifying talents in the sport of football based on the players’ individual qualities; physical, mental, and technical. The combination of qualities as assessed by coaches are then used to predict the players’ position in a match that suits the player the best in a particular team formation. Evaluation of the proposed framework is two-fold; quantitatively via classification experiments to predict player position, and qualitatively via a Talent Identification Site developed to achieve the same goal. Results from the classification experiments using Bayesian Networks, Decision Trees, and K-Nearest Neighbor have shown an average of 98% accuracy, which will promote consistency in decision-making though elimination of personal bias in team selection. The positive reviews on the Football Identification Site based on user acceptance evaluation also indicates that the framework is sufficient to serve as the basis of developing an intelligent team management system in different sports, whereby growth and performance of sport players can be monitored and identified.
Towards a Lifecycle Information Framework and Technology in Manufacturing.
Hedberg, Thomas; Feeney, Allison Barnard; Helu, Moneer; Camelio, Jaime A
2017-06-01
Industry has been chasing the dream of integrating and linking data across the product lifecycle and enterprises for decades. However, industry has been challenged by the fact that the context in which data is used varies based on the function / role in the product lifecycle that is interacting with the data. Holistically, the data across the product lifecycle must be considered an unstructured data-set because multiple data repositories and domain-specific schema exist in each phase of the lifecycle. This paper explores a concept called the Lifecycle Information Framework and Technology (LIFT). LIFT is a conceptual framework for lifecycle information management and the integration of emerging and existing technologies, which together form the basis of a research agenda for dynamic information modeling in support of digital-data curation and reuse in manufacturing. This paper provides a discussion of the existing technologies and activities that the LIFT concept leverages. Also, the paper describes the motivation for applying such work to the domain of manufacturing. Then, the LIFT concept is discussed in detail, while underlying technologies are further examined and a use case is detailed. Lastly, potential impacts are explored.
Vervisch, Thomas G A; Vlassenroot, Koen; Braeckman, Johan
2013-04-01
The failure of food security and livelihood interventions to adapt to conflict settings remains a key challenge in humanitarian responses to protracted crises. This paper proposes a social capital analysis to address this policy gap, adding a political economy dimension on food security and conflict to the actor-based livelihood framework. A case study of three hillsides in north Burundi provides an ethnographic basis for this hypothesis. While relying on a theoretical framework in which different combinations of social capital (bonding, bridging, and linking) account for a diverse range of outcomes, the findings offer empirical insights into how social capital portfolios adapt to a protracted crisis. It is argued that these social capital adaptations have the effect of changing livelihood policies, institutions, and processes (PIPs), and clarify the impact of the distribution of power and powerlessness on food security issues. In addition, they represent a solid way of integrating political economy concerns into the livelihood framework. © 2013 The Author(s). Journal compilation © Overseas Development Institute, 2013.
Baudin, Pablo; Kristensen, Kasper
2017-06-07
We present a new framework for calculating coupled cluster (CC) excitation energies at a reduced computational cost. It relies on correlated natural transition orbitals (NTOs), denoted CIS(D')-NTOs, which are obtained by diagonalizing generalized hole and particle density matrices determined from configuration interaction singles (CIS) information and additional terms that represent correlation effects. A transition-specific reduced orbital space is determined based on the eigenvalues of the CIS(D')-NTOs, and a standard CC excitation energy calculation is then performed in that reduced orbital space. The new method is denoted CorNFLEx (Correlated Natural transition orbital Framework for Low-scaling Excitation energy calculations). We calculate second-order approximate CC singles and doubles (CC2) excitation energies for a test set of organic molecules and demonstrate that CorNFLEx yields excitation energies of CC2 quality at a significantly reduced computational cost, even for relatively small systems and delocalized electronic transitions. In order to illustrate the potential of the method for large molecules, we also apply CorNFLEx to calculate CC2 excitation energies for a series of solvated formamide clusters (up to 4836 basis functions).
Intelligent Control of a Sensor-Actuator System via Kernelized Least-Squares Policy Iteration
Liu, Bo; Chen, Sanfeng; Li, Shuai; Liang, Yongsheng
2012-01-01
In this paper a new framework, called Compressive Kernelized Reinforcement Learning (CKRL), for computing near-optimal policies in sequential decision making with uncertainty is proposed via incorporating the non-adaptive data-independent Random Projections and nonparametric Kernelized Least-squares Policy Iteration (KLSPI). Random Projections are a fast, non-adaptive dimensionality reduction framework in which high-dimensionality data is projected onto a random lower-dimension subspace via spherically random rotation and coordination sampling. KLSPI introduce kernel trick into the LSPI framework for Reinforcement Learning, often achieving faster convergence and providing automatic feature selection via various kernel sparsification approaches. In this approach, policies are computed in a low-dimensional subspace generated by projecting the high-dimensional features onto a set of random basis. We first show how Random Projections constitute an efficient sparsification technique and how our method often converges faster than regular LSPI, while at lower computational costs. Theoretical foundation underlying this approach is a fast approximation of Singular Value Decomposition (SVD). Finally, simulation results are exhibited on benchmark MDP domains, which confirm gains both in computation time and in performance in large feature spaces. PMID:22736969
Accurate Methods for Large Molecular Systems (Preprint)
2009-01-06
tensor, EFP calculations are basis set dependent. The smallest recommended basis set is 6- 31++G( d , p )52 The dependence of the computational cost of...and second order perturbation theory (MP2) levels with the 6-31G( d , p ) basis set. Additional SFM tests are presented for a small set of alpha...helices using the 6-31++G( d , p ) basis set. The larger 6-311++G(3df,2p) basis set is employed for creating all EFPs used for non- bonded interactions, since
Scobbie, Lesley; Dixon, Diane; Wyke, Sally
2011-05-01
Setting and achieving goals is fundamental to rehabilitation practice but has been criticized for being a-theoretical and the key components of replicable goal-setting interventions are not well established. To describe the development of a theory-based goal setting practice framework for use in rehabilitation settings and to detail its component parts. Causal modelling was used to map theories of behaviour change onto the process of setting and achieving rehabilitation goals, and to suggest the mechanisms through which patient outcomes are likely to be affected. A multidisciplinary task group developed the causal model into a practice framework for use in rehabilitation settings through iterative discussion and implementation with six patients. Four components of a goal-setting and action-planning practice framework were identified: (i) goal negotiation, (ii) goal identification, (iii) planning, and (iv) appraisal and feedback. The variables hypothesized to effect change in patient outcomes were self-efficacy and action plan attainment. A theory-based goal setting practice framework for use in rehabilitation settings is described. The framework requires further development and systematic evaluation in a range of rehabilitation settings.
Krishnan, Jerry A.; Au, David H.; Bender, Bruce G.; Carson, Shannon S.; Cattamanchi, Adithya; Cloutier, Michelle M.; Cooke, Colin R.; Erickson, Karen; George, Maureen; Gerald, Joe K.; Gerald, Lynn B.; Goss, Christopher H.; Gould, Michael K.; Hyzy, Robert; Kahn, Jeremy M.; Mittman, Brian S.; Mosesón, Erika M.; Mularski, Richard A.; Parthasarathy, Sairam; Patel, Sanjay R.; Rand, Cynthia S.; Redeker, Nancy S.; Reiss, Theodore F.; Riekert, Kristin A.; Rubenfeld, Gordon D.; Tate, Judith A.; Wilson, Kevin C.; Thomson, Carey C.
2016-01-01
Background: Many advances in health care fail to reach patients. Implementation science is the study of novel approaches to mitigate this evidence-to-practice gap. Methods: The American Thoracic Society (ATS) created a multidisciplinary ad hoc committee to develop a research statement on implementation science in pulmonary, critical care, and sleep medicine. The committee used an iterative consensus process to define implementation science and review the use of conceptual frameworks to guide implementation science for the pulmonary, critical care, and sleep community and to explore how professional medical societies such as the ATS can promote implementation science. Results: The committee defined implementation science as the study of the mechanisms by which effective health care interventions are either adopted or not adopted in clinical and community settings. The committee also distinguished implementation science from the act of implementation. Ideally, implementation science should include early and continuous stakeholder involvement and the use of conceptual frameworks (i.e., models to systematize the conduct of studies and standardize the communication of findings). Multiple conceptual frameworks are available, and we suggest the selection of one or more frameworks on the basis of the specific research question and setting. Professional medical societies such as the ATS can have an important role in promoting implementation science. Recommendations for professional societies to consider include: unifying implementation science activities through a single organizational structure, linking front-line clinicians with implementation scientists, seeking collaborations to prioritize and conduct implementation science studies, supporting implementation science projects through funding opportunities, working with research funding bodies to set the research agenda in the field, collaborating with external bodies responsible for health care delivery, disseminating results of implementation science through scientific journals and conferences, and teaching the next generation about implementation science through courses and other media. Conclusions: Implementation science plays an increasingly important role in health care. Through support of implementation science, the ATS and other professional medical societies can work with other stakeholders to lead this effort. PMID:27739895
Weiss, Curtis H; Krishnan, Jerry A; Au, David H; Bender, Bruce G; Carson, Shannon S; Cattamanchi, Adithya; Cloutier, Michelle M; Cooke, Colin R; Erickson, Karen; George, Maureen; Gerald, Joe K; Gerald, Lynn B; Goss, Christopher H; Gould, Michael K; Hyzy, Robert; Kahn, Jeremy M; Mittman, Brian S; Mosesón, Erika M; Mularski, Richard A; Parthasarathy, Sairam; Patel, Sanjay R; Rand, Cynthia S; Redeker, Nancy S; Reiss, Theodore F; Riekert, Kristin A; Rubenfeld, Gordon D; Tate, Judith A; Wilson, Kevin C; Thomson, Carey C
2016-10-15
Many advances in health care fail to reach patients. Implementation science is the study of novel approaches to mitigate this evidence-to-practice gap. The American Thoracic Society (ATS) created a multidisciplinary ad hoc committee to develop a research statement on implementation science in pulmonary, critical care, and sleep medicine. The committee used an iterative consensus process to define implementation science and review the use of conceptual frameworks to guide implementation science for the pulmonary, critical care, and sleep community and to explore how professional medical societies such as the ATS can promote implementation science. The committee defined implementation science as the study of the mechanisms by which effective health care interventions are either adopted or not adopted in clinical and community settings. The committee also distinguished implementation science from the act of implementation. Ideally, implementation science should include early and continuous stakeholder involvement and the use of conceptual frameworks (i.e., models to systematize the conduct of studies and standardize the communication of findings). Multiple conceptual frameworks are available, and we suggest the selection of one or more frameworks on the basis of the specific research question and setting. Professional medical societies such as the ATS can have an important role in promoting implementation science. Recommendations for professional societies to consider include: unifying implementation science activities through a single organizational structure, linking front-line clinicians with implementation scientists, seeking collaborations to prioritize and conduct implementation science studies, supporting implementation science projects through funding opportunities, working with research funding bodies to set the research agenda in the field, collaborating with external bodies responsible for health care delivery, disseminating results of implementation science through scientific journals and conferences, and teaching the next generation about implementation science through courses and other media. Implementation science plays an increasingly important role in health care. Through support of implementation science, the ATS and other professional medical societies can work with other stakeholders to lead this effort.
The PSEUDODOJO: Training and grading a 85 element optimized norm-conserving pseudopotential table
NASA Astrophysics Data System (ADS)
van Setten, M. J.; Giantomassi, M.; Bousquet, E.; Verstraete, M. J.; Hamann, D. R.; Gonze, X.; Rignanese, G.-M.
2018-05-01
First-principles calculations in crystalline structures are often performed with a planewave basis set. To make the number of basis functions tractable two approximations are usually introduced: core electrons are frozen and the diverging Coulomb potential near the nucleus is replaced by a smoother expression. The norm-conserving pseudopotential was the first successful method to apply these approximations in a fully ab initio way. Later on, more efficient and more exact approaches were developed based on the ultrasoft and the projector augmented wave formalisms. These formalisms are however more complex and developing new features in these frameworks is usually more difficult than in the norm-conserving framework. Most of the existing tables of norm-conserving pseudopotentials, generated long ago, do not include the latest developments, are not systematically tested or are not designed primarily for high precision. In this paper, we present our PSEUDODOJO framework for developing and testing full tables of pseudopotentials, and demonstrate it with a new table generated with the ONCVPSP approach. The PSEUDODOJO is an open source project, building on the ABIPY package, for developing and systematically testing pseudopotentials. At present it contains 7 different batteries of tests executed with ABINIT, which are performed as a function of the energy cutoff. The results of these tests are then used to provide hints for the energy cutoff for actual production calculations. Our final set contains 141 pseudopotentials split into a standard and a stringent accuracy table. In total around 70,000 calculations were performed to test the pseudopotentials. The process of developing the final table led to new insights into the effects of both the core-valence partitioning and the non-linear core corrections on the stability, convergence, and transferability of norm-conserving pseudopotentials. The PSEUDODOJO hence provides a set of pseudopotentials and general purpose tools for further testing and development, focusing on highly accurate calculations and their use in the development of ab initio packages. The pseudopotential files are available on the PSEUDODOJO web-interface pseudo-dojo.org under the name NC (ONCVPSP) v0.4 in the psp8, UPF2, and PSML 1.1 formats. The webinterface also provides the inputs, which are compatible with the 3.3.1 and higher versions of ONCVPSP. All tests have been performed with ABINIT 8.4.
Ogunnaike, Babatunde A; Gelmi, Claudio A; Edwards, Jeremy S
2010-05-21
Gene expression studies generate large quantities of data with the defining characteristic that the number of genes (whose expression profiles are to be determined) exceed the number of available replicates by several orders of magnitude. Standard spot-by-spot analysis still seeks to extract useful information for each gene on the basis of the number of available replicates, and thus plays to the weakness of microarrays. On the other hand, because of the data volume, treating the entire data set as an ensemble, and developing theoretical distributions for these ensembles provides a framework that plays instead to the strength of microarrays. We present theoretical results that under reasonable assumptions, the distribution of microarray intensities follows the Gamma model, with the biological interpretations of the model parameters emerging naturally. We subsequently establish that for each microarray data set, the fractional intensities can be represented as a mixture of Beta densities, and develop a procedure for using these results to draw statistical inference regarding differential gene expression. We illustrate the results with experimental data from gene expression studies on Deinococcus radiodurans following DNA damage using cDNA microarrays. Copyright (c) 2010 Elsevier Ltd. All rights reserved.
Topological framework for local structure analysis in condensed matter
Lazar, Emanuel A.; Han, Jian; Srolovitz, David J.
2015-01-01
Physical systems are frequently modeled as sets of points in space, each representing the position of an atom, molecule, or mesoscale particle. As many properties of such systems depend on the underlying ordering of their constituent particles, understanding that structure is a primary objective of condensed matter research. Although perfect crystals are fully described by a set of translation and basis vectors, real-world materials are never perfect, as thermal vibrations and defects introduce significant deviation from ideal order. Meanwhile, liquids and glasses present yet more complexity. A complete understanding of structure thus remains a central, open problem. Here we propose a unified mathematical framework, based on the topology of the Voronoi cell of a particle, for classifying local structure in ordered and disordered systems that is powerful and practical. We explain the underlying reason why this topological description of local structure is better suited for structural analysis than continuous descriptions. We demonstrate the connection of this approach to the behavior of physical systems and explore how crystalline structure is compromised at elevated temperatures. We also illustrate potential applications to identifying defects in plastically deformed polycrystals at high temperatures, automating analysis of complex structures, and characterizing general disordered systems. PMID:26460045
Usvyat, Denis; Civalleri, Bartolomeo; Maschio, Lorenzo; Dovesi, Roberto; Pisani, Cesare; Schütz, Martin
2011-06-07
The atomic orbital basis set limit is approached in periodic correlated calculations for solid LiH. The valence correlation energy is evaluated at the level of the local periodic second order Møller-Plesset perturbation theory (MP2), using basis sets of progressively increasing size, and also employing "bond"-centered basis functions in addition to the standard atom-centered ones. Extended basis sets, which contain linear dependencies, are processed only at the MP2 stage via a dual basis set scheme. The local approximation (domain) error has been consistently eliminated by expanding the orbital excitation domains. As a final result, it is demonstrated that the complete basis set limit can be reached for both HF and local MP2 periodic calculations, and a general scheme is outlined for the definition of high-quality atomic-orbital basis sets for solids. © 2011 American Institute of Physics
Feller, David; Peterson, Kirk A
2013-08-28
The effectiveness of the recently developed, explicitly correlated coupled cluster method CCSD(T)-F12b is examined in terms of its ability to reproduce atomization energies derived from complete basis set extrapolations of standard CCSD(T). Most of the standard method findings were obtained with aug-cc-pV7Z or aug-cc-pV8Z basis sets. For a few homonuclear diatomic molecules it was possible to push the basis set to the aug-cc-pV9Z level. F12b calculations were performed with the cc-pVnZ-F12 (n = D, T, Q) basis set sequence and were also extrapolated to the basis set limit using a Schwenke-style, parameterized formula. A systematic bias was observed in the F12b method with the (VTZ-F12/VQZ-F12) basis set combination. This bias resulted in the underestimation of reference values associated with small molecules (valence correlation energies <0.5 E(h)) and an even larger overestimation of atomization energies for bigger systems. Consequently, caution should be exercised in the use of F12b for high accuracy studies. Root mean square and mean absolute deviation error metrics for this basis set combination were comparable to complete basis set values obtained with standard CCSD(T) and the aug-cc-pVDZ through aug-cc-pVQZ basis set sequence. However, the mean signed deviation was an order of magnitude larger. Problems partially due to basis set superposition error were identified with second row compounds which resulted in a weak performance for the smaller VDZ-F12/VTZ-F12 combination of basis sets.
NASA Astrophysics Data System (ADS)
Chmela, Jiří; Harding, Michael E.
2018-06-01
Optimised auxiliary basis sets for lanthanide atoms (Ce to Lu) for four basis sets of the Karlsruhe error-balanced segmented contracted def2 - series (SVP, TZVP, TZVPP and QZVPP) are reported. These auxiliary basis sets enable the use of the resolution-of-the-identity (RI) approximation in post Hartree-Fock methods - as for example, second-order perturbation theory (MP2) and coupled cluster (CC) theory. The auxiliary basis sets are tested on an enlarged set of about a hundred molecules where the test criterion is the size of the RI error in MP2 calculations. Our tests also show that the same auxiliary basis sets can be used together with different effective core potentials. With these auxiliary basis set calculations of MP2 and CC quality can now be performed efficiently on medium-sized molecules containing lanthanides.
High-order nonlinear susceptibilities of He
NASA Astrophysics Data System (ADS)
Liu, W.-C.; Clark, Charles W.
1996-05-01
High-order nonlinear optical response of noble gases to intense laser radiation is of considerable experimental interest, but is difficult to measure or calculate accurately. We have begun a set of calculations of frequency-dependent nonlinear susceptibilities of He 1s^2, within the framework of Rayleigh-Schrödinger perturbation theory at lowest applicable order, with the goal of providing critically evaluated atomic data for modelling high harmonic generation processes. The atomic Hamiltonian is decomposed in term of Hylleraas coordinates and spherical harmonics using the formalism of Pont and Shakeshaft (M. Pont and R. Shakeshaft, Phy. Rev. A 51), 257 (1995), and the hierarchy of inhomogeneous equations of perturbation theory is solved iteratively. A combination of Hylleraas and Frankowski basis functions is used(J. D. Baker, Master thesis, U. Delaware (1988); J. D. Baker, R. N. Hill, and J. D. Morgan, AIP Conference Proceedings 189) 123(1989); the compact Hylleraas basis provides a highly accurate representation of the ground state wavefunction, whereas the diffuse Frankowski basis functions efficiently reproduce the correct asymptotic structure of the perturbed orbitals.
A walk through the approximations of ab initio multiple spawning
NASA Astrophysics Data System (ADS)
Mignolet, Benoit; Curchod, Basile F. E.
2018-04-01
Full multiple spawning offers an in principle exact framework for excited-state dynamics, where nuclear wavefunctions in different electronic states are represented by a set of coupled trajectory basis functions that follow classical trajectories. The couplings between trajectory basis functions can be approximated to treat molecular systems, leading to the ab initio multiple spawning method which has been successfully employed to study the photochemistry and photophysics of several molecules. However, a detailed investigation of its approximations and their consequences is currently missing in the literature. In this work, we simulate the explicit photoexcitation and subsequent excited-state dynamics of a simple system, LiH, and we analyze (i) the effect of the ab initio multiple spawning approximations on different observables and (ii) the convergence of the ab initio multiple spawning results towards numerically exact quantum dynamics upon a progressive relaxation of these approximations. We show that, despite the crude character of the approximations underlying ab initio multiple spawning for this low-dimensional system, the qualitative excited-state dynamics is adequately captured, and affordable corrections can further be applied to ameliorate the coupling between trajectory basis functions.
A walk through the approximations of ab initio multiple spawning.
Mignolet, Benoit; Curchod, Basile F E
2018-04-07
Full multiple spawning offers an in principle exact framework for excited-state dynamics, where nuclear wavefunctions in different electronic states are represented by a set of coupled trajectory basis functions that follow classical trajectories. The couplings between trajectory basis functions can be approximated to treat molecular systems, leading to the ab initio multiple spawning method which has been successfully employed to study the photochemistry and photophysics of several molecules. However, a detailed investigation of its approximations and their consequences is currently missing in the literature. In this work, we simulate the explicit photoexcitation and subsequent excited-state dynamics of a simple system, LiH, and we analyze (i) the effect of the ab initio multiple spawning approximations on different observables and (ii) the convergence of the ab initio multiple spawning results towards numerically exact quantum dynamics upon a progressive relaxation of these approximations. We show that, despite the crude character of the approximations underlying ab initio multiple spawning for this low-dimensional system, the qualitative excited-state dynamics is adequately captured, and affordable corrections can further be applied to ameliorate the coupling between trajectory basis functions.
Penocchio, Emanuele; Piccardo, Matteo; Barone, Vincenzo
2015-10-13
The B2PLYP double hybrid functional, coupled with the correlation-consistent triple-ζ cc-pVTZ (VTZ) basis set, has been validated in the framework of the semiexperimental (SE) approach for deriving accurate equilibrium structures of molecules containing up to 15 atoms. A systematic comparison between new B2PLYP/VTZ results and several equilibrium SE structures previously determined at other levels, in particular B3LYP/SNSD and CCSD(T) with various basis sets, has put in evidence the accuracy and the remarkable stability of such model chemistry for both equilibrium structures and vibrational corrections. New SE equilibrium structures for phenylacetylene, pyruvic acid, peroxyformic acid, and phenyl radical are discussed and compared with literature data. Particular attention has been devoted to the discussion of systems for which lack of sufficient experimental data prevents a complete SE determination. In order to obtain an accurate equilibrium SE structure for these situations, the so-called templating molecule approach is discussed and generalized with respect to our previous work. Important applications are those involving biological building blocks, like uracil and thiouracil. In addition, for more general situations the linear regression approach has been proposed and validated.
Cáceres, Carlos F; Koechlin, Florence; Goicochea, Pedro; Sow, Papa-Salif; O'Reilly, Kevin R; Mayer, Kenneth H; Godfrey-Faussett, Peter
2015-01-01
Towards the end of the twentieth century, significant success was achieved in reducing incidence in several global HIV epidemics through ongoing prevention strategies. However, further progress in risk reduction was uncertain. For one thing, it was clear that social vulnerability had to be addressed, through research on interventions addressing health systems and other structural barriers. As soon as antiretroviral treatment became available, researchers started to conceive that antiretrovirals might play a role in decreasing either susceptibility in uninfected people or infectiousness among people living with HIV. In this paper we focus on the origin, present status, and potential contribution of pre-exposure prophylaxis (PrEP) within the combination HIV prevention framework. After a phase of controversy, PrEP efficacy trials took off. By 2015, daily oral PrEP, using tenofovir alone or in combination with emtricitabine, has been proven efficacious, though efficacy seems heavily contingent upon adherence to pill uptake. Initial demonstration projects after release of efficacy results have shown that PrEP can be implemented in real settings and adherence can be high, leading to high effectiveness. Despite its substantial potential, beliefs persist about unfeasibility in real-life settings due to stigma, cost, adherence, and potential risk compensation barriers. The strategic synergy of behavioural change communication, biomedical strategies (including PrEP), and structural programmes is providing the basis for the combination HIV prevention framework. If PrEP is to ever become a key component of that framework, several negative beliefs must be confronted based on emerging evidence; moreover, research gaps regarding PrEP implementation must be filled, and appropriate prioritization strategies must be set up. Those challenges are significant, proportional to the impact that PrEP implementation may have in the global response to HIV.
Cáceres, Carlos F; Koechlin, Florence; Goicochea, Pedro; Sow, Papa-Salif; O'Reilly, Kevin R; Mayer, Kenneth H; Godfrey-Faussett, Peter
2015-01-01
Introduction Towards the end of the twentieth century, significant success was achieved in reducing incidence in several global HIV epidemics through ongoing prevention strategies. However, further progress in risk reduction was uncertain. For one thing, it was clear that social vulnerability had to be addressed, through research on interventions addressing health systems and other structural barriers. As soon as antiretroviral treatment became available, researchers started to conceive that antiretrovirals might play a role in decreasing either susceptibility in uninfected people or infectiousness among people living with HIV. In this paper we focus on the origin, present status, and potential contribution of pre-exposure prophylaxis (PrEP) within the combination HIV prevention framework. Discussion After a phase of controversy, PrEP efficacy trials took off. By 2015, daily oral PrEP, using tenofovir alone or in combination with emtricitabine, has been proven efficacious, though efficacy seems heavily contingent upon adherence to pill uptake. Initial demonstration projects after release of efficacy results have shown that PrEP can be implemented in real settings and adherence can be high, leading to high effectiveness. Despite its substantial potential, beliefs persist about unfeasibility in real-life settings due to stigma, cost, adherence, and potential risk compensation barriers. Conclusions The strategic synergy of behavioural change communication, biomedical strategies (including PrEP), and structural programmes is providing the basis for the combination HIV prevention framework. If PrEP is to ever become a key component of that framework, several negative beliefs must be confronted based on emerging evidence; moreover, research gaps regarding PrEP implementation must be filled, and appropriate prioritization strategies must be set up. Those challenges are significant, proportional to the impact that PrEP implementation may have in the global response to HIV. PMID:26198341
Updated United Nations Framework Classification for reserves and resources of extractive industries
Ahlbrandt, T.S.; Blaise, J.R.; Blystad, P.; Kelter, D.; Gabrielyants, G.; Heiberg, S.; Martinez, A.; Ross, J.G.; Slavov, S.; Subelj, A.; Young, E.D.
2004-01-01
The United Nations have studied how the oil and gas resource classification developed jointly by the SPE, the World Petroleum Congress (WPC) and the American Association of Petroleum Geologists (AAPG) could be harmonized with the United Nations Framework Classification (UNFC) for Solid Fuel and Mineral Resources (1). The United Nations has continued to build on this and other works, with support from many relevant international organizations, with the objective of updating the UNFC to apply to the extractive industries. The result is the United Nations Framework Classification for Energy and Mineral Resources (2) that this paper will present. Reserves and resources are categorized with respect to three sets of criteria: ??? Economic and commercial viability ??? Field project status and feasibility ??? The level of geologic knowledge The field project status criteria are readily recognized as the ones highlighted in the SPE/WPC/AAPG classification system of 2000. The geologic criteria absorb the rich traditions that form the primary basis for the Russian classification system, and the ones used to delimit, in part, proved reserves. Economic and commercial criteria facilitate the use of the classification in general, and reflect the commercial considerations used to delimit proved reserves in particular. The classification system will help to develop a common understanding of reserves and resources for all the extractive industries and will assist: ??? International and national resources management to secure supplies; ??? Industries' management of business processes to achieve efficiency in exploration and production; and ??? An appropriate basis for documenting the value of reserves and resources in financial statements.
Ab Initio Density Fitting: Accuracy Assessment of Auxiliary Basis Sets from Cholesky Decompositions.
Boström, Jonas; Aquilante, Francesco; Pedersen, Thomas Bondo; Lindh, Roland
2009-06-09
The accuracy of auxiliary basis sets derived by Cholesky decompositions of the electron repulsion integrals is assessed in a series of benchmarks on total ground state energies and dipole moments of a large test set of molecules. The test set includes molecules composed of atoms from the first three rows of the periodic table as well as transition metals. The accuracy of the auxiliary basis sets are tested for the 6-31G**, correlation consistent, and atomic natural orbital basis sets at the Hartree-Fock, density functional theory, and second-order Møller-Plesset levels of theory. By decreasing the decomposition threshold, a hierarchy of auxiliary basis sets is obtained with accuracies ranging from that of standard auxiliary basis sets to that of conventional integral treatments.
Evaluating Health Information Systems Using Ontologies
Anderberg, Peter; Larsson, Tobias C; Fricker, Samuel A; Berglund, Johan
2016-01-01
Background There are several frameworks that attempt to address the challenges of evaluation of health information systems by offering models, methods, and guidelines about what to evaluate, how to evaluate, and how to report the evaluation results. Model-based evaluation frameworks usually suggest universally applicable evaluation aspects but do not consider case-specific aspects. On the other hand, evaluation frameworks that are case specific, by eliciting user requirements, limit their output to the evaluation aspects suggested by the users in the early phases of system development. In addition, these case-specific approaches extract different sets of evaluation aspects from each case, making it challenging to collectively compare, unify, or aggregate the evaluation of a set of heterogeneous health information systems. Objectives The aim of this paper is to find a method capable of suggesting evaluation aspects for a set of one or more health information systems—whether similar or heterogeneous—by organizing, unifying, and aggregating the quality attributes extracted from those systems and from an external evaluation framework. Methods On the basis of the available literature in semantic networks and ontologies, a method (called Unified eValuation using Ontology; UVON) was developed that can organize, unify, and aggregate the quality attributes of several health information systems into a tree-style ontology structure. The method was extended to integrate its generated ontology with the evaluation aspects suggested by model-based evaluation frameworks. An approach was developed to extract evaluation aspects from the ontology that also considers evaluation case practicalities such as the maximum number of evaluation aspects to be measured or their required degree of specificity. The method was applied and tested in Future Internet Social and Technological Alignment Research (FI-STAR), a project of 7 cloud-based eHealth applications that were developed and deployed across European Union countries. Results The relevance of the evaluation aspects created by the UVON method for the FI-STAR project was validated by the corresponding stakeholders of each case. These evaluation aspects were extracted from a UVON-generated ontology structure that reflects both the internally declared required quality attributes in the 7 eHealth applications of the FI-STAR project and the evaluation aspects recommended by the Model for ASsessment of Telemedicine applications (MAST) evaluation framework. The extracted evaluation aspects were used to create questionnaires (for the corresponding patients and health professionals) to evaluate each individual case and the whole of the FI-STAR project. Conclusions The UVON method can provide a relevant set of evaluation aspects for a heterogeneous set of health information systems by organizing, unifying, and aggregating the quality attributes through ontological structures. Those quality attributes can be either suggested by evaluation models or elicited from the stakeholders of those systems in the form of system requirements. The method continues to be systematic, context sensitive, and relevant across a heterogeneous set of health information systems. PMID:27311735
Evaluating Health Information Systems Using Ontologies.
Eivazzadeh, Shahryar; Anderberg, Peter; Larsson, Tobias C; Fricker, Samuel A; Berglund, Johan
2016-06-16
There are several frameworks that attempt to address the challenges of evaluation of health information systems by offering models, methods, and guidelines about what to evaluate, how to evaluate, and how to report the evaluation results. Model-based evaluation frameworks usually suggest universally applicable evaluation aspects but do not consider case-specific aspects. On the other hand, evaluation frameworks that are case specific, by eliciting user requirements, limit their output to the evaluation aspects suggested by the users in the early phases of system development. In addition, these case-specific approaches extract different sets of evaluation aspects from each case, making it challenging to collectively compare, unify, or aggregate the evaluation of a set of heterogeneous health information systems. The aim of this paper is to find a method capable of suggesting evaluation aspects for a set of one or more health information systems-whether similar or heterogeneous-by organizing, unifying, and aggregating the quality attributes extracted from those systems and from an external evaluation framework. On the basis of the available literature in semantic networks and ontologies, a method (called Unified eValuation using Ontology; UVON) was developed that can organize, unify, and aggregate the quality attributes of several health information systems into a tree-style ontology structure. The method was extended to integrate its generated ontology with the evaluation aspects suggested by model-based evaluation frameworks. An approach was developed to extract evaluation aspects from the ontology that also considers evaluation case practicalities such as the maximum number of evaluation aspects to be measured or their required degree of specificity. The method was applied and tested in Future Internet Social and Technological Alignment Research (FI-STAR), a project of 7 cloud-based eHealth applications that were developed and deployed across European Union countries. The relevance of the evaluation aspects created by the UVON method for the FI-STAR project was validated by the corresponding stakeholders of each case. These evaluation aspects were extracted from a UVON-generated ontology structure that reflects both the internally declared required quality attributes in the 7 eHealth applications of the FI-STAR project and the evaluation aspects recommended by the Model for ASsessment of Telemedicine applications (MAST) evaluation framework. The extracted evaluation aspects were used to create questionnaires (for the corresponding patients and health professionals) to evaluate each individual case and the whole of the FI-STAR project. The UVON method can provide a relevant set of evaluation aspects for a heterogeneous set of health information systems by organizing, unifying, and aggregating the quality attributes through ontological structures. Those quality attributes can be either suggested by evaluation models or elicited from the stakeholders of those systems in the form of system requirements. The method continues to be systematic, context sensitive, and relevant across a heterogeneous set of health information systems.
Struijs, Jeroen N; Drewes, Hanneke W; Heijink, Richard; Baan, Caroline A
2015-04-01
Many countries face the persistent twin challenge of providing high-quality care while keeping health systems affordable and accessible. As a result, the interest for more efficient strategies to stimulate population health is increasing. A possible successful strategy is population management (PM). PM strives to address health needs for the population at-risk and the chronically ill at all points along the health continuum by integrating services across health care, prevention, social care and welfare. The Care Continuum Alliance (CCA) population health guide, which recently changed their name in Population Health Alliance (PHA) provides a useful instrument for implementing and evaluating such innovative approaches. This framework is developed for PM specifically and describes the core elements of the PM-concept on the basis of six subsequent interrelated steps. The aim of this article is to transform the CCA framework into an analytical framework. Quantitative methods are refined and we operationalized a set of indicators to measure the impact of PM in terms of the Triple Aim (population health, quality of care and cost per capita). Additionally, we added a qualitative part to gain insight into the implementation process of PM. This resulted in a broadly applicable analytical framework based on a mixed-methods approach. In the coming years, the analytical framework will be applied within the Dutch Monitor Population Management to derive transferable 'lessons learned' and to methodologically underpin the concept of PM. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Oberhofer, Harald; Blumberger, Jochen
2010-12-01
We present a plane wave basis set implementation for the calculation of electronic coupling matrix elements of electron transfer reactions within the framework of constrained density functional theory (CDFT). Following the work of Wu and Van Voorhis [J. Chem. Phys. 125, 164105 (2006)], the diabatic wavefunctions are approximated by the Kohn-Sham determinants obtained from CDFT calculations, and the coupling matrix element calculated by an efficient integration scheme. Our results for intermolecular electron transfer in small systems agree very well with high-level ab initio calculations based on generalized Mulliken-Hush theory, and with previous local basis set CDFT calculations. The effect of thermal fluctuations on the coupling matrix element is demonstrated for intramolecular electron transfer in the tetrathiafulvalene-diquinone (Q-TTF-Q-) anion. Sampling the electronic coupling along density functional based molecular dynamics trajectories, we find that thermal fluctuations, in particular the slow bending motion of the molecule, can lead to changes in the instantaneous electron transfer rate by more than an order of magnitude. The thermal average, ( {< {| {H_ab } |^2 } > } )^{1/2} = 6.7 {mH}, is significantly higher than the value obtained for the minimum energy structure, | {H_ab } | = 3.8 {mH}. While CDFT in combination with generalized gradient approximation (GGA) functionals describes the intermolecular electron transfer in the studied systems well, exact exchange is required for Q-TTF-Q- in order to obtain coupling matrix elements in agreement with experiment (3.9 mH). The implementation presented opens up the possibility to compute electronic coupling matrix elements for extended systems where donor, acceptor, and the environment are treated at the quantum mechanical (QM) level.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rossi, Tuomas P., E-mail: tuomas.rossi@alumni.aalto.fi; Sakko, Arto; Puska, Martti J.
We present an approach for generating local numerical basis sets of improving accuracy for first-principles nanoplasmonics simulations within time-dependent density functional theory. The method is demonstrated for copper, silver, and gold nanoparticles that are of experimental interest but computationally demanding due to the semi-core d-electrons that affect their plasmonic response. The basis sets are constructed by augmenting numerical atomic orbital basis sets by truncated Gaussian-type orbitals generated by the completeness-optimization scheme, which is applied to the photoabsorption spectra of homoatomic metal atom dimers. We obtain basis sets of improving accuracy up to the complete basis set limit and demonstrate thatmore » the performance of the basis sets transfers to simulations of larger nanoparticles and nanoalloys as well as to calculations with various exchange-correlation functionals. This work promotes the use of the local basis set approach of controllable accuracy in first-principles nanoplasmonics simulations and beyond.« less
NASA Astrophysics Data System (ADS)
Neuville, R.; Pouliot, J.; Poux, F.; Hallot, P.; De Rudder, L.; Billen, R.
2017-10-01
This paper deals with the establishment of a comprehensive methodological framework that defines 3D visualisation rules and its application in a decision support tool. Whilst the use of 3D models grows in many application fields, their visualisation remains challenging from the point of view of mapping and rendering aspects to be applied to suitability support the decision making process. Indeed, there exists a great number of 3D visualisation techniques but as far as we know, a decision support tool that facilitates the production of an efficient 3D visualisation is still missing. This is why a comprehensive methodological framework is proposed in order to build decision tables for specific data, tasks and contexts. Based on the second-order logic formalism, we define a set of functions and propositions among and between two collections of entities: on one hand static retinal variables (hue, size, shape…) and 3D environment parameters (directional lighting, shadow, haze…) and on the other hand their effect(s) regarding specific visual tasks. It enables to define 3D visualisation rules according to four categories: consequence, compatibility, potential incompatibility and incompatibility. In this paper, the application of the methodological framework is demonstrated for an urban visualisation at high density considering a specific set of entities. On the basis of our analysis and the results of many studies conducted in the 3D semiotics, which refers to the study of symbols and how they relay information, the truth values of propositions are determined. 3D visualisation rules are then extracted for the considered context and set of entities and are presented into a decision table with a colour coding. Finally, the decision table is implemented into a plugin developed with three.js, a cross-browser JavaScript library. The plugin consists of a sidebar and warning windows that help the designer in the use of a set of static retinal variables and 3D environment parameters.
Estimation and Application of Ecological Memory Functions in Time and Space
NASA Astrophysics Data System (ADS)
Itter, M.; Finley, A. O.; Dawson, A.
2017-12-01
A common goal in quantitative ecology is the estimation or prediction of ecological processes as a function of explanatory variables (or covariates). Frequently, the ecological process of interest and associated covariates vary in time, space, or both. Theory indicates many ecological processes exhibit memory to local, past conditions. Despite such theoretical understanding, few methods exist to integrate observations from the recent past or within a local neighborhood as drivers of these processes. We build upon recent methodological advances in ecology and spatial statistics to develop a Bayesian hierarchical framework to estimate so-called ecological memory functions; that is, weight-generating functions that specify the relative importance of local, past covariate observations to ecological processes. Memory functions are estimated using a set of basis functions in time and/or space, allowing for flexible ecological memory based on a reduced set of parameters. Ecological memory functions are entirely data driven under the Bayesian hierarchical framework—no a priori assumptions are made regarding functional forms. Memory function uncertainty follows directly from posterior distributions for model parameters allowing for tractable propagation of error to predictions of ecological processes. We apply the model framework to simulated spatio-temporal datasets generated using memory functions of varying complexity. The framework is also applied to estimate the ecological memory of annual boreal forest growth to local, past water availability. Consistent with ecological understanding of boreal forest growth dynamics, memory to past water availability peaks in the year previous to growth and slowly decays to zero in five to eight years. The Bayesian hierarchical framework has applicability to a broad range of ecosystems and processes allowing for increased understanding of ecosystem responses to local and past conditions and improved prediction of ecological processes.
Using the living laboratory framework as a basis for understanding next-generation analyst work
NASA Astrophysics Data System (ADS)
McNeese, Michael D.; Mancuso, Vincent; McNeese, Nathan; Endsley, Tristan; Forster, Pete
2013-05-01
The preparation of next generation analyst work requires alternative levels of understanding and new methodological departures from the way current work transpires. Current work practices typically do not provide a comprehensive approach that emphasizes the role of and interplay between (a) cognition, (b) emergent activities in a shared situated context, and (c) collaborative teamwork. In turn, effective and efficient problem solving fails to take place, and practice is often composed of piecemeal, techno-centric tools that isolate analysts by providing rigid, limited levels of understanding of situation awareness. This coupled with the fact that many analyst activities are classified produces a challenging situation for researching such phenomena and designing and evaluating systems to support analyst cognition and teamwork. Through our work with cyber, image, and intelligence analysts we have realized that there is more required of researchers to study human-centered designs to provide for analyst's needs in a timely fashion. This paper identifies and describes how The Living Laboratory Framework can be utilized as a means to develop a comprehensive, human-centric, and problem-focused approach to next generation analyst work, design, and training. We explain how the framework is utilized for specific cases in various applied settings (e.g., crisis management analysis, image analysis, and cyber analysis) to demonstrate its value and power in addressing an area of utmost importance to our national security. Attributes of analyst work settings are delineated to suggest potential design affordances that could help improve cognitive activities and awareness. Finally, the paper puts forth a research agenda for the use of the framework for future work that will move the analyst profession in a viable manner to address the concerns identified.
An open-source framework for testing tracking devices using Lego Mindstorms
NASA Astrophysics Data System (ADS)
Jomier, Julien; Ibanez, Luis; Enquobahrie, Andinet; Pace, Danielle; Cleary, Kevin
2009-02-01
In this paper, we present an open-source framework for testing tracking devices in surgical navigation applications. At the core of image-guided intervention systems is the tracking interface that handles communication with the tracking device and gathers tracking information. Given that the correctness of tracking information is critical for protecting patient safety and for ensuring the successful execution of an intervention, the tracking software component needs to be thoroughly tested on a regular basis. Furthermore, with widespread use of extreme programming methodology that emphasizes continuous and incremental testing of application components, testing design becomes critical. While it is easy to automate most of the testing process, it is often more difficult to test components that require manual intervention such as tracking device. Our framework consists of a robotic arm built from a set of Lego Mindstorms and an open-source toolkit written in C++ to control the robot movements and assess the accuracy of the tracking devices. The application program interface (API) is cross-platform and runs on Windows, Linux and MacOS. We applied this framework for the continuous testing of the Image-Guided Surgery Toolkit (IGSTK), an open-source toolkit for image-guided surgery and shown that regression testing on tracking devices can be performed at low cost and improve significantly the quality of the software.
Hermanowski, Tomasz Roman; Drozdowska, Aleksandra Krystyna; Kowalczyk, Marta
2015-01-01
Objectives In this paper, we emphasised that effective management of health plans beneficiaries access to reimbursed medicines requires proper institutional set-up. The main objective was to identify and recommend an institutional framework of integrated pharmaceutical care providing effective, safe and equitable access to medicines. Method The institutional framework of drug policy was derived on the basis of publications obtained by systematic reviews. A comparative analysis concerning adaptation of coordinated pharmaceutical care services in the USA, the UK, Poland, Italy, Denmark and Germany was performed. Results While most European Union Member States promote the implementation of selected e-Health tools, like e-Prescribing, these efforts do not necessarily implement an integrated package. There is no single agent who would manage an insured patients’ access to medicines and health care in a coordinated manner, thereby increasing the efficiency and safety of drug policy. More attention should be paid by European Union Member States as to how to integrate various e-Health tools to enhance benefits to both individuals and societies. One solution could be to implement an integrated “pharmacy benefit management” model, which is well established in the USA and Canada and provides an integrated package of cost-containment methods, implemented within a transparent institutional framework and powered by strong motivation of the agent. PMID:26528099
NASA Astrophysics Data System (ADS)
Witte, Jonathon; Neaton, Jeffrey B.; Head-Gordon, Martin
2017-06-01
With the aim of mitigating the basis set error in density functional theory (DFT) calculations employing local basis sets, we herein develop two empirical corrections for basis set superposition error (BSSE) in the def2-SVPD basis, a basis which—when stripped of BSSE—is capable of providing near-complete-basis DFT results for non-covalent interactions. Specifically, we adapt the existing pairwise geometrical counterpoise (gCP) approach to the def2-SVPD basis, and we develop a beyond-pairwise approach, DFT-C, which we parameterize across a small set of intermolecular interactions. Both gCP and DFT-C are evaluated against the traditional Boys-Bernardi counterpoise correction across a set of 3402 non-covalent binding energies and isomerization energies. We find that the DFT-C method represents a significant improvement over gCP, particularly for non-covalently-interacting molecular clusters. Moreover, DFT-C is transferable among density functionals and can be combined with existing functionals—such as B97M-V—to recover large-basis results at a fraction of the cost.
Sparse approximation of currents for statistics on curves and surfaces.
Durrleman, Stanley; Pennec, Xavier; Trouvé, Alain; Ayache, Nicholas
2008-01-01
Computing, processing, visualizing statistics on shapes like curves or surfaces is a real challenge with many applications ranging from medical image analysis to computational geometry. Modelling such geometrical primitives with currents avoids feature-based approach as well as point-correspondence method. This framework has been proved to be powerful to register brain surfaces or to measure geometrical invariants. However, if the state-of-the-art methods perform efficiently pairwise registrations, new numerical schemes are required to process groupwise statistics due to an increasing complexity when the size of the database is growing. Statistics such as mean and principal modes of a set of shapes often have a heavy and highly redundant representation. We propose therefore to find an adapted basis on which mean and principal modes have a sparse decomposition. Besides the computational improvement, this sparse representation offers a way to visualize and interpret statistics on currents. Experiments show the relevance of the approach on 34 sets of 70 sulcal lines and on 50 sets of 10 meshes of deep brain structures.
ERIC Educational Resources Information Center
Ku, Ya-Lie; Kuo, Chien-Lin
2016-01-01
The purpose of this study was to develop a framework of creative thinking teaching mode for RN-BSN students on the basis of the creative process of clinical nurses in Taiwan. Purposive samples have earned creativity awards recruited from the medical, surgical, maternity, paediatric, community and psychiatric departments in Taiwan. Semi-structured…
NASA Astrophysics Data System (ADS)
Vogiatzis, Konstantinos D.; Mavrandonakis, Antreas; Klopper, Wim; Froudakis, George
2009-08-01
The separation, capture and storage of carbon dioxide from the flue gas is an environmental and economical problem of significant importance. Zeolites and activated carbons have been used from the industries in order to reduce the emissions of CO2. A new family of materials, the metal-organic frameworks (MOFs), has been recently proposed as an efficient substitute of the abovementioned materials. In particular, materials based on zinc complexes with imidazo-like aromatic compounds which builds frameworks similar with those of Zeolites (Zeolite-Imidazolium Frameworks, ZIFs), have the potential for efficient separation of CO2 from CO and CH4. [1]. Weak interactions between carbon dioxide and heterocyclic aromatic compounds are being examined with high accuracy ab initio methods. CO2 has zero dipole moment but a significant quadrupole moment enables it to operate as a weak acid or weak base, according to his environment. Nitrogen-containing aromatic compounds act as electron donors, while CO2 acts as an electron acceptor. Electrostatic interactions induce a non-permanent dipole moment on CO2 and the complex is stabilized by in-plane hydrogen bonds between the charged oxygens of CO2 and nearby hydrogen of the aromatic molecule. In addition, dispersion forces from the electron correlation contribute to the interaction energy. By using explicitly correlated methods (MP2-F12/aug-cc-pVTZ) [2] and by adding the contribution from the triples excitations, calculated with a smaller basis (6-311++G**), we reach to an approximate CCSD(T) complete basis set result. [3] Extrapolation schemes were used in order to reach the MP2 basis set limit and compare it with the CCSD(T)/CBS result. Those results are in excellent agreement with the explicitly correlated MP2-F12. In addition, our complexes are being investigated with DFT methods that calculate separately the dispersion energy (DFT-D) [4] and modified MP2 which scaling of spin pair correlation [5]. DFT-D results are in good agreement with CCSD(T)/CBS results, providing us a computational cheap method with high accuracy. The quantization of the interaction is examined by changing the aromaticity of the heterocyclic molecules and by talking into account the electron correlation. [6] The electron density of the nitrogen that binds CO2 is gradually decreasing by substituting carbons with nitrogens in pyridine (pyrimidine, pyrazine, triazine), leading to lower binding energy.
Site specific interaction between ZnO nanoparticles and tyrosine: A density functional theory study
NASA Astrophysics Data System (ADS)
Singh, Satvinder; Singh, Janpreet; Singh, Baljinder; Singh, Gurinder; Kaura, Aman; Tripathi, S. K.
2018-05-01
First Principles Calculations have been performed on ZnO/Tyrosine atomic complex to study site specific interaction of Tyrosine and ZnO nanoparticles. Calculated results shows that -COOH group present in Tyrosine is energetically more favorable than -NH2 group. Interactions show ionic bonding between ZnO and Tyrosine. All the calculations have been performed under the Density Functional Theory (DFT) framework. Structural and electronic properties of (ZnO)3/Tyrosine complex have been studied. Gaussian basis set approach has been adopted for the calculations. A ring type most stable (ZnO)3 atomic cluster has been modeled, analyzed and used for the calculations.
The Political Economy of Longevity: Developing New Forms of Solidarity for Later Life
Phillipson, Chris
2015-01-01
Aging populations now exert influence on all aspects of social life. This article examines changes to major social and economic institutions linked with old age, taking the period from the mid-20th century to the opening decades of the 21st century. These developments are set within the context of the influence of globalization as well as the impact of the 2008 financial crisis, these restructuring debates around the longevity revolution. The article examines how the basis for a new framework for accommodating longevity can be built, outlining ways of securing new forms of solidarity in later life. PMID:25678722
Waves in a plane graphene - dielectric waveguide structure
NASA Astrophysics Data System (ADS)
Evseev, Dmitry A.; Eliseeva, Svetlana V.; Sementsov, Dmitry I.
2017-10-01
The features of the guided TE modes propagation have been investigated on the basis of computer simulations in a planar structure consisting of a set of alternating layers of dielectric and graphene. Within the framework of the effective medium approximation, the dispersion relations have been received for symmetric and antisymmetric waveguide modes, determined by the frequency range of their existence. The wave field distribution by structure, frequency dependences of the constants of propagation and transverse components of the wave vectors, as well as group and phase velocities of waveguide modes have been obtained, the effect of the graphene part in a structure on the waveguide mode behavior has been shown.
Finite Nuclei in the Quark-Meson Coupling Model.
Stone, J R; Guichon, P A M; Reinhard, P G; Thomas, A W
2016-03-04
We report the first use of the effective quark-meson coupling (QMC) energy density functional (EDF), derived from a quark model of hadron structure, to study a broad range of ground state properties of even-even nuclei across the periodic table in the nonrelativistic Hartree-Fock+BCS framework. The novelty of the QMC model is that the nuclear medium effects are treated through modification of the internal structure of the nucleon. The density dependence is microscopically derived and the spin-orbit term arises naturally. The QMC EDF depends on a single set of four adjustable parameters having a clear physics basis. When applied to diverse ground state data the QMC EDF already produces, in its present simple form, overall agreement with experiment of a quality comparable to a representative Skyrme EDF. There exist, however, multiple Skyrme parameter sets, frequently tailored to describe selected nuclear phenomena. The QMC EDF set of fewer parameters, derived in this work, is not open to such variation, chosen set being applied, without adjustment, to both the properties of finite nuclei and nuclear matter.
Quantum Dynamics with Short-Time Trajectories and Minimal Adaptive Basis Sets.
Saller, Maximilian A C; Habershon, Scott
2017-07-11
Methods for solving the time-dependent Schrödinger equation via basis set expansion of the wave function can generally be categorized as having either static (time-independent) or dynamic (time-dependent) basis functions. We have recently introduced an alternative simulation approach which represents a middle road between these two extremes, employing dynamic (classical-like) trajectories to create a static basis set of Gaussian wavepackets in regions of phase-space relevant to future propagation of the wave function [J. Chem. Theory Comput., 11, 8 (2015)]. Here, we propose and test a modification of our methodology which aims to reduce the size of basis sets generated in our original scheme. In particular, we employ short-time classical trajectories to continuously generate new basis functions for short-time quantum propagation of the wave function; to avoid the continued growth of the basis set describing the time-dependent wave function, we employ Matching Pursuit to periodically minimize the number of basis functions required to accurately describe the wave function. Overall, this approach generates a basis set which is adapted to evolution of the wave function while also being as small as possible. In applications to challenging benchmark problems, namely a 4-dimensional model of photoexcited pyrazine and three different double-well tunnelling problems, we find that our new scheme enables accurate wave function propagation with basis sets which are around an order-of-magnitude smaller than our original trajectory-guided basis set methodology, highlighting the benefits of adaptive strategies for wave function propagation.
Polarization functions for the modified m6-31G basis sets for atoms Ga through Kr.
Mitin, Alexander V
2013-09-05
The 2df polarization functions for the modified m6-31G basis sets of the third-row atoms Ga through Kr (Int J Quantum Chem, 2007, 107, 3028; Int J. Quantum Chem, 2009, 109, 1158) are proposed. The performances of the m6-31G, m6-31G(d,p), and m6-31G(2df,p) basis sets were examined in molecular calculations carried out by the density functional theory (DFT) method with B3LYP hybrid functional, Møller-Plesset perturbation theory of the second order (MP2), quadratic configuration interaction method with single and double substitutions and were compared with those for the known 6-31G basis sets as well as with the other similar 641 and 6-311G basis sets with and without polarization functions. Obtained results have shown that the performances of the m6-31G, m6-31G(d,p), and m6-31G(2df,p) basis sets are better in comparison with the performances of the known 6-31G, 6-31G(d,p) and 6-31G(2df,p) basis sets. These improvements are mainly reached due to better approximations of different electrons belonging to the different atomic shells in the modified basis sets. Applicability of the modified basis sets in thermochemical calculations is also discussed. © 2013 Wiley Periodicals, Inc.
Cox, Robin S; Danford, Taryn
2014-04-01
Competency models attempt to define what makes expert performers "experts." Successful disaster psychosocial planning and the institutionalizing of psychosocial response within emergency management require clearly-defined skill sets. This necessitates anticipating both the short- and long-term psychosocial implications of a disaster or health emergency (ie, pandemic) by developing effective and sustained working relationships among psychosocial providers, programs, and other planning partners. The following article outlines recommended competencies for psychosocial responders to enable communities and organizations to prepare for and effectively manage a disaster response. Competency-based models are founded on observable performance or behavioral indicators, attitudes, traits, or personalities related to effective performance in a specific role or job. After analyzing the literature regarding competency-based frameworks, a proposed competency framework that details 13 competency domains is suggested. Each domain describes a series of competencies and suggests behavioral indicators for each competency and, where relevant, associated training expectations. These domains have been organized under three distinct categories or types of competencies: general competency domains; disaster psychosocial intervention competency domains; and disaster psychosocial program leadership and coordination competency domains. Competencies do not replace job descriptions nor should they be confused with performance assessments. What they can do is update and revise job descriptions; orient existing and new employees to their disaster/emergency roles and responsibilities; target training needs; provide the basis for ongoing self-assessment by agencies and individuals as they evaluate their readiness to respond; and provide a job- or role-relevant basis for performance appraisal dimensions or standards and review discussions. Using a modular approach to psychosocial planning, service providers can improve their response capacity by utilizing differences in levels of expertise and training. The competencies outlined in this paper can thus be used to standardize expectations about levels of psychosocial support interventions. In addition this approach provides an adaptable framework that can be adjusted for various contexts.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mao, Yuezhi; Horn, Paul R.; Mardirossian, Narbe
2016-07-28
Recently developed density functionals have good accuracy for both thermochemistry (TC) and non-covalent interactions (NC) if very large atomic orbital basis sets are used. To approach the basis set limit with potentially lower computational cost, a new self-consistent field (SCF) scheme is presented that employs minimal adaptive basis (MAB) functions. The MAB functions are optimized on each atomic site by minimizing a surrogate function. High accuracy is obtained by applying a perturbative correction (PC) to the MAB calculation, similar to dual basis approaches. Compared to exact SCF results, using this MAB-SCF (PC) approach with the same large target basis set producesmore » <0.15 kcal/mol root-mean-square deviations for most of the tested TC datasets, and <0.1 kcal/mol for most of the NC datasets. The performance of density functionals near the basis set limit can be even better reproduced. With further improvement to its implementation, MAB-SCF (PC) is a promising lower-cost substitute for conventional large-basis calculations as a method to approach the basis set limit of modern density functionals.« less
Simple and efficient LCAO basis sets for the diffuse states in carbon nanostructures.
Papior, Nick R; Calogero, Gaetano; Brandbyge, Mads
2018-06-27
We present a simple way to describe the lowest unoccupied diffuse states in carbon nanostructures in density functional theory calculations using a minimal LCAO (linear combination of atomic orbitals) basis set. By comparing plane wave basis calculations, we show how these states can be captured by adding long-range orbitals to the standard LCAO basis sets for the extreme cases of planar sp 2 (graphene) and curved carbon (C 60 ). In particular, using Bessel functions with a long range as additional basis functions retain a minimal basis size. This provides a smaller and simpler atom-centered basis set compared to the standard pseudo-atomic orbitals (PAOs) with multiple polarization orbitals or by adding non-atom-centered states to the basis.
Simple and efficient LCAO basis sets for the diffuse states in carbon nanostructures
NASA Astrophysics Data System (ADS)
Papior, Nick R.; Calogero, Gaetano; Brandbyge, Mads
2018-06-01
We present a simple way to describe the lowest unoccupied diffuse states in carbon nanostructures in density functional theory calculations using a minimal LCAO (linear combination of atomic orbitals) basis set. By comparing plane wave basis calculations, we show how these states can be captured by adding long-range orbitals to the standard LCAO basis sets for the extreme cases of planar sp 2 (graphene) and curved carbon (C60). In particular, using Bessel functions with a long range as additional basis functions retain a minimal basis size. This provides a smaller and simpler atom-centered basis set compared to the standard pseudo-atomic orbitals (PAOs) with multiple polarization orbitals or by adding non-atom-centered states to the basis.
A conformal truncation framework for infinite-volume dynamics
Katz, Emanuel; Khandker, Zuhair U.; Walters, Matthew T.
2016-07-28
Here, we present a new framework for studying conformal field theories deformed by one or more relevant operators. The original CFT is described in infinite volume using a basis of states with definite momentum, P, and conformal Casimir, C. The relevant deformation is then considered using lightcone quantization, with the resulting Hamiltonian expressed in terms of this CFT basis. Truncating to states with C ≤ C max, one can numerically find the resulting spectrum, as well as other dynamical quantities, such as spectral densities of operators. This method requires the introduction of an appropriate regulator, which can be chosen tomore » preserve the conformal structure of the basis. We check this framework in three dimensions for various perturbative deformations of a free scalar CFT, and for the case of a free O(N) CFT deformed by a mass term and a non-perturbative quartic interaction at large- N. In all cases, the truncation scheme correctly reproduces known analytic results. As a result, we also discuss a general procedure for generating a basis of Casimir eigenstates for a free CFT in any number of dimensions.« less
How to apply the ICF and ICF core sets for low back pain.
Stier-Jarmer, Marita; Cieza, Alarcos; Borchers, Michael; Stucki, Gerold
2009-01-01
To introduce the International Classification of Functioning, Disability and Health (ICF) as conceptual model and classification and the ICF Core Sets as a way to specify functioning for a specific health condition such as Low Back Pain, and to illustrate the application of the ICF and ICF Core Sets in the context of clinical practice, the planning and reporting of studies and the comparison of health status measures. A decision-making and consensus process was performed to develop the ICF Core Sets for Low Back Pain, the linking procedure was applied as basis for the content comparison of health-status measures and the Rehab-Cycle was used to exemplify the application of the ICE and ICF Core Sets in clinical practice. Two different ICF Core Sets, namely, a comprehensive and a brief, are presented, three different health-status measures were linked to the ICF and compared and a case example of a patient with Low back Pain was described based on the Rehab-Cycle. The ICF is a promising new framework and classification to assess the impact of Low Back Pain. The ICF and practical tools, such as the ICF Core Sets for Low Back Pain, are useful for clinical practice, outcome and rehabilitation research, education, health statistics, and regulation.
NASA Astrophysics Data System (ADS)
Gembong, S.; Suwarsono, S. T.; Prabowo
2018-03-01
Schema in the current study refers to a set of action, process, object and other schemas already possessed to build an individual’s ways of thinking to solve a given problem. The current study aims to investigate the schemas built among elementary school students in solving problems related to operations of addition to fractions. The analyses of the schema building were done qualitatively on the basis of the analytical framework of the APOS theory (Action, Process, Object, and Schema). Findings show that the schemas built on students of high and middle ability indicate the following. In the action stage, students were able to add two fractions by way of drawing a picture or procedural way. In the Stage of process, they could add two and three fractions. In the stage of object, they could explain the steps of adding two fractions and change a fraction into addition of fractions. In the last stage, schema, they could add fractions by relating them to another schema they have possessed i.e. the least common multiple. Those of high and middle mathematic abilities showed that their schema building in solving problems related to operations odd addition to fractions worked in line with the framework of the APOS theory. Those of low mathematic ability, however, showed that their schema on each stage did not work properly.
Björck-Åkesson, Eva; Wilder, Jenny; Granlund, Mats; Pless, Mia; Simeonsson, Rune; Adolfsson, Margareta; Almqvist, Lena; Augustine, Lilly; Klang, Nina; Lillvist, Anne
2010-01-01
Early childhood intervention and habilitation services for children with disabilities operate on an interdisciplinary basis. It requires a common language between professionals, and a shared framework for intervention goals and intervention implementation. The International Classification of Functioning, Disability and Health (ICF) and the version for children and youth (ICF-CY) may serve as this common framework and language. This overview of studies implemented by our research group is based on three research questions: Do the ICF-CY conceptual model have a valid content and is it logically coherent when investigated empirically? Is the ICF-CY classification useful for documenting child characteristics in services? What difficulties and benefits are related to using ICF-CY model as a basis for intervention when it is implemented in services? A series of studies, undertaken by the CHILD researchers are analysed. The analysis is based on data sets from published studies or master theses. Results and conclusion show that the ICF-CY has a useful content and is logically coherent on model level. Professionals find it useful for documenting children's body functions and activities. Guidelines for separating activity and participation are needed. ICF-CY is a complex classification, implementing it in services is a long-term project.
Pneumothorax detection in chest radiographs using local and global texture signatures
NASA Astrophysics Data System (ADS)
Geva, Ofer; Zimmerman-Moreno, Gali; Lieberman, Sivan; Konen, Eli; Greenspan, Hayit
2015-03-01
A novel framework for automatic detection of pneumothorax abnormality in chest radiographs is presented. The suggested method is based on a texture analysis approach combined with supervised learning techniques. The proposed framework consists of two main steps: at first, a texture analysis process is performed for detection of local abnormalities. Labeled image patches are extracted in the texture analysis procedure following which local analysis values are incorporated into a novel global image representation. The global representation is used for training and detection of the abnormality at the image level. The presented global representation is designed based on the distinctive shape of the lung, taking into account the characteristics of typical pneumothorax abnormalities. A supervised learning process was performed on both the local and global data, leading to trained detection system. The system was tested on a dataset of 108 upright chest radiographs. Several state of the art texture feature sets were experimented with (Local Binary Patterns, Maximum Response filters). The optimal configuration yielded sensitivity of 81% with specificity of 87%. The results of the evaluation are promising, establishing the current framework as a basis for additional improvements and extensions.
Towards a Lifecycle Information Framework and Technology in Manufacturing
Hedberg, Thomas; Feeney, Allison Barnard; Helu, Moneer; Camelio, Jaime A.
2016-01-01
Industry has been chasing the dream of integrating and linking data across the product lifecycle and enterprises for decades. However, industry has been challenged by the fact that the context in which data is used varies based on the function / role in the product lifecycle that is interacting with the data. Holistically, the data across the product lifecycle must be considered an unstructured data-set because multiple data repositories and domain-specific schema exist in each phase of the lifecycle. This paper explores a concept called the Lifecycle Information Framework and Technology (LIFT). LIFT is a conceptual framework for lifecycle information management and the integration of emerging and existing technologies, which together form the basis of a research agenda for dynamic information modeling in support of digital-data curation and reuse in manufacturing. This paper provides a discussion of the existing technologies and activities that the LIFT concept leverages. Also, the paper describes the motivation for applying such work to the domain of manufacturing. Then, the LIFT concept is discussed in detail, while underlying technologies are further examined and a use case is detailed. Lastly, potential impacts are explored. PMID:28265224
NASA Astrophysics Data System (ADS)
Orellana, Diego A.; Salas, Alberto A.; Solarz, Pablo F.; Medina Ruiz, Luis; Rotger, Viviana I.
2016-04-01
The production of clinical information about each patient is constantly increasing, and it is noteworthy that the information is created in different formats and at diverse points of care, resulting in fragmented, incomplete, inaccurate and isolated, health information. The use of health information technology has been promoted as having a decisive impact to improve the efficiency, cost-effectiveness, quality and safety of medical care delivery. However in developing countries the utilization of health information technology is insufficient and lacking of standards among other situations. In the present work we evaluate the framework EHRGen, based on the openEHR standard, as mean to reach generation and availability of patient centered information. The framework has been evaluated through the provided tools for final users, that is, without intervention of computer experts. It makes easier to adopt the openEHR ideas and provides an open source basis with a set of services, although some limitations in its current state conspire against interoperability and usability. However, despite the described limitations respect to usability and semantic interoperability, EHRGen is, at least regionally, a considerable step toward EHR adoption and interoperability, so that it should be supported from academic and administrative institutions.
Derivation of a formula for the resonance integral for a nonorthogonal basis set
Yim, Yung-Chang; Eyring, Henry
1981-01-01
In a self-consistent field calculation, a formula for the off-diagonal matrix elements of the core Hamiltonian is derived for a nonorthogonal basis set by a polyatomic approach. A set of parameters is then introduced for the repulsion integral formula of Mataga-Nishimoto to fit the experimental data. The matrix elements computed for the nonorthogonal basis set in the π-electron approximation are transformed to those for an orthogonal basis set by the Löwdin symmetrical orthogonalization. PMID:16593009
Consumables and wastes estimations for the First Lunar Outpost
NASA Technical Reports Server (NTRS)
Theis, Ronald L. A.; Ballin, Mark G.; Evert, Martha F.
1992-01-01
The First Lunar Outpost mission is a design reference mission for the first human return to the moon. This paper describes a set of consumables and waste material estimations made on the basis of the First Lunar Outpost mission scenario developed by the NASA Exploration Programs Office. The study includes the definition of a functional interface framework and a top-level set of consumables and waste materials to be evaluated, the compilation of mass flow information from mission developers supplemented with information from the literature, and the analysis of the resulting mass flow information to gain insight about the possibility of material flow integration between the moon outpost elements. The results of the study of the details of the piloted mission and the habitat are used to identify areas where integration of consumables and wastes across different mission elements could provide possible launch mass savings.
Brown, C; Hofer, T; Johal, A; Thomson, R; Nicholl, J; Franklin, B D; Lilford, R J
2008-06-01
This is the first of a four-part series of articles examining the epistemology of patient safety research. Parts 2 and 3 will describe different study designs and methods of measuring outcomes in the evaluation of patient safety interventions, before Part 4 suggests that "one size does not fit all". Part 1 sets the scene by defining patient safety research as a challenging form of service delivery and organisational research that has to deal (although not exclusively) with some very rare events. It then considers two inter-related ideas: a causal chain that can be used to identify where in an organisation's structure and/or processes an intervention may impact; and the need for preimplementation evaluation of proposed interventions. Finally, the paper outlines the authors' pragmatist ontological stance to patient safety research, which sets the philosophical basis for the remaining three articles.
Study on Mine Emergency Mechanism based on TARP and ICS
NASA Astrophysics Data System (ADS)
Xi, Jian; Wu, Zongzhi
2018-01-01
By analyzing the experiences and practices of mine emergency in China and abroad, especially the United States and Australia, normative principle, risk management principle and adaptability principle of constructing mine emergency mechanism based on Trigger Action Response Plans (TARP) and Incident Command System (ICS) are summarized. Classification method, framework, flow and subject of TARP and ICS which are suitable for the actual situation of domestic mine emergency are proposed. The system dynamics model of TARP and ICS is established. The parameters such as evacuation ratio, response rate, per capita emergency capability and entry rate of rescuers are set up. By simulating the operation process of TARP and ICS, the impact of these parameters on the emergency process are analyzed, which could provide a reference and basis for building emergency capacity, formulating emergency plans and setting up action plans in the emergency process.
NASA Astrophysics Data System (ADS)
Leopold-Wildburger, Ulrike; Pickl, Stefan
2008-10-01
In our research we intend to use experiments to study human behavior in a simulation environment based on a simple Lotka-Volterra predator-prey ecology. The aim is to study the influence of participants' harvesting strategies and certain personality traits derived from [1] on the outcome in terms of sustainability and economic performance. Such an approach is embedded in a research program which intends to develop and understand interactive resource planning processes. We present the general framework as well as the new decision support system EXPOSIM. The key element is the combination of experimental design, analytical understanding of time-discrete systems (especially Lotka-Volterra systems) and economic performance. In the first part, the general role of laboratory experiments is discussed. The second part summarizes the concept of sustainable development. It is taken from [18]. As we use Lotka-Volterra systems as the basis for our simulations a theoretical framework is described afterwards. It is possible to determine optimal behavior for those systems. The empirical setting is based on the empirical approach that the subjects are put into the position of a decision-maker. They are able to model the environment in such a way that harvesting can be observed. We suggest an experimental setting which might lead to new insights in an anticipatory sense.
Ory, Marcia G.; Altpeter, Mary; Belza, Basia; Helduser, Janet; Zhang, Chen; Smith, Matthew Lee
2015-01-01
Dissemination and implementation (D&I) frameworks are increasingly being promoted in public health research. However, less is known about their uptake in the field, especially for diverse sets of programs. Limited questionnaires exist to assess the ways that frameworks can be utilized in program planning and evaluation. We present a case study from the United States that describes the implementation of the RE-AIM framework by state aging services providers and public health partners and a questionnaire that can be used to assess the utility of such frameworks in practice. An online questionnaire was developed to capture community perspectives about the utility of the RE-AIM framework. Distributed to project leads in 27 funded states in an evidence-based disease prevention initiative for older adults, 40 key stakeholders responded representing a 100% state-participation rate among the 27 funded states. Findings suggest that there is perceived utility in using the RE-AIM framework when evaluating grand-scale initiatives for older adults. The RE-AIM framework was seen as useful for planning, implementation, and evaluation with relevance for evaluators, providers, community leaders, and policy makers. Yet, the uptake was not universal, and some respondents reported difficulties in use, especially adopting the framework as a whole. This questionnaire can serve as the basis to assess ways the RE-AIM framework can be utilized by practitioners in state-wide D&I efforts. Maximal benefit can be derived from examining the assessment of RE-AIM-related knowledge and confidence as part of a continual quality assurance process. We recommend such an assessment be performed before the implementation of new funding initiatives and throughout their course to assess RE-AIM uptake and to identify areas for technical assistance. PMID:25964897
Hung, Linda; da Jornada, Felipe H.; Souto-Casares, Jaime; ...
2016-08-15
Here, we present first-principles calculations on the vertical ionization potentials (IPs), electron affinities (EAs), and singlet excitation energies on an aromatic-molecule test set (benzene, thiophene, 1,2,5-thiadiazole, naphthalene, benzothiazole, and tetrathiafulvalene) within the GW and Bethe-Salpeter equation (BSE) formalisms. Our computational framework, which employs a real-space basis for ground-state and a transition-space basis for excited-state calculations, is well suited for high-accuracy calculations on molecules, as we show by comparing against G0W0 calculations within a plane-wave-basis formalism. We then generalize our framework to test variants of the GW approximation that include a local density approximation (LDA)–derived vertex function (Γ LDA ) andmore » quasiparticle-self-consistent (QS) iterations. We find that Γ LDA and quasiparticle self-consistency shift IPs and EAs by roughly the same magnitude, but with opposite sign for IPs and the same sign for EAs. G0W0 and QS GWΓ LDA are more accurate for IPs, while G 0W 0Γ LDA and QS GW are best for EAs. For optical excitations, we find that perturbative GW-BSE underestimates the singlet excitation energy, while self-consistent GW-BSE results in good agreement with previous best-estimate values for both valence and Rydberg excitations. Finally, our work suggests that a hybrid approach, in which G0W0 energies are used for occupied orbitals and G0W0Γ LDA for unoccupied orbitals, also yields optical excitation energies in good agreement with experiment but at a smaller computational cost.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hung, Linda; da Jornada, Felipe H.; Souto-Casares, Jaime
Here, we present first-principles calculations on the vertical ionization potentials (IPs), electron affinities (EAs), and singlet excitation energies on an aromatic-molecule test set (benzene, thiophene, 1,2,5-thiadiazole, naphthalene, benzothiazole, and tetrathiafulvalene) within the GW and Bethe-Salpeter equation (BSE) formalisms. Our computational framework, which employs a real-space basis for ground-state and a transition-space basis for excited-state calculations, is well suited for high-accuracy calculations on molecules, as we show by comparing against G0W0 calculations within a plane-wave-basis formalism. We then generalize our framework to test variants of the GW approximation that include a local density approximation (LDA)–derived vertex function (Γ LDA ) andmore » quasiparticle-self-consistent (QS) iterations. We find that Γ LDA and quasiparticle self-consistency shift IPs and EAs by roughly the same magnitude, but with opposite sign for IPs and the same sign for EAs. G0W0 and QS GWΓ LDA are more accurate for IPs, while G 0W 0Γ LDA and QS GW are best for EAs. For optical excitations, we find that perturbative GW-BSE underestimates the singlet excitation energy, while self-consistent GW-BSE results in good agreement with previous best-estimate values for both valence and Rydberg excitations. Finally, our work suggests that a hybrid approach, in which G0W0 energies are used for occupied orbitals and G0W0Γ LDA for unoccupied orbitals, also yields optical excitation energies in good agreement with experiment but at a smaller computational cost.« less
NASA Astrophysics Data System (ADS)
Eakins, John P.; Edwards, Jonathan D.; Riley, K. Jonathan; Rosin, Paul L.
2001-01-01
Many different kinds of features have been used as the basis for shape retrieval from image databases. This paper investigates the relative effectiveness of several types of global shape feature, both singly and in combination. The features compared include well-established descriptors such as Fourier coefficients and moment invariants, as well as recently-proposed measures of triangularity and ellipticity. Experiments were conducted within the framework of the ARTISAN shape retrieval system, and retrieval effectiveness assessed on a database of over 10,000 images, using 24 queries and associated ground truth supplied by the UK Patent Office . Our experiments revealed only minor differences in retrieval effectiveness between different measures, suggesting that a wide variety of shape feature combinations can provide adequate discriminating power for effective shape retrieval in multi-component image collections such as trademark registries. Marked differences between measures were observed for some individual queries, suggesting that there could be considerable scope for improving retrieval effectiveness by providing users with an improved framework for searching multi-dimensional feature space.
NASA Astrophysics Data System (ADS)
Eakins, John P.; Edwards, Jonathan D.; Riley, K. Jonathan; Rosin, Paul L.
2000-12-01
Many different kinds of features have been used as the basis for shape retrieval from image databases. This paper investigates the relative effectiveness of several types of global shape feature, both singly and in combination. The features compared include well-established descriptors such as Fourier coefficients and moment invariants, as well as recently-proposed measures of triangularity and ellipticity. Experiments were conducted within the framework of the ARTISAN shape retrieval system, and retrieval effectiveness assessed on a database of over 10,000 images, using 24 queries and associated ground truth supplied by the UK Patent Office . Our experiments revealed only minor differences in retrieval effectiveness between different measures, suggesting that a wide variety of shape feature combinations can provide adequate discriminating power for effective shape retrieval in multi-component image collections such as trademark registries. Marked differences between measures were observed for some individual queries, suggesting that there could be considerable scope for improving retrieval effectiveness by providing users with an improved framework for searching multi-dimensional feature space.
Zhou, Mingming; Kam, Chester Chun Seng
2018-05-17
In this study, we sought to extend the research on self-determination, future orientation, and personal identity construction by integrating the theories on self-determination and future orientation to provide a conceptual framework for understanding the relations between personal identity and the following individual characteristics: Hope, optimism, awareness of self, and perceived choice. 191 university students in China responded surveys in hardcopies on an individual basis. Our SEM results revealed that proximal future orientation influenced the mechanisms through which distal psychological traits affected identity construction. Specifically, hope mediated the effects of self-awareness on the participants' personal identity ratings (b = .45, p < .05). Although optimism was related to both awareness of self and perceived choice, it was not significantly related to personal identity. This study suggested an extended framework through which we could understand how the interaction between future orientation and self-determination can predict personal identity. The findings have significant implications for interventions in educational settings.
Environmental statistics and optimal regulation
NASA Astrophysics Data System (ADS)
Sivak, David; Thomson, Matt
2015-03-01
The precision with which an organism can detect its environment, and the timescale for and statistics of environmental change, will affect the suitability of different strategies for regulating protein levels in response to environmental inputs. We propose a general framework--here applied to the enzymatic regulation of metabolism in response to changing nutrient concentrations--to predict the optimal regulatory strategy given the statistics of fluctuations in the environment and measurement apparatus, and the costs associated with enzyme production. We find: (i) relative convexity of enzyme expression cost and benefit influences the fitness of thresholding or graded responses; (ii) intermediate levels of measurement uncertainty call for a sophisticated Bayesian decision rule; and (iii) in dynamic contexts, intermediate levels of uncertainty call for retaining memory of the past. Statistical properties of the environment, such as variability and correlation times, set optimal biochemical parameters, such as thresholds and decay rates in signaling pathways. Our framework provides a theoretical basis for interpreting molecular signal processing algorithms and a classification scheme that organizes known regulatory strategies and may help conceptualize heretofore unknown ones.
U.S. History Framework for the 2010 National Assessment of Educational Progress
ERIC Educational Resources Information Center
National Assessment Governing Board, 2009
2009-01-01
This framework identifies the main ideas, major events, key individuals, and unifying themes of American history as a basis for preparing the 2010 assessment. The framework recognizes that U.S. history includes powerful ideas, common and diverse traditions, economic developments, technological and scientific innovations, philosophical debates,…
ERIC Educational Resources Information Center
Bowen, Barbara Lynn
This study presents a holistic framework which can be used as a basis for decision-making at various points in the curriculum-instruction development process as described by Johnson in a work published in 1967. The proposed framework has conceptual bases in the work of Thomas S. Kuhn and David P. Ausubel and utilizes the work of several perceptual…
Mackie, Iain D; DiLabio, Gino A
2011-10-07
The first-principles calculation of non-covalent (particularly dispersion) interactions between molecules is a considerable challenge. In this work we studied the binding energies for ten small non-covalently bonded dimers with several combinations of correlation methods (MP2, coupled-cluster single double, coupled-cluster single double (triple) (CCSD(T))), correlation-consistent basis sets (aug-cc-pVXZ, X = D, T, Q), two-point complete basis set energy extrapolations, and counterpoise corrections. For this work, complete basis set results were estimated from averaged counterpoise and non-counterpoise-corrected CCSD(T) binding energies obtained from extrapolations with aug-cc-pVQZ and aug-cc-pVTZ basis sets. It is demonstrated that, in almost all cases, binding energies converge more rapidly to the basis set limit by averaging the counterpoise and non-counterpoise corrected values than by using either counterpoise or non-counterpoise methods alone. Examination of the effect of basis set size and electron correlation shows that the triples contribution to the CCSD(T) binding energies is fairly constant with the basis set size, with a slight underestimation with CCSD(T)∕aug-cc-pVDZ compared to the value at the (estimated) complete basis set limit, and that contributions to the binding energies obtained by MP2 generally overestimate the analogous CCSD(T) contributions. Taking these factors together, we conclude that the binding energies for non-covalently bonded systems can be accurately determined using a composite method that combines CCSD(T)∕aug-cc-pVDZ with energy corrections obtained using basis set extrapolated MP2 (utilizing aug-cc-pVQZ and aug-cc-pVTZ basis sets), if all of the components are obtained by averaging the counterpoise and non-counterpoise energies. With such an approach, binding energies for the set of ten dimers are predicted with a mean absolute deviation of 0.02 kcal/mol, a maximum absolute deviation of 0.05 kcal/mol, and a mean percent absolute deviation of only 1.7%, relative to the (estimated) complete basis set CCSD(T) results. Use of this composite approach to an additional set of eight dimers gave binding energies to within 1% of previously published high-level data. It is also shown that binding within parallel and parallel-crossed conformations of naphthalene dimer is predicted by the composite approach to be 9% greater than that previously reported in the literature. The ability of some recently developed dispersion-corrected density-functional theory methods to predict the binding energies of the set of ten small dimers was also examined. © 2011 American Institute of Physics
Seismic risk management of non-engineered buildings
NASA Astrophysics Data System (ADS)
Winar, Setya
Earthquakes have long been feared as one of nature's most terrifying and devastating events. Although seismic codes clearly exist in countries with a high seismic risk to save lives and human suffering, earthquakes still continue to cause tragic events with high death tolls, particularly due to the collapse of widespread non-engineered buildings with non-seismic resistance in developing countries such as Indonesia. The implementation of seismic codes in non-engineered construction is the key to ensuring earthquake safety. In fact, such implementation is not simple, because it comprises all forms of cross disciplinary and cross sectoral linkages at different levels of understanding, commitment, and skill. This fact suggests that a widely agreed framework can help to harmonise the various perspectives. Hence, this research is aimed at developing an integrated framework for guiding and monitoring seismic risk reduction of non-engineered buildings in Indonesia via a risk management method.Primarily, the proposed framework for the study has drawn heavily on wider literature, the three existing frameworks around the world, and on the contribution of various stakeholders who participated in the study. A postal questionnaire survey, selected interviews, and workshop event constituted the primary data collection methods. As a robust framework needed to be achieved, the following two workshop events, which were conducted in Yogyakarta City and Bengkulu City in Indonesia, were carried out for practicality, validity, and moderation or any identifiable improvement requirements. The data collected was analysed with the assistance of SPSS and NVivo software programmes.This research found that the content of the proposed framework comprises 63 pairs of characteristic-indicators complemented by (a) three important factors of effective seismic risk management of non-engineered buildings, (b) three guiding principles for sustainable dissemination to the grass root communities and (c) a map of agents of change. Among the 63 pairs, there are 19 technical interventions and 44 non-technical interventions. These findings contribute to the wider knowledge in the domain of the seismic risk management of non-engineered buildings, in order to: (a) provide a basis for effective political advocacy, (b) reflect the multidimensional and inter-disciplinary nature of seismic risk reduction, (c) assist a wide range of users in determining roles, responsibilities, and accountabilities, and (d) provide the basis for setting goals and targets.
Cysewski, Piotr; Jeliński, Tomasz
2013-10-01
The electronic spectrum of four different anthraquinones (1,2-dihydroxyanthraquinone, 1-aminoanthraquinone, 2-aminoanthraquinone and 1-amino-2-methylanthraquinone) in methanol solution was measured and used as reference data for theoretical color prediction. The visible part of the spectrum was modeled according to TD-DFT framework with a broad range of DFT functionals. The convoluted theoretical spectra were validated against experimental data by a direct color comparison in terms of CIE XYZ and CIE Lab tristimulus model color. It was found, that the 6-31G** basis set provides the most accurate color prediction and there is no need to extend the basis set since it does not improve the prediction of color. Although different functionals were found to give the most accurate color prediction for different anthraquinones, it is possible to apply the same DFT approach for the whole set of analyzed dyes. Especially three functionals seem to be valuable, namely mPW1LYP, B1LYP and PBE0 due to very similar spectra predictions. The major source of discrepancies between theoretical and experimental spectra comes from L values, representing the lightness, and the a parameter, depicting the position on green→magenta axis. Fortunately, the agreement between computed and observed blue→yellow axis (parameter b) is very precise in the case of studied anthraquinone dyes in methanol solution. Despite discussed shortcomings, color prediction from first principle quantum chemistry computations can lead to quite satisfactory results, expressed in terms of color space parameters.
Dixit, Anant; Claudot, Julien; Lebègue, Sébastien; Rocca, Dario
2017-06-07
By using a formulation based on the dynamical polarizability, we propose a novel implementation of second-order Møller-Plesset perturbation (MP2) theory within a plane wave (PW) basis set. Because of the intrinsic properties of PWs, this method is not affected by basis set superposition errors. Additionally, results are converged without relying on complete basis set extrapolation techniques; this is achieved by using the eigenvectors of the static polarizability as an auxiliary basis set to compactly and accurately represent the response functions involved in the MP2 equations. Summations over the large number of virtual states are avoided by using a formalism inspired by density functional perturbation theory, and the Lanczos algorithm is used to include dynamical effects. To demonstrate this method, applications to three weakly interacting dimers are presented.
Nursing Routine Data as a Basis for Association Analysis in the Domain of Nursing Knowledge
Sellemann, Björn; Stausberg, Jürgen; Hübner, Ursula
2012-01-01
This paper describes the data mining method of association analysis within the framework of Knowledge Discovery in Databases (KDD) with the aim to identify standard patterns of nursing care. The approach is application-oriented and used on nursing routine data of the method LEP nursing 2. The increasing use of information technology in hospitals, especially of nursing information systems, requires the storage of large data sets, which hitherto have not always been analyzed adequately. Three association analyses for the days of admission, surgery and discharge, have been performed. The results of almost 1.5 million generated association rules indicate that it is valid to apply association analysis to nursing routine data. All rules are semantically trivial, since they reflect existing knowledge from the domain of nursing. This may be due either to the method LEP Nursing 2, or to the nursing activities themselves. Nonetheless, association analysis may in future become a useful analytical tool on the basis of structured nursing routine data. PMID:24199122
Jutterström, S; Andersson, H C; Omstedt, A; Malmaeus, J M
2014-09-15
The paper discusses the combined effects of ocean acidification, eutrophication and climate change on the Baltic Sea and the implications for current management strategies. The scientific basis is built on results gathered in the BONUS+ projects Baltic-C and ECOSUPPORT. Model results indicate that the Baltic Sea is likely to be warmer, more hypoxic and more acidic in the future. At present management strategies are not taking into account temporal trends and potential ecosystem change due to warming and/or acidification, and therefore fulfilling the obligations specified within the Marine Strategy Framework Directive, OSPAR and HELCOM conventions and national environmental objectives may become significantly more difficult. The paper aims to provide a basis for a discussion on the effectiveness of current policy instruments and possible strategies for setting practical environmental objectives in a changing climate and with multiple stressors. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.
NASA Astrophysics Data System (ADS)
Maschio, Lorenzo; Kirtman, Bernard; Rérat, Michel; Orlando, Roberto; Dovesi, Roberto
2013-10-01
In this work, we validate a new, fully analytical method for calculating Raman intensities of periodic systems, developed and presented in Paper I [L. Maschio, B. Kirtman, M. Rérat, R. Orlando, and R. Dovesi, J. Chem. Phys. 139, 164101 (2013)]. Our validation of this method and its implementation in the CRYSTAL code is done through several internal checks as well as comparison with experiment. The internal checks include consistency of results when increasing the number of periodic directions (from 0D to 1D, 2D, 3D), comparison with numerical differentiation, and a test of the sum rule for derivatives of the polarizability tensor. The choice of basis set as well as the Hamiltonian is also studied. Simulated Raman spectra of α-quartz and of the UiO-66 Metal-Organic Framework are compared with the experimental data.
Towards a global network of gamma-ray detector calibration facilities
NASA Astrophysics Data System (ADS)
Tijs, Marco; Koomans, Ronald; Limburg, Han
2016-09-01
Gamma-ray logging tools are applied worldwide. At various locations, calibration facilities are used to calibrate these gamma-ray logging systems. Several attempts have been made to cross-correlate well known calibration pits, but this cross-correlation does not include calibration facilities in Europe or private company calibration facilities. Our aim is to set-up a framework that gives the possibility to interlink all calibration facilities worldwide by using `tools of opportunity' - tools that have been calibrated in different calibration facilities, whether this usage was on a coordinated basis or by coincidence. To compare the measurement of different tools, it is important to understand the behaviour of the tools in the different calibration pits. Borehole properties, such as diameter, fluid, casing and probe diameter strongly influence the outcome of gamma-ray borehole logging. Logs need to be properly calibrated and compensated for these borehole properties in order to obtain in-situ grades or to do cross-hole correlation. Some tool providers provide tool-specific correction curves for this purpose. Others rely on reference measurements against sources of known radionuclide concentration and geometry. In this article, we present an attempt to set-up a framework for transferring `local' calibrations to be applied `globally'. This framework includes corrections for any geometry and detector size to give absolute concentrations of radionuclides from borehole measurements. This model is used to compare measurements in the calibration pits of Grand Junction, located in the USA; Adelaide (previously known as AMDEL), located in Adelaide Australia; and Stonehenge, located at Medusa Explorations BV in the Netherlands.
NASA Astrophysics Data System (ADS)
Martin, Jan M. L.; Sundermann, Andreas
2001-02-01
We propose large-core correlation-consistent (cc) pseudopotential basis sets for the heavy p-block elements Ga-Kr and In-Xe. The basis sets are of cc-pVTZ and cc-pVQZ quality, and have been optimized for use with the large-core (valence-electrons only) Stuttgart-Dresden-Bonn (SDB) relativistic pseudopotentials. Validation calculations on a variety of third-row and fourth-row diatomics suggest them to be comparable in quality to the all-electron cc-pVTZ and cc-pVQZ basis sets for lighter elements. Especially the SDB-cc-pVQZ basis set in conjunction with a core polarization potential (CPP) yields excellent agreement with experiment for compounds of the later heavy p-block elements. For accurate calculations on Ga (and, to a lesser extent, Ge) compounds, explicit treatment of 13 valence electrons appears to be desirable, while it seems inevitable for In compounds. For Ga and Ge, we propose correlation consistent basis sets extended for (3d) correlation. For accurate calculations on organometallic complexes of interest to homogenous catalysis, we recommend a combination of the standard cc-pVTZ basis set for first- and second-row elements, the presently derived SDB-cc-pVTZ basis set for heavier p-block elements, and for transition metals, the small-core [6s5p3d] Stuttgart-Dresden basis set-relativistic effective core potential combination supplemented by (2f1g) functions with exponents given in the Appendix to the present paper.
From plane waves to local Gaussians for the simulation of correlated periodic systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Booth, George H., E-mail: george.booth@kcl.ac.uk; Tsatsoulis, Theodoros; Grüneis, Andreas, E-mail: a.grueneis@fkf.mpg.de
2016-08-28
We present a simple, robust, and black-box approach to the implementation and use of local, periodic, atom-centered Gaussian basis functions within a plane wave code, in a computationally efficient manner. The procedure outlined is based on the representation of the Gaussians within a finite bandwidth by their underlying plane wave coefficients. The core region is handled within the projected augment wave framework, by pseudizing the Gaussian functions within a cutoff radius around each nucleus, smoothing the functions so that they are faithfully represented by a plane wave basis with only moderate kinetic energy cutoff. To mitigate the effects of themore » basis set superposition error and incompleteness at the mean-field level introduced by the Gaussian basis, we also propose a hybrid approach, whereby the complete occupied space is first converged within a large plane wave basis, and the Gaussian basis used to construct a complementary virtual space for the application of correlated methods. We demonstrate that these pseudized Gaussians yield compact and systematically improvable spaces with an accuracy comparable to their non-pseudized Gaussian counterparts. A key advantage of the described method is its ability to efficiently capture and describe electronic correlation effects of weakly bound and low-dimensional systems, where plane waves are not sufficiently compact or able to be truncated without unphysical artifacts. We investigate the accuracy of the pseudized Gaussians for the water dimer interaction, neon solid, and water adsorption on a LiH surface, at the level of second-order Møller–Plesset perturbation theory.« less
Hybrid Grid and Basis Set Approach to Quantum Chemistry DMRG
NASA Astrophysics Data System (ADS)
Stoudenmire, Edwin Miles; White, Steven
We present a new approach for using DMRG for quantum chemistry that combines the advantages of a basis set with that of a grid approximation. Because DMRG scales linearly for quasi-one-dimensional systems, it is feasible to approximate the continuum with a fine grid in one direction while using a standard basis set approach for the transverse directions. Compared to standard basis set methods, we reach larger systems and achieve better scaling when approaching the basis set limit. The flexibility and reduced costs of our approach even make it feasible to incoporate advanced DMRG techniques such as simulating real-time dynamics. Supported by the Simons Collaboration on the Many-Electron Problem.
Khvostichenko, Daria; Choi, Andrew; Boulatov, Roman
2008-04-24
We investigated the effect of several computational variables, including the choice of the basis set, application of symmetry constraints, and zero-point energy (ZPE) corrections, on the structural parameters and predicted ground electronic state of model 5-coordinate hemes (iron(II) porphines axially coordinated by a single imidazole or 2-methylimidazole). We studied the performance of B3LYP and B3PW91 with eight Pople-style basis sets (up to 6-311+G*) and B97-1, OLYP, and TPSS functionals with 6-31G and 6-31G* basis sets. Only hybrid functionals B3LYP, B3PW91, and B97-1 reproduced the quintet ground state of the model hemes. With a given functional, the choice of the basis set caused up to 2.7 kcal/mol variation of the quintet-triplet electronic energy gap (DeltaEel), in several cases, resulting in the inversion of the sign of DeltaEel. Single-point energy calculations with triple-zeta basis sets of the Pople (up to 6-311G++(2d,2p)), Ahlrichs (TZVP and TZVPP), and Dunning (cc-pVTZ) families showed the same trend. The zero-point energy of the quintet state was approximately 1 kcal/mol lower than that of the triplet, and accounting for ZPE corrections was crucial for establishing the ground state if the electronic energy of the triplet state was approximately 1 kcal/mol less than that of the quintet. Within a given model chemistry, effects of symmetry constraints and of a "tense" structure of the iron porphine fragment coordinated to 2-methylimidazole on DeltaEel were limited to 0.3 kcal/mol. For both model hemes the best agreement with crystallographic structural data was achieved with small 6-31G and 6-31G* basis sets. Deviation of the computed frequency of the Fe-Im stretching mode from the experimental value with the basis set decreased in the order: nonaugmented basis sets, basis sets with polarization functions, and basis sets with polarization and diffuse functions. Contraction of Pople-style basis sets (double-zeta or triple-zeta) affected the results insignificantly for iron(II) porphyrin coordinated with imidazole. Poor performance of a "locally dense" basis set with a large number of basis functions on the Fe center was observed in calculation of quintet-triplet gaps. Our results lead to a series of suggestions for density functional theory calculations of quintet-triplet energy gaps in ferrohemes with a single axial imidazole; these suggestions are potentially applicable for other transition-metal complexes.
Ab Initio Calculations of Spin-Orbit Coupling for Heavy-Metal Containing Radicals
NASA Astrophysics Data System (ADS)
Cheng, Lan
2016-06-01
The perturbative treatment of spin-orbit coupling (SOC) on top of scalar-relativistic calculations is a cost-effective alternative to rigorous fully relativistic calculations. In this work the applicability of the perturbative scheme in the framework of spin-free exact two-component theory is demonstrated with calculations of SO splittings and SOC contributions to molecular properties in small heavy-metal containing radicals, including AuO, AuS, and ThO^+. The equation of motion coupled cluster techniques have been used to accurately account for the electron-correlation effects in these radicals, and basis-set effects are carefully analyzed. The computed results are compared with experimental measurements for SO splittings and dipole moments when available.
Identifying and applying psychological theory to setting and achieving rehabilitation goals.
Scobbie, Lesley; Wyke, Sally; Dixon, Diane
2009-04-01
Goal setting is considered to be a fundamental part of rehabilitation; however, theories of behaviour change relevant to goal-setting practice have not been comprehensively reviewed. (i) To identify and discuss specific theories of behaviour change relevant to goal-setting practice in the rehabilitation setting. (ii) To identify 'candidate' theories that that offer most potential to inform clinical practice. The rehabilitation and self-management literature was systematically searched to identify review papers or empirical studies that proposed a specific theory of behaviour change relevant to setting and/or achieving goals in a clinical context. Data from included papers were extracted under the headings of: key constructs, clinical application and empirical support. Twenty-four papers were included in the review which proposed a total of five theories: (i) social cognitive theory, (ii) goal setting theory, (iii) health action process approach, (iv) proactive coping theory, and (v) the self-regulatory model of illness behaviour. The first three of these theories demonstrated most potential to inform clinical practice, on the basis of their capacity to inform interventions that resulted in improved patient outcomes. Social cognitive theory, goal setting theory and the health action process approach are theories of behaviour change that can inform clinicians in the process of setting and achieving goals in the rehabilitation setting. Overlapping constructs within these theories have been identified, and can be applied in clinical practice through the development and evaluation of a goal-setting practice framework.
Localized basis sets for unbound electrons in nanoelectronics.
Soriano, D; Jacob, D; Palacios, J J
2008-02-21
It is shown how unbound electron wave functions can be expanded in a suitably chosen localized basis sets for any desired range of energies. In particular, we focus on the use of Gaussian basis sets, commonly used in first-principles codes. The possible usefulness of these basis sets in a first-principles description of field emission or scanning tunneling microscopy at large bias is illustrated by studying a simpler related phenomenon: The lifetime of an electron in a H atom subjected to a strong electric field.
Marshall, Martin; Klazinga, Niek; Leatherman, Sheila; Hardy, Charlie; Bergmann, Eckhard; Pisco, Luis; Mattke, Soeren; Mainz, Jan
2006-09-01
This article describes a project undertaken as part of the Organization for Economic Co-operation and Development (OECD)'s Healthcare Quality Indicator (HCQI) Project, which aimed to develop a set of quality indicators representing the domains of primary care, prevention and health promotion, and which could be used to assess the performance of primary care systems. Existing quality indicators from around the world were mapped to an organizing framework which related primary care, prevention, and health promotion. The indicators were judged against the US Institute of Medicine's assessment criteria of importance and scientific soundness, and only those which met these criteria and were likely to be feasible were included. An initial large set of indicators was reduced by the primary care expert panel using a modified Delphi process. A set of 27 indicators was produced. Six of them were related to health promotion, covering health-related behaviours that are typically targeted by health education and outreach campaigns, 13 to preventive care with a focus on prenatal care and immunizations and eight to primary clinical care mainly addressing activities related to risk reduction. The indicators selected placed a strong emphasis on the public health aspects of primary care. This project represents an important but preliminary step towards a set of measures to evaluate and compare primary care quality. Further work is required to assess the operational feasibility of the indicators and the validity of any benchmarking data drawn from international comparisons. A conceptual framework needs to be developed that comprehensively captures the complex construct of primary care as a basis for the selection of additional indicators.
NASA Astrophysics Data System (ADS)
Smeulders, G. G. B.; Koho, K. A.; de Stigter, H. C.; Mienis, F.; de Haas, H.; van Weering, T. C. E.
2014-01-01
The extent of the cold-water coral mounds in the modern ocean basins has been recently revealed by new state-of-the-art equipment. However, not much is known about their geological extent or development through time. In the facies model presented here seven different types of seabed substrate are distinguished, which may be used for reconstruction of fossil coral habitats. The studied substrates include: off-mound settings, (foram) sands, hardgrounds, dead coral debris, and substrates characterized by a variable density of living coral framework. Whereas sediment characteristics only provide a basis for distinguishing on- and off-mound habitats and the loci of most prolific coral growth, benthic foraminiferal assemblages are the key to identifying different mound substrates in more detail. Specific foraminiferal assemblages are distinguished that are characteristic of these specific environments. Assemblages from off-mound settings are dominated by (attached) epifaunal species such as Cibicides refulgens and Cibicides variabilis. The attached epibenthic species Discanomalina coronata is also common in off-mound sediments, but it is most abundant where hardgrounds have formed. In contrast, the settings with coral debris or living corals attract shallow infaunal species that are associated with more fine-grained soft sediments. The typical ‘living coral assemblage' is composed of Cassidulina obtusa, Bulimina marginata, and Cassidulina laevigata. The abundance of these species shows an almost linear increase with the density of the living coral cover. The benthic foraminifera encountered from off-mound to top-mound settings appear to represent a gradient of decreasing current intensity and availability of suspended food particles, and increasing availability of organic matter associated with fine-grained sediment trapped in between coral framework.
Dietscher, Christina
2017-02-01
Networks in health promotion (HP) have, after the launch of WHO's Ottawa Charter [(World Health Organization (WHO) (eds). (1986) Ottawa Charter on Health Promotion. Towards A New Public Health. World Health Organization, Geneva], become a widespread tool to disseminate HP especially in conjunction with the settings approach. Despite their allegedly high importance for HP practice and more than two decades of experiences with networking so far, a sound theoretical basis to support effective planning, formation, coordination and strategy development for networks in the settings approach of HP (HPSN) is still widely missing. Brößkamp-Stone's multi-facetted interorganizational network assessment framework (2004) provides a starting point but falls short of specifying the outcomes that can be reasonably expected from the specific network type of HPSN, and the specific processes/strategies and structures that are needed to achieve them. Based on outcome models in HP, on social, managerial and health science theories of networks, settings and organizations, a sociological systems theory approach and the capacity approach in HP, this article points out why existing approaches to studying networks are insufficient for HPSN, what can be understood by their functioning and effectiveness, what preconditions there are for HPSN effectiveness and how an HPSN functioning and effectiveness framework proposed on these grounds can be used for researching networks in practice, drawing on experiences from the ‘Project on an Internationally Comparative Evaluation Study of the International Network of Health Promoting Hospitals and Health Services’ (PRICES-HPH), which was coordinated by the WHO Collaborating Centre for Health Promotion in Hospitals and Health Services (Vienna WHO-CC) from 2008 to 2012.
Efficient view based 3-D object retrieval using Hidden Markov Model
NASA Astrophysics Data System (ADS)
Jain, Yogendra Kumar; Singh, Roshan Kumar
2013-12-01
Recent research effort has been dedicated to view based 3-D object retrieval, because of highly discriminative property of 3-D object and has multi view representation. The state-of-art method is highly depending on their own camera array setting for capturing views of 3-D object and use complex Zernike descriptor, HAC for representative view selection which limit their practical application and make it inefficient for retrieval. Therefore, an efficient and effective algorithm is required for 3-D Object Retrieval. In order to move toward a general framework for efficient 3-D object retrieval which is independent of camera array setting and avoidance of representative view selection, we propose an Efficient View Based 3-D Object Retrieval (EVBOR) method using Hidden Markov Model (HMM). In this framework, each object is represented by independent set of view, which means views are captured from any direction without any camera array restriction. In this, views are clustered (including query view) to generate the view cluster, which is then used to build the query model with HMM. In our proposed method, HMM is used in twofold: in the training (i.e. HMM estimate) and in the retrieval (i.e. HMM decode). The query model is trained by using these view clusters. The EVBOR query model is worked on the basis of query model combining with HMM. The proposed approach remove statically camera array setting for view capturing and can be apply for any 3-D object database to retrieve 3-D object efficiently and effectively. Experimental results demonstrate that the proposed scheme has shown better performance than existing methods. [Figure not available: see fulltext.
A framework for air quality monitoring based on free public data and open source tools
NASA Astrophysics Data System (ADS)
Nikolov, Hristo; Borisova, Denitsa
2014-10-01
In the recent years more and more widely accepted by the Space agencies (e.g. NASA, ESA) is the policy toward provision of Earth observation (EO) data and end products concerning air quality especially in large urban areas without cost to researchers and SMEs. Those EO data are complemented by increasing amount of in-situ data also provided at no cost either from national authorities or having crowdsourced origin. This accessibility together with the increased processing capabilities of the free and open source software is a prerequisite for creation of solid framework for air modeling in support of decision making at medium and large scale. Essential part of this framework is web-based GIS mapping tool responsible for dissemination of the output generated. In this research an attempt is made to establish a running framework based solely on openly accessible data on air quality and on set of freely available software tools for processing and modeling taking into account the present status quo in Bulgaria. Among the primary sources of data, especially for bigger urban areas, for different types of gases and dust particles, noted should be the National Institute of Meteorology and Hydrology of Bulgaria (NIMH) and National System for Environmental Monitoring managed by Bulgarian Executive Environmental Agency (ExEA). Both authorities provide data for concentration of several gases just to mention CO, CO2, NO2, SO2, and fine suspended dust (PM10, PM2.5) on monthly (for some data on daily) basis. In the framework proposed these data will complement the data from satellite-based sensors such as OMI instrument aboard EOS-Aura satellite and from TROPOMI instrument payload for future ESA Sentinel-5P mission. Integral part of the framework is the modern map for the land use/land cover which is provided from EEA by initiative GIO Land CORINE. This map is also a product from EO data distributed at European level. First and above all, our effort is focused on provision to the wider public living in urbanized areas with one reliable source of information on the present conditions concerning the air quality. Also this information might be used as indicator for presence of acid rains in agriculture areas close to industrial or electricity plants. Its availability at regular basis makes such information valuable source in case of manmade industrial disasters or incidents such as forest fires. Key issue in developing this framework is to ensure the delivery of reliable data products related to air quality at larger scale that those available at the moment.
Near Hartree-Fock quality GTO basis sets for the second-row atoms
NASA Technical Reports Server (NTRS)
Partridge, Harry
1987-01-01
Energy optimized, near Hartree-Fock quality Gaussian basis sets ranging in size from (17s12p) to (20s15p) are presented for the ground states of the second-row atoms for Na(2P), Na(+), Na(-), Mg(3P), P(-), S(-), and Cl(-). In addition, optimized supplementary functions are given for the ground state basis sets to describe the negative ions, and the excited Na(2P) and Mg(3P) atomic states. The ratios of successive orbital exponents describing the inner part of the 1s and 2p orbitals are found to be nearly independent of both nuclear charge and basis set size. This provides a method of obtaining good starting estimates for other basis set optimizations.
A Survey and Analysis of Aircraft Maintenance Metrics: A Balanced Scorecard Approach
2014-03-27
Metrics Set Theory /Framework .................................................................................... 16 Balanced Scorecard overview...a useful form Figure 1: Metric Evaluation Criteria (Caplice & Sheffi, 1994, p. 14) Metrics Set Theory /Framework The researcher included an...examination of established theory and frameworks on how metrics sets are constructed in the literature review. The purpose of this examination was to
Teodoro, Tiago Quevedo; Visscher, Lucas; da Silva, Albérico Borges Ferreira; Haiduke, Roberto Luiz Andrade
2017-03-14
The f-block elements are addressed in this third part of a series of prolapse-free basis sets of quadruple-ζ quality (RPF-4Z). Relativistic adapted Gaussian basis sets (RAGBSs) are used as primitive sets of functions while correlating/polarization (C/P) functions are chosen by analyzing energy lowerings upon basis set increments in Dirac-Coulomb multireference configuration interaction calculations with single and double excitations of the valence spinors. These function exponents are obtained by applying the RAGBS parameters in a polynomial expression. Moreover, through the choice of C/P characteristic exponents from functions of lower angular momentum spaces, a reduction in the computational demand is attained in relativistic calculations based on the kinetic balance condition. The present study thus complements the RPF-4Z sets for the whole periodic table (Z ≤ 118). The sets are available as Supporting Information and can also be found at http://basis-sets.iqsc.usp.br .
Combination of large and small basis sets in electronic structure calculations on large systems
NASA Astrophysics Data System (ADS)
Røeggen, Inge; Gao, Bin
2018-04-01
Two basis sets—a large and a small one—are associated with each nucleus of the system. Each atom has its own separate one-electron basis comprising the large basis set of the atom in question and the small basis sets for the partner atoms in the complex. The perturbed atoms in molecules and solids model is at core of the approach since it allows for the definition of perturbed atoms in a system. It is argued that this basis set approach should be particularly useful for periodic systems. Test calculations are performed on one-dimensional arrays of H and Li atoms. The ground-state energy per atom in the linear H array is determined versus bond length.
Kapiriri, Lydia
2017-06-19
While there have been efforts to develop frameworks to guide healthcare priority setting; there has been limited focus on evaluation frameworks. Moreover, while the few frameworks identify quality indicators for successful priority setting, they do not provide the users with strategies to verify these indicators. Kapiriri and Martin (Health Care Anal 18:129-147, 2010) developed a framework for evaluating priority setting in low and middle income countries. This framework provides BOTH parameters for successful priority setting and proposes means of their verification. Before its use in real life contexts, this paper presents results from a validation process of the framework. The framework validation involved 53 policy makers and priority setting researchers at the global, national and sub-national levels (in Uganda). They were requested to indicate the relative importance of the proposed parameters as well as the feasibility of obtaining the related information. We also pilot tested the proposed means of verification. Almost all the respondents evaluated all the parameters, including the contextual factors, as 'very important'. However, some respondents at the global level thought 'presence of incentives to comply', 'reduced disagreements', 'increased public understanding,' 'improved institutional accountability' and 'meeting the ministry of health objectives', which could be a reflection of their levels of decision making. All the proposed means of verification were assessed as feasible with the exception of meeting observations which would require an insider. These findings results were consistent with those obtained from the pilot testing. These findings are relevant to policy makers and researchers involved in priority setting in low and middle income countries. To the best of our knowledge, this is one of the few initiatives that has involved potential users of a framework (at the global and in a Low Income Country) in its validation. The favorable validation of all the parameters at the national and sub-national levels implies that the framework has potential usefulness at those levels, as is. The parameters that were disputed at the global level necessitate further discussion when using the framework at that level. The next step is to use the validated framework in evaluating actual priority setting at the different levels.
NASA Astrophysics Data System (ADS)
Fallatah, Mohammed I.; Kerans, Charles
2018-01-01
A sequence stratigraphic framework of the Late Jurassic (Oxfordian) Hanifa Formation at its exposure in Central Arabia is presented for the first time. This study offers the first high-resolution stratigraphic framework of the Hanifa along the Tuwaiq Escarpment by measuring 15 sections ( 770 m total thickness) over an oblique-to-dip distance of 260 km and collecting 295 samples for petrographic analysis. On the basis of these data, the Hanifa Formation can be subdivided into eight facies; 1) tabular cross-bedded quartz-peloidal-skeletal grainstone, 2) cross-bedded skeletal-peloidal grainstone, 3) bioturbated foraminiferal wackestone/mud-dominated packstone, 4) oncolitic rudstone, 5) stromatoporoid-coral biostrome/bioherm, 6) peloidal/composite-grain grain-dominated packstone/grainstone, 7) bioturbated spiculitic wackestone/mud-dominated packstone, and 8) thinly-bedded argillaceous mudstone/wackestone. The vertical and lateral distributions of these facies along the exposure define their sequence setting using the principals of sequence stratigraphy. By recognizing erosional surfaces, facies offset, and changes in facies proportions, five third-order sequences, with an average duration of 1.1 Myr, are interpreted for the Hanifa Formation. The correlation of the sequences across the study area shows that only four sequences are preserved in the north where shallow-water deposits are well-developed. Facies trends within these sequences are further illustrated in depositional models representing the highstand systems tracts (HST) and the transgressive systems tracts (TST) of the Hanifa Formation. These proposed models represent depositional settings of a carbonate ramp with normal open-marine conditions. The HST depositional model is characterized by a high-energy shoreline and depicts the presence of an offshore, structurally controlled skeletal-peloidal shoal body described here for the first time at the Hanifa exposure in the Hozwa area. This work provides a predictive framework and outcrop analog for applications in hydrocarbon exploration and development. Furthermore, a basinal setting predicted to the south of the study area is a potential site for unconventional plays.
Jia, Zhilong; Liu, Ying; Guan, Naiyang; Bo, Xiaochen; Luo, Zhigang; Barnes, Michael R
2016-05-27
Drug repositioning, finding new indications for existing drugs, has gained much recent attention as a potentially efficient and economical strategy for accelerating new therapies into the clinic. Although improvement in the sensitivity of computational drug repositioning methods has identified numerous credible repositioning opportunities, few have been progressed. Arguably the "black box" nature of drug action in a new indication is one of the main blocks to progression, highlighting the need for methods that inform on the broader target mechanism in the disease context. We demonstrate that the analysis of co-expressed genes may be a critical first step towards illumination of both disease pathology and mode of drug action. We achieve this using a novel framework, co-expressed gene-set enrichment analysis (cogena) for co-expression analysis of gene expression signatures and gene set enrichment analysis of co-expressed genes. The cogena framework enables simultaneous, pathway driven, disease and drug repositioning analysis. Cogena can be used to illuminate coordinated changes within disease transcriptomes and identify drugs acting mechanistically within this framework. We illustrate this using a psoriatic skin transcriptome, as an exemplar, and recover two widely used Psoriasis drugs (Methotrexate and Ciclosporin) with distinct modes of action. Cogena out-performs the results of Connectivity Map and NFFinder webservers in similar disease transcriptome analyses. Furthermore, we investigated the literature support for the other top-ranked compounds to treat psoriasis and showed how the outputs of cogena analysis can contribute new insight to support the progression of drugs into the clinic. We have made cogena freely available within Bioconductor or https://github.com/zhilongjia/cogena . In conclusion, by targeting co-expressed genes within disease transcriptomes, cogena offers novel biological insight, which can be effectively harnessed for drug discovery and repositioning, allowing the grouping and prioritisation of drug repositioning candidates on the basis of putative mode of action.
Integrated performance and reliability specification for digital avionics systems
NASA Technical Reports Server (NTRS)
Brehm, Eric W.; Goettge, Robert T.
1995-01-01
This paper describes an automated tool for performance and reliability assessment of digital avionics systems, called the Automated Design Tool Set (ADTS). ADTS is based on an integrated approach to design assessment that unifies traditional performance and reliability views of system designs, and that addresses interdependencies between performance and reliability behavior via exchange of parameters and result between mathematical models of each type. A multi-layer tool set architecture has been developed for ADTS that separates the concerns of system specification, model generation, and model solution. Performance and reliability models are generated automatically as a function of candidate system designs, and model results are expressed within the system specification. The layered approach helps deal with the inherent complexity of the design assessment process, and preserves long-term flexibility to accommodate a wide range of models and solution techniques within the tool set structure. ADTS research and development to date has focused on development of a language for specification of system designs as a basis for performance and reliability evaluation. A model generation and solution framework has also been developed for ADTS, that will ultimately encompass an integrated set of analytic and simulated based techniques for performance, reliability, and combined design assessment.
A dual indicator set to help farms achieve more sustainable crop protection.
Wustenberghs, Hilde; Delcour, Ilse; D'Haene, Karoline; Lauwers, Ludwig; Marchand, Fleur; Steurbaut, Walter; Spanoghe, Pieter
2012-08-01
Farmers are being called to use plant protection products (PPPs) more consciously and adopt more sustainable crop protection strategies. Indicators will help farmers to monitor their progress towards sustainability and will support their learning process. Talking the indicators through in farmers' discussion groups and the resulting peer encouragement will foster knowledge acquirement and can lead to changes in attitudes, norms, perception and behaviour. Using a participatory approach, a conceptual framework for on-farm sustainable crop protection practices was created. The same participatory approach was used to design a dual indicator set, which pairs a pesticide impact assessment system (PIAS) with a farm inquiry. The PIAS measures the risk for human health and the environment exerted by chemical crop protection. The inquiry reveals the farmers' response to this risk, both in terms of the actions they take and their knowledge, awareness and attitude. The dual indicator set allows for implementation in four tiers, each representing increased potential for monitoring and social learning. The indicator set can be adjusted on the basis of new findings, and the participatory approach can be extrapolated to other situations. Copyright © 2012 Society of Chemical Industry.
Setting conservation priorities.
Wilson, Kerrie A; Carwardine, Josie; Possingham, Hugh P
2009-04-01
A generic framework for setting conservation priorities based on the principles of classic decision theory is provided. This framework encapsulates the key elements of any problem, including the objective, the constraints, and knowledge of the system. Within the context of this framework the broad array of approaches for setting conservation priorities are reviewed. While some approaches prioritize assets or locations for conservation investment, it is concluded here that prioritization is incomplete without consideration of the conservation actions required to conserve the assets at particular locations. The challenges associated with prioritizing investments through time in the face of threats (and also spatially and temporally heterogeneous costs) can be aided by proper problem definition. Using the authors' general framework for setting conservation priorities, multiple criteria can be rationally integrated and where, how, and when to invest conservation resources can be scheduled. Trade-offs are unavoidable in priority setting when there are multiple considerations, and budgets are almost always finite. The authors discuss how trade-offs, risks, uncertainty, feedbacks, and learning can be explicitly evaluated within their generic framework for setting conservation priorities. Finally, they suggest ways that current priority-setting approaches may be improved.
McGraw, Caroline; Drennan, Vari M
2015-02-01
To evaluate the suitability of root cause analysis frameworks for the investigation of community-acquired pressure ulcers. The objective was to identify the extent to which these frameworks take account of the setting where the ulcer originated as being the person's home rather than a hospital setting. Pressure ulcers involving full-thickness skin loss are increasingly being regarded as indicators of nursing patient safety failure, requiring investigation using root cause analysis frameworks. Evidence suggests that root cause analysis frameworks developed in hospital settings ignore the unique dimensions of risk in home healthcare settings. A systematic literature review and documentary analysis of frameworks used to investigate community-acquired grade three and four pressure ulcers by home nursing services in England. No published papers were identified for inclusion in the review. Fifteen patient safety investigative frameworks were collected and analysed. Twelve of the retrieved frameworks were intended for the investigation of community-acquired pressure ulcers; seven of which took account of the setting where the ulcer originated as being the patient's home. This study provides evidence to suggest that many of the root cause analysis frameworks used to investigate community-acquired pressure ulcers in England are unsuitable for this purpose. This study provides researchers and practitioners with evidence of the need to develop appropriate home nursing root cause analysis frameworks to investigate community-acquired pressure ulcers. © 2014 John Wiley & Sons Ltd.
ERIC Educational Resources Information Center
Schweizer, Karl
2008-01-01
Structural equation modeling provides the framework for investigating experimental effects on the basis of variances and covariances in repeated measurements. A special type of confirmatory factor analysis as part of this framework enables the appropriate representation of the experimental effect and the separation of experimental and…
The "Blueprint" Framework for Career Management Skills: A Critical Exploration
ERIC Educational Resources Information Center
Hooley, Tristram; Watts, A. G.; Sultana, Ronald G.; Neary, Siobhan
2013-01-01
This article examines the Blueprint framework for career management skills as it has been revealed across sequential implementations in the USA, Canada and Australia. It is argued that despite its lack of an empirical basis, the framework forms a useful and innovative means through which career theory, practice and policy can be connected. The…
ERIC Educational Resources Information Center
Hinton, Kip Austin
2015-01-01
Social science research on communities of color has long been shaped by theories of social and cultural capital. This article is a hermeneutic reading of metaphorical capital frameworks, including community cultural wealth and funds of knowledge. Financial capital, the basis of these frameworks, is premised on unequal exchange. Money only becomes…
ERIC Educational Resources Information Center
de Muynck, Bram; Reijnoudt-Klein, Willemieke; Spruyt-de Kloe, Marike
2017-01-01
This article reports the development of a framework that structures differences in Christian educational practices worldwide. One of its purposes is to simplify the complexity of the contexts in which global partners cooperate. The framework also offers the theoretical basis for an instrument that nongovernmental organizations can use to determine…
Identifying Attributes of CO2 Leakage Zones in Shallow Aquifers Using a Parametric Level Set Method
NASA Astrophysics Data System (ADS)
Sun, A. Y.; Islam, A.; Wheeler, M.
2016-12-01
Leakage through abandoned wells and geologic faults poses the greatest risk to CO2 storage permanence. For shallow aquifers, secondary CO2 plumes emanating from the leak zones may go undetected for a sustained period of time and has the greatest potential to cause large-scale and long-term environmental impacts. Identification of the attributes of leak zones, including their shape, location, and strength, is required for proper environmental risk assessment. This study applies a parametric level set (PaLS) method to characterize the leakage zone. Level set methods are appealing for tracking topological changes and recovering unknown shapes of objects. However, level set evolution using the conventional level set methods is challenging. In PaLS, the level set function is approximated using a weighted sum of basis functions and the level set evolution problem is replaced by an optimization problem. The efficacy of PaLS is demonstrated through recovering the source zone created by CO2 leakage into a carbonate aquifer. Our results show that PaLS is a robust source identification method that can recover the approximate source locations in the presence of measurement errors, model parameter uncertainty, and inaccurate initial guesses of source flux strengths. The PaLS inversion framework introduced in this work is generic and can be adapted for any reactive transport model by switching the pre- and post-processing routines.
NASA Astrophysics Data System (ADS)
Ozaki, H.
2004-01-01
Using the closed-time-path formalism, we construct perturbative frameworks, in terms of quasiparticle picture, for studying quasiuniform relativistic quantum field systems near equilibrium and non-equilibrium quasistationary systems. We employ the derivative expansion and take in up to the second-order term, i.e., one-order higher than the gradient approximation. After constructing self-energy resummed propagator, we formulated two kinds of mutually equivalent perturbative frameworks: The first one is formulated on the basis of the ``bare'' number density function, and the second one is formulated on the basis of ``physical'' number density function. In the course of construction of the second framework, the generalized Boltzmann equations directly come out, which describe the evolution of the system.
Dynamical basis sets for algebraic variational calculations in quantum-mechanical scattering theory
NASA Technical Reports Server (NTRS)
Sun, Yan; Kouri, Donald J.; Truhlar, Donald G.; Schwenke, David W.
1990-01-01
New basis sets are proposed for linear algebraic variational calculations of transition amplitudes in quantum-mechanical scattering problems. These basis sets are hybrids of those that yield the Kohn variational principle (KVP) and those that yield the generalized Newton variational principle (GNVP) when substituted in Schlessinger's stationary expression for the T operator. Trial calculations show that efficiencies almost as great as that of the GNVP and much greater than the KVP can be obtained, even for basis sets with the majority of the members independent of energy.
On basis set superposition error corrected stabilization energies for large n-body clusters.
Walczak, Katarzyna; Friedrich, Joachim; Dolg, Michael
2011-10-07
In this contribution, we propose an approximate basis set superposition error (BSSE) correction scheme for the site-site function counterpoise and for the Valiron-Mayer function counterpoise correction of second order to account for the basis set superposition error in clusters with a large number of subunits. The accuracy of the proposed scheme has been investigated for a water cluster series at the CCSD(T), CCSD, MP2, and self-consistent field levels of theory using Dunning's correlation consistent basis sets. The BSSE corrected stabilization energies for a series of water clusters are presented. A study regarding the possible savings with respect to computational resources has been carried out as well as a monitoring of the basis set dependence of the approximate BSSE corrections. © 2011 American Institute of Physics
Setting Healthcare Priorities at the Macro and Meso Levels: A Framework for Evaluation
Barasa, Edwine W.; Molyneux, Sassy; English, Mike; Cleary, Susan
2015-01-01
Background: Priority setting in healthcare is a key determinant of health system performance. However, there is no widely accepted priority setting evaluation framework. We reviewed literature with the aim of developing and proposing a framework for the evaluation of macro and meso level healthcare priority setting practices. Methods: We systematically searched Econlit, PubMed, CINAHL, and EBSCOhost databases and supplemented this with searches in Google Scholar, relevant websites and reference lists of relevant papers. A total of 31 papers on evaluation of priority setting were identified. These were supplemented by broader theoretical literature related to evaluation of priority setting. A conceptual review of selected papers was undertaken. Results: Based on a synthesis of the selected literature, we propose an evaluative framework that requires that priority setting practices at the macro and meso levels of the health system meet the following conditions: (1) Priority setting decisions should incorporate both efficiency and equity considerations as well as the following outcomes; (a) Stakeholder satisfaction, (b) Stakeholder understanding, (c) Shifted priorities (reallocation of resources), and (d) Implementation of decisions. (2) Priority setting processes should also meet the procedural conditions of (a) Stakeholder engagement, (b) Stakeholder empowerment, (c) Transparency, (d) Use of evidence, (e) Revisions, (f) Enforcement, and (g) Being grounded on community values. Conclusion: Available frameworks for the evaluation of priority setting are mostly grounded on procedural requirements, while few have included outcome requirements. There is, however, increasing recognition of the need to incorporate both consequential and procedural considerations in priority setting practices. In this review, we adapt an integrative approach to develop and propose a framework for the evaluation of priority setting practices at the macro and meso levels that draws from these complementary schools of thought. PMID:26673332
Setting Healthcare Priorities at the Macro and Meso Levels: A Framework for Evaluation.
Barasa, Edwine W; Molyneux, Sassy; English, Mike; Cleary, Susan
2015-09-16
Priority setting in healthcare is a key determinant of health system performance. However, there is no widely accepted priority setting evaluation framework. We reviewed literature with the aim of developing and proposing a framework for the evaluation of macro and meso level healthcare priority setting practices. We systematically searched Econlit, PubMed, CINAHL, and EBSCOhost databases and supplemented this with searches in Google Scholar, relevant websites and reference lists of relevant papers. A total of 31 papers on evaluation of priority setting were identified. These were supplemented by broader theoretical literature related to evaluation of priority setting. A conceptual review of selected papers was undertaken. Based on a synthesis of the selected literature, we propose an evaluative framework that requires that priority setting practices at the macro and meso levels of the health system meet the following conditions: (1) Priority setting decisions should incorporate both efficiency and equity considerations as well as the following outcomes; (a) Stakeholder satisfaction, (b) Stakeholder understanding, (c) Shifted priorities (reallocation of resources), and (d) Implementation of decisions. (2) Priority setting processes should also meet the procedural conditions of (a) Stakeholder engagement, (b) Stakeholder empowerment, (c) Transparency, (d) Use of evidence, (e) Revisions, (f) Enforcement, and (g) Being grounded on community values. Available frameworks for the evaluation of priority setting are mostly grounded on procedural requirements, while few have included outcome requirements. There is, however, increasing recognition of the need to incorporate both consequential and procedural considerations in priority setting practices. In this review, we adapt an integrative approach to develop and propose a framework for the evaluation of priority setting practices at the macro and meso levels that draws from these complementary schools of thought. © 2015 by Kerman University of Medical Sciences.
Carinci, F; Van Gool, K; Mainz, J; Veillard, J; Pichora, E C; Januel, J M; Arispe, I; Kim, S M; Klazinga, N S
2015-04-01
To review and update the conceptual framework, indicator content and research priorities of the Organisation for Economic Cooperation and Development's (OECD) Health Care Quality Indicators (HCQI) project, after a decade of collaborative work. A structured assessment was carried out using a modified Delphi approach, followed by a consensus meeting, to assess the suite of HCQI for international comparisons, agree on revisions to the original framework and set priorities for research and development. International group of countries participating to OECD projects. Members of the OECD HCQI expert group. A reference matrix, based on a revised performance framework, was used to map and assess all seventy HCQI routinely calculated by the OECD expert group. A total of 21 indicators were agreed to be excluded, due to the following concerns: (i) relevance, (ii) international comparability, particularly where heterogeneous coding practices might induce bias, (iii) feasibility, when the number of countries able to report was limited and the added value did not justify sustained effort and (iv) actionability, for indicators that were unlikely to improve on the basis of targeted policy interventions. The revised OECD framework for HCQI represents a new milestone of a long-standing international collaboration among a group of countries committed to building common ground for performance measurement. The expert group believes that the continuation of this work is paramount to provide decision makers with a validated toolbox to directly act on quality improvement strategies. © The Author 2015. Published by Oxford University Press in association with the International Society for Quality in Health Care; all rights reserved.
Margevicius, Kristen J.; Generous, Nicholas; Taylor-McCabe, Kirsten J.; Brown, Mac; Daniel, W. Brent; Castro, Lauren; Hengartner, Andrea; Deshpande, Alina
2014-01-01
In recent years, biosurveillance has become the buzzword under which a diverse set of ideas and activities regarding detecting and mitigating biological threats are incorporated depending on context and perspective. Increasingly, biosurveillance practice has become global and interdisciplinary, requiring information and resources across public health, One Health, and biothreat domains. Even within the scope of infectious disease surveillance, multiple systems, data sources, and tools are used with varying and often unknown effectiveness. Evaluating the impact and utility of state-of-the-art biosurveillance is, in part, confounded by the complexity of the systems and the information derived from them. We present a novel approach conceptualizing biosurveillance from the perspective of the fundamental data streams that have been or could be used for biosurveillance and to systematically structure a framework that can be universally applicable for use in evaluating and understanding a wide range of biosurveillance activities. Moreover, the Biosurveillance Data Stream Framework and associated definitions are proposed as a starting point to facilitate the development of a standardized lexicon for biosurveillance and characterization of currently used and newly emerging data streams. Criteria for building the data stream framework were developed from an examination of the literature, analysis of information on operational infectious disease biosurveillance systems, and consultation with experts in the area of biosurveillance. To demonstrate utility, the framework and definitions were used as the basis for a schema of a relational database for biosurveillance resources and in the development and use of a decision support tool for data stream evaluation. PMID:24392093
Dziadkowiec, Oliwier; Callahan, Tiffany; Ozkaynak, Mustafa; Reeder, Blaine; Welton, John
2016-01-01
Objectives: We examine the following: (1) the appropriateness of using a data quality (DQ) framework developed for relational databases as a data-cleaning tool for a data set extracted from two EPIC databases, and (2) the differences in statistical parameter estimates on a data set cleaned with the DQ framework and data set not cleaned with the DQ framework. Background: The use of data contained within electronic health records (EHRs) has the potential to open doors for a new wave of innovative research. Without adequate preparation of such large data sets for analysis, the results might be erroneous, which might affect clinical decision-making or the results of Comparative Effectives Research studies. Methods: Two emergency department (ED) data sets extracted from EPIC databases (adult ED and children ED) were used as examples for examining the five concepts of DQ based on a DQ assessment framework designed for EHR databases. The first data set contained 70,061 visits; and the second data set contained 2,815,550 visits. SPSS Syntax examples as well as step-by-step instructions of how to apply the five key DQ concepts these EHR database extracts are provided. Conclusions: SPSS Syntax to address each of the DQ concepts proposed by Kahn et al. (2012)1 was developed. The data set cleaned using Kahn’s framework yielded more accurate results than the data set cleaned without this framework. Future plans involve creating functions in R language for cleaning data extracted from the EHR as well as an R package that combines DQ checks with missing data analysis functions. PMID:27429992
A unified framework for evaluating the risk of re-identification of text de-identification tools.
Scaiano, Martin; Middleton, Grant; Arbuckle, Luk; Kolhatkar, Varada; Peyton, Liam; Dowling, Moira; Gipson, Debbie S; El Emam, Khaled
2016-10-01
It has become regular practice to de-identify unstructured medical text for use in research using automatic methods, the goal of which is to remove patient identifying information to minimize re-identification risk. The metrics commonly used to determine if these systems are performing well do not accurately reflect the risk of a patient being re-identified. We therefore developed a framework for measuring the risk of re-identification associated with textual data releases. We apply the proposed evaluation framework to a data set from the University of Michigan Medical School. Our risk assessment results are then compared with those that would be obtained using a typical contemporary micro-average evaluation of recall in order to illustrate the difference between the proposed evaluation framework and the current baseline method. We demonstrate how this framework compares against common measures of the re-identification risk associated with an automated text de-identification process. For the probability of re-identification using our evaluation framework we obtained a mean value for direct identifiers of 0.0074 and a mean value for quasi-identifiers of 0.0022. The 95% confidence interval for these estimates were below the relevant thresholds. The threshold for direct identifier risk was based on previously used approaches in the literature. The threshold for quasi-identifiers was determined based on the context of the data release following commonly used de-identification criteria for structured data. Our framework attempts to correct for poorly distributed evaluation corpora, accounts for the data release context, and avoids the often optimistic assumptions that are made using the more traditional evaluation approach. It therefore provides a more realistic estimate of the true probability of re-identification. This framework should be used as a basis for computing re-identification risk in order to more realistically evaluate future text de-identification tools. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
High quality Gaussian basis sets for fourth-row atoms
NASA Technical Reports Server (NTRS)
Partridge, Harry; Faegri, Knut, Jr.
1992-01-01
Energy optimized Gaussian basis sets of triple-zeta quality for the atoms Rb-Xe have been derived. Two series of basis sets are developed: (24s 16p 10d) and (26s 16p 10d) sets which were expanded to 13d and 19p functions as the 4d and 5p shells become occupied. For the atoms lighter than Cd, the (24s 16p 10d) sets with triple-zeta valence distributions are higher in energy than the corresponding double-zeta distribution. To ensure a triple-zeta distribution and a global energy minimum, the (26s 16p 10d) sets were derived. Total atomic energies from the largest basis sets are between 198 and 284 (mu)E(sub H) above the numerical Hartree-Fock energies.
Acceptance of lean redesigns in primary care: A contextual analysis.
Hung, Dorothy; Gray, Caroline; Martinez, Meghan; Schmittdiel, Julie; Harrison, Michael I
Lean is a leading change strategy used in health care to achieve short-term efficiency and quality improvement while promising longer-term system transformation. Most research examines Lean intervention to address isolated problems, rather than to achieve broader systemic changes to care delivery. Moreover, no studies examine contextual influences on system-wide Lean implementation efforts in primary care. The aim of this study was to identify contextual factors most critical to implementing and scaling Lean redesigns across all primary care clinics in a large, ambulatory care delivery system. Over 100 interviews and focus groups were conducted with frontline physicians, clinical staff, and operational leaders. Data analysis was guided by a modified Consolidated Framework for Implementation Research (CFIR), a popular implementation science framework. On the basis of expert recommendations, the modified framework targets factors influencing the implementation of process redesigns. This modified framework, the CFIR-PR, informed our identification of contextual factors that most impacted Lean acceptance among frontline physicians and staff. Several domains identified by the CFIR-PR were critical to acceptance of Lean redesigns. Regarding the implementation process acceptance was influenced by time and intensity of exposure to changes, "top-down" versus "bottom-up" implementation styles, and degrees of employee engagement in developing new workflows. Important factors in the inner setting were the clinic's culture and style of leadership, along with availability of information about Lean's effectiveness. Last, implementation efforts were impacted by individual and team characteristics regarding changed work roles and related issues of professional identity, authority, and autonomy. This study underscores the need for change leaders to consider the contextual factors that surround efforts to implement Lean in primary care. As Lean redesigns are scaled across a system, special attention is warranted with respect to the implementation approach, internal clinic setting, and implications for professional roles and identities of physicians and staff.
Relativistic well-tempered Gaussian basis sets for helium through mercury
DOE Office of Scientific and Technical Information (OSTI.GOV)
Okada, S.; Matsuoka, O.
1989-10-01
Exponent parameters of the nonrelativistically optimized well-tempered Gaussian basis sets of Huzinaga and Klobukowski have been employed for Dirac--Fock--Roothaan calculations without their reoptimization. For light atoms He (atomic number {ital Z}=2)--Rh ({ital Z}=45), the number of exponent parameters used has been the same as the nonrelativistic basis sets and for heavier atoms Pd ({ital Z}=46)--Hg({ital Z}=80), two 2{ital p} (and three 3{ital d}) Gaussian basis functions have been augmented. The scheme of kinetic energy balance and the uniformly charged sphere model of atomic nuclei have been adopted. The qualities of the calculated basis sets are close to the Dirac--Fock limit.
Shafer; Inglis
2000-07-01
/ Managing protected areas involves balancing the enjoyment of visitors with the protection of a variety of cultural and biophysical resources. Tourism pressures in the Great Barrier Reef World Heritage Area (GBRWHA) are creating concerns about how to strike this balance in a marine environment. Terrestrial-based research has led to conceptual planning and management frameworks that address issues of human use and resource protection. The limits of acceptable change (LAC) framework was used as a conceptual basis for a study of snorkeling at reef sites in the GBRWHA. The intent was to determine if different settings existed among tourism operators traveling to the reef and, if so, to identify specific conditions relating to those settings. Snorkelers (N = 1475) traveling with tourism operations of different sizes who traveled to different sites completed surveys. Results indicated that snorkelers who traveled with larger operations (more people and infrastructure) differed from those traveling with smaller operations (few people and little on-site infrastructure) on benefits received and in the way that specific conditions influenced their enjoyment. Benefits related to nature, escape, and family helped to define reef experiences. Conditions related to coral, fish, and operator staff had a positive influence on the enjoyment of most visitors but, number of people on the trip and site infrastructure may have the greatest potential as setting indicators. Data support the potential usefulness of visitor input in applying the LAC concept to a marine environment where tourism and recreational uses are rapidly changing.
Medicine shortages in Australia: causes, impact and management strategies in the community setting.
Tan, Yee Xi; Moles, Rebekah J; Chaar, Betty B
2016-10-01
Background Medicine shortages are an ongoing global problem. The Therapeutic Goods Administration (TGA) dedicated a website for monitoring of medicine shortages in Australia in May 2014, as part of the Medicine Shortage Information Initiative. This study aimed to explore the views of pharmacists regarding medicine shortages in the community setting and the impact of the TGA website in Australia. Setting Community pharmacies in New South Wales, Australia. Method Twenty semi-structured interviews were conducted with community pharmacists. Data collected were analysed thematically utilising the framework analysis method. Main outcome measure Qualitative analysis conducted using the framework approach. Results Findings clearly indicated that medicine shortages were experienced on a regular basis, but most participants were unaware of the TGA website. Medicine shortages reportedly impacted both pharmacists and consumers; and various workarounds were undertaken to manage the issue. The "price disclosure policy" was found to be a prominent contributing factor in emerging shortages. Suggestions were made for ways to improve the growing occurrence of shortages. Conclusion Overall, the study found that there was a lack of familiarity with the TGA website, despite experiencing regular shortages of medicines in practice. Also highlighted, was the importance of pharmacists prioritising patient care over business decisions. To reduce prescribing of out-of-stock medicines notifying doctors about shortages was also considered important, to allow for early action to be taken at higher levels of the supply chain. Findings of this study may help direct future policy-making around the world, as medicine shortages is a problem shared by healthcare providers in most countries around the world.
Prescott, Sarah; Fleming, Jennifer; Doig, Emmah
2017-06-11
The aim of this study was to explore clinicians' experiences of implementing goal setting with community dwelling clients with acquired brain injury, to develop a goal setting practice framework. Grounded theory methodology was employed. Clinicians, representing six disciplines across seven services, were recruited and interviewed until theoretical saturation was achieved. A total of 22 clinicians were interviewed. A theoretical framework was developed to explain how clinicians support clients to actively engage in goal setting in routine practice. The framework incorporates three phases: a needs identification phase, a goal operationalisation phase, and an intervention phase. Contextual factors, including personal and environmental influences, also affect how clinicians and clients engage in this process. Clinicians use additional strategies to support clients with impaired self-awareness. These include structured communication and metacognitive strategies to operationalise goals. For clients with emotional distress, clinicians provide additional time and intervention directed at new identity development. The goal setting practice framework may guide clinician's understanding of how to engage in client-centred goal setting in brain injury rehabilitation. There is a predilection towards a client-centred goal setting approach in the community setting, however, contextual factors can inhibit implementation of this approach. Implications for Rehabilitation The theoretical framework describes processes used to develop achievable client-centred goals with people with brain injury. Building rapport is a core strategy to engage clients with brain injury in goal setting. Clients with self-awareness impairment benefit from additional metacognitive strategies to participate in goal setting. Clients with emotional distress may need additional time for new identity development.
NASA Astrophysics Data System (ADS)
Wang, Feng; Pang, Wenning; Duffy, Patrick
2012-12-01
Performance of a number of commonly used density functional methods in chemistry (B3LYP, Bhandh, BP86, PW91, VWN, LB94, PBe0, SAOP and X3LYP and the Hartree-Fock (HF) method) has been assessed using orbital momentum distributions of the 7σ orbital of nitrous oxide (NNO), which models electron behaviour in a chemically significant region. The density functional methods are combined with a number of Gaussian basis sets (Pople's 6-31G*, 6-311G**, DGauss TZVP and Dunning's aug-cc-pVTZ as well as even-tempered Slater basis sets, namely, et-DZPp, et-QZ3P, et-QZ+5P and et-pVQZ). Orbital momentum distributions of the 7σ orbital in the ground electronic state of NNO, which are obtained from a Fourier transform into momentum space from single point electronic calculations employing the above models, are compared with experimental measurement of the same orbital from electron momentum spectroscopy (EMS). The present study reveals information on performance of (a) the density functional methods, (b) Gaussian and Slater basis sets, (c) combinations of the density functional methods and basis sets, that is, the models, (d) orbital momentum distributions, rather than a group of specific molecular properties and (e) the entire region of chemical significance of the orbital. It is found that discrepancies of this orbital between the measured and the calculated occur in the small momentum region (i.e. large r region). In general, Slater basis sets achieve better overall performance than the Gaussian basis sets. Performance of the Gaussian basis sets varies noticeably when combining with different Vxc functionals, but Dunning's augcc-pVTZ basis set achieves the best performance for the momentum distributions of this orbital. The overall performance of the B3LYP and BP86 models is similar to newer models such as X3LYP and SAOP. The present study also demonstrates that the combinations of the density functional methods and the basis sets indeed make a difference in the quality of the calculated orbitals.
Plumley, Joshua A.; Dannenberg, J. J.
2011-01-01
We evaluate the performance of nine functionals (B3LYP, M05, M05-2X, M06, M06-2X, B2PLYP, B2PLYPD, X3LYP, B97D and MPWB1K) in combination with 16 basis sets ranging in complexity from 6-31G(d) to aug-cc-pV5Z for the calculation of the H-bonded water dimer with the goal of defining which combinations of functionals and basis sets provide a combination of economy and accuracy for H-bonded systems. We have compared the results to the best non-DFT molecular orbital calculations and to experimental results. Several of the smaller basis sets lead to qualitatively incorrect geometries when optimized on a normal potential energy surface (PES). This problem disappears when the optimization is performed on a counterpoise corrected PES. The calculated ΔE's with the largest basis sets vary from -4.42 (B97D) to -5.19 (B2PLYPD) kcal/mol for the different functionals. Small basis sets generally predict stronger interactions than the large ones. We found that, due to error compensation, the smaller basis sets gave the best results (in comparison to experimental and high level non-DFT MO calculations) when combined with a functional that predicts a weak interaction with the largest basis set. Since many applications are complex systems and require economical calculations, we suggest the following functional/basis set combinations in order of increasing complexity and cost: 1) D95(d,p) with B3LYP, B97D, M06 or MPWB1k; 2) 6-311G(d,p) with B3LYP; 3) D95++(d,p) with B3LYP, B97D or MPWB1K; 4)6-311++G(d,p) with B3LYP or B97D; and 5) aug-cc-pVDZ with M05-2X, M06-2X or X3LYP. PMID:21328398
Plumley, Joshua A; Dannenberg, J J
2011-06-01
We evaluate the performance of ten functionals (B3LYP, M05, M05-2X, M06, M06-2X, B2PLYP, B2PLYPD, X3LYP, B97D, and MPWB1K) in combination with 16 basis sets ranging in complexity from 6-31G(d) to aug-cc-pV5Z for the calculation of the H-bonded water dimer with the goal of defining which combinations of functionals and basis sets provide a combination of economy and accuracy for H-bonded systems. We have compared the results to the best non-density functional theory (non-DFT) molecular orbital (MO) calculations and to experimental results. Several of the smaller basis sets lead to qualitatively incorrect geometries when optimized on a normal potential energy surface (PES). This problem disappears when the optimization is performed on a counterpoise (CP) corrected PES. The calculated interaction energies (ΔEs) with the largest basis sets vary from -4.42 (B97D) to -5.19 (B2PLYPD) kcal/mol for the different functionals. Small basis sets generally predict stronger interactions than the large ones. We found that, because of error compensation, the smaller basis sets gave the best results (in comparison to experimental and high-level non-DFT MO calculations) when combined with a functional that predicts a weak interaction with the largest basis set. As many applications are complex systems and require economical calculations, we suggest the following functional/basis set combinations in order of increasing complexity and cost: (1) D95(d,p) with B3LYP, B97D, M06, or MPWB1k; (2) 6-311G(d,p) with B3LYP; (3) D95++(d,p) with B3LYP, B97D, or MPWB1K; (4) 6-311++G(d,p) with B3LYP or B97D; and (5) aug-cc-pVDZ with M05-2X, M06-2X, or X3LYP. Copyright © 2011 Wiley Periodicals, Inc.
Measuring sexual function in community surveys: development of a conceptual framework.
Mitchell, Kirstin R; Wellings, Kaye
2013-01-01
Among the many psychometric measures of sexual (dys)function, none is entirely suited to use in community surveys. Faced with the need to include a brief and non-intrusive measure of sexual function in a general population survey, a new measure was developed. Findings from qualitative research with men and women in the community designed to inform the conceptual framework for this measure are presented. Thirty-two semi-structured interviews with individuals recruited from a general practice, an HIV/AIDS charity, and a sexual problems clinic were conducted. From their accounts, 31 potential criteria of a functional sex life were identified. Using evidence from qualitative data and the existing literature, and applying a set of decision rules, the list was reduced to 13 (eight for those not in a relationship), and a further eight criteria were added to enable individuals to self-rate their level of function and indicate the severity of difficulties. These criteria constitute a conceptual framework that is grounded in participant perceptions; is relevant to all, regardless of sexual experience or orientation; provides opportunity to state the degree of associated distress; and incorporates relational, psychological, and physiological aspects. It provides the conceptual basis for a concise and acceptable measure of sexual function.
A weak Galerkin generalized multiscale finite element method
Mu, Lin; Wang, Junping; Ye, Xiu
2016-03-31
In this study, we propose a general framework for weak Galerkin generalized multiscale (WG-GMS) finite element method for the elliptic problems with rapidly oscillating or high contrast coefficients. This general WG-GMS method features in high order accuracy on general meshes and can work with multiscale basis derived by different numerical schemes. A special case is studied under this WG-GMS framework in which the multiscale basis functions are obtained by solving local problem with the weak Galerkin finite element method. Convergence analysis and numerical experiments are obtained for the special case.
A weak Galerkin generalized multiscale finite element method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mu, Lin; Wang, Junping; Ye, Xiu
In this study, we propose a general framework for weak Galerkin generalized multiscale (WG-GMS) finite element method for the elliptic problems with rapidly oscillating or high contrast coefficients. This general WG-GMS method features in high order accuracy on general meshes and can work with multiscale basis derived by different numerical schemes. A special case is studied under this WG-GMS framework in which the multiscale basis functions are obtained by solving local problem with the weak Galerkin finite element method. Convergence analysis and numerical experiments are obtained for the special case.
NASA Astrophysics Data System (ADS)
Miliordos, Evangelos; Xantheas, Sotiris S.
2015-03-01
We report the variation of the binding energy of the Formic Acid Dimer with the size of the basis set at the Coupled Cluster with iterative Singles, Doubles and perturbatively connected Triple replacements [CCSD(T)] level of theory, estimate the Complete Basis Set (CBS) limit, and examine the validity of the Basis Set Superposition Error (BSSE)-correction for this quantity that was previously challenged by Kalescky, Kraka, and Cremer (KKC) [J. Chem. Phys. 140, 084315 (2014)]. Our results indicate that the BSSE correction, including terms that account for the substantial geometry change of the monomers due to the formation of two strong hydrogen bonds in the dimer, is indeed valid for obtaining accurate estimates for the binding energy of this system as it exhibits the expected decrease with increasing basis set size. We attribute the discrepancy between our current results and those of KKC to their use of a valence basis set in conjunction with the correlation of all electrons (i.e., including the 1s of C and O). We further show that the use of a core-valence set in conjunction with all electron correlation converges faster to the CBS limit as the BSSE correction is less than half than the valence electron/valence basis set case. The uncorrected and BSSE-corrected binding energies were found to produce the same (within 0.1 kcal/mol) CBS limits. We obtain CCSD(T)/CBS best estimates for De = - 16.1 ± 0.1 kcal/mol and for D0 = - 14.3 ± 0.1 kcal/mol, the later in excellent agreement with the experimental value of -14.22 ± 0.12 kcal/mol.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mardirossian, Narbe; Head-Gordon, Martin
2013-08-22
For a set of eight equilibrium intermolecular complexes, it is discovered in this paper that the basis set limit (BSL) cannot be reached by aug-cc-pV5Z for three of the Minnesota density functionals: M06-L, M06-HF, and M11-L. In addition, the M06 and M11 functionals exhibit substantial, but less severe, difficulties in reaching the BSL. By using successively finer grids, it is demonstrated that this issue is not related to the numerical integration of the exchange-correlation functional. In addition, it is shown that the difficulty in reaching the BSL is not a direct consequence of the structure of the augmented functions inmore » Dunning’s basis sets, since modified augmentation yields similar results. By using a very large custom basis set, the BSL appears to be reached for the HF dimer for all of the functionals. As a result, it is concluded that the difficulties faced by several of the Minnesota density functionals are related to an interplay between the form of these functionals and the structure of standard basis sets. It is speculated that the difficulty in reaching the basis set limit is related to the magnitude of the inhomogeneity correction factor (ICF) of the exchange functional. A simple modification of the M06-L exchange functional that systematically reduces the basis set superposition error (BSSE) for the HF dimer in the aug-cc-pVQZ basis set is presented, further supporting the speculation that the difficulty in reaching the BSL is caused by the magnitude of the exchange functional ICF. In conclusion, the BSSE is plotted with respect to the internuclear distance of the neon dimer for two of the examined functionals.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miliordos, Evangelos; Aprà, Edoardo; Xantheas, Sotiris S.
We establish a new estimate for the binding energy between two benzene molecules in the parallel-displaced (PD) conformation by systematically converging (i) the intra- and intermolecular geometry at the minimum, (ii) the expansion of the orbital basis set, and (iii) the level of electron correlation. The calculations were performed at the second-order Møller–Plesset perturbation (MP2) and the coupled cluster including singles, doubles, and a perturbative estimate of triples replacement [CCSD(T)] levels of electronic structure theory. At both levels of theory, by including results corrected for basis set superposition error (BSSE), we have estimated the complete basis set (CBS) limit bymore » employing the family of Dunning’s correlation-consistent polarized valence basis sets. The largest MP2 calculation was performed with the cc-pV6Z basis set (2772 basis functions), whereas the largest CCSD(T) calculation was with the cc-pV5Z basis set (1752 basis functions). The cluster geometries were optimized with basis sets up to quadruple-ζ quality, observing that both its intra- and intermolecular parts have practically converged with the triple-ζ quality sets. The use of converged geometries was found to play an important role for obtaining accurate estimates for the CBS limits. Our results demonstrate that the binding energies with the families of the plain (cc-pVnZ) and augmented (aug-cc-pVnZ) sets converge [within <0.01 kcal/mol for MP2 and <0.15 kcal/mol for CCSD(T)] to the same CBS limit. In addition, the average of the uncorrected and BSSE-corrected binding energies was found to converge to the same CBS limit much faster than either of the two constituents (uncorrected or BSSE-corrected binding energies). Due to the fact that the family of augmented basis sets (especially for the larger sets) causes serious linear dependency problems, the plain basis sets (for which no linear dependencies were found) are deemed as a more efficient and straightforward path for obtaining an accurate CBS limit. We considered extrapolations of the uncorrected (ΔE) and BSSE-corrected (ΔE cp) binding energies, their average value (ΔE ave), as well as the average of the latter over the plain and augmented sets (Δ~E ave) with the cardinal number of the basis set n. Our best estimate of the CCSD(T)/CBS limit for the π–π binding energy in the PD benzene dimer is D e = -2.65 ± 0.02 kcal/mol. The best CCSD(T)/cc-pV5Z calculated value is -2.62 kcal/mol, just 0.03 kcal/mol away from the CBS limit. For comparison, the MP2/CBS limit estimate is -5.00 ± 0.01 kcal/mol, demonstrating a 90% overbinding with respect to CCSD(T). Finally, the spin-component-scaled (SCS) MP2 variant was found to closely reproduce the CCSD(T) results for each basis set, while scaled opposite spin (SOS) MP2 yielded results that are too low when compared to CCSD(T).« less
Atomization Energies of SO and SO2; Basis Set Extrapolation Revisted
NASA Technical Reports Server (NTRS)
Bauschlicher, Charles W., Jr.; Ricca, Alessandra; Arnold, James (Technical Monitor)
1998-01-01
The addition of tight functions to sulphur and extrapolation to the complete basis set limit are required to obtain accurate atomization energies. Six different extrapolation procedures are tried. The best atomization energies come from the series of basis sets that yield the most consistent results for all extrapolation techniques. In the variable alpha approach, alpha values larger than 4.5 or smaller than 3, appear to suggest that the extrapolation may not be reliable. It does not appear possible to determine a reliable basis set series using only the triple and quadruple zeta based sets. The scalar relativistic effects reduce the atomization of SO and SO2 by 0.34 and 0.81 kcal/mol, respectively, and clearly must be accounted for if a highly accurate atomization energy is to be computed. The magnitude of the core-valence (CV) contribution to the atomization is affected by missing diffuse valence functions. The CV contribution is much more stable if basis set superposition errors are accounted for. A similar study of SF, SF(+), and SF6 shows that the best family of basis sets varies with the nature of the S bonding.
Whiting, Mark
2013-03-01
Parenting a child with complex needs or disabilities is a challenging proposition. This study, which drew upon of the experiences of the parents of 34 children (in 33 families), set out to explore the themes of impact, need for help and support and meaning/sense-making as they were related by parents. Data were collected using semi-structured interviews, and an emerging theoretical framework was validated through the use of a series of mind-maps(®) which were presented to individual parents as the basis for a second round (verificational) interview. Parents were nominated into the study by health care professions who were asked to identify the subject children to one of three separate sub-groups: children with a disability; children with a life-limiting/life-threatening illness or children with a technology dependence. Comparisons were made between the three study sub-groups in order to identify areas of consistency and of inconsistency. A fourth study theme - 'battleground' emerged from entirely within the data set. Sense-making occupied a central position within the overall theoretical framework for the study and parental perception of 'battleground' presented as significant element of parental sense-making, particularly in the context of their relationships with professional staff. © The Author(s) 2012.
King, L; Gill, T; Allender, S; Swinburn, B
2011-05-01
Best practice in obesity prevention has generally been defined in terms of 'what' needs to be done while neglecting 'how'. A multifaceted definition of best practice, which combines available evidence on what actions to take, with an established process for interpreting this information in a specific community context, provides a more appropriate basis for defining the principles of best practice in community-based obesity prevention. Based on analysis of a range of literature, a preliminary set of principles was drafted and progressively revised through further analyses of published literature and a series of consultations. The framework for best practice principles comprises: community engagement, programme design and planning, evaluation, implementation and sustainability, and governance. Specific principles were formulated within this framework. While many principles were generic, distinctive features of obesity prevention were also covered. The engagement of end-users influenced the design of the formatting of the outputs, which represent three levels of knowledge transfer: detailed evidence summaries, guiding questions for programme planners and a briefer set of questions for simpler communication purposes. The best practice principles provide a valuable mechanism for the translation of existing evidence and experience into the decision-making processes for planning, implementing and evaluating the complex community-based interventions needed for successful obesity prevention. © 2010 The Authors. obesity reviews © 2010 International Association for the Study of Obesity.
Hadoop neural network for parallel and distributed feature selection.
Hodge, Victoria J; O'Keefe, Simon; Austin, Jim
2016-06-01
In this paper, we introduce a theoretical basis for a Hadoop-based neural network for parallel and distributed feature selection in Big Data sets. It is underpinned by an associative memory (binary) neural network which is highly amenable to parallel and distributed processing and fits with the Hadoop paradigm. There are many feature selectors described in the literature which all have various strengths and weaknesses. We present the implementation details of five feature selection algorithms constructed using our artificial neural network framework embedded in Hadoop YARN. Hadoop allows parallel and distributed processing. Each feature selector can be divided into subtasks and the subtasks can then be processed in parallel. Multiple feature selectors can also be processed simultaneously (in parallel) allowing multiple feature selectors to be compared. We identify commonalities among the five features selectors. All can be processed in the framework using a single representation and the overall processing can also be greatly reduced by only processing the common aspects of the feature selectors once and propagating these aspects across all five feature selectors as necessary. This allows the best feature selector and the actual features to select to be identified for large and high dimensional data sets through exploiting the efficiency and flexibility of embedding the binary associative-memory neural network in Hadoop. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.
ERIC Educational Resources Information Center
Bowen, J. Philip; Sorensen, Jennifer B.; Kirschner, Karl N.
2007-01-01
The analysis explains the basis set superposition error (BSSE) and fragment relaxation involved in calculating the interaction energies using various first principle theories. Interacting the correlated fragment and increasing the size of the basis set can help in decreasing the BSSE to a great extent.
pyGIMLi: An open-source library for modelling and inversion in geophysics
NASA Astrophysics Data System (ADS)
Rücker, Carsten; Günther, Thomas; Wagner, Florian M.
2017-12-01
Many tasks in applied geosciences cannot be solved by single measurements, but require the integration of geophysical, geotechnical and hydrological methods. Numerical simulation techniques are essential both for planning and interpretation, as well as for the process understanding of modern geophysical methods. These trends encourage open, simple, and modern software architectures aiming at a uniform interface for interdisciplinary and flexible modelling and inversion approaches. We present pyGIMLi (Python Library for Inversion and Modelling in Geophysics), an open-source framework that provides tools for modelling and inversion of various geophysical but also hydrological methods. The modelling component supplies discretization management and the numerical basis for finite-element and finite-volume solvers in 1D, 2D and 3D on arbitrarily structured meshes. The generalized inversion framework solves the minimization problem with a Gauss-Newton algorithm for any physical forward operator and provides opportunities for uncertainty and resolution analyses. More general requirements, such as flexible regularization strategies, time-lapse processing and different sorts of coupling individual methods are provided independently of the actual methods used. The usage of pyGIMLi is first demonstrated by solving the steady-state heat equation, followed by a demonstration of more complex capabilities for the combination of different geophysical data sets. A fully coupled hydrogeophysical inversion of electrical resistivity tomography (ERT) data of a simulated tracer experiment is presented that allows to directly reconstruct the underlying hydraulic conductivity distribution of the aquifer. Another example demonstrates the improvement of jointly inverting ERT and ultrasonic data with respect to saturation by a new approach that incorporates petrophysical relations in the inversion. Potential applications of the presented framework are manifold and include time-lapse, constrained, joint, and coupled inversions of various geophysical and hydrological data sets.
Delaney, Aogán; Tamás, Peter A; Crane, Todd A; Chesterman, Sabrina
2016-01-01
There is increasing interest in using systematic review to synthesize evidence on the social and environmental effects of and adaptations to climate change. Use of systematic review for evidence in this field is complicated by the heterogeneity of methods used and by uneven reporting. In order to facilitate synthesis of results and design of subsequent research a method, construct-centered methods aggregation, was designed to 1) provide a transparent, valid and reliable description of research methods, 2) support comparability of primary studies and 3) contribute to a shared empirical basis for improving research practice. Rather than taking research reports at face value, research designs are reviewed through inductive analysis. This involves bottom-up identification of constructs, definitions and operationalizations; assessment of concepts' commensurability through comparison of definitions; identification of theoretical frameworks through patterns of construct use; and integration of transparently reported and valid operationalizations into ideal-type research frameworks. Through the integration of reliable bottom-up inductive coding from operationalizations and top-down coding driven from stated theory with expert interpretation, construct-centered methods aggregation enabled both resolution of heterogeneity within identically named constructs and merging of differently labeled but identical constructs. These two processes allowed transparent, rigorous and contextually sensitive synthesis of the research presented in an uneven set of reports undertaken in a heterogenous field. If adopted more broadly, construct-centered methods aggregation may contribute to the emergence of a valid, empirically-grounded description of methods used in primary research. These descriptions may function as a set of expectations that improves the transparency of reporting and as an evolving comprehensive framework that supports both interpretation of existing and design of future research.
Crane, Todd A.; Chesterman, Sabrina
2016-01-01
There is increasing interest in using systematic review to synthesize evidence on the social and environmental effects of and adaptations to climate change. Use of systematic review for evidence in this field is complicated by the heterogeneity of methods used and by uneven reporting. In order to facilitate synthesis of results and design of subsequent research a method, construct-centered methods aggregation, was designed to 1) provide a transparent, valid and reliable description of research methods, 2) support comparability of primary studies and 3) contribute to a shared empirical basis for improving research practice. Rather than taking research reports at face value, research designs are reviewed through inductive analysis. This involves bottom-up identification of constructs, definitions and operationalizations; assessment of concepts’ commensurability through comparison of definitions; identification of theoretical frameworks through patterns of construct use; and integration of transparently reported and valid operationalizations into ideal-type research frameworks. Through the integration of reliable bottom-up inductive coding from operationalizations and top-down coding driven from stated theory with expert interpretation, construct-centered methods aggregation enabled both resolution of heterogeneity within identically named constructs and merging of differently labeled but identical constructs. These two processes allowed transparent, rigorous and contextually sensitive synthesis of the research presented in an uneven set of reports undertaken in a heterogenous field. If adopted more broadly, construct-centered methods aggregation may contribute to the emergence of a valid, empirically-grounded description of methods used in primary research. These descriptions may function as a set of expectations that improves the transparency of reporting and as an evolving comprehensive framework that supports both interpretation of existing and design of future research. PMID:26901409
Major, M E; Kwakman, R; Kho, M E; Connolly, B; McWilliams, D; Denehy, L; Hanekom, S; Patman, S; Gosselink, R; Jones, C; Nollet, F; Needham, D M; Engelbert, R H H; van der Schaaf, M
2016-10-29
The study objective was to obtain consensus on physical therapy (PT) in the rehabilitation of critical illness survivors after hospital discharge. Research questions were: what are PT goals, what are recommended measurement tools, and what constitutes an optimal PT intervention for survivors of critical illness? A Delphi consensus study was conducted. Panelists were included based on relevant fields of expertise, years of clinical experience, and publication record. A literature review determined five themes, forming the basis for Delphi round one, which was aimed at generating ideas. Statements were drafted and ranked on a 5-point Likert scale in two additional rounds with the objective to reach consensus. Results were expressed as median and semi-interquartile range, with the consensus threshold set at ≤0.5. Ten internationally established researchers and clinicians participated in this Delphi panel, with a response rate of 80 %, 100 %, and 100 % across three rounds. Consensus was reached on 88.5 % of the statements, resulting in a framework for PT after hospital discharge. Essential handover information should include information on 15 parameters. A core set of outcomes should test exercise capacity, skeletal muscle strength, function in activities of daily living, mobility, quality of life, and pain. PT interventions should include functional exercises, circuit and endurance training, strengthening exercises for limb and respiratory muscles, education on recovery, and a nutritional component. Screening tools to identify impairments in other health domains and referral to specialists are proposed. A consensus-based framework for optimal PT after hospital discharge is proposed. Future research should focus on feasibility testing of this framework, developing risk stratification tools and validating core outcome measures for ICU survivors.
ERIC Educational Resources Information Center
Barnhardt, Bradford; Ginns, Paul
2014-01-01
This article orients a recently proposed alienation-based framework for student learning theory (SLT) to the empirical basis of the approaches to learning perspective. The proposed framework makes new macro-level interpretations of an established micro-level theory, across three levels of interpretation: (1) a context-free psychological state…
The effect of diffuse basis functions on valence bond structural weights
NASA Astrophysics Data System (ADS)
Galbraith, John Morrison; James, Andrew M.; Nemes, Coleen T.
2014-03-01
Structural weights and bond dissociation energies have been determined for H-F, H-X, and F-X molecules (-X = -OH, -NH2, and -CH3) at the valence bond self-consistent field (VBSCF) and breathing orbital valence bond (BOVB) levels of theory with the aug-cc-pVDZ and 6-31++G(d,p) basis sets. At the BOVB level, the aug-cc-pVDZ basis set yields a counterintuitive ordering of ionic structural weights when the initial heavy atom s-type basis functions are included. For H-F, H-OH, and F-X, the ordering follows chemical intuition when these basis functions are not included. These counterintuitive weights are shown to be a result of the diffuse polarisation function on one VB fragment being spatially located, in part, on the other VB fragment. Except in the case of F-CH3, this problem is corrected with the 6-31++G(d,p) basis set. The initial heavy atom s-type functions are shown to make an important contribution to the VB orbitals and bond dissociation energies and, therefore, should not be excluded. It is recommended to not use diffuse basis sets in valence bond calculations unless absolutely necessary. If diffuse basis sets are needed, the 6-31++G(d,p) basis set should be used with caution and the structural weights checked against VBSCF values which have been shown to follow the expected ordering in all cases.
Donny, Eric C.; Hatsukami, Dorothy K.; Benowitz, Neal L.; Sved, Alan F.; Tidey, Jennifer W.; Cassidy, Rachel N.
2014-01-01
Introduction Both the Tobacco Control Act in the U.S. and Article 9 of the Framework Convention on Tobacco Control enable governments to directly address the addictiveness of combustible tobacco by reducing nicotine through product standards. Although nicotine may have some harmful effects, the detrimental health effects of smoked tobacco are primarily due to non-nicotine constituents. Hence, the health effects of nicotine reduction would likely be determined by changes in behavior that result in changes in smoke exposure. Methods Herein, we review the current evidence on nicotine reduction and discuss some of the challenges in establishing the empirical basis for regulatory decisions. Results To date, research suggests that very low nicotine content cigarettes produce a desirable set of outcomes, including reduced exposure to nicotine, reduced smoking, and reduced dependence, without significant safety concerns. However, much is still unknown, including the effects of gradual versus abrupt changes in nicotine content, effects in vulnerable populations, and impact on youth. Discussion A coordinated effort must be made to provide the best possible scientific basis for regulatory decisions. The outcome of this effort may provide the foundation for a novel approach to tobacco control that dramatically reduces the devastating health consequences of smoked tobacco. PMID:24967958
Donny, Eric C; Hatsukami, Dorothy K; Benowitz, Neal L; Sved, Alan F; Tidey, Jennifer W; Cassidy, Rachel N
2014-11-01
Both the Tobacco Control Act in the U.S. and Article 9 of the Framework Convention on Tobacco Control enable governments to directly address the addictiveness of combustible tobacco by reducing nicotine through product standards. Although nicotine may have some harmful effects, the detrimental health effects of smoked tobacco are primarily due to non-nicotine constituents. Hence, the health effects of nicotine reduction would likely be determined by changes in behavior that result in changes in smoke exposure. Herein, we review the current evidence on nicotine reduction and discuss some of the challenges in establishing the empirical basis for regulatory decisions. To date, research suggests that very low nicotine content cigarettes produce a desirable set of outcomes, including reduced exposure to nicotine, reduced smoking, and reduced dependence, without significant safety concerns. However, much is still unknown, including the effects of gradual versus abrupt changes in nicotine content, effects in vulnerable populations, and impact on youth. A coordinated effort must be made to provide the best possible scientific basis for regulatory decisions. The outcome of this effort may provide the foundation for a novel approach to tobacco control that dramatically reduces the devastating health consequences of smoked tobacco. Copyright © 2014 Elsevier Inc. All rights reserved.
Idzerda, Leanne; Rader, Tamara; Tugwell, Peter; Boers, Maarten
2014-05-01
The usefulness of randomized control trials to advance clinical care depends upon the outcomes reported, but disagreement on the choice of outcome measures has resulted in inconsistency and the potential for reporting bias. One solution to this problem is the development of a core outcome set: a minimum set of outcome measures deemed critical for clinical decision making. Within rheumatology the Outcome Measures in Rheumatology (OMERACT) initiative has pioneered the development of core outcome sets since 1992. As the number of diseases addressed by OMERACT has increased and its experience in formulating core sets has grown, clarification and update of the conceptual framework and formulation of a more explicit process of area/domain core set development has become necessary. As part of the update process of the OMERACT Filter criteria to version 2, a literature review was undertaken to compare and contrast the OMERACT conceptual framework with others within and outside rheumatology. A scoping search was undertaken to examine the extent, range, and nature of conceptual frameworks for core set outcome selection in health. We searched the following resources: Cochrane Library Methods Group Register; Medline; Embase; PsycInfo; Environmental Studies and Policy Collection; and ABI/INFORM Global. We also conducted a targeted Google search. Five conceptual frameworks were identified: the WHO tripartite definition of health; the 5 Ds (discomfort, disability, drug toxicity, dollar cost, and death); the International Classification of Functioning (ICF); PROMIS (Patient-Reported Outcomes Measurement System); and the Outcomes Hierarchy. Of these, only the 5 Ds and ICF frameworks have been systematically applied in core set development. Outside the area of rheumatology, several core sets were identified; these had been developed through a limited range of consensus-based methods with varying degrees of methodological rigor. None applied a framework to ensure content validity of the end product. This scoping review reinforced the need for clear methods and standards for core set development. Based on these findings, OMERACT will make its own conceptual framework and working process more explicit. Proposals for how to achieve this were discussed at the OMERACT 11 conference.
Toward a computational theory for motion understanding: The expert animators model
NASA Technical Reports Server (NTRS)
Mohamed, Ahmed S.; Armstrong, William W.
1988-01-01
Artificial intelligence researchers claim to understand some aspect of human intelligence when their model is able to emulate it. In the context of computer graphics, the ability to go from motion representation to convincing animation should accordingly be treated not simply as a trick for computer graphics programmers but as important epistemological and methodological goal. In this paper we investigate a unifying model for animating a group of articulated bodies such as humans and robots in a three-dimensional environment. The proposed model is considered in the framework of knowledge representation and processing, with special reference to motion knowledge. The model is meant to help setting the basis for a computational theory for motion understanding applied to articulated bodies.
A periodic table of coiled-coil protein structures.
Moutevelis, Efrosini; Woolfson, Derek N
2009-01-23
Coiled coils are protein structure domains with two or more alpha-helices packed together via interlacing of side chains known as knob-into-hole packing. We analysed and classified a large set of coiled-coil structures using a combination of automated and manual methods. This led to a systematic classification that we termed a "periodic table of coiled coils," which we have made available at http://coiledcoils.chm.bris.ac.uk/ccplus/search/periodic_table. In this table, coiled-coil assemblies are arranged in columns with increasing numbers of alpha-helices and in rows of increased complexity. The table provides a framework for understanding possibilities in and limits on coiled-coil structures and a basis for future prediction, engineering and design studies.
Sparse representation of Gravitational Sound
NASA Astrophysics Data System (ADS)
Rebollo-Neira, Laura; Plastino, A.
2018-03-01
Gravitational Sound clips produced by the Laser Interferometer Gravitational-Wave Observatory (LIGO) and the Massachusetts Institute of Technology (MIT) are considered within the particular context of data reduction. We advance a procedure to this effect and show that these types of signals can be approximated with high quality using significantly fewer elementary components than those required within the standard orthogonal basis framework. Furthermore, a local measure sparsity is shown to render meaningful information about the variation of a signal along time, by generating a set of local sparsity values which is much smaller than the dimension of the signal. This point is further illustrated by recourse to a more complex signal, generated by Milde Science Communication to divulge Gravitational Sound in the form of a ring tone.
On the effects of basis set truncation and electron correlation in conformers of 2-hydroxy-acetamide
NASA Astrophysics Data System (ADS)
Szarecka, A.; Day, G.; Grout, P. J.; Wilson, S.
Ab initio quantum chemical calculations have been used to study the differences in energy between two gas phase conformers of the 2-hydroxy-acetamide molecule that possess intramolecular hydrogen bonding. In particular, rotation around the central C-C bond has been considered as a factor determining the structure of the hydrogen bond and stabilization of the conformer. Energy calculations include full geometiy optimization using both the restricted matrix Hartree-Fock model and second-order many-body perturbation theory with a number of commonly used basis sets. The basis sets employed ranged from the minimal STO-3G set to [`]split-valence' sets up to 6-31 G. The effects of polarization functions were also studied. The results display a strong basis set dependence.
A Reduced Basis Method with Exact-Solution Certificates for Symmetric Coercive Equations
2013-11-06
the energy associated with the infinite - dimensional weak solution of parametrized symmetric coercive partial differential equations with piecewise...builds bounds with respect to the infinite - dimensional weak solution, aims to entirely remove the issue of the “truth” within the certified reduced basis...framework. We in particular introduce a reduced basis method that provides rigorous upper and lower bounds
International spinal cord injury pulmonary function basic data set.
Biering-Sørensen, F; Krassioukov, A; Alexander, M S; Donovan, W; Karlsson, A-K; Mueller, G; Perkash, I; Sheel, A William; Wecht, J; Schilero, G J
2012-06-01
To develop the International Spinal Cord Injury (SCI) Pulmonary Function Basic Data Set within the framework of the International SCI Data Sets in order to facilitate consistent collection and reporting of basic bronchopulmonary findings in the SCI population. International. The SCI Pulmonary Function Data Set was developed by an international working group. The initial data set document was revised on the basis of suggestions from members of the Executive Committee of the International SCI Standards and Data Sets, the International Spinal Cord Society (ISCoS) Executive and Scientific Committees, American Spinal Injury Association (ASIA) Board, other interested organizations and societies and individual reviewers. In addition, the data set was posted for 2 months on ISCoS and ASIA websites for comments. The final International SCI Pulmonary Function Data Set contains questions on the pulmonary conditions diagnosed before spinal cord lesion,if available, to be obtained only once; smoking history; pulmonary complications and conditions after the spinal cord lesion, which may be collected at any time. These data include information on pneumonia, asthma, chronic obstructive pulmonary disease and sleep apnea. Current utilization of ventilator assistance including mechanical ventilation, diaphragmatic pacing, phrenic nerve stimulation and Bi-level positive airway pressure can be reported, as well as results from pulmonary function testing includes: forced vital capacity, forced expiratory volume in one second and peak expiratory flow. The complete instructions for data collection and the data sheet itself are freely available on the website of ISCoS (http://www.iscos.org.uk).
The ICAP Framework: Linking Cognitive Engagement to Active Learning Outcomes
ERIC Educational Resources Information Center
Chi, Michelene T. H.; Wylie, Ruth
2014-01-01
This article describes the ICAP framework that defines cognitive engagement activities on the basis of students' overt behaviors and proposes that engagement behaviors can be categorized and differentiated into one of four modes: "Interactive," "Constructive," "Active," and "Passive." The ICAP hypothesis…
On the optimization of Gaussian basis sets
NASA Astrophysics Data System (ADS)
Petersson, George A.; Zhong, Shijun; Montgomery, John A.; Frisch, Michael J.
2003-01-01
A new procedure for the optimization of the exponents, αj, of Gaussian basis functions, Ylm(ϑ,φ)rle-αjr2, is proposed and evaluated. The direct optimization of the exponents is hindered by the very strong coupling between these nonlinear variational parameters. However, expansion of the logarithms of the exponents in the orthonormal Legendre polynomials, Pk, of the index, j: ln αj=∑k=0kmaxAkPk((2j-2)/(Nprim-1)-1), yields a new set of well-conditioned parameters, Ak, and a complete sequence of well-conditioned exponent optimizations proceeding from the even-tempered basis set (kmax=1) to a fully optimized basis set (kmax=Nprim-1). The error relative to the exact numerical self-consistent field limit for a six-term expansion is consistently no more than 25% larger than the error for the completely optimized basis set. Thus, there is no need to optimize more than six well-conditioned variational parameters, even for the largest sets of Gaussian primitives.
Lau-Walker, Margaret
2006-02-01
This paper analyses the two prominent psychological theories of patient response--illness representation and self-efficacy--and explore the possibilities of the development of a conceptual individualized care model that would make use of both theories. Analysis of the literature established common themes that were used as the basis to form a conceptual framework intended to assist in the joint application of these theories to therapeutic settings. Both theories emphasize personal experience, pre-construction of self, individual response to illness and treatment, and that the patients' beliefs are more influential in their recovery than the severity of the illness. Where the theories are most divergent is their application to therapeutic interventions, which reflects the different sources of influence that each theory emphasizes. Based on their similarities and differences it is possible to integrate the two theories into a conceptual care model. The Interactive Care Model combines both theories of patient response and provides an explicit framework for further research into the design of effective therapeutic interventions in rehabilitation care.
Collaborative classification of hyperspectral and visible images with convolutional neural network
NASA Astrophysics Data System (ADS)
Zhang, Mengmeng; Li, Wei; Du, Qian
2017-10-01
Recent advances in remote sensing technology have made multisensor data available for the same area, and it is well-known that remote sensing data processing and analysis often benefit from multisource data fusion. Specifically, low spatial resolution of hyperspectral imagery (HSI) degrades the quality of the subsequent classification task while using visible (VIS) images with high spatial resolution enables high-fidelity spatial analysis. A collaborative classification framework is proposed to fuse HSI and VIS images for finer classification. First, the convolutional neural network model is employed to extract deep spectral features for HSI classification. Second, effective binarized statistical image features are learned as contextual basis vectors for the high-resolution VIS image, followed by a classifier. The proposed approach employs diversified data in a decision fusion, leading to an integration of the rich spectral information, spatial information, and statistical representation information. In particular, the proposed approach eliminates the potential problems of the curse of dimensionality and excessive computation time. The experiments evaluated on two standard data sets demonstrate better classification performance offered by this framework.
Homeostasis, inflammation, and disease susceptibility.
Kotas, Maya E; Medzhitov, Ruslan
2015-02-26
While modernization has dramatically increased lifespan, it has also witnessed the increasing prevalence of diseases such as obesity, hypertension, and type 2 diabetes. Such chronic, acquired diseases result when normal physiologic control goes awry and may thus be viewed as failures of homeostasis. However, while nearly every process in human physiology relies on homeostatic mechanisms for stability, only some have demonstrated vulnerability to dysregulation. Additionally, chronic inflammation is a common accomplice of the diseases of homeostasis, yet the basis for this connection is not fully understood. Here we review the design of homeostatic systems and discuss universal features of control circuits that operate at the cellular, tissue, and organismal levels. We suggest a framework for classification of homeostatic signals that is based on different classes of homeostatic variables they report on. Finally, we discuss how adaptability of homeostatic systems with adjustable set points creates vulnerability to dysregulation and disease. This framework highlights the fundamental parallels between homeostatic and inflammatory control mechanisms and provides a new perspective on the physiological origin of inflammation. Copyright © 2015 Elsevier Inc. All rights reserved.
Walakira, Eddy J; Ochen, Eric A; Bukuluki, Paul; Alllan, Sue
2014-01-01
This article describes a model of care for abandoned and neglected infants in need of urgent physical, social, and medical support as implemented by the Child's i Foundation, an international, nongovernmental organization operating in Uganda. The model discounts the need for long-term care of young children within institutions and challenges the basis for intercountry adoption. Underpinned by the essentials of care continuum provided under the Uganda National Alternative Care Framework (Ministry of Gender, Labour and Social Development, 2012), the model emphasizes the need to effect the reintegration of the separated child within the family of his or her birth, or locally organize foster care or adoption. Highlighting policy and programming lessons, the model showcases a holistic approach to the problem and puts emphasis on interventions that are protective, promotional, and transformational and the use of a community-oriented approach. The model offers guidance to both government and nongovernment actors in addressing the problems of child neglect and abandonment through the implementation of the alternative care framework. © 2014 Michigan Association for Infant Mental Health.
Extending cluster lot quality assurance sampling designs for surveillance programs.
Hund, Lauren; Pagano, Marcello
2014-07-20
Lot quality assurance sampling (LQAS) has a long history of applications in industrial quality control. LQAS is frequently used for rapid surveillance in global health settings, with areas classified as poor or acceptable performance on the basis of the binary classification of an indicator. Historically, LQAS surveys have relied on simple random samples from the population; however, implementing two-stage cluster designs for surveillance sampling is often more cost-effective than simple random sampling. By applying survey sampling results to the binary classification procedure, we develop a simple and flexible nonparametric procedure to incorporate clustering effects into the LQAS sample design to appropriately inflate the sample size, accommodating finite numbers of clusters in the population when relevant. We use this framework to then discuss principled selection of survey design parameters in longitudinal surveillance programs. We apply this framework to design surveys to detect rises in malnutrition prevalence in nutrition surveillance programs in Kenya and South Sudan, accounting for clustering within villages. By combining historical information with data from previous surveys, we design surveys to detect spikes in the childhood malnutrition rate. Copyright © 2014 John Wiley & Sons, Ltd.
Guidelines for Bacteriophage Product Certification.
Fauconnier, Alan
2018-01-01
Following decades in the wilderness, bacteriophage therapy is now appearing as a credible antimicrobial strategy. However, this reemerging therapy does not rekindle without raising sensitive regulatory concerns. Indeed, whereas the European regulatory framework has been basically implemented to tackle ready-to-use pharmaceuticals produced on a large scale, bacteriophage therapy relies on a dynamic approach requiring a regulation on personalized medicine, nonexistent at present. Because of this, no guideline are currently available for addressing the scientific and regulatory issues specifically related to phage therapy medicinal products (PTMP).Pending to the implementation of an appropriate regulatory framework and to the development of ensuing guidelines, several avenues which might lead to PTMP regulatory compliance are explored here. Insights might come from the multi-strain dossier approach set up for particular animal vaccines, from the homologous group concept developed for the allergen products or from the licensing process for veterinary autogenous vaccines. Depending on national legislations, customized preparations prescribed as magistral formulas or to be used on a named-patient basis are possible regulatory approaches to be considered. However, these schemes are not optimal and should thus be regarded as transitional.
European Union Framework Programme 7 Building the Europe of Knowledge
NASA Astrophysics Data System (ADS)
Akkaş, Nuri
In March 2000, the Lisbon European Council set the goal of becoming by 2010 "the most competitive and dynamic knowledge-based economy in the world, capable of sustainable economic growth with more and better jobs and greater social cohesion". This was called the Lisbon Strategy. The project of creating a European Research Area (ERA) was endorsed as a central element of the Lisbon Strategy to achieve this goal. However, EU still invests too little in R & D. In 2003, top 500 private R & D spenders in EU decreased their R & D investment by 2.0%. Top 500 private R & D spenders outside EU increased their R & D investment by 3.9%. Overall R &D investments are as follows: EU: 1.96%; US: 2.59%; S. Korea: 2.91%; Japan: 3.12%. ERA is implemented through so-called Framework Programmes (FP). FP7 is proposed on the basis of a doubling of funds and the duration is 7 years (2007-13). FP7 will fund R& D projects of immediate industrial relevance & needs of industry. Projects will include both public research institutions and private companies (PPP).
Fleming, Marc L; Hatfield, Mark D; Wattana, Monica K; Todd, Knox H
2014-03-01
Emergency physicians (EPs) are faced with significant challenges regarding pain management, while preventing abuse of prescription opioids. Prescription monitoring programs (PMPs) are increasingly used to help allay the abuse of controlled substances. The objective of this study was to determine EPs' intention to use the Texas PMP within the framework of the Technology Acceptance Model. A cross-sectional, 24-item survey instrument was developed and distributed to EPs attending an emergency medicine conference. PMP nonusers reported a positive intention to use the PMP in the future, with attitude (β = 0.61, p < 0.01) as the only statistically significant predictor of intention. PMP users reported a positive intention to use the PMP, with perceived usefulness (β = 0.62, p < 0.01) as the only statistically significant predictor of intention for PMP users. This exploratory study provides a basis for understanding EPs' intention to use a PMP. The use of PMPs by EPs may lead to a decrease in prescription opioid abuse and improve patient safety related to opioid prescribing in the emergency department setting.
Improving the Effectiveness of Electronic Health Record-Based Referral Processes
2012-01-01
Electronic health records are increasingly being used to facilitate referral communication in the outpatient setting. However, despite support by technology, referral communication between primary care providers and specialists is often unsatisfactory and is unable to eliminate care delays. This may be in part due to lack of attention to how information and communication technology fits within the social environment of health care. Making electronic referral communication effective requires a multifaceted “socio-technical” approach. Using an 8-dimensional socio-technical model for health information technology as a framework, we describe ten recommendations that represent good clinical practices to design, develop, implement, improve, and monitor electronic referral communication in the outpatient setting. These recommendations were developed on the basis of our previous work, current literature, sound clinical practice, and a systems-based approach to understanding and implementing health information technology solutions. Recommendations are relevant to system designers, practicing clinicians, and other stakeholders considering use of electronic health records to support referral communication. PMID:22973874
Robust representation and recognition of facial emotions using extreme sparse learning.
Shojaeilangari, Seyedehsamaneh; Yau, Wei-Yun; Nandakumar, Karthik; Li, Jun; Teoh, Eam Khwang
2015-07-01
Recognition of natural emotions from human faces is an interesting topic with a wide range of potential applications, such as human-computer interaction, automated tutoring systems, image and video retrieval, smart environments, and driver warning systems. Traditionally, facial emotion recognition systems have been evaluated on laboratory controlled data, which is not representative of the environment faced in real-world applications. To robustly recognize the facial emotions in real-world natural situations, this paper proposes an approach called extreme sparse learning, which has the ability to jointly learn a dictionary (set of basis) and a nonlinear classification model. The proposed approach combines the discriminative power of extreme learning machine with the reconstruction property of sparse representation to enable accurate classification when presented with noisy signals and imperfect data recorded in natural settings. In addition, this paper presents a new local spatio-temporal descriptor that is distinctive and pose-invariant. The proposed framework is able to achieve the state-of-the-art recognition accuracy on both acted and spontaneous facial emotion databases.
Pedagogical competence and value clarification among health educators.
Wistoft, Karen
2009-09-01
Individual and social values are increasingly important in health education. This article examines how health educators in Greenland and Denmark engage in value clarification as part of their educational practices. It presents the results of a study of health professionals in a variety of settings, focusing in particular on how development work and experimentation can strengthen their pedagogical competences. The study focuses on belief, reasoning, interpretation and reflection, rather than routines, skills, or ethical rules, and takes a participatory approach that oscillates between dialogical and qualitative empirical methodologies. It observes pedagogical practice in selected settings in Greenland and the municipality of Copenhagen. Within the framework provided by four discourses that appear to organize communication about health, it shows how values became important to the progress of two research-based development projects. On this basis, the article argues that health education can be effectively grounded in the values, perceptions, and experiences of a given population, while being guided by the health educators' biomedical knowledge and educational values.
Ageing and the economic life cycle: The National Transfer Accounts approach.
Temple, Jeromey B; Rice, James M; McDonald, Peter F
2017-12-01
To illustrate the use of National Transfer Accounts (NTA) for understanding ageing and the economic life cycle in Australia. The NTA methodology is applied utilising a range of unit record, demographic and administrative data sets from 1981 to 2010. During early and later life, total consumption (public and private) is greater than labour income. On a time series and cohort basis, we show that each successive generation has improved their level of well-being (as measured by consumption) relative to the previous years or previous cohorts from 1981 to 1982 onwards. We also show a substantial increase in labour income earned by mature age workers over this period. International comparisons show Australia to have consumption and labour income age profiles very similar to those of Canada but dissimilar to many other countries, driven by differences in demographic and policy settings. The NTA approach provides a powerful framework to track differences in the economic life cycle across age groups, across time, across cohorts and across countries. © 2017 AJA Inc.
Mining algorithm for association rules in big data based on Hadoop
NASA Astrophysics Data System (ADS)
Fu, Chunhua; Wang, Xiaojing; Zhang, Lijun; Qiao, Liying
2018-04-01
In order to solve the problem that the traditional association rules mining algorithm has been unable to meet the mining needs of large amount of data in the aspect of efficiency and scalability, take FP-Growth as an example, the algorithm is realized in the parallelization based on Hadoop framework and Map Reduce model. On the basis, it is improved using the transaction reduce method for further enhancement of the algorithm's mining efficiency. The experiment, which consists of verification of parallel mining results, comparison on efficiency between serials and parallel, variable relationship between mining time and node number and between mining time and data amount, is carried out in the mining results and efficiency by Hadoop clustering. Experiments show that the paralleled FP-Growth algorithm implemented is able to accurately mine frequent item sets, with a better performance and scalability. It can be better to meet the requirements of big data mining and efficiently mine frequent item sets and association rules from large dataset.
Zhu, Wuming; Trickey, S B
2017-12-28
In high magnetic field calculations, anisotropic Gaussian type orbital (AGTO) basis functions are capable of reconciling the competing demands of the spherically symmetric Coulombic interaction and cylindrical magnetic (B field) confinement. However, the best available a priori procedure for composing highly accurate AGTO sets for atoms in a strong B field [W. Zhu et al., Phys. Rev. A 90, 022504 (2014)] yields very large basis sets. Their size is problematical for use in any calculation with unfavorable computational cost scaling. Here we provide an alternative constructive procedure. It is based upon analysis of the underlying physics of atoms in B fields that allow identification of several principles for the construction of AGTO basis sets. Aided by numerical optimization and parameter fitting, followed by fine tuning of fitting parameters, we devise formulae for generating accurate AGTO basis sets in an arbitrary B field. For the hydrogen iso-electronic sequence, a set depends on B field strength, nuclear charge, and orbital quantum numbers. For multi-electron systems, the basis set formulae also include adjustment to account for orbital occupations. Tests of the new basis sets for atoms H through C (1 ≤ Z ≤ 6) and ions Li + , Be + , and B + , in a wide B field range (0 ≤ B ≤ 2000 a.u.), show an accuracy better than a few μhartree for single-electron systems and a few hundredths to a few mHs for multi-electron atoms. The relative errors are similar for different atoms and ions in a large B field range, from a few to a couple of tens of millionths, thereby confirming rather uniform accuracy across the nuclear charge Z and B field strength values. Residual basis set errors are two to three orders of magnitude smaller than the electronic correlation energies in multi-electron atoms, a signal of the usefulness of the new AGTO basis sets in correlated wavefunction or density functional calculations for atomic and molecular systems in an external strong B field.
NASA Astrophysics Data System (ADS)
Zhu, Wuming; Trickey, S. B.
2017-12-01
In high magnetic field calculations, anisotropic Gaussian type orbital (AGTO) basis functions are capable of reconciling the competing demands of the spherically symmetric Coulombic interaction and cylindrical magnetic (B field) confinement. However, the best available a priori procedure for composing highly accurate AGTO sets for atoms in a strong B field [W. Zhu et al., Phys. Rev. A 90, 022504 (2014)] yields very large basis sets. Their size is problematical for use in any calculation with unfavorable computational cost scaling. Here we provide an alternative constructive procedure. It is based upon analysis of the underlying physics of atoms in B fields that allow identification of several principles for the construction of AGTO basis sets. Aided by numerical optimization and parameter fitting, followed by fine tuning of fitting parameters, we devise formulae for generating accurate AGTO basis sets in an arbitrary B field. For the hydrogen iso-electronic sequence, a set depends on B field strength, nuclear charge, and orbital quantum numbers. For multi-electron systems, the basis set formulae also include adjustment to account for orbital occupations. Tests of the new basis sets for atoms H through C (1 ≤ Z ≤ 6) and ions Li+, Be+, and B+, in a wide B field range (0 ≤ B ≤ 2000 a.u.), show an accuracy better than a few μhartree for single-electron systems and a few hundredths to a few mHs for multi-electron atoms. The relative errors are similar for different atoms and ions in a large B field range, from a few to a couple of tens of millionths, thereby confirming rather uniform accuracy across the nuclear charge Z and B field strength values. Residual basis set errors are two to three orders of magnitude smaller than the electronic correlation energies in multi-electron atoms, a signal of the usefulness of the new AGTO basis sets in correlated wavefunction or density functional calculations for atomic and molecular systems in an external strong B field.
NASA Astrophysics Data System (ADS)
Choi, Chu Hwan
2002-09-01
Ab initio chemistry has shown great promise in reproducing experimental results and in its predictive power. The many complicated computational models and methods seem impenetrable to an inexperienced scientist, and the reliability of the results is not easily interpreted. The application of midbond orbitals is used to determine a general method for use in calculating weak intermolecular interactions, especially those involving electron-deficient systems. Using the criteria of consistency, flexibility, accuracy and efficiency we propose a supermolecular method of calculation using the full counterpoise (CP) method of Boys and Bernardi, coupled with Moller-Plesset (MP) perturbation theory as an efficient electron-correlative method. We also advocate the use of the highly efficient and reliable correlation-consistent polarized valence basis sets of Dunning. To these basis sets, we add a general set of midbond orbitals and demonstrate greatly enhanced efficiency in the calculation. The H2-H2 dimer is taken as a benchmark test case for our method, and details of the computation are elaborated. Our method reproduces with great accuracy the dissociation energies of other previous theoretical studies. The added efficiency of extending the basis sets with conventional means is compared with the performance of our midbond-extended basis sets. The improvement found with midbond functions is notably superior in every case tested. Finally, a novel application of midbond functions to the BH5 complex is presented. The system is an unusual van der Waals complex. The interaction potential curves are presented for several standard basis sets and midbond-enhanced basis sets, as well as for two popular, alternative correlation methods. We report that MP theory appears to be superior to coupled-cluster (CC) in speed, while it is more stable than B3LYP, a widely-used density functional theory (DFT). Application of our general method yields excellent results for the midbond basis sets. Again they prove superior to conventional extended basis sets. Based on these results, we recommend our general approach as a highly efficient, accurate method for calculating weakly interacting systems.
Petruzielo, F R; Toulouse, Julien; Umrigar, C J
2011-02-14
A simple yet general method for constructing basis sets for molecular electronic structure calculations is presented. These basis sets consist of atomic natural orbitals from a multiconfigurational self-consistent field calculation supplemented with primitive functions, chosen such that the asymptotics are appropriate for the potential of the system. Primitives are optimized for the homonuclear diatomic molecule to produce a balanced basis set. Two general features that facilitate this basis construction are demonstrated. First, weak coupling exists between the optimal exponents of primitives with different angular momenta. Second, the optimal primitive exponents for a chosen system depend weakly on the particular level of theory employed for optimization. The explicit case considered here is a basis set appropriate for the Burkatzki-Filippi-Dolg pseudopotentials. Since these pseudopotentials are finite at nuclei and have a Coulomb tail, the recently proposed Gauss-Slater functions are the appropriate primitives. Double- and triple-zeta bases are developed for elements hydrogen through argon. These new bases offer significant gains over the corresponding Burkatzki-Filippi-Dolg bases at various levels of theory. Using a Gaussian expansion of the basis functions, these bases can be employed in any electronic structure method. Quantum Monte Carlo provides an added benefit: expansions are unnecessary since the integrals are evaluated numerically.
Control of Distributed Parameter Systems
1990-08-01
vari- ant of the general Lotka - Volterra model for interspecific competition. The variant described the emergence of one subpopulation from another as a...distribut ion unlimited. I&. ARSTRACT (MAUMUnw2O1 A unified arioroximation framework for Parameter estimation In general linear POE models has been completed...unified approximation framework for parameter estimation in general linear PDE models. This framework has provided the theoretical basis for a number of
Hill, J Grant
2013-09-30
Auxiliary basis sets (ABS) specifically matched to the cc-pwCVnZ-PP and aug-cc-pwCVnZ-PP orbital basis sets (OBS) have been developed and optimized for the 4d elements Y-Pd at the second-order Møller-Plesset perturbation theory level. Calculation of the core-valence electron correlation energies for small to medium sized transition metal complexes demonstrates that the error due to the use of these new sets in density fitting is three to four orders of magnitude smaller than that due to the OBS incompleteness, and hence is considered negligible. Utilizing the ABSs in the resolution-of-the-identity component of explicitly correlated calculations is also investigated, where it is shown that i-type functions are important to produce well-controlled errors in both integrals and correlation energy. Benchmarking at the explicitly correlated coupled cluster with single, double, and perturbative triple excitations level indicates impressive convergence with respect to basis set size for the spectroscopic constants of 4d monofluorides; explicitly correlated double-ζ calculations produce results close to conventional quadruple-ζ, and triple-ζ is within chemical accuracy of the complete basis set limit. Copyright © 2013 Wiley Periodicals, Inc.
Benchmark of Ab Initio Bethe-Salpeter Equation Approach with Numeric Atom-Centered Orbitals
NASA Astrophysics Data System (ADS)
Liu, Chi; Kloppenburg, Jan; Kanai, Yosuke; Blum, Volker
The Bethe-Salpeter equation (BSE) approach based on the GW approximation has been shown to be successful for optical spectra prediction of solids and recently also for small molecules. We here present an all-electron implementation of the BSE using numeric atom-centered orbital (NAO) basis sets. In this work, we present benchmark of BSE implemented in FHI-aims for low-lying excitation energies for a set of small organic molecules, the well-known Thiel's set. The difference between our implementation (using an analytic continuation of the GW self-energy on the real axis) and the results generated by a fully frequency dependent GW treatment on the real axis is on the order of 0.07 eV for the benchmark molecular set. We study the convergence behavior to the complete basis set limit for excitation spectra, using a group of valence correlation consistent NAO basis sets (NAO-VCC-nZ), as well as for standard NAO basis sets for ground state DFT with extended augmentation functions (NAO+aug). The BSE results and convergence behavior are compared to linear-response time-dependent DFT, where excellent numerical convergence is shown for NAO+aug basis sets.
Wolf, Alexander; Reiher, Markus; Hess, Bernd Artur
2004-05-08
The first molecular calculations with the generalized Douglas-Kroll method up to fifth order in the external potential (DKH5) are presented. We study the spectroscopic parameters and electron affinity of the tin oxide molecule SnO and its anion SnO(-) applying nonrelativistic as well as relativistic calculations with higher orders of the DK approximation. In order to guarantee highly accurate results close to the basis set limit, an all-electron basis for Sn of at least quintuple-zeta quality has been constructed and optimized. All-electron CCSD(T) calculations of the potential energy curves of both SnO and SnO(-) reproduce the experimental values very well. Relative energies and valence properties are already well described with the established standard second-order approximation DKH2 and the higher-order corrections DKH3-DKH5 hardly affect these quantities. However, an accurate description of total energies and inner-shell properties requires superior relativistic schemes up to DKH5. (c) 2004 American Institute of Physics.
Convoluted Quasi Sturmian basis for the two-electron continuum
NASA Astrophysics Data System (ADS)
Ancarani, Lorenzo Ugo; Zaytsev, A. S.; Zaytsev, S. A.
2016-09-01
In the construction of solutions for the Coulomb three-body scattering problem one encounters a series of mathematical and numerical difficulties, one of which are the cumbersome boundary conditions the wave function should obey. We propose to describe a Coulomb three-body system continuum with a set of two-particle functions, named Convoluted Quasi Sturmian (CQS) in. They are built using recently introduced Quasi Sturmian (QS) functions which have the merit of possessing a closed form. Unlike a simple product of two one-particle functions, by construction, the CQS functions look asymptotically like a six-dimensional outgoing spherical wave. The proposed CQS basis is tested through the study of the double ionization of helium by high-energy electron impact in the framework of the Temkin-Poet model. An adequate logarithmic-like phase factor is further included in order to take into account the Coulomb interelectronic interaction and formally build the correct asymptotic behavior when all interparticle distances are large. With such a phase-factor (that can be easily extended to take into account higher partial waves) rapid convergence of the expansion can be obtained.
Spatial Bayesian Latent Factor Regression Modeling of Coordinate-based Meta-analysis Data
Montagna, Silvia; Wager, Tor; Barrett, Lisa Feldman; Johnson, Timothy D.; Nichols, Thomas E.
2017-01-01
Summary Now over 20 years old, functional MRI (fMRI) has a large and growing literature that is best synthesised with meta-analytic tools. As most authors do not share image data, only the peak activation coordinates (foci) reported in the paper are available for Coordinate-Based Meta-Analysis (CBMA). Neuroimaging meta-analysis is used to 1) identify areas of consistent activation; and 2) build a predictive model of task type or cognitive process for new studies (reverse inference). To simultaneously address these aims, we propose a Bayesian point process hierarchical model for CBMA. We model the foci from each study as a doubly stochastic Poisson process, where the study-specific log intensity function is characterised as a linear combination of a high-dimensional basis set. A sparse representation of the intensities is guaranteed through latent factor modeling of the basis coefficients. Within our framework, it is also possible to account for the effect of study-level covariates (meta-regression), significantly expanding the capabilities of the current neuroimaging meta-analysis methods available. We apply our methodology to synthetic data and neuroimaging meta-analysis datasets. PMID:28498564
NASA Technical Reports Server (NTRS)
Dyall, Kenneth G.; Faegri, Knut, Jr.
1990-01-01
The paper investigates bounds failure in calculations using Gaussian basis sets for the solution of the one-electron Dirac equation for the 2p1/2 state of Hg(79+). It is shown that bounds failure indicates inadequacies in the basis set, both in terms of the exponent range and the number of functions. It is also shown that overrepresentation of the small component space may lead to unphysical results. It is concluded that it is important to use matched large and small component basis sets with an adequate size and exponent range.
Ab Initio and Analytic Intermolecular Potentials for Ar-CF₄
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vayner, Grigoriy; Alexeev, Yuri; Wang, Jiangping
2006-03-09
Ab initio calculations at the CCSD(T) level of theory are performed to characterize the Ar + CF ₄ intermolecular potential. Extensive calculations, with and without a correction for basis set superposition error (BSSE), are performed with the cc-pVTZ basis set. Additional calculations are performed with other correlation consistent (cc) basis sets to extrapolate the Ar---CF₄potential energy minimum to the complete basis set (CBS) limit. Both the size of the basis set and BSSE have substantial effects on the Ar + CF₄ potential. Calculations with the cc-pVTZ basis set and without a BSSE correction, appear to give a good representation ofmore » the potential at the CBS limit and with a BSSE correction. In addition, MP2 theory is found to give potential energies in very good agreement with those determined by the much higher level CCSD(T) theory. Two analytic potential energy functions were determined for Ar + CF₄by fitting the cc-pVTZ calculations both with and without a BSSE correction. These analytic functions were written as a sum of two body potentials and excellent fits to the ab initio potentials were obtained by representing each two body interaction as a Buckingham potential.« less
On the performance of large Gaussian basis sets for the computation of total atomization energies
NASA Technical Reports Server (NTRS)
Martin, J. M. L.
1992-01-01
The total atomization energies of a number of molecules have been computed using an augmented coupled-cluster method and (5s4p3d2f1g) and 4s3p2d1f) atomic natural orbital (ANO) basis sets, as well as the correlation consistent valence triple zeta plus polarization (cc-pVTZ) correlation consistent valence quadrupole zeta plus polarization (cc-pVQZ) basis sets. The performance of ANO and correlation consistent basis sets is comparable throughout, although the latter can result in significant CPU time savings. Whereas the inclusion of g functions has significant effects on the computed Sigma D(e) values, chemical accuracy is still not reached for molecules involving multiple bonds. A Gaussian-1 (G) type correction lowers the error, but not much beyond the accuracy of the G1 model itself. Using separate corrections for sigma bonds, pi bonds, and valence pairs brings down the mean absolute error to less than 1 kcal/mol for the spdf basis sets, and about 0.5 kcal/mol for the spdfg basis sets. Some conclusions on the success of the Gaussian-1 and Gaussian-2 models are drawn.
Family Systems Theory: A Unifying Framework for Codependence.
ERIC Educational Resources Information Center
Prest, Layne A.; Protinsky, Howard
1993-01-01
Considers addictions and construct of codependence. Offers critical review and synthesis of codependency literature, along with an intergenerational family systems framework for conceptualizing the relationship of the dysfunctional family to the construct of codependence. Presents theoretical basis for systemic clinical work and research in this…
Cloud computing strategic framework (FY13 - FY15).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arellano, Lawrence R.; Arroyo, Steven C.; Giese, Gerald J.
This document presents an architectural framework (plan) and roadmap for the implementation of a robust Cloud Computing capability at Sandia National Laboratories. It is intended to be a living document and serve as the basis for detailed implementation plans, project proposals and strategic investment requests.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shirkov, Leonid; Makarewicz, Jan, E-mail: jama@amu.edu.pl
An ab initio intermolecular potential energy surface (PES) has been constructed for the benzene-krypton (BKr) van der Waals (vdW) complex. The interaction energy has been calculated at the coupled cluster level of theory with single, double, and perturbatively included triple excitations using different basis sets. As a result, a few analytical PESs of the complex have been determined. They allowed a prediction of the complex structure and its vibrational vdW states. The vibrational energy level pattern exhibits a distinct polyad structure. Comparison of the equilibrium structure, the dipole moment, and vibrational levels of BKr with their experimental counterparts has allowedmore » us to design an optimal basis set composed of a small Dunning’s basis set for the benzene monomer, a larger effective core potential adapted basis set for Kr and additional midbond functions. Such a basis set yields vibrational energy levels that agree very well with the experimental ones as well as with those calculated from the available empirical PES derived from the microwave spectra of the BKr complex. The basis proposed can be applied to larger complexes including Kr because of a reasonable computational cost and accurate results.« less
Polarized atomic orbitals for self-consistent field electronic structure calculations
NASA Astrophysics Data System (ADS)
Lee, Michael S.; Head-Gordon, Martin
1997-12-01
We present a new self-consistent field approach which, given a large "secondary" basis set of atomic orbitals, variationally optimizes molecular orbitals in terms of a small "primary" basis set of distorted atomic orbitals, which are simultaneously optimized. If the primary basis is taken as a minimal basis, the resulting functions are termed polarized atomic orbitals (PAO's) because they are valence (or core) atomic orbitals which have distorted or polarized in an optimal way for their molecular environment. The PAO's derive their flexibility from the fact that they are formed from atom-centered linear-combinations of the larger set of secondary atomic orbitals. The variational conditions satisfied by PAO's are defined, and an iterative method for performing a PAO-SCF calculation is introduced. We compare the PAO-SCF approach against full SCF calculations for the energies, dipoles, and molecular geometries of various molecules. The PAO's are potentially useful for studying large systems that are currently intractable with larger than minimal basis sets, as well as offering potential interpretative benefits relative to calculations in extended basis sets.
Patch-based image reconstruction for PET using prior-image derived dictionaries
NASA Astrophysics Data System (ADS)
Tahaei, Marzieh S.; Reader, Andrew J.
2016-09-01
In PET image reconstruction, regularization is often needed to reduce the noise in the resulting images. Patch-based image processing techniques have recently been successfully used for regularization in medical image reconstruction through a penalized likelihood framework. Re-parameterization within reconstruction is another powerful regularization technique in which the object in the scanner is re-parameterized using coefficients for spatially-extensive basis vectors. In this work, a method for extracting patch-based basis vectors from the subject’s MR image is proposed. The coefficients for these basis vectors are then estimated using the conventional MLEM algorithm. Furthermore, using the alternating direction method of multipliers, an algorithm for optimizing the Poisson log-likelihood while imposing sparsity on the parameters is also proposed. This novel method is then utilized to find sparse coefficients for the patch-based basis vectors extracted from the MR image. The results indicate the superiority of the proposed methods to patch-based regularization using the penalized likelihood framework.
van der Togt, Remko; Bakker, Piet J M; Jaspers, Monique W M
2011-04-01
RFID offers great opportunities to health care. Nevertheless, prior experiences also show that RFID systems have not been designed and tested in response to the particular needs of health care settings and might introduce new risks. The aim of this study is to present a framework that can be used to assess the performance of RFID systems particularly in health care settings. We developed a framework describing a systematic approach that can be used for assessing the feasibility of using an RFID technology in a particular healthcare setting; more specific for testing the impact of environmental factors on the quality of RFID generated data and vice versa. This framework is based on our own experiences with an RFID pilot implementation in an academic hospital in The Netherlands and a literature review concerning RFID test methods and current insights of RFID implementations in healthcare. The implementation of an RFID system within the blood transfusion chain inside a hospital setting was used as a show case to explain the different phases of the framework. The framework consists of nine phases, including an implementation development plan, RFID and medical equipment interference tests, data accuracy- and data completeness tests to be run in laboratory, simulated field and real field settings. The potential risks that RFID technologies may bring to the healthcare setting should be thoroughly evaluated before they are introduced into a vital environment. The RFID performance assessment framework that we present can act as a reference model to start an RFID development, engineering, implementation and testing plan and more specific, to assess the potential risks of interference and to test the quality of the RFID generated data potentially influenced by physical objects in specific health care environments. Copyright © 2010 Elsevier Inc. All rights reserved.
Hendriks, Anna-Marie; Jansen, Maria W J; Gubbels, Jessica S; De Vries, Nanne K; Paulussen, Theo; Kremers, Stef P J
2013-04-18
Childhood obesity is a 'wicked' public health problem that is best tackled by an integrated approach, which is enabled by integrated public health policies. The development and implementation of such policies have in practice proven to be difficult, however, and studying why this is the case requires a tool that may assist local policy-makers and those assisting them. A comprehensive framework that can help to identify options for improvement and to systematically develop solutions may be used to support local policy-makers. We propose the 'Behavior Change Ball' as a tool to study the development and implementation of integrated public health policies within local government. Based on the tenets of the 'Behavior Change Wheel' by Michie and colleagues (2011), the proposed conceptual framework distinguishes organizational behaviors of local policy-makers at the strategic, tactical and operational levels, as well as the determinants (motivation, capability, opportunity) required for these behaviors, and interventions and policy categories that can influence them. To illustrate the difficulty of achieving sustained integrated approaches, we use the metaphor of a ball in our framework: the mountainous landscapes surrounding the ball reflect the system's resistance to change (by making it difficult for the ball to roll). We apply this framework to the problem of childhood obesity prevention. The added value provided by the framework lies in its comprehensiveness, theoretical basis, diagnostic and heuristic nature and face validity. Since integrated public health policies have not been widely developed and implemented in practice, organizational behaviors relevant to the development of these policies remain to be investigated. A conceptual framework that can assist in systematically studying the policy process may facilitate this. Our Behavior Change Ball adds significant value to existing public health policy frameworks by incorporating multiple theoretical perspectives, specifying a set of organizational behaviors and linking the analysis of these behaviors to interventions and policies. We would encourage examination by others of our framework as a tool to explain and guide the development of integrated policies for the prevention of wicked public health problems.
2013-01-01
Background Childhood obesity is a ‘wicked’ public health problem that is best tackled by an integrated approach, which is enabled by integrated public health policies. The development and implementation of such policies have in practice proven to be difficult, however, and studying why this is the case requires a tool that may assist local policy-makers and those assisting them. A comprehensive framework that can help to identify options for improvement and to systematically develop solutions may be used to support local policy-makers. Discussion We propose the ‘Behavior Change Ball’ as a tool to study the development and implementation of integrated public health policies within local government. Based on the tenets of the ‘Behavior Change Wheel’ by Michie and colleagues (2011), the proposed conceptual framework distinguishes organizational behaviors of local policy-makers at the strategic, tactical and operational levels, as well as the determinants (motivation, capability, opportunity) required for these behaviors, and interventions and policy categories that can influence them. To illustrate the difficulty of achieving sustained integrated approaches, we use the metaphor of a ball in our framework: the mountainous landscapes surrounding the ball reflect the system’s resistance to change (by making it difficult for the ball to roll). We apply this framework to the problem of childhood obesity prevention. The added value provided by the framework lies in its comprehensiveness, theoretical basis, diagnostic and heuristic nature and face validity. Summary Since integrated public health policies have not been widely developed and implemented in practice, organizational behaviors relevant to the development of these policies remain to be investigated. A conceptual framework that can assist in systematically studying the policy process may facilitate this. Our Behavior Change Ball adds significant value to existing public health policy frameworks by incorporating multiple theoretical perspectives, specifying a set of organizational behaviors and linking the analysis of these behaviors to interventions and policies. We would encourage examination by others of our framework as a tool to explain and guide the development of integrated policies for the prevention of wicked public health problems. PMID:23597122
Harden, Samantha M.; Smith, Matthew Lee; Ory, Marcia G.; Smith-Ray, Renae L.; Estabrooks, Paul A.; Glasgow, Russell E.
2018-01-01
The RE-AIM Framework is a planning and evaluation model that has been used in a variety of settings to address various programmatic, environmental, and policy innovations for improving population health. In addition to the broad application and diverse use of the framework, there are lessons learned and recommendations for the future use of the framework across clinical, community, and corporate settings. The purposes of this article are to: (A) provide a brief overview of the RE-AIM Framework and its pragmatic use for planning and evaluation; (B) offer recommendations to facilitate the application of RE-AIM in clinical, community, and corporate settings; and (C) share perspectives and lessons learned about employing RE-AIM dimensions in the planning, implementation, and evaluation phases within these different settings. In this article, we demonstrate how the RE-AIM concepts and elements within each dimension can be applied by researchers and practitioners in diverse settings, among diverse populations and for diverse health topics. PMID:29623270
Campbell, Norm; Young, Eric R; Drouin, Denis; Legowski, Barbara; Adams, Michael A; Farrell, Judi; Kaczorowski, Janusz; Lewanczuk, Richard; Moy Lum-Kwong, Margaret; Tobe, Sheldon
2012-05-01
Increased blood pressure is a leading risk for premature death and disability. The causes of increased blood pressure are intuitive and well known. However, the fundamental basis and means for improving blood pressure control are highly integrated into our complex societal structure both inside and outside our health system and hence require a comprehensive discussion of the pathway forward. A group of Canadian experts was appointed by Hypertension Canada with funding from Public Health Agency of Canada and the Heart and Stroke Foundation of Canada, Canadian Institute for Health Research (HSFC-CIHR) Chair in Hypertension Prevention and Control to draft a discussion Framework for prevention and control of hypertension. The report includes an environmental scan of past and current activities, proposals for key indicators, and targets to be achieved by 2020, and what changes are likely to be required in Canada to achieve the proposed targets. The key targets are to reduce the prevalence of hypertension to 13% of adults and improve control to 78% of those with hypertension. Broad changes in government policy, research, and health services delivery are required for these changes to occur. The Hypertension Framework process is designed to have 3 phases. The first includes the experts' report which is summarized in this report. The second phase is to gather input and priorities for action from individuals and organizations for revision of the Framework. It is hoped the Framework will stimulate discussion and input for its full intended lifespan 2011-2020. The third phase is to work with individuals and organizations on the priorities set in phase 2. Copyright © 2012 Canadian Cardiovascular Society. Published by Elsevier Inc. All rights reserved.
From everyday emotions to aesthetic emotions: towards a unified theory of musical emotions.
Juslin, Patrik N
2013-09-01
The sound of music may arouse profound emotions in listeners. But such experiences seem to involve a 'paradox', namely that music--an abstract form of art, which appears removed from our concerns in everyday life--can arouse emotions - biologically evolved reactions related to human survival. How are these (seemingly) non-commensurable phenomena linked together? Key is to understand the processes through which sounds are imbued with meaning. It can be argued that the survival of our ancient ancestors depended on their ability to detect patterns in sounds, derive meaning from them, and adjust their behavior accordingly. Such an ecological perspective on sound and emotion forms the basis of a recent multi-level framework that aims to explain emotional responses to music in terms of a large set of psychological mechanisms. The goal of this review is to offer an updated and expanded version of the framework that can explain both 'everyday emotions' and 'aesthetic emotions'. The revised framework--referred to as BRECVEMA--includes eight mechanisms: Brain Stem Reflex, Rhythmic Entrainment, Evaluative Conditioning, Contagion, Visual Imagery, Episodic Memory, Musical Expectancy, and Aesthetic Judgment. In this review, it is argued that all of the above mechanisms may be directed at information that occurs in a 'musical event' (i.e., a specific constellation of music, listener, and context). Of particular significance is the addition of a mechanism corresponding to aesthetic judgments of the music, to better account for typical 'appreciation emotions' such as admiration and awe. Relationships between aesthetic judgments and other mechanisms are reviewed based on the revised framework. It is suggested that the framework may contribute to a long-needed reconciliation between previous approaches that have conceptualized music listeners' responses in terms of either 'everyday emotions' or 'aesthetic emotions'. © 2013 Elsevier B.V. All rights reserved.
A general framework for characterizing studies of brain interface technology.
Mason, S G; Jackson, M M Moore; Birch, G E
2005-11-01
The development of brain interface (BI) technology continues to attract researchers with a wide range of backgrounds and expertise. Though the BI community is committed to accurate and objective evaluation of methods, systems, and technology, the very diversity of the methods and terminology used in the field hinders understanding and impairs technology cross-fertilization and cross-group validation of findings. Underlying this dilemma is a lack of common perspective and language. As seen in our previous works in this area, our approach to remedy this problem is to propose language in the form of taxonomy and functional models. Our intent is to document and validate our best thinking in this area and publish a perspective that will stimulate discussion. We encourage others to do the same with the belief that focused discussion on language issues will accelerate the inherently slow natural evolution of language selection and thus alleviate related problems. In this work, we propose a theoretical framework for describing BI-technology-related studies. The proposed framework is based on the theoretical concepts and terminology from classical science, assistive technology development, human-computer interaction, and previous BI-related works. Using a representative set of studies from the literature, the proposed BI study framework was shown to be complete and appropriate perspective for thoroughly characterizing a BI study. We have also demonstrated that this BI study framework is useful for (1) objectively reviewing existing BI study designs and results, (2) comparing designs and results of multiple BI studies, (3) designing new studies or objectively reporting BI study results, and (4) facilitating intra- and inter-group communication and the education of new researchers. As such, it forms a sound and appropriate basis for community discussion.
Palazuelos, Daniel; DaEun Im, Dana; Peckarsky, Matthew; Schwarz, Dan; Farmer, Didi Bertrand; Dhillon, Ranu; Johnson, Ari; Orihuela, Claudia; Hackett, Jill; Bazile, Junior; Berman, Leslie; Ballard, Madeleine; Panjabi, Raj; Ternier, Ralph; Slavin, Sam; Lee, Scott; Selinsky, Steve; Mitnick, Carole Diane
2013-01-01
Introduction Despite decades of experience with community health workers (CHWs) in a wide variety of global health projects, there is no established conceptual framework that structures how implementers and researchers can understand, study and improve their respective programs based on lessons learned by other CHW programs. Objective To apply an original, non-linear framework and case study method, 5-SPICE, to multiple sister projects of a large, international non-governmental organization (NGO), and other CHW projects. Design Engaging a large group of implementers, researchers and the best available literature, the 5-SPICE framework was refined and then applied to a selection of CHW programs. Insights gleaned from the case study method were summarized in a tabular format named the ‘5×5-SPICE chart’. This format graphically lists the ways in which essential CHW program elements interact, both positively and negatively, in the implementation field. Results The 5×5-SPICE charts reveal a variety of insights that come from a more complex understanding of how essential CHW projects interact and influence each other in their unique context. Some have been well described in the literature previously, while others are exclusive to this article. An analysis of how best to compensate CHWs is also offered as an example of the type of insights that this method may yield. Conclusions The 5-SPICE framework is a novel instrument that can be used to guide discussions about CHW projects. Insights from this process can help guide quality improvement efforts, or be used as hypothesis that will form the basis of a program's research agenda. Recent experience with research protocols embedded into successfully implemented projects demonstrates how such hypothesis can be rigorously tested. PMID:23561023
Weiskel, Peter K.; Wolock, David M.; Zarriello, Phillip J.; Vogel, Richard M.; Levin, Sara B.; Lent, Robert M.
2014-01-01
Runoff-based indicators of terrestrial water availability are appropriate for humid regions, but have tended to limit our basic hydrologic understanding of drylands – the dry-subhumid, semiarid, and arid regions which presently cover nearly half of the global land surface. In response, we introduce an indicator framework that gives equal weight to humid and dryland regions, accounting fully for both vertical (precipitation + evapotranspiration) and horizontal (groundwater + surface-water) components of the hydrologic cycle in any given location – as well as fluxes into and out of landscape storage. We apply the framework to a diverse hydroclimatic region (the conterminous USA) using a distributed water-balance model consisting of 53 400 networked landscape hydrologic units. Our model simulations indicate that about 21% of the conterminous USA either generated no runoff or consumed runoff from upgradient sources on a mean-annual basis during the 20th century. Vertical fluxes exceeded horizontal fluxes across 76% of the conterminous area. Long-term-average total water availability (TWA) during the 20th century, defined here as the total influx to a landscape hydrologic unit from precipitation, groundwater, and surface water, varied spatially by about 400 000-fold, a range of variation ~100 times larger than that for mean-annual runoff across the same area. The framework includes but is not limited to classical, runoff-based approaches to water-resource assessment. It also incorporates and reinterprets the green- and blue-water perspective now gaining international acceptance. Implications of the new framework for several areas of contemporary hydrology are explored, and the data requirements of the approach are discussed in relation to the increasing availability of gridded global climate, land-surface, and hydrologic data sets.
From everyday emotions to aesthetic emotions: Towards a unified theory of musical emotions
NASA Astrophysics Data System (ADS)
Juslin, Patrik N.
2013-09-01
The sound of music may arouse profound emotions in listeners. But such experiences seem to involve a ‘paradox’, namely that music - an abstract form of art, which appears removed from our concerns in everyday life - can arouse emotions - biologically evolved reactions related to human survival. How are these (seemingly) non-commensurable phenomena linked together? Key is to understand the processes through which sounds are imbued with meaning. It can be argued that the survival of our ancient ancestors depended on their ability to detect patterns in sounds, derive meaning from them, and adjust their behavior accordingly. Such an ecological perspective on sound and emotion forms the basis of a recent multi-level framework that aims to explain emotional responses to music in terms of a large set of psychological mechanisms. The goal of this review is to offer an updated and expanded version of the framework that can explain both ‘everyday emotions’ and ‘aesthetic emotions’. The revised framework - referred to as BRECVEMA - includes eight mechanisms: Brain Stem Reflex, Rhythmic Entrainment, Evaluative Conditioning, Contagion, Visual Imagery, Episodic Memory, Musical Expectancy, and Aesthetic Judgment. In this review, it is argued that all of the above mechanisms may be directed at information that occurs in a ‘musical event’ (i.e., a specific constellation of music, listener, and context). Of particular significance is the addition of a mechanism corresponding to aesthetic judgments of the music, to better account for typical ‘appreciation emotions’ such as admiration and awe. Relationships between aesthetic judgments and other mechanisms are reviewed based on the revised framework. It is suggested that the framework may contribute to a long-needed reconciliation between previous approaches that have conceptualized music listeners' responses in terms of either ‘everyday emotions’ or ‘aesthetic emotions’.
Building a Framework for Engineering Design Experiences in High School
ERIC Educational Resources Information Center
Denson, Cameron D.; Lammi, Matthew
2014-01-01
In this article, Denson and Lammi put forth a conceptual framework that will help promote the successful infusion of engineering design experiences into high school settings. When considering a conceptual framework of engineering design in high school settings, it is important to consider the complex issue at hand. For the purposes of this…
Ensemble Methods for Classification of Physical Activities from Wrist Accelerometry.
Chowdhury, Alok Kumar; Tjondronegoro, Dian; Chandran, Vinod; Trost, Stewart G
2017-09-01
To investigate whether the use of ensemble learning algorithms improve physical activity recognition accuracy compared to the single classifier algorithms, and to compare the classification accuracy achieved by three conventional ensemble machine learning methods (bagging, boosting, random forest) and a custom ensemble model comprising four algorithms commonly used for activity recognition (binary decision tree, k nearest neighbor, support vector machine, and neural network). The study used three independent data sets that included wrist-worn accelerometer data. For each data set, a four-step classification framework consisting of data preprocessing, feature extraction, normalization and feature selection, and classifier training and testing was implemented. For the custom ensemble, decisions from the single classifiers were aggregated using three decision fusion methods: weighted majority vote, naïve Bayes combination, and behavior knowledge space combination. Classifiers were cross-validated using leave-one subject out cross-validation and compared on the basis of average F1 scores. In all three data sets, ensemble learning methods consistently outperformed the individual classifiers. Among the conventional ensemble methods, random forest models provided consistently high activity recognition; however, the custom ensemble model using weighted majority voting demonstrated the highest classification accuracy in two of the three data sets. Combining multiple individual classifiers using conventional or custom ensemble learning methods can improve activity recognition accuracy from wrist-worn accelerometer data.
Milano, Giulia; Saenz, Elizabeth; Clark, Nicolas; Busse, Anja; Gale, John; Campello, Giovanna; Mattfeld, Elizabeth; Maalouf, Wadih; Heikkila, Hanna; Martelli, Antonietta; Morales, Brian; Gerra, Gilberto
2017-11-10
Very little evidence has been reported in literature regarding the misuse of substances in rural areas. Despite the common perception of rural communities as a protective and risk-mitigating environment, the scientific literature demonstrated the existence of many risk factors in rural communities. The Drug Prevention and Health Branch (DHB) of the United Nations Office on Drugs and Crime (UNODC), and the World Health Organization (WHO), in June 2016, organized a meeting of experts in treatment and prevention of SUDs in rural settings. The content presented during the meeting and the related discussion have provided materials for the preparation of an outline document, which is the basis to create a technical tool on SUDs prevention and treatment in rural settings. The UNODC framework for interventions in rural settings is a technical tool aimed to assist policy makers and managers at the national level. This paper is a report on UNODC/WHO efforts to improve the clinical conditions of people affected by SUDs and living in rural areas. The purpose of this article is to draw attention on a severe clinical and social problem in a reality forgotten by everyone.
NASA Astrophysics Data System (ADS)
Iwashita, Fabio; Brooks, Andrew; Spencer, John; Borombovits, Daniel; Curwen, Graeme; Olley, Jon
2015-04-01
Assessing bank stability using geotechnical models traditionally involves the laborious collection of data on the bank and floodplain stratigraphy, as well as in-situ geotechnical data for each sedimentary unit within a river bank. The application of geotechnical bank stability models are limited to those sites where extensive field data has been collected, where their ability to provide predictions of bank erosion at the reach scale are limited without a very extensive and expensive field data collection program. Some challenges in the construction and application of riverbank erosion and hydraulic numerical models are their one-dimensionality, steady-state requirements, lack of calibration data, and nonuniqueness. Also, numerical models commonly can be too rigid with respect to detecting unexpected features like the onset of trends, non-linear relations, or patterns restricted to sub-samples of a data set. These shortcomings create the need for an alternate modelling approach capable of using available data. The application of the Self-Organizing Maps (SOM) approach is well-suited to the analysis of noisy, sparse, nonlinear, multidimensional, and scale-dependent data. It is a type of unsupervised artificial neural network with hybrid competitive-cooperative learning. In this work we present a method that uses a database of geotechnical data collected at over 100 sites throughout Queensland State, Australia, to develop a modelling approach that enables geotechnical parameters (soil effective cohesion, friction angle, soil erodibility and critical stress) to be derived from sediment particle size data (PSD). The model framework and predicted values were evaluated using two methods, splitting the dataset into training and validation set, and through a Bootstrap approach. The basis of Bootstrap cross-validation is a leave-one-out strategy. This requires leaving one data value out of the training set while creating a new SOM to estimate that missing value based on the remaining data. As a new SOM is created up to 30 times for each value under scrutiny, it forms the basis for a stochastic framework from which residuals are used to evaluate error statistics and model bias. The proposed method is suitable to estimate soil geotechnical properties, revealing and quantifying relationships between geotechnical variables and particle distribution size, not properly observed by linear multivariate statistical approaches.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Venkatesan, R.C., E-mail: ravi@systemsresearchcorp.com; Plastino, A., E-mail: plastino@fisica.unlp.edu.ar
The (i) reciprocity relations for the relative Fisher information (RFI, hereafter) and (ii) a generalized RFI–Euler theorem are self-consistently derived from the Hellmann–Feynman theorem. These new reciprocity relations generalize the RFI–Euler theorem and constitute the basis for building up a mathematical Legendre transform structure (LTS, hereafter), akin to that of thermodynamics, that underlies the RFI scenario. This demonstrates the possibility of translating the entire mathematical structure of thermodynamics into a RFI-based theoretical framework. Virial theorems play a prominent role in this endeavor, as a Schrödinger-like equation can be associated to the RFI. Lagrange multipliers are determined invoking the RFI–LTS linkmore » and the quantum mechanical virial theorem. An appropriate ansatz allows for the inference of probability density functions (pdf’s, hereafter) and energy-eigenvalues of the above mentioned Schrödinger-like equation. The energy-eigenvalues obtained here via inference are benchmarked against established theoretical and numerical results. A principled theoretical basis to reconstruct the RFI-framework from the FIM framework is established. Numerical examples for exemplary cases are provided. - Highlights: • Legendre transform structure for the RFI is obtained with the Hellmann–Feynman theorem. • Inference of the energy-eigenvalues of the SWE-like equation for the RFI is accomplished. • Basis for reconstruction of the RFI framework from the FIM-case is established. • Substantial qualitative and quantitative distinctions with prior studies are discussed.« less
Code of Federal Regulations, 2010 CFR
2010-10-01
... physician services in a teaching setting. 415.170 Section 415.170 Public Health CENTERS FOR MEDICARE... BY PHYSICIANS IN PROVIDERS, SUPERVISING PHYSICIANS IN TEACHING SETTINGS, AND RESIDENTS IN CERTAIN SETTINGS Physician Services in Teaching Settings § 415.170 Conditions for payment on a fee schedule basis...
Brand, Sarah L.; Fleming, Lora E.; Wyatt, Katrina M.
2015-01-01
Many healthy workplace interventions have been developed for healthcare settings to address the consistently low scores of healthcare professionals on assessments of mental and physical well-being. Complex healthcare settings present challenges for the scale-up and spread of successful interventions from one setting to another. Despite general agreement regarding the importance of the local setting in affecting intervention success across different settings, there is no consensus on what it is about a local setting that needs to be taken into account to design healthy workplace interventions appropriate for different local settings. Complexity theory principles were used to understand a workplace as a complex adaptive system and to create a framework of eight domains (system characteristics) that affect the emergence of system-level behaviour. This Workplace of Well-being (WoW) framework is responsive and adaptive to local settings and allows a shared understanding of the enablers and barriers to behaviour change by capturing local information for each of the eight domains. We use the results of applying the WoW framework to one workplace, a UK National Health Service ward, to describe the utility of this approach in informing design of setting-appropriate healthy workplace interventions that create workplaces conducive to healthy behaviour change. PMID:26380358
Brand, Sarah L; Fleming, Lora E; Wyatt, Katrina M
2015-01-01
Many healthy workplace interventions have been developed for healthcare settings to address the consistently low scores of healthcare professionals on assessments of mental and physical well-being. Complex healthcare settings present challenges for the scale-up and spread of successful interventions from one setting to another. Despite general agreement regarding the importance of the local setting in affecting intervention success across different settings, there is no consensus on what it is about a local setting that needs to be taken into account to design healthy workplace interventions appropriate for different local settings. Complexity theory principles were used to understand a workplace as a complex adaptive system and to create a framework of eight domains (system characteristics) that affect the emergence of system-level behaviour. This Workplace of Well-being (WoW) framework is responsive and adaptive to local settings and allows a shared understanding of the enablers and barriers to behaviour change by capturing local information for each of the eight domains. We use the results of applying the WoW framework to one workplace, a UK National Health Service ward, to describe the utility of this approach in informing design of setting-appropriate healthy workplace interventions that create workplaces conducive to healthy behaviour change.
Projected Hybrid Orbitals: A General QM/MM Method
2015-01-01
A projected hybrid orbital (PHO) method was described to model the covalent boundary in a hybrid quantum mechanical and molecular mechanical (QM/MM) system. The PHO approach can be used in ab initio wave function theory and in density functional theory with any basis set without introducing system-dependent parameters. In this method, a secondary basis set on the boundary atom is introduced to formulate a set of hybrid atomic orbtials. The primary basis set on the boundary atom used for the QM subsystem is projected onto the secondary basis to yield a representation that provides a good approximation to the electron-withdrawing power of the primary basis set to balance electronic interactions between QM and MM subsystems. The PHO method has been tested on a range of molecules and properties. Comparison with results obtained from QM calculations on the entire system shows that the present PHO method is a robust and balanced QM/MM scheme that preserves the structural and electronic properties of the QM region. PMID:25317748
A novel Gaussian-Sinc mixed basis set for electronic structure calculations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jerke, Jonathan L.; Lee, Young; Tymczak, C. J.
2015-08-14
A Gaussian-Sinc basis set methodology is presented for the calculation of the electronic structure of atoms and molecules at the Hartree–Fock level of theory. This methodology has several advantages over previous methods. The all-electron electronic structure in a Gaussian-Sinc mixed basis spans both the “localized” and “delocalized” regions. A basis set for each region is combined to make a new basis methodology—a lattice of orthonormal sinc functions is used to represent the “delocalized” regions and the atom-centered Gaussian functions are used to represent the “localized” regions to any desired accuracy. For this mixed basis, all the Coulomb integrals are definablemore » and can be computed in a dimensional separated methodology. Additionally, the Sinc basis is translationally invariant, which allows for the Coulomb singularity to be placed anywhere including on lattice sites. Finally, boundary conditions are always satisfied with this basis. To demonstrate the utility of this method, we calculated the ground state Hartree–Fock energies for atoms up to neon, the diatomic systems H{sub 2}, O{sub 2}, and N{sub 2}, and the multi-atom system benzene. Together, it is shown that the Gaussian-Sinc mixed basis set is a flexible and accurate method for solving the electronic structure of atomic and molecular species.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hill, J. Grant, E-mail: grant.hill@sheffield.ac.uk, E-mail: kipeters@wsu.edu; Peterson, Kirk A., E-mail: grant.hill@sheffield.ac.uk, E-mail: kipeters@wsu.edu
New correlation consistent basis sets, cc-pVnZ-PP-F12 (n = D, T, Q), for all the post-d main group elements Ga–Rn have been optimized for use in explicitly correlated F12 calculations. The new sets, which include not only orbital basis sets but also the matching auxiliary sets required for density fitting both conventional and F12 integrals, are designed for correlation of valence sp, as well as the outer-core d electrons. The basis sets are constructed for use with the previously published small-core relativistic pseudopotentials of the Stuttgart-Cologne variety. Benchmark explicitly correlated coupled-cluster singles and doubles with perturbative triples [CCSD(T)-F12b] calculations of themore » spectroscopic properties of numerous diatomic molecules involving 4p, 5p, and 6p elements have been carried out and compared to the analogous conventional CCSD(T) results. In general the F12 results obtained with a n-zeta F12 basis set were comparable to conventional aug-cc-pVxZ-PP or aug-cc-pwCVxZ-PP basis set calculations obtained with x = n + 1 or even x = n + 2. The new sets used in CCSD(T)-F12b calculations are particularly efficient at accurately recovering the large correlation effects of the outer-core d electrons.« less
Expertise, Task Complexity, and Artificial Intelligence: A Conceptual Framework.
ERIC Educational Resources Information Center
Buckland, Michael K.; Florian, Doris
1991-01-01
Examines the relationship between users' expertise, task complexity of information system use, and artificial intelligence to provide the basis for a conceptual framework for considering the role that artificial intelligence might play in information systems. Cognitive and conceptual models are discussed, and cost effectiveness is considered. (27…
The Intelligent Career Framework as a Basis for Interdisciplinary Inquiry
ERIC Educational Resources Information Center
Parker, Polly; Khapova, Svetlana N.; Arthur, Michael B.
2009-01-01
This paper examines how separate behavioral science disciplines can be brought together to more fully understand the dynamics of contemporary careers. We adopt one interdisciplinary framework--that of the "intelligent career"--and use it to examine how separate disciplinary approaches relate to one another. The intelligent career framework…
Graphical Means for Inspecting Qualitative Models of System Behaviour
ERIC Educational Resources Information Center
Bouwer, Anders; Bredeweg, Bert
2010-01-01
This article presents the design and evaluation of a tool for inspecting conceptual models of system behaviour. The basis for this research is the Garp framework for qualitative simulation. This framework includes modelling primitives, such as entities, quantities and causal dependencies, which are combined into model fragments and scenarios.…
Questioning and Experimentation
ERIC Educational Resources Information Center
Mutanen, Arto
2014-01-01
The paper is a philosophical analysis of experimentation. The philosophical framework of the analysis is the interrogative model of inquiry developed by Hintikka. The basis of the model is explicit and well-formed logic of questions and answers. The framework allows us to formulate a flexible logic of experimentation. In particular, the formulated…
A Component-Based Vocabulary-Extensible Sign Language Gesture Recognition Framework.
Wei, Shengjing; Chen, Xiang; Yang, Xidong; Cao, Shuai; Zhang, Xu
2016-04-19
Sign language recognition (SLR) can provide a helpful tool for the communication between the deaf and the external world. This paper proposed a component-based vocabulary extensible SLR framework using data from surface electromyographic (sEMG) sensors, accelerometers (ACC), and gyroscopes (GYRO). In this framework, a sign word was considered to be a combination of five common sign components, including hand shape, axis, orientation, rotation, and trajectory, and sign classification was implemented based on the recognition of five components. Especially, the proposed SLR framework consisted of two major parts. The first part was to obtain the component-based form of sign gestures and establish the code table of target sign gesture set using data from a reference subject. In the second part, which was designed for new users, component classifiers were trained using a training set suggested by the reference subject and the classification of unknown gestures was performed with a code matching method. Five subjects participated in this study and recognition experiments under different size of training sets were implemented on a target gesture set consisting of 110 frequently-used Chinese Sign Language (CSL) sign words. The experimental results demonstrated that the proposed framework can realize large-scale gesture set recognition with a small-scale training set. With the smallest training sets (containing about one-third gestures of the target gesture set) suggested by two reference subjects, (82.6 ± 13.2)% and (79.7 ± 13.4)% average recognition accuracy were obtained for 110 words respectively, and the average recognition accuracy climbed up to (88 ± 13.7)% and (86.3 ± 13.7)% when the training set included 50~60 gestures (about half of the target gesture set). The proposed framework can significantly reduce the user's training burden in large-scale gesture recognition, which will facilitate the implementation of a practical SLR system.
Manna, Debashree; Kesharwani, Manoj K; Sylvetsky, Nitai; Martin, Jan M L
2017-07-11
Benchmark ab initio energies for BEGDB and WATER27 data sets have been re-examined at the MP2 and CCSD(T) levels with both conventional and explicitly correlated (F12) approaches. The basis set convergence of both conventional and explicitly correlated methods has been investigated in detail, both with and without counterpoise corrections. For the MP2 and CCSD-MP2 contributions, rapid basis set convergence observed with explicitly correlated methods is compared to conventional methods. However, conventional, orbital-based calculations are preferred for the calculation of the (T) term, since it does not benefit from F12. CCSD(F12*) converges somewhat faster with the basis set than CCSD-F12b for the CCSD-MP2 term. The performance of various DFT methods is also evaluated for the BEGDB data set, and results show that Head-Gordon's ωB97X-V and ωB97M-V functionals outperform all other DFT functionals. Counterpoise-corrected DSD-PBEP86 and raw DSD-PBEPBE-NL also perform well and are close to MP2 results. In the WATER27 data set, the anionic (deprotonated) water clusters exhibit unacceptably slow basis set convergence with the regular cc-pVnZ-F12 basis sets, which have only diffuse s and p functions. To overcome this, we have constructed modified basis sets, denoted aug-cc-pVnZ-F12 or aVnZ-F12, which have been augmented with diffuse functions on the higher angular momenta. The calculated final dissociation energies of BEGDB and WATER27 data sets are available in the Supporting Information. Our best calculated dissociation energies can be reproduced through n-body expansion, provided one pushes to the basis set and electron correlation limit for the two-body term; for the three-body term, post-MP2 contributions (particularly CCSD-MP2) are important for capturing the three-body dispersion effects. Terms beyond four-body can be adequately captured at the MP2-F12 level.
The chromosomal analysis of teaching: the search for promoter genes.
Skeff, Kelley M
2007-01-01
The process of teaching is ubiquitous in medicine, both in the practice of medicine and the promotion of medical science. Yet, until the last 50 years, the process of medical teaching had been neglected. To improve this process, the research group at the Stanford Faculty Development Center for Medical Teachers developed an educational framework to assist teachers to analyze and improve the teaching process. Utilizing empirical data drawn from videotapes of actual clinical teaching and educational literature, we developed a seven-category systematic scheme for the analysis of medical teaching, identifying key areas and behaviors that could enable teachers to enhance their effectiveness. The organizational system of this scheme is similar to that used in natural sciences, such as genetics. Whereas geneticists originally identified chromosomes and ultimately individual and related genes, this classification system identifies major categories and specific teaching behaviors that can enhance teaching effectiveness. Over the past two decades, this organizational framework has provided the basis for a variety of faculty development programs for improving teaching effectiveness. Results of those programs have revealed several positive findings, including the usefulness of the methods for a wide variety of medical teachers in a variety of settings. This research indicates that the development of a framework for analysis has been, as in the natural sciences, an important way to improve the science of the art of teaching.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clausen, Alison, E-mail: aliclausen@protocol.com.a; Vu, Hoang Hoa, E-mail: hoanghoavu@yahoo.co; Pedrono, Miguel, E-mail: pedrono@cirad.f
Vietnam has one of the fastest growing economies in the world and has achieved significant socio-economic development in recent years. However this growth is placing increased pressure on an already depleted natural environment. Environmental impact assessment (EIA) is recognised by the Government and international organizations as an important tool in the management of the impacts of future development on the country's natural resource base. The Government's commitment to EIA has been demonstrated through the development and adoption of the Law on Environment Protection (Revised) in 2005 which sets out the requirements for EIA and which represents a major step inmore » the development of a robust legislative framework for EIA in Vietnam. The Law on Environment Protection (Revised) 2005 has now been operational for several years and we have undertaken an evaluation of the resulting EIA system in Vietnam. We argue that while significant improvements have been achieved in the EIA policy framework, an important gap remains between EIA theory and practice. We contend that the basis of the current EIA legislation is strong and that future developments of the EIA system in Vietnam should focus on improving capacity of EIA practitioners rather than further substantial legislative change. Such improvements would allow the Vietnamese EIA system to emerge as an effective and efficient tool for environmental management in Vietnam and as a model EIA framework for other developing countries.« less
Souba, Wiley W
2011-02-24
The ethical foundation of the medical profession, which values service above reward and holds the doctor-patient relationship as inviolable, continues to be challenged by the commercialization of health care. This article contends that a realigned leadership framework - one that distinguishes being a leader as the ontological basis for what leaders know, have, and do - is central to safeguarding medicine's ethical foundation. Four ontological pillars of leadership - awareness, commitment, integrity, and authenticity - are proposed as fundamental elements that anchor this foundation and the basic tenets of professionalism. Ontological leadership is shaped by and accessible through language; what health care leaders create in language "uses" them by providing a point of view (a context) within and from which they orient their conversations, decisions, and conduct such that they are ethically aligned and grounded. This contextual leadership framework exposes for us the limitations imposed by our mental maps, creating new opportunity sets for being and action (previously unavailable) that embody medicine's charter on professionalism. While this leadership methodology contrasts with the conventional results-oriented model where leading is generally equated with a successful clinical practice, a distinguished research program, or a promotion, it is not a replacement for it; indeed, results are essential for performance. Rather, being and action are interrelated and their correlated nature equips leaders with a framework for tackling health care's most complex problems in a manner that preserves medicine's venerable ethical heritage.
2011-01-01
The ethical foundation of the medical profession, which values service above reward and holds the doctor-patient relationship as inviolable, continues to be challenged by the commercialization of health care. This article contends that a realigned leadership framework - one that distinguishes being a leader as the ontological basis for what leaders know, have, and do - is central to safeguarding medicine's ethical foundation. Four ontological pillars of leadership - awareness, commitment, integrity, and authenticity - are proposed as fundamental elements that anchor this foundation and the basic tenets of professionalism. Ontological leadership is shaped by and accessible through language; what health care leaders create in language "uses" them by providing a point of view (a context) within and from which they orient their conversations, decisions, and conduct such that they are ethically aligned and grounded. This contextual leadership framework exposes for us the limitations imposed by our mental maps, creating new opportunity sets for being and action (previously unavailable) that embody medicine's charter on professionalism. While this leadership methodology contrasts with the conventional results-oriented model where leading is generally equated with a successful clinical practice, a distinguished research program, or a promotion, it is not a replacement for it; indeed, results are essential for performance. Rather, being and action are interrelated and their correlated nature equips leaders with a framework for tackling health care's most complex problems in a manner that preserves medicine's venerable ethical heritage. PMID:21349187
Establishing a framework for studying the emerging cislunar economy
NASA Astrophysics Data System (ADS)
Entrena Utrilla, Carlos Manuel
2017-12-01
Recent developments from the New Space industry have seen the appearance of a number of new companies interested in the creation of a self-sustained economy in cislunar space. Industries such as asteroid mining, Moon mining, and on-orbit manufacturing require the existence of a developed economy in space for the business cases to close in the long term, without the need to have the government as a permanent anchor customer. However, most studies and business plans do not consider the global picture of the cislunar economy, and only work with Earth-based activities when evaluating possible customers and competition. This work aims to set the framework for the study of the cislunar economy as a whole by identifying the market verticals that will form the basis of the economic activities in cislunar space, focusing on activities that create value in space for space. The prospective cislunar market verticals are identified based on a comprehensive review of current space activities and of proposed future business cases. This framework can be expanded in the future with evaluations of market sizes and relationships between verticals to inform business plans and investment decisions. The study was performed during the first two months in the summer of 2016 as part of the author's internship at NASA's Space Portal Office to complete the International Space University Master of Space Studies.
Policy development for biodiversity offsets: a review of offset frameworks.
McKenney, Bruce A; Kiesecker, Joseph M
2010-01-01
Biodiversity offsets seek to compensate for residual environmental impacts of planned developments after appropriate steps have been taken to avoid, minimize or restore impacts on site. Offsets are emerging as an increasingly employed mechanism for achieving net environmental benefits, with offset policies being advanced in a wide range of countries (i.e., United States, Australia, Brazil, Colombia, and South Africa). To support policy development for biodiversity offsets, we review a set of major offset policy frameworks-US wetlands mitigation, US conservation banking, EU Natura 2000, Australian offset policies in New South Wales, Victoria, and Western Australia, and Brazilian industrial and forest offsets. We compare how the frameworks define offset policy goals, approach the mitigation process, and address six key issues for implementing offsets: (1) equivalence of project impacts with offset gains; (2) location of the offset relative to the impact site; (3) "additionality" (a new contribution to conservation) and acceptable types of offsets; (4) timing of project impacts versus offset benefits; (5) offset duration and compliance; and (6) "currency" and mitigation replacement ratios. We find substantial policy commonalities that may serve as a sound basis for future development of biodiversity offsets policy. We also identify issues requiring further policy guidance, including how best to: (1) ensure conformance with the mitigation hierarchy; (2) identify the most environmentally preferable offsets within a landscape context; and (3) determine appropriate mitigation replacement ratios.
On the Use of a Mixed Gaussian/Finite-Element Basis Set for the Calculation of Rydberg States
NASA Technical Reports Server (NTRS)
Thuemmel, Helmar T.; Langhoff, Stephen (Technical Monitor)
1996-01-01
Configuration-interaction studies are reported for the Rydberg states of the helium atom using mixed Gaussian/finite-element (GTO/FE) one particle basis sets. Standard Gaussian valence basis sets are employed, like those, used extensively in quantum chemistry calculations. It is shown that the term values for high-lying Rydberg states of the helium atom can be obtained accurately (within 1 cm -1), even for a small GTO set, by augmenting the n-particle space with configurations, where orthonormalized interpolation polynomials are singly occupied.
Structures of cage, prism, and book isomers of water hexamer from broadband rotational spectroscopy.
Pérez, Cristóbal; Muckle, Matt T; Zaleski, Daniel P; Seifert, Nathan A; Temelso, Berhane; Shields, George C; Kisiel, Zbigniew; Pate, Brooks H
2012-05-18
Theory predicts the water hexamer to be the smallest water cluster with a three-dimensional hydrogen-bonding network as its minimum energy structure. There are several possible low-energy isomers, and calculations with different methods and basis sets assign them different relative stabilities. Previous experimental work has provided evidence for the cage, book, and cyclic isomers, but no experiment has identified multiple coexisting structures. Here, we report that broadband rotational spectroscopy in a pulsed supersonic expansion unambiguously identifies all three isomers; we determined their oxygen framework structures by means of oxygen-18-substituted water (H(2)(18)O). Relative isomer populations at different expansion conditions establish that the cage isomer is the minimum energy structure. Rotational spectra consistent with predicted heptamer and nonamer structures have also been identified.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
1978-01-31
This report is a part of the interim report documentation for the Global Spent Fuel Logistics System (GSFLS) study. This report describes a global framework that evaluates spent fuel disposition requirements, influencing factors and strategies. A broad sampling of foreign governmental officials, electric utility spokesmen and nuclear power industry officials responsible for GSFLS policies, plans and programs were surveyed as to their views with respect to national and international GSFLS related considerations. The results of these GSFLS visit findings are presented herein. These findings were then evaluated in terms of technical, institutional and legal/regulatory implications. The GSFLS evaluations, in conjunctionmore » with perceived US spent fuel objectives, formed the basis for selecting a set of GSFLS strategies which are reported herein.« less
Hamiltonian BFV-BRST theory of closed quantum cosmological models
NASA Astrophysics Data System (ADS)
Kamenshchik, A. Yu.; Lyakhovich, S. L.
1997-02-01
We introduce and study a new discrete basis of gravity constraints by making use of harmonic expansion for closed cosmological models. The full set of constraints is split into area-preserving spatial diffeomorphisms, forming closed subalgebra, and Virasoro-like generators. Operational Hamiltonian BFV-BRST quantization is performed in the framework of perturbative expansion in the dimensionless parameter, which is a positive power of the ratio of Planckian volume to the volume of the Universe. For the (N + 1)-dimensional generalization of stationary closed Bianchi-I cosmology the nilpotency condition for the BRST operator is examined in the first quantum approximation. It turns out that a certain relationship between the dimensionality of the space and the spectrum of matter fields emerges from the requirement of quantum consistency of the model.
Hamiltonian BFV-BRST theory of closed quantum cosmological models
NASA Astrophysics Data System (ADS)
Kamenshchik, A. Yu.; Lyakhovich, S. L.
1997-08-01
We introduce and study a new discrete basis of gravity constraints by making use of the harmonic expansion for closed cosmological models. The full set of constraints is split into area-preserving spatial diffeomorphisms, forming a closed subalgebra, and Virasoro-like generators. The operatorial Hamiltonian BFV-BRST quantization is performed in the framework of a perturbative expansion in the dimensionless parameter which is a positive power of the ratio of the Planck volume to the volume of the Universe. For the (N + 1) - dimensional generalization of a stationary closed Bianchi-I cosmology the nilpotency condition for the BRST operator is examined in the first quantum approximation. It turns out that a relationship between the dimensionality of the space and the spectrum of matter fields emerges from the requirement of quantum consistency of the model.
Variable habitat conditions drive species covariation in the human microbiota
Mora, Thierry; Walczak, Aleksandra M.
2017-01-01
Two species with similar resource requirements respond in a characteristic way to variations in their habitat—their abundances rise and fall in concert. We use this idea to learn how bacterial populations in the microbiota respond to habitat conditions that vary from person-to-person across the human population. Our mathematical framework shows that habitat fluctuations are sufficient for explaining intra-bodysite correlations in relative species abundances from the Human Microbiome Project. We explicitly show that the relative abundances of closely related species are positively correlated and can be predicted from taxonomic relationships. We identify a small set of functional pathways related to metabolism and maintenance of the cell wall that form the basis of a common resource sharing niche space of the human microbiota. PMID:28448493
NASA Astrophysics Data System (ADS)
Leon, Neira B. Oscar; Fabio, Mejía Elio; Elizabeth, y. Rincón B.
2008-04-01
The organic molecules of a chain structure containing phenyl, oxazole and oxadiazole rings are used in different combinations as active media for tunable lasers. From this viewpoint, we focused in the theoretical study of organic compounds of three rings, which have similar optical properties (fluorescence and laser properties). The main goal of this study is to compare the electronic structure through the analysis of molecular global descriptors defined in the DFT framework of2-[2-X-phenyl]-5-phenyl-1,3-Oxazole, 2-[2-X-phenyl]-5-phenyl-1,3,4-Oxadiazole, and 2-[2-X-phenyl]-5-phenyl-furane with X = H, F and Cl. The basis set used was 6-31G+(d).
New DMFT capabilities in CASTEP
NASA Astrophysics Data System (ADS)
Plekhanov, Evgeny; Sacksteder, Vincent; Hasnip, Phil; Probert, Matt; Clark, Stewart; Weber, Cedric; Refson, Keith
We present the first implementation of Dynamical Mean-Field Theory in UK's major ab-initio code CASTEP. This implementation: i) is modular; ii) allows great flexibility in choosing local basis set for downfolding/upfolding of self-energy; iii) permits wide choice of impurity solvers (including external solver libraries); and iv) gives the user a possibility to use several self-consistency schemes and calculate total energy and forces. We explain in details the theoretical framework used. We benchmark our implementation on several strongly-correlated insulating systems with d- and f-shells: γ-Ce and Ce2O3 by using Hubbard I and CTHYB-QMC solvers. Our results appear to be in excellent agreement with the reference data published previously in the literature. EPSRC-funded project ''Strong Correlation meets Materials Modelling: DMFT and GW in CASTEP''.
NASA Astrophysics Data System (ADS)
Aurenheimer, C.
The process of transfer of jurisdictions on environmental matters from the central administration to an autonomous regional government is described. The main approaches taken by different autonomous regions in Spain, to deal with the problems of setting up an environmental administrative organization are also briefly commented here. Finally, the specific structure of the environmental administration in the Comunidad Valenciana and the main guidelines of its environmental policies are explained. It is concluded that earth science can provide a very useful basis for the drafting of these policies in the region, because of the specific nature of the problems to be dealt with, related mainly to water management, soil conservation, land occupation, natural hazards and preservation of natural areas.
NASA Astrophysics Data System (ADS)
Ehmann, B.; Balázs, L.; Fülöp, É.; Hargitai, R.; Kabai, P.; Péley, B.; Pólya, T.; Vargha, A.; László, J.
2011-05-01
This paper is about a pilot application of narrative psychological content analysis in the psychological status monitoring of Crew 71 of a space analog simulation environment, the Mars Desert Research Station (MDRS). Both the method and its theoretical framework, Scientific Narrative Psychology, are original developments by Hungarian psychologists [5] (László, 2008). The software was NooJ, a multilingual linguistic development environment [11] (Silberztein, 2008). Three measures were conceptualized and assessed: emotional status, team spirit and subjective physical comfort. The results showed the patterns of these three measures on a daily basis at group level, and allowed for detecting individual differences as well. The method is adaptable to languages involved in space psychology, e.g. Russian, French and German in addition to English.
Benchmarking physician performance, part 2.
Collier, David A; Collier, Cindy Eddins; Kelly, Thomas M
2006-01-01
Part 1 of this article (January-February 2006) reviewed ways of measuring the work of physicians through methods such as data envelopment analysis (DEA) and relative value units (RVUs). These techniques provide insights into: 1. Who are the best-performing physicians? 2. Who are the underperforming physicians? 3. How can underperforming physicians improve? 4. What are the underperformers' performance targets? 5. How do you deal with full- and part-time physicians in a university setting? Part 2 compares the performance of 16 primary care physicians in the same medical specialty using DEA efficiency scores. DEA is capable of modeling multiple criteria and automatically determines the relative weights of each performance measure. This research also provides a preliminary framework for how work measurement and DEA can be used as a basis for a medical team or physician compensation system.
NASA Astrophysics Data System (ADS)
Graham, Matthew J.
To flourish in the new data-intensive environment of twenty-first century science, we need to evolve new skills. These can be expressed in terms of the systemized framework that formed the basis of mediaeval education—the trivium (logic, grammar and rhetoric) and quadrivium (arithmetic, geometry, music and astronomy). However, rather than focusing on number, data are the new keystone. We need to understand what rules they obey, how they are symbolized and communicated, and what their relationship is to physical space and time. In this paper, we will review this understanding in terms of the technologies and processes that data require. We contend that, at least, an appreciation of all these aspects is crucial to enabling us to extract scientific information and knowledge from the data sets that threaten to engulf and overwhelm us.
Oller, Stephen D
2005-01-01
The pragmatic mapping process and its variants have proven effective in second language learning and teaching. The goal of this paper is to show that the same process applies in teaching and intervention with disordered populations. A secondary goal, ultimately more important, is to give clinicians, teachers, and other educators a tool-kit, or a framework, from which they can evaluate and implement interventions. What is offered is an introduction to a general theory of signs and some examples of how it can be applied in treating communication disorders. (1) Readers will be able to relate the three theoretical consistency requirements to language teaching and intervention. (2) Readers will be introduced to a general theory of signs that provides a basis for evaluating and implementing interventions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bolisetti, Chandrakanth; Yu, Chingching; Coleman, Justin
This report provides a framework for assessing the benefits of seismic isolation and exercises the framework on a Generic Department of Energy Nuclear Facility (GDNF). These benefits are (1) reduction in the risk of unacceptable seismic performance and a dramatic reduction in the probability of unacceptable performance at beyond-design basis shaking, and (2) a reduction in capital cost at sites with moderate to high seismic hazard. The framework includes probabilistic risk assessment and estimates of overnight capital cost for the GDNF.
Wen, Dingqiao; Yu, Yun; Hahn, Matthew W.; Nakhleh, Luay
2016-01-01
The role of hybridization and subsequent introgression has been demonstrated in an increasing number of species. Recently, Fontaine et al. (Science, 347, 2015, 1258524) conducted a phylogenomic analysis of six members of the Anopheles gambiae species complex. Their analysis revealed a reticulate evolutionary history and pointed to extensive introgression on all four autosomal arms. The study further highlighted the complex evolutionary signals that the co-occurrence of incomplete lineage sorting (ILS) and introgression can give rise to in phylogenomic analyses. While tree-based methodologies were used in the study, phylogenetic networks provide a more natural model to capture reticulate evolutionary histories. In this work, we reanalyse the Anopheles data using a recently devised framework that combines the multispecies coalescent with phylogenetic networks. This framework allows us to capture ILS and introgression simultaneously, and forms the basis for statistical methods for inferring reticulate evolutionary histories. The new analysis reveals a phylogenetic network with multiple hybridization events, some of which differ from those reported in the original study. To elucidate the extent and patterns of introgression across the genome, we devise a new method that quantifies the use of reticulation branches in the phylogenetic network by each genomic region. Applying the method to the mosquito data set reveals the evolutionary history of all the chromosomes. This study highlights the utility of ‘network thinking’ and the new insights it can uncover, in particular in phylogenomic analyses of large data sets with extensive gene tree incongruence. PMID:26808290
Perturbation corrections to Koopmans' theorem. V - A study with large basis sets
NASA Technical Reports Server (NTRS)
Chong, D. P.; Langhoff, S. R.
1982-01-01
The vertical ionization potentials of N2, F2 and H2O were calculated by perturbation corrections to Koopmans' theorem using six different basis sets. The largest set used includes several sets of polarization functions. Comparison is made with measured values and with results of computations using Green's functions.
A new basis set for molecular bending degrees of freedom.
Jutier, Laurent
2010-07-21
We present a new basis set as an alternative to Legendre polynomials for the variational treatment of bending vibrational degrees of freedom in order to highly reduce the number of basis functions. This basis set is inspired from the harmonic oscillator eigenfunctions but is defined for a bending angle in the range theta in [0:pi]. The aim is to bring the basis functions closer to the final (ro)vibronic wave functions nature. Our methodology is extended to complicated potential energy surfaces, such as quasilinearity or multiequilibrium geometries, by using several free parameters in the basis functions. These parameters allow several density maxima, linear or not, around which the basis functions will be mainly located. Divergences at linearity in integral computations are resolved as generalized Legendre polynomials. All integral computations required for the evaluation of molecular Hamiltonian matrix elements are given for both discrete variable representation and finite basis representation. Convergence tests for the low energy vibronic states of HCCH(++), HCCH(+), and HCCS are presented.
NASA Technical Reports Server (NTRS)
Banks, H. T.; Ito, K.
1988-01-01
Numerical techniques for parameter identification in distributed-parameter systems are developed analytically. A general convergence and stability framework (for continuous dependence on observations) is derived for first-order systems on the basis of (1) a weak formulation in terms of sesquilinear forms and (2) the resolvent convergence form of the Trotter-Kato approximation. The extension of this framework to second-order systems is considered.
Borrok, D.; Turner, B.F.; Fein, J.B.
2005-01-01
Adsorption onto bacterial cell walls can significantly affect the speciation and mobility of aqueous metal cations in many geologic settings. However, a unified thermodynamic framework for describing bacterial adsorption reactions does not exist. This problem originates from the numerous approaches that have been chosen for modeling bacterial surface protonation reactions. In this study, we compile all currently available potentiometric titration datasets for individual bacterial species, bacterial consortia, and bacterial cell wall components. Using a consistent, four discrete site, non-electrostatic surface complexation model, we determine total functional group site densities for all suitable datasets, and present an averaged set of 'universal' thermodynamic proton binding and site density parameters for modeling bacterial adsorption reactions in geologic systems. Modeling results demonstrate that the total concentrations of proton-active functional group sites for the 36 bacterial species and consortia tested are remarkably similar, averaging 3.2 ?? 1.0 (1??) ?? 10-4 moles/wet gram. Examination of the uncertainties involved in the development of proton-binding modeling parameters suggests that ignoring factors such as bacterial species, ionic strength, temperature, and growth conditions introduces relatively small error compared to the unavoidable uncertainty associated with the determination of cell abundances in realistic geologic systems. Hence, we propose that reasonable estimates of the extent of bacterial cell wall deprotonation can be made using averaged thermodynamic modeling parameters from all of the experiments that are considered in this study, regardless of bacterial species used, ionic strength, temperature, or growth condition of the experiment. The average site densities for the four discrete sites are 1.1 ?? 0.7 ?? 10-4, 9.1 ?? 3.8 ?? 10-5, 5.3 ?? 2.1 ?? 10-5, and 6.6 ?? 3.0 ?? 10-5 moles/wet gram bacteria for the sites with pKa values of 3.1, 4.7, 6.6, and 9.0, respectively. It is our hope that this thermodynamic framework for modeling bacteria-proton binding reactions will also provide the basis for the development of an internally consistent set of bacteria-metal binding constants. 'Universal' constants for bacteria-metal binding reactions can then be used in conjunction with equilibrium constants for other important metal adsorption and complexation reactions to calculate the overall distribution of metals in realistic geologic systems.
A common evaluation framework for the African Health Initiative.
Bryce, Jennifer; Requejo, Jennifer Harris; Moulton, Lawrence H; Ram, Malathi; Black, Robert E
2013-01-01
The African Health Initiative includes highly diverse partnerships in five countries (Ghana, Mozambique, Rwanda, Tanzania, and Zambia), each of which is working to improve population health by strengthening health systems and to evaluate the results. One aim of the Initiative is to generate cross-site learning that can inform implementation in the five partnerships during the project period and identify lessons that may be generalizable to other countries in the region. Collaborators in the Initiative developed a common evaluation framework as a basis for this cross-site learning. This paper describes the components of the framework; this includes the conceptual model, core metrics to be measured in all sites, and standard guidelines for reporting on the implementation of partnership activities and contextual factors that may affect implementation, or the results it produces. We also describe the systems that have been put in place for data management, data quality assessments, and cross-site analysis of results. The conceptual model for the Initiative highlights points in the causal chain between health system strengthening activities and health impact where evidence produced by the partnerships can contribute to learning. This model represents an important advance over its predecessors by including contextual factors and implementation strength as potential determinants, and explicitly including equity as a component of both outcomes and impact. Specific measurement challenges include the prospective documentation of program implementation and contextual factors. Methodological issues addressed in the development of the framework include the aggregation of data collected using different methods and the challenge of evaluating a complex set of interventions being improved over time based on continuous monitoring and intermediate results.
The Digital Anatomist Distributed Framework and Its Applications to Knowledge-based Medical Imaging
Brinkley, James F.; Rosse, Cornelius
1997-01-01
Abstract The domain of medical imaging is anatomy. Therefore, anatomic knowledge should be a rational basis for organizing and analyzing images. The goals of the Digital Anatomist Program at the University of Washington include the development of an anatomically based software framework for organizing, analyzing, visualizing and utilizing biomedical information. The framework is based on representations for both spatial and symbolic anatomic knowledge, and is being implemented in a distributed architecture in which multiple client programs on the Internet are used to update and access an expanding set of anatomical information resources. The development of this framework is driven by several practical applications, including symbolic anatomic reasoning, knowledge based image segmentation, anatomy information retrieval, and functional brain mapping. Since each of these areas involves many difficult image processing issues, our research strategy is an evolutionary one, in which applications are developed somewhat independently, and partial solutions are integrated in a piecemeal fashion, using the network as the substrate. This approach assumes that networks of interacting components can synergistically work together to solve problems larger than either could solve on its own. Each of the individual projects is described, along with evaluations that show that the individual components are solving the problems they were designed for, and are beginning to interact with each other in a synergistic manner. We argue that this synergy will increase, not only within our own group, but also among groups as the Internet matures, and that an anatomic knowledge base will be a useful means for fostering these interactions. PMID:9147337
The international spinal cord injury endocrine and metabolic function basic data set.
Bauman, W A; Biering-Sørensen, F; Krassioukov, A
2011-10-01
To develop the International Spinal Cord Injury (SCI) Endocrine and Metabolic Function Basic Data Set within the framework of the International SCI Data Sets that would facilitate consistent collection and reporting of basic endocrine and metabolic findings in the SCI population. International. The International SCI Endocrine and Metabolic Function Data Set was developed by a working group. The initial data set document was revised on the basis of suggestions from members of the Executive Committee of the International SCI Standards and Data Sets, the International Spinal Cord Society (ISCoS) Executive and Scientific Committees, American Spinal Injury Association (ASIA) Board, other interested organizations and societies, and individual reviewers. In addition, the data set was posted for 2 months on ISCoS and ASIA websites for comments. The final International SCI Endocrine and Metabolic Function Data Set contains questions on the endocrine and metabolic conditions diagnosed before and after spinal cord lesion. If available, information collected before injury is to be obtained only once, whereas information after injury may be collected at any time. These data include information on diabetes mellitus, lipid disorders, osteoporosis, thyroid disease, adrenal disease, gonadal disease and pituitary disease. The question of gonadal status includes stage of sexual development and that for females also includes menopausal status. Data will be collected for body mass index and for the fasting serum lipid profile. The complete instructions for data collection and the data sheet itself are freely available on the websites of ISCoS (http://www.iscos.org.uk) and ASIA (http://www.asia-spinalinjury.org).
Regan, Tracey J; Taylor, Barbara L; Thompson, Grant G; Cochrane, Jean Fitts; Ralls, Katherine; Runge, Michael C; Merrick, Richard
2013-08-01
Lack of guidance for interpreting the definitions of endangered and threatened in the U.S. Endangered Species Act (ESA) has resulted in case-by-case decision making leaving the process vulnerable to being considered arbitrary or capricious. Adopting quantitative decision rules would remedy this but requires the agency to specify the relative urgency concerning extinction events over time, cutoff risk values corresponding to different levels of protection, and the importance given to different types of listing errors. We tested the performance of 3 sets of decision rules that use alternative functions for weighting the relative urgency of future extinction events: a threshold rule set, which uses a decision rule of x% probability of extinction over y years; a concave rule set, where the relative importance of future extinction events declines exponentially over time; and a shoulder rule set that uses a sigmoid shape function, where relative importance declines slowly at first and then more rapidly. We obtained decision cutoffs by interviewing several biologists and then emulated the listing process with simulations that covered a range of extinction risks typical of ESA listing decisions. We evaluated performance of the decision rules under different data quantities and qualities on the basis of the relative importance of misclassification errors. Although there was little difference between the performance of alternative decision rules for correct listings, the distribution of misclassifications differed depending on the function used. Misclassifications for the threshold and concave listing criteria resulted in more overprotection errors, particularly as uncertainty increased, whereas errors for the shoulder listing criteria were more symmetrical. We developed and tested the framework for quantitative decision rules for listing species under the U.S. ESA. If policy values can be agreed on, use of this framework would improve the implementation of the ESA by increasing transparency and consistency. Conservation Biology © 2013 Society for Conservation Biology No claim to original US government works.
NASA Astrophysics Data System (ADS)
Hill, J. Grant; Peterson, Kirk A.; Knizia, Gerald; Werner, Hans-Joachim
2009-11-01
Accurate extrapolation to the complete basis set (CBS) limit of valence correlation energies calculated with explicitly correlated MP2-F12 and CCSD(T)-F12b methods have been investigated using a Schwenke-style approach for molecules containing both first and second row atoms. Extrapolation coefficients that are optimal for molecular systems containing first row elements differ from those optimized for second row analogs, hence values optimized for a combined set of first and second row systems are also presented. The new coefficients are shown to produce excellent results in both Schwenke-style and equivalent power-law-based two-point CBS extrapolations, with the MP2-F12/cc-pV(D,T)Z-F12 extrapolations producing an average error of just 0.17 mEh with a maximum error of 0.49 for a collection of 23 small molecules. The use of larger basis sets, i.e., cc-pV(T,Q)Z-F12 and aug-cc-pV(Q,5)Z, in extrapolations of the MP2-F12 correlation energy leads to average errors that are smaller than the degree of confidence in the reference data (˜0.1 mEh). The latter were obtained through use of very large basis sets in MP2-F12 calculations on small molecules containing both first and second row elements. CBS limits obtained from optimized coefficients for conventional MP2 are only comparable to the accuracy of the MP2-F12/cc-pV(D,T)Z-F12 extrapolation when the aug-cc-pV(5+d)Z and aug-cc-pV(6+d)Z basis sets are used. The CCSD(T)-F12b correlation energy is extrapolated as two distinct parts: CCSD-F12b and (T). While the CCSD-F12b extrapolations with smaller basis sets are statistically less accurate than those of the MP2-F12 correlation energies, this is presumably due to the slower basis set convergence of the CCSD-F12b method compared to MP2-F12. The use of larger basis sets in the CCSD-F12b extrapolations produces correlation energies with accuracies exceeding the confidence in the reference data (also obtained in large basis set F12 calculations). It is demonstrated that the use of the 3C(D) Ansatz is preferred for MP2-F12 CBS extrapolations. Optimal values of the geminal Slater exponent are presented for the diagonal, fixed amplitude Ansatz in MP2-F12 calculations, and these are also recommended for CCSD-F12b calculations.
A framework to support human factors of automation in railway intelligent infrastructure.
Dadashi, Nastaran; Wilson, John R; Golightly, David; Sharples, Sarah
2014-01-01
Technological and organisational advances have increased the potential for remote access and proactive monitoring of the infrastructure in various domains and sectors - water and sewage, oil and gas and transport. Intelligent Infrastructure (II) is an architecture that potentially enables the generation of timely and relevant information about the state of any type of infrastructure asset, providing a basis for reliable decision-making. This paper reports an exploratory study to understand the concepts and human factors associated with II in the railway, largely drawing from structured interviews with key industry decision-makers and attachment to pilot projects. Outputs from the study include a data-processing framework defining the key human factors at different levels of the data structure within a railway II system and a system-level representation. The framework and other study findings will form a basis for human factors contributions to systems design elements such as information interfaces and role specifications.
NASA Astrophysics Data System (ADS)
Newton, Alice; Borja, Angel; Solidoro, Cosimo; Grégoire, Marilaure
2015-10-01
The Marine Strategy Framework Directive (MSFD; EC, 2008) is an ambitious European policy instrument that aims to achieve Good Environmental Status (GES) in the 5,720,000 km2 of European seas by 2020, using an Ecosystem Approach. GES is to be assessed using 11 descriptors and up to 56 indicators (European Commission, 2010), and the goal is for clean, healthy and productive seas that are the basis for marine-based development, known as Blue-Growth. The MSFD is one of many policy instruments, such as the Water Framework Directive, the Common Fisheries Policy and the Habitats Directive that, together, should result in "Healthy Oceans and Productive Ecosystems - HOPE". Researchers working together with stakeholders such as the Member States environmental agencies, the European Environmental Agency, and the Regional Sea Conventions, are to provide the scientific knowledge basis for the implementation of the MSFD. This represents both a fascinating challenge and a stimulating opportunity.
NASA Astrophysics Data System (ADS)
Khayyer, Abbas; Gotoh, Hitoshi; Falahaty, Hosein; Shimizu, Yuma
2018-02-01
Simulation of incompressible fluid flow-elastic structure interactions is targeted by using fully-Lagrangian mesh-free computational methods. A projection-based fluid model (moving particle semi-implicit (MPS)) is coupled with either a Newtonian or a Hamiltonian Lagrangian structure model (MPS or HMPS) in a mathematically-physically consistent manner. The fluid model is founded on the solution of Navier-Stokes and continuity equations. The structure models are configured either in the framework of Newtonian mechanics on the basis of conservation of linear and angular momenta, or Hamiltonian mechanics on the basis of variational principle for incompressible elastodynamics. A set of enhanced schemes are incorporated for projection-based fluid model (Enhanced MPS), thus, the developed coupled solvers for fluid structure interaction (FSI) are referred to as Enhanced MPS-MPS and Enhanced MPS-HMPS. Besides, two smoothed particle hydrodynamics (SPH)-based FSI solvers, being developed by the authors, are considered and their potential applicability and comparable performance are briefly discussed in comparison with MPS-based FSI solvers. The SPH-based FSI solvers are established through coupling of projection-based incompressible SPH (ISPH) fluid model and SPH-based Newtonian/Hamiltonian structure models, leading to Enhanced ISPH-SPH and Enhanced ISPH-HSPH. A comparative study is carried out on the performances of the FSI solvers through a set of benchmark tests, including hydrostatic water column on an elastic plate, high speed impact of an elastic aluminum beam, hydroelastic slamming of a marine panel and dam break with elastic gate.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mizrachi, Eshchar; Verbeke, Lieven; Christie, Nanette
As a consequence of their remarkable adaptability, fast growth, and superior wood properties, eucalypt tree plantations have emerged as key renewable feedstocks (over 20 million ha globally) for the production of pulp, paper, bioenergy, and other lignocellulosic products. However, most biomass properties such as growth, wood density, and wood chemistry are complex traits that are hard to improve in long-lived perennials. Systems genetics, a process of harnessing multiple levels of component trait information (e.g., transcript, protein, and metabolite variation) in populations that vary in complex traits, has proven effective for dissecting the genetics and biology of such traits. We havemore » applied a network-based data integration (NBDI) method for a systems-level analysis of genes, processes and pathways underlying biomass and bioenergy-related traits using a segregating Eucalyptus hybrid population. We show that the integrative approach can link biologically meaningful sets of genes to complex traits and at the same time reveal the molecular basis of trait variation. Gene sets identified for related woody biomass traits were found to share regulatory loci, cluster in network neighborhoods, and exhibit enrichment for molecular functions such as xylan metabolism and cell wall development. These findings offer a framework for identifying the molecular underpinnings of complex biomass and bioprocessing-related traits. Furthermore, a more thorough understanding of the molecular basis of plant biomass traits should provide additional opportunities for the establishment of a sustainable bio-based economy.« less
Mizrachi, Eshchar; Verbeke, Lieven; Christie, Nanette; ...
2017-01-17
As a consequence of their remarkable adaptability, fast growth, and superior wood properties, eucalypt tree plantations have emerged as key renewable feedstocks (over 20 million ha globally) for the production of pulp, paper, bioenergy, and other lignocellulosic products. However, most biomass properties such as growth, wood density, and wood chemistry are complex traits that are hard to improve in long-lived perennials. Systems genetics, a process of harnessing multiple levels of component trait information (e.g., transcript, protein, and metabolite variation) in populations that vary in complex traits, has proven effective for dissecting the genetics and biology of such traits. We havemore » applied a network-based data integration (NBDI) method for a systems-level analysis of genes, processes and pathways underlying biomass and bioenergy-related traits using a segregating Eucalyptus hybrid population. We show that the integrative approach can link biologically meaningful sets of genes to complex traits and at the same time reveal the molecular basis of trait variation. Gene sets identified for related woody biomass traits were found to share regulatory loci, cluster in network neighborhoods, and exhibit enrichment for molecular functions such as xylan metabolism and cell wall development. These findings offer a framework for identifying the molecular underpinnings of complex biomass and bioprocessing-related traits. Furthermore, a more thorough understanding of the molecular basis of plant biomass traits should provide additional opportunities for the establishment of a sustainable bio-based economy.« less
Mizrachi, Eshchar; Verbeke, Lieven; Christie, Nanette; Fierro, Ana C; Mansfield, Shawn D; Davis, Mark F; Gjersing, Erica; Tuskan, Gerald A; Van Montagu, Marc; Van de Peer, Yves; Marchal, Kathleen; Myburg, Alexander A
2017-01-31
As a consequence of their remarkable adaptability, fast growth, and superior wood properties, eucalypt tree plantations have emerged as key renewable feedstocks (over 20 million ha globally) for the production of pulp, paper, bioenergy, and other lignocellulosic products. However, most biomass properties such as growth, wood density, and wood chemistry are complex traits that are hard to improve in long-lived perennials. Systems genetics, a process of harnessing multiple levels of component trait information (e.g., transcript, protein, and metabolite variation) in populations that vary in complex traits, has proven effective for dissecting the genetics and biology of such traits. We have applied a network-based data integration (NBDI) method for a systems-level analysis of genes, processes and pathways underlying biomass and bioenergy-related traits using a segregating Eucalyptus hybrid population. We show that the integrative approach can link biologically meaningful sets of genes to complex traits and at the same time reveal the molecular basis of trait variation. Gene sets identified for related woody biomass traits were found to share regulatory loci, cluster in network neighborhoods, and exhibit enrichment for molecular functions such as xylan metabolism and cell wall development. These findings offer a framework for identifying the molecular underpinnings of complex biomass and bioprocessing-related traits. A more thorough understanding of the molecular basis of plant biomass traits should provide additional opportunities for the establishment of a sustainable bio-based economy.
Building Background Knowledge through Reading: Rethinking Text Sets
ERIC Educational Resources Information Center
Lupo, Sarah M.; Strong, John Z.; Lewis, William; Walpole, Sharon; McKenna, Michael C.
2018-01-01
To increase reading volume and help students access challenging texts, the authors propose a four-dimensional framework for text sets. The quad text set framework is designed around a target text: a challenging content area text, such as a canonical literary work, research article, or historical primary source document. The three remaining…
ERIC Educational Resources Information Center
Anwar-McHenry, Julia; Donovan, Robert John; Nicholas, Amberlee; Kerrigan, Simone; Francas, Stephanie; Phan, Tina
2016-01-01
Purpose: Mentally Healthy WA developed and implemented the Mentally Healthy Schools Framework in 2010 in response to demand from schools wanting to promote the community-based Act-Belong-Commit mental health promotion message within a school setting. Schools are an important setting for mental health promotion, therefore, the Framework encourages…
NASA Technical Reports Server (NTRS)
Mackenzie, Anne I.; Baginski, Michael E.; Rao, Sadasiva M.
2008-01-01
In this work, we present an alternate set of basis functions, each defined over a pair of planar triangular patches, for the method of moments solution of electromagnetic scattering and radiation problems associated with arbitrarily-shaped, closed, conducting surfaces. The present basis functions are point-wise orthogonal to the pulse basis functions previously defined. The prime motivation to develop the present set of basis functions is to utilize them for the electromagnetic solution of dielectric bodies using a surface integral equation formulation which involves both electric and magnetic cur- rents. However, in the present work, only the conducting body solution is presented and compared with other data.
Priority setting: what constitutes success? A conceptual framework for successful priority setting.
Sibbald, Shannon L; Singer, Peter A; Upshur, Ross; Martin, Douglas K
2009-03-05
The sustainability of healthcare systems worldwide is threatened by a growing demand for services and expensive innovative technologies. Decision makers struggle in this environment to set priorities appropriately, particularly because they lack consensus about which values should guide their decisions. One way to approach this problem is to determine what all relevant stakeholders understand successful priority setting to mean. The goal of this research was to develop a conceptual framework for successful priority setting. Three separate empirical studies were completed using qualitative data collection methods (one-on-one interviews with healthcare decision makers from across Canada; focus groups with representation of patients, caregivers and policy makers; and Delphi study including scholars and decision makers from five countries). This paper synthesizes the findings from three studies into a framework of ten separate but interconnected elements germane to successful priority setting: stakeholder understanding, shifted priorities/reallocation of resources, decision making quality, stakeholder acceptance and satisfaction, positive externalities, stakeholder engagement, use of explicit process, information management, consideration of values and context, and revision or appeals mechanism. The ten elements specify both quantitative and qualitative dimensions of priority setting and relate to both process and outcome components. To our knowledge, this is the first framework that describes successful priority setting. The ten elements identified in this research provide guidance for decision makers and a common language to discuss priority setting success and work toward improving priority setting efforts.
Using Recommendations in Evaluation: A Decision-Making Framework for Evaluators
ERIC Educational Resources Information Center
Iriti, Jennifer E.; Bickel, William E.; Nelson, Catherine Awsumb
2005-01-01
Is it appropriate and useful for evaluators to use findings to make recommendations? If so, under what circumstances? How specific should they be? This article presents a decision-making framework for the appropriateness of recommendations in varying contexts. On the basis of reviews of evaluation theory, selected evaluation reports, and feedback…
Support for Mobile Collaborative Learning Applications
ERIC Educational Resources Information Center
Martin, Sergio; Boticki, Ivica; Jacobs, George; Castro, Manuel; Peire, Juan
2010-01-01
This work is intended to describe a framework aimed to address the challenges in the development of mobile Collaborative Learning applications. Firstly, the paper offers an overview of some of the main principles of Collaborative Learning that will be the basis of the framework, which is based on three main pillars: collaboration and communication…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beckingsal, David; Gamblin, Todd
Modern performance portability frameworks provide application developers with a flexible way to determine how to run application kernels, however, they provide no guidance as to the best configuration for a given kernel. Apollo provides a model-generation framework that, when integrated with the RAJA library, uses lightweight decision tree models to select the fastest execution configuration on a per-kernel basis
ERIC Educational Resources Information Center
Wells, John G.
2016-01-01
The PIRPOSAL model is both a conceptual and pedagogical framework intended for use as a pragmatic guide to classroom implementation of Integrative STEM Education. Designerly questioning prompted by a "need to know" serves as the basis for transitioning student designers within and among multiple phases while they progress toward an…
Self, System, Synergy: A Career-Life Development Framework for Individuals and Organizations.
ERIC Educational Resources Information Center
Gelatt, H. B.
1998-01-01
The Self-System-Synergy model provides the philosophical framework for the concept of career resiliency, which has become the basis for many organizational initiatives. The three elements are self-reliance (the power of personal beliefs), interdependence (the connectedness of multiple systems), and self-renewal through continuous learning. (JOW)
Conducting Research with the Disability Community: A Rights-Based Approach
ERIC Educational Resources Information Center
Munger, Kelly M.; Mertens, Donna M.
2011-01-01
This article explores philosophical and theoretical frameworks that are useful for the conduct of research with people with disabilities. It then uses these frameworks as a basis for discussion of research practices, with a specific focus on differences that occur because of specific impairments and various cultural meanings of disability. The…
ERIC Educational Resources Information Center
Talbot, Kristen; Hug, Barbara
2013-01-01
Teachers often ask: How can I engage my students in the study of "real" science? The answer can be found in the National Research Council's "A Framework for K-12 Science Education: Practices, Crosscutting Concepts, and Core Ideas" (NRC 2012). This framework calls for a new approach to science education and is the basis for…
NASA Astrophysics Data System (ADS)
Goh, K. L.; Liew, S. C.; Hasegawa, B. H.
1997-12-01
Computer simulation results from our previous studies showed that energy dependent systematic errors exist in the values of attenuation coefficient synthesized using the basis material decomposition technique with acrylic and aluminum as the basis materials, especially when a high atomic number element (e.g., iodine from radiographic contrast media) was present in the body. The errors were reduced when a basis set was chosen from materials mimicking those found in the phantom. In the present study, we employed a basis material coefficients transformation method to correct for the energy-dependent systematic errors. In this method, the basis material coefficients were first reconstructed using the conventional basis materials (acrylic and aluminum) as the calibration basis set. The coefficients were then numerically transformed to those for a more desirable set materials. The transformation was done at the energies of the low and high energy windows of the X-ray spectrum. With this correction method using acrylic and an iodine-water mixture as our desired basis set, computer simulation results showed that accuracy of better than 2% could be achieved even when iodine was present in the body at a concentration as high as 10% by mass. Simulation work had also been carried out on a more inhomogeneous 2D thorax phantom of the 3D MCAT phantom. The results of the accuracy of quantitation were presented here.
Brandenburg, Jan Gerit; Grimme, Stefan
2014-01-01
We present and evaluate dispersion corrected Hartree-Fock (HF) and Density Functional Theory (DFT) based quantum chemical methods for organic crystal structure prediction. The necessity of correcting for missing long-range electron correlation, also known as van der Waals (vdW) interaction, is pointed out and some methodological issues such as inclusion of three-body dispersion terms are discussed. One of the most efficient and widely used methods is the semi-classical dispersion correction D3. Its applicability for the calculation of sublimation energies is investigated for the benchmark set X23 consisting of 23 small organic crystals. For PBE-D3 the mean absolute deviation (MAD) is below the estimated experimental uncertainty of 1.3 kcal/mol. For two larger π-systems, the equilibrium crystal geometry is investigated and very good agreement with experimental data is found. Since these calculations are carried out with huge plane-wave basis sets they are rather time consuming and routinely applicable only to systems with less than about 200 atoms in the unit cell. Aiming at crystal structure prediction, which involves screening of many structures, a pre-sorting with faster methods is mandatory. Small, atom-centered basis sets can speed up the computation significantly but they suffer greatly from basis set errors. We present the recently developed geometrical counterpoise correction gCP. It is a fast semi-empirical method which corrects for most of the inter- and intramolecular basis set superposition error. For HF calculations with nearly minimal basis sets, we additionally correct for short-range basis incompleteness. We combine all three terms in the HF-3c denoted scheme which performs very well for the X23 sublimation energies with an MAD of only 1.5 kcal/mol, which is close to the huge basis set DFT-D3 result.
Point Set Denoising Using Bootstrap-Based Radial Basis Function.
Liew, Khang Jie; Ramli, Ahmad; Abd Majid, Ahmad
2016-01-01
This paper examines the application of a bootstrap test error estimation of radial basis functions, specifically thin-plate spline fitting, in surface smoothing. The presence of noisy data is a common issue of the point set model that is generated from 3D scanning devices, and hence, point set denoising is one of the main concerns in point set modelling. Bootstrap test error estimation, which is applied when searching for the smoothing parameters of radial basis functions, is revisited. The main contribution of this paper is a smoothing algorithm that relies on a bootstrap-based radial basis function. The proposed method incorporates a k-nearest neighbour search and then projects the point set to the approximated thin-plate spline surface. Therefore, the denoising process is achieved, and the features are well preserved. A comparison of the proposed method with other smoothing methods is also carried out in this study.
Highway extraction from high resolution aerial photography using a geometric active contour model
NASA Astrophysics Data System (ADS)
Niu, Xutong
Highway extraction and vehicle detection are two of the most important steps in traffic-flow analysis from multi-frame aerial photographs. The traditional method of deriving traffic flow trajectories relies on manual vehicle counting from a sequence of aerial photographs, which is tedious and time-consuming. This research presents a new framework for semi-automatic highway extraction. The basis of the new framework is an improved geometric active contour (GAC) model. This novel model seeks to minimize an objective function that transforms a problem of propagation of regular curves into an optimization problem. The implementation of curve propagation is based on level set theory. By using an implicit representation of a two-dimensional curve, a level set approach can be used to deal with topological changes naturally, and the output is unaffected by different initial positions of the curve. However, the original GAC model, on which the new model is based, only incorporates boundary information into the curve propagation process. An error-producing phenomenon called leakage is inevitable wherever there is an uncertain weak edge. In this research, region-based information is added as a constraint into the original GAC model, thereby, giving this proposed method the ability of integrating both boundary and region-based information during the curve propagation. Adding the region-based constraint eliminates the leakage problem. This dissertation applies the proposed augmented GAC model to the problem of highway extraction from high-resolution aerial photography. First, an optimized stopping criterion is designed and used in the implementation of the GAC model. It effectively saves processing time and computations. Second, a seed point propagation framework is designed and implemented. This framework incorporates highway extraction, tracking, and linking into one procedure. A seed point is usually placed at an end node of highway segments close to the boundary of the image or at a position where possible blocking may occur, such as at an overpass bridge or near vehicle crowds. These seed points can be automatically propagated throughout the entire highway network. During the process, road center points are also extracted, which introduces a search direction for solving possible blocking problems. This new framework has been successfully applied to highway network extraction from a large orthophoto mosaic. In the process, vehicles on the highway extracted from mosaic were detected with an 83% success rate.
NASA Astrophysics Data System (ADS)
van Hoeve, Miriam D.; Klobukowski, Mariusz
2018-03-01
Simulation of the electronic spectra of HRgF (Rg = Ar, Kr, Xe, Rn) was carried out using the time-dependent density functional method, with the CAMB3LYP functional and several basis sets augmented with even-tempered diffuse functions. A full spectral assignment for the HRgF systems was done. The effect of the rare gas matrix on the HRgF (Rg = Ar and Kr) spectra was investigated and it was found that the matrix blue-shifted the spectra. Scalar relativistic effects on the spectra were also studied and it was found that while the excitation energies of HArF and HKrF were insignificantly affected by relativistic effects, most of the excitation energies of HXeF and HRnF were red-shifted. Spin-orbit coupling was found to significantly affect excitation energies in HRnF. Analysis of performance of the model core potential basis set relative to all-electron (AE) basis sets showed that the former basis set increased computational efficiency and gave results similar to those obtained with the AE basis set.
Midbond basis functions for weakly bound complexes
NASA Astrophysics Data System (ADS)
Shaw, Robert A.; Hill, J. Grant
2018-06-01
Weakly bound systems present a difficult problem for conventional atom-centred basis sets due to large separations, necessitating the use of large, computationally expensive bases. This can be remedied by placing a small number of functions in the region between molecules in the complex. We present compact sets of optimised midbond functions for a range of complexes involving noble gases, alkali metals and small molecules for use in high accuracy coupled -cluster calculations, along with a more robust procedure for their optimisation. It is shown that excellent results are possible with double-zeta quality orbital basis sets when a few midbond functions are added, improving both the interaction energy and the equilibrium bond lengths of a series of noble gas dimers by 47% and 8%, respectively. When used in conjunction with explicitly correlated methods, near complete basis set limit accuracy is readily achievable at a fraction of the cost that using a large basis would entail. General purpose auxiliary sets are developed to allow explicitly correlated midbond function studies to be carried out, making it feasible to perform very high accuracy calculations on weakly bound complexes.
A modular computational framework for automated peak extraction from ion mobility spectra
2014-01-01
Background An ion mobility (IM) spectrometer coupled with a multi-capillary column (MCC) measures volatile organic compounds (VOCs) in the air or in exhaled breath. This technique is utilized in several biotechnological and medical applications. Each peak in an MCC/IM measurement represents a certain compound, which may be known or unknown. For clustering and classification of measurements, the raw data matrix must be reduced to a set of peaks. Each peak is described by its coordinates (retention time in the MCC and reduced inverse ion mobility) and shape (signal intensity, further shape parameters). This fundamental step is referred to as peak extraction. It is the basis for identifying discriminating peaks, and hence putative biomarkers, between two classes of measurements, such as a healthy control group and a group of patients with a confirmed disease. Current state-of-the-art peak extraction methods require human interaction, such as hand-picking approximate peak locations, assisted by a visualization of the data matrix. In a high-throughput context, however, it is preferable to have robust methods for fully automated peak extraction. Results We introduce PEAX, a modular framework for automated peak extraction. The framework consists of several steps in a pipeline architecture. Each step performs a specific sub-task and can be instantiated by different methods implemented as modules. We provide open-source software for the framework and several modules for each step. Additionally, an interface that allows easy extension by a new module is provided. Combining the modules in all reasonable ways leads to a large number of peak extraction methods. We evaluate all combinations using intrinsic error measures and by comparing the resulting peak sets with an expert-picked one. Conclusions Our software PEAX is able to automatically extract peaks from MCC/IM measurements within a few seconds. The automatically obtained results keep up with the results provided by current state-of-the-art peak extraction methods. This opens a high-throughput context for the MCC/IM application field. Our software is available at http://www.rahmannlab.de/research/ims. PMID:24450533
A modular computational framework for automated peak extraction from ion mobility spectra.
D'Addario, Marianna; Kopczynski, Dominik; Baumbach, Jörg Ingo; Rahmann, Sven
2014-01-22
An ion mobility (IM) spectrometer coupled with a multi-capillary column (MCC) measures volatile organic compounds (VOCs) in the air or in exhaled breath. This technique is utilized in several biotechnological and medical applications. Each peak in an MCC/IM measurement represents a certain compound, which may be known or unknown. For clustering and classification of measurements, the raw data matrix must be reduced to a set of peaks. Each peak is described by its coordinates (retention time in the MCC and reduced inverse ion mobility) and shape (signal intensity, further shape parameters). This fundamental step is referred to as peak extraction. It is the basis for identifying discriminating peaks, and hence putative biomarkers, between two classes of measurements, such as a healthy control group and a group of patients with a confirmed disease. Current state-of-the-art peak extraction methods require human interaction, such as hand-picking approximate peak locations, assisted by a visualization of the data matrix. In a high-throughput context, however, it is preferable to have robust methods for fully automated peak extraction. We introduce PEAX, a modular framework for automated peak extraction. The framework consists of several steps in a pipeline architecture. Each step performs a specific sub-task and can be instantiated by different methods implemented as modules. We provide open-source software for the framework and several modules for each step. Additionally, an interface that allows easy extension by a new module is provided. Combining the modules in all reasonable ways leads to a large number of peak extraction methods. We evaluate all combinations using intrinsic error measures and by comparing the resulting peak sets with an expert-picked one. Our software PEAX is able to automatically extract peaks from MCC/IM measurements within a few seconds. The automatically obtained results keep up with the results provided by current state-of-the-art peak extraction methods. This opens a high-throughput context for the MCC/IM application field. Our software is available at http://www.rahmannlab.de/research/ims.
2012-01-01
Background Mobile phone technology has demonstrated the potential to improve health service delivery, but there is little guidance to inform decisions about acquiring and implementing mHealth technology at scale in health systems. Using the case of community-based health services (CBS) in South Africa, we apply a framework to appraise the opportunities and challenges to effective implementation of mHealth at scale in health systems. Methods A qualitative study reviewed the benefits and challenges of mHealth in community-based services in South Africa, through a combination of key informant interviews, site visits to local projects and document reviews. Using a framework adapted from three approaches to reviewing sustainable information and communication technology (ICT), the lessons from local experience and elsewhere formed the basis of a wider consideration of scale up challenges in South Africa. Results Four key system dimensions were identified and assessed: government stewardship and the organisational, technological and financial systems. In South Africa, the opportunities for successful implementation of mHealth include the high prevalence of mobile phones, a supportive policy environment for eHealth, successful use of mHealth for CBS in a number of projects and a well-developed ICT industry. However there are weaknesses in other key health systems areas such as organisational culture and capacity for using health information for management, and the poor availability and use of ICT in primary health care. The technological challenges include the complexity of ensuring interoperability and integration of information systems and securing privacy of information. Finally, there are the challenges of sustainable financing required for large scale use of mobile phone technology in resource limited settings. Conclusion Against a background of a health system with a weak ICT environment and limited implementation capacity, it remains uncertain that the potential benefits of mHealth for CBS would be retained with immediate large-scale implementation. Applying a health systems framework facilitated a systematic appraisal of potential challenges to scaling up mHealth for CBS in South Africa and may be useful for policy and practice decision-making in other low- and middle-income settings. PMID:23126370
Automated Design Framework for Synthetic Biology Exploiting Pareto Optimality.
Otero-Muras, Irene; Banga, Julio R
2017-07-21
In this work we consider Pareto optimality for automated design in synthetic biology. We present a generalized framework based on a mixed-integer dynamic optimization formulation that, given design specifications, allows the computation of Pareto optimal sets of designs, that is, the set of best trade-offs for the metrics of interest. We show how this framework can be used for (i) forward design, that is, finding the Pareto optimal set of synthetic designs for implementation, and (ii) reverse design, that is, analyzing and inferring motifs and/or design principles of gene regulatory networks from the Pareto set of optimal circuits. Finally, we illustrate the capabilities and performance of this framework considering four case studies. In the first problem we consider the forward design of an oscillator. In the remaining problems, we illustrate how to apply the reverse design approach to find motifs for stripe formation, rapid adaption, and fold-change detection, respectively.
Wu, Alex Chi; Morell, Matthew K.; Gilbert, Robert G.
2013-01-01
A core set of genes involved in starch synthesis has been defined by genetic studies, but the complexity of starch biosynthesis has frustrated attempts to elucidate the precise functional roles of the enzymes encoded. The chain-length distribution (CLD) of amylopectin in cereal endosperm is modeled here on the basis that the CLD is produced by concerted actions of three enzyme types: starch synthases, branching and debranching enzymes, including their respective isoforms. The model, together with fitting to experiment, provides four key insights. (1) To generate crystalline starch, defined restrictions on particular ratios of enzymatic activities apply. (2) An independent confirmation of the conclusion, previously reached solely from genetic studies, of the absolute requirement for debranching enzyme in crystalline amylopectin synthesis. (3) The model provides a mechanistic basis for understanding how successive arrays of crystalline lamellae are formed, based on the identification of two independent types of long amylopectin chains, one type remaining in the amorphous lamella, while the other propagates into, and is integral to the formation of, an adjacent crystalline lamella. (4) The model provides a means by which a small number of key parameters defining the core enzymatic activities can be derived from the amylopectin CLD, providing the basis for focusing studies on the enzymatic requirements for generating starches of a particular structure. The modeling approach provides both a new tool to accelerate efforts to understand granular starch biosynthesis and a basis for focusing efforts to manipulate starch structure and functionality using a series of testable predictions based on a robust mechanistic framework. PMID:23762422
The search for the genetic basis of hypertension.
Yagil, Yoram; Yagil, Chana
2005-03-01
This review surveys the literature on the search for the genetic basis of hypertension during the 10 months since November 2003. The goals set forth by this search are defined and the highlights of the work accomplished are provided. The search for the genetic basis of hypertension is ongoing, generating an abundance of new data. These data consist of a large number of candidate genes, association of previously known and novel candidate genes with various facets of hypertension, detection of new quantitative trait loci and identification of genes that mediate susceptibility to hypertension. The renin-zangiotensin-aldosterone system continues to dominate the interest of investigators. Other gene systems are also emerging but a single-gene system cannot be singled out beyond the renin-angiotensin-aldosterone system and the data are mostly sporadic and do not reflect a guided or coordinated effort to resolve unanswered issues. The notion that hypertension is polygenic is reinforced, yet few data are provided as to the actual number of genes involved, gene-gene interaction or gene-environment interaction. Advanced biotechnological tools involving transcriptomics and proteomics are underused. Research on the genetic basis of hypertension has generated over the past year a large number of candidate genes and tied them to various aspects of hypertension. How these genes fit into the complex pathophysiological network that induces hypertension remains unclear. The task of putting together these genes into a cohesive framework still lies ahead, but promises to enlighten us as to the true nature of hypertension, the pathogenic mechanisms involved and improved therapeutic and preventive measures.
A Framework and Toolkit for the Construction of Multimodal Learning Interfaces
1998-04-29
human communication modalities in the context of a broad class of applications, specifically those that support state manipulation via parameterized actions. The multimodal semantic model is also the basis for a flexible, domain independent, incrementally trainable multimodal interpretation algorithm based on a connectionist network. The second major contribution is an application framework consisting of reusable components and a modular, distributed system architecture. Multimodal application developers can assemble the components in the framework into a new application,
Pigolkin, Iu I; Murzova, T V; Mirzoev, Kh M
2011-01-01
The authors discuss peculiarities of the performance of forensic medical expertise in the cases of unfavourable outcomes of the stomatological treatment. The methodological basis of expert assessment has been created to be applied in situations related to the unfavourable outcomes of dental care.
Varandas, A J C
2009-02-01
The potential energy surface for the C(20)-He interaction is extrapolated for three representative cuts to the complete basis set limit using second-order Møller-Plesset perturbation calculations with correlation consistent basis sets up to the doubly augmented variety. The results both with and without counterpoise correction show consistency with each other, supporting that extrapolation without such a correction provides a reliable scheme to elude the basis-set-superposition error. Converged attributes are obtained for the C(20)-He interaction, which are used to predict the fullerene dimer ones. Time requirements show that the method can be drastically more economical than the counterpoise procedure and even competitive with Kohn-Sham density functional theory for the title system.
Exact exchange-correlation potentials of singlet two-electron systems
NASA Astrophysics Data System (ADS)
Ryabinkin, Ilya G.; Ospadov, Egor; Staroverov, Viktor N.
2017-10-01
We suggest a non-iterative analytic method for constructing the exchange-correlation potential, v XC ( r ) , of any singlet ground-state two-electron system. The method is based on a convenient formula for v XC ( r ) in terms of quantities determined only by the system's electronic wave function, exact or approximate, and is essentially different from the Kohn-Sham inversion technique. When applied to Gaussian-basis-set wave functions, the method yields finite-basis-set approximations to the corresponding basis-set-limit v XC ( r ) , whereas the Kohn-Sham inversion produces physically inappropriate (oscillatory and divergent) potentials. The effectiveness of the procedure is demonstrated by computing accurate exchange-correlation potentials of several two-electron systems (helium isoelectronic series, H2, H3 + ) using common ab initio methods and Gaussian basis sets.
Dumez, Birgit; Van Damme, Karel; Casteleyn, Ludwine
2008-06-05
Assessment of ethical aspects and authorization by ethics committees have become a major constraint for health research including human subjects. Ethical reference values often are extrapolated from clinical settings, where emphasis lies on decisional autonomy and protection of individual's privacy. The question rises if this set of values used in clinical research can be considered as relevant references for HBM research, which is at the basis of public health surveillance. Current and future research activities using human biomarkers are facing new challenges and expectancies on sensitive socio-ethical issues. Reflection is needed on the necessity to balance individual rights against public interest. In addition, many HBM research programs require international collaboration. Domestic legislation is not always easily applicable in international projects. Also, there seem to be considerable inconsistencies in ethical assessments of similar research activities between different countries and even within one country. All this is causing delay and putting the researcher in situations in which it is unclear how to act in accordance with necessary legal requirements. Therefore, analysis of ethical practices and their consequences for HBM research is needed.This analysis will be performed by a bottom-up approach, based on a methodology for comparative analysis of determinants in ethical reasoning, allowing taking into account different social, cultural, political and historical traditions, in view of safeguarding common EU values. Based on information collected in real life complexity, paradigm cases and virtual case scenarios will be developed and discussed with relevant stakeholders to openly discuss possible obstacles and to identify options for improvement in regulation. The material collected will allow developing an ethical framework which may constitute the basis for a more harmonized and consistent socio-ethical and legal approach. This will not only increase the possibilities for comparison between data generated but may also allow for more equality in the protection of the rights of European citizens and establish trustful relationships between science and society, based on firmly rooted ethical values within the EU legislative framework.These considerations outline part of the research on legal, socio-ethical and communication aspects of HBM within the scope of ECNIS (NoE) and NewGeneris (IP).
Dumez, Birgit; Van Damme, Karel; Casteleyn, Ludwine
2008-01-01
Assessment of ethical aspects and authorization by ethics committees have become a major constraint for health research including human subjects. Ethical reference values often are extrapolated from clinical settings, where emphasis lies on decisional autonomy and protection of individual's privacy. The question rises if this set of values used in clinical research can be considered as relevant references for HBM research, which is at the basis of public health surveillance. Current and future research activities using human biomarkers are facing new challenges and expectancies on sensitive socio-ethical issues. Reflection is needed on the necessity to balance individual rights against public interest. In addition, many HBM research programs require international collaboration. Domestic legislation is not always easily applicable in international projects. Also, there seem to be considerable inconsistencies in ethical assessments of similar research activities between different countries and even within one country. All this is causing delay and putting the researcher in situations in which it is unclear how to act in accordance with necessary legal requirements. Therefore, analysis of ethical practices and their consequences for HBM research is needed. This analysis will be performed by a bottom-up approach, based on a methodology for comparative analysis of determinants in ethical reasoning, allowing taking into account different social, cultural, political and historical traditions, in view of safeguarding common EU values. Based on information collected in real life complexity, paradigm cases and virtual case scenarios will be developed and discussed with relevant stakeholders to openly discuss possible obstacles and to identify options for improvement in regulation. The material collected will allow developing an ethical framework which may constitute the basis for a more harmonized and consistent socio-ethical and legal approach. This will not only increase the possibilities for comparison between data generated but may also allow for more equality in the protection of the rights of European citizens and establish trustful relationships between science and society, based on firmly rooted ethical values within the EU legislative framework. These considerations outline part of the research on legal, socio-ethical and communication aspects of HBM within the scope of ECNIS (NoE) and NewGeneris (IP). PMID:18541073
'CHEATS': a generic information communication technology (ICT) evaluation framework.
Shaw, Nicola T
2002-05-01
This paper describes a generic framework for the evaluation of information communication technologies. This framework, CHEATS, utilises both qualitative and quantitative research methods and has proved appropriate in multiple clinical settings including telepsychiatry, teledermatology and teleeducation. The paper demonstrates how a multidisciplinary approach is essential when evaluating new and emerging technologies, particularly when such systems are implemented in real service as opposed to a research setting.
Developing a pressure ulcer risk factor minimum data set and risk assessment framework.
Coleman, Susanne; Nelson, E Andrea; Keen, Justin; Wilson, Lyn; McGinnis, Elizabeth; Dealey, Carol; Stubbs, Nikki; Muir, Delia; Farrin, Amanda; Dowding, Dawn; Schols, Jos M G A; Cuddigan, Janet; Berlowitz, Dan; Jude, Edward; Vowden, Peter; Bader, Dan L; Gefen, Amit; Oomens, Cees W J; Schoonhoven, Lisette; Nixon, Jane
2014-10-01
To agree a draft pressure ulcer risk factor Minimum Data Set to underpin the development of a new evidenced-based Risk Assessment Framework. A recent systematic review identified the need for a pressure ulcer risk factor Minimum Data Set and development and validation of an evidenced-based pressure ulcer Risk Assessment Framework. This was undertaken through the Pressure UlceR Programme Of reSEarch (RP-PG-0407-10056), funded by the National Institute for Health Research and incorporates five phases. This article reports phase two, a consensus study. Consensus study. A modified nominal group technique based on the Research and Development/University of California at Los Angeles appropriateness method. This incorporated an expert group, review of the evidence and the views of a Patient and Public Involvement service user group. Data were collected December 2010-December 2011. The risk factors and assessment items of the Minimum Data Set (including immobility, pressure ulcer and skin status, perfusion, diabetes, skin moisture, sensory perception and nutrition) were agreed. In addition, a draft Risk Assessment Framework incorporating all Minimum Data Set items was developed, comprising a two stage assessment process (screening and detailed full assessment) and decision pathways. The draft Risk Assessment Framework will undergo further design and pre-testing with clinical nurses to assess and improve its usability. It will then be evaluated in clinical practice to assess its validity and reliability. The Minimum Data Set could be used in future for large scale risk factor studies informing refinement of the Risk Assessment Framework. © 2014 The Authors. Journal of Advanced Nursing Published by John Wiley & Sons Ltd.
The LSST metrics analysis framework (MAF)
NASA Astrophysics Data System (ADS)
Jones, R. L.; Yoachim, Peter; Chandrasekharan, Srinivasan; Connolly, Andrew J.; Cook, Kem H.; Ivezic, Željko; Krughoff, K. S.; Petry, Catherine; Ridgway, Stephen T.
2014-07-01
We describe the Metrics Analysis Framework (MAF), an open-source python framework developed to provide a user-friendly, customizable, easily-extensible set of tools for analyzing data sets. MAF is part of the Large Synoptic Survey Telescope (LSST) Simulations effort. Its initial goal is to provide a tool to evaluate LSST Operations Simulation (OpSim) simulated surveys to help understand the effects of telescope scheduling on survey performance, however MAF can be applied to a much wider range of datasets. The building blocks of the framework are Metrics (algorithms to analyze a given quantity of data), Slicers (subdividing the overall data set into smaller data slices as relevant for each Metric), and Database classes (to access the dataset and read data into memory). We describe how these building blocks work together, and provide an example of using MAF to evaluate different dithering strategies. We also outline how users can write their own custom Metrics and use these within the framework.
Correlation consistent basis sets for actinides. I. The Th and U atoms.
Peterson, Kirk A
2015-02-21
New correlation consistent basis sets based on both pseudopotential (PP) and all-electron Douglas-Kroll-Hess (DKH) Hamiltonians have been developed from double- to quadruple-zeta quality for the actinide atoms thorium and uranium. Sets for valence electron correlation (5f6s6p6d), cc - pV nZ - PP and cc - pV nZ - DK3, as well as outer-core correlation (valence + 5s5p5d), cc - pwCV nZ - PP and cc - pwCV nZ - DK3, are reported (n = D, T, Q). The -PP sets are constructed in conjunction with small-core, 60-electron PPs, while the -DK3 sets utilized the 3rd-order Douglas-Kroll-Hess scalar relativistic Hamiltonian. Both series of basis sets show systematic convergence towards the complete basis set limit, both at the Hartree-Fock and correlated levels of theory, making them amenable to standard basis set extrapolation techniques. To assess the utility of the new basis sets, extensive coupled cluster composite thermochemistry calculations of ThFn (n = 2 - 4), ThO2, and UFn (n = 4 - 6) have been carried out. After accurately accounting for valence and outer-core correlation, spin-orbit coupling, and even Lamb shift effects, the final 298 K atomization enthalpies of ThF4, ThF3, ThF2, and ThO2 are all within their experimental uncertainties. Bond dissociation energies of ThF4 and ThF3, as well as UF6 and UF5, were similarly accurate. The derived enthalpies of formation for these species also showed a very satisfactory agreement with experiment, demonstrating that the new basis sets allow for the use of accurate composite schemes just as in molecular systems composed only of lighter atoms. The differences between the PP and DK3 approaches were found to increase with the change in formal oxidation state on the actinide atom, approaching 5-6 kcal/mol for the atomization enthalpies of ThF4 and ThO2. The DKH3 atomization energy of ThO2 was calculated to be smaller than the DKH2 value by ∼1 kcal/mol.
An efficient, scalable, and adaptable framework for solving generic systems of level-set PDEs
Mosaliganti, Kishore R.; Gelas, Arnaud; Megason, Sean G.
2013-01-01
In the last decade, level-set methods have been actively developed for applications in image registration, segmentation, tracking, and reconstruction. However, the development of a wide variety of level-set PDEs and their numerical discretization schemes, coupled with hybrid combinations of PDE terms, stopping criteria, and reinitialization strategies, has created a software logistics problem. In the absence of an integrative design, current toolkits support only specific types of level-set implementations which restrict future algorithm development since extensions require significant code duplication and effort. In the new NIH/NLM Insight Toolkit (ITK) v4 architecture, we implemented a level-set software design that is flexible to different numerical (continuous, discrete, and sparse) and grid representations (point, mesh, and image-based). Given that a generic PDE is a summation of different terms, we used a set of linked containers to which level-set terms can be added or deleted at any point in the evolution process. This container-based approach allows the user to explore and customize terms in the level-set equation at compile-time in a flexible manner. The framework is optimized so that repeated computations of common intensity functions (e.g., gradient and Hessians) across multiple terms is eliminated. The framework further enables the evolution of multiple level-sets for multi-object segmentation and processing of large datasets. For doing so, we restrict level-set domains to subsets of the image domain and use multithreading strategies to process groups of subdomains or level-set functions. Users can also select from a variety of reinitialization policies and stopping criteria. Finally, we developed a visualization framework that shows the evolution of a level-set in real-time to help guide algorithm development and parameter optimization. We demonstrate the power of our new framework using confocal microscopy images of cells in a developing zebrafish embryo. PMID:24501592
An efficient, scalable, and adaptable framework for solving generic systems of level-set PDEs.
Mosaliganti, Kishore R; Gelas, Arnaud; Megason, Sean G
2013-01-01
In the last decade, level-set methods have been actively developed for applications in image registration, segmentation, tracking, and reconstruction. However, the development of a wide variety of level-set PDEs and their numerical discretization schemes, coupled with hybrid combinations of PDE terms, stopping criteria, and reinitialization strategies, has created a software logistics problem. In the absence of an integrative design, current toolkits support only specific types of level-set implementations which restrict future algorithm development since extensions require significant code duplication and effort. In the new NIH/NLM Insight Toolkit (ITK) v4 architecture, we implemented a level-set software design that is flexible to different numerical (continuous, discrete, and sparse) and grid representations (point, mesh, and image-based). Given that a generic PDE is a summation of different terms, we used a set of linked containers to which level-set terms can be added or deleted at any point in the evolution process. This container-based approach allows the user to explore and customize terms in the level-set equation at compile-time in a flexible manner. The framework is optimized so that repeated computations of common intensity functions (e.g., gradient and Hessians) across multiple terms is eliminated. The framework further enables the evolution of multiple level-sets for multi-object segmentation and processing of large datasets. For doing so, we restrict level-set domains to subsets of the image domain and use multithreading strategies to process groups of subdomains or level-set functions. Users can also select from a variety of reinitialization policies and stopping criteria. Finally, we developed a visualization framework that shows the evolution of a level-set in real-time to help guide algorithm development and parameter optimization. We demonstrate the power of our new framework using confocal microscopy images of cells in a developing zebrafish embryo.
Orbital-Dependent Density Functionals for Chemical Catalysis
2014-10-17
noncollinear density functional theory to show that the low-spin state of Mn3 in a model of the oxygen -evolving complex of photosystem II avoids...DK, which denotes the cc-pV5Z-DK basis set for 3d metals and hydrogen and the ma-cc- pV5Z-DK basis set for oxygen ) and to nonrelativistic all...cc-pV5Z basis set for oxygen ). As compared to NCBS-DK results, all ECP calculations perform worse than def2-TZVP all-electron relativistic
Electric dipole moment of diatomic molecules by configuration interaction. IV.
NASA Technical Reports Server (NTRS)
Green, S.
1972-01-01
The theory of basis set dependence in configuration interaction calculations is discussed, taking into account a perturbation model which is valid for small changes in the self-consistent field orbitals. It is found that basis set corrections are essentially additive through first order. It is shown that an error found in a previously published dipole moment calculation by Green (1972) for the metastable first excited state of CO was indeed due to an inadequate basis set as claimed.
Project Stakeholder Management in the Clinical Research Environment: How to Do it Right
Pandi-Perumal, Seithikurippu R.; Akhter, Sohel; Zizi, Ferdinard; Jean-Louis, Girardin; Ramasubramanian, Chellamuthu; Edward Freeman, R.; Narasimhan, Meera
2015-01-01
This review introduces a conceptual framework for understanding stakeholder management (ShM) in the clinical and community-based research environment. In recent years, an evolution in practice has occurred in many applicants for public and non-governmental funding of public health research in hospital settings. Community health research projects are inherently complex, have sought to involve patients and other stakeholders in the center of the research process. Substantial evidence has now been provided that stakeholder involvement is essential for management effectiveness in clinical research. Feedback from stakeholders has critical value for research managers inasmuch as it alerts them to the social, environmental, and ethical implications of research activities. Additionally, those who are directly affected by program development and clinical research, the patients, their families, and others, almost universally have a strong motivation to be involved in the planning and execution of new program changes. The current overview introduces a conceptual framework for ShM in the clinical research environment and offers practical suggestions for fostering meaningful stakeholder engagement. The fifth edition of PMBOK® of the Project Management Institute, has served as basis for many of the suggested guidelines that are put forward in this article. PMID:26042053
MATRIX-VBS Condensing Organic Aerosols in an Aerosol Microphysics Model
NASA Technical Reports Server (NTRS)
Gao, Chloe Y.; Tsigaridis, Konstas; Bauer, Susanne E.
2015-01-01
The condensation of organic aerosols is represented in a newly developed box-model scheme, where its effect on the growth and composition of particles are examined. We implemented the volatility-basis set (VBS) framework into the aerosol mixing state resolving microphysical scheme Multiconfiguration Aerosol TRacker of mIXing state (MATRIX). This new scheme is unique and advances the representation of organic aerosols in models in that, contrary to the traditional treatment of organic aerosols as non-volatile in most climate models and in the original version of MATRIX, this new scheme treats them as semi-volatile. Such treatment is important because low-volatility organics contribute significantly to the growth of particles. The new scheme includes several classes of semi-volatile organic compounds from the VBS framework that can partition among aerosol populations in MATRIX, thus representing the growth of particles via condensation of low volatility organic vapors. Results from test cases representing Mexico City and a Finish forrest condistions show good representation of the time evolutions of concentration for VBS species in the gas phase and in the condensed particulate phase. Emitted semi-volatile primary organic aerosols evaporate almost completely in the high volatile range, and they condense more efficiently in the low volatility range.
Gordon, Abekah Nkrumah; Hinson, Robert Ebo
2007-01-01
The purpose of this paper is to argue for a theoretical framework by which development of computer based health information systems (CHIS) can be made sustainable. Health Management and promotion thrive on well-articulated CHIS. There are high levels of risk associated with the development of CHIS in the context of least developed countries (LDC), thereby making them unsustainable. This paper is based largely on literature survey on health promotion and information systems. The main factors accounting for the sustainability problem in less developed countries include poor infrastructure, inappropriate donor policies and strategies, poor infrastructure and inadequate human resource capacity. To counter these challenges and to ensure that CHIS deployment in LDCs is sustainable, it is proposed that the activities involved in the implementation of these systems be incorporated into organizational routines. This will ensure and secure the needed resources as well as the relevant support from all stakeholders of the system; on a continuous basis. This paper sets out to look at the issue of CHIS sustainability in LDCs, theoretically explains the factors that account for the sustainability problem and develops a conceptual model based on theoretical literature and existing empirical findings.
Project Stakeholder Management in the Clinical Research Environment: How to Do it Right.
Pandi-Perumal, Seithikurippu R; Akhter, Sohel; Zizi, Ferdinard; Jean-Louis, Girardin; Ramasubramanian, Chellamuthu; Edward Freeman, R; Narasimhan, Meera
2015-01-01
This review introduces a conceptual framework for understanding stakeholder management (ShM) in the clinical and community-based research environment. In recent years, an evolution in practice has occurred in many applicants for public and non-governmental funding of public health research in hospital settings. Community health research projects are inherently complex, have sought to involve patients and other stakeholders in the center of the research process. Substantial evidence has now been provided that stakeholder involvement is essential for management effectiveness in clinical research. Feedback from stakeholders has critical value for research managers inasmuch as it alerts them to the social, environmental, and ethical implications of research activities. Additionally, those who are directly affected by program development and clinical research, the patients, their families, and others, almost universally have a strong motivation to be involved in the planning and execution of new program changes. The current overview introduces a conceptual framework for ShM in the clinical research environment and offers practical suggestions for fostering meaningful stakeholder engagement. The fifth edition of PMBOK(®) of the Project Management Institute, has served as basis for many of the suggested guidelines that are put forward in this article.
Efficient iterative method for solving the Dirac-Kohn-Sham density functional theory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lin, Lin; Shao, Sihong; E, Weinan
2012-11-06
We present for the first time an efficient iterative method to directly solve the four-component Dirac-Kohn-Sham (DKS) density functional theory. Due to the existence of the negative energy continuum in the DKS operator, the existing iterative techniques for solving the Kohn-Sham systems cannot be efficiently applied to solve the DKS systems. The key component of our method is a novel filtering step (F) which acts as a preconditioner in the framework of the locally optimal block preconditioned conjugate gradient (LOBPCG) method. The resulting method, dubbed the LOBPCG-F method, is able to compute the desired eigenvalues and eigenvectors in the positive energy band without computing any state in the negative energy band. The LOBPCG-F method introduces mild extra cost compared to the standard LOBPCG method and can be easily implemented. We demonstrate our method in the pseudopotential framework with a planewave basis set which naturally satisfies the kinetic balance prescription. Numerical results for Ptmore » $$_{2}$$, Au$$_{2}$$, TlF, and Bi$$_{2}$$Se$$_{3}$$ indicate that the LOBPCG-F method is a robust and efficient method for investigating the relativistic effect in systems containing heavy elements.« less
NASA Astrophysics Data System (ADS)
Hinko, Kathleen
2016-03-01
University educators (UEs) have a long history of teaching physics not only in formal classroom settings but also in informal outreach environments. The pedagogical practices of UEs in informal physics teaching have not been widely studied, and they may provide insight into formal practices and preparation. We investigate the interactions between UEs and children in an afterschool physics program facilitated by university physics students from the University of Colorado Boulder. In this program, physics undergraduates, graduate students and post-doctoral researchers work with K-8 children on hands-on physics activities on a weekly basis over the course of a semester. We use an Activity Theoretic framework as a tool to examine situational aspects of individuals' behavior in the complex structure of the afterschool program. Using this framework, we analyze video of UE-child interactions and identify three main pedagogical modalities that UEs display during activities: Instruction, Consultation and Participation modes. These modes are characterized by certain language, physical location, and objectives that establish differences in UE-child roles and division of labor. Based on this analysis, we discuss implications for promoting pedagogical strategies through purposeful curriculum development and university educator preparation.
NASA Technical Reports Server (NTRS)
Mackenzie, Anne I.; Baginski, Michael E.; Rao, Sadasiva M.
2007-01-01
In this work, we present a new set of basis functions, de ned over a pair of planar triangular patches, for the solution of electromagnetic scattering and radiation problems associated with arbitrarily-shaped surfaces using the method of moments solution procedure. The basis functions are constant over the function subdomain and resemble pulse functions for one and two dimensional problems. Further, another set of basis functions, point-wise orthogonal to the first set, is also de ned over the same function space. The primary objective of developing these basis functions is to utilize them for the electromagnetic solution involving conducting, dielectric, and composite bodies. However, in the present work, only the conducting body solution is presented and compared with other data.
NASA Technical Reports Server (NTRS)
Mackenzie, Anne I.; Baginski, Michael E.; Rao, Sadasiva M.
2008-01-01
In this work, we present a new set of basis functions, defined over a pair of planar triangular patches, for the solution of electromagnetic scattering and radiation problems associated with arbitrarily-shaped surfaces using the method of moments solution procedure. The basis functions are constant over the function subdomain and resemble pulse functions for one and two dimensional problems. Further, another set of basis functions, point-wise orthogonal to the first set, is also defined over the same function space. The primary objective of developing these basis functions is to utilize them for the electromagnetic solution involving conducting, dielectric, and composite bodies. However, in the present work, only the conducting body solution is presented and compared with other data.
NASA Astrophysics Data System (ADS)
Awatey, M. T.; Irving, J.; Oware, E. K.
2016-12-01
Markov chain Monte Carlo (McMC) inversion frameworks are becoming increasingly popular in geophysics due to their ability to recover multiple equally plausible geologic features that honor the limited noisy measurements. Standard McMC methods, however, become computationally intractable with increasing dimensionality of the problem, for example, when working with spatially distributed geophysical parameter fields. We present a McMC approach based on a sparse proper orthogonal decomposition (POD) model parameterization that implicitly incorporates the physics of the underlying process. First, we generate training images (TIs) via Monte Carlo simulations of the target process constrained to a conceptual model. We then apply POD to construct basis vectors from the TIs. A small number of basis vectors can represent most of the variability in the TIs, leading to dimensionality reduction. A projection of the starting model into the reduced basis space generates the starting POD coefficients. At each iteration, only coefficients within a specified sampling window are resimulated assuming a Gaussian prior. The sampling window grows at a specified rate as the number of iteration progresses starting from the coefficients corresponding to the highest ranked basis to those of the least informative basis. We found this gradual increment in the sampling window to be more stable compared to resampling all the coefficients right from the first iteration. We demonstrate the performance of the algorithm with both synthetic and lab-scale electrical resistivity imaging of saline tracer experiments, employing the same set of basis vectors for all inversions. We consider two scenarios of unimodal and bimodal plumes. The unimodal plume is consistent with the hypothesis underlying the generation of the TIs whereas bimodality in plume morphology was not theorized. We show that uncertainty quantification using McMC can proceed in the reduced dimensionality space while accounting for the physics of the underlying process.
Amateur Image Pipeline Processing using Python plus PyRAF
NASA Astrophysics Data System (ADS)
Green, Wayne
2012-05-01
A template pipeline spanning observing planning to publishing is offered as a basis for establishing a long term observing program. The data reduction pipeline encapsulates all policy and procedures, providing an accountable framework for data analysis and a teaching framework for IRAF. This paper introduces the technical details of a complete pipeline processing environment using Python, PyRAF and a few other languages. The pipeline encapsulates all processing decisions within an auditable framework. The framework quickly handles the heavy lifting of image processing. It also serves as an excellent teaching environment for astronomical data management and IRAF reduction decisions.
Aquilante, Francesco; Gagliardi, Laura; Pedersen, Thomas Bondo; Lindh, Roland
2009-04-21
Cholesky decomposition of the atomic two-electron integral matrix has recently been proposed as a procedure for automated generation of auxiliary basis sets for the density fitting approximation [F. Aquilante et al., J. Chem. Phys. 127, 114107 (2007)]. In order to increase computational performance while maintaining accuracy, we propose here to reduce the number of primitive Gaussian functions of the contracted auxiliary basis functions by means of a second Cholesky decomposition. Test calculations show that this procedure is most beneficial in conjunction with highly contracted atomic orbital basis sets such as atomic natural orbitals, and that the error resulting from the second decomposition is negligible. We also demonstrate theoretically as well as computationally that the locality of the fitting coefficients can be controlled by means of the decomposition threshold even with the long-ranged Coulomb metric. Cholesky decomposition-based auxiliary basis sets are thus ideally suited for local density fitting approximations.
NASA Astrophysics Data System (ADS)
Aquilante, Francesco; Gagliardi, Laura; Pedersen, Thomas Bondo; Lindh, Roland
2009-04-01
Cholesky decomposition of the atomic two-electron integral matrix has recently been proposed as a procedure for automated generation of auxiliary basis sets for the density fitting approximation [F. Aquilante et al., J. Chem. Phys. 127, 114107 (2007)]. In order to increase computational performance while maintaining accuracy, we propose here to reduce the number of primitive Gaussian functions of the contracted auxiliary basis functions by means of a second Cholesky decomposition. Test calculations show that this procedure is most beneficial in conjunction with highly contracted atomic orbital basis sets such as atomic natural orbitals, and that the error resulting from the second decomposition is negligible. We also demonstrate theoretically as well as computationally that the locality of the fitting coefficients can be controlled by means of the decomposition threshold even with the long-ranged Coulomb metric. Cholesky decomposition-based auxiliary basis sets are thus ideally suited for local density fitting approximations.
Reducing racial bias among health care providers: lessons from social-cognitive psychology.
Burgess, Diana; van Ryn, Michelle; Dovidio, John; Saha, Somnath
2007-06-01
The paper sets forth a set of evidence-based recommendations for interventions to combat unintentional bias among health care providers, drawing upon theory and research in social cognitive psychology. Our primary aim is to provide a framework that outlines strategies and skills, which can be taught to medical trainees and practicing physicians, to prevent unconscious racial attitudes and stereotypes from negatively influencing the course and outcomes of clinical encounters. These strategies and skills are designed to: 1) enhance internal motivation to reduce bias, while avoiding external pressure; 2) increase understanding about the psychological basis of bias; 3) enhance providers' confidence in their ability to successfully interact with socially dissimilar patients; 4) enhance emotional regulation skills; and 5) improve the ability to build partnerships with patients. We emphasize the need for programs to provide a nonthreatening environment in which to practice new skills and the need to avoid making providers ashamed of having racial, ethnic, or cultural stereotypes. These recommendations are also intended to provide a springboard for research on interventions to reduce unintentional racial bias in health care.
Agent-Based Mapping of Credit Risk for Sustainable Microfinance
Lee, Joung-Hun; Jusup, Marko; Podobnik, Boris; Iwasa, Yoh
2015-01-01
By drawing analogies with independent research areas, we propose an unorthodox framework for mapping microfinance credit risk---a major obstacle to the sustainability of lenders outreaching to the poor. Specifically, using the elements of network theory, we constructed an agent-based model that obeys the stylized rules of microfinance industry. We found that in a deteriorating economic environment confounded with adverse selection, a form of latent moral hazard may cause a regime shift from a high to a low loan payment probability. An after-the-fact recovery, when possible, required the economic environment to improve beyond that which led to the shift in the first place. These findings suggest a small set of measurable quantities for mapping microfinance credit risk and, consequently, for balancing the requirements to reasonably price loans and to operate on a fully self-financed basis. We illustrate how the proposed mapping works using a 10-year monthly data set from one of the best-known microfinance representatives, Grameen Bank in Bangladesh. Finally, we discuss an entirely new perspective for managing microfinance credit risk based on enticing spontaneous cooperation by building social capital. PMID:25945790
NASA Astrophysics Data System (ADS)
Randler, Christoph; Kummer, Barbara; Wilhelm, Christian
2012-06-01
The aim of this study was to assess the outcome of a zoo visit in terms of learning and retention of knowledge concerning the adaptations and behavior of vertebrate species. Basis of the work was the concept of implementing zoo visits as an out-of-school setting for formal, curriculum based learning. Our theoretical framework centers on the self-determination theory, therefore, we used a group-based, hands-on learning environment. To address this questions, we used a treatment—control design (BACI) with different treatments and a control group. Pre-, post- and retention tests were applied. All treatments led to a substantial increase of learning and retention knowledge compared to the control group. Immediately after the zoo visit, the zoo-guide tour provided the highest scores, while after a delay of 6 weeks, the learner-centered environment combined with a teacher-guided summarizing scored best. We suggest incorporating the zoo as an out-of-school environment into formal school learning, and we propose different methods to improve learning in zoo settings.
Jian, Wen-Shan; Hsu, Chien-Yeh; Hao, Te-Hui; Wen, Hsyien-Chia; Hsu, Min-Huei; Lee, Yen-Liang; Li, Yu-Chuan; Chang, Polun
2007-11-01
Traditional electronic health record (EHR) data are produced from various hospital information systems. They could not have existed independently without an information system until the incarnation of XML technology. The interoperability of a healthcare system can be divided into two dimensions: functional interoperability and semantic interoperability. Currently, no single EHR standard exists that provides complete EHR interoperability. In order to establish a national EHR standard, we developed a set of local EHR templates. The Taiwan Electronic Medical Record Template (TMT) is a standard that aims to achieve semantic interoperability in EHR exchanges nationally. The TMT architecture is basically composed of forms, components, sections, and elements. Data stored in the elements which can be referenced by the code set, data type, and narrative block. The TMT was established with the following requirements in mind: (1) transformable to international standards; (2) having a minimal impact on the existing healthcare system; (3) easy to implement and deploy, and (4) compliant with Taiwan's current laws and regulations. The TMT provides a basis for building a portable, interoperable information infrastructure for EHR exchange in Taiwan.
Agent-based mapping of credit risk for sustainable microfinance.
Lee, Joung-Hun; Jusup, Marko; Podobnik, Boris; Iwasa, Yoh
2015-01-01
By drawing analogies with independent research areas, we propose an unorthodox framework for mapping microfinance credit risk--a major obstacle to the sustainability of lenders outreaching to the poor. Specifically, using the elements of network theory, we constructed an agent-based model that obeys the stylized rules of microfinance industry. We found that in a deteriorating economic environment confounded with adverse selection, a form of latent moral hazard may cause a regime shift from a high to a low loan payment probability. An after-the-fact recovery, when possible, required the economic environment to improve beyond that which led to the shift in the first place. These findings suggest a small set of measurable quantities for mapping microfinance credit risk and, consequently, for balancing the requirements to reasonably price loans and to operate on a fully self-financed basis. We illustrate how the proposed mapping works using a 10-year monthly data set from one of the best-known microfinance representatives, Grameen Bank in Bangladesh. Finally, we discuss an entirely new perspective for managing microfinance credit risk based on enticing spontaneous cooperation by building social capital.
Developing a radiomics framework for classifying non-small cell lung carcinoma subtypes
NASA Astrophysics Data System (ADS)
Yu, Dongdong; Zang, Yali; Dong, Di; Zhou, Mu; Gevaert, Olivier; Fang, Mengjie; Shi, Jingyun; Tian, Jie
2017-03-01
Patient-targeted treatment of non-small cell lung carcinoma (NSCLC) has been well documented according to the histologic subtypes over the past decade. In parallel, recent development of quantitative image biomarkers has recently been highlighted as important diagnostic tools to facilitate histological subtype classification. In this study, we present a radiomics analysis that classifies the adenocarcinoma (ADC) and squamous cell carcinoma (SqCC). We extract 52-dimensional, CT-based features (7 statistical features and 45 image texture features) to represent each nodule. We evaluate our approach on a clinical dataset including 324 ADCs and 110 SqCCs patients with CT image scans. Classification of these features is performed with four different machine-learning classifiers including Support Vector Machines with Radial Basis Function kernel (RBF-SVM), Random forest (RF), K-nearest neighbor (KNN), and RUSBoost algorithms. To improve the classifiers' performance, optimal feature subset is selected from the original feature set by using an iterative forward inclusion and backward eliminating algorithm. Extensive experimental results demonstrate that radiomics features achieve encouraging classification results on both complete feature set (AUC=0.89) and optimal feature subset (AUC=0.91).
Reducing Racial Bias Among Health Care Providers: Lessons from Social-Cognitive Psychology
van Ryn, Michelle; Dovidio, John; Saha, Somnath
2007-01-01
The paper sets forth a set of evidence-based recommendations for interventions to combat unintentional bias among health care providers, drawing upon theory and research in social cognitive psychology. Our primary aim is to provide a framework that outlines strategies and skills, which can be taught to medical trainees and practicing physicians, to prevent unconscious racial attitudes and stereotypes from negatively influencing the course and outcomes of clinical encounters. These strategies and skills are designed to: l) enhance internal motivation to reduce bias, while avoiding external pressure; 2) increase understanding about the psychological basis of bias; 3) enhance providers’ confidence in their ability to successfully interact with socially dissimilar patients; 4) enhance emotional regulation skills; and 5) improve the ability to build partnerships with patients. We emphasize the need for programs to provide a nonthreatening environment in which to practice new skills and the need to avoid making providers ashamed of having racial, ethnic, or cultural stereotypes. These recommendations are also intended to provide a springboard for research on interventions to reduce unintentional racial bias in health care. PMID:17503111
An Analysis of Internet’s MBONE: A Media Choice Perspective
1994-09-01
in determining which medium best fits their communication needs. The symbolic interactionism framework provides a basis for understanding the factors...s a. Equivocality The equivocality of a message should affect media choice based upon the symbolic interactionism framework. "Equivocality means...9 b. Uncertainty .... ........................ 10 c. Media as a Symbol ..................... 11 d . S ocial P
ERIC Educational Resources Information Center
Murin, Tricia M.
2016-01-01
Providing equitable education for all students is the responsibility of administrators, teachers, and parents. Even though the MTSS/RtII Framework has evolved from the RtI and RtII models, the basis is the same: intervening and identifying students' needs and analyzing data and programming instruction to meet all students' needs. Even though in…
A conceptual framework for the study of human ecosystems in urban areas
Steward T.A. Pickett; William R. Burch; Shawn E. Dalton; Timothy W. Foresman; J. Morgan Grove; Rowan Rowntree
1997-01-01
The need for integrated concepts, capable of satisfying natural and social scientists and supporting integrated research, motivates a conceptual framework for understanding the role of humans in ecosystems. The question is how to add humans to the ecological models used to understand urban ecosystems. The ecosystem concept can serve as the basis, but specific social...
ERIC Educational Resources Information Center
Evangelista, Nancy; McLellan, Mary J.
2004-01-01
The expansion of early childhood services has brought increasing recognition of the need to address mental health disorders in young children. The transactional perspective of developmental psychopathology is the basis for review of diagnostic frameworks for young children. The Diagnostic and Statistical Manual of Mental Disorders (DSM-IV) is…
The Concept of Energy in Psychological Theory. Cognitive Science Program, Technical Report No. 86-2.
ERIC Educational Resources Information Center
Posner, Michael I.; Rothbart, Mary Klevjord
This paper describes a basic framework for integration of computational and energetic concepts in psychological theory. The framework is adapted from a general effort to understand the neural systems underlying cognition. The element of the cognitive system that provides the best basis for attempting to relate energetic and computational ideas is…
ERIC Educational Resources Information Center
Basham, James D.; Lowrey, K. Alisa; deNoyelles, Aimee
2010-01-01
This study investigated the Universal Design for Learning (UDL) framework as a basis for a bi-university computer mediated communication (CMC) collaborative project. Participants in the research included 78 students from two special education programs enrolled in teacher education courses. The focus of the investigation was on exploring the…
ERIC Educational Resources Information Center
Lai, Su-Huei
A conceptual framework of the modes of problem-solving action has been developed on the basis of a simple relationship cone to assist individuals in diversified professions in inquiry and implementation of theory and practice in their professional development. The conceptual framework is referred to as the Cone-Deciphered Modes of Problem Solving…
A Conceptual Framework for the Indirect Method of Reporting Net Cash Flow from Operating Activities
ERIC Educational Resources Information Center
Wang, Ting J.
2010-01-01
This paper describes the fundamental concept of the reconciliation behind the indirect method of the statement of cash flows. A conceptual framework is presented to demonstrate how accrual and cash-basis accounting methods relate to each other and to illustrate the concept of reconciling these two accounting methods. The conceptual framework…
ERIC Educational Resources Information Center
Gade, Sharada
2015-01-01
Long association with a mathematics teacher at a Grade 4-6 school in Sweden, is basis for reporting a case of teacher-researcher collaboration. Three theoretical frameworks used to study its development over time are relational knowing, relational agency and cogenerative dialogue. While relational knowing uses narrative perspectives to explore the…
ERIC Educational Resources Information Center
Jacobo, Rodolfo; Ochoa, Alberto M.
2011-01-01
This article examines the experiences of selected undocumented college-aged (UCA) students attending a community and four year college, and the trauma they live on a daily basis. A conceptual framework is provided for examining the tensions experienced by undocumented students. The framework is suggested as a tool to analyze the explicit and…
Evaluation, Sustainable Development, and the Environment in the South Pacific
ERIC Educational Resources Information Center
Turvey, Rosario
2007-01-01
This article outlines the Results-Based Evaluation (RBE) framework proposed for the ex-post assessment of the National Environmental Management Strategies (NEMS) in 12 small-island developing states (SIDS) in the South Pacific. It gives an overview of the methods and basis of developing an evaluation framework in the context of SIDS in the region.…
Reasons Preventing Teachers from Acting within the Framework of Ethical Principles
ERIC Educational Resources Information Center
Dag, Nilgün; Arslantas, Halis Adnan
2015-01-01
This study aims at putting forth the reasons preventing teachers from acting ethically, acting within the framework of ethical principles and having an ethical tendency. This study featuring a qualitative research model taking as a basis the case study approach followed a path of selecting people that can be a rich source of information for…
Empirical grounding of the nature of scientific inquiry: A study of developing researchers
NASA Astrophysics Data System (ADS)
Stucky, Amy Preece
This work uses grounded theory methodology for developing theory about the nature of authentic scientific inquiry that occurs on a day-to-day basis in an academic research laboratory. Symbolic interaction and situated learning provide a theoretical framework. Data were collected from field notes, over 100 hours of videotape of researchers working in a chemical research laboratory, and interviews with participants. The phenomena of a research laboratory suggest that authentic daily work stretches scientists in three learning modalities: cognitive, affective and motivational beliefs and goals, which influence action to promote learning. A laboratory's line of research is divided into individual, thematic projects. Researchers are enabled in a specialized laboratory environment with sets of unique artifacts, substances, people and theoretical concepts to facilitate production of significant research goals. The work itself consists of chemical and mechanical processes facilitated by human actions, appropriate mental states, and theoretical explanations. The cognitive, affective (emotional), and conative (motivational) stretching then leads to explicit learning as well as implicit learning in the gain of experience and tacit knowledge. Implications of these findings about the nature of authentic scientific research on a day-to-day basis are applied to inquiry in science education in undergraduate and graduate education.
Banishing the Control Homunculi in Studies of Action Control and Behavior Change
Verbruggen, Frederick; McLaren, Ian P. L.; Chambers, Christopher D.
2014-01-01
For centuries, human self-control has fascinated scientists and nonscientists alike. Current theories often attribute it to an executive control system. But even though executive control receives a great deal of attention across disciplines, most aspects of it are still poorly understood. Many theories rely on an ill-defined set of “homunculi” doing jobs like “response inhibition” or “updating” without explaining how they do so. Furthermore, it is not always appreciated that control takes place across different timescales. These two issues hamper major advances. Here we focus on the mechanistic basis for the executive control of actions. We propose that at the most basic level, action control depends on three cognitive processes: signal detection, action selection, and action execution. These processes are modulated via error-correction or outcome-evaluation mechanisms, preparation, and task rules maintained in working and long-term memory. We also consider how executive control of actions becomes automatized with practice and how people develop a control network. Finally, we discuss how the application of this unified framework in clinical domains can increase our understanding of control deficits and provide a theoretical basis for the development of novel behavioral change interventions. PMID:25419227
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aquino, Fredy W.; Govind, Niranjan; Autschbach, Jochen
2011-10-01
Density functional theory (DFT) calculations of NMR chemical shifts and molecular g-tensors with Gaussian-type orbitals are implemented via second-order energy derivatives within the scalar relativistic zeroth order regular approximation (ZORA) framework. Nonhybrid functionals, standard (global) hybrids, and range-separated (Coulomb-attenuated, long-range corrected) hybrid functionals are tested. Origin invariance of the results is ensured by use of gauge-including atomic orbital (GIAO) basis functions. The new implementation in the NWChem quantum chemistry package is verified by calculations of nuclear shielding constants for the heavy atoms in HX (X=F, Cl, Br, I, At) and H2X (X = O, S, Se, Te, Po), and Temore » chemical shifts in a number of tellurium compounds. The basis set and functional dependence of g-shifts is investigated for 14 radicals with light and heavy atoms. The problem of accurately predicting F NMR shielding in UF6-nCln, n = 1 to 6, is revisited. The results are sensitive to approximations in the density functionals, indicating a delicate balance of DFT self-interaction vs. correlation. For the uranium halides, the results with the range-separated functionals are mixed.« less
Spatial Bayesian latent factor regression modeling of coordinate-based meta-analysis data.
Montagna, Silvia; Wager, Tor; Barrett, Lisa Feldman; Johnson, Timothy D; Nichols, Thomas E
2018-03-01
Now over 20 years old, functional MRI (fMRI) has a large and growing literature that is best synthesised with meta-analytic tools. As most authors do not share image data, only the peak activation coordinates (foci) reported in the article are available for Coordinate-Based Meta-Analysis (CBMA). Neuroimaging meta-analysis is used to (i) identify areas of consistent activation; and (ii) build a predictive model of task type or cognitive process for new studies (reverse inference). To simultaneously address these aims, we propose a Bayesian point process hierarchical model for CBMA. We model the foci from each study as a doubly stochastic Poisson process, where the study-specific log intensity function is characterized as a linear combination of a high-dimensional basis set. A sparse representation of the intensities is guaranteed through latent factor modeling of the basis coefficients. Within our framework, it is also possible to account for the effect of study-level covariates (meta-regression), significantly expanding the capabilities of the current neuroimaging meta-analysis methods available. We apply our methodology to synthetic data and neuroimaging meta-analysis datasets. © 2017, The International Biometric Society.
NASA Astrophysics Data System (ADS)
Balabanov, Nikolai B.; Peterson, Kirk A.
2005-08-01
Sequences of basis sets that systematically converge towards the complete basis set (CBS) limit have been developed for the first-row transition metal elements Sc-Zn. Two families of basis sets, nonrelativistic and Douglas-Kroll-Hess (-DK) relativistic, are presented that range in quality from triple-ζ to quintuple-ζ. Separate sets are developed for the description of valence (3d4s) electron correlation (cc-pVnZ and cc-pVnZ-DK; n =T,Q, 5) and valence plus outer-core (3s3p3d4s) correlation (cc-pwCVnZ and cc-pwCVnZ-DK; n =T,Q, 5), as well as these sets augmented by additional diffuse functions for the description of negative ions and weak interactions (aug-cc-pVnZ and aug-cc-pVnZ-DK). Extensive benchmark calculations at the coupled cluster level of theory are presented for atomic excitation energies, ionization potentials, and electron affinities, as well as molecular calculations on selected hydrides (TiH, MnH, CuH) and other diatomics (TiF, Cu2). In addition to observing systematic convergence towards the CBS limits, both 3s3p electron correlation and scalar relativity are calculated to strongly impact many of the atomic and molecular properties investigated for these first-row transition metal species.
Gum, Lyn Frances; Lloyd, Andrea; Lawn, Sharon; Richards, Janet Noreen; Lindemann, Iris; Sweet, Linda; Ward, Helena; King, Alison; Bramwell, Donald
2013-11-01
This article is based on a partnership between a primary health service and a university whose shared goal was to prepare students and graduates for interprofessional practice (IPP). This collaborative process led to the development of consensus on an interprofessional capability framework. An action research methodology was adopted to study the development and progress of the partnership between university and health service providers. The initial aim was to understand their perceptions of IPP. Following this, the findings and draft capabilities were presented back to the groups. Finalisation of the capabilities took place with shared discussion and debate on how to implement them in the primary care setting. Several ideas and strategies were generated as to how to prepare effective interprofessional learning experiences for students in both environments (university and primary health care setting). Extensive stakeholder consultation from healthcare providers and educators has produced a framework, which incorporates the shared views and understandings, and can therefore be widely used in both settings. Development of a framework of capabilities for IPP, through a collaborative process, is a useful strategy for achieving agreement. Such a framework can guide curriculum for use in university and health service settings to assist incorporation of interprofessional capabilities into students' learning and practice.
Priority setting: what constitutes success? A conceptual framework for successful priority setting
Sibbald, Shannon L; Singer, Peter A; Upshur, Ross; Martin, Douglas K
2009-01-01
Background The sustainability of healthcare systems worldwide is threatened by a growing demand for services and expensive innovative technologies. Decision makers struggle in this environment to set priorities appropriately, particularly because they lack consensus about which values should guide their decisions. One way to approach this problem is to determine what all relevant stakeholders understand successful priority setting to mean. The goal of this research was to develop a conceptual framework for successful priority setting. Methods Three separate empirical studies were completed using qualitative data collection methods (one-on-one interviews with healthcare decision makers from across Canada; focus groups with representation of patients, caregivers and policy makers; and Delphi study including scholars and decision makers from five countries). Results This paper synthesizes the findings from three studies into a framework of ten separate but interconnected elements germane to successful priority setting: stakeholder understanding, shifted priorities/reallocation of resources, decision making quality, stakeholder acceptance and satisfaction, positive externalities, stakeholder engagement, use of explicit process, information management, consideration of values and context, and revision or appeals mechanism. Conclusion The ten elements specify both quantitative and qualitative dimensions of priority setting and relate to both process and outcome components. To our knowledge, this is the first framework that describes successful priority setting. The ten elements identified in this research provide guidance for decision makers and a common language to discuss priority setting success and work toward improving priority setting efforts. PMID:19265518
NASA Astrophysics Data System (ADS)
Hill, J. Grant; Peterson, Kirk A.
2017-12-01
New correlation consistent basis sets based on pseudopotential (PP) Hamiltonians have been developed from double- to quintuple-zeta quality for the late alkali (K-Fr) and alkaline earth (Ca-Ra) metals. These are accompanied by new all-electron basis sets of double- to quadruple-zeta quality that have been contracted for use with both Douglas-Kroll-Hess (DKH) and eXact 2-Component (X2C) scalar relativistic Hamiltonians. Sets for valence correlation (ms), cc-pVnZ-PP and cc-pVnZ-(DK,DK3/X2C), in addition to outer-core correlation [valence + (m-1)sp], cc-p(w)CVnZ-PP and cc-pwCVnZ-(DK,DK3/X2C), are reported. The -PP sets have been developed for use with small-core PPs [I. S. Lim et al., J. Chem. Phys. 122, 104103 (2005) and I. S. Lim et al., J. Chem. Phys. 124, 034107 (2006)], while the all-electron sets utilized second-order DKH Hamiltonians for 4s and 5s elements and third-order DKH for 6s and 7s. The accuracy of the basis sets is assessed through benchmark calculations at the coupled-cluster level of theory for both atomic and molecular properties. Not surprisingly, it is found that outer-core correlation is vital for accurate calculation of the thermodynamic and spectroscopic properties of diatomic molecules containing these elements.
NASA Astrophysics Data System (ADS)
Frisch, Michael J.; Binkley, J. Stephen; Schaefer, Henry F., III
1984-08-01
The relative energies of the stationary points on the FH2 and H2CO nuclear potential energy surfaces relevant to the hydrogen atom abstraction, H2 elimination and 1,2-hydrogen shift reactions have been examined using fourth-order Møller-Plesset perturbation theory and a variety of basis sets. The theoretical absolute zero activation energy for the F+H2→FH+H reaction is in better agreement with experiment than previous theoretical studies, and part of the disagreement between earlier theoretical calculations and experiment is found to result from the use of assumed rather than calculated zero-point vibrational energies. The fourth-order reaction energy for the elimination of hydrogen from formaldehyde is within 2 kcal mol-1 of the experimental value using the largest basis set considered. The qualitative features of the H2CO surface are unchanged by expansion of the basis set beyond the polarized triple-zeta level, but diffuse functions and several sets of polarization functions are found to be necessary for quantitative accuracy in predicted reaction and activation energies. Basis sets and levels of perturbation theory which represent good compromises between computational efficiency and accuracy are recommended.
NASA Astrophysics Data System (ADS)
Romero, Angel H.
2017-10-01
The influence of ring puckering angle on the multipole moments of sixteen four-membered heterocycles (1-16) was theoretically estimated using MP2 and different DFTs in combination with the 6-31+G(d,p) basis set. To obtain an accurate evaluation, CCSD/cc-pVDZ level and, the MP2 and PBE1PBE methods in combination with the aug-cc-pVDZ and aug-cc-pVTZ basis sets were performed on the planar geometries of 1-16. In general, the DFT and MP2 approaches provided an identical dependence of the electrical properties with the puckering angle for 1-16. Quantitatively, the quality of the level of theory and basis sets affects significant the predictions of the multipole moments, in particular for the heterocycles containing C=O and C=S bonds. Convergence basis sets within the MP2 and PBE1PBE approximations are reached in the dipole moment calculations when the aug-cc-pVTZ basis set is used, while the quadrupole and octupole moment computations require a larger basis set than aug-cc-pVTZ. On the other hand, the multipole moments showed a strong dependence with the molecular geometry and the nature of the carbon-heteroatom bonds. Specifically, the C-X bond determines the behavior of the μ(ϕ), θ(ϕ) and Ώ(ϕ) functions, while the C=Y bond plays an important role in the magnitude of the studied properties.
NASA Astrophysics Data System (ADS)
Girard, L.; Weiss, J.; Molines, J. M.; Barnier, B.; Bouillon, S.
2009-08-01
Sea ice drift and deformation from models are evaluated on the basis of statistical and scaling properties. These properties are derived from two observation data sets: the RADARSAT Geophysical Processor System (RGPS) and buoy trajectories from the International Arctic Buoy Program (IABP). Two simulations obtained with the Louvain-la-Neuve Ice Model (LIM) coupled to a high-resolution ocean model and a simulation obtained with the Los Alamos Sea Ice Model (CICE) were analyzed. Model ice drift compares well with observations in terms of large-scale velocity field and distributions of velocity fluctuations although a significant bias on the mean ice speed is noted. On the other hand, the statistical properties of ice deformation are not well simulated by the models: (1) The distributions of strain rates are incorrect: RGPS distributions of strain rates are power law tailed, i.e., exhibit "wild randomness," whereas models distributions remain in the Gaussian attraction basin, i.e., exhibit "mild randomness." (2) The models are unable to reproduce the spatial and temporal correlations of the deformation fields: In the observations, ice deformation follows spatial and temporal scaling laws that express the heterogeneity and the intermittency of deformation. These relations do not appear in simulated ice deformation. Mean deformation in models is almost scale independent. The statistical properties of ice deformation are a signature of the ice mechanical behavior. The present work therefore suggests that the mechanical framework currently used by models is inappropriate. A different modeling framework based on elastic interactions could improve the representation of the statistical and scaling properties of ice deformation.
The Heterogeneous Dynamics of Economic Complexity
Cristelli, Matthieu; Tacchella, Andrea; Pietronero, Luciano
2015-01-01
What will be the growth of the Gross Domestic Product (GDP) or the competitiveness of China, United States, and Vietnam in the next 3, 5 or 10 years? Despite this kind of questions has a large societal impact and an extreme value for economic policy making, providing a scientific basis for economic predictability is still a very challenging problem. Recent results of a new branch—Economic Complexity—have set the basis for a framework to approach such a challenge and to provide new perspectives to cast economic prediction into the conceptual scheme of forecasting the evolution of a dynamical system as in the case of weather dynamics. We argue that a recently introduced non-monetary metrics for country competitiveness (fitness) allows for quantifying the hidden growth potential of countries by the means of the comparison of this measure for intangible assets with monetary figures, such as GDP per capita. This comparison defines the fitness-income plane where we observe that country dynamics presents strongly heterogeneous patterns of evolution. The flow in some zones is found to be laminar while in others a chaotic behavior is instead observed. These two regimes correspond to very different predictability features for the evolution of countries: in the former regime, we find strong predictable pattern while the latter scenario exhibits a very low predictability. In such a framework, regressions, the usual tool used in economics, are no more the appropriate strategy to deal with such a heterogeneous scenario and new concepts, borrowed from dynamical systems theory, are mandatory. We therefore propose a data-driven method—the selective predictability scheme—in which we adopt a strategy similar to the methods of analogues, firstly introduced by Lorenz, to assess future evolution of countries. PMID:25671312
Belmar, Oscar; Velasco, Josefa; Martinez-Capel, Francisco
2011-05-01
Hydrological classification constitutes the first step of a new holistic framework for developing regional environmental flow criteria: the "Ecological Limits of Hydrologic Alteration (ELOHA)". The aim of this study was to develop a classification for 390 stream sections of the Segura River Basin based on 73 hydrological indices that characterize their natural flow regimes. The hydrological indices were calculated with 25 years of natural monthly flows (1980/81-2005/06) derived from a rainfall-runoff model developed by the Spanish Ministry of Environment and Public Works. These indices included, at a monthly or annual basis, measures of duration of droughts and central tendency and dispersion of flow magnitude (average, low and high flow conditions). Principal Component Analysis (PCA) indicated high redundancy among most hydrological indices, as well as two gradients: flow magnitude for mainstream rivers and temporal variability for tributary streams. A classification with eight flow-regime classes was chosen as the most easily interpretable in the Segura River Basin, which was supported by ANOSIM analyses. These classes can be simplified in 4 broader groups, with different seasonal discharge pattern: large rivers, perennial stable streams, perennial seasonal streams and intermittent and ephemeral streams. They showed a high degree of spatial cohesion, following a gradient associated with climatic aridity from NW to SE, and were well defined in terms of the fundamental variables in Mediterranean streams: magnitude and temporal variability of flows. Therefore, this classification is a fundamental tool to support water management and planning in the Segura River Basin. Future research will allow us to study the flow alteration-ecological response relationship for each river type, and set the basis to design scientifically credible environmental flows following the ELOHA framework.
The heterogeneous dynamics of economic complexity.
Cristelli, Matthieu; Tacchella, Andrea; Pietronero, Luciano
2015-01-01
What will be the growth of the Gross Domestic Product (GDP) or the competitiveness of China, United States, and Vietnam in the next 3, 5 or 10 years? Despite this kind of questions has a large societal impact and an extreme value for economic policy making, providing a scientific basis for economic predictability is still a very challenging problem. Recent results of a new branch--Economic Complexity--have set the basis for a framework to approach such a challenge and to provide new perspectives to cast economic prediction into the conceptual scheme of forecasting the evolution of a dynamical system as in the case of weather dynamics. We argue that a recently introduced non-monetary metrics for country competitiveness (fitness) allows for quantifying the hidden growth potential of countries by the means of the comparison of this measure for intangible assets with monetary figures, such as GDP per capita. This comparison defines the fitness-income plane where we observe that country dynamics presents strongly heterogeneous patterns of evolution. The flow in some zones is found to be laminar while in others a chaotic behavior is instead observed. These two regimes correspond to very different predictability features for the evolution of countries: in the former regime, we find strong predictable pattern while the latter scenario exhibits a very low predictability. In such a framework, regressions, the usual tool used in economics, are no more the appropriate strategy to deal with such a heterogeneous scenario and new concepts, borrowed from dynamical systems theory, are mandatory. We therefore propose a data-driven method--the selective predictability scheme--in which we adopt a strategy similar to the methods of analogues, firstly introduced by Lorenz, to assess future evolution of countries.
Large-scale data integration framework provides a comprehensive view on glioblastoma multiforme.
Ovaska, Kristian; Laakso, Marko; Haapa-Paananen, Saija; Louhimo, Riku; Chen, Ping; Aittomäki, Viljami; Valo, Erkka; Núñez-Fontarnau, Javier; Rantanen, Ville; Karinen, Sirkku; Nousiainen, Kari; Lahesmaa-Korpinen, Anna-Maria; Miettinen, Minna; Saarinen, Lilli; Kohonen, Pekka; Wu, Jianmin; Westermarck, Jukka; Hautaniemi, Sampsa
2010-09-07
Coordinated efforts to collect large-scale data sets provide a basis for systems level understanding of complex diseases. In order to translate these fragmented and heterogeneous data sets into knowledge and medical benefits, advanced computational methods for data analysis, integration and visualization are needed. We introduce a novel data integration framework, Anduril, for translating fragmented large-scale data into testable predictions. The Anduril framework allows rapid integration of heterogeneous data with state-of-the-art computational methods and existing knowledge in bio-databases. Anduril automatically generates thorough summary reports and a website that shows the most relevant features of each gene at a glance, allows sorting of data based on different parameters, and provides direct links to more detailed data on genes, transcripts or genomic regions. Anduril is open-source; all methods and documentation are freely available. We have integrated multidimensional molecular and clinical data from 338 subjects having glioblastoma multiforme, one of the deadliest and most poorly understood cancers, using Anduril. The central objective of our approach is to identify genetic loci and genes that have significant survival effect. Our results suggest several novel genetic alterations linked to glioblastoma multiforme progression and, more specifically, reveal Moesin as a novel glioblastoma multiforme-associated gene that has a strong survival effect and whose depletion in vitro significantly inhibited cell proliferation. All analysis results are available as a comprehensive website. Our results demonstrate that integrated analysis and visualization of multidimensional and heterogeneous data by Anduril enables drawing conclusions on functional consequences of large-scale molecular data. Many of the identified genetic loci and genes having significant survival effect have not been reported earlier in the context of glioblastoma multiforme. Thus, in addition to generally applicable novel methodology, our results provide several glioblastoma multiforme candidate genes for further studies.Anduril is available at http://csbi.ltdk.helsinki.fi/anduril/The glioblastoma multiforme analysis results are available at http://csbi.ltdk.helsinki.fi/anduril/tcga-gbm/
A unifying framework for rigid multibody dynamics and serial and parallel computational issues
NASA Technical Reports Server (NTRS)
Fijany, Amir; Jain, Abhinandan
1989-01-01
A unifying framework for various formulations of the dynamics of open-chain rigid multibody systems is discussed. Their suitability for serial and parallel processing is assessed. The framework is based on the derivation of intrinsic, i.e., coordinate-free, equations of the algorithms which provides a suitable abstraction and permits a distinction to be made between the computational redundancy in the intrinsic and extrinsic equations. A set of spatial notation is used which allows the derivation of the various algorithms in a common setting and thus clarifies the relationships among them. The three classes of algorithms viz., O(n), O(n exp 2) and O(n exp 3) or the solution of the dynamics problem are investigated. Researchers begin with the derivation of O(n exp 3) algorithms based on the explicit computation of the mass matrix and it provides insight into the underlying basis of the O(n) algorithms. From a computational perspective, the optimal choice of a coordinate frame for the projection of the intrinsic equations is discussed and the serial computational complexity of the different algorithms is evaluated. The three classes of algorithms are also analyzed for suitability for parallel processing. It is shown that the problem belongs to the class of N C and the time and processor bounds are of O(log2/2(n)) and O(n exp 4), respectively. However, the algorithm that achieves the above bounds is not stable. Researchers show that the fastest stable parallel algorithm achieves a computational complexity of O(n) with O(n exp 4), respectively. However, the algorithm that achieves the above bounds is not stable. Researchers show that the fastest stable parallel algorithm achieves a computational complexity of O(n) with O(n exp 2) processors, and results from the parallelization of the O(n exp 3) serial algorithm.
Data science ethics in government.
Drew, Cat
2016-12-28
Data science can offer huge opportunities for government. With the ability to process larger and more complex datasets than ever before, it can provide better insights for policymakers and make services more tailored and efficient. As with all new technologies, there is a risk that we do not take up its opportunities and miss out on its enormous potential. We want people to feel confident to innovate with data. So, over the past 18 months, the Government Data Science Partnership has taken an open, evidence-based and user-centred approach to creating an ethical framework. It is a practical document that brings all the legal guidance together in one place, and is written in the context of new data science capabilities. As part of its development, we ran a public dialogue on data science ethics, including deliberative workshops, an experimental conjoint survey and an online engagement tool. The research supported the principles set out in the framework as well as provided useful insight into how we need to communicate about data science. It found that people had a low awareness of the term 'data science', but that showing data science examples can increase broad support for government exploring innovative uses of data. But people's support is highly context driven. People consider acceptability on a case-by-case basis, first thinking about the overall policy goals and likely intended outcome, and then weighing up privacy and unintended consequences. The ethical framework is a crucial start, but it does not solve all the challenges it highlights, particularly as technology is creating new challenges and opportunities every day. Continued research is needed into data minimization and anonymization, robust data models, algorithmic accountability, and transparency and data security. It also has revealed the need to set out a renewed deal between the citizen and state on data, to maintain and solidify trust in how we use people's data for social good.This article is part of the themed issue 'The ethical impact of data science'. © 2016 The Author(s).
The Cost of Quality--Its Application to Libraries.
ERIC Educational Resources Information Center
Franklin, Brinley
1994-01-01
Examines the conceptual basis for the cost of quality and its application to libraries. The framework for analysis of this conceptual basis includes definitions of the cost of quality; a brief historical review of the cost of quality; and the application of quality cost to libraries, including an explanation of how quality costs respond to quality…
Random sampling of elementary flux modes in large-scale metabolic networks.
Machado, Daniel; Soons, Zita; Patil, Kiran Raosaheb; Ferreira, Eugénio C; Rocha, Isabel
2012-09-15
The description of a metabolic network in terms of elementary (flux) modes (EMs) provides an important framework for metabolic pathway analysis. However, their application to large networks has been hampered by the combinatorial explosion in the number of modes. In this work, we develop a method for generating random samples of EMs without computing the whole set. Our algorithm is an adaptation of the canonical basis approach, where we add an additional filtering step which, at each iteration, selects a random subset of the new combinations of modes. In order to obtain an unbiased sample, all candidates are assigned the same probability of getting selected. This approach avoids the exponential growth of the number of modes during computation, thus generating a random sample of the complete set of EMs within reasonable time. We generated samples of different sizes for a metabolic network of Escherichia coli, and observed that they preserve several properties of the full EM set. It is also shown that EM sampling can be used for rational strain design. A well distributed sample, that is representative of the complete set of EMs, should be suitable to most EM-based methods for analysis and optimization of metabolic networks. Source code for a cross-platform implementation in Python is freely available at http://code.google.com/p/emsampler. dmachado@deb.uminho.pt Supplementary data are available at Bioinformatics online.
Marital Status and Persons With Dementia in Assisted Living.
Fields, Noelle L; Richardson, Virginia E; Schuman, Donna
2017-03-01
Despite the prevalence of dementia among residents in assisted living (AL), few researchers have focused on the length of stay (LOS) in AL among this population. Little is known about the factors that may contribute to LOS in these settings, particularly for residents with dementia. In the current study, a sub-set of AL residents with dementia (n = 112) was utilized to examine whether marital status was associated with LOS in AL as this has received sparse attention in previous research despite studies suggesting that marital status influences LOS in other health-care and long-term care settings. The Andersen-Newman behavioral model was used as a conceptual framework for the basis of this study of LOS, marital status, and dementia in AL. We hypothesized that persons with dementia who were married would have longer LOS than unmarried persons with dementia in AL. Cox regression was used to examine the association between marital status and LOS in AL of residents with dementia and whether activities of daily living were related to discharge from AL settings among married and unmarried residents with dementia. Main effects for marital status and the interaction between marital status and mobility with LOS were examined. Study findings provide information related to the psychosocial needs of AL residents with dementia and offer implications for assessing the on-going needs of vulnerable AL residents.
A conceptualisation framework for building consensus on environmental sensitivity.
González Del Campo, Ainhoa
2017-09-15
Examination of the intrinsic attributes of a system that render it more or less sensitive to potential stressors provides further insight into the baseline environment. In impact assessment, sensitivity of environmental receptors can be conceptualised on the basis of their: a) quality status according to statutory indicators and associated thresholds or targets; b) statutory protection; or c) inherent risk. Where none of these considerations are pertinent, subjective value judgments can be applied to determine sensitivity. This pragmatic conceptual framework formed the basis of a stakeholder consultation process for harmonising degrees of sensitivity of a number of environmental criteria. Harmonisation was sought to facilitate their comparative and combined analysis. Overall, full or wide agreement was reached on relative sensitivity values for the large majority of the reviewed criteria. Consensus was easier to reach on some themes (e.g. biodiversity, water and cultural heritage) than others (e.g. population and soils). As anticipated, existing statutory measures shaped the outcomes but, ultimately, knowledge-based values prevailed. The agreed relative sensitivities warrant extensive consultation but the conceptual framework provides a basis for increasing stakeholder consensus and objectivity of baseline assessments. This, in turn, can contribute to improving the evidence-base for characterising the significance of potential impacts. Copyright © 2017 Elsevier Ltd. All rights reserved.
The role of government in supporting technological advance
NASA Astrophysics Data System (ADS)
Tucker, Christopher K.
A broad and poorly focused debate has, for quite some time, raged across the range of social science disciplines and policy related professions. This debate has dealt, in different ways, with the question of the proper role of the government in a mixed economy. Current debates over the appropriate role of government in a mixed economy are largely constrained by a basic set of 'market failure' concepts developed in economics. This dissertation interrogates the histories of the automobile, electrical and aircraft industries in the six decades spanning the turn of the 20th century with a theoretical framework that draws on recent theorizing on the co-evolution of technologies, industrial structure, and supporting institutions. In highlighting institutional and technological aspects of industrial development, this dissertation informs a basis for science and technology policy making that moves beyond 'market failure' analysis.
A Case-Based Reasoning Method with Rank Aggregation
NASA Astrophysics Data System (ADS)
Sun, Jinhua; Du, Jiao; Hu, Jian
2018-03-01
In order to improve the accuracy of case-based reasoning (CBR), this paper addresses a new CBR framework with the basic principle of rank aggregation. First, the ranking methods are put forward in each attribute subspace of case. The ordering relation between cases on each attribute is got between cases. Then, a sorting matrix is got. Second, the similar case retrieval process from ranking matrix is transformed into a rank aggregation optimal problem, which uses the Kemeny optimal. On the basis, a rank aggregation case-based reasoning algorithm, named RA-CBR, is designed. The experiment result on UCI data sets shows that case retrieval accuracy of RA-CBR algorithm is higher than euclidean distance CBR and mahalanobis distance CBR testing.So we can get the conclusion that RA-CBR method can increase the performance and efficiency of CBR.
Modeling the data systems role of the scientist (for the NEEDS Command and Control Task)
NASA Technical Reports Server (NTRS)
Hei, D. J., Jr.; Winter, W. J., Jr.; Brookes, R.; Locke, M.
1981-01-01
Research was conducted into the command and control activities of the scientists for five space missions: International Ultraviolet Explorer, Solar Maximum Mission, International Sun-Earth Explorer, High-Energy Astronomy Observatory 1, and Atmospheric Explorer 5. A basis for developing a generalized description of the scientists' activities was obtained. Because of this characteristic, it was decided that a series of flowcharts would be used. This set of flowcharts constitutes a model of the scientists' activities within the total data system. The model was developed through three levels of detail. The first is general and provides a conceptual framework for discussing the system. The second identifies major functions and should provide a fundamental understanding of the scientists' command and control activities. The third level expands the major functions into a more detailed description.
Distributed Computing Framework for Synthetic Radar Application
NASA Technical Reports Server (NTRS)
Gurrola, Eric M.; Rosen, Paul A.; Aivazis, Michael
2006-01-01
We are developing an extensible software framework, in response to Air Force and NASA needs for distributed computing facilities for a variety of radar applications. The objective of this work is to develop a Python based software framework, that is the framework elements of the middleware that allows developers to control processing flow on a grid in a distributed computing environment. Framework architectures to date allow developers to connect processing functions together as interchangeable objects, thereby allowing a data flow graph to be devised for a specific problem to be solved. The Pyre framework, developed at the California Institute of Technology (Caltech), and now being used as the basis for next-generation radar processing at JPL, is a Python-based software framework. We have extended the Pyre framework to include new facilities to deploy processing components as services, including components that monitor and assess the state of the distributed network for eventual real-time control of grid resources.
NASA Astrophysics Data System (ADS)
Varandas, António J. C.
2018-04-01
Because the one-electron basis set limit is difficult to reach in correlated post-Hartree-Fock ab initio calculations, the low-cost route of using methods that extrapolate to the estimated basis set limit attracts immediate interest. The situation is somewhat more satisfactory at the Hartree-Fock level because numerical calculation of the energy is often affordable at nearly converged basis set levels. Still, extrapolation schemes for the Hartree-Fock energy are addressed here, although the focus is on the more slowly convergent and computationally demanding correlation energy. Because they are frequently based on the gold-standard coupled-cluster theory with single, double, and perturbative triple excitations [CCSD(T)], correlated calculations are often affordable only with the smallest basis sets, and hence single-level extrapolations from one raw energy could attain maximum usefulness. This possibility is examined. Whenever possible, this review uses raw data from second-order Møller-Plesset perturbation theory, as well as CCSD, CCSD(T), and multireference configuration interaction methods. Inescapably, the emphasis is on work done by the author's research group. Certain issues in need of further research or review are pinpointed.
Kagawa, Rina; Kawazoe, Yoshimasa; Ida, Yusuke; Shinohara, Emiko; Tanaka, Katsuya; Imai, Takeshi; Ohe, Kazuhiko
2017-07-01
Phenotyping is an automated technique that can be used to distinguish patients based on electronic health records. To improve the quality of medical care and advance type 2 diabetes mellitus (T2DM) research, the demand for T2DM phenotyping has been increasing. Some existing phenotyping algorithms are not sufficiently accurate for screening or identifying clinical research subjects. We propose a practical phenotyping framework using both expert knowledge and a machine learning approach to develop 2 phenotyping algorithms: one is for screening; the other is for identifying research subjects. We employ expert knowledge as rules to exclude obvious control patients and machine learning to increase accuracy for complicated patients. We developed phenotyping algorithms on the basis of our framework and performed binary classification to determine whether a patient has T2DM. To facilitate development of practical phenotyping algorithms, this study introduces new evaluation metrics: area under the precision-sensitivity curve (AUPS) with a high sensitivity and AUPS with a high positive predictive value. The proposed phenotyping algorithms based on our framework show higher performance than baseline algorithms. Our proposed framework can be used to develop 2 types of phenotyping algorithms depending on the tuning approach: one for screening, the other for identifying research subjects. We develop a novel phenotyping framework that can be easily implemented on the basis of proper evaluation metrics, which are in accordance with users' objectives. The phenotyping algorithms based on our framework are useful for extraction of T2DM patients in retrospective studies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McKemmish, Laura K., E-mail: laura.mckemmish@gmail.com; Research School of Chemistry, Australian National University, Canberra
Algorithms for the efficient calculation of two-electron integrals in the newly developed mixed ramp-Gaussian basis sets are presented, alongside a Fortran90 implementation of these algorithms, RAMPITUP. These new basis sets have significant potential to (1) give some speed-up (estimated at up to 20% for large molecules in fully optimised code) to general-purpose Hartree-Fock (HF) and density functional theory quantum chemistry calculations, replacing all-Gaussian basis sets, and (2) give very large speed-ups for calculations of core-dependent properties, such as electron density at the nucleus, NMR parameters, relativistic corrections, and total energies, replacing the current use of Slater basis functions or verymore » large specialised all-Gaussian basis sets for these purposes. This initial implementation already demonstrates roughly 10% speed-ups in HF/R-31G calculations compared to HF/6-31G calculations for large linear molecules, demonstrating the promise of this methodology, particularly for the second application. As well as the reduction in the total primitive number in R-31G compared to 6-31G, this timing advantage can be attributed to the significant reduction in the number of mathematically complex intermediate integrals after modelling each ramp-Gaussian basis-function-pair as a sum of ramps on a single atomic centre.« less
An Expanded Theoretical Framework of Care Coordination Across Transitions in Care Settings.
Radwin, Laurel E; Castonguay, Denise; Keenan, Carolyn B; Hermann, Cherice
2016-01-01
For many patients, high-quality, patient-centered, and cost-effective health care requires coordination among multiple clinicians and settings. Ensuring optimal care coordination requires a clear understanding of how clinician activities and continuity during transitions affect patient-centeredness and quality outcomes. This article describes an expanded theoretical framework to better understand care coordination. The framework provides clear articulation of concepts. Examples are provided of ways to measure the concepts.
Quantum Mechanical Calculations of Monoxides of Silicon Carbide Molecules
2003-03-01
Data for CO Final Energy Charge Mult Basis Set (hart) EA (eV) ZPE (hart) EA (eV) w/ ZPE 0 1 DVZ -112.6850703739 2.02121 -1 2 DVZ...Energy Charge Mult Basis Set (hart) EA (eV) ZPE (hart) EA (eV) w/ ZPE 0 1 DVZ -363.7341927429 0.617643 -1 2 DVZ -363.7114852831 0 3 DVZ...Input Geometry Output Geometry Basis Set Final Energy (hart) EA (eV) ZPE (hart) EA (eV) w/ ZPE -1 2 O-C-Si Linear O-C-Si Linear DZV -401.5363
DOE Office of Scientific and Technical Information (OSTI.GOV)
Okada, S.; Shinada, M.; Matsuoka, O.
1990-10-01
A systematic calculation of new relativistic Gaussian basis sets is reported. The new basis sets are similar to the previously reported ones (J. Chem. Phys. {bold 91}, 4193 (1989)), but, in the calculation, the Breit interaction has been explicitly included besides the Dirac--Coulomb Hamiltonian. They have been adopted for the calculation of the self-consistent field effect on the Breit interaction energies and are expected to be useful for the studies on higher-order effects such as the electron correlations and other quantum electrodynamical effects.
DOE Office of Scientific and Technical Information (OSTI.GOV)
De Jong, Wibe A.; Harrison, Robert J.; Dixon, David A.
A parallel implementation of the spin-free one-electron Douglas-Kroll(-Hess) Hamiltonian (DKH) in NWChem is discussed. An efficient and accurate method to calculate DKH gradients is introduced. It is shown that the use of standard (non-relativistic) contracted basis set can produce erroneous results for elements beyond the first row elements. The generation of DKH contracted cc-pVXZ (X = D, T, Q, 5) basis sets for H, He, B - Ne, Al - Ar, and Ga - Br will be discussed.
Heslop, Carl William; Burns, Sharyn; Lobo, Roanna; McConigley, Ruth
2017-01-01
Introduction There is limited research examining community-based or multilevel interventions that address the sexual health of young people in the rural Australian context. This paper describes the Participatory Action Research (PAR) project that will develop and validate a framework that is effective for planning, implementing and evaluating multilevel community-based sexual health interventions for young people aged 16–24 years in the Australian rural setting. Methods and analysis To develop a framework for sexual health interventions with stakeholders, PAR will be used. Three PAR cycles will be conducted, using semistructured one-on-one interviews, focus groups, community mapping and photovoice to inform the development of a draft framework. Cycle 2 and Cycle 3 will use targeted Delphi studies to gather evaluation and feedback on the developed draft framework. All data collected will be reviewed and analysed in detail and coded as concepts become apparent at each stage of the process. Ethics and dissemination This protocol describes a supervised doctoral research project. This project seeks to contribute to the literature regarding PAR in the rural setting and the use of the Delphi technique within PAR projects. The developed framework as a result of the project will provide a foundation for further research testing the application of the framework in other settings and health areas. This research has received ethics approval from the Curtin University Human Research and Ethics Committee (HR96/2015). PMID:28559453
NASA Astrophysics Data System (ADS)
Sanchez, Marina; Provasi, Patricio F.; Aucar, Gustavo A.; Sauer, Stephan P. A.
Locally dense basis sets (