Combining Formal and Functional Approaches to Topic Structure
ERIC Educational Resources Information Center
Zellers, Margaret; Post, Brechtje
2012-01-01
Fragmentation between formal and functional approaches to prosodic variation is an ongoing problem in linguistic research. In particular, the frameworks of the Phonetics of Talk-in-Interaction (PTI) and Empirical Phonology (EP) take very different theoretical and methodological approaches to this kind of variation. We argue that it is fruitful to…
Systems, methods and apparatus for pattern matching in procedure development and verification
NASA Technical Reports Server (NTRS)
Hinchey, Michael G. (Inventor); Rouff, Christopher A. (Inventor); Rash, James L. (Inventor)
2011-01-01
Systems, methods and apparatus are provided through which, in some embodiments, a formal specification is pattern-matched from scenarios, the formal specification is analyzed, and flaws in the formal specification are corrected. The systems, methods and apparatus may include pattern-matching an equivalent formal model from an informal specification. Such a model can be analyzed for contradictions, conflicts, use of resources before the resources are available, competition for resources, and so forth. From such a formal model, an implementation can be automatically generated in a variety of notations. The approach can improve the resulting implementation, which, in some embodiments, is provably equivalent to the procedures described at the outset, which in turn can improve confidence that the system reflects the requirements, and in turn reduces system development time and reduces the amount of testing required of a new system. Moreover, in some embodiments, two or more implementations can be "reversed" to appropriate formal models, the models can be combined, and the resulting combination checked for conflicts. Then, the combined, error-free model can be used to generate a new (single) implementation that combines the functionality of the original separate implementations, and may be more likely to be correct.
Formal Verification of Complex Systems based on SysML Functional Requirements
2014-12-23
Formal Verification of Complex Systems based on SysML Functional Requirements Hoda Mehrpouyan1, Irem Y. Tumer2, Chris Hoyle2, Dimitra Giannakopoulou3...requirements for design of complex engineered systems. The proposed ap- proach combines a SysML modeling approach to document and structure safety requirements...methods and tools to support the integration of safety into the design solution. 2.1. SysML for Complex Engineered Systems Traditional methods and tools
NASA Astrophysics Data System (ADS)
Zhou, Shiqi
2004-07-01
A universal formalism, which enables calculation of solvent-mediated potential (SMP) between two equal or non-equal solute particles with any shape immersed in solvent reservior consisting of atomic particle and/or polymer chain or their mixture, is proposed by importing a density functional theory externally into OZ equation systems. Only if size asymmetry of the solvent bath components is moderate, the present formalism can calculate the SMP in any complex fluids at the present development stage of statistical mechanics, and therefore avoids all of limitations of previous approaches for SMP. Preliminary calculation indicates the reliability of the present formalism.
VLBI-derived troposphere parameters during CONT08
NASA Astrophysics Data System (ADS)
Heinkelmann, R.; Böhm, J.; Bolotin, S.; Engelhardt, G.; Haas, R.; Lanotte, R.; MacMillan, D. S.; Negusini, M.; Skurikhina, E.; Titov, O.; Schuh, H.
2011-07-01
Time-series of zenith wet and total troposphere delays as well as north and east gradients are compared, and zenith total delays ( ZTD) are combined on the level of parameter estimates. Input data sets are provided by ten Analysis Centers (ACs) of the International VLBI Service for Geodesy and Astrometry (IVS) for the CONT08 campaign (12-26 August 2008). The inconsistent usage of meteorological data and models, such as mapping functions, causes systematics among the ACs, and differing parameterizations and constraints add noise to the troposphere parameter estimates. The empirical standard deviation of ZTD among the ACs with regard to an unweighted mean is 4.6 mm. The ratio of the analysis noise to the observation noise assessed by the operator/software impact (OSI) model is about 2.5. These and other effects have to be accounted for to improve the intra-technique combination of VLBI-derived troposphere parameters. While the largest systematics caused by inconsistent usage of meteorological data can be avoided and the application of different mapping functions can be considered by applying empirical corrections, the noise has to be modeled in the stochastic model of intra-technique combination. The application of different stochastic models shows no significant effects on the combined parameters but results in different mean formal errors: the mean formal errors of the combined ZTD are 2.3 mm (unweighted), 4.4 mm (diagonal), 8.6 mm [variance component (VC) estimation], and 8.6 mm (operator/software impact, OSI). On the one hand, the OSI model, i.e. the inclusion of off-diagonal elements in the cofactor-matrix, considers the reapplication of observations yielding a factor of about two for mean formal errors as compared to the diagonal approach. On the other hand, the combination based on VC estimation shows large differences among the VCs and exhibits a comparable scaling of formal errors. Thus, for the combination of troposphere parameters a combination of the two extensions of the stochastic model is recommended.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Duchemin, Ivan, E-mail: ivan.duchemin@cea.fr; Jacquemin, Denis; Institut Universitaire de France, 1 rue Descartes, 75005 Paris Cedex 5
We have implemented the polarizable continuum model within the framework of the many-body Green’s function GW formalism for the calculation of electron addition and removal energies in solution. The present formalism includes both ground-state and non-equilibrium polarization effects. In addition, the polarization energies are state-specific, allowing to obtain the bath-induced renormalisation energy of all occupied and virtual energy levels. Our implementation is validated by comparisons with ΔSCF calculations performed at both the density functional theory and coupled-cluster single and double levels for solvated nucleobases. The present study opens the way to GW and Bethe-Salpeter calculations in disordered condensed phases ofmore » interest in organic optoelectronics, wet chemistry, and biology.« less
NASA Astrophysics Data System (ADS)
Nourali, Mahrouz; Ghahraman, Bijan; Pourreza-Bilondi, Mohsen; Davary, Kamran
2016-09-01
In the present study, DREAM(ZS), Differential Evolution Adaptive Metropolis combined with both formal and informal likelihood functions, is used to investigate uncertainty of parameters of the HEC-HMS model in Tamar watershed, Golestan province, Iran. In order to assess the uncertainty of 24 parameters used in HMS, three flood events were used to calibrate and one flood event was used to validate the posterior distributions. Moreover, performance of seven different likelihood functions (L1-L7) was assessed by means of DREAM(ZS)approach. Four likelihood functions, L1-L4, Nash-Sutcliffe (NS) efficiency, Normalized absolute error (NAE), Index of agreement (IOA), and Chiew-McMahon efficiency (CM), is considered as informal, whereas remaining (L5-L7) is represented in formal category. L5 focuses on the relationship between the traditional least squares fitting and the Bayesian inference, and L6, is a hetereoscedastic maximum likelihood error (HMLE) estimator. Finally, in likelihood function L7, serial dependence of residual errors is accounted using a first-order autoregressive (AR) model of the residuals. According to the results, sensitivities of the parameters strongly depend on the likelihood function, and vary for different likelihood functions. Most of the parameters were better defined by formal likelihood functions L5 and L7 and showed a high sensitivity to model performance. Posterior cumulative distributions corresponding to the informal likelihood functions L1, L2, L3, L4 and the formal likelihood function L6 are approximately the same for most of the sub-basins, and these likelihood functions depict almost a similar effect on sensitivity of parameters. 95% total prediction uncertainty bounds bracketed most of the observed data. Considering all the statistical indicators and criteria of uncertainty assessment, including RMSE, KGE, NS, P-factor and R-factor, results showed that DREAM(ZS) algorithm performed better under formal likelihood functions L5 and L7, but likelihood function L5 may result in biased and unreliable estimation of parameters due to violation of the residualerror assumptions. Thus, likelihood function L7 provides posterior distribution of model parameters credibly and therefore can be employed for further applications.
Composing, Analyzing and Validating Software Models
NASA Astrophysics Data System (ADS)
Sheldon, Frederick T.
1998-10-01
This research has been conducted at the Computational Sciences Division of the Information Sciences Directorate at Ames Research Center (Automated Software Engineering Grp). The principle work this summer has been to review and refine the agenda that were carried forward from last summer. Formal specifications provide good support for designing a functionally correct system, however they are weak at incorporating non-functional performance requirements (like reliability). Techniques which utilize stochastic Petri nets (SPNs) are good for evaluating the performance and reliability for a system, but they may be too abstract and cumbersome from the stand point of specifying and evaluating functional behavior. Therefore, one major objective of this research is to provide an integrated approach to assist the user in specifying both functionality (qualitative: mutual exclusion and synchronization) and performance requirements (quantitative: reliability and execution deadlines). In this way, the merits of a powerful modeling technique for performability analysis (using SPNs) can be combined with a well-defined formal specification language. In doing so, we can come closer to providing a formal approach to designing a functionally correct system that meets reliability and performance goals.
Composing, Analyzing and Validating Software Models
NASA Technical Reports Server (NTRS)
Sheldon, Frederick T.
1998-01-01
This research has been conducted at the Computational Sciences Division of the Information Sciences Directorate at Ames Research Center (Automated Software Engineering Grp). The principle work this summer has been to review and refine the agenda that were carried forward from last summer. Formal specifications provide good support for designing a functionally correct system, however they are weak at incorporating non-functional performance requirements (like reliability). Techniques which utilize stochastic Petri nets (SPNs) are good for evaluating the performance and reliability for a system, but they may be too abstract and cumbersome from the stand point of specifying and evaluating functional behavior. Therefore, one major objective of this research is to provide an integrated approach to assist the user in specifying both functionality (qualitative: mutual exclusion and synchronization) and performance requirements (quantitative: reliability and execution deadlines). In this way, the merits of a powerful modeling technique for performability analysis (using SPNs) can be combined with a well-defined formal specification language. In doing so, we can come closer to providing a formal approach to designing a functionally correct system that meets reliability and performance goals.
Performance Measurement and Analysis of Certain Search Algorithms
1979-05-01
methodology that combines experiment and analysis in complementary and highly specialized and formalized roles, and that the richness of the domains make it ... it is difficult to determine what fraction of the observed differences between the 51 two sets is due to bias in sample set 1, and what fraction simply...given by its characteristic KMIN and KMAX functions. We posit a formal model of "knowledge" itself in which there are at least as many distinct "states
Combining formal and functional approaches to topic structure.
Zellers, Margaret; Post, Brechtje
2012-03-01
Fragmentation between formal and functional approaches to prosodic variation is an ongoing problem in linguistic research. In particular, the frameworks of the Phonetics of Talk-in-Interaction (PTI) and Empirical Phonology (EP) take very different theoretical and methodological approaches to this kind of variation. We argue that it is fruitful to adopt the insights of both PTI's qualitative analysis and EP's quantitative analysis and combine them into a multiple-methods approach. One realm in which it is possible to combine these frameworks is in the analysis of discourse topic structure and the prosodic cues relevant to it. By combining a quantitative and a qualitative approach to discourse topic structure, it is possible to give a better account of the observed variation in prosody, for example in the case of fundamental frequency (F0) peak timing, which can be explained in terms of pitch accent distribution over different topic structure categories. Similarly, local and global patterns in speech rate variation can be better explained and motivated by adopting insights from both PTI and EP in the study of topic structure. Combining PTI and EP can provide better accounts of speech data as well as opening up new avenues of investigation which would not have been possible in either approach alone.
Music acquisition: effects of enculturation and formal training on development.
Hannon, Erin E; Trainor, Laurel J
2007-11-01
Musical structure is complex, consisting of a small set of elements that combine to form hierarchical levels of pitch and temporal structure according to grammatical rules. As with language, different systems use different elements and rules for combination. Drawing on recent findings, we propose that music acquisition begins with basic features, such as peripheral frequency-coding mechanisms and multisensory timing connections, and proceeds through enculturation, whereby everyday exposure to a particular music system creates, in a systematic order of acquisition, culture-specific brain structures and representations. Finally, we propose that formal musical training invokes domain-specific processes that affect salience of musical input and the amount of cortical tissue devoted to its processing, as well as domain-general processes of attention and executive functioning.
Combining Formal, Non-Formal and Informal Learning for Workforce Skill Development
ERIC Educational Resources Information Center
Misko, Josie
2008-01-01
This literature review, undertaken for Australian Industry Group, shows how multiple variations and combinations of formal, informal and non-formal learning, accompanied by various government incentives and organisational initiatives (including job redesign, cross-skilling, multi-skilling, diversified career pathways, action learning projects,…
Fisz, Jacek J
2006-12-07
The optimization approach based on the genetic algorithm (GA) combined with multiple linear regression (MLR) method, is discussed. The GA-MLR optimizer is designed for the nonlinear least-squares problems in which the model functions are linear combinations of nonlinear functions. GA optimizes the nonlinear parameters, and the linear parameters are calculated from MLR. GA-MLR is an intuitive optimization approach and it exploits all advantages of the genetic algorithm technique. This optimization method results from an appropriate combination of two well-known optimization methods. The MLR method is embedded in the GA optimizer and linear and nonlinear model parameters are optimized in parallel. The MLR method is the only one strictly mathematical "tool" involved in GA-MLR. The GA-MLR approach simplifies and accelerates considerably the optimization process because the linear parameters are not the fitted ones. Its properties are exemplified by the analysis of the kinetic biexponential fluorescence decay surface corresponding to a two-excited-state interconversion process. A short discussion of the variable projection (VP) algorithm, designed for the same class of the optimization problems, is presented. VP is a very advanced mathematical formalism that involves the methods of nonlinear functionals, algebra of linear projectors, and the formalism of Fréchet derivatives and pseudo-inverses. Additional explanatory comments are added on the application of recently introduced the GA-NR optimizer to simultaneous recovery of linear and weakly nonlinear parameters occurring in the same optimization problem together with nonlinear parameters. The GA-NR optimizer combines the GA method with the NR method, in which the minimum-value condition for the quadratic approximation to chi(2), obtained from the Taylor series expansion of chi(2), is recovered by means of the Newton-Raphson algorithm. The application of the GA-NR optimizer to model functions which are multi-linear combinations of nonlinear functions, is indicated. The VP algorithm does not distinguish the weakly nonlinear parameters from the nonlinear ones and it does not apply to the model functions which are multi-linear combinations of nonlinear functions.
Hybrid Theory of P-Wave Electron-Hydrogen Elastic Scattering
NASA Technical Reports Server (NTRS)
Bhatia, Anand
2012-01-01
We report on a study of electron-hydrogen scattering, using a combination of a modified method of polarized orbitals and the optical potential formalism. The calculation is restricted to P waves in the elastic region, where the correlation functions are of Hylleraas type. It is found that the phase shifts are not significantly affected by the modification of the target function by a method similar to the method of polarized orbitals and they are close to the phase shifts calculated earlier by Bhatia. This indicates that the correlation function is general enough to include the target distortion (polarization) in the presence of the incident electron. The important fact is that in the present calculation, to obtain similar results only 35-term correlation function is needed in the wave function compared to the 220-term wave function required in the above-mentioned previous calculation. Results for the phase shifts, obtained in the present hybrid formalism, are rigorous lower bounds to the exact phase shifts.
Peeters, José M; Pot, Anne Margriet; de Lange, Jacomine; Spreeuwenberg, Peter M; Francke, Anneke L
2016-03-09
In the Netherlands, various organisational models of dementia case management exist. In this study the following four models are distinguished, based on differences in the availability of the service and in the case management function: Model 1: the case management service is available from first dementia symptoms + is always a separate specialist function; Model 2: the case management service is only available after a formal dementia diagnosis + is always a separate specialist function; Model 3: the case management service is available from first dementia symptoms + is often a combined function; Model 4: the case management service is only available after a formal dementia diagnosis + is often a combined function. The objectives of this study are to give insight into whether satisfaction with dementia case management and the development of caregiver burden depend on the organisational model. A survey was carried out in regional dementia care networks in the Netherlands among 554 informal carers for people with dementia at the start of case management (response of 85 %), and one year later. Descriptive statistics and multilevel models were used to analyse the data. The satisfaction with the case manager was high in general (an average of 8.0 within a possible range of 1 to 10), although the caregiver burden did not decrease in the first year after starting with case management. No differences were found between the four organisational models regarding the development of caregiver burden. However, statistically significant differences (p < 0.05) were found regarding satisfaction: informal carers in the organisational model where case management is only available after formal diagnosis of dementia and is often a combined function had on average the lowest satisfaction scores. Nevertheless, the satisfaction of informal carers within all organisational models was high (ranging from 7.51 to 8.40 within a range of 1 to 10). Organisational features of case management seem to make little or no difference to the development in caregiver burden and the satisfaction of informal carers. Future research is needed to explore whether the individual characteristics of the case managers themselves are associated with case management outcomes.
Bridging single and multireference coupled cluster theories with universal state selective formalism
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bhaskaran-Nair, Kiran; Kowalski, Karol
2013-05-28
The universal state selective (USS) multireference approach is used to construct new energy functionals which offers a unique possibility of bridging single and multireference coupled cluster theories (SR/MRCC). These functionals, which can be used to develop iterative and non-iterative approaches, utilize a special form of the trial wavefunctions, which assure additive separability (or size-consistency) of the USS energies in the non-interacting subsystem limit. When the USS formalism is combined with approximate SRCC theories, the resulting formalism can be viewed as a size-consistent version of the method of moments of coupled cluster equations (MMCC) employing a MRCC trial wavefunction. Special casesmore » of the USS formulations, which utilize single reference state specific CC (V.V. Ivanov, D.I. Lyakh, L. Adamowicz, Phys. Chem. Chem. Phys. 11, 2355 (2009)) and tailored CC (T. Kinoshita, O. Hino, R.J. Bartlett, J. Chem. Phys. 123, 074106 (2005)) expansions are also discussed.« less
F-Nets and Software Cabling: Deriving a Formal Model and Language for Portable Parallel Programming
NASA Technical Reports Server (NTRS)
DiNucci, David C.; Saini, Subhash (Technical Monitor)
1998-01-01
Parallel programming is still being based upon antiquated sequence-based definitions of the terms "algorithm" and "computation", resulting in programs which are architecture dependent and difficult to design and analyze. By focusing on obstacles inherent in existing practice, a more portable model is derived here, which is then formalized into a model called Soviets which utilizes a combination of imperative and functional styles. This formalization suggests more general notions of algorithm and computation, as well as insights into the meaning of structured programming in a parallel setting. To illustrate how these principles can be applied, a very-high-level graphical architecture-independent parallel language, called Software Cabling, is described, with many of the features normally expected from today's computer languages (e.g. data abstraction, data parallelism, and object-based programming constructs).
Symmetries of the Space of Linear Symplectic Connections
NASA Astrophysics Data System (ADS)
Fox, Daniel J. F.
2017-01-01
There is constructed a family of Lie algebras that act in a Hamiltonian way on the symplectic affine space of linear symplectic connections on a symplectic manifold. The associated equivariant moment map is a formal sum of the Cahen-Gutt moment map, the Ricci tensor, and a translational term. The critical points of a functional constructed from it interpolate between the equations for preferred symplectic connections and the equations for critical symplectic connections. The commutative algebra of formal sums of symmetric tensors on a symplectic manifold carries a pair of compatible Poisson structures, one induced from the canonical Poisson bracket on the space of functions on the cotangent bundle polynomial in the fibers, and the other induced from the algebraic fiberwise Schouten bracket on the symmetric algebra of each fiber of the cotangent bundle. These structures are shown to be compatible, and the required Lie algebras are constructed as central extensions of their! linear combinations restricted to formal sums of symmetric tensors whose first order term is a multiple of the differential of its zeroth order term.
Poss, Jeffrey W; Hirdes, John P; Fries, Brant E; McKillop, Ian; Chase, Mary
2008-04-01
The case-mix system Resource Utilization Groups version III for Home Care (RUG-III/HC) was derived using a modest data sample from Michigan, but to date no comprehensive large scale validation has been done. This work examines the performance of the RUG-III/HC classification using a large sample from Ontario, Canada. Cost episodes over a 13-week period were aggregated from individual level client billing records and matched to assessment information collected using the Resident Assessment Instrument for Home Care, from which classification rules for RUG-III/HC are drawn. The dependent variable, service cost, was constructed using formal services plus informal care valued at approximately one-half that of a replacement worker. An analytic dataset of 29,921 episodes showed a skewed distribution with over 56% of cases falling into the lowest hierarchical level, reduced physical functions. Case-mix index values for formal and informal cost showed very close similarities to those found in the Michigan derivation. Explained variance for a function of combined formal and informal cost was 37.3% (20.5% for formal cost alone), with personal support services as well as informal care showing the strongest fit to the RUG-III/HC classification. RUG-III/HC validates well compared with the Michigan derivation work. Potential enhancements to the present classification should consider the large numbers of undifferentiated cases in the reduced physical function group, and the low explained variance for professional disciplines.
Fernández, J J; Tablero, C; Wahnón, P
2004-06-08
In this paper we present an analysis of the convergence of the band structure properties, particularly the influence on the modification of the bandgap and bandwidth values in half metallic compounds by the use of the exact exchange formalism. This formalism for general solids has been implemented using a localized basis set of numerical functions to represent the exchange density. The implementation has been carried out using a code which uses a linear combination of confined numerical pseudoatomic functions to represent the Kohn-Sham orbitals. The application of this exact exchange scheme to a half-metallic semiconductor compound, in particular to Ga(4)P(3)Ti, a promising material in the field of high efficiency solar cells, confirms the existence of the isolated intermediate band in this compound. (c) 2004 American Institute of Physics.
Combining Trust and Behavioral Analysis to Detect Security Threats in Open Environments
2010-11-01
behavioral feature values. This would provide a baseline notional object trust and is formally defined as follows: TO(1)[0, 1] = ∑ 0,n:νbt wtP (S) (8...TO(2)[0, 1] = ∑ wtP (S) · identity(O,P ) (9) 28- 12 RTO-MP-IST-091 Combining Trust and Behavioral Analysis to Detect Security Threats in Open...respectively. The wtP weight function determines the significance of a particular behavioral feature in the final trust calculation. Note that the weight
Construction of even and odd combinations of Morse-like coherent states
NASA Astrophysics Data System (ADS)
Récamier, José; Jáuregui, Rocio
2003-06-01
In this work we construct approximate coherent states for the Morse potential using a method inspired by the f-oscillator formalism (Man'ko et al 1996 Proc. 4th Wigner Symp. ed M Natig, Atakishiyev, T H Seligman and K B Wolf (Singapore: World Scientific) p 421). We make even and odd combinations of these states and evaluate the temporal evolution of the position operator and its dispersion as a function of time when the states evolve under a nonlinear Morse Hamiltonian.
Neural system modeling and simulation using Hybrid Functional Petri Net.
Tang, Yin; Wang, Fei
2012-02-01
The Petri net formalism has been proved to be powerful in biological modeling. It not only boasts of a most intuitive graphical presentation but also combines the methods of classical systems biology with the discrete modeling technique. Hybrid Functional Petri Net (HFPN) was proposed specially for biological system modeling. An array of well-constructed biological models using HFPN yielded very interesting results. In this paper, we propose a method to represent neural system behavior, where biochemistry and electrical chemistry are both included using the Petri net formalism. We built a model for the adrenergic system using HFPN and employed quantitative analysis. Our simulation results match the biological data well, showing that the model is very effective. Predictions made on our model further manifest the modeling power of HFPN and improve the understanding of the adrenergic system. The file of our model and more results with their analysis are available in our supplementary material.
NASA Astrophysics Data System (ADS)
Nielsen, N. K.; Quaade, U. J.
1995-07-01
The physical phase space of the relativistic top, as defined by Hansson and Regge, is expressed in terms of canonical coordinates of the Poincaré group manifold. The system is described in the Hamiltonian formalism by the mass-shell condition and constraints that reduce the number of spin degrees of freedom. The constraints are second class and are modified into a set of first class constraints by adding combinations of gauge-fixing functions. The Batalin-Fradkin-Vilkovisky method is then applied to quantize the system in the path integral formalism in Hamiltonian form. It is finally shown that different gauge choices produce different equivalent forms of the constraints.
Bellavance, Gabriel; Barriault, Louis
2014-06-23
The remarkable biological activities of polyprenylated polycyclic acylphloroglucinols (PPAPs) combined with their highly decorated bicyclo[3.3.1]nonane-2,4,9-trione frameworks have inspired synthetic organic chemists over the last decade. The concise total syntheses of four natural products PPAPs; hyperforin and papuaforins A-C, and the formal synthesis of nemorosone are reported. Key to the realization of this strategy is the short and scalable synthesis of densely substituted PPAP scaffolds through a gold(I)-catalyzed 6-endo-dig carbocyclization of cyclic enol ethers for late-stage functionalization. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Quantifying similarity in reliability surfaces using the probability of agreement
Stevens, Nathaniel T.; Anderson-Cook, Christine Michaela
2017-03-30
When separate populations exhibit similar reliability as a function of multiple explanatory variables, combining them into a single population is tempting. This can simplify future predictions and reduce uncertainty associated with estimation. However, combining these populations may introduce bias if the underlying relationships are in fact different. The probability of agreement formally and intuitively quantifies the similarity of estimated reliability surfaces across a two-factor input space. An example from the reliability literature demonstrates the utility of the approach when deciding whether to combine two populations or to keep them as distinct. As a result, new graphical summaries provide strategies formore » visualizing the results.« less
Quantifying similarity in reliability surfaces using the probability of agreement
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stevens, Nathaniel T.; Anderson-Cook, Christine Michaela
When separate populations exhibit similar reliability as a function of multiple explanatory variables, combining them into a single population is tempting. This can simplify future predictions and reduce uncertainty associated with estimation. However, combining these populations may introduce bias if the underlying relationships are in fact different. The probability of agreement formally and intuitively quantifies the similarity of estimated reliability surfaces across a two-factor input space. An example from the reliability literature demonstrates the utility of the approach when deciding whether to combine two populations or to keep them as distinct. As a result, new graphical summaries provide strategies formore » visualizing the results.« less
NASA Technical Reports Server (NTRS)
Bhatia, A. K.
2012-01-01
The P-wave hybrid theory of electron-hydrogen elastic scattering [Phys. Rev. A 85, 052708 (2012)] is applied to the P-wave scattering from He ion. In this method, both short-range and long-range correlations are included in the Schroedinger equation at the same time, by using a combination of a modified method of polarized orbitals and the optical potential formalism. The short-correlation functions are of Hylleraas type. It is found that the phase shifts are not significantly affected by the modification of the target function by a method similar to the method of polarized orbitals and they are close to the phase shifts calculated earlier by Bhatia [Phys. Rev. A 69, 032714 (2004)]. This indicates that the correlation function is general enough to include the target distortion (polarization) in the presence of the incident electron. The important fact is that in the present calculation, to obtain similar results only a 20-term correlation function is needed in the wave function compared to the 220- term wave function required in the above-mentioned calculation. Results for the phase shifts, obtained in the present hybrid formalism, are rigorous lower bounds to the exact phase shifts. The lowest P-wave resonances in He atom and hydrogen ion have been calculated and compared with the results obtained using the Feshbach projection operator formalism [Phys. Rev. A, 11, 2018 (1975)]. It is concluded that accurate resonance parameters can be obtained by the present method, which has the advantage of including corrections due to neighboring resonances, bound states and the continuum in which these resonance are embedded.
NASA Technical Reports Server (NTRS)
Narkawicz, Anthony J.; Munoz, Cesar A.
2014-01-01
Sturm's Theorem is a well-known result in real algebraic geometry that provides a function that computes the number of roots of a univariate polynomial in a semiopen interval. This paper presents a formalization of this theorem in the PVS theorem prover, as well as a decision procedure that checks whether a polynomial is always positive, nonnegative, nonzero, negative, or nonpositive on any input interval. The soundness and completeness of the decision procedure is proven in PVS. The procedure and its correctness properties enable the implementation of a PVS strategy for automatically proving existential and universal univariate polynomial inequalities. Since the decision procedure is formally verified in PVS, the soundness of the strategy depends solely on the internal logic of PVS rather than on an external oracle. The procedure itself uses a combination of Sturm's Theorem, an interval bisection procedure, and the fact that a polynomial with exactly one root in a bounded interval is always nonnegative on that interval if and only if it is nonnegative at both endpoints.
Longitudinal Associations Between Formal Volunteering and Cognitive Functioning.
Proulx, Christine M; Curl, Angela L; Ermer, Ashley E
2018-03-02
The present study examines the association between formal volunteering and cognitive functioning over time. We also examine the moderating roles of race, sex, education, and time. Using 11,100 participants aged 51 years and older and nine waves of data from the Health and Retirement Survey, we simultaneously modeled the longitudinal associations between engaging in formal volunteering and changes in cognitive functioning using multilevel models. Formal volunteering was associated with higher levels of cognitive functioning over time, especially with aspects of cognitive functioning related to working memory and processing. This association was stronger for women than it was for men, and for those with below average levels of education. The positive association between formal volunteering and cognitive functioning weakened over time when cognitive functioning was conceptualized as memory, but strengthened over time when conceptualized as working memory and processing. Volunteering is a productive activity that is beneficial not just to society, but to volunteers' levels of cognitive functioning in older age. For women and those with lower levels of education, formal volunteering appears particularly beneficial to working memory and processing. © The Author 2017. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
A Strategy for Language Assessment of Young Children: A Combination of Two Approaches.
ERIC Educational Resources Information Center
Kelly, Donna J.; Rice, Mabel L.
1986-01-01
A proposed strategy for language assessment advocates a combination of descriptive and formal assessment measures. This approach involves a parent-clinician interview, parent-child observations, clinician-directed formal and nonformal assessment procedures, and a parent-clinician interpretation. An elaborated sample of language assessment is…
Planning Non-Formal Education Curricula: The Case of Israel.
ERIC Educational Resources Information Center
Keller, Diana; Dror, Ilana
This paper compares the formal and non-formal education systems currently operating in Israel, describing the special features of curriculum planning in non-formal education. The central argument is that the non-formal education system fulfills functions that constitute a critique of the formal education system. The non-formal system offers the…
Boman, Inga-Lill; Persson, Ann-Christine; Bartfai, Aniko
2016-03-07
This project Smart Assisted Living involving Informal careGivers++ (SALIG) intends to develop an ICT-based device for persons with cognitive impairment combined with remote support possibilities for significant others and formal caregivers. This paper presents the identification of the target groups' needs and requirements of such device and the evaluation of the first mock-up, demonstrated in a tablet. The inclusive design method that includes end-users in the design process was chosen. First, a scoping review was conducted in order to examine the target group's need of an ICT-based device, and to gather recommendations regarding its design and functionalities. In order to capture the users' requirements of the design and functionalities of the device three targeted focus groups were conducted. Based on the findings from the publications and the focus groups a user requirement specification was developed. After that a design concept and a first mock-up was developed in an iterative process. The mock-up was evaluated through interviews with persons with cognitive impairment, health care professionals and significant others. Data were analysed using content analysis. Several useful recommendations of the design and functionalities of the SALIG device for persons with cognitive impairment were identified. The main benefit of the mock-up was that it was a single device with a set of functionalities installed on a tablet and designed for persons with cognitive impairment. An additional benefit was that it could be used remotely by significant others and formal caregivers. The SALIG device has the potentials to facilitate everyday life for persons with cognitive impairment, their significant others and the work situation for formal caregivers. The results may provide guidance in the development of different types of technologies for the target population and for people with diverse disabilities. Further work will focus on developing a prototype to be empirically tested by persons with cognitive impairment, their significant others and formal caregivers.
NASA Astrophysics Data System (ADS)
Yoshimoto, Yuta; Li, Zhen; Kinefuchi, Ikuya; Karniadakis, George Em
2017-12-01
We propose a new coarse-grained (CG) molecular simulation technique based on the Mori-Zwanzig (MZ) formalism along with the iterative Boltzmann inversion (IBI). Non-Markovian dissipative particle dynamics (NMDPD) taking into account memory effects is derived in a pairwise interaction form from the MZ-guided generalized Langevin equation. It is based on the introduction of auxiliary variables that allow for the replacement of a non-Markovian equation with a Markovian one in a higher dimensional space. We demonstrate that the NMDPD model exploiting MZ-guided memory kernels can successfully reproduce the dynamic properties such as the mean square displacement and velocity autocorrelation function of a Lennard-Jones system, as long as the memory kernels are appropriately evaluated based on the Volterra integral equation using the force-velocity and velocity-velocity correlations. Furthermore, we find that the IBI correction of a pair CG potential significantly improves the representation of static properties characterized by a radial distribution function and pressure, while it has little influence on the dynamic processes. Our findings suggest that combining the advantages of both the MZ formalism and IBI leads to an accurate representation of both the static and dynamic properties of microscopic systems that exhibit non-Markovian behavior.
NASA Technical Reports Server (NTRS)
Chelton, Dudley B.; Schlax, Michael G.
1994-01-01
A formalism is presented for determining the wavenumber-frequency transfer function associated with an irregularly sampled multidimensional dataset. This transfer function reveals the filtering characteristics and aliasing patterns inherent in the sample design. In combination with information about the spectral characteristics of the signal, the transfer function can be used to quantify the spatial and temporal resolution capability of the dataset. Application of the method to idealized Geosat altimeter data (i.e., neglecting measurement errors and data dropouts) concludes that the Geosat orbit configuration is capable of resolving scales of about 3 deg in latitude and longitude by about 30 days.
Defect and grain boundary scattering in tungsten: A combined theoretical and experimental study
NASA Astrophysics Data System (ADS)
Lanzillo, Nicholas A.; Dixit, Hemant; Milosevic, Erik; Niu, Chengyu; Carr, Adra V.; Oldiges, Phil; Raymond, Mark V.; Cho, Jin; Standaert, Theodorus E.; Kamineni, Vimal K.
2018-04-01
Several major electron scattering mechanisms in tungsten (W) are evaluated using a combination of first-principles density functional theory, a Non-Equilibrium Green's Function formalism, and thin film Kelvin 4-point sheet resistance measurements. The impact of grain boundary scattering is found to be roughly an order of magnitude larger than the impact of defect scattering. Ab initio simulations predict average grain boundary reflection coefficients for a number of twin grain boundaries to lie in the range r = 0.47 to r = 0.62, while experimental data can be fit to the empirical Mayadas-Schatzkes model with a comparable but slightly larger value of r = 0.69. The experimental and simulation data for grain boundary resistivity as a function of grain size show excellent agreement. These results provide crucial insights for understanding the impact of scaling of W-based contacts between active devices and back-end-of-line interconnects in next-generation semiconductor technology.
Hentschinski, M; Kusina, A; Kutak, K; Serino, M
2018-01-01
We calculate the transverse momentum dependent gluon-to-gluon splitting function within [Formula: see text]-factorization, generalizing the framework employed in the calculation of the quark splitting functions in Hautmann et al. (Nucl Phys B 865:54-66, arXiv:1205.1759, 2012), Gituliar et al. (JHEP 01:181, arXiv:1511.08439, 2016), Hentschinski et al. (Phys Rev D 94(11):114013, arXiv:1607.01507, 2016) and demonstrate at the same time the consistency of the extended formalism with previous results. While existing versions of [Formula: see text] factorized evolution equations contain already a gluon-to-gluon splitting function i.e. the leading order Balitsky-Fadin-Kuraev-Lipatov (BFKL) kernel or the Ciafaloni-Catani-Fiorani-Marchesini (CCFM) kernel, the obtained splitting function has the important property that it reduces both to the leading order BFKL kernel in the high energy limit, to the Dokshitzer-Gribov-Lipatov-Altarelli-Parisi (DGLAP) gluon-to-gluon splitting function in the collinear limit as well as to the CCFM kernel in the soft limit. At the same time we demonstrate that this splitting kernel can be obtained from a direct calculation of the QCD Feynman diagrams, based on a combined implementation of the Curci-Furmanski-Petronzio formalism for the calculation of the collinear splitting functions and the framework of high energy factorization.
Schwinger-Keldysh diagrammatics for primordial perturbations
NASA Astrophysics Data System (ADS)
Chen, Xingang; Wang, Yi; Xianyu, Zhong-Zhi
2017-12-01
We present a systematic introduction to the diagrammatic method for practical calculations in inflationary cosmology, based on Schwinger-Keldysh path integral formalism. We show in particular that the diagrammatic rules can be derived directly from a classical Lagrangian even in the presence of derivative couplings. Furthermore, we use a quasi-single-field inflation model as an example to show how this formalism, combined with the trick of mixed propagator, can significantly simplify the calculation of some in-in correlation functions. The resulting bispectrum includes the lighter scalar case (m<3H/2) that has been previously studied, and the heavier scalar case (m>3H/2) that has not been explicitly computed for this model. The latter provides a concrete example of quantum primordial standard clocks, in which the clock signals can be observably large.
2013-01-01
Background To formulate sustainable long-term care policies, it is critical first to understand the relationship between informal care and formal care expenditure. The aim of this paper is to examine to what extent informal care reduces public expenditure on elderly care. Methods Data from a geriatric rehabilitation program conducted in Finland (Age Study, n = 732) were used to estimate the annual public care expenditure on elderly care. We first constructed hierarchical multilevel regression models to determine the factors associated with elderly care expenditure. Second, we calculated the adjusted mean costs of care in four care patterns: 1) informal care only for elderly living alone; 2) informal care only from a co-resident family member; 3) a combination of formal and informal care; and 4) formal care only. We included functional independence and health-related quality of life (15D score) measures into our models. This method standardizes the care needs of a heterogeneous subject group and enabled us to compare expenditure among various care categories even when differences were observed in the subjects’ physical health. Results Elder care that consisted of formal care only had the highest expenditure at 25,300 Euros annually. The combination of formal and informal care had an annual expenditure of 22,300 Euros. If a person received mainly informal care from a co-resident family member, then the annual expenditure was only 4,900 Euros and just 6,000 Euros for a person living alone and receiving informal care. Conclusions Our analysis of a frail elderly Finnish population shows that the availability of informal care considerably reduces public care expenditure. Therefore, informal care should be taken into account when formulating policies for long-term care. The process whereby families choose to provide care for their elderly relatives has a significant impact on long-term care expenditure. PMID:23947622
Implementing NLO DGLAP evolution in parton showers
Hoche, Stefan; Krauss, Frank; Prestel, Stefan
2017-10-13
Here, we present a parton shower which implements the DGLAP evolution of parton densities and fragmentation functions at next-to-leading order precision up to effects stemming from local four-momentum conservation. The Monte-Carlo simulation is based on including next-to-leading order collinear splitting functions in an existing parton shower and combining their soft enhanced contributions with the corresponding terms at leading order. Soft double counting is avoided by matching to the soft eikonal. Example results from two independent realizations of the algorithm, implemented in the two event generation frameworks Pythia and Sherpa, illustrate the improved precision of the new formalism.
Implementing NLO DGLAP evolution in parton showers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Höche, Stefan; Krauss, Frank; Prestel, Stefan
2017-10-01
We present a parton shower which implements the DGLAP evolution of parton densities and fragmentation functions at next-to-leading order precision up to effects stemming from local four-momentum conservation. The Monte-Carlo simulation is based on including next-to-leading order collinear splitting functions in an existing parton shower and combining their soft enhanced contributions with the corresponding terms at leading order. Soft double counting is avoided by matching to the soft eikonal. Example results from two independent realizations of the algorithm, implemented in the two event generation frameworks Pythia and Sherpa, illustrate the improved precision of the new formalism.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ritboon, Atirach, E-mail: atirach.3.14@gmail.com; Department of Physics, Faculty of Science, Prince of Songkla University, Hat Yai 90112; Daengngam, Chalongrat, E-mail: chalongrat.d@psu.ac.th
2016-08-15
Biakynicki-Birula introduced a photon wave function similar to the matter wave function that satisfies the Schrödinger equation. Its second quantization form can be applied to investigate nonlinear optics at nearly full quantum level. In this paper, we applied the photon wave function formalism to analyze both linear optical processes in the well-known Mach–Zehnder interferometer and nonlinear optical processes for sum-frequency generation in dispersive and lossless medium. Results by photon wave function formalism agree with the well-established Maxwell treatments and existing experimental verifications.
Supporting Professional Learning in a Massive Open Online Course
ERIC Educational Resources Information Center
Milligan, Colin; Littlejohn, Allison
2014-01-01
Professional learning, combining formal and on the job learning, is important for the development and maintenance of expertise in the modern workplace. To integrate formal and informal learning, professionals have to have good self-regulatory ability. Formal learning opportunities are opening up through massive open online courses (MOOCs),…
Charlesworth, Jac C; Peralta, Juan M; Drigalenko, Eugene; Göring, Harald Hh; Almasy, Laura; Dyer, Thomas D; Blangero, John
2009-12-15
Gene identification using linkage, association, or genome-wide expression is often underpowered. We propose that formal combination of information from multiple gene-identification approaches may lead to the identification of novel loci that are missed when only one form of information is available. Firstly, we analyze the Genetic Analysis Workshop 16 Framingham Heart Study Problem 2 genome-wide association data for HDL-cholesterol using a "gene-centric" approach. Then we formally combine the association test results with genome-wide transcriptional profiling data for high-density lipoprotein cholesterol (HDL-C), from the San Antonio Family Heart Study, using a Z-transform test (Stouffer's method). We identified 39 genes by the joint test at a conservative 1% false-discovery rate, including 9 from the significant gene-based association test and 23 whose expression was significantly correlated with HDL-C. Seven genes identified as significant in the joint test were not independently identified by either the association or expression tests. This combined approach has increased power and leads to the direct nomination of novel candidate genes likely to be involved in the determination of HDL-C levels. Such information can then be used as justification for a more exhaustive search for functional sequence variation within the nominated genes. We anticipate that this type of analysis will improve our speed of identification of regulatory genes causally involved in disease risk.
Davis, Kelly D.; Zarit, Steven H.; Moen, Phyllis; Hammer, Leslie B.; Almeida, David M.
2016-01-01
Objectives. Women who combine formal and informal caregiving roles represent a unique, understudied population. In the literature, healthcare employees who simultaneously provide unpaid elder care at home have been referred to as double-duty caregivers. The present study broadens this perspective by examining the psychosocial implications of double-duty child care (child care only), double-duty elder care (elder care only), and triple-duty care (both child care and elder care or “sandwiched” care). Method. Drawing from the Work, Family, and Health Study, we focus on a large sample of women working in nursing homes in the United States (n = 1,399). We use multiple regression analysis and analysis of covariance tests to examine a range of psychosocial implications associated with double- and triple-duty care. Results. Compared with nonfamily caregivers, double-duty child caregivers indicated greater family-to-work conflict and poorer partner relationship quality. Double-duty elder caregivers reported more family-to-work conflict, perceived stress, and psychological distress, whereas triple-duty caregivers indicated poorer psychosocial functioning overall. Discussion. Relative to their counterparts without family caregiving roles, women with combined caregiving roles reported poorer psychosocial well-being. Additional research on women with combined caregiving roles, especially triple-duty caregivers, should be a priority amidst an aging population, older workforce, and growing number of working caregivers. PMID:25271309
Modular Knowledge Representation and Reasoning in the Semantic Web
NASA Astrophysics Data System (ADS)
Serafini, Luciano; Homola, Martin
Construction of modular ontologies by combining different modules is becoming a necessity in ontology engineering in order to cope with the increasing complexity of the ontologies and the domains they represent. The modular ontology approach takes inspiration from software engineering, where modularization is a widely acknowledged feature. Distributed reasoning is the other side of the coin of modular ontologies: given an ontology comprising of a set of modules, it is desired to perform reasoning by combination of multiple reasoning processes performed locally on each of the modules. In the last ten years, a number of approaches for combining logics has been developed in order to formalize modular ontologies. In this chapter, we survey and compare the main formalisms for modular ontologies and distributed reasoning in the Semantic Web. We select four formalisms build on formal logical grounds of Description Logics: Distributed Description Logics, ℰ-connections, Package-based Description Logics and Integrated Distributed Description Logics. We concentrate on expressivity and distinctive modeling features of each framework. We also discuss reasoning capabilities of each framework.
Using the Chebychev expansion in quantum transport calculations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Popescu, Bogdan; Rahman, Hasan; Kleinekathöfer, Ulrich, E-mail: u.kleinekathoefer@jacobs-university.de
2015-04-21
Irradiation by laser pulses and a fluctuating surrounding liquid environment can, for example, lead to time-dependent effects in the transport through molecular junctions. From the theoretical point of view, time-dependent theories of quantum transport are still challenging. In one of these existing transport theories, the energy-dependent coupling between molecule and leads is decomposed into Lorentzian functions. This trick has successfully been combined with quantum master approaches, hierarchical formalisms, and non-equilibrium Green’s functions. The drawback of this approach is, however, its serious limitation to certain forms of the molecule-lead coupling and to higher temperatures. Tian and Chen [J. Chem. Phys. 137,more » 204114 (2012)] recently employed a Chebychev expansion to circumvent some of these latter problems. Here, we report on a similar approach also based on the Chebychev expansion but leading to a different set of coupled differential equations using the fact that a derivative of a zeroth-order Bessel function can again be given in terms of Bessel functions. Test calculations show the excellent numerical accuracy and stability of the presented formalism. The time span for which this Chebychev expansion scheme is valid without any restrictions on the form of the spectral density or temperature can be determined a priori.« less
Schools Together: Enhancing the Citizenship Curriculum through a Non-Formal Education Programme
ERIC Educational Resources Information Center
O'Connor, Una
2012-01-01
In divided societies education for diversity, often introduced via the combined approaches of civic education, citizenship education and community-relations activity, is advocated as a core element of the school curriculum. Its delivery, through formal and non-formal educational approaches, has been routinely recognised as an opportunity for…
NASA Astrophysics Data System (ADS)
Bhatia, A. K.
2012-09-01
The P-wave hybrid theory of electron-hydrogen elastic scattering [Bhatia, Phys. Rev. A10.1103/PhysRevA.85.052708 85, 052708 (2012)] is applied to the P-wave scattering from He ion. In this method, both short-range and long-range correlations are included in the Schrödinger equation at the same time, by using a combination of a modified method of polarized orbitals and the optical potential formalism. The short-range-correlation functions are of Hylleraas type. It is found that the phase shifts are not significantly affected by the modification of the target function by a method similar to the method of polarized orbitals and they are close to the phase shifts calculated earlier by Bhatia [Phys. Rev. A10.1103/PhysRevA.69.032714 69, 032714 (2004)]. This indicates that the correlation function is general enough to include the target distortion (polarization) in the presence of the incident electron. The important fact is that in the present calculation, to obtain similar results only a 20-term correlation function is needed in the wave function compared to the 220-term wave function required in the above-mentioned calculation. Results for the phase shifts, obtained in the present hybrid formalism, are rigorous lower bounds to the exact phase shifts. The lowest P-wave resonances in He atom and hydrogen ion have also been calculated and compared with the results obtained using the Feshbach projection operator formalism [Bhatia and Temkin, Phys. Rev. A10.1103/PhysRevA.11.2018 11, 2018 (1975)] and also with the results of other calculations. It is concluded that accurate resonance parameters can be obtained by the present method, which has the advantage of including corrections due to neighboring resonances, bound states, and the continuum in which these resonances are embedded.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dai, Ling -Yun; Kang, Zhong -Bo; Prokudin, Alexei
2015-12-22
Here, we study the Sivers asymmetry in semi-inclusive hadron production in deep inelastic scattering. We concentrate on the contribution from the photon-gluon fusion channel at O(α em 2α s), where three-gluon correlation functions play a major role within the twist-3 collinear factorization formalism. We establish the correspondence between such a formalism with three-gluon correlation functions and the usual transverse momentum-dependent (TMD) factorization formalism at moderate hadron transverse momenta. We derive the coefficient functions used in the usual TMD evolution formalism related to the quark Sivers function expansion in terms of the three-gluon correlation functions. We further perform the next-to-leading ordermore » calculation for the transverse momentum-weighted spin-dependent differential cross section and identify the off-diagonal contribution from the three-gluon correlation functions to the QCD collinear evolution of the twist-3 Qiu-Sterman function.« less
ERIC Educational Resources Information Center
Goldratt, Miri; Cohen, Eric H.
2016-01-01
This article explores encounters between formal, informal, and non-formal education and the role of mentor-educators in creating values education in which such encounters take place. Mixed-methods research was conducted in Israeli public schools participating in the Personal Education Model, which combines educational modes. Ethnographic and…
NASA Astrophysics Data System (ADS)
Zhou, Chenyi; Guo, Hong
2017-01-01
We report a diagrammatic method to solve the general problem of calculating configurationally averaged Green's function correlators that appear in quantum transport theory for nanostructures containing disorder. The theory treats both equilibrium and nonequilibrium quantum statistics on an equal footing. Since random impurity scattering is a problem that cannot be solved exactly in a perturbative approach, we combine our diagrammatic method with the coherent potential approximation (CPA) so that a reliable closed-form solution can be obtained. Our theory not only ensures the internal consistency of the diagrams derived at different levels of the correlators but also satisfies a set of Ward-like identities that corroborate the conserving consistency of transport calculations within the formalism. The theory is applied to calculate the quantum transport properties such as average ac conductance and transmission moments of a disordered tight-binding model, and results are numerically verified to high precision by comparing to the exact solutions obtained from enumerating all possible disorder configurations. Our formalism can be employed to predict transport properties of a wide variety of physical systems where disorder scattering is important.
The Formalization of Cultural Psychology. Reasons and Functions.
Salvatore, Sergio
2017-03-01
In this paper I discuss two basic theses about the formalization of cultural psychology. First, I claim that formalization is a relevant, even necessary stage of development of this domain of science. This is so because formalization allows the scientific language to achieve a much needed autonomy from the commonsensical language of the phenomena that this science deals with. Second, I envisage the two main functions that formalization has to perform in the field of cultural psychology: on the one hand, it has to provide formal rules grounding and constraining the deductive construction of the general theory; on the other hand, it has to provide the devices for supporting the interpretation of local phenomena, in terms of the abductive reconstruction of the network of linkages among empirical occurrences comprising the local phenomena.
Thermodynamic assessment of the U–Y–O system
Brese, R. G.; McMurray, J. W.; Shin, D.; ...
2015-02-03
We developed a CALPHAD assessment of the U-Y-O system. To represent the YO2 compound in the compound energy formalism (CEF) for U 1-yY yO 2± x, the lattice stability was calculated using density functional theory (DFT) while a partially ionic liquid sub-lattice model is used to describe the liquid phase. Moreover, a Gibbs function for the stoichiometric rhombohedral UY 6O 12 phase is proposed. Models representing the phases in the U-O and Y-O systems taken from the literature along with the phases that appear in the U-Y-O ternary are combined to form a unified assessment.
Electronic transport properties in [n]cycloparaphenylenes molecular devices
NASA Astrophysics Data System (ADS)
Hu, Lizhi; Guo, Yandong; Yan, Xiaohong; Zeng, Hongli; Zhou, Jie
2017-07-01
The electronic transport of [n]cycloparaphenylenes ([n]CPPs) is investigated based on nonequilibrium Green's function formalism in combination with the density-functional theory. Negative differential resistance (NDR) phenomenon is observed. Further analysis shows that the reduction of the transmission peak induced by the bias changing near Fermi energy results in the NDR effect. Replacing the electrode (from carbon chain to Au electrode), doping with N atom and changing the size of the nanohoop (n = 5, 6, 8, 10) have also been studied and the NDR still exists, suggesting the NDR behavior is the intrinsic feature of such [n]CPPs systems, which would be quite useful in future nanoelectronic devices.
Highly efficient spin polarizer based on individual heterometallic cubane single-molecule magnets
NASA Astrophysics Data System (ADS)
Dong, Damin
2015-09-01
The spin-polarized transport across a single-molecule magnet [Mn3Zn(hmp)3O(N3)3(C3H5O2)3].2CHCl3 has been investigated using a density functional theory combined with Keldysh non-equilibrium Green's function formalism. It is shown that this single-molecule magnet has perfect spin filter behaviour. By adsorbing Ni3 cluster onto non-magnetic Au electrode, a large magnetoresistance exceeding 172% is found displaying molecular spin valve feature. Due to the tunneling via discrete quantum-mechanical states, the I-V curve has a stepwise character and negative differential resistance behaviour.
Generalizing Prototype Theory: A Formal Quantum Framework
Aerts, Diederik; Broekaert, Jan; Gabora, Liane; Sozzo, Sandro
2016-01-01
Theories of natural language and concepts have been unable to model the flexibility, creativity, context-dependence, and emergence, exhibited by words, concepts and their combinations. The mathematical formalism of quantum theory has instead been successful in capturing these phenomena such as graded membership, situational meaning, composition of categories, and also more complex decision making situations, which cannot be modeled in traditional probabilistic approaches. We show how a formal quantum approach to concepts and their combinations can provide a powerful extension of prototype theory. We explain how prototypes can interfere in conceptual combinations as a consequence of their contextual interactions, and provide an illustration of this using an intuitive wave-like diagram. This quantum-conceptual approach gives new life to original prototype theory, without however making it a privileged concept theory, as we explain at the end of our paper. PMID:27065436
ERIC Educational Resources Information Center
Penuel, William R.; Riel, Margaret; Joshi, Aasha; Pearlman, Leslie; Kim, Chong Min; Frank, Kenneth A.
2010-01-01
Previous qualitative studies show that when the formal organization of a school and patterns of informal interaction are aligned, faculty and leaders in a school are better able to coordinate instructional change. This article combines social network analysis with interview data to analyze how well the formal and informal aspects of a school's…
The Influence of Rural Location on Utilization of Formal Home Care: The Role of Medicaid
ERIC Educational Resources Information Center
McAuley, William J.; Spector, William D.; Van Nostrand, Joan; Shaffer, Tom
2004-01-01
Purpose: This research examines the impact of rural-urban residence on formal home-care utilization among older people and determines whether and how Medicaid coverage influences the association between, rural-urban location and risk of formal home-care use. Design and Methods: We combined data from the 1998 consolidated file of the Medical…
Bridging the gap between formal and experience-based knowledge for context-aware laparoscopy.
Katić, Darko; Schuck, Jürgen; Wekerle, Anna-Laura; Kenngott, Hannes; Müller-Stich, Beat Peter; Dillmann, Rüdiger; Speidel, Stefanie
2016-06-01
Computer assistance is increasingly common in surgery. However, the amount of information is bound to overload processing abilities of surgeons. We propose methods to recognize the current phase of a surgery for context-aware information filtering. The purpose is to select the most suitable subset of information for surgical situations which require special assistance. We combine formal knowledge, represented by an ontology, and experience-based knowledge, represented by training samples, to recognize phases. For this purpose, we have developed two different methods. Firstly, we use formal knowledge about possible phase transitions to create a composition of random forests. Secondly, we propose a method based on cultural optimization to infer formal rules from experience to recognize phases. The proposed methods are compared with a purely formal knowledge-based approach using rules and a purely experience-based one using regular random forests. The comparative evaluation on laparoscopic pancreas resections and adrenalectomies employs a consistent set of quality criteria on clean and noisy input. The rule-based approaches proved best with noisefree data. The random forest-based ones were more robust in the presence of noise. Formal and experience-based knowledge can be successfully combined for robust phase recognition.
NASA Astrophysics Data System (ADS)
Suzuki, Yoshi-ichi; Seideman, Tamar; Stener, Mauro
2004-01-01
Time-resolved photoelectron differential cross sections are computed within a quantum dynamical theory that combines a formally exact solution of the nuclear dynamics with density functional theory (DFT)-based approximations of the electronic dynamics. Various observables of time-resolved photoelectron imaging techniques are computed at the Kohn-Sham and at the time-dependent DFT levels. Comparison of the results serves to assess the reliability of the former method and hence its usefulness as an economic approach for time-domain photoelectron cross section calculations, that is applicable to complex polyatomic systems. Analysis of the matrix elements that contain the electronic dynamics provides insight into a previously unexplored aspect of femtosecond-resolved photoelectron imaging.
DePasquale, Nicole; Davis, Kelly D; Zarit, Steven H; Moen, Phyllis; Hammer, Leslie B; Almeida, David M
2016-03-01
Women who combine formal and informal caregiving roles represent a unique, understudied population. In the literature, healthcare employees who simultaneously provide unpaid elder care at home have been referred to as double-duty caregivers. The present study broadens this perspective by examining the psychosocial implications of double-duty child care (child care only), double-duty elder care (elder care only), and triple-duty care (both child care and elder care or "sandwiched" care). Drawing from the Work, Family, and Health Study, we focus on a large sample of women working in nursing homes in the United States (n = 1,399). We use multiple regression analysis and analysis of covariance tests to examine a range of psychosocial implications associated with double- and triple-duty care. Compared with nonfamily caregivers, double-duty child caregivers indicated greater family-to-work conflict and poorer partner relationship quality. Double-duty elder caregivers reported more family-to-work conflict, perceived stress, and psychological distress, whereas triple-duty caregivers indicated poorer psychosocial functioning overall. Relative to their counterparts without family caregiving roles, women with combined caregiving roles reported poorer psychosocial well-being. Additional research on women with combined caregiving roles, especially triple-duty caregivers, should be a priority amidst an aging population, older workforce, and growing number of working caregivers. © The Author 2014. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
How linear response shaped models of neural circuits and the quest for alternatives.
Herfurth, Tim; Tchumatchenko, Tatjana
2017-10-01
In the past decades, many mathematical approaches to solve complex nonlinear systems in physics have been successfully applied to neuroscience. One of these tools is the concept of linear response functions. However, phenomena observed in the brain emerge from fundamentally nonlinear interactions and feedback loops rather than from a composition of linear filters. Here, we review the successes achieved by applying the linear response formalism to topics, such as rhythm generation and synchrony and by incorporating it into models that combine linear and nonlinear transformations. We also discuss the challenges encountered in the linear response applications and argue that new theoretical concepts are needed to tackle feedback loops and non-equilibrium dynamics which are experimentally observed in neural networks but are outside of the validity regime of the linear response formalism. Copyright © 2017 Elsevier Ltd. All rights reserved.
Legaz-García, María Del Carmen; Dentler, Kathrin; Fernández-Breis, Jesualdo Tomás; Cornet, Ronald
2017-01-01
ArchMS is a framework that represents clinical information and knowledge using ontologies in OWL, which facilitates semantic interoperability and thereby the exploitation and secondary use of clinical data. However, it does not yet support the automated assessment of quality of care. CLIF is a stepwise method to formalize quality indicators. The method has been implemented in the CLIF tool which supports its users in generating computable queries based on a patient data model which can be based on archetypes. To enable the automated computation of quality indicators using ontologies and archetypes, we tested whether ArchMS and the CLIF tool can be integrated. We successfully automated the process of generating SPARQL queries from quality indicators that have been formalized with CLIF and integrated them into ArchMS. Hence, ontologies and archetypes can be combined for the execution of formalized quality indicators.
Some properties for integro-differential operator defined by a fractional formal.
Abdulnaby, Zainab E; Ibrahim, Rabha W; Kılıçman, Adem
2016-01-01
Recently, the study of the fractional formal (operators, polynomials and classes of special functions) has been increased. This study not only in mathematics but extended to another topics. In this effort, we investigate a generalized integro-differential operator [Formula: see text] defined by a fractional formal (fractional differential operator) and study some its geometric properties by employing it in new subclasses of analytic univalent functions.
A quasilinear operator retaining magnetic drift effects in tokamak geometry
NASA Astrophysics Data System (ADS)
Catto, Peter J.; Lee, Jungpyo; Ram, Abhay K.
2017-12-01
The interaction of radio frequency waves with charged particles in a magnetized plasma is usually described by the quasilinear operator that was originally formulated by Kennel & Engelmann (Phys. Fluids, vol. 9, 1966, pp. 2377-2388). In their formulation the plasma is assumed to be homogenous and embedded in a uniform magnetic field. In tokamak plasmas the Kennel-Engelmann operator does not capture the magnetic drifts of the particles that are inherent to the non-uniform magnetic field. To overcome this deficiency a combined drift and gyrokinetic derivation is employed to derive the quasilinear operator for radio frequency heating and current drive in a tokamak with magnetic drifts retained. The derivation requires retaining the magnetic moment to higher order in both the unperturbed and perturbed kinetic equations. The formal prescription for determining the perturbed distribution function then follows a novel procedure in which two non-resonant terms must be evaluated explicitly. The systematic analysis leads to a diffusion equation that is compact and completely expressed in terms of the drift kinetic variables. The equation is not transit averaged, and satisfies the entropy principle, while retaining the full poloidal angle variation without resorting to Fourier decomposition. As the diffusion equation is in physical variables, it can be implemented in any computational code. In the Kennel-Engelmann formalism, the wave-particle resonant delta function is either for the Landau resonance or the Doppler shifted cyclotron resonance. In the combined gyro and drift kinetic approach, a term related to the magnetic drift modifies the resonance condition.
Egidi, Franco; Sun, Shichao; Goings, Joshua J; Scalmani, Giovanni; Frisch, Michael J; Li, Xiaosong
2017-06-13
We present a linear response formalism for the description of the electronic excitations of a noncollinear reference defined via Kohn-Sham spin density functional methods. A set of auxiliary variables, defined using the density and noncollinear magnetization density vector, allows the generalization of spin density functional kernels commonly used in collinear DFT to noncollinear cases, including local density, GGA, meta-GGA and hybrid functionals. Working equations and derivations of functional second derivatives with respect to the noncollinear density, required in the linear response noncollinear TDDFT formalism, are presented in this work. This formalism takes all components of the spin magnetization into account independent of the type of reference state (open or closed shell). As a result, the method introduced here is able to afford a nonzero local xc torque on the spin magnetization while still satisfying the zero-torque theorem globally. The formalism is applied to a few test cases using the variational exact-two-component reference including spin-orbit coupling to illustrate the capabilities of the method.
Cognitive function in the oldest old: women perform better than men.
van Exel, E; Gussekloo, J; de Craen, A J; Bootsma-van der Wiel, A; Houx, P; Knook, D L; Westendorp, R G
2001-07-01
Limited formal education is associated with poor cognitive function. This could explain sex differences in cognitive function in the oldest old. Whether limited formal education explains differences in cognitive function between elderly women and men was explored. The Leiden 85-plus Study is a population based study investigating all 85 year old inhabitants of Leiden with an overall response rate of 87%. A sample of 599 participants were visited at their place of residence. The mini mental state examination was completed by all participants. Cognitive speed and memory were determined with four neuropsychological tests in participants with a mini mental state examination score higher than 18 points. The proportion of women with limited formal education was significantly higher than that of men (70% v 53%, p=0.001), but women had better scores for cognitive speed and memory than men (p<0.05). After adjustment for differences in limited formal education and the presence of depressive symptoms, the odds ratio for women to have a higher cognitive speed than men was 1.7 (95% CI; 1.0 to 2.6), and for them to have a better memory the odds ratio was 1.8 (95%CI; 1.2 to 2.7). Women have a better cognitive function than men, despite their lower level of formal education. Limited formal education alone, therefore, cannot explain the differences in cognitive function in women and men. These findings support the alternative hypothesis that biological differences, such as atherosclerosis, between women and men account for the sex differences in cognitive decline.
Papadimitriou, Konstantinos I.; Liu, Shih-Chii; Indiveri, Giacomo; Drakakis, Emmanuel M.
2014-01-01
The field of neuromorphic silicon synapse circuits is revisited and a parsimonious mathematical framework able to describe the dynamics of this class of log-domain circuits in the aggregate and in a systematic manner is proposed. Starting from the Bernoulli Cell Formalism (BCF), originally formulated for the modular synthesis and analysis of externally linear, time-invariant logarithmic filters, and by means of the identification of new types of Bernoulli Cell (BC) operators presented here, a generalized formalism (GBCF) is established. The expanded formalism covers two new possible and practical combinations of a MOS transistor (MOST) and a linear capacitor. The corresponding mathematical relations codifying each case are presented and discussed through the tutorial treatment of three well-known transistor-level examples of log-domain neuromorphic silicon synapses. The proposed mathematical tool unifies past analysis approaches of the same circuits under a common theoretical framework. The speed advantage of the proposed mathematical framework as an analysis tool is also demonstrated by a compelling comparative circuit analysis example of high order, where the GBCF and another well-known log-domain circuit analysis method are used for the determination of the input-output transfer function of the high (4th) order topology. PMID:25653579
Papadimitriou, Konstantinos I; Liu, Shih-Chii; Indiveri, Giacomo; Drakakis, Emmanuel M
2014-01-01
The field of neuromorphic silicon synapse circuits is revisited and a parsimonious mathematical framework able to describe the dynamics of this class of log-domain circuits in the aggregate and in a systematic manner is proposed. Starting from the Bernoulli Cell Formalism (BCF), originally formulated for the modular synthesis and analysis of externally linear, time-invariant logarithmic filters, and by means of the identification of new types of Bernoulli Cell (BC) operators presented here, a generalized formalism (GBCF) is established. The expanded formalism covers two new possible and practical combinations of a MOS transistor (MOST) and a linear capacitor. The corresponding mathematical relations codifying each case are presented and discussed through the tutorial treatment of three well-known transistor-level examples of log-domain neuromorphic silicon synapses. The proposed mathematical tool unifies past analysis approaches of the same circuits under a common theoretical framework. The speed advantage of the proposed mathematical framework as an analysis tool is also demonstrated by a compelling comparative circuit analysis example of high order, where the GBCF and another well-known log-domain circuit analysis method are used for the determination of the input-output transfer function of the high (4(th)) order topology.
Strain-induced tunable negative differential resistance in triangle graphene spirals
NASA Astrophysics Data System (ADS)
Tan, Jie; Zhang, Xiaoming; Liu, Wenguan; He, Xiujie; Zhao, Mingwen
2018-05-01
Using non-equilibrium Green’s function formalism combined with density functional theory calculations, we investigate the significant changes in electronic and transport properties of triangle graphene spirals (TGSs) in response to external strain. Tunable negative differential resistance (NDR) behavior is predicted. The NDR bias region, NDR width, and peak-to-valley ratio can be well tuned by external strain. Further analysis shows that these peculiar properties can be attributed to the dispersion widths of the p z orbitals. Moreover, the conductance of TGSs is very sensitive to the applied stress, which is promising for applications in nanosensor devices. Our findings reveal a novel approach to produce tunable electronic devices based on graphene spirals.
Strain-induced tunable negative differential resistance in triangle graphene spirals.
Tan, Jie; Zhang, Xiaoming; Liu, Wenguan; He, Xiujie; Zhao, Mingwen
2018-05-18
Using non-equilibrium Green's function formalism combined with density functional theory calculations, we investigate the significant changes in electronic and transport properties of triangle graphene spirals (TGSs) in response to external strain. Tunable negative differential resistance (NDR) behavior is predicted. The NDR bias region, NDR width, and peak-to-valley ratio can be well tuned by external strain. Further analysis shows that these peculiar properties can be attributed to the dispersion widths of the p z orbitals. Moreover, the conductance of TGSs is very sensitive to the applied stress, which is promising for applications in nanosensor devices. Our findings reveal a novel approach to produce tunable electronic devices based on graphene spirals.
Service composition towards increasing end-user accessibility.
Kaklanis, Nikolaos; Votis, Konstantinos; Tzovaras, Dimitrios
2015-01-01
This paper presents the Cloud4all Service Synthesizer Tool, a framework that enables efficient orchestration of accessibility services, as well as their combination into complex forms, providing more advanced functionalities towards increasing the accessibility of end-users with various types of functional limitations. The supported services are described formally within an ontology, enabling, thus, semantic service composition. The proposed service composition approach is based on semantic matching between services specifications on the one hand and user needs/preferences and current context of use on the other hand. The use of automatic composition of accessibility services can significantly enhance end-users' accessibility, especially in cases where assistive solutions are not available in their device.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fediai, Artem, E-mail: artem.fediai@nano.tu-dresden.de; Ryndyk, Dmitry A.; Center for Advancing Electronics Dresden, TU Dresden, 01062 Dresden
2016-09-05
Using a dedicated combination of the non-equilibrium Green function formalism and large-scale density functional theory calculations, we investigated how incomplete metal coverage influences two of the most important electrical properties of carbon nanotube (CNT)-based transistors: contact resistance and its scaling with contact length, and maximum current. These quantities have been derived from parameter-free simulations of atomic systems that are as close as possible to experimental geometries. Physical mechanisms that govern these dependences have been identified for various metals, representing different CNT-metal interaction strengths from chemisorption to physisorption. Our results pave the way for an application-oriented design of CNT-metal contacts.
The potential for increased power from combining P-values testing the same hypothesis.
Ganju, Jitendra; Julie Ma, Guoguang
2017-02-01
The conventional approach to hypothesis testing for formal inference is to prespecify a single test statistic thought to be optimal. However, we usually have more than one test statistic in mind for testing the null hypothesis of no treatment effect but we do not know which one is the most powerful. Rather than relying on a single p-value, combining p-values from prespecified multiple test statistics can be used for inference. Combining functions include Fisher's combination test and the minimum p-value. Using randomization-based tests, the increase in power can be remarkable when compared with a single test and Simes's method. The versatility of the method is that it also applies when the number of covariates exceeds the number of observations. The increase in power is large enough to prefer combined p-values over a single p-value. The limitation is that the method does not provide an unbiased estimator of the treatment effect and does not apply to situations when the model includes treatment by covariate interaction.
Formal System Verification - Extension 2
2012-08-08
vision of truly trustworthy systems has been to provide a formally verified microkernel basis. We have previously developed the seL4 microkernel...together with a formal proof (in the theorem prover Isabelle/HOL) of its functional correctness [6]. This means that all the behaviours of the seL4 C...source code are included in the high-level, formal specification of the kernel. This work enabled us to provide further formal guarantees about seL4 , in
Hybrid density-functional calculations of phonons in LaCoO3
NASA Astrophysics Data System (ADS)
Gryaznov, Denis; Evarestov, Robert A.; Maier, Joachim
2010-12-01
Phonon frequencies at Γ point in nonmagnetic rhombohedral phase of LaCoO3 were calculated using density-functional theory with hybrid exchange correlation functional PBE0. The calculations involved a comparison of results for two types of basis functions commonly used in ab initio calculations, namely, the plane-wave approach and linear combination of atomic orbitals, as implemented in VASP and CRYSTAL computer codes, respectively. A good qualitative, but also within an error margin of less than 30%, a quantitative agreement was observed not only between the two formalisms but also between theoretical and experimental phonon frequency predictions. Moreover, the correlation between the phonon symmetries in cubic and rhombohedral phases is discussed in detail on the basis of group-theoretical analysis. It is concluded that the hybrid PBE0 functional is able to predict correctly the phonon properties in LaCoO3 .
Dynamic Forms. Part 1: Functions
NASA Technical Reports Server (NTRS)
Meyer, George; Smith, G. Allan
1993-01-01
The formalism of dynamic forms is developed as a means for organizing and systematizing the design control systems. The formalism allows the designer to easily compute derivatives to various orders of large composite functions that occur in flight-control design. Such functions involve many function-of-a-function calls that may be nested to many levels. The component functions may be multiaxis, nonlinear, and they may include rotation transformations. A dynamic form is defined as a variable together with its time derivatives up to some fixed but arbitrary order. The variable may be a scalar, a vector, a matrix, a direction cosine matrix, Euler angles, or Euler parameters. Algorithms for standard elementary functions and operations of scalar dynamic forms are developed first. Then vector and matrix operations and transformations between parameterization of rotations are developed in the next level in the hierarchy. Commonly occurring algorithms in control-system design, including inversion of pure feedback systems, are developed in the third level. A large-angle, three-axis attitude servo and other examples are included to illustrate the effectiveness of the developed formalism. All algorithms were implemented in FORTRAN code. Practical experience shows that the proposed formalism may significantly improve the productivity of the design and coding process.
International Workshop on Principles of Program Analysis
1999-01-01
with respect to a semantics of the programming language. It is a sad fact that new program analyses often contain subtle bugs, and a formal ... It defines a higher-order function f with formal parameter x and body x 1; then it defines two functions g and h that are given as actual parameters...begin by presenting a formal semantics for WHILE. The material of this section may be skimmed through on a first reading; however, it is frequently
Zimbovskaya, Natalya A
2016-07-27
In this paper, we theoretically analyze steady-state thermoelectric transport through a single-molecule junction with a vibrating bridge. The thermally induced charge current in the system is explored using a nonequilibrium Green function formalism. We study the combined effects of Coulomb interactions between charge carriers on the bridge and electron-phonon interactions on the thermocurrent beyond the linear response regime. It is shown that electron-vibron interactions may significantly affect both the magnitude and the direction of the thermocurrent, and vibrational signatures may appear.
Patra, Bikash; Jana, Subrata; Samal, Prasanjit
2018-03-28
The exchange hole, which is one of the principal constituents of the density functional formalism, can be used to design accurate range-separated hybrid functionals in association with appropriate correlation. In this regard, the exchange hole derived from the density matrix expansion has gained attention due to its fulfillment of some of the desired exact constraints. Thus, the new long-range corrected density functional proposed here combines the meta generalized gradient approximation level exchange functional designed from the density matrix expansion based exchange hole coupled with the ab initio Hartree-Fock exchange through the range separation of the Coulomb interaction operator using the standard error function technique. Then, in association with the Lee-Yang-Parr correlation functional, the assessment and benchmarking of the above newly constructed range-separated functional with various well-known test sets shows its reasonable performance for a broad range of molecular properties, such as thermochemistry, non-covalent interaction and barrier heights of the chemical reactions.
Durham, J; Michael, Marcos; Hill, P S; Paviignani, E
2015-09-28
In most societies the health marketplace is pluralistic in character, with a mix of formal and informal providers. In high-income countries, state regulation of the market helps ensure quality and access and mitigate market failures. In the present study, using Haiti as a case study, we explore what happens to the functioning of the pluralistic health marketplace in severely disrupted environments where the informal sector is able to flourish. The overall research design was qualitative. Research methods included an extensive documentary and policy analysis, based on peer-reviewed articles, books and "grey" literature--government policy and program reports, unpublished research and evaluations, reviews and reviews from key multilateral and bilateral donors, and non-government organisations, combined with field site visits and in-depth key informant interviews (N = 45). The findings show that state fragility has resulted in a privatised, commoditised and largely unregulated and informal health market. While different market segments can be identified, in reality the boundaries between international/domestic, public/private, for profit/not-for-profit, legal/illegal are hazy and shifting. The lack of state capacity to provide an enabling environment, establish, and enforce its regulatory framework has resulted in a highly segmented, heterogeneous and informal health market. The result is deplorable health indices which are far below regional averages and many other low-income countries. Working in fragile states with limited capacity to undertake the core function of securing the health of its population requires new and innovative ways of working. This needs longer time-frames, combining incremental top-down and bottom-up strategies which recognize and work with state and civil society, public and private actors, formal and informal institutions, and progressively facilitate changes in the different market functions of supply, demand, regulation and supporting functions.
The Lagrangian-Hamiltonian formalism for higher order field theories
NASA Astrophysics Data System (ADS)
Vitagliano, Luca
2010-06-01
We generalize the Lagrangian-Hamiltonian formalism of Skinner and Rusk to higher order field theories on fiber bundles. As a byproduct we solve the long standing problem of defining, in a coordinate free manner, a Hamiltonian formalism for higher order Lagrangian field theories. Namely, our formalism does only depend on the action functional and, therefore, unlike previously proposed ones, is free from any relevant ambiguity.
The Transition to Formal Thinking in Mathematics
ERIC Educational Resources Information Center
Tall, David
2008-01-01
This paper focuses on the changes in thinking involved in the transition from school mathematics to formal proof in pure mathematics at university. School mathematics is seen as a combination of visual representations, including geometry and graphs, together with symbolic calculations and manipulations. Pure mathematics in university shifts…
NASA Astrophysics Data System (ADS)
Xu, Xiao; Holzwarth, N. A. W.
2011-10-01
This paper presents the formulation and numerical implementation of a self-consistent treatment of orbital-dependent exchange-correlation functionals within the projector-augmented-wave method of Blöchl [Phys. Rev. BPRBMDO1098-012110.1103/PhysRevB.50.17953 50, 17953 (1994)] for electronic structure calculations. The methodology is illustrated with binding energy curves for C in the diamond structure and LiF in the rock salt structure, by comparing results from the Hartree-Fock (HF) formalism and the optimized effective potential formalism in the so-called KLI approximation [Krieger, Li, and Iafrate, Phys. Rev. APLRAAN1050-294710.1103/PhysRevA.45.101 45, 101 (1992)] with those of the local density approximation. While the work here uses pure Fock exchange only, the formalism can be extended to treat orbital-dependent functionals more generally.
A High-Level Formalization of Floating-Point Number in PVS
NASA Technical Reports Server (NTRS)
Boldo, Sylvie; Munoz, Cesar
2006-01-01
We develop a formalization of floating-point numbers in PVS based on a well-known formalization in Coq. We first describe the definitions of all the needed notions, e.g., floating-point number, format, rounding modes, etc.; then, we present an application to polynomial evaluation for elementary function evaluation. The application already existed in Coq, but our formalization shows a clear improvement in the quality of the result due to the automation provided by PVS. We finally integrate our formalization into a PVS hardware-level formalization of the IEEE-854 standard previously developed at NASA.
NASA Astrophysics Data System (ADS)
Brock, Ryan J.
Nature deficit, where disconnections occur between children and nature have come to the forefront of environmental education in recent years. This study explored how fourth graders in an after-school Nature Club developed or strengthened their environmental identity, thus decreasing nature deficit. Through a program that utilized semi-formal instruction, both classroom learning and direct experiences with nature, took place over a nine week period of time. Six children were followed as qualitative data was collected and analyzed for themes that would reveal how adolescent children in the developmental stage of concrete operations developed environmental identity. The results indicate that all students strengthened their environmental identity when social aspects were embedded. Students who entered Nature Club with low environmental identity required more direct experiences with nature while those with higher environmental identity required a combination of reflective components along with nature experiences. Based upon this study, the nine-week program which combined formal and non-formal means of learning was able to strengthen environmental identity in each of the participants. A strong theme of social learning, not explicitly identified in the literature was found. Additionally, and most importantly, findings also indicate that educators, both formal and non-formal, who teach environmental education and seek to strengthen environmental identity for adolescents for early interventions need to understand the development of environmental identity in concrete operational learners at a theoretical level.
Juan-Senabre, Xavier J; Porras, Ignacio; Lallena, Antonio M
2013-06-01
A variation of TG-43 protocol for seeds with cylindrical symmetry aiming at a better description of the radial and anisotropy functions is proposed. The TG-43 two dimensional formalism is modified by introducing a new anisotropy function. Also new fitting functions that permit a more robust description of the radial and anisotropy functions than usual polynomials are studied. The relationship between the new anisotropy function and the anisotropy factor included in the one-dimensional TG-43 formalism is analyzed. The new formalism is tested for the (125)I Nucletron selectSeed brachytherapy source, using Monte Carlo simulations performed with PENELOPE. The goodness of the new parameterizations is discussed. The results obtained indicate that precise fits can be achieved, with a better description than that provided by previous parameterizations. Special care has been taken in the description and fitting of the anisotropy factor near the source. The modified formalism shows advantages with respect to the usual one in the description of the anisotropy functions. The new parameterizations obtained can be easily implemented in the clinical planning calculation systems, provided that the ratio between geometry factors is also modified according to the new dose rate expression. Copyright © 2012 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
An extension of stochastic hierarchy equations of motion for the equilibrium correlation functions
NASA Astrophysics Data System (ADS)
Ke, Yaling; Zhao, Yi
2017-06-01
A traditional stochastic hierarchy equations of motion method is extended into the correlated real-time and imaginary-time propagations, in this paper, for its applications in calculating the equilibrium correlation functions. The central idea is based on a combined employment of stochastic unravelling and hierarchical techniques for the temperature-dependent and temperature-free parts of the influence functional, respectively, in the path integral formalism of the open quantum systems coupled to a harmonic bath. The feasibility and validity of the proposed method are justified in the emission spectra of homodimer compared to those obtained through the deterministic hierarchy equations of motion. Besides, it is interesting to find that the complex noises generated from a small portion of real-time and imaginary-time cross terms can be safely dropped to produce the stable and accurate position and flux correlation functions in a broad parameter regime.
On the formalization and reuse of scientific research.
King, Ross D; Liakata, Maria; Lu, Chuan; Oliver, Stephen G; Soldatova, Larisa N
2011-10-07
The reuse of scientific knowledge obtained from one investigation in another investigation is basic to the advance of science. Scientific investigations should therefore be recorded in ways that promote the reuse of the knowledge they generate. The use of logical formalisms to describe scientific knowledge has potential advantages in facilitating such reuse. Here, we propose a formal framework for using logical formalisms to promote reuse. We demonstrate the utility of this framework by using it in a worked example from biology: demonstrating cycles of investigation formalization [F] and reuse [R] to generate new knowledge. We first used logic to formally describe a Robot scientist investigation into yeast (Saccharomyces cerevisiae) functional genomics [f(1)]. With Robot scientists, unlike human scientists, the production of comprehensive metadata about their investigations is a natural by-product of the way they work. We then demonstrated how this formalism enabled the reuse of the research in investigating yeast phenotypes [r(1) = R(f(1))]. This investigation found that the removal of non-essential enzymes generally resulted in enhanced growth. The phenotype investigation was then formally described using the same logical formalism as the functional genomics investigation [f(2) = F(r(1))]. We then demonstrated how this formalism enabled the reuse of the phenotype investigation to investigate yeast systems-biology modelling [r(2) = R(f(2))]. This investigation found that yeast flux-balance analysis models fail to predict the observed changes in growth. Finally, the systems biology investigation was formalized for reuse in future investigations [f(3) = F(r(2))]. These cycles of reuse are a model for the general reuse of scientific knowledge.
Positive Character Development in School Sport Programs. ERIC Digest.
ERIC Educational Resources Information Center
Beller, Jennifer
This digest discusses the formal and informal processes of moral character development through sport in light of the types of programs that have shown to improve moral character, sportsmanship, and fair play, noting that such efforts involve combined lifelong formal and informal educational processes with three interrelated dimensions: knowing,…
Establishing the Validity of Recovery from Stuttering without Formal Treatment.
ERIC Educational Resources Information Center
Finn, Patrick
1996-01-01
This study examined a validation procedure combining self-reports with independent verification to identify cases of recovery from stuttering without formal treatment. A Speech Behavior Checklist was administered to 42 individuals familiar with recovered subjects' past speech. Analysis of subjects' descriptions of their past stuttering was…
Spatial Proportional Reasoning Is Associated with Formal Knowledge about Fractions
ERIC Educational Resources Information Center
Möhring, Wenke; Newcombe, Nora S.; Levine, Susan C.; Frick, Andrea
2016-01-01
Proportional reasoning involves thinking about parts and wholes (i.e., about fractional quantities). Yet, research on proportional reasoning and fraction learning has proceeded separately. This study assessed proportional reasoning and formal fraction knowledge in 8- to 10-year-olds. Participants (N = 52) saw combinations of cherry juice and water…
Formal verification and testing: An integrated approach to validating Ada programs
NASA Technical Reports Server (NTRS)
Cohen, Norman H.
1986-01-01
An integrated set of tools called a validation environment is proposed to support the validation of Ada programs by a combination of methods. A Modular Ada Validation Environment (MAVEN) is described which proposes a context in which formal verification can fit into the industrial development of Ada software.
Combining Education and Work; Experiences in Asia and Oceania: Bangladesh.
ERIC Educational Resources Information Center
Dacca Univ., Bangladesh. Inst. of Education and Research.
Bangladesh stresses the importance of education responsive to the country's development needs and capable of producing, through formal or non-formal methods, skilled, employable manpower. Although no pre-vocational training exists, new curricula have introduced practical work experience in the primary schools and have integrated agriculture,…
A review of research on formal reasoning and science teaching
NASA Astrophysics Data System (ADS)
Lawson, Anton E.
A central purpose of education is to improve students' reasoning abilities. The present review examines research in developmental psychology and science education that has attempted to assess the validity of Piaget's theory of formal thought and its relation to educational practice. Should a central objective of schools be to help students become formal thinkers? To answer this question research has focused on the following subordinate questions: (1) What role does biological maturation play in the development of formal reasoning? (2) Are Piaget's formal tasks reliable and valid? (3) Does formal reasoning constitute a unified and general mode of intellectual functioning? (4) How does the presence or absence of formal reasoning affect school achievement? (5) Can formal reasoning be taught? (6) What is the structural or functional nature of advanced reasoning? The general conclusion drawn is that although Piaget's work and that which has sprung from it leaves a number of unresolved theoretical and methodological problems, it provides an important background from which to make substantial progress toward a most significant educational objective.All our dignity lies in thought. By thought we must elevate ourselves, not by space and time which we can not fill. Let us endeavor then to think well; therein lies the principle of morality. Blaise Pascal 1623-1662.
Formal Education, Eminence, and Dogmatism: The Curvilinear Relationship.
ERIC Educational Resources Information Center
Simonton, Dean Keith
The relationship between formal education and creativity was investigated in two studies. A reanalysis of Cox's (1926) 301 geniuses indicated that achieved eminence of creators is a curvilinear inverted-U function of formal education. Secondly, a study of 33 American presidents found that dogmatism (i.e., idealistic inflexibility) is a curvilinear…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Romano, J.D.; Woan, G.
Data from the Laser Interferometer Space Antenna (LISA) is expected to be dominated by frequency noise from its lasers. However, the noise from any one laser appears more than once in the data and there are combinations of the data that are insensitive to this noise. These combinations, called time delay interferometry (TDI) variables, have received careful study and point the way to how LISA data analysis may be performed. Here we approach the problem from the direction of statistical inference, and show that these variables are a direct consequence of a principal component analysis of the problem. We presentmore » a formal analysis for a simple LISA model and show that there are eigenvectors of the noise covariance matrix that do not depend on laser frequency noise. Importantly, these orthogonal basis vectors correspond to linear combinations of TDI variables. As a result we show that the likelihood function for source parameters using LISA data can be based on TDI combinations of the data without loss of information.« less
Spectral likelihood expansions for Bayesian inference
NASA Astrophysics Data System (ADS)
Nagel, Joseph B.; Sudret, Bruno
2016-03-01
A spectral approach to Bayesian inference is presented. It pursues the emulation of the posterior probability density. The starting point is a series expansion of the likelihood function in terms of orthogonal polynomials. From this spectral likelihood expansion all statistical quantities of interest can be calculated semi-analytically. The posterior is formally represented as the product of a reference density and a linear combination of polynomial basis functions. Both the model evidence and the posterior moments are related to the expansion coefficients. This formulation avoids Markov chain Monte Carlo simulation and allows one to make use of linear least squares instead. The pros and cons of spectral Bayesian inference are discussed and demonstrated on the basis of simple applications from classical statistics and inverse modeling.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tabacchi, G; Hutter, J; Mundy, C
2005-04-07
A combined linear response--frozen electron density model has been implemented in a molecular dynamics scheme derived from an extended Lagrangian formalism. This approach is based on a partition of the electronic charge distribution into a frozen region described by Kim-Gordon theory, and a response contribution determined by the instaneous ionic configuration of the system. The method is free from empirical pair-potentials and the parameterization protocol involves only calculations on properly chosen subsystems. They apply this method to a series of alkali halides in different physical phases and are able to reproduce experimental structural and thermodynamic properties with an accuracy comparablemore » to Kohn-Sham density functional calculations.« less
First-principles investigation on transport properties of NiO monowire-based molecular device
NASA Astrophysics Data System (ADS)
Chandiramouli, R.; Sriram, S.
2014-08-01
The electronic transport properties of novel NiO monowire connected to the gold electrodes are investigated using density functional theory combined with nonequilibrium Green's functions formalism. The densities of states of the monowire under various bias conditions are discussed. The transport properties are discussed in terms of the transmission spectrum and current-voltage characteristics of NiO monowire. The transmission pathways provide the insight to the transmission of electrons along the monowire. With different bias voltages, current in the order of few microampere flows across the monowire. The applied voltage controls the flow of current through the monowire, which can be used to control the current efficiently in the low order of magnitude in the molecular device.
Thermoelectric transport properties of Ti doped/adsorbed monolayer blue phosphorene.
Zhu, Lin; Li, Bowen; Yao, Kailun
2018-08-10
Thermoelectric transport properties of Ti doped or adsorbed monolayer blue phosphorene are investigated by density functional theory combined with the nonequilibrium Green's function formalism. The thermal giant magnetoresistance and a nearly 100% spin polarization which solely relies on the temperature gradient of electrodes without bias or gate voltage are observed. Moreover, the spin Seebeck effect is also found. Furthermore, taking into account the electronic and phonon dispersion, the thermoelectric merit for Ti doping in the monolayer blue phosphorene at room temperature is also studied, the maximum value of thermoelectric merit can reach 1.01 near the Fermi level. The results indicate that Ti doped or adsorbed monolayer blue phosphorene has potential application in both spintronics and spin caloritronics.
Unraveling hadron structure with generalized parton distributions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andrei Belitsky; Anatoly Radyushkin
2004-10-01
The recently introduced generalized parton distributions have emerged as a universal tool to describe hadrons in terms of quark and gluonic degrees of freedom. They combine the features of form factors, parton densities and distribution amplitudes - the functions used for a long time in studies of hadronic structure. Generalized parton distributions are analogous to the phase-space Wigner quasi-probability function of non-relativistic quantum mechanics which encodes full information on a quantum-mechanical system. We give an extensive review of main achievements in the development of this formalism. We discuss physical interpretation and basic properties of generalized parton distributions, their modeling andmore » QCD evolution in the leading and next-to-leading orders. We describe how these functions enter a wide class of exclusive reactions, such as electro- and photo-production of photons, lepton pairs, or mesons.« less
A new line-of-sight approach to the non-linear Cosmic Microwave Background
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fidler, Christian; Koyama, Kazuya; Pettinari, Guido W., E-mail: christian.fidler@port.ac.uk, E-mail: kazuya.koyama@port.ac.uk, E-mail: guido.pettinari@gmail.com
2015-04-01
We develop the transport operator formalism, a new line-of-sight integration framework to calculate the anisotropies of the Cosmic Microwave Background (CMB) at the linear and non-linear level. This formalism utilises a transformation operator that removes all inhomogeneous propagation effects acting on the photon distribution function, thus achieving a split between perturbative collisional effects at recombination and non-perturbative line-of-sight effects at later times. The former can be computed in the framework of standard cosmological perturbation theory with a second-order Boltzmann code such as SONG, while the latter can be treated within a separate perturbative scheme allowing the use of non-linear Newtonianmore » potentials. We thus provide a consistent framework to compute all physical effects contained in the Boltzmann equation and to combine the standard remapping approach with Boltzmann codes at any order in perturbation theory, without assuming that all sources are localised at recombination.« less
The Nature and Origin of Time-Asymmetric Spacetime Structures
NASA Astrophysics Data System (ADS)
Zeh, H. Dieter
Time-asymmetric spacetime structures, in particular those representing black holes and the expansion of the universe, are intimately related to other arrows of time, such as the second law and the retardation of radiation. The nature of the quantum arrow, often attributed to a collapse of the wave function, is essential, in particular, for understanding the much discussed black hole information loss paradox. This paradox assumes a new form and can possibly be avoided in a consistent causal treatment that may be able to avoid horizons and singularities. The master arrow that would combine all arrows of time does not have to be identified with a direction of the formal time parameter that serves to formulate the dynamics as a succession of global states (a trajectory in configuration or Hilbert space). It may even change direction with respect to a fundamental physical clock such as the cosmic expansion parameter if this was formally extended either into a future contraction era or to negative pre-big-bang values.
NASA Astrophysics Data System (ADS)
Collart, T. G.; Stacey, W. M.
2015-11-01
Several methods are presented for extending the traditional analytic ``circular'' representation of flux-surface aligned curvilinear coordinate systems to more accurately describe equilibrium plasma geometry and magnetic fields in DIII-D. The formalism originally presented by Miller is extended to include different poloidal variations in the upper and lower hemispheres. A coordinate system based on separate Fourier expansions of major radius and vertical position greatly improves accuracy in edge plasma structure representation. Scale factors and basis vectors for a system formed by expanding the circular model minor radius can be represented using linear combinations of Fourier basis functions. A general method for coordinate system orthogonalization is presented and applied to all curvilinear models. A formalism for the magnetic field structure in these curvilinear models is presented, and the resulting magnetic field predictions are compared against calculations performed in a Cartesian system using an experimentally based EFIT prediction for the Grad-Shafranov equilibrium. Supported by: US DOE under DE-FG02-00ER54538.
Deformation quantization of fermi fields
DOE Office of Scientific and Technical Information (OSTI.GOV)
Galaviz, I.; Garcia-Compean, H.; Departamento de Fisica, Centro de Investigacion y de Estudios Avanzados del IPN, P.O. Box 14-740, 07000 Mexico, D.F.
2008-04-15
Deformation quantization for any Grassmann scalar free field is described via the Weyl-Wigner-Moyal formalism. The Stratonovich-Weyl quantizer, the Moyal *-product and the Wigner functional are obtained by extending the formalism proposed recently in [I. Galaviz, H. Garcia-Compean, M. Przanowski, F.J. Turrubiates, Weyl-Wigner-Moyal Formalism for Fermi Classical Systems, arXiv:hep-th/0612245] to the fermionic systems of infinite number of degrees of freedom. In particular, this formalism is applied to quantize the Dirac free field. It is observed that the use of suitable oscillator variables facilitates considerably the procedure. The Stratonovich-Weyl quantizer, the Moyal *-product, the Wigner functional, the normal ordering operator, and finally,more » the Dirac propagator have been found with the use of these variables.« less
Formal verification of an avionics microprocessor
NASA Technical Reports Server (NTRS)
Srivas, Mandayam, K.; Miller, Steven P.
1995-01-01
Formal specification combined with mechanical verification is a promising approach for achieving the extremely high levels of assurance required of safety-critical digital systems. However, many questions remain regarding their use in practice: Can these techniques scale up to industrial systems, where are they likely to be useful, and how should industry go about incorporating them into practice? This report discusses a project undertaken to answer some of these questions, the formal verification of the AAMPS microprocessor. This project consisted of formally specifying in the PVS language a rockwell proprietary microprocessor at both the instruction-set and register-transfer levels and using the PVS theorem prover to show that the microcode correctly implemented the instruction-level specification for a representative subset of instructions. Notable aspects of this project include the use of a formal specification language by practicing hardware and software engineers, the integration of traditional inspections with formal specifications, and the use of a mechanical theorem prover to verify a portion of a commercial, pipelined microprocessor that was not explicitly designed for formal verification.
Langevin dynamics for vector variables driven by multiplicative white noise: A functional formalism
NASA Astrophysics Data System (ADS)
Moreno, Miguel Vera; Arenas, Zochil González; Barci, Daniel G.
2015-04-01
We discuss general multidimensional stochastic processes driven by a system of Langevin equations with multiplicative white noise. In particular, we address the problem of how time reversal diffusion processes are affected by the variety of conventions available to deal with stochastic integrals. We present a functional formalism to build up the generating functional of correlation functions without any type of discretization of the Langevin equations at any intermediate step. The generating functional is characterized by a functional integration over two sets of commuting variables, as well as Grassmann variables. In this representation, time reversal transformation became a linear transformation in the extended variables, simplifying in this way the complexity introduced by the mixture of prescriptions and the associated calculus rules. The stochastic calculus is codified in our formalism in the structure of the Grassmann algebra. We study some examples such as higher order derivative Langevin equations and the functional representation of the micromagnetic stochastic Landau-Lifshitz-Gilbert equation.
A systematic study of finite BRST-BFV transformations in generalized Hamiltonian formalism
NASA Astrophysics Data System (ADS)
Batalin, Igor A.; Lavrov, Peter M.; Tyutin, Igor V.
2014-09-01
We study systematically finite BRST-BFV transformations in the generalized Hamiltonian formalism. We present explicitly their Jacobians and the form of a solution to the compensation equation determining the functional field dependence of finite Fermionic parameters, necessary to generate an arbitrary finite change of gauge-fixing functions in the path integral.
Working the College System: Six Strategies for Building a Personal Powerbase
ERIC Educational Resources Information Center
Simplicio, Joseph S. C.
2008-01-01
Within each college system there are prescribed formalized methods for accomplishing tasks and achieving established goals. To truly understand how a college, or any large organization functions, it is vital to understand the basis of the formal structure. Those individuals who understand formal systems within a college can use this knowledge to…
The generation of gravitational waves. 1. Weak-field sources: A plug-in-and-grind formalism
NASA Technical Reports Server (NTRS)
Thorne, K. S.; Kovacs, S. J.
1974-01-01
A plug-in-and-grind formalism is derived for calculating the gravitational waves emitted by any system with weak internal gravitational fields. If the internal fields have negligible influence on the system's motions, then the formalism reduces to standard linearized theory. Whether or not gravity affects the motions, if the motions are slow and internal stresses are weak, then the new formalism reduces to the standard quadrupole-moment formalism. In the general case the new formalism expresses the radiation in terms of a retarded Green's function for slightly curved spacetime, and then breaks the Green's-function integral into five easily understood pieces: direct radiation, produced directly by the motions of the sources; whump radiation, produced by the the gravitational stresses of the source; transition radiation, produced by a time-changing time delay (Shapiro effect) in the propagation of the nonradiative, 1/r field of the source; focussing radiation produced when one portion of the source focusses, in a time-dependent way, the nonradiative field of another portion of the source, and tail radiation, produced by backscatter of the nonradiative field in regions of focussing.
A Deterministic Transport Code for Space Environment Electrons
NASA Technical Reports Server (NTRS)
Nealy, John E.; Chang, C. K.; Norman, Ryan B.; Blattnig, Steve R.; Badavi, Francis F.; Adamczyk, Anne M.
2010-01-01
A deterministic computational procedure has been developed to describe transport of space environment electrons in various shield media. This code is an upgrade and extension of an earlier electron code. Whereas the former code was formulated on the basis of parametric functions derived from limited laboratory data, the present code utilizes well established theoretical representations to describe the relevant interactions and transport processes. The shield material specification has been made more general, as have the pertinent cross sections. A combined mean free path and average trajectory approach has been used in the transport formalism. Comparisons with Monte Carlo calculations are presented.
Analysis of Yb3+/Er3+-codoped microring resonator cross-grid matrices
NASA Astrophysics Data System (ADS)
Vallés, Juan A.; Gǎlǎtuş, Ramona
2014-09-01
An analytic model of the scattering response of a highly Yb3+/Er3+-codoped phosphate glass microring resonator matrix is considered to obtain the transfer functions of an M x N cross-grid microring resonator structure. Then a detailed model is used to calculate the pump and signal propagation, including a microscopic statistical formalism to describe the high-concentration induced energy-transfer mechanisms and passive and active features are combined to realistically simulate the performance as a wavelength-selective amplifier or laser. This analysis allows the optimization of these structures for telecom or sensing applications.
Contrasting Cognitive Effects of Formal and Informal Education.
ERIC Educational Resources Information Center
Lave, Jean
This study of informal education examines traditional tailors' apprenticeship training in Liberia. The purpose is to compare and contrast a form of informal education with formal schooling. An examination was made of a group of one hundred tailors having all combinations of tailoring experience, from none to thirty years, and schooling, from none…
Goldberg, Wendy A; Prause, Joann; Lucas-Thompson, Rachel; Himsel, Amy
2008-01-01
This meta-analysis of 68 studies (770 effect sizes) used random effects models to examine whether children's achievement differed depending on whether their mothers were employed. Four achievement outcomes were emphasized: formal tests of achievement and intellectual functioning, grades, and teacher ratings of cognitive competence. When all employment was compared with nonemployment for combined and separate achievement outcomes without moderators, effects were nonsignificant. Small beneficial effects of part-time compared with full-time employment were apparent for all achievement outcomes combined and for each individual achievement outcome. Significant sample-level moderators of the associations between maternal employment and achievement for all outcomes combined included family structure, race/ethnicity, and socioeconomic status; associations were positive when samples were majority 1-parent families and mixed 1- and 2-parent families, racially/ethnically diverse or international in composition, and not middle-upper class. Analyses of child gender indicated more positive effects for girls. Children's age was a significant moderator for the outcome of intellectual functioning. The identification of sample-level moderators of the relationship between maternal employment and children's achievement highlights the importance of social context in understanding work-family linkages. Copyright (c) 2008 APA.
NASA Astrophysics Data System (ADS)
Batalin, Igor A.; Lavrov, Peter M.; Tyutin, Igor V.
2014-09-01
We study systematically finite BRST-BFV transformations in Sp(2)-extended generalized Hamiltonian formalism. We present explicitly their Jacobians and the form of a solution to the compensation equation determining the functional field dependence of finite Fermionic parameters, necessary to generate arbitrary finite change of gauge-fixing functions in the path integral.
Application of the Extended Completeness Relation to the Absorbing Boundary Condition
NASA Astrophysics Data System (ADS)
Iwasaki, Masataka; Otani, Reiji; Ito, Makoto
The strength function of the linear response by the external field is calculated in the formalism of the absorbing boundary condition (ABC). The dipole excitation of a schematic two-body system is treated in the present study. The extended completeness relation, which is assumed on the analogy of the formulation in the complex scaling method (CSM), is applied to the calculation of the strength function. The calculation of the strength function is successful in the present formalism and hence, the extended completeness relation seems to work well in the ABC formalism. The contributions from the resonance and the non-resonant continuum are also analyzed according to the decomposition of the energy levels in the extended completeness relation.
Patel, N S; Chiu-Tsao, S T; Tsao, H S; Harrison, L B
2001-01-01
Intravascular brachytherapy (IVBT) is an emerging modality for the treatment of atherosclerotic lesions in the artery. As part of the refinement in this rapidly evolving modality of treatment, the current simplistic dosimetry approach based on a fixed-point prescription must be challenged by future rigorous dosimetry method employing image-based three-dimensional (3D) treatment planning. The goals of 3D IVBT treatment planning calculations include (1) achieving high accuracy in a slim cylindrical region of interest, (2) accounting for the edge effect around the source ends, and (3) supporting multiple dwell positions. The formalism recommended by Task Group 60 (TG-60) of the American Association of Physicists in Medicine (AAPM) is applicable for gamma sources, as well as short beta sources with lengths less than twice the beta particle range. However, for the elongated beta sources and/or seed trains with lengths greater than twice the beta range, a new formalism is required to handle their distinctly different dose characteristics. Specifically, these characteristics consist of (a) flat isodose curves in the central region, (b) steep dose gradient at the source ends, and (c) exponential dose fall-off in the radial direction. In this paper, we present a novel formalism that evolved from TG-60 in maintaining the dose rate as a product of four key quantities. We propose to employ cylindrical coordinates (R, Z, phi), which are more natural and suitable to the slim cylindrical shape of the volume of interest, as opposed to the spherical coordinate system (r, theta, phi) used in the TG-60 formalism. The four quantities used in this formalism include (1) the distribution factor, H(R, Z), (2) the modulation function, M(R, Z), (3) the transverse dose function, h(R), and (4) the reference dose rate at 2 mm along the perpendicular bisector, D(R0=2 mm, Z0=0). The first three are counterparts of the geometry factor, the anisotropy function and the radial dose function in the TG-60 formalism, respectively. The reference dose rate is identical to that recommended by TG-60. The distribution factor is intended to resemble the dose profile due to the spatial distribution of activity in the elongated beta source, and it is a modified Fermi-Dirac function in mathematical form. The utility of this formalism also includes the slow-varying nature of the modulation function, allowing for more accurate treatment planning calculations based on interpolation. The transverse dose function describes the exponential fall-off of the dose in the radial direction, and an exponential or a polynomial can fit it. Simultaneously, the decoupling nature of these dose-related quantities facilitates image-based 3D treatment planning calculations for long beta sources used in IVBT. The new formalism also supports the dosimetry involving multiple dwell positions required for lesions longer than the source length. An example of the utilization of this formalism is illustrated for a 90Y coil source in a carbon dioxide-filled balloon. The pertinent dosimetric parameters were generated and tabulated for future use.
The generation of gravitational waves. 2: The post-linear formalism revisited
NASA Technical Reports Server (NTRS)
Crowley, R. J.; Thorne, K. S.
1975-01-01
Two different versions of the Green's function for the scalar wave equation in weakly curved spacetime (one due to DeWitt and DeWitt, the other to Thorne and Kovacs) are compared and contrasted; and their mathematical equivalence is demonstrated. The DeWitt-DeWitt Green's function is used to construct several alternative versions of the Thorne-Kovacs post-linear formalism for gravitational-wave generation. Finally it is shown that, in calculations of gravitational bremsstrahlung radiation, some of our versions of the post-linear formalism allow one to treat the interacting bodies as point masses, while others do not.
Morris, Melody K.; Saez-Rodriguez, Julio; Lauffenburger, Douglas A.; Alexopoulos, Leonidas G.
2012-01-01
Modeling of signal transduction pathways plays a major role in understanding cells' function and predicting cellular response. Mathematical formalisms based on a logic formalism are relatively simple but can describe how signals propagate from one protein to the next and have led to the construction of models that simulate the cells response to environmental or other perturbations. Constrained fuzzy logic was recently introduced to train models to cell specific data to result in quantitative pathway models of the specific cellular behavior. There are two major issues in this pathway optimization: i) excessive CPU time requirements and ii) loosely constrained optimization problem due to lack of data with respect to large signaling pathways. Herein, we address both issues: the former by reformulating the pathway optimization as a regular nonlinear optimization problem; and the latter by enhanced algorithms to pre/post-process the signaling network to remove parts that cannot be identified given the experimental conditions. As a case study, we tackle the construction of cell type specific pathways in normal and transformed hepatocytes using medium and large-scale functional phosphoproteomic datasets. The proposed Non Linear Programming (NLP) formulation allows for fast optimization of signaling topologies by combining the versatile nature of logic modeling with state of the art optimization algorithms. PMID:23226239
Mitsos, Alexander; Melas, Ioannis N; Morris, Melody K; Saez-Rodriguez, Julio; Lauffenburger, Douglas A; Alexopoulos, Leonidas G
2012-01-01
Modeling of signal transduction pathways plays a major role in understanding cells' function and predicting cellular response. Mathematical formalisms based on a logic formalism are relatively simple but can describe how signals propagate from one protein to the next and have led to the construction of models that simulate the cells response to environmental or other perturbations. Constrained fuzzy logic was recently introduced to train models to cell specific data to result in quantitative pathway models of the specific cellular behavior. There are two major issues in this pathway optimization: i) excessive CPU time requirements and ii) loosely constrained optimization problem due to lack of data with respect to large signaling pathways. Herein, we address both issues: the former by reformulating the pathway optimization as a regular nonlinear optimization problem; and the latter by enhanced algorithms to pre/post-process the signaling network to remove parts that cannot be identified given the experimental conditions. As a case study, we tackle the construction of cell type specific pathways in normal and transformed hepatocytes using medium and large-scale functional phosphoproteomic datasets. The proposed Non Linear Programming (NLP) formulation allows for fast optimization of signaling topologies by combining the versatile nature of logic modeling with state of the art optimization algorithms.
Dominant partition method. [based on a wave function formalism
NASA Technical Reports Server (NTRS)
Dixon, R. M.; Redish, E. F.
1979-01-01
By use of the L'Huillier, Redish, and Tandy (LRT) wave function formalism, a partially connected method, the dominant partition method (DPM) is developed for obtaining few body reductions of the many body problem in the LRT and Bencze, Redish, and Sloan (BRS) formalisms. The DPM maps the many body problem to a fewer body one by using the criterion that the truncated formalism must be such that consistency with the full Schroedinger equation is preserved. The DPM is based on a class of new forms for the irreducible cluster potential, which is introduced in the LRT formalism. Connectivity is maintained with respect to all partitions containing a given partition, which is referred to as the dominant partition. Degrees of freedom corresponding to the breakup of one or more of the clusters of the dominant partition are treated in a disconnected manner. This approach for simplifying the complicated BRS equations is appropriate for physical problems where a few body reaction mechanism prevails.
Formal development of a clock synchronization circuit
NASA Technical Reports Server (NTRS)
Miner, Paul S.
1995-01-01
This talk presents the latest stage in formal development of a fault-tolerant clock synchronization circuit. The development spans from a high level specification of the required properties to a circuit realizing the core function of the system. An abstract description of an algorithm has been verified to satisfy the high-level properties using the mechanical verification system EHDM. This abstract description is recast as a behavioral specification input to the Digital Design Derivation system (DDD) developed at Indiana University. DDD provides a formal design algebra for developing correct digital hardware. Using DDD as the principle design environment, a core circuit implementing the clock synchronization algorithm was developed. The design process consisted of standard DDD transformations augmented with an ad hoc refinement justified using the Prototype Verification System (PVS) from SRI International. Subsequent to the above development, Wilfredo Torres-Pomales discovered an area-efficient realization of the same function. Establishing correctness of this optimization requires reasoning in arithmetic, so a general verification is outside the domain of both DDD transformations and model-checking techniques. DDD represents digital hardware by systems of mutually recursive stream equations. A collection of PVS theories was developed to aid in reasoning about DDD-style streams. These theories include a combinator for defining streams that satisfy stream equations, and a means for proving stream equivalence by exhibiting a stream bisimulation. DDD was used to isolate the sub-system involved in Torres-Pomales' optimization. The equivalence between the original design and the optimized verified was verified in PVS by exhibiting a suitable bisimulation. The verification depended upon type constraints on the input streams and made extensive use of the PVS type system. The dependent types in PVS provided a useful mechanism for defining an appropriate bisimulation.
Van der Fels-Klerx, Ine H J; Goossens, Louis H J; Saatkamp, Helmut W; Horst, Suzan H S
2002-02-01
This paper presents a protocol for a formal expert judgment process using a heterogeneous expert panel aimed at the quantification of continuous variables. The emphasis is on the process's requirements related to the nature of expertise within the panel, in particular the heterogeneity of both substantive and normative expertise. The process provides the opportunity for interaction among the experts so that they fully understand and agree upon the problem at hand, including qualitative aspects relevant to the variables of interest, prior to the actual quantification task. Individual experts' assessments on the variables of interest, cast in the form of subjective probability density functions, are elicited with a minimal demand for normative expertise. The individual experts' assessments are aggregated into a single probability density function per variable, thereby weighting the experts according to their expertise. Elicitation techniques proposed include the Delphi technique for the qualitative assessment task and the ELI method for the actual quantitative assessment task. Appropriately, the Classical model was used to weight the experts' assessments in order to construct a single distribution per variable. Applying this model, the experts' quality typically was based on their performance on seed variables. An application of the proposed protocol in the broad and multidisciplinary field of animal health is presented. Results of this expert judgment process showed that the proposed protocol in combination with the proposed elicitation and analysis techniques resulted in valid data on the (continuous) variables of interest. In conclusion, the proposed protocol for a formal expert judgment process aimed at the elicitation of quantitative data from a heterogeneous expert panel provided satisfactory results. Hence, this protocol might be useful for expert judgment studies in other broad and/or multidisciplinary fields of interest.
A formalism for the calculus of variations with spinors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bäckdahl, Thomas, E-mail: thobac@chalmers.se; Valiente Kroon, Juan A., E-mail: j.a.valiente-kroon@qmul.ac.uk
2016-02-15
We develop a frame and dyad gauge-independent formalism for the calculus of variations of functionals involving spinorial objects. As a part of this formalism, we define a modified variation operator which absorbs frame and spin dyad gauge terms. This formalism is applicable to both the standard spacetime (i.e., SL(2, ℂ)) 2-spinors as well as to space (i.e., SU(2, ℂ)) 2-spinors. We compute expressions for the variations of the connection and the curvature spinors.
ERIC Educational Resources Information Center
Seltman, Muriel; Seltman, P. E. J.
1978-01-01
The authors stress the importance of bringing together the causal logic of history and the formal logic of mathematics in order to humanize mathematics and make it more accessible. An example of such treatment is given in a discussion of the centrality of Euclid and the Euclidean system to mathematics development. (MN)
ERIC Educational Resources Information Center
Arena, Dylan A.; Schwartz, Daniel L.
2014-01-01
Well-designed digital games can deliver powerful experiences that are difficult to provide through traditional instruction, while traditional instruction can deliver formal explanations that are not a natural fit for gameplay. Combined, they can accomplish more than either can alone. An experiment tested this claim using the topic of statistics,…
The Role of the Board of Education in the Process of Resource Allocation for Public Schools.
ERIC Educational Resources Information Center
Chichura, Elaine Marie
Public schools as formal organizations have broad-based goals, limited resources, and a formal hierarchy with which to manage the goal achievement process. The board of education combines this organization's economic and political dimensions to provide a thorough, efficient education for all children in the state. This paper investigates the…
ERIC Educational Resources Information Center
Burns, Janet Z.; Schaefer, Karen; Hayden, Jessie M.
2005-01-01
Trade and industrial (T&I) teachers enter the classroom as content level experts who may have acquired their content expertise through a combination of formal industry training and informal on-the-job experiences. When they make the career transition from industry to teaching, they must acquire professional teaching competencies. Like the content…
Non-Formal Education and Civil Society in Post-Soviet Russia: What Is the Relationship?
ERIC Educational Resources Information Center
Morgan, W. John; Kliucharev, Grigori A.
2011-01-01
The article describes collaborative research into the relationship between non-formal education and civil society in post-Soviet Russia. It shows how through social survey data and case studies of non-governmental organisations (NGOs) and other civil society organisations (CSOs), using a combination of social science perspectives, much can be…
NASA Astrophysics Data System (ADS)
Rumpfhuber, E.; Keller, G. R.; Velasco, A. A.
2005-12-01
Many large-scale experiments conduct both controlled-source and passive deployments to investigate the lithospheric structure of a targeted region. Many of these studies utilize each data set independently, resulting in different images of the Earth depending on the data set investigated. In general, formal integration of these data sets, such as joint inversions, with other data has not been performed. The CD-ROM experiment, which included both 2-D controlled-source and passive recording along a profile extending from southern Wyoming to northern New Mexico serves as an excellent data set to develop a formal integration strategy between both controlled source and passive experiments. These data are ideal to develop this strategy because: 1) the analysis of refraction/wide-angle reflection data yields Vp structure, and sometimes Vs structure, of the crust and uppermost mantle; 2) analysis of the PmP phase (Moho reflection) yields estimates of the average Vp of the crust for the crust; and 3) receiver functions contain full-crustal reverberations and yield the Vp/Vs ratio, but do not constrain the absolute P and S velocity. Thus, a simple form of integration involves using the Vp/Vs ratio from receiver functions and the average Vp from refraction measurements, to solve for the average Vs of the crust. When refraction/ wide-angle reflection data and several receiver functions nearby are available, an integrated 2-D model can be derived. In receiver functions, the PS conversion gives the S-wave travel-time (ts) through the crust along the raypath traveled from the Moho to the surface. Since the receiver function crustal reverberation gives the Vp/Vs ratio, it is also possible to use the arrival time of the converted phase, PS, to solve for the travel time of the direct teleseismic P-wave through the crust along the ray path. Raytracing can yield the point where the teleseismic wave intersects the Moho. In this approach, the conversion point is essentially a pseudo-shotpoint, thus the converted arrival at the surface can be jointly modeled with refraction data using a 3-D inversion code. Employing the combined CD-ROM data sets, we will be investigating the joint inversion results of controlled source data and receiver functions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berdiyorov, G. R., E-mail: gberdiyorov@qf.org.qa; El-Mellouhi, F.; Madjet, M. E.
Density functional theory in combination with the nonequilibrium Green's function formalism is used to study the electronic transport properties of methylammonium lead-iodide perovskite CH{sub 3}NH{sub 3}PbI{sub 3}. Electronic transport in homogeneous ferroelectric and antiferroelectric phases, both of which do not contain any charged domain walls, is quite similar. The presence of charged domain wall drastically (by about an order of magnitude) enhances the electronic transport in the lateral direction. The increase of the transmission originates from the smaller variation of the electrostatic potential profile along the charged domain walls. This fact may provide a tool for tuning transport properties ofmore » such hybrid materials by manipulating molecular cations having dipole moment.« less
Rectification induced in N2AA-doped armchair graphene nanoribbon device
NASA Astrophysics Data System (ADS)
Chen, Tong; Li, Xiao-Fei; Wang, Ling-Ling; Luo, Kai-Wu; Xu, Liang
2014-07-01
By using non-equilibrium Green function formalism in combination with density functional theory, we investigated the electronic transport properties of armchair graphene nanoribbon devices in which one lead is undoped and the other is N2AA-doped with two quasi-adjacent substitutional nitrogen atoms incorporating pairs of neighboring carbon atoms in the same sublattice A. Two kinds of N2AA-doped style are considered, for N dopants substitute the center or the edge carbon atoms. Our results show that the rectification behavior with a large rectifying ratio can be found in these devices and the rectifying characteristics can be modulated by changing the width of graphene nanoribbons or the position of the N2AA dopant. The mechanisms are revealed to explain the rectifying behaviors.
Deformation quantizations with separation of variables on a Kähler manifold
NASA Astrophysics Data System (ADS)
Karabegov, Alexander V.
1996-10-01
We give a simple geometric description of all formal differentiable deformation quantizations on a Kähler manifold M such that for each open subset U⊂ M ⋆-multiplication from the left by a holomorphic function and from the right by an antiholomorphic function on U coincides with the pointwise multiplication by these functions. We show that these quantizations are in 1-1 correspondence with the formal deformations of the original Kähler metrics on M.
Developing a Treatment Planning Software Based on TG-43U1 Formalism for Cs-137 LDR Brachytherapy.
Sina, Sedigheh; Faghihi, Reza; Soleimani Meigooni, Ali; Siavashpour, Zahra; Mosleh-Shirazi, Mohammad Amin
2013-08-01
The old Treatment Planning Systems (TPSs) used for intracavitary brachytherapy with Cs-137 Selectron source utilize traditional dose calculation methods, considering each source as a point source. Using such methods introduces significant errors in dose estimation. As of 1995, TG-43 is used as the main dose calculation formalism in treatment TPSs. The purpose of this study is to design and establish a treatment planning software for Cs-137 Solectron brachytherapy source, based on TG-43U1 formalism by applying the effects of the applicator and dummy spacers. Two softwares used for treatment planning of Cs-137 sources in Iran (STPS and PLATO), are based on old formalisms. The purpose of this work is to establish and develop a TPS for Selectron source based on TG-43 formalism. In this planning system, the dosimetry parameters of each pellet in different places inside applicators were obtained by MCNP4c code. Then the dose distribution around every combination of active and inactive pellets was obtained by summing the doses. The accuracy of this algorithm was checked by comparing its results for special combination of active and inactive pellets with MC simulations. Finally, the uncertainty of old dose calculation formalism was investigated by comparing the results of STPS and PLATO softwares with those obtained by the new algorithm. For a typical arrangement of 10 active pellets in the applicator, the percentage difference between doses obtained by the new algorithm at 1cm distance from the tip of the applicator and those obtained by old formalisms is about 30%, while the difference between the results of MCNP and the new algorithm is less than 5%. According to the results, the old dosimetry formalisms, overestimate the dose especially towards the applicator's tip. While the TG-43U1 based software perform the calculations more accurately.
ERIC Educational Resources Information Center
Purse, Katie; Gardner, Hilary
2013-01-01
This study aimed to consider collaborative practice in contributing to joint assessment and producing appropriate referral of children to speech and language therapy (SLT). Results of formal testing of selected comprehension skills are compared with functional/classroom performance as rated by class teachers. Thirty children aged 6.5-8.4 years,…
Double Parton Fragmentation Function and its Evolution in Quarkonium Production
NASA Astrophysics Data System (ADS)
Kang, Zhong-Bo
2014-01-01
We summarize the results of a recent study on a new perturbative QCD factorization formalism for the production of heavy quarkonia of large transverse momentum pT at collider energies. Such a new factorization formalism includes both the leading power (LP) and next-to-leading power (NLP) contributions to the cross section in the mQ2/p_T^2 expansion for heavy quark mass mQ. For the NLP contribution, the so-called double parton fragmentation functions are involved, whose evolution equations have been derived. We estimate fragmentation functions in the non-relativistic QCD formalism, and found that their contribution reproduce the bulk of the large enhancement found in explicit NLO calculations in the color singlet model. Heavy quarkonia produced from NLP channels prefer longitudinal polarization, in contrast to the single parton fragmentation function. This might shed some light on the heavy quarkonium polarization puzzle.
Coarse-grained hydrodynamics from correlation functions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Palmer, Bruce
This paper will describe a formalism for using correlation functions between different grid cells as the basis for determining coarse-grained hydrodynamic equations for modeling the behavior of mesoscopic fluid systems. Configuration from a molecular dynamics simulation are projected onto basis functions representing grid cells in a continuum hydrodynamic simulation. Equilbrium correlation functions between different grid cells are evaluated from the molecular simulation and used to determine the evolution operator for the coarse-grained hydrodynamic system. The formalism is applied to some simple hydrodynamic cases to determine the feasibility of applying this to realistic nanoscale systems.
Pérès, Sabine; Felicori, Liza; Rialle, Stéphanie; Jobard, Elodie; Molina, Franck
2010-01-01
Motivation: In the available databases, biological processes are described from molecular and cellular points of view, but these descriptions are represented with text annotations that make it difficult to handle them for computation. Consequently, there is an obvious need for formal descriptions of biological processes. Results: We present a formalism that uses the BioΨ concepts to model biological processes from molecular details to networks. This computational approach, based on elementary bricks of actions, allows us to calculate on biological functions (e.g. process comparison, mapping structure–function relationships, etc.). We illustrate its application with two examples: the functional comparison of proteases and the functional description of the glycolysis network. This computational approach is compatible with detailed biological knowledge and can be applied to different kinds of systems of simulation. Availability: www.sysdiag.cnrs.fr/publications/supplementary-materials/BioPsi_Manager/ Contact: sabine.peres@sysdiag.cnrs.fr; franck.molina@sysdiag.cnrs.fr Supplementary information: Supplementary data are available at Bioinformatics online. PMID:20448138
NASA Astrophysics Data System (ADS)
Hilditch, David; Harms, Enno; Bugner, Marcus; Rüter, Hannes; Brügmann, Bernd
2018-03-01
A long-standing problem in numerical relativity is the satisfactory treatment of future null-infinity. We propose an approach for the evolution of hyperboloidal initial data in which the outer boundary of the computational domain is placed at infinity. The main idea is to apply the ‘dual foliation’ formalism in combination with hyperboloidal coordinates and the generalized harmonic gauge formulation. The strength of the present approach is that, following the ideas of Zenginoğlu, a hyperboloidal layer can be naturally attached to a central region using standard coordinates of numerical relativity applications. Employing a generalization of the standard hyperboloidal slices, developed by Calabrese et al, we find that all formally singular terms take a trivial limit as we head to null-infinity. A byproduct is a numerical approach for hyperboloidal evolution of nonlinear wave equations violating the null-condition. The height-function method, used often for fixed background spacetimes, is generalized in such a way that the slices can be dynamically ‘waggled’ to maintain the desired outgoing coordinate lightspeed precisely. This is achieved by dynamically solving the eikonal equation. As a first numerical test of the new approach we solve the 3D flat space scalar wave equation. The simulations, performed with the pseudospectral bamps code, show that outgoing waves are cleanly absorbed at null-infinity and that errors converge away rapidly as resolution is increased.
ERIC Educational Resources Information Center
Sato, Yukiko; Rachmawan, Irene Erlyn Wina; Brückner, Stefan; Waragai, Ikumi; Kiyoki, Yasushi
2017-01-01
This study aims to enhance foreign language learners' language competence by integrating formal and informal learning environments and considers how they can improve their grammatical and lexical skills through the gathering (comprehension) and sharing (writing) of information in the foreign language. Experiments with German learners at a Japanese…
ERIC Educational Resources Information Center
Lukes, Marguerite
2011-01-01
This study explores the potential of native language literacy instruction for adult immigrant English language learners who have limited formal schooling or have had interruptions in their formal education. By examining 3 programs that provide native language literacy in combination with English as a second language (ESL) instruction, this study…
ERIC Educational Resources Information Center
Kalechofsky, Robert
This research paper proposes several mathematical models which help clarify Piaget's theory of cognition on the concrete and formal operational stages. Some modified lattice models were used for the concrete stage and a combined Boolean Algebra and group theory model was used for the formal stage. The researcher used experiments cited in the…
Formal Home Care Utilization Patterns by Rural–Urban Community Residence
Spector, William; Van Nostrand, Joan
2009-01-01
Background We examined formal home care utilization among civilian adults across metro and nonmetro residential categories before and after adjustment for predisposing, enabling, and need variables. Methods Two years of the Medical Expenditure Panel Survey (MEPS) were combined to produce a nationally representative sample of adults who resided in the community for a calendar year. We established 6 rural–urban categories based upon Urban Influence Codes and examined 2 dependent variables: (a) likelihood of using any formal home care and (b) number of provider days received by users. The Area Resource File provided county-level information. Logistic and negative binomial regression analyses were employed, with adjustments for the MEPS complex sampling design and the combined years. Results Under controls for predisposing, enabling, and need variables, differences in likelihood of any formal home care use disappear, but differences in number of provider days received by users emerged, with fewer provider days in remote areas than in metro and several other nonmetro types. Conclusions It is important to fully account for predisposing, enabling, and need factors when assessing rural and urban home care utilization patterns. The limited provider days in remote counties under controls suggest a possible access problem for adults in these areas. PMID:19196690
ERIC Educational Resources Information Center
Ziermans, Tim; Swaab, Hanna; Stockmann, Alexander; de Bruin, Esther; van Rijn, Sophie
2017-01-01
Formal thought disorder (FTD) is a disruption in the flow of thought and a common feature in psychotic disorders and autism spectrum disorder (ASD). Executive dysfunction has often been associated with FTD, yet for ASD convincing evidence is lacking. This study investigated FTD and three core executive functions in 50 young children and…
NASA Astrophysics Data System (ADS)
Melis, Stefano
2015-01-01
We present a review of current Transverse Momentum Dependent (TMD) phenomenology focusing our attention on the unpolarized TMD parton distribution function and the Sivers function. The paper introduces and comments about the new Collins-Soper-Sterman (CSS) TMD evolution formalism [1]. We make use of a selection of results obtained by several groups to illustrate the achievements and the failures of the simple Gaussian approach and the TMD CSS evolution formalism.
Formal functional test designs with a test representation language
NASA Technical Reports Server (NTRS)
Hops, J. M.
1993-01-01
The application of the category-partition method to the test design phase of hardware, software, or system test development is discussed. The method provides a formal framework for reducing the total number of possible test cases to a minimum logical subset for effective testing. An automatic tool and a formal language were developed to implement the method and produce the specification of test cases.
On the abundance of planetary water and exo-life after Kepler
NASA Astrophysics Data System (ADS)
Wandel, Amri
2015-08-01
Combining the recent results of the Kepler mission on the abundance of small planets within the Habitable Zone with a Drake-equation formalism I derive the space density of planets with surface water and biotic planets as a function of the yet unknown probabilities for the evolution of an Earthlike atmosphere and biosphere, respectively. I describe how these probabilities may be estimated by future spectral observations of exoplanet biomarkers such as atmospheric oxygen and water. I find that planets with surface liquid water may be expected within 10 light years and biotic planets within 10 -- 100 light years from Earth. ArXiv 1412.1302.
Assessment of parametric uncertainty for groundwater reactive transport modeling,
Shi, Xiaoqing; Ye, Ming; Curtis, Gary P.; Miller, Geoffery L.; Meyer, Philip D.; Kohler, Matthias; Yabusaki, Steve; Wu, Jichun
2014-01-01
The validity of using Gaussian assumptions for model residuals in uncertainty quantification of a groundwater reactive transport model was evaluated in this study. Least squares regression methods explicitly assume Gaussian residuals, and the assumption leads to Gaussian likelihood functions, model parameters, and model predictions. While the Bayesian methods do not explicitly require the Gaussian assumption, Gaussian residuals are widely used. This paper shows that the residuals of the reactive transport model are non-Gaussian, heteroscedastic, and correlated in time; characterizing them requires using a generalized likelihood function such as the formal generalized likelihood function developed by Schoups and Vrugt (2010). For the surface complexation model considered in this study for simulating uranium reactive transport in groundwater, parametric uncertainty is quantified using the least squares regression methods and Bayesian methods with both Gaussian and formal generalized likelihood functions. While the least squares methods and Bayesian methods with Gaussian likelihood function produce similar Gaussian parameter distributions, the parameter distributions of Bayesian uncertainty quantification using the formal generalized likelihood function are non-Gaussian. In addition, predictive performance of formal generalized likelihood function is superior to that of least squares regression and Bayesian methods with Gaussian likelihood function. The Bayesian uncertainty quantification is conducted using the differential evolution adaptive metropolis (DREAM(zs)) algorithm; as a Markov chain Monte Carlo (MCMC) method, it is a robust tool for quantifying uncertainty in groundwater reactive transport models. For the surface complexation model, the regression-based local sensitivity analysis and Morris- and DREAM(ZS)-based global sensitivity analysis yield almost identical ranking of parameter importance. The uncertainty analysis may help select appropriate likelihood functions, improve model calibration, and reduce predictive uncertainty in other groundwater reactive transport and environmental modeling.
Silbergleit, Alice K; Cook, Diana; Kienzle, Scott; Boettcher, Erica; Myers, Daniel; Collins, Denise; Peterson, Edward; Silbergleit, Matthew A; Silbergleit, Richard
2018-04-04
Formal agreement studies on interpretation of the videofluoroscopic swallowing study (VFSS) procedure among speech-language pathologists, radiology house officers, and staff radiologists have not been pursued. Each of these professions participates in the procedure, interprets the examination, and writes separate reports on the findings. The aim of this study was to determine reliability of interpretation between and within the disciplines and to determine if structured training improved reliability. Thirteen speech-language pathologists (SLPs), ten diagnostic radiologists (RADs) and twenty-one diagnostic radiology house officers (HOs) participated in this study. Each group viewed 24 VFSS samples and rated the presence or absence of seven aberrant swallowing features as well as the presence of dysphagia and identification of oral dysphagia, pharyngeal dysphagia, or both. During part two, the groups were provided with a training session on normal and abnormal swallowing, using different VFSS samples from those in part one, followed by re-rating of the original 24 VFSS samples. A generalized estimating equations (GEE) approach with a binomial link function was used to examine each question separately. For each cluster of tests, as example, all pairwise comparisons between the three groups in the pretraining period, a Hochberg's correction for multiple testing was used to determine significance. A GEE approach with a binomial link function was used to compare the premeasure to postmeasure for each of the three groups of raters stratified by experience. The primary result revealed that the HO group scored significantly lower than the SLP and RAD group on identification of the presence of dysphagia (p = 0.008; p = 0.001, respectively), identification of oral phase dysphagia (p = 0.003; p = 0.001, respectively), and identification of both oral and pharyngeal phase dysphagia, (p = 0.014, p = 0.001, respectively) pretraining. Post training there was no statistically significant difference between the three groups on identification of dysphagia and identification of combined oral and pharyngeal dysphagia. Formal training to identify oropharyngeal dysphagia characteristics appears to improve accuracy of interpretation of the VFSS procedure for radiology house officers. Consideration to include formal training in this area for radiology residency training programs is recommended.
The generation of gravitational waves. I - Weak-field sources
NASA Technical Reports Server (NTRS)
Thorne, K. S.; Kovacs, S. J.
1975-01-01
This paper derives and summarizes a 'plug-in-and-grind' formalism for calculating the gravitational waves emitted by any system with weak internal gravitational fields. If the internal fields have negligible influence on the system's motions, the formalism reduces to standard 'linearized theory'. Independent of the effects of gravity on the motions, the formalism reduces to the standard 'quadrupole-moment formalism' if the motions are slow and internal stresses are weak. In the general case, the formalism expresses the radiation in terms of a retarded Green's function for slightly curved spacetime and breaks the Green's function integral into five easily understood pieces: direct radiation, produced directly by the motions of the source; whump radiation, produced by the 'gravitational stresses' of the source; transition radiation, produced by a time-changing time delay ('Shapiro effect') in the propagation of the nonradiative 1/r field of the source; focusing radiation, produced when one portion of the source focuses, in a time-dependent way, the nonradiative field of another portion of the source; and tail radiation, produced by 'back-scatter' of the nonradiative field in regions of focusing.
Tactical Synthesis Of Efficient Global Search Algorithms
NASA Technical Reports Server (NTRS)
Nedunuri, Srinivas; Smith, Douglas R.; Cook, William R.
2009-01-01
Algorithm synthesis transforms a formal specification into an efficient algorithm to solve a problem. Algorithm synthesis in Specware combines the formal specification of a problem with a high-level algorithm strategy. To derive an efficient algorithm, a developer must define operators that refine the algorithm by combining the generic operators in the algorithm with the details of the problem specification. This derivation requires skill and a deep understanding of the problem and the algorithmic strategy. In this paper we introduce two tactics to ease this process. The tactics serve a similar purpose to tactics used for determining indefinite integrals in calculus, that is suggesting possible ways to attack the problem.
Neural substrates and behavioral profiles of romantic jealousy and its temporal dynamics.
Sun, Yan; Yu, Hongbo; Chen, Jie; Liang, Jie; Lu, Lin; Zhou, Xiaolin; Shi, Jie
2016-06-07
Jealousy is not only a way of experiencing love but also a stabilizer of romantic relationships, although morbid romantic jealousy is maladaptive. Being engaged in a formal romantic relationship can tune one's romantic jealousy towards a specific target. Little is known about how the human brain processes romantic jealousy by now. Here, by combining scenario-based imagination and functional MRI, we investigated the behavioral and neural correlates of romantic jealousy and their development across stages (before vs. after being in a formal relationship). Romantic jealousy scenarios elicited activations primarily in the basal ganglia (BG) across stages, and were significantly higher after the relationship was established in both the behavioral rating and BG activation. The intensity of romantic jealousy was related to the intensity of romantic happiness, which mainly correlated with ventral medial prefrontal cortex activation. The increase in jealousy across stages was associated with the tendency for interpersonal aggression. These results bridge the gap between the theoretical conceptualization of romantic jealousy and its neural correlates and shed light on the dynamic changes in jealousy.
Neural substrates and behavioral profiles of romantic jealousy and its temporal dynamics
Sun, Yan; Yu, Hongbo; Chen, Jie; Liang, Jie; Lu, Lin; Zhou, Xiaolin; Shi, Jie
2016-01-01
Jealousy is not only a way of experiencing love but also a stabilizer of romantic relationships, although morbid romantic jealousy is maladaptive. Being engaged in a formal romantic relationship can tune one’s romantic jealousy towards a specific target. Little is known about how the human brain processes romantic jealousy by now. Here, by combining scenario-based imagination and functional MRI, we investigated the behavioral and neural correlates of romantic jealousy and their development across stages (before vs. after being in a formal relationship). Romantic jealousy scenarios elicited activations primarily in the basal ganglia (BG) across stages, and were significantly higher after the relationship was established in both the behavioral rating and BG activation. The intensity of romantic jealousy was related to the intensity of romantic happiness, which mainly correlated with ventral medial prefrontal cortex activation. The increase in jealousy across stages was associated with the tendency for interpersonal aggression. These results bridge the gap between the theoretical conceptualization of romantic jealousy and its neural correlates and shed light on the dynamic changes in jealousy. PMID:27273024
Adhesion of a bimetallic interface. Ph.D. Thesis - Case Western Reserve Univ.; [for Al, Mg, and Zn
NASA Technical Reports Server (NTRS)
Ferrante, J.
1978-01-01
The Hohenberg-Kohn and Kohn-Sham formalisms are used to examine binding (binding energy as a function of separation) for combinations of the simple metals Al(111), Zn(0001), Mg(0001), and Na(110) in contact. Similar metal contacts between Al, Zn, Mg, and Na are examined self-consistently in an ab initio calculation using the Kohn-Sham formalism. Crystallinity is included using the Aschroft pseudopotential via first order perturbation theory for the electron-ion interaction; and the ion-ion interaction is included exactly via a lattice sum. Binding energy was determined both in the local-density approximation and including gradient corrections to the exchange and correlation energy. Binding was found in all cases. In dissimilar metal contacts, interfacial bonding was greater than that in the weaker material predicting the possibility of metallic transfer. The nonzero position of the energy minimum in like metal contacts is explained in terms of consistency between the Ashcroft pseudopotential and the bulk charge density. Good agreement with experimental surface energies is obtained in the self-consistent calculation when nonlocal terms are included.
The Aeronautical Data Link: Taxonomy, Architectural Analysis, and Optimization
NASA Technical Reports Server (NTRS)
Morris, A. Terry; Goode, Plesent W.
2002-01-01
The future Communication, Navigation, and Surveillance/Air Traffic Management (CNS/ATM) System will rely on global satellite navigation, and ground-based and satellite based communications via Multi-Protocol Networks (e.g. combined Aeronautical Telecommunications Network (ATN)/Internet Protocol (IP)) to bring about needed improvements in efficiency and safety of operations to meet increasing levels of air traffic. This paper will discuss the development of an approach that completely describes optimal data link architecture configuration and behavior to meet the multiple conflicting objectives of concurrent and different operations functions. The practical application of the approach enables the design and assessment of configurations relative to airspace operations phases. The approach includes a formal taxonomic classification, an architectural analysis methodology, and optimization techniques. The formal taxonomic classification provides a multidimensional correlation of data link performance with data link service, information protocol, spectrum, and technology mode; and to flight operations phase and environment. The architectural analysis methodology assesses the impact of a specific architecture configuration and behavior on the local ATM system performance. Deterministic and stochastic optimization techniques maximize architectural design effectiveness while addressing operational, technology, and policy constraints.
Development of dynamic Bayesian models for web application test management
NASA Astrophysics Data System (ADS)
Azarnova, T. V.; Polukhin, P. V.; Bondarenko, Yu V.; Kashirina, I. L.
2018-03-01
The mathematical apparatus of dynamic Bayesian networks is an effective and technically proven tool that can be used to model complex stochastic dynamic processes. According to the results of the research, mathematical models and methods of dynamic Bayesian networks provide a high coverage of stochastic tasks associated with error testing in multiuser software products operated in a dynamically changing environment. Formalized representation of the discrete test process as a dynamic Bayesian model allows us to organize the logical connection between individual test assets for multiple time slices. This approach gives an opportunity to present testing as a discrete process with set structural components responsible for the generation of test assets. Dynamic Bayesian network-based models allow us to combine in one management area individual units and testing components with different functionalities and a direct influence on each other in the process of comprehensive testing of various groups of computer bugs. The application of the proposed models provides an opportunity to use a consistent approach to formalize test principles and procedures, methods used to treat situational error signs, and methods used to produce analytical conclusions based on test results.
Bondonic effects in group-IV honeycomb nanoribbons with Stone-Wales topological defects.
Putz, Mihai V; Ori, Ottorino
2014-04-03
This work advances the modeling of bondonic effects on graphenic and honeycomb structures, with an original two-fold generalization: (i) by employing the fourth order path integral bondonic formalism in considering the high order derivatives of the Wiener topological potential of those 1D systems; and (ii) by modeling a class of honeycomb defective structures starting from graphene, the carbon-based reference case, and then generalizing the treatment to Si (silicene), Ge (germanene), Sn (stannene) by using the fermionic two-degenerate statistical states function in terms of electronegativity. The honeycomb nanostructures present η-sized Stone-Wales topological defects, the isomeric dislocation dipoles originally called by authors Stone-Wales wave or SWw. For these defective nanoribbons the bondonic formalism foresees a specific phase-transition whose critical behavior shows typical bondonic fast critical time and bonding energies. The quantum transition of the ideal-to-defect structural transformations is fully described by computing the caloric capacities for nanostructures triggered by η-sized topological isomerisations. Present model may be easily applied to hetero-combinations of Group-IV elements like C-Si, C-Ge, C-Sn, Si-Ge, Si-Sn, Ge-Sn.
Systematic errors in transport calculations of shear viscosity using the Green-Kubo formalism
NASA Astrophysics Data System (ADS)
Rose, J. B.; Torres-Rincon, J. M.; Oliinychenko, D.; Schäfer, A.; Petersen, H.
2018-05-01
The purpose of this study is to provide a reproducible framework in the use of the Green-Kubo formalism to extract transport coefficients. More specifically, in the case of shear viscosity, we investigate the limitations and technical details of fitting the auto-correlation function to a decaying exponential. This fitting procedure is found to be applicable for systems interacting both through constant and energy-dependent cross-sections, although this is only true for sufficiently dilute systems in the latter case. We find that the optimal fit technique consists in simultaneously fixing the intercept of the correlation function and use a fitting interval constrained by the relative error on the correlation function. The formalism is then applied to the full hadron gas, for which we obtain the shear viscosity to entropy ratio.
The detailed balance requirement and general empirical formalisms for continuum absorption
NASA Technical Reports Server (NTRS)
Ma, Q.; Tipping, R. H.
1994-01-01
Two general empirical formalisms are presented for the spectral density which take into account the deviations from the Lorentz line shape in the wing regions of resonance lines. These formalisms satisfy the detailed balance requirement. Empirical line shape functions, which are essential to provide the continuum absorption at different temperatures in various frequency regions for atmospheric transmission codes, can be obtained by fitting to experimental data.
Modeling of clover detector in addback mode
NASA Astrophysics Data System (ADS)
Kshetri, R.
2012-07-01
Based on absorption and scattering of gamma-rays, a formalism has been presented for modeling the clover germanium detector in addback mode and to predict its response for high energy γ-rays. In the present formalism, the operation of a bare clover detector could be described in terms of three quantities only. Considering an additional parameter, the formalism could be extended for suppressed clover. Using experimental data on relative single crystal efficiency and addback factor as input, the peak-to-total ratio has been calculated for three energies (Eγ = 3.401, 5.324 and 10.430 MeV) where direct measurement of peak-to-total ratio is impossible due to absence of a radioactive source having single monoenergetic gamma-ray of that energy. The experimental validation and consistency of the formalism have been shown considering data for TIGRESS clover detector. In a recent work (R. Kshetri, JINST 2012 7 P04008), we showed that for a given γ-ray energy, the formalism could be used to predict the peak-to-total ratio as a function of number of detector modules. In the present paper, we have shown that for a given composite detector (clover detector is considered here), the formalism could be used to predict the peak-to-total ratio as a function of γ-ray energy.
Systematic Model-in-the-Loop Test of Embedded Control Systems
NASA Astrophysics Data System (ADS)
Krupp, Alexander; Müller, Wolfgang
Current model-based development processes offer new opportunities for verification automation, e.g., in automotive development. The duty of functional verification is the detection of design flaws. Current functional verification approaches exhibit a major gap between requirement definition and formal property definition, especially when analog signals are involved. Besides lack of methodical support for natural language formalization, there does not exist a standardized and accepted means for formal property definition as a target for verification planning. This article addresses several shortcomings of embedded system verification. An Enhanced Classification Tree Method is developed based on the established Classification Tree Method for Embeded Systems CTM/ES which applies a hardware verification language to define a verification environment.
Palm: Easing the Burden of Analytical Performance Modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tallent, Nathan R.; Hoisie, Adolfy
2014-06-01
Analytical (predictive) application performance models are critical for diagnosing performance-limiting resources, optimizing systems, and designing machines. Creating models, however, is difficult because they must be both accurate and concise. To ease the burden of performance modeling, we developed Palm, a modeling tool that combines top-down (human-provided) semantic insight with bottom-up static and dynamic analysis. To express insight, Palm defines a source code modeling annotation language. By coordinating models and source code, Palm's models are `first-class' and reproducible. Unlike prior work, Palm formally links models, functions, and measurements. As a result, Palm (a) uses functions to either abstract or express complexitymore » (b) generates hierarchical models (representing an application's static and dynamic structure); and (c) automatically incorporates measurements to focus attention, represent constant behavior, and validate models. We discuss generating models for three different applications.« less
Ricotta, Carlo; Pacini, Alessandra; Avena, Giancarlo
2002-01-01
We propose a measure of divergence from species to life-form diversity aimed at summarizing the ecological similarity among different plant communities without losing information on traditional taxonomic diversity. First, species and life-form relative abundances within a given plant community are determined. Next, using Rényi's generalized entropy, the diversity profiles of the analyzed community are computed both from species and life-form relative abundances. Finally, the speed of decrease from species to life-form diversity is obtained by combining the outcome of both profiles. Interestingly, the proposed measure shows some formal analogies with multifractal functions developed in statistical physics for the analysis of spatial patterns. As an application for demonstration, a small data set from a plant community sampled in the archaeological site of Paestum (southern Italy) is used.
Rectification induced in N{sub 2}{sup AA}-doped armchair graphene nanoribbon device
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Tong; Wang, Ling-Ling, E-mail: llwang@hnu.edu.cn; Luo, Kai-Wu
2014-07-07
By using non-equilibrium Green function formalism in combination with density functional theory, we investigated the electronic transport properties of armchair graphene nanoribbon devices in which one lead is undoped and the other is N{sub 2}{sup AA}-doped with two quasi-adjacent substitutional nitrogen atoms incorporating pairs of neighboring carbon atoms in the same sublattice A. Two kinds of N{sub 2}{sup AA}-doped style are considered, for N dopants substitute the center or the edge carbon atoms. Our results show that the rectification behavior with a large rectifying ratio can be found in these devices and the rectifying characteristics can be modulated by changingmore » the width of graphene nanoribbons or the position of the N{sub 2}{sup AA} dopant. The mechanisms are revealed to explain the rectifying behaviors.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Tong; Wang, Ling-Ling, E-mail: llwang@hnu.edu.cn; Li, Quan
2014-02-07
The electronic band structures and transport properties of N{sub 2}{sup AA}-doped armchair graphene nanoribbons (aGNRs) with two quasi-adjacent substitutional nitrogen atoms incorporated in pairs of neighboring carbon atoms in the same sublattice A are investigated by using non-equilibrium Green function formalism in combination with density functional theory. The results show that the coupling effect between the Pz orbitals of carbon and nitrogen atoms plays an important role in the transition between semiconductor and metal by different locations of N{sub 2}{sup AA}-doped aGNRs. And the striking negative differential resistance behaviors can be found in such devices. These tremendous properties suggest potentialmore » application of N{sub 2}{sup AA}-doped aGNRs in graphene-based nanoelectronic devices.« less
NASA Astrophysics Data System (ADS)
Tauber, C.
2018-05-01
We propose a general edge index definition for two-dimensional Floquet topological phases based on a switch-function formalism. When the Floquet operator has a spectral gap, the index covers both clean and disordered phases, anomalous or not, and does not require the bulk to be fully localized. It is interpreted as a nonadiabatic charge pumping that is quantized when the sample is placed next to an effective vacuum. This vacuum is gap-dependent and obtained from a Floquet Hamiltonian. The choice of a vacuum provides a simple and alternative gap-selection mechanism. Inspired by the model from Rudner et al. we then illustrate these concepts on Floquet disordered phases. Switch-function formalism is usually restricted to infinite samples in the thermodynamic limit. Here we circumvent this issue and propose a numerical implementation of the edge index that could be adapted to any bulk or edge index expressed in terms of switch functions, already existing for many topological phases.
NASA Astrophysics Data System (ADS)
Nagai, Tetsuro
2017-01-01
Replica-exchange molecular dynamics (REMD) has demonstrated its efficiency by combining trajectories of a wide range of temperatures. As an extension of the method, the author formalizes the mass-manipulating replica-exchange molecular dynamics (MMREMD) method that allows for arbitrary mass scaling with respect to temperature and individual particles. The formalism enables the versatile application of mass-scaling approaches to the REMD method. The key change introduced in the novel formalism is the generalized rules for the velocity and momentum scaling after accepted replica-exchange attempts. As an application of this general formalism, the refinement of the viscosity-REMD (V-REMD) method [P. H. Nguyen,
Quantum localization of classical mechanics
NASA Astrophysics Data System (ADS)
Batalin, Igor A.; Lavrov, Peter M.
2016-07-01
Quantum localization of classical mechanics within the BRST-BFV and BV (or field-antifield) quantization methods are studied. It is shown that a special choice of gauge fixing functions (or BRST-BFV charge) together with the unitary limit leads to Hamiltonian localization in the path integral of the BRST-BFV formalism. In turn, we find that a special choice of gauge fixing functions being proportional to extremals of an initial non-degenerate classical action together with a very special solution of the classical master equation result in Lagrangian localization in the partition function of the BV formalism.
FOURIER ANALYSIS OF BLAZAR VARIABILITY: KLEIN–NISHINA EFFECTS AND THE JET SCATTERING ENVIRONMENT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Finke, Justin D.; Becker, Peter A., E-mail: justin.finke@nrl.navy.mil, E-mail: pbecker@gmu.edu
The strong variability of blazars can be characterized by power spectral densities (PSDs) and Fourier frequency-dependent time lags. In previous work, we created a new theoretical formalism for describing the PSDs and time lags produced via a combination of stochastic particle injection and emission via the synchrotron, synchrotron self-Compton, and external Compton (EC) processes. This formalism used the Thomson cross section and simple δ-function approximations to model the synchrotron and Compton emissivities. Here we expand upon this work, using the full Compton cross section and detailed and accurate emissivities. Our results indicate good agreement between the PSDs computed using themore » δ-function approximations and those computed using the accurate expressions, provided the observed photons are produced primarily by electrons with energies exceeding the lower limit of the injected particle population. Breaks are found in the PSDs at frequencies corresponding to the cooling timescales of the electrons primarily responsible for the observed emission, and the associated time lags are related to the difference in electron cooling timescales between the two energy channels, as expected. If the electron cooling timescales can be determined from the observed time lags and/or the observed EC PSDs, then one could in principle use the method developed here to determine the energy of the external seed photon source for EC, which is an important unsolved problem in blazar physics.« less
Multi-level meta-workflows: new concept for regularly occurring tasks in quantum chemistry.
Arshad, Junaid; Hoffmann, Alexander; Gesing, Sandra; Grunzke, Richard; Krüger, Jens; Kiss, Tamas; Herres-Pawlis, Sonja; Terstyanszky, Gabor
2016-01-01
In Quantum Chemistry, many tasks are reoccurring frequently, e.g. geometry optimizations, benchmarking series etc. Here, workflows can help to reduce the time of manual job definition and output extraction. These workflows are executed on computing infrastructures and may require large computing and data resources. Scientific workflows hide these infrastructures and the resources needed to run them. It requires significant efforts and specific expertise to design, implement and test these workflows. Many of these workflows are complex and monolithic entities that can be used for particular scientific experiments. Hence, their modification is not straightforward and it makes almost impossible to share them. To address these issues we propose developing atomic workflows and embedding them in meta-workflows. Atomic workflows deliver a well-defined research domain specific function. Publishing workflows in repositories enables workflow sharing inside and/or among scientific communities. We formally specify atomic and meta-workflows in order to define data structures to be used in repositories for uploading and sharing them. Additionally, we present a formal description focused at orchestration of atomic workflows into meta-workflows. We investigated the operations that represent basic functionalities in Quantum Chemistry, developed the relevant atomic workflows and combined them into meta-workflows. Having these workflows we defined the structure of the Quantum Chemistry workflow library and uploaded these workflows in the SHIWA Workflow Repository.Graphical AbstractMeta-workflows and embedded workflows in the template representation.
Utilisation of Healthcare and Associated Services in Huntington’s disease: a data mining study
Busse, Monica; Al-Madfai, Dr. Hasan; Kenkre, Joyce; Landwehrmeyer, G. Bernhard; Bentivoglio, AnnaRita; Rosser, Anne
2011-01-01
Background: People with Huntington’s disease (HD) often require tailored healthcare and support packages that develop as the disease progresses. The Client Service Receipt Inventory (CSRI) gathers retrospective information on service utilization. This study investigated the use of formal services and informal care as measured by the CSRI and explored associations between informal care, disease severity and functional ability as measured by the Unified Huntington’s Disease Rating Scale Total Motor Score (UHDRS-TMS) and functional scales. Methods: All monitored longitudinal data from annual clinical assessments of UHDRS-TMS and functional assessments and CSRI collected under the auspices of the European Huntington’s Disease Network (EHDN) REGISTRY study between the years 2004 and 2009 were utilised in the analyses. Disease severity was reflected by UHDRS-TMS. Functional ability was measured using the UHDRS functional scales. CSRI data were analysed according to percentage use of individual formal services and total estimated hours per week of informal care. Regression analyses were conducted to identify any associations between disease severity, functional ability and hours of informal care. Results: 451 HD patients (212 female; 239 male) completed one visit; 105 patients (54 females; 51 males) completed two visits and 47 patients (20 females; 27 males) completed three visits in total over the 5 year period. The mean time between visits was 1.2 years. At visit one, 74% of the participants reported being in receipt of at least one formal hospital-based service in the previous six months, and 89% reported receipt of formal primary and community care services. In contrast, at the third visit, 62% of people had used hospital based services and 94% formal community based services in the previous six months. Fifty % of individuals required some form of informal care in the home at visit 1; this increased to 68% at visits 2 and 3. The mean (SD) estimated weekly total informal care hours at visits 1, 2 and 3 were 32.8 (49.4); 21.6 (53.6) and 21.3 (62.4) respectively. Only the scores on the Functional Assessment Scale (FAS) accounted for the variance in the weekly total informal care hours at each visit. Conclusions: Although it must be acknowledged that service use is supply driven, most HD patients across Europe surveyed as part of this study were in receipt of formal primary and community care services and to a lesser extent formal hospital based services. There was however a large reliance on informal care in the home. The FAS appear to have predictive value on informal care requirements and may have utility in facilitating pro-active service provision and in particular when managing carer burden in this population. PMID:21304753
Coarse-grained hydrodynamics from correlation functions
NASA Astrophysics Data System (ADS)
Palmer, Bruce
2018-02-01
This paper will describe a formalism for using correlation functions between different grid cells as the basis for determining coarse-grained hydrodynamic equations for modeling the behavior of mesoscopic fluid systems. Configurations from a molecular dynamics simulation or other atomistic simulation are projected onto basis functions representing grid cells in a continuum hydrodynamic simulation. Equilibrium correlation functions between different grid cells are evaluated from the molecular simulation and used to determine the evolution operator for the coarse-grained hydrodynamic system. The formalism is demonstrated on a discrete particle simulation of diffusion with a spatially dependent diffusion coefficient. Correlation functions are calculated from the particle simulation and the spatially varying diffusion coefficient is recovered using a fitting procedure.
Chemical markup, XML, and the world wide web. 6. CMLReact, an XML vocabulary for chemical reactions.
Holliday, Gemma L; Murray-Rust, Peter; Rzepa, Henry S
2006-01-01
A set of components (CMLReact) for managing chemical and biochemical reactions has been added to CML. These can be combined to support most of the strategies for the formal representation of reactions. The elements, attributes, and types are formally defined as XMLSchema components, and their semantics are developed. New syntax and semantics in CML are reported and illustrated with 10 examples.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dai, Lingyun; Prokudin, Alexei; Kang, Zhong-Bo
2015-09-01
We study the three-gluon correlation function contribution to the Sivers asymmetry in semi-inclusive deep inelastic scattering. We first establish the matching between the usual twist-3 collinear factorization approach and transverse momentum dependent factorization formalism for the moderate transverse momentum region. We then derive the so-called coefficient functions used in the usual TMD evolution formalism. Finally, we perform the next-to-leading order calculation for the transverse-momentum-weighted spin-dependent differential cross section, from which we identify the QCD collinear evolution of the twist-3 Qiu-Sterman function: the off-diagonal contribution from the three-gluon correlation functions.
Code of Federal Regulations, 2010 CFR
2010-07-01
... overhaul; and (2) An analysis of the cost to implement the overhaul within a year versus a proposed... be based on a formal comprehensive appraisal or a series of formal appraisals of the functional...
Understanding visualization: a formal approach using category theory and semiotics.
Vickers, Paul; Faith, Joe; Rossiter, Nick
2013-06-01
This paper combines the vocabulary of semiotics and category theory to provide a formal analysis of visualization. It shows how familiar processes of visualization fit the semiotic frameworks of both Saussure and Peirce, and extends these structures using the tools of category theory to provide a general framework for understanding visualization in practice, including: Relationships between systems, data collected from those systems, renderings of those data in the form of representations, the reading of those representations to create visualizations, and the use of those visualizations to create knowledge and understanding of the system under inspection. The resulting framework is validated by demonstrating how familiar information visualization concepts (such as literalness, sensitivity, redundancy, ambiguity, generalizability, and chart junk) arise naturally from it and can be defined formally and precisely. This paper generalizes previous work on the formal characterization of visualization by, inter alia, Ziemkiewicz and Kosara and allows us to formally distinguish properties of the visualization process that previous work does not.
Mayerhöfer, Thomas G; Pahlow, Susanne; Hübner, Uwe; Popp, Jürgen
2018-06-25
A hybrid formalism combining elements from Kramers-Kronig based analyses and dispersion analysis was developed, which allows removing interference-based effects in the infrared spectra of layers on highly reflecting substrates. In order to enable a highly convenient application, the correction procedure is fully automatized and usually requires less than a minute with non-optimized software on a typical office PC. The formalism was tested with both synthetic and experimental spectra of poly(methyl methacrylate) on gold. The results confirmed the usefulness of the formalism: apparent peak ratios as well as the interference fringes in the original spectra were successfully corrected. Accordingly, the introduced formalism makes it possible to use inexpensive and robust highly reflecting substrates for routine infrared spectroscopic investigations of layers or films the thickness of which is limited by the imperative that reflectance absorbance must be smaller than about 1. For thicker films the formalism is still useful, but requires estimates for the optical constants.
RUIZ-RAMOS, MARGARITA; MÍNGUEZ, M. INÉS
2006-01-01
• Background Plant structural (i.e. architectural) models explicitly describe plant morphology by providing detailed descriptions of the display of leaf and stem surfaces within heterogeneous canopies and thus provide the opportunity for modelling the functioning of plant organs in their microenvironments. The outcome is a class of structural–functional crop models that combines advantages of current structural and process approaches to crop modelling. ALAMEDA is such a model. • Methods The formalism of Lindenmayer systems (L-systems) was chosen for the development of a structural model of the faba bean canopy, providing both numerical and dynamic graphical outputs. It was parameterized according to the results obtained through detailed morphological and phenological descriptions that capture the detailed geometry and topology of the crop. The analysis distinguishes between relationships of general application for all sowing dates and stem ranks and others valid only for all stems of a single crop cycle. • Results and Conclusions The results reveal that in faba bean, structural parameterization valid for the entire plant may be drawn from a single stem. ALAMEDA was formed by linking the structural model to the growth model ‘Simulation d'Allongement des Feuilles’ (SAF) with the ability to simulate approx. 3500 crop organs and components of a group of nine plants. Model performance was verified for organ length, plant height and leaf area. The L-system formalism was able to capture the complex architecture of canopy leaf area of this indeterminate crop and, with the growth relationships, generate a 3D dynamic crop simulation. Future development and improvement of the model are discussed. PMID:16390842
Normal order and extended Wick theorem for a multiconfiguration reference wave function
NASA Astrophysics Data System (ADS)
Kutzelnigg, Werner; Mukherjee, Debashis
1997-07-01
A generalization of normal ordering and of Wick's theorem with respect to an arbitrary reference function Φ as some generalized "physical vacuum" is formulated in a different (but essentially equivalent) way than that suggested previously by one of the present authors. Guiding principles are that normal order operators with respect to any reference state must be expressible as linear combinations of those with respect to the genuine vacuum, that the vacuum expectation value of a normal order operator must vanish (with respect to the vacuum to which it is in normal order), and that the well-known formalism for a single Slater determinant as physical vacuum must be contained as a special case. The derivation is largely based on the concepts of "Quantum Chemistry in Fock space," which means that particle-number-conserving operators (excitation operators) play a central role. Nevertheless, the contraction rules in the frame of a generalized Wick theorem are derived, that hold for non-particle-number-conserving operators as well. The contraction rules are formulated and illustrated in terms of diagrams. The contractions involve the "residual n-particle density matrices" λ, which are the irreducible (non-factorizable) parts of the conventional n-particle density matrices γ, in the sense of a cumulant expansion for the density. A spinfree formulation is presented as well. The expression of the Hamiltonian in normal order with respect to a multiconfiguration reference function leads to a natural definition of a generalized Fock operator. MC-SCF-theory is easily worked out in this context. The paper concludes with a discussion of the excited configurations and the first-order interacting space, that underlies a perturbative coupled cluster type correction to the MCSCF function for an arbitrary reference function, and with general implications of the new formalism, that is related to "internally contracted multireference configuration interaction." The present generalization of normal ordering is not only valid for arbitrary reference functions, but also if the reference state is an ensemble state.
Gulf Coast Clean Energy Application Center
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dillingham, Gavin
The Gulf Coast Clean Energy Application Center was initiated to significantly improve market and regulatory conditions for the implementation of combined heat and power technologies. The GC CEAC was responsible for the development of CHP in Texas, Louisiana and Oklahoma. Through this program we employed a variety of outreach and education techniques, developed and deployed assessment tools and conducted market assessments. These efforts resulted in the growth of the combined heat and power market in the Gulf Coast region with a realization of more efficient energy generation, reduced emissions and a more resilient infrastructure. Specific t research, we did notmore » formally investigate any techniques with any formal research design or methodology.« less
Formal Physical Therapy After Total Hip Arthroplasty Is Not Required: A Randomized Controlled Trial.
Austin, Matthew S; Urbani, Brian T; Fleischman, Andrew N; Fernando, Navin D; Purtill, James J; Hozack, William J; Parvizi, Javad; Rothman, Richard H
2017-04-19
The value of formal physical therapy after total hip arthroplasty is unknown. With substantial changes that have occurred in surgical and anesthesia techniques, self-directed therapy may be efficacious in restoring function to patients undergoing total hip arthroplasty. We conducted a single-center, randomized trial of 120 patients undergoing primary, unilateral total hip arthroplasty who were eligible for direct home discharge. The experimental group followed a self-directed home exercise program for 10 weeks. The control group received the standard protocol for physical therapy that included in-home visits with a physical therapist for the first 2 weeks followed by formal outpatient physical therapy for 8 weeks. Functional outcomes were measured using validated instruments including the Harris hip score (HHS), the Western Ontario and McMaster Universities Osteoarthritis Index (WOMAC), and the Short Form-36 Health Survey (SF-36) preoperatively, at 1 month postoperatively, and at 6 to 12 months postoperatively. Of 120 randomized patients, 108 were included in the final analysis. Ten patients (19%) were randomized to unsupervised home exercise and 20 patients (37%) were randomized to formal outpatient therapy crossed over between groups. There was no significant difference in any of the measured functional outcomes between patients receiving formal therapy (n = 54) and those participating in unsupervised home exercise (n = 54) at any time point (HHS, p = 0.82; WOMAC, p = 0.80; and SF-36 physical health, p = 0.90). This randomized trial suggests that unsupervised home exercise is both safe and efficacious for a majority of patients undergoing total hip arthroplasty, and formal physical therapy may not be required. Therapeutic Level I. See Instructions for Authors for a complete description of levels of evidence.
Farran, Carol J; Etkin, Caryn D; Eisenstein, Amy; Paun, Olimpia; Rajan, Kumar B; Sweet, Cynthia M Castro; McCann, Judith J; Barnes, Lisa L; Shah, Raj C; Evans, Denis A
2017-01-01
Objective Alzheimer’s disease and related dementias (ADRD) affect more than five million Americans and their family caregivers. Caregiving creates challenges, may contribute to decreased caregiver health and is associated with $9.7 billion of caregiver health care costs. The purpose of this 12 month randomized clinical trial (RCT) was to examine if the Enhancing Physical Activity Intervention (EPAI), a moderate to vigorous physical activity (MVPA) treatment group, versus the Caregiver Skill Building Intervention (CSBI) control, would have greater: (1) MVPA adherence; and (2) physical function. Methods Caregivers were randomly assigned to EPAI or CSBI (N=211). MVPA was assessed using a self-report measure; and physical function was objectively assessed using two measures. Intention-to-treat analyses used descriptive, categorical and generalized estimating equations (GEE), with an exchangeable working correlation matrix and a log link, to examine main effects and interactions in change of MVPA and physical function over time. Results At 12 months, EPAI significantly increased MVPA (p=<0.001) and number of steps (p=< .01); maintained stable caregiving hours and use of formal services; while CSBI increased hours of caregiving (p=<0.001) and used more formal services (p=<0.02). Qualitative physical function data indicated that approximately 50% of caregivers had difficulties completing physical function tests. Conclusion The EPAI had a stronger 12 month effect on caregiver MVPA and physical function, as well as maintaining stability of caregiving hours and formal service use; while CSBI increased caregiving hours and use of formal services. A study limitation included greater EPAI versus CSBI attrition. Future directions are proposed for dementia family caregiver physical activity research. PMID:28752016
Extension of the Kohn-Sham formulation of density functional theory to finite temperature
NASA Astrophysics Data System (ADS)
Gonis, A.; Däne, M.
2018-05-01
Based on Mermin's extension of the Hohenberg and Kohn theorems to non-zero temperature, the Kohn-Sham formulation of density functional theory (KS-DFT) is generalized to finite temperature. We show that present formulations are inconsistent with Mermin's functional containing expressions, in particular describing the Coulomb energy, that defy derivation and are even in violation of rules of logical inference. More; current methodology is in violation of fundamental laws of both quantum and classical mechanics. Based on this feature, we demonstrate the impossibility of extending the KS formalism to finite temperature through the self-consistent solutions of the single-particle Schrödinger equation of T > 0. Guided by the form of Mermin's functional that depends on the eigenstates of a Hamiltonian, determined at T = 0, we base our extension of KS-DFT on the determination of the excited states of a non-interacting system at the zero of temperature. The resulting formulation is consistent with that of Mermin constructing the free energy at T > 0 in terms of the excited states of a non-interacting Hamiltonian (system) that, within the KS formalism, are described by Slater determinants. To determine the excited states at T = 0 use is made of the extension of the Hohenberg and Kohn theorems to excited states presented in previous work applied here to a non-interacting collection of replicas of a non-interacting N-particle system, whose ground state density is taken to match that of K non-interacting replicas of an interacting N-particle system at T = 0 . The formalism allows for an ever denser population of the excitation spectrum of a Hamiltonian, within the KS approximation. The form of the auxiliary potential, (Kohn-Sham potential), is formally identical to that in the ground state formalism with the contribution of the Coulomb energy provided by the derivative of the Coulomb energy in all excited states taken into account. Once the excited states are determined, the minimum of the free energy within the KS formalism follows immediately in the form of Mermin's functional, but with the exact excited states in that functional represented by Slater determinants obtained through self-consistency conditions at the zero of temperature. It is emphasized that, in departure from all existing formulations, no self-consistency conditions are implemented at finite T; as we show, in fact, such formulations are rigorously blocked.
Putkinen, Vesa; Tervaniemi, Mari; Saarikivi, Katri; Huotilainen, Minna
2015-03-01
Adult musicians show superior neural sound discrimination when compared to nonmusicians. However, it is unclear whether these group differences reflect the effects of experience or preexisting neural enhancement in individuals who seek out musical training. Tracking how brain function matures over time in musically trained and nontrained children can shed light on this issue. Here, we review our recent longitudinal event-related potential (ERP) studies that examine how formal musical training and less formal musical activities influence the maturation of brain responses related to sound discrimination and auditory attention. These studies found that musically trained school-aged children and preschool-aged children attending a musical playschool show more rapid maturation of neural sound discrimination than their control peers. Importantly, we found no evidence for pretraining group differences. In a related cross-sectional study, we found ERP and behavioral evidence for improved executive functions and control over auditory novelty processing in musically trained school-aged children and adolescents. Taken together, these studies provide evidence for the causal role of formal musical training and less formal musical activities in shaping the development of important neural auditory skills and suggest transfer effects with domain-general implications. © 2015 New York Academy of Sciences.
Combining Variables, Controlling Variables, and Proportions: Is There a Psychological Link?
ERIC Educational Resources Information Center
Lawson, Anton E.
1979-01-01
Investigated the degree of relationship among the performance of 28 seventh grade students on the following three formal operations tasks: chemical combinations, bending rods, and balance beam. Results show that task performance ranged widely from early concrete operational to fully operational. (HM)
Abel, David L.
2011-01-01
Is life physicochemically unique? No. Is life unique? Yes. Life manifests innumerable formalisms that cannot be generated or explained by physicodynamics alone. Life pursues thousands of biofunctional goals, not the least of which is staying alive. Neither physicodynamics, nor evolution, pursue goals. Life is largely directed by linear digital programming and by the Prescriptive Information (PI) instantiated particularly into physicodynamically indeterminate nucleotide sequencing. Epigenomic controls only compound the sophistication of these formalisms. Life employs representationalism through the use of symbol systems. Life manifests autonomy, homeostasis far from equilibrium in the harshest of environments, positive and negative feedback mechanisms, prevention and correction of its own errors, and organization of its components into Sustained Functional Systems (SFS). Chance and necessity—heat agitation and the cause-and-effect determinism of nature’s orderliness—cannot spawn formalisms such as mathematics, language, symbol systems, coding, decoding, logic, organization (not to be confused with mere self-ordering), integration of circuits, computational success, and the pursuit of functionality. All of these characteristics of life are formal, not physical. PMID:25382119
Formal Verification at System Level
NASA Astrophysics Data System (ADS)
Mazzini, S.; Puri, S.; Mari, F.; Melatti, I.; Tronci, E.
2009-05-01
System Level Analysis calls for a language comprehensible to experts with different background and yet precise enough to support meaningful analyses. SysML is emerging as an effective balance between such conflicting goals. In this paper we outline some the results obtained as for SysML based system level functional formal verification by an ESA/ESTEC study, with a collaboration among INTECS and La Sapienza University of Roma. The study focuses on SysML based system level functional requirements techniques.
Spin-dependent optimized effective potential formalism for open and closed systems
NASA Astrophysics Data System (ADS)
Rigamonti, S.; Horowitz, C. M.; Proetto, C. R.
2015-12-01
Orbital-based exchange (x ) correlation (c ) energy functionals, leading to the optimized effective potential (OEP) formalism of density-functional theory (DFT), are gaining increasing importance in ground-state DFT, as applied to the calculation of the electronic structure of closed systems with a fixed number of particles, such as atoms and molecules. These types of functionals prove also to be extremely valuable for dealing with solid-state systems with reduced dimensionality, such as is the case of electrons trapped at the interface between two different semiconductors, or narrow metallic slabs. In both cases, electrons build a quasi-two-dimensional electron gas, or Q2DEG. We provide here a general DFT-OEP formal scheme valid both for Q2DEGs either isolated (closed) or in contact with a particle bath (open), and show that both possible representations are equivalent, being the choice of one or the other essentially a question of convenience. Based on this equivalence, a calculation scheme is proposed which avoids the noninvertibility problem of the density response function for closed systems. We also consider the case of spontaneously spin-polarized Q2DEGs, and find that far from the region where the Q2DEG is localized, the exact x -only exchange potential approaches two different, spin-dependent asymptotic limits. As an example, aside from these formal results, we also provide numerical results for a spin-polarized jellium slab, using the new OEP formalism for closed systems. The accuracy of the Krieger-Li-Iafrate approximation has been also tested for the same system, and found to be as good as it is for atoms and molecules.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hummer, G.; Garcia, A.E.; Soumpasis, D.M.
1994-10-01
To understand the functioning of living organisms on a molecular level, it is crucial to dissect the intricate interplay of the immense number of biological molecules. Most of the biochemical processes in cells occur in a liquid environment formed mainly by water and ions. This solvent environment plays an important role in biological systems. The potential-of-mean-force (PMF) formalism attempts to describe quantitatively the interactions of the solvent with biological macromolecules on the basis of an approximate statistical-mechanical representation. At its current status of development, it deals with ionic effects on the biomolecular structure and with the structural hydration of biomolecules.more » The underlying idea of the PMF formalism is to identify the dominant sources of interactions and incorporate these interactions into the theoretical formalism using PMF`s (or particle correlation functions) extracted from bulk-liquid systems. In the following, the authors shall briefly outline the statistical-mechanical foundation of the PMF formalism and introduce the PMF expansion formalism, which is intimately linked to superposition approximations for higher-order particle correlation functions. The authors shall then sketch applications, which describe the effects of the ionic environment on nucleic-acid structure. Finally, the authors shall present the more recent extension of the PMF idea to describe quantitatively the structural hydration of biomolecules. Results for the interface of ice and water and for the hydration of deoxyribonucleic acid (DNA) will be discussed.« less
NASA Astrophysics Data System (ADS)
McCaul, G. M. G.; Lorenz, C. D.; Kantorovich, L.
2017-03-01
We present a partition-free approach to the evolution of density matrices for open quantum systems coupled to a harmonic environment. The influence functional formalism combined with a two-time Hubbard-Stratonovich transformation allows us to derive a set of exact differential equations for the reduced density matrix of an open system, termed the extended stochastic Liouville-von Neumann equation. Our approach generalizes previous work based on Caldeira-Leggett models and a partitioned initial density matrix. This provides a simple, yet exact, closed-form description for the evolution of open systems from equilibriated initial conditions. The applicability of this model and the potential for numerical implementations are also discussed.
Prototype design based on NX subdivision modeling application
NASA Astrophysics Data System (ADS)
Zhan, Xianghui; Li, Xiaoda
2018-04-01
Prototype design is an important part of the product design, through a quick and easy way to draw a three-dimensional product prototype. Combined with the actual production, the prototype could be modified several times, resulting in a highly efficient and reasonable design before the formal design. Subdivision modeling is a common method of modeling product prototypes. Through Subdivision modeling, people can in a short time with a simple operation to get the product prototype of the three-dimensional model. This paper discusses the operation method of Subdivision modeling for geometry. Take a vacuum cleaner as an example, the NX Subdivision modeling functions are applied. Finally, the development of Subdivision modeling is forecasted.
Theorem Proving in Intel Hardware Design
NASA Technical Reports Server (NTRS)
O'Leary, John
2009-01-01
For the past decade, a framework combining model checking (symbolic trajectory evaluation) and higher-order logic theorem proving has been in production use at Intel. Our tools and methodology have been used to formally verify execution cluster functionality (including floating-point operations) for a number of Intel products, including the Pentium(Registered TradeMark)4 and Core(TradeMark)i7 processors. Hardware verification in 2009 is much more challenging than it was in 1999 - today s CPU chip designs contain many processor cores and significant firmware content. This talk will attempt to distill the lessons learned over the past ten years, discuss how they apply to today s problems, outline some future directions.
NASA Astrophysics Data System (ADS)
Tkacz, J.; Bukowiec, A.; Doligalski, M.
2017-08-01
The paper presentes the method of modeling and implementation of concurrent controllers. Concurrent controllers are specified by Petri nets. Then Petri nets are decomposed using symbolic deduction method of analysis. Formal methods like sequent calculus system with considered elements of Thelen's algorithm have been used here. As a result, linked state machines (LSMs) are received. Each FSM is implemented using methods of structural decomposition during process of logic synthesis. The method of multiple encoding of microinstruction has been applied. It leads to decreased number of Boolean function realized by combinational part of FSM. The additional decoder could be implemented with the use of memory blocks.
Zhou, Yirong; Breit, Bernhard
2017-12-22
An unprecedented asymmetric N-H functionalization of quinazolinones with allenes and allylic carbonates was successfully achieved by rhodium catalysis with the assistance of chiral bidentate diphosphine ligands. The high efficiency and practicality of this method was demonstrated by a low catalyst loading of 1 mol % as well as excellent chemo-, regio-, and enantioselectivities with broad functional group compatibility. Furthermore, this newly developed strategy was applied as key step in the first enantioselective formal total synthesis of (-)-chaetominine. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
ERIC Educational Resources Information Center
Abrandt, Madeleine
This study investigated whether students of physiotherapy experienced the concepts "health,""movement,""function," and "interaction" differently during formal education and after some professional experience. Data were gathered by interviewing two groups of physiotherapy students at Linkoping University…
Organization Structure and Administrative Control: A Question of Dimensionality.
ERIC Educational Resources Information Center
Montanari, John R.; Freedman, Sara M.
1981-01-01
Used a sample of national firms (N=836) to investigate the relationship between specialization, formalization, and centralization in the functional work unit. Data indicated that the three variables compose a single dimension of organizational structure. Another finding was that, within this dimension, specialization, formalization, and…
Fusing Quantitative Requirements Analysis with Model-based Systems Engineering
NASA Technical Reports Server (NTRS)
Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven
2006-01-01
A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.
Nuclear spin relaxation due to chemical shift anisotropy of gas-phase 129Xe.
Hanni, Matti; Lantto, Perttu; Vaara, Juha
2011-08-14
Nuclear spin relaxation provides detailed dynamical information on molecular systems and materials. Here, first-principles modeling of the chemical shift anisotropy (CSA) relaxation time for the prototypic monoatomic (129)Xe gas is carried out, both complementing and predicting the results of NMR measurements. Our approach is based on molecular dynamics simulations combined with pre-parametrized ab initio binary nuclear shielding tensors, an "NMR force field". By using the Redfield relaxation formalism, the simulated CSA time correlation functions lead to spectral density functions that, for the first time, quantitatively determine the experimental spin-lattice relaxation times T(1). The quality requirements on both the Xe-Xe interaction potential and binary shielding tensor are investigated in the context of CSA T(1). Persistent dimers Xe(2) are found to be responsible for the CSA relaxation mechanism in the low-density limit of the gas, completely in line with the earlier experimental findings.
All-phosphorus flexible devices with non-collinear electrodes: a first principles study.
Li, Junjun; Ruan, Lufeng; Wu, Zewen; Zhang, Guiling; Wang, Yin
2018-03-07
With the continuous expansion of the family of two-dimensional (2D) materials, flexible electronics based on 2D materials have quickly emerged. Theoretically, predicting the transport properties of the flexible devices made up of 2D materials using first principles is of great importance. Using density functional theory combined with the non-equilibrium Green's function formalism, we calculated the transport properties of all-phosphorus flexible devices with non-collinear electrodes, and the results predicted that the device with compressed metallic phosphorene electrodes sandwiching a P-type semiconducting phosphorene shows a better and robust conducting behavior against the bending of the semiconducting region when the angle between the two electrodes is less than 45°, which indicates that this system is very promising for flexible electronics. The calculation of a quantum transport system with non-collinear electrodes demonstrated in this work will provide more interesting information on mesoscopic material systems and related devices.
Petri net-based dependability modeling methodology for reconfigurable field programmable gate arrays
NASA Astrophysics Data System (ADS)
Graczyk, Rafał; Orleański, Piotr; Poźniak, Krzysztof
2015-09-01
Dependability modeling is an important issue for aerospace and space equipment designers. From system level perspective, one has to choose from multitude of possible architectures, redundancy levels, component combinations in a way to meet desired properties and dependability and finally fit within required cost and time budgets. Modeling of such systems is getting harder as its levels of complexity grow together with demand for more functional and flexible, yet more available systems that govern more and more crucial parts of our civilization's infrastructure (aerospace transport systems, telecommunications, exploration probes). In this article promising method of modeling complex systems using Petri networks is introduced in context of qualitative and quantitative dependability analysis. This method, although with some limitation and drawback offer still convenient visual formal method of describing system behavior on different levels (functional, timing, random events) and offers straight correspondence to underlying mathematical engine, perfect for simulations and engineering support.
Transport properties of CNT/oligosilane/CNT heterojunctions
NASA Astrophysics Data System (ADS)
Yu, J.; Zhang, G. L.; Shang, Y.; Wang, K. D.; Zhang, H.; Sun, M.; Liu, B.; Zeng, T.
2013-02-01
Combining the non-equilibrium Green's function formalism with density functional theory, the transport properties of nine CNT/oligosilane/CNT heterojunctions were systematically studied. We have found that the incorporation of oligosilane linkage to the carbon nanotube mouth could significantly tune the transport properties compared with the pure oligosilane and pure CNT. The P- and B-dopings upon the oligosilane moiety could not only enhance the conductivity but also give rise to multiple negative differential resistance behavior for the CNT/oligosilane/CNT heterojunctions. The concentration of heteroatom plays an important role in the transport properties of the CNT/oligosilane/CNT heterojunctions, while the number of the oligosilane linkage exerts little effect on the conductivity. The B-doped CNT/oligosilane/CNT heterojunctions show higher conductivity than those of the P-doped ones. The p-n junction caused by B- and P-codopings exhibits a rectifying effect and the rectification ratio is up to 7.19.
Spin-filtering and giant magnetoresistance effects in polyacetylene-based molecular devices
NASA Astrophysics Data System (ADS)
Chen, Tong; Yan, Shenlang; Xu, Liang; Liu, Desheng; Li, Quan; Wang, Lingling; Long, Mengqiu
2017-07-01
Using the non-equilibrium Green's function formalism in combination with density functional theory, we performed ab initio calculations of spin-dependent electron transport in molecular devices consisting of a polyacetylene (CnHn+1) chain vertically attached to a carbon chain sandwiched between two semi-infinite zigzag-edged graphene nanoribbon electrodes. Spin-charge transport in the device could be modulated to different magnetic configurations by an external magnetic field. The results showed that single spin conduction could be obtained. Specifically, the proposed CnHn+1 devices exhibited several interesting effects, including (dual) spin filtering, spin negative differential resistance, odd-even oscillation, and magnetoresistance (MR). Marked spin polarization with a filtering efficiency of up to 100% over a large bias range was found, and the highest MR ratio for the CnHn+1 junctions reached 4.6 × 104. In addition, the physical mechanisms for these phenomena were also revealed.
2012-01-01
Background In New Zealand, around 45,000 people live with stroke and many studies have reported that benefits gained during initial rehabilitation are not sustained. Evidence indicates that participation in physical interventions can prevent the functional decline that frequently occurs after discharge from acute care facilities. However, on-going stroke services provision following discharge from acute care is often related to non-medical factors such as availability of resources and geographical location. Currently most people receive no treatment beyond three months post stroke. The study aims to determine if the Augmented Community Telerehabilitation Intervention (ACTIV) results in better physical function for people with stroke than usual care, as measured by the Stroke Impact Scale, physical subcomponent. Methods/design This study will use a multi-site, two-arm, assessor blinded, parallel randomised controlled trial design. People will be eligible if they have had their first ever stroke, are over 20 and have some physical impairment in either arm or leg, or both. Following discharge from formal physiotherapy services (inpatient, outpatient or community), participants will be randomised into ACTIV or usual care. ACTIV uses readily available technology, telephone and mobile phones, combined with face-to-face visits from a physiotherapist over a six-month period, to help people with stroke resume activities they enjoyed before the stroke. The impact of stroke on physical function and quality of life will be assessed, measures of cost will be collected and a discrete choice survey will be used to measure preferences for rehabilitation options. These outcomes will be collected at baseline, six months and 12 months. In-depth interviews will be used to explore the experiences of people participating in the intervention arm of the study. Discussion The lack of on-going rehabilitation for people with stroke diminishes the chance of their best possible outcome and may contribute to a functional decline following discharge from formal rehabilitation. Best practice guidelines recommend a prolonged period of rehabilitation, however this is expensive and therefore not undertaken in most publicly funded centres. An effective, cost-effective, and preference-sensitive therapy using basic technology to assist programme delivery may improve patient autonomy as they leave formal rehabilitation and return home. Trial registration ACTRN12612000464864 PMID:23216861
Projector Augmented Wave formulation of orbital-dependent exchange-correlation functionals
NASA Astrophysics Data System (ADS)
Xu, Xiao; Holzwarth, N. A. W.
2012-02-01
The use of orbital-dependent exchange-correlation functionals within electronic structure calculations has recently received renewed attention for improving the accuracy of the calculations, especially correcting self-interaction errors. Since the Projector Augmented Wave (PAW) methodootnotetext P. Bl"ochl, Phys. Rev. B 50, 17953 (1994). is an efficient pseudopotential-like scheme which ensures accurate evaluation of all multipole moments of direct and exchange Coulomb integrals, it is a natural choice for implementing orbital-dependent formalisms. Using Fock exchange as an example of an orbital-dependent functional, we developed the formulation and numerical implementation of the approximate optimized effective potential formalism of Kreiger, Li, and Iafrate (KLI)ootnotetext J. B. Krieger, Y. Li, and G. J. Iafrate Phys. Rev. A 45, 101 (1992). within the PAW method, comparing results with the analogous Hartree-Fock treatment.ootnotetext Xiao Xu and N. A. W. Holzwarth, Phys. Rev. B 81, 245105 (2010); 84, 155113 (2011). Test results are presented for ground state properties of two well-known materials -- diamond and LiF. This formalism can be extended to treat orbital-dependent functionals more generally.
Formalization of software requirements for information systems using fuzzy logic
NASA Astrophysics Data System (ADS)
Yegorov, Y. S.; Milov, V. R.; Kvasov, A. S.; Sorokoumova, S. N.; Suvorova, O. V.
2018-05-01
The paper considers an approach to the design of information systems based on flexible software development methodologies. The possibility of improving the management of the life cycle of information systems by assessing the functional relationship between requirements and business objectives is described. An approach is proposed to establish the relationship between the degree of achievement of business objectives and the fulfillment of requirements for the projected information system. It describes solutions that allow one to formalize the process of formation of functional and non-functional requirements with the help of fuzzy logic apparatus. The form of the objective function is formed on the basis of expert knowledge and is specified via learning from very small data set.
Smarr formula for Lovelock black holes: A Lagrangian approach
NASA Astrophysics Data System (ADS)
Liberati, Stefano; Pacilio, Costantino
2016-04-01
The mass formula for black holes can be formally expressed in terms of a Noether charge surface integral plus a suitable volume integral, for any gravitational theory. The integrals can be constructed as an application of Wald's formalism. We apply this formalism to compute the mass and the Smarr formula for static Lovelock black holes. Finally, we propose a new prescription for Wald's entropy in the case of Lovelock black holes, which takes into account topological contributions to the entropy functional.
Electromagnetic δ -function sphere
NASA Astrophysics Data System (ADS)
Parashar, Prachi; Milton, Kimball A.; Shajesh, K. V.; Brevik, Iver
2017-10-01
We develop a formalism to extend our previous work on the electromagnetic δ -function plates to a spherical surface. The electric (λe) and magnetic (λg) couplings to the surface are through δ -function potentials defining the dielectric permittivity and the diamagnetic permeability, with two anisotropic coupling tensors. The formalism incorporates dispersion. The electromagnetic Green's dyadic breaks up into transverse electric and transverse magnetic parts. We derive the Casimir interaction energy between two concentric δ -function spheres in this formalism and show that it has the correct asymptotic flat-plate limit. We systematically derive expressions for the Casimir self-energy and the total stress on a spherical shell using a δ -function potential, properly regulated by temporal and spatial point splitting, which are different from the conventional temporal point splitting. In the strong-coupling limit, we recover the usual result for the perfectly conducting spherical shell but in addition there is an integrated curvature-squared divergent contribution. For finite coupling, there are additional divergent contributions; in particular, there is a familiar logarithmic divergence occurring in the third order of the uniform asymptotic expansion that renders it impossible to extract a unique finite energy except in the case of an isorefractive sphere, which translates into λg=-λe.
NASA Astrophysics Data System (ADS)
Nurhidayati, I.; Suparmi, A.; Cari, C.
2018-03-01
The Schrödinger equation has been extended by applying the minimal length formalism for trigonometric potential. The wave function and energy spectra were used to describe the behavior of subatomic particle. The wave function and energy spectra were obtained by using hypergeometry method. The result showed that the energy increased by the increasing both of minimal length parameter and the potential parameter. The energy were calculated numerically using MatLab.
Topological vertex formalism with O5-plane
NASA Astrophysics Data System (ADS)
Kim, Sung-Soo; Yagi, Futoshi
2018-01-01
We propose a new topological vertex formalism for a type IIB (p ,q ) 5-brane web with an O5-plane. We apply our proposal to five-dimensional N =1 Sp(1) gauge theory with Nf=0 , 1, 8 flavors to compute the topological string partition functions and check the agreement with the known results. Especially for the Nf=8 case, which corresponds to E-string theory on a circle, we obtain a new, yet simple, expression of the partition function with a two Young diagram sum.
Dissipative transport in superlattices within the Wigner function formalism
Jonasson, O.; Knezevic, I.
2015-07-30
Here, we employ the Wigner function formalism to simulate partially coherent, dissipative electron transport in biased semiconductor superlattices. We introduce a model collision integral with terms that describe energy dissipation, momentum relaxation, and the decay of spatial coherences (localization). Based on a particle-based solution to the Wigner transport equation with the model collision integral, we simulate quantum electronic transport at 10 K in a GaAs/AlGaAs superlattice and accurately reproduce its current density vs field characteristics obtained in experiment.
NASA Astrophysics Data System (ADS)
Ayalon, Michal; Watson, Anne; Lerman, Steve
2016-09-01
This study examines expressions of reasoning by some higher achieving 11 to 18 year-old English students responding to a survey consisting of function tasks developed in collaboration with their teachers. We report on 70 students, 10 from each of English years 7-13. Iterative and comparative analysis identified capabilities and difficulties of students and suggested conjectures concerning links between the affordances of the tasks, the curriculum, and students' responses. The paper focuses on five of the survey tasks and highlights connections between informal and formal expressions of reasoning about variables in learning. We introduce the notion of `schooled' expressions of reasoning, neither formal nor informal, to emphasise the role of the formatting tools introduced in school that shape future understanding and reasoning.
Neuhauser, Daniel; Gao, Yi; Arntsen, Christopher; Karshenas, Cyrus; Rabani, Eran; Baer, Roi
2014-08-15
We develop a formalism to calculate the quasiparticle energy within the GW many-body perturbation correction to the density functional theory. The occupied and virtual orbitals of the Kohn-Sham Hamiltonian are replaced by stochastic orbitals used to evaluate the Green function G, the polarization potential W, and, thereby, the GW self-energy. The stochastic GW (sGW) formalism relies on novel theoretical concepts such as stochastic time-dependent Hartree propagation, stochastic matrix compression, and spatial or temporal stochastic decoupling techniques. Beyond the theoretical interest, the formalism enables linear scaling GW calculations breaking the theoretical scaling limit for GW as well as circumventing the need for energy cutoff approximations. We illustrate the method for silicon nanocrystals of varying sizes with N_{e}>3000 electrons.
Formal Mentoring Relationships and Attachment Theory: Implications for Human Resource Development
ERIC Educational Resources Information Center
Germain, Marie-Line
2011-01-01
An attachment theory perspective of mentoring is presented to explain the degree of functionality of a mentor-protege formal match in an organizational setting. By focusing on Bowlby's (1969/1982) behavioral system of attachment and its triarchic taxonomy of secure, avoidant, and anxious-ambivalent attachment, previous conceptualizations are…
Avoiding School Management Conflicts and Crisis through Formal Communication
ERIC Educational Resources Information Center
Nwogbaga, David M. E.; Nwankwo, Oliver U.; Onwa, Doris O.
2015-01-01
This paper examined how conflicts and crisis can be avoided through formal communication. It was necessitated by the observation that most of the conflicts and crisis which tend to mar school management today are functions of the inconsistencies arising from "grapevines, rumours, and gossips" generated through informal communication…
Unified formalism for the generalized kth-order Hamilton-Jacobi problem
NASA Astrophysics Data System (ADS)
Colombo, Leonardo; de Léon, Manuel; Prieto-Martínez, Pedro Daniel; Román-Roy, Narciso
2014-08-01
The geometric formulation of the Hamilton-Jacobi theory enables us to generalize it to systems of higher-order ordinary differential equations. In this work we introduce the unified Lagrangian-Hamiltonian formalism for the geometric Hamilton-Jacobi theory on higher-order autonomous dynamical systems described by regular Lagrangian functions.
Helping System Engineers Bridge the Peaks
NASA Technical Reports Server (NTRS)
Rungta, Neha; Tkachuk, Oksana; Person, Suzette; Biatek, Jason; Whalen, Michael W.; Castle, Joseph; Castle, JosephGundy-Burlet, Karen
2014-01-01
In our experience at NASA, system engineers generally follow the Twin Peaks approach when developing safety-critical systems. However, iterations between the peaks require considerable manual, and in some cases duplicate, effort. A significant part of the manual effort stems from the fact that requirements are written in English natural language rather than a formal notation. In this work, we propose an approach that enables system engineers to leverage formal requirements and automated test generation to streamline iterations, effectively "bridging the peaks". The key to the approach is a formal language notation that a) system engineers are comfortable with, b) is supported by a family of automated V&V tools, and c) is semantically rich enough to describe the requirements of interest. We believe the combination of formalizing requirements and providing tool support to automate the iterations will lead to a more efficient Twin Peaks implementation at NASA.
Formal Education Level Versus Self-Rated Literacy as Predictors of Cognitive Aging
Shrira, Amit; Palgi, Yuval; Spalter, Tal; Ben-Ezra, Menachem; Shmotkin, Dov
2012-01-01
Objectives. To compare the prediction of cognitive functioning by formal education and self-rated literacy and the differences in prediction across younger and older cohorts. Method. Data on 28,535 respondents were drawn from a cross-sectional representative sample of community-dwelling older individuals (≥50), participating in the Survey of Health, Ageing, and Retirement in Europe. Education level was classified according to the International Standard Classification of Education 1997 (ISCED-1997) self-rated literacy was determined by having respondents rate their reading and writing on 1–5 scales. Cognitive functioning was measured by verbal recall, word fluency, and arithmetic ability. Results. Structural equation modeling demonstrated that self-rated literacy was more strongly associated with cognitive functioning than was education level, with or without additional exogenous variables (age, sex, household income, medical conditions, activities of daily living, reading eyesight, and country). The association between education level and cognitive functioning was weaker in older than in younger age groups, whereas the association between self-rated literacy and cognitive functioning showed the opposite trend. Discussion. Self-rated literacy was found to be a better predictor of late-life cognitive functioning than was the level of formal education. The results have implications for studies of age-related differences in which education level is taken into account. PMID:22421808
Formal education level versus self-rated literacy as predictors of cognitive aging.
Kavé, Gitit; Shrira, Amit; Palgi, Yuval; Spalter, Tal; Ben-Ezra, Menachem; Shmotkin, Dov
2012-11-01
To compare the prediction of cognitive functioning by formal education and self-rated literacy and the differences in prediction across younger and older cohorts. Data on 28,535 respondents were drawn from a cross-sectional representative sample of community-dwelling older individuals (≥50), participating in the Survey of Health, Ageing, and Retirement in Europe. Education level was classified according to the International Standard Classification of Education 1997 (ISCED-1997) self-rated literacy was determined by having respondents rate their reading and writing on 1-5 scales. Cognitive functioning was measured by verbal recall, word fluency, and arithmetic ability. Structural equation modeling demonstrated that self-rated literacy was more strongly associated with cognitive functioning than was education level, with or without additional exogenous variables (age, sex, household income, medical conditions, activities of daily living, reading eyesight, and country). The association between education level and cognitive functioning was weaker in older than in younger age groups, whereas the association between self-rated literacy and cognitive functioning showed the opposite trend. Self-rated literacy was found to be a better predictor of late-life cognitive functioning than was the level of formal education. The results have implications for studies of age-related differences in which education level is taken into account.
Systems engineering principles for the design of biomedical signal processing systems.
Faust, Oliver; Acharya U, Rajendra; Sputh, Bernhard H C; Min, Lim Choo
2011-06-01
Systems engineering aims to produce reliable systems which function according to specification. In this paper we follow a systems engineering approach to design a biomedical signal processing system. We discuss requirements capturing, specification definition, implementation and testing of a classification system. These steps are executed as formal as possible. The requirements, which motivate the system design, are based on diabetes research. The main requirement for the classification system is to be a reliable component of a machine which controls diabetes. Reliability is very important, because uncontrolled diabetes may lead to hyperglycaemia (raised blood sugar) and over a period of time may cause serious damage to many of the body systems, especially the nerves and blood vessels. In a second step, these requirements are refined into a formal CSP‖ B model. The formal model expresses the system functionality in a clear and semantically strong way. Subsequently, the proven system model was translated into an implementation. This implementation was tested with use cases and failure cases. Formal modeling and automated model checking gave us deep insight in the system functionality. This insight enabled us to create a reliable and trustworthy implementation. With extensive tests we established trust in the reliability of the implementation. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
Ab-initio Computation of the Electronic, transport, and Bulk Properties of Calcium Oxide.
NASA Astrophysics Data System (ADS)
Mbolle, Augustine; Banjara, Dipendra; Malozovsky, Yuriy; Franklin, Lashounda; Bagayoko, Diola
We report results from ab-initio, self-consistent, local Density approximation (LDA) calculations of electronic and related properties of calcium oxide (CaO) in the rock salt structure. We employed the Ceperley and Alder LDA potential and the linear combination of atomic orbitals (LCAO) formalism. Our calculations are non-relativistic. We implemented the LCAO formalism following the Bagayoko, Zhao, and Williams (BZW) method, as enhanced by Ekuma and Franklin (BZW-EF). The BZW-EF method involves a methodical search for the optimal basis set that yields the absolute minima of the occupied energies, as required by density functional theory (DFT). Our calculated, indirect band gap of 6.91eV, from towards the L point, is in excellent agreement with experimental value of 6.93-7.7eV, at room temperature (RT). We have also calculated the total (DOS) and partial (pDOS) densities of states as well as the bulk modulus. Our calculated bulk modulus is in excellent agreement with experiment. Work funded in part by the US Department of Energy (DOE), National Nuclear Security Administration (NNSA) (Award No.DE-NA0002630), the National Science Foundation (NSF) (Award No, 1503226), LaSPACE, and LONI-SUBR.
Das, Siddhartha; Chakraborty, Suman
2011-08-01
In this paper, we quantitatively demonstrate that exponentially decaying attractive potentials can effectively mimic strong hydrophobic interactions between monomer units of a polymer chain dissolved in aqueous solvent. Classical approaches to modeling hydrophobic solvation interactions are based on invariant attractive length scales. However, we demonstrate here that the solvation interaction decay length may need to be posed as a function of the relative separation distances and the sizes of the interacting species (or beads or monomers) to replicate the necessary physical interactions. As an illustrative example, we derive a universal scaling relationship for a given solute-solvent combination between the solvation decay length, the bead radius, and the distance between the interacting beads. With our formalism, the hydrophobic component of the net attractive interaction between monomer units can be synergistically accounted for within the unified framework of a simple exponentially decaying potential law, where the characteristic decay length incorporates the distinctive and critical physical features of the underlying interaction. The present formalism, even in a mesoscopic computational framework, is capable of incorporating the essential physics of the appropriate solute-size dependence and solvent-interaction dependence in the hydrophobic force estimation, without explicitly resolving the underlying molecular level details.
Unified theory for inhomogeneous thermoelectric generators and coolers including multistage devices.
Gerstenmaier, York Christian; Wachutka, Gerhard
2012-11-01
A novel generalized Lagrange multiplier method for functional optimization with inclusion of subsidiary conditions is presented and applied to the optimization of material distributions in thermoelectric converters. Multistaged devices are considered within the same formalism by inclusion of position-dependent electric current in the legs leading to a modified thermoelectric equation. Previous analytical solutions for maximized efficiencies for generators and coolers obtained by Sherman [J. Appl. Phys. 31, 1 (1960)], Snyder [Phys. Rev. B 86, 045202 (2012)], and Seifert et al. [Phys. Status Solidi A 207, 760 (2010)] by a method of local optimization of reduced efficiencies are recovered by independent proof. The outstanding maximization problems for generated electric power and cooling power can be solved swiftly numerically by solution of a differential equation-system obtained within the new formalism. As far as suitable materials are available, the inhomogeneous TE converters can have increased performance by use of purely temperature-dependent material properties in the thermoelectric legs or by use of purely spatial variation of material properties or by a combination of both. It turns out that the optimization domain is larger for the second kind of device which can, thus, outperform the first kind of device.
Liu, Hao; Zhu, Lili; Bai, Shuming; Shi, Qiang
2014-04-07
We investigated applications of the hierarchical equation of motion (HEOM) method to perform high order perturbation calculations of reduced quantum dynamics for a harmonic bath with arbitrary spectral densities. Three different schemes are used to decompose the bath spectral density into analytical forms that are suitable to the HEOM treatment: (1) The multiple Lorentzian mode model that can be obtained by numerically fitting the model spectral density. (2) The combined Debye and oscillatory Debye modes model that can be constructed by fitting the corresponding classical bath correlation function. (3) A new method that uses undamped harmonic oscillator modes explicitly in the HEOM formalism. Methods to extract system-bath correlations were investigated for the above bath decomposition schemes. We also show that HEOM in the undamped harmonic oscillator modes can give detailed information on the partial Wigner transform of the total density operator. Theoretical analysis and numerical simulations of the spin-Boson dynamics and the absorption line shape of molecular dimers show that the HEOM formalism for high order perturbations can serve as an important tool in studying the quantum dissipative dynamics in the intermediate coupling regime.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Hao; Zhu, Lili; Bai, Shuming
2014-04-07
We investigated applications of the hierarchical equation of motion (HEOM) method to perform high order perturbation calculations of reduced quantum dynamics for a harmonic bath with arbitrary spectral densities. Three different schemes are used to decompose the bath spectral density into analytical forms that are suitable to the HEOM treatment: (1) The multiple Lorentzian mode model that can be obtained by numerically fitting the model spectral density. (2) The combined Debye and oscillatory Debye modes model that can be constructed by fitting the corresponding classical bath correlation function. (3) A new method that uses undamped harmonic oscillator modes explicitly inmore » the HEOM formalism. Methods to extract system-bath correlations were investigated for the above bath decomposition schemes. We also show that HEOM in the undamped harmonic oscillator modes can give detailed information on the partial Wigner transform of the total density operator. Theoretical analysis and numerical simulations of the spin-Boson dynamics and the absorption line shape of molecular dimers show that the HEOM formalism for high order perturbations can serve as an important tool in studying the quantum dissipative dynamics in the intermediate coupling regime.« less
Toward fidelity between specification and implementation
NASA Technical Reports Server (NTRS)
Callahan, John R.; Montgomery, Todd L.; Morrison, Jeff; Wu, Yunqing
1994-01-01
This paper describes the methods used to specify and implement a complex communications protocol that provides reliable delivery of data in multicast-capable, packet-switching telecommunication networks. The protocol, called the Reliable Multicasting Protocol (RMP), was developed incrementally by two complementary teams using a combination of formal and informal techniques in an attempt to ensure the correctness of the protocol implementation. The first team, called the Design team, initially specified protocol requirements using a variant of SCR requirements tables and implemented a prototype solution. The second team, called the V&V team, developed a state model based on the requirements tables and derived test cases from these tables to exercise the implementation. In a series of iterative steps, the Design team added new functionality to the implementation while the V&V team kept the state model in fidelity with the implementation through testing. Test cases derived from state transition paths in the formal model formed the dialogue between teams during development and served as the vehicles for keeping the model and implementation in fidelity with each other. This paper describes our experiences in developing our process model, details of our approach, and some example problems found during the development of RMP.
Verification and validation of a reliable multicast protocol
NASA Technical Reports Server (NTRS)
Callahan, John R.; Montgomery, Todd L.
1995-01-01
This paper describes the methods used to specify and implement a complex communications protocol that provides reliable delivery of data in multicast-capable, packet-switching telecommunication networks. The protocol, called the Reliable Multicasting Protocol (RMP), was developed incrementally by two complementary teams using a combination of formal and informal techniques in an attempt to ensure the correctness of the protocol implementation. The first team, called the Design team, initially specified protocol requirements using a variant of SCR requirements tables and implemented a prototype solution. The second team, called the V&V team, developed a state model based on the requirements tables and derived test cases from these tables to exercise the implementation. In a series of iterative steps, the Design team added new functionality to the implementation while the V&V team kept the state model in fidelity with the implementation through testing. Test cases derived from state transition paths in the formal model formed the dialogue between teams during development and served as the vehicles for keeping the model and implementation in fidelity with each other. This paper describes our experiences in developing our process model, details of our approach, and some example problems found during the development of RMP.
Tetzlaff, Christian; Kolodziejski, Christoph; Timme, Marc; Wörgötter, Florentin
2011-01-01
Synaptic scaling is a slow process that modifies synapses, keeping the firing rate of neural circuits in specific regimes. Together with other processes, such as conventional synaptic plasticity in the form of long term depression and potentiation, synaptic scaling changes the synaptic patterns in a network, ensuring diverse, functionally relevant, stable, and input-dependent connectivity. How synaptic patterns are generated and stabilized, however, is largely unknown. Here we formally describe and analyze synaptic scaling based on results from experimental studies and demonstrate that the combination of different conventional plasticity mechanisms and synaptic scaling provides a powerful general framework for regulating network connectivity. In addition, we design several simple models that reproduce experimentally observed synaptic distributions as well as the observed synaptic modifications during sustained activity changes. These models predict that the combination of plasticity with scaling generates globally stable, input-controlled synaptic patterns, also in recurrent networks. Thus, in combination with other forms of plasticity, synaptic scaling can robustly yield neuronal circuits with high synaptic diversity, which potentially enables robust dynamic storage of complex activation patterns. This mechanism is even more pronounced when considering networks with a realistic degree of inhibition. Synaptic scaling combined with plasticity could thus be the basis for learning structured behavior even in initially random networks. PMID:22203799
Tempo: A Toolkit for the Timed Input/Output Automata Formalism
2008-01-30
generation of distributed code from specifications. F .4.3 [Formal Languages]: Tempo;, D.3 [Programming Many distributed systems involve a combination of...and require The chek (i) transition is enabled when process i’s program the simulator to check the assertions after every single step counter is set to...output foo (n:Int) The Tempo simulator addresses this issue by putting the states x: Int : = 10;transitions modeler in charge of resolving the non
Adolescent thinking ála Piaget: The formal stage.
Dulit, E
1972-12-01
Two of the formal-stage experiments of Piaget and Inhelder, selected largely for their closeness to the concepts defining the stage, were replicated with groups of average and gifted adolescents. This report describes the relevant Piagetian concepts (formal stage, concrete stage) in context, gives the methods and findings of this study, and concludes with a section discussing implications and making some reformulations which generally support but significantly qualify some of the central themes of the Piaget-Inhelder work. Fully developed formal-stage thinking emerges as far from commonplace among normal or average adolescents (by marked contrast with the impression created by the Piaget-Inhelder text, which chooses to report no middle or older adolescents who function at less than fully formal levels). In this respect, the formal stage differs appreciably from the earlier Piagetian stages, and early adolescence emerges as the age for which a "single path" model of cognitive development becomes seriously inadequate and a more complex model becomes essential. Formal-stage thinking seems best conceptualized, like most other aspects of psychological maturity, as a potentiality only partially attained by most and fully attained only by some.
δ M formalism and anisotropic chaotic inflation power spectrum
NASA Astrophysics Data System (ADS)
Talebian-Ashkezari, A.; Ahmadi, N.
2018-05-01
A new analytical approach to linear perturbations in anisotropic inflation has been introduced in [A. Talebian-Ashkezari, N. Ahmadi and A.A. Abolhasani, JCAP 03 (2018) 001] under the name of δ M formalism. In this paper we apply the mentioned approach to a model of anisotropic inflation driven by a scalar field, coupled to the kinetic term of a vector field with a U(1) symmetry. The δ M formalism provides an efficient way of computing tensor-tensor, tensor-scalar as well as scalar-scalar 2-point correlations that are needed for the analysis of the observational features of an anisotropic model on the CMB. A comparison between δ M results and the tedious calculations using in-in formalism shows the aptitude of the δ M formalism in calculating accurate two point correlation functions between physical modes of the system.
NASA Astrophysics Data System (ADS)
Bommier, Véronique
2017-11-01
Context. In previous papers of this series, we presented a formalism able to account for both statistical equilibrium of a multilevel atom and coherent and incoherent scatterings (partial redistribution). Aims: This paper provides theoretical expressions of the redistribution function for the two-term atom. This redistribution function includes both coherent (RII) and incoherent (RIII) scattering contributions with their branching ratios. Methods: The expressions were derived by applying the formalism outlined above. The statistical equilibrium equation for the atomic density matrix is first formally solved in the case of the two-term atom with unpolarized and infinitely sharp lower levels. Then the redistribution function is derived by substituting this solution for the expression of the emissivity. Results: Expressions are provided for both magnetic and non-magnetic cases. Atomic fine structure is taken into account. Expressions are also separately provided under zero and non-zero hyperfine structure. Conclusions: Redistribution functions are widely used in radiative transfer codes. In our formulation, collisional transitions between Zeeman sublevels within an atomic level (depolarizing collisions effect) are taken into account when possible (I.e., in the non-magnetic case). However, the need for a formal solution of the statistical equilibrium as a preliminary step prevents us from taking into account collisional transfers between the levels of the upper term. Accounting for these collisional transfers could be done via a numerical solution of the statistical equilibrium equation system.
Azar, R Julian; Horn, Paul Richard; Sundstrom, Eric Jon; Head-Gordon, Martin
2013-02-28
The problem of describing the energy-lowering associated with polarization of interacting molecules is considered in the overlapping regime for self-consistent field wavefunctions. The existing approach of solving for absolutely localized molecular orbital (ALMO) coefficients that are block-diagonal in the fragments is shown based on formal grounds and practical calculations to often overestimate the strength of polarization effects. A new approach using a minimal basis of polarized orthogonal local MOs (polMOs) is developed as an alternative. The polMO basis is minimal in the sense that one polarization function is provided for each unpolarized orbital that is occupied; such an approach is exact in second-order perturbation theory. Based on formal grounds and practical calculations, the polMO approach is shown to underestimate the strength of polarization effects. In contrast to the ALMO method, however, the polMO approach yields results that are very stable to improvements in the underlying AO basis expansion. Combining the ALMO and polMO approaches allows an estimate of the range of energy-lowering due to polarization. Extensive numerical calculations on the water dimer using a large range of basis sets with Hartree-Fock theory and a variety of different density functionals illustrate the key considerations. Results are also presented for the polarization-dominated Na(+)CH4 complex. Implications for energy decomposition analysis of intermolecular interactions are discussed.
A Mathematical Account of the NEGF Formalism
NASA Astrophysics Data System (ADS)
Cornean, Horia D.; Moldoveanu, Valeriu; Pillet, Claude-Alain
2018-02-01
The main goal of this paper is to put on solid mathematical grounds the so-called Non-Equilibrium Green's Function (NEGF) transport formalism for open systems. In particular, we derive the Jauho-Meir-Wingreen formula for the time-dependent current through an interacting sample coupled to non-interacting leads. Our proof is non-perturbative and uses neither complex-time Keldysh contours, nor Langreth rules of 'analytic continuation'. We also discuss other technical identities (Langreth, Keldysh) involving various many body Green's functions. Finally, we study the Dyson equation for the advanced/retarded interacting Green's function and we rigorously construct its (irreducible) self-energy, using the theory of Volterra operators.
Extension of the Kohn-Sham formulation of density functional theory to finite temperature
Gonis, A.; Dane, M.
2017-12-20
Based on Mermin's extension of the Hohenberg and Kohn theorems to non-zero temperature, the Kohn-Sham formulation of density functional theory (KS-DFT) is generalized to finite temperature. Here, we show that present formulations are inconsistent with Mermin's functional containing expressions, in particular describing the Coulomb energy, that defy derivation and are even in violation of rules of logical inference. More; current methodology is in violation of fundamental laws of both quantum and classical mechanics. Based on this feature, we demonstrate the impossibility of extending the KS formalism to finite temperature through the self-consistent solutions of the single-particle Schrödinger equation of T>0.more » Guided by the form of Mermin's functional that depends on the eigenstates of a Hamiltonian, determined at T>0 we base our extension of KS-DFT on the determination of the excited states of a non-interacting system at the zero of temperature. The resulting formulation is consistent with that of Mermin constructing the free energy at T>0 in terms of the excited states of a non-interacting Hamiltonian (system) that, within the KS formalism, are described by Slater determinants. To determine the excited states at T=0 use is made of the extension of the Hohenberg and Kohn theorems to excited states presented in previous work applied here to a non-interacting collection of replicas of a non-interacting N-particle system, whose ground state density is taken to match that of K non-interacting replicas of an interacting N-particle system at T>0. The formalism allows for an ever denser population of the excitation spectrum of a Hamiltonian, within the KS approximation. The form of the auxiliary potential, (Kohn-Sham potential), is formally identical to that in the ground state formalism with the contribution of the Coulomb energy provided by the derivative of the Coulomb energy in all excited states taken into account. Once the excited states are determined, the minimum of the free energy within the KS formalism follows immediately in the form of Mermin's functional, but with the exact excited states in that functional represented by Slater determinants obtained through self-consistency conditions at the zero of temperature. Lastly, it is emphasized that, in departure from all existing formulations, no self-consistency conditions are implemented at finite T; as we show, in fact, such formulations are rigorously blocked.« less
Extension of the Kohn-Sham formulation of density functional theory to finite temperature
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gonis, A.; Dane, M.
Based on Mermin's extension of the Hohenberg and Kohn theorems to non-zero temperature, the Kohn-Sham formulation of density functional theory (KS-DFT) is generalized to finite temperature. Here, we show that present formulations are inconsistent with Mermin's functional containing expressions, in particular describing the Coulomb energy, that defy derivation and are even in violation of rules of logical inference. More; current methodology is in violation of fundamental laws of both quantum and classical mechanics. Based on this feature, we demonstrate the impossibility of extending the KS formalism to finite temperature through the self-consistent solutions of the single-particle Schrödinger equation of T>0.more » Guided by the form of Mermin's functional that depends on the eigenstates of a Hamiltonian, determined at T>0 we base our extension of KS-DFT on the determination of the excited states of a non-interacting system at the zero of temperature. The resulting formulation is consistent with that of Mermin constructing the free energy at T>0 in terms of the excited states of a non-interacting Hamiltonian (system) that, within the KS formalism, are described by Slater determinants. To determine the excited states at T=0 use is made of the extension of the Hohenberg and Kohn theorems to excited states presented in previous work applied here to a non-interacting collection of replicas of a non-interacting N-particle system, whose ground state density is taken to match that of K non-interacting replicas of an interacting N-particle system at T>0. The formalism allows for an ever denser population of the excitation spectrum of a Hamiltonian, within the KS approximation. The form of the auxiliary potential, (Kohn-Sham potential), is formally identical to that in the ground state formalism with the contribution of the Coulomb energy provided by the derivative of the Coulomb energy in all excited states taken into account. Once the excited states are determined, the minimum of the free energy within the KS formalism follows immediately in the form of Mermin's functional, but with the exact excited states in that functional represented by Slater determinants obtained through self-consistency conditions at the zero of temperature. Lastly, it is emphasized that, in departure from all existing formulations, no self-consistency conditions are implemented at finite T; as we show, in fact, such formulations are rigorously blocked.« less
The Formal Pragmatics of Non-at-Issue Intensification in English and Japanese
ERIC Educational Resources Information Center
Taniguchi, Ai
2017-01-01
This dissertation concerns the formal pragmatics of constructions in English and Japanese that are perceptively intensificative in their discourse function in some way. In particular I examine polarity emphasis (verum focus), exclamatives, and acts of notification and surprise in language using a compositional version of Farkas and Bruce (2010)'s…
Defining role models for staff orientation.
Kinley, H
This article examines the need for a formal role model to help integrate new staff within a unit. While acknowledging the range of titles and functions ascribed to such a role in the literature, the author suggests that the essence of the role and its formal recognition has benefits for experienced staff and orientees alike.
The mathematical bases for qualitative reasoning
NASA Technical Reports Server (NTRS)
Kalagnanam, Jayant; Simon, Herbert A.; Iwasaki, Yumi
1991-01-01
The practices of researchers in many fields who use qualitative reasoning are summarized and explained. The goal is to gain an understanding of the formal assumptions and mechanisms that underlie this kind of analysis. The explanations given are based on standard mathematical formalisms, particularly on ordinal properties, continuous differentiable functions, and the mathematics of nonlinear dynamic systems.
Connecting Formal and Informal Learning Experiences
ERIC Educational Resources Information Center
O'Mahony, Timothy Kieran
2010-01-01
The learning study reports on part of a larger project being lead by the author. In this dissertation I explore one goal of this project--to understand effects on student learning outcomes as a function of using different methods for connecting out-of-school experiential learning with formal school-based instruction. There is a long history of…
Formal Operations, the Imaginary Audience and the Personal Fable.
ERIC Educational Resources Information Center
Hudson, Lynne M.; Gray, William M.
1986-01-01
Administered the Adolescent Egocentrism Scale (AES) to middle and high school students (N=129). Found partial support for Inhelder and Piaget's and Elkind's views that adolescent egocentrism is a function of beginning formal operations. Discusses the difficulty of assessing the true thoughts/feelings of persons who are worried how they will appear…
Learning Goal Orientation, Formal Mentoring, and Leadership Competence in HRD: A Conceptual Model
ERIC Educational Resources Information Center
Kim, Sooyoung
2007-01-01
Purpose: The purpose of this paper is to suggest a conceptual model of formal mentoring as a leadership development initiative including "learning goal orientation", "mentoring functions", and "leadership competencies" as key constructs of the model. Design/methodology/approach: Some empirical studies, though there are not many, will provide…
40 CFR 262.200 - Definitions for this subpart.
Code of Federal Regulations, 2012 CFR
2012-07-01
... research as its primary function and files as a non-profit organization under the tax code of 26 U.S.C. 501... college or university, or a non-profit research institute that is owned by or has a formal written... written affiliation agreement with a college or university. Formal written affiliation agreement for a non...
40 CFR 262.200 - Definitions for this subpart.
Code of Federal Regulations, 2013 CFR
2013-07-01
... research as its primary function and files as a non-profit organization under the tax code of 26 U.S.C. 501... college or university, or a non-profit research institute that is owned by or has a formal written... written affiliation agreement with a college or university. Formal written affiliation agreement for a non...
40 CFR 262.200 - Definitions for this subpart.
Code of Federal Regulations, 2011 CFR
2011-07-01
... research as its primary function and files as a non-profit organization under the tax code of 26 U.S.C. 501... college or university, or a non-profit research institute that is owned by or has a formal written... written affiliation agreement with a college or university. Formal written affiliation agreement for a non...
40 CFR 262.200 - Definitions for this subpart.
Code of Federal Regulations, 2014 CFR
2014-07-01
... research as its primary function and files as a non-profit organization under the tax code of 26 U.S.C. 501... college or university, or a non-profit research institute that is owned by or has a formal written... written affiliation agreement with a college or university. Formal written affiliation agreement for a non...
The Acquisition of Korean Plural Marking by Native English Speakers
ERIC Educational Resources Information Center
Hwang, Sun Hee
2013-01-01
This study investigated the L2 acquisition of Korean plural marking by English-speaking learners within a feature-reassembly approach--a formal feature-based approach suggesting that native-like attainment of L2 morphosyntactic knowledge is determined by whether learners can reconfigure the formal features assembled in functional categories and…
Mechanically verified hardware implementing an 8-bit parallel IO Byzantine agreement processor
NASA Technical Reports Server (NTRS)
Moore, J. Strother
1992-01-01
Consider a network of four processors that use the Oral Messages (Byzantine Generals) Algorithm of Pease, Shostak, and Lamport to achieve agreement in the presence of faults. Bevier and Young have published a functional description of a single processor that, when interconnected appropriately with three identical others, implements this network under the assumption that the four processors step in synchrony. By formalizing the original Pease, et al work, Bevier and Young mechanically proved that such a network achieves fault tolerance. We develop, formalize, and discuss a hardware design that has been mechanically proven to implement their processor. In particular, we formally define mapping functions from the abstract state space of the Bevier-Young processor to a concrete state space of a hardware module and state a theorem that expresses the claim that the hardware correctly implements the processor. We briefly discuss the Brock-Hunt Formal Hardware Description Language which permits designs both to be proved correct with the Boyer-Moore theorem prover and to be expressed in a commercially supported hardware description language for additional electrical analysis and layout. We briefly describe our implementation.
NASA Astrophysics Data System (ADS)
Angulo, Raul E.; Hilbert, Stefan
2015-03-01
We explore the cosmological constraints from cosmic shear using a new way of modelling the non-linear matter correlation functions. The new formalism extends the method of Angulo & White, which manipulates outputs of N-body simulations to represent the 3D non-linear mass distribution in different cosmological scenarios. We show that predictions from our approach for shear two-point correlations at 1-300 arcmin separations are accurate at the ˜10 per cent level, even for extreme changes in cosmology. For moderate changes, with target cosmologies similar to that preferred by analyses of recent Planck data, the accuracy is close to ˜5 per cent. We combine this approach with a Monte Carlo Markov chain sampler to explore constraints on a Λ cold dark matter model from the shear correlation functions measured in the Canada-France-Hawaii Telescope Lensing Survey (CFHTLenS). We obtain constraints on the parameter combination σ8(Ωm/0.27)0.6 = 0.801 ± 0.028. Combined with results from cosmic microwave background data, we obtain marginalized constraints on σ8 = 0.81 ± 0.01 and Ωm = 0.29 ± 0.01. These results are statistically compatible with previous analyses, which supports the validity of our approach. We discuss the advantages of our method and the potential it offers, including a path to model in detail (i) the effects of baryons, (ii) high-order shear correlation functions, and (iii) galaxy-galaxy lensing, among others, in future high-precision cosmological analyses.
NASA Astrophysics Data System (ADS)
Cui, S. T.
The stress-stress correlation function and the viscosity of a united-atom model of liquid decane are studied by equilibrium molecular dynamics simulation using two different formalisms for the stress tensor: the atomic and the molecular formalisms. The atomic and molecular correlation functions show dramatic difference in short-time behaviour. The integrals of the two correlation functions, however, become identical after a short transient period whichis significantly shorter than the rotational relaxation time of the molecule. Both reach the same plateau value in a time period corresponding to this relaxation time. These results provide a convenient guide for the choice of the upper integral time limit in calculating the viscosity by the Green-Kubo formula.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mendes, Albert C.R., E-mail: albert@fisica.ufjf.br; Takakura, Flavio I., E-mail: takakura@fisica.ufjf.br; Abreu, Everton M.C., E-mail: evertonabreu@ufrrj.br
In this work we have obtained a higher-derivative Lagrangian for a charged fluid coupled with the electromagnetic fluid and the Dirac’s constraints analysis was discussed. A set of first-class constraints fixed by noncovariant gauge condition were obtained. The path integral formalism was used to obtain the partition function for the corresponding higher-derivative Hamiltonian and the Faddeev–Popov ansatz was used to construct an effective Lagrangian. Through the partition function, a Stefan–Boltzmann type law was obtained. - Highlights: • Higher-derivative Lagrangian for a charged fluid. • Electromagnetic coupling and Dirac’s constraint analysis. • Partition function through path integral formalism. • Stefan–Boltzmann-kind lawmore » through the partition function.« less
The Leadership Practices of the Dean of Combined Arms Academy
ERIC Educational Resources Information Center
Tafere, Matebe
2014-01-01
This study is on the leadership practices of the dean of Combined Army Academy. The research is a qualitative design. The academic staff members were the participants of the study. Formal and informal conversational interview approaches, personal observation and document analysis were the instruments of the study. Thematic analysis was used for…
Understanding the Impact of Exposure Patterns on Risks from Combined Exposures to Multiple Chemicals
The talk was invited so there is no formal abstract. However, the focus of the talk is on the use of exposure information in the evaluation of risks from combined exposures to chemicals. The talk presents a bit of history and several case studies. All empirical data presented hav...
Formal analysis of imprecise system requirements with Event-B.
Le, Hong Anh; Nakajima, Shin; Truong, Ninh Thuan
2016-01-01
Formal analysis of functional properties of system requirements needs precise descriptions. However, the stakeholders sometimes describe the system with ambiguous, vague or fuzzy terms, hence formal frameworks for modeling and verifying such requirements are desirable. The Fuzzy If-Then rules have been used for imprecise requirements representation, but verifying their functional properties still needs new methods. In this paper, we propose a refinement-based modeling approach for specification and verification of such requirements. First, we introduce a representation of imprecise requirements in the set theory. Then we make use of Event-B refinement providing a set of translation rules from Fuzzy If-Then rules to Event-B notations. After that, we show how to verify both safety and eventuality properties with RODIN/Event-B. Finally, we illustrate the proposed method on the example of Crane Controller.
Accurate Energies and Orbital Description in Semi-Local Kohn-Sham DFT
NASA Astrophysics Data System (ADS)
Lindmaa, Alexander; Kuemmel, Stephan; Armiento, Rickard
2015-03-01
We present our progress on a scheme in semi-local Kohn-Sham density-functional theory (KS-DFT) for improving the orbital description while still retaining the level of accuracy of the usual semi-local exchange-correlation (xc) functionals. DFT is a widely used tool for first-principles calculations of properties of materials. A given task normally requires a balance of accuracy and computational cost, which is well achieved with semi-local DFT. However, commonly used semi-local xc functionals have important shortcomings which often can be attributed to features of the corresponding xc potential. One shortcoming is an overly delocalized representation of localized orbitals. Recently a semi-local GGA-type xc functional was constructed to address these issues, however, it has the trade-off of lower accuracy of the total energy. We discuss the source of this error in terms of a surplus energy contribution in the functional that needs to be accounted for, and offer a remedy for this issue which formally stays within KS-DFT, and, which does not harshly increase the computational effort. The end result is a scheme that combines accurate total energies (e.g., relaxed geometries) with an improved orbital description (e.g., improved band structure).
Identifying enhanced cortico-basal ganglia loops associated with prolonged dance training
Li, Gujing; He, Hui; Huang, Mengting; Zhang, Xingxing; Lu, Jing; Lai, Yongxiu; Luo, Cheng; Yao, Dezhong
2015-01-01
Studies have revealed that prolonged, specialized training combined with higher cognitive conditioning induces enhanced brain alternation. In particular, dancers with long-term dance experience exhibit superior motor control and integration with their sensorimotor networks. However, little is known about the functional connectivity patterns of spontaneous intrinsic activities in the sensorimotor network of dancers. Our study examined the functional connectivity density (FCD) of dancers with a mean period of over 10 years of dance training in contrast with a matched non-dancer group without formal dance training using resting-state fMRI scans. FCD was mapped and analyzed, and the functional connectivity (FC) analyses were then performed based on the difference of FCD. Compared to the non-dancers, the dancers exhibited significantly increased FCD in the precentral gyri, postcentral gyri and bilateral putamen. Furthermore, the results of the FC analysis revealed enhanced connections between the middle cingulate cortex and the bilateral putamen and between the precentral and the postcentral gyri. All findings indicated an enhanced functional integration in the cortico-basal ganglia loops that govern motor control and integration in dancers. These findings might reflect improved sensorimotor function for the dancers consequent to long-term dance training. PMID:26035693
From non-trivial geometries to power spectra and vice versa
NASA Astrophysics Data System (ADS)
Brooker, D. J.; Tsamis, N. C.; Woodard, R. P.
2018-04-01
We review a recent formalism which derives the functional forms of the primordial—tensor and scalar—power spectra of scalar potential inflationary models. The formalism incorporates the case of geometries with non-constant first slow-roll parameter. Analytic expressions for the power spectra are given that explicitly display the dependence on the geometric properties of the background. Moreover, we present the full algorithm for using our formalism, to reconstruct the model from the observed power spectra. Our techniques are applied to models possessing "features" in their potential with excellent agreement.
Towards the Formal Verification of a Distributed Real-Time Automotive System
NASA Technical Reports Server (NTRS)
Endres, Erik; Mueller, Christian; Shadrin, Andrey; Tverdyshev, Sergey
2010-01-01
We present the status of a project which aims at building, formally and pervasively verifying a distributed automotive system. The target system is a gate-level model which consists of several interconnected electronic control units with independent clocks. This model is verified against the specification as seen by a system programmer. The automotive system is implemented on several FPGA boards. The pervasive verification is carried out using combination of interactive theorem proving (Isabelle/HOL) and model checking (LTL).
Solving the three-body Coulomb breakup problem using exterior complex scaling
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCurdy, C.W.; Baertschy, M.; Rescigno, T.N.
2004-05-17
Electron-impact ionization of the hydrogen atom is the prototypical three-body Coulomb breakup problem in quantum mechanics. The combination of subtle correlation effects and the difficult boundary conditions required to describe two electrons in the continuum have made this one of the outstanding challenges of atomic physics. A complete solution of this problem in the form of a ''reduction to computation'' of all aspects of the physics is given by the application of exterior complex scaling, a modern variant of the mathematical tool of analytic continuation of the electronic coordinates into the complex plane that was used historically to establish themore » formal analytic properties of the scattering matrix. This review first discusses the essential difficulties of the three-body Coulomb breakup problem in quantum mechanics. It then describes the formal basis of exterior complex scaling of electronic coordinates as well as the details of its numerical implementation using a variety of methods including finite difference, finite elements, discrete variable representations, and B-splines. Given these numerical implementations of exterior complex scaling, the scattering wave function can be generated with arbitrary accuracy on any finite volume in the space of electronic coordinates, but there remains the fundamental problem of extracting the breakup amplitudes from it. Methods are described for evaluating these amplitudes. The question of the volume-dependent overall phase that appears in the formal theory of ionization is resolved. A summary is presented of accurate results that have been obtained for the case of electron-impact ionization of hydrogen as well as a discussion of applications to the double photoionization of helium.« less
NASA Astrophysics Data System (ADS)
Li, Bin
Spatial control behaviors account for a large proportion of human everyday activities from normal daily tasks, such as reaching for objects, to specialized tasks, such as driving, surgery, or operating equipment. These behaviors involve intensive interactions within internal processes (i.e. cognitive, perceptual, and motor control) and with the physical world. This dissertation builds on a concept of interaction pattern and a hierarchical functional model. Interaction pattern represents a type of behavior synergy that humans coordinates cognitive, perceptual, and motor control processes. It contributes to the construction of the hierarchical functional model that delineates humans spatial control behaviors as the coordination of three functional subsystems: planning, guidance, and tracking/pursuit. This dissertation formalizes and validates these two theories and extends them for the investigation of human spatial control skills encompassing development and assessment. Specifically, this dissertation first presents an overview of studies in human spatial control skills encompassing definition, characteristic, development, and assessment, to provide theoretical evidence for the concept of interaction pattern and the hierarchical functional model. The following, the human experiments for collecting motion and gaze data and techniques to register and classify gaze data, are described. This dissertation then elaborates and mathematically formalizes the hierarchical functional model and the concept of interaction pattern. These theories then enables the construction of a succinct simulation model that can reproduce a variety of human performance with a minimal set of hypotheses. This validates the hierarchical functional model as a normative framework for interpreting human spatial control behaviors. The dissertation then investigates human skill development and captures the emergence of interaction pattern. The final part of the dissertation applies the hierarchical functional model for skill assessment and introduces techniques to capture interaction patterns both from the top down using their geometric features and from the bottom up using their dynamical characteristics. The validity and generality of the skill assessment is illustrated using two the remote-control flight and laparoscopic surgical training experiments.
Verification of Triple Modular Redundancy (TMR) Insertion for Reliable and Trusted Systems
NASA Technical Reports Server (NTRS)
Berg, Melanie; LaBel, Kenneth A.
2016-01-01
We propose a method for TMR insertion verification that satisfies the process for reliable and trusted systems. If a system is expected to be protected using TMR, improper insertion can jeopardize the reliability and security of the system. Due to the complexity of the verification process, there are currently no available techniques that can provide complete and reliable confirmation of TMR insertion. This manuscript addresses the challenge of confirming that TMR has been inserted without corruption of functionality and with correct application of the expected TMR topology. The proposed verification method combines the usage of existing formal analysis tools with a novel search-detect-and-verify tool. Field programmable gate array (FPGA),Triple Modular Redundancy (TMR),Verification, Trust, Reliability,
Velocity-gauge real-time TDDFT within a numerical atomic orbital basis set
NASA Astrophysics Data System (ADS)
Pemmaraju, C. D.; Vila, F. D.; Kas, J. J.; Sato, S. A.; Rehr, J. J.; Yabana, K.; Prendergast, David
2018-05-01
The interaction of laser fields with solid-state systems can be modeled efficiently within the velocity-gauge formalism of real-time time dependent density functional theory (RT-TDDFT). In this article, we discuss the implementation of the velocity-gauge RT-TDDFT equations for electron dynamics within a linear combination of atomic orbitals (LCAO) basis set framework. Numerical results obtained from our LCAO implementation, for the electronic response of periodic systems to both weak and intense laser fields, are compared to those obtained from established real-space grid and Full-Potential Linearized Augmented Planewave approaches. Potential applications of the LCAO based scheme in the context of extreme ultra-violet and soft X-ray spectroscopies involving core-electronic excitations are discussed.
Microscopic pressure-cooker model for studying molecules in confinement
NASA Astrophysics Data System (ADS)
Santamaria, Ruben; Adamowicz, Ludwik; Rosas-Acevedo, Hortensia
2015-04-01
A model for a system of a finite number of molecules in confinement is presented and expressions for determining the temperature, pressure, and volume of the system are derived. The present model is a generalisation of the Zwanzig-Langevin model because it includes pressure effects in the system. It also has general validity, preserves the ergodic hypothesis, and provides a formal framework for previous studies of hydrogen clusters in confinement. The application of the model is illustrated by an investigation of a set of prebiotic compounds exposed to varying pressure and temperature. The simulations performed within the model involve the use of a combination of molecular dynamics and density functional theory methods implemented on a computer system with a mixed CPU-GPU architecture.
NASA Technical Reports Server (NTRS)
Pathak, P. H.; Kouyoumjian, R. G.
1974-01-01
The diffraction of a TM sub o surface wave by a terminated dielectric slab which is flush mounted in a perfectly conducting surface is studied. The incident surface wave gives rise to waves reflected and diffracted by the termination; these reflected and diffracted fields may be expressed in terms of the geometrical theory of diffraction by introducing surface wave reflection and diffraction coefficients which are associated with the termination. In this investigation, the surface wave reflection and diffraction coefficients have been deduced from a formally exact solution to this canonical problem. The solution is obtained by a combination of the generalized scattering matrix technique and function theoretic methods.
A lattice calculation of the hadronic vacuum polarization contribution to (g - 2)µ
NASA Astrophysics Data System (ADS)
Della Morte, M.; Francis, A.; Gérardin, A.; Gülpers, V.; Herdoíza, G.; von Hippel, G.; Horch, H.; Jäger, B.; Meyer, H. B.; Nyffeler, A.; Wittig, H.
2018-03-01
We present results of calculations of the hadronic vacuum polarisation contribution to the muon anomalous magnetic moment. Specifically, we focus on controlling the infrared regime of the vacuum polarisation function. Our results are corrected for finite-size effects by combining the Gounaris-Sakurai parameterisation of the timelike pion form factor with the Lüscher formalism. The impact of quark-disconnected diagrams and the precision of the scale determination is discussed and included in our final result in two-flavour QCD, which carries an overall uncertainty of 6%. We present preliminary results computed on ensembles with Nf = 2 + 1 dynamical flavours and discuss how the long-distance contribution can be accurately constrained by a dedicated spectrum calculation in the iso-vector channel.
An algebraic interpretation of PSP composition.
Vaucher, G
1998-01-01
The introduction of time in artificial neurons is a delicate problem on which many groups are working. Our approach combines some properties of biological models and the algebraic properties of McCulloch and Pitts artificial neuron (AN) (McCulloch and Pitts, 1943) to produce a new model which links both characteristics. In this extended artificial neuron, postsynaptic potentials (PSPs) are considered as numerical elements, having two degrees of freedom, on which the neuron computes operations. Modelled in this manner, a group of neurons can be seen as a computer with an asynchronous architecture. To formalize the functioning of this computer, we propose an algebra of impulses. This approach might also be interesting in the modelling of the passive electrical properties in some biological neurons.
Robust functional statistics applied to Probability Density Function shape screening of sEMG data.
Boudaoud, S; Rix, H; Al Harrach, M; Marin, F
2014-01-01
Recent studies pointed out possible shape modifications of the Probability Density Function (PDF) of surface electromyographical (sEMG) data according to several contexts like fatigue and muscle force increase. Following this idea, criteria have been proposed to monitor these shape modifications mainly using High Order Statistics (HOS) parameters like skewness and kurtosis. In experimental conditions, these parameters are confronted with small sample size in the estimation process. This small sample size induces errors in the estimated HOS parameters restraining real-time and precise sEMG PDF shape monitoring. Recently, a functional formalism, the Core Shape Model (CSM), has been used to analyse shape modifications of PDF curves. In this work, taking inspiration from CSM method, robust functional statistics are proposed to emulate both skewness and kurtosis behaviors. These functional statistics combine both kernel density estimation and PDF shape distances to evaluate shape modifications even in presence of small sample size. Then, the proposed statistics are tested, using Monte Carlo simulations, on both normal and Log-normal PDFs that mimic observed sEMG PDF shape behavior during muscle contraction. According to the obtained results, the functional statistics seem to be more robust than HOS parameters to small sample size effect and more accurate in sEMG PDF shape screening applications.
The hadronic vacuum polarization contribution to the muon g - 2 from lattice QCD
NASA Astrophysics Data System (ADS)
Morte, M. Della; Francis, A.; Gülpers, V.; Herdoíza, G.; von Hippel, G.; Horch, H.; Jäger, B.; Meyer, H. B.; Nyffeler, A.; Wittig, H.
2017-10-01
We present a calculation of the hadronic vacuum polarization contribution to the muon anomalous magnetic moment, a μ hvp , in lattice QCD employing dynamical up and down quarks. We focus on controlling the infrared regime of the vacuum polarization function. To this end we employ several complementary approaches, including Padé fits, time moments and the time-momentum representation. We correct our results for finite-volume effects by combining the Gounaris-Sakurai parameterization of the timelike pion form factor with the Lüscher formalism. On a subset of our ensembles we have derived an upper bound on the magnitude of quark-disconnected diagrams and found that they decrease the estimate for a μ hvp by at most 2%. Our final result is {a}_{μ}^{hvp} = (654 ± {32}{^{-23}}^{+21}) ·10-10, where the first error is statistical, and the second denotes the combined systematic uncertainty. Based on our findings we discuss the prospects for determining a μ hvp with sub-percent precision.
How Students Combine Resources to Make Conceptual Breakthroughs
NASA Astrophysics Data System (ADS)
Richards, A. J.; Jones, Darrick C.; Etkina, Eugenia
2018-04-01
We use the framework of cognitive resources to investigate how students construct understanding of a complex physics topic, namely, a photovoltaic cell. By observing students as they learn about how a solar cell functions, we identified over 60 distinct resources that learners may activate while thinking about photovoltaic cells. We classify these resources into three main types: phenomenological primitives, conceptual resources, and epistemological resources. Furthermore, we found a pattern that suggests that when students make conceptual breakthroughs they may be more likely to activate combinations of resources of different types in concert, especially if a resource from each of the three categories is used. This pattern suggests that physics instructors should encourage students to activate multiple types of prior knowledge during the learning process. This can result from instructors deliberately and explicitly connecting new knowledge to students' prior experience both in and outside the formal physics classroom, as well as allowing students to reflect metacognitively on how the new knowledge fits into their existing understanding of the natural world.
MatLab Script and Functional Programming
NASA Technical Reports Server (NTRS)
Shaykhian, Gholam Ali
2007-01-01
MatLab Script and Functional Programming: MatLab is one of the most widely used very high level programming languages for scientific and engineering computations. It is very user-friendly and needs practically no formal programming knowledge. Presented here are MatLab programming aspects and not just the MatLab commands for scientists and engineers who do not have formal programming training and also have no significant time to spare for learning programming to solve their real world problems. Specifically provided are programs for visualization. The MatLab seminar covers the functional and script programming aspect of MatLab language. Specific expectations are: a) Recognize MatLab commands, script and function. b) Create, and run a MatLab function. c) Read, recognize, and describe MatLab syntax. d) Recognize decisions, loops and matrix operators. e) Evaluate scope among multiple files, and multiple functions within a file. f) Declare, define and use scalar variables, vectors and matrices.
Comparative study of DFT+U functionals for non-collinear magnetism
NASA Astrophysics Data System (ADS)
Ryee, Siheon; Han, Myung Joon
2018-07-01
We performed comparative analysis for DFT+U functionals to better understand their applicability to non-collinear magnetism. Taking LiNiPO4 and Sr2IrO4 as examples, we investigated the results out of two formalisms based on charge-only density and spin density functional plus U calculations. Our results show that the ground state spin order in terms of tilting angle is strongly dependent on Hund J. In particular, the opposite behavior of canting angles as a function of J is found for LiNiPO4. The dependence on the other physical parameters such as Hubbard U and Slater parameterization is investigated. We also discuss the formal aspects of these functional dependences as well as parameter dependences. The current study provides useful information and important intuition for the first-principles calculation of non-collinear magnetic materials.
ERIC Educational Resources Information Center
Doronila, Maria Luisa C.
In the Philippines, introduction of a formal education system, new written language, and the knowledge encoded in it have been part of a colonization process and not the result of direct evolution from informal education. The discontinuities between formal and informal education--abstraction, systematization, and specialization--are greater and…
Liaison Roles in the Communication Structure of a Formal Organization: A Pilot Study.
ERIC Educational Resources Information Center
Schwartz, Donald F.
The purpose of this study was first to map the functional communication structure of a 142-member formal organization, then to analyze that structure to identify work groups (Cliques) and interlinking liaison role persons, and finally to describe certain differences between liaison persons and nonliaison members of the work groups as perceived by…
ERIC Educational Resources Information Center
Koskela, Merja; Pilke, Nina
2016-01-01
This article explores how linguistic resources from two local languages, Finnish and Swedish, are used in expert presentations in bilingual formal meetings and how they function with respect to the three ideal criteria of professional communication: economy, efficiency, and precision. Based on the results, the article suggests a typology of…
Johnson, Benjamin K; Tierney, David M; Rosborough, Terry K; Harris, Kevin M; Newell, Marc C
2016-02-01
Although focused cardiac ultrasonographic (FoCUS) examination has been evaluated in emergency departments and intensive care units with good correlation to formal echocardiography, accuracy for the assessment of left ventricular systolic function (LVSF) when performed by internal medicine physicians still needs independent evaluation. This prospective observational study in a 640-bed, academic, quaternary care center, included 178 inpatients examined by 10 internal medicine physicians who had completed our internal medicine bedside ultrasound training program. The ability to estimate LVSF with FoCUS as "normal," "mild to moderately decreased," or "severely decreased" was compared with left ventricular ejection fraction (>50%, 31-49%, and <31%, respectively) from formal echocardiography interpreted by a cardiologist. Sensitivity and specificity of FoCUS for any degree of LVSF impairment were 0.91 (95% confidence interval [CI] 0.80, 0.97) and 0.88 (95% CI 0.81, 0.93), respectively. The interrater agreement between internal medicine physician-performed FoCUS and formal echocardiography for any LVSF impairment was "good/substantial" with κ = 0.77 (p < 0.001), 95% CI (0.67, 0.87). Formal echocardiography was classified as "technically limited due to patient factors" in 20% of patients; however, echogenicity was sufficient in 100% of FoCUS exams to classify LVSF. Internal medicine physicians using FoCUS identify normal versus decreased LVSF with high sensitivity, specificity, and "good/substantial" interrater agreement when compared with formal echocardiography. These results support the role of cardiac FoCUS by properly trained internal medicine physicians for discriminating normal from reduced LVSF. © 2015 Wiley Periodicals, Inc.
Theory of the dynamical thermal conductivity of metals
NASA Astrophysics Data System (ADS)
Bhalla, Pankaj; Kumar, Pradeep; Das, Nabyendu; Singh, Navinder
2016-09-01
The Mori's projection method, known as the memory function method, is an important theoretical formalism to study various transport coefficients. In the present work, we calculate the dynamical thermal conductivity in the case of metals using the memory function formalism. We introduce thermal memory functions for the first time and discuss the behavior of thermal conductivity in both the zero frequency limit and in the case of nonzero frequencies. We compare our results for the zero frequency case with the results obtained by the Bloch-Boltzmann kinetic approach and find that both approaches agree with each other. Motivated by some recent experimental advancements, we obtain several new results for the ac or the dynamical thermal conductivity.
Non-functional Avionics Requirements
NASA Astrophysics Data System (ADS)
Paulitsch, Michael; Ruess, Harald; Sorea, Maria
Embedded systems in aerospace become more and more integrated in order to reduce weight, volume/size, and power of hardware for more fuel-effi ciency. Such integration tendencies change architectural approaches of system ar chi tec tures, which subsequently change non-functional requirements for plat forms. This paper provides some insight into state-of-the-practice of non-func tional requirements for developing ultra-critical embedded systems in the aero space industry, including recent changes and trends. In particular, formal requi re ment capture and formal analysis of non-functional requirements of avionic systems - including hard-real time, fault-tolerance, reliability, and per for mance - are exemplified by means of recent developments in SAL and HiLiTE.
Bernardes, Sónia F; Matos, Marta; Goubert, Liesbet
2017-09-01
Chronic pain among older adults is common and often disabling. Pain-related formal social support (e.g., provided by staff at day-care centers, nursing homes), and the extent to which it promotes functional autonomy or dependence, plays a significant role in the promotion of older adults' ability to engage in their daily activities. Assessing older adults' preferences for pain-related social support for functional autonomy or dependence could contribute to increase formal social support responsiveness to individuals' needs. Therefore, this study aimed at developing and validating the preferences for formal social support of autonomy and dependence in pain inventory (PFSSADI). One hundred and sixty-five older adults with chronic musculoskeletal pain ( M age = 79.1, 67.3% women), attending day-care centers, completed the PFSSADI, the revised formal social support for autonomy and dependence in pain inventory, and a measure of desire for (in)dependence; the PFSSADI was filled out again 6 weeks later. Confirmatory factor analyses showed a structure of two correlated factors ( r = .56): (a) preferences for autonomy support ( α = .99) and (b) preferences for dependence support ( α = .98). The scale showed good test-retest reliability, sensitivity and discriminant and concurrent validity; the higher the preferences for dependence support, the higher the desire for dependence ( r = .33) and the lower the desire for independence ( r = -.41). The PFSSADI is an innovative tool, which may contribute to explore the role of pain-related social support responsiveness on the promotion of older adults' functional autonomy when in pain.
NASA Astrophysics Data System (ADS)
Hsieh, Chang-Yu; Cao, Jianshu
2018-01-01
We extend a standard stochastic theory to study open quantum systems coupled to a generic quantum environment. We exemplify the general framework by studying a two-level quantum system coupled bilinearly to the three fundamental classes of non-interacting particles: bosons, fermions, and spins. In this unified stochastic approach, the generalized stochastic Liouville equation (SLE) formally captures the exact quantum dissipations when noise variables with appropriate statistics for different bath models are applied. Anharmonic effects of a non-Gaussian bath are precisely encoded in the bath multi-time correlation functions that noise variables have to satisfy. Starting from the SLE, we devise a family of generalized hierarchical equations by averaging out the noise variables and expand bath multi-time correlation functions in a complete basis of orthonormal functions. The general hierarchical equations constitute systems of linear equations that provide numerically exact simulations of quantum dynamics. For bosonic bath models, our general hierarchical equation of motion reduces exactly to an extended version of hierarchical equation of motion which allows efficient simulation for arbitrary spectral densities and temperature regimes. Similar efficiency and flexibility can be achieved for the fermionic bath models within our formalism. The spin bath models can be simulated with two complementary approaches in the present formalism. (I) They can be viewed as an example of non-Gaussian bath models and be directly handled with the general hierarchical equation approach given their multi-time correlation functions. (II) Alternatively, each bath spin can be first mapped onto a pair of fermions and be treated as fermionic environments within the present formalism.
NASA Astrophysics Data System (ADS)
Welden, Alicia Rae; Rusakov, Alexander A.; Zgid, Dominika
2016-11-01
Including finite-temperature effects from the electronic degrees of freedom in electronic structure calculations of semiconductors and metals is desired; however, in practice it remains exceedingly difficult when using zero-temperature methods, since these methods require an explicit evaluation of multiple excited states in order to account for any finite-temperature effects. Using a Matsubara Green's function formalism remains a viable alternative, since in this formalism it is easier to include thermal effects and to connect the dynamic quantities such as the self-energy with static thermodynamic quantities such as the Helmholtz energy, entropy, and internal energy. However, despite the promising properties of this formalism, little is known about the multiple solutions of the non-linear equations present in the self-consistent Matsubara formalism and only a few cases involving a full Coulomb Hamiltonian were investigated in the past. Here, to shed some light onto the iterative nature of the Green's function solutions, we self-consistently evaluate the thermodynamic quantities for a one-dimensional (1D) hydrogen solid at various interatomic separations and temperatures using the self-energy approximated to second-order (GF2). At many points in the phase diagram of this system, multiple phases such as a metal and an insulator exist, and we are able to determine the most stable phase from the analysis of Helmholtz energies. Additionally, we show the evolution of the spectrum of 1D boron nitride to demonstrate that GF2 is capable of qualitatively describing the temperature effects influencing the size of the band gap.
Approaches to Foster Transfer of Formal Principles: Which Route to Take?
Schalk, Lennart; Saalbach, Henrik; Stern, Elsbeth
2016-01-01
Enabling learners to transfer knowledge about formal principles to new problems is a major aim of science and mathematics education, which, however, is notoriously difficult to reach. Previous research advocates different approaches of how to introduce principles to foster the transfer of knowledge about formal principles. One approach suggests teaching a generic formalism of the principles. Another approach suggests presenting (at least) two concrete cases instantiating the principle. A third approach suggests presenting a generic formalism accompanied by a case. As yet, though, empirical results regarding the transfer potential of these approaches are mixed and difficult to integrate as the three approaches have rarely been tested competitively. Furthermore, the approaches have been evaluated in relation to different control conditions, and they have been assessed using varying transfer measures. In the present experiment, we introduced undergraduates to the formal principles of propositional logic with the aim to systematically compare the transfer potential of the different approaches in relation to each other and to a common control condition by using various learning and transfer tasks. Results indicate that all approaches supported successful learning and transfer of the principles, but also caused systematic differences in the magnitude of transfer. Results indicate that the combination of a generic formalism with a case was surprisingly unsuccessful while learners who compared two cases outperformed the control condition. We discuss how the simultaneous assessment of the different approaches allows to more precisely capture the underlying learning mechanisms and to advance theory on how these mechanisms contribute to transfer performance.
NASA Astrophysics Data System (ADS)
Zheng, Mingfang; He, Cunfu; Lu, Yan; Wu, Bin
2018-01-01
We presented a numerical method to solve phase dispersion curve in general anisotropic plates. This approach involves an exact solution to the problem in the form of the Legendre polynomial of multiple integrals, which we substituted into the state-vector formalism. In order to improve the efficiency of the proposed method, we made a special effort to demonstrate the analytical methodology. Furthermore, we analyzed the algebraic symmetries of the matrices in the state-vector formalism for anisotropic plates. The basic feature of the proposed method was the expansion of field quantities by Legendre polynomials. The Legendre polynomial method avoid to solve the transcendental dispersion equation, which can only be solved numerically. This state-vector formalism combined with Legendre polynomial expansion distinguished the adjacent dispersion mode clearly, even when the modes were very close. We then illustrated the theoretical solutions of the dispersion curves by this method for isotropic and anisotropic plates. Finally, we compared the proposed method with the global matrix method (GMM), which shows excellent agreement.
You, Jing
2016-05-01
This paper assesses the causal impact on child health of borrowing formal microcredit for Chinese rural households by exploiting a panel dataset (2000 and 2004) in a poor northwest province. Endogenous borrowing is controlled for in a dynamic regression-discontinuity design creating a quasi-experimental environment for causal inferences. There is causal relationship running from formal microcredit to improved child health in the short term, while past borrowing behaviour has no protracted impact on subsequent child health outcomes. Moreover, formal microcredit appears to be a complement to health insurance in improving child health through two mechanisms-it enhances affordability for out-of-pocket health care expenditure and helps buffer consumption against adverse health shocks and financial risk incurred by current health insurance arrangements. Government efforts in expanding health insurance for rural households would be more likely to achieve its optimal goals of improving child health outcomes if combined with sufficient access to formal microcredit. Copyright © 2015 John Wiley & Sons, Ltd.
ARIES: Acquisition of Requirements and Incremental Evolution of Specifications
NASA Technical Reports Server (NTRS)
Roberts, Nancy A.
1993-01-01
This paper describes a requirements/specification environment specifically designed for large-scale software systems. This environment is called ARIES (Acquisition of Requirements and Incremental Evolution of Specifications). ARIES provides assistance to requirements analysts for developing operational specifications of systems. This development begins with the acquisition of informal system requirements. The requirements are then formalized and gradually elaborated (transformed) into formal and complete specifications. ARIES provides guidance to the user in validating formal requirements by translating them into natural language representations and graphical diagrams. ARIES also provides ways of analyzing the specification to ensure that it is correct, e.g., testing the specification against a running simulation of the system to be built. Another important ARIES feature, especially when developing large systems, is the sharing and reuse of requirements knowledge. This leads to much less duplication of effort. ARIES combines all of its features in a single environment that makes the process of capturing a formal specification quicker and easier.
Unified quantitative characterization of epithelial tissue development
Guirao, Boris; Rigaud, Stéphane U; Bosveld, Floris; Bailles, Anaïs; López-Gay, Jesús; Ishihara, Shuji; Sugimura, Kaoru
2015-01-01
Understanding the mechanisms regulating development requires a quantitative characterization of cell divisions, rearrangements, cell size and shape changes, and apoptoses. We developed a multiscale formalism that relates the characterizations of each cell process to tissue growth and morphogenesis. Having validated the formalism on computer simulations, we quantified separately all morphogenetic events in the Drosophila dorsal thorax and wing pupal epithelia to obtain comprehensive statistical maps linking cell and tissue scale dynamics. While globally cell shape changes, rearrangements and divisions all significantly participate in tissue morphogenesis, locally, their relative participations display major variations in space and time. By blocking division we analyzed the impact of division on rearrangements, cell shape changes and tissue morphogenesis. Finally, by combining the formalism with mechanical stress measurement, we evidenced unexpected interplays between patterns of tissue elongation, cell division and stress. Our formalism provides a novel and rigorous approach to uncover mechanisms governing tissue development. DOI: http://dx.doi.org/10.7554/eLife.08519.001 PMID:26653285
A Spherical Harmonic Analysis of the Ooty Wide Field Array (OWFA) Visibility Signal
NASA Astrophysics Data System (ADS)
Chatterjee, Suman; Bharadwaj, Somnath
2018-04-01
Considering redshifted 21-cm intensity mapping with the upcoming OWFA whose field of view subtends ˜57° in the N-S direction, we present a formalism which relates the measured visibilities to the spherical harmonic coefficients of the sky signal. We use this to calculate window functions which relate the two-visibility correlations i.e. the correlation between the visibilities measured at two baselines and two frequencies, to different multipoles of the multi-frequency angular power spectrum Cℓ(ν1, ν2). The formalism here is validated using simulations. We also present approximate closed form analytical expressions which can be used to calculate the window functions. Comparing the widely adopted flat sky approximation, we find that its predictions match those of our spherical harmonic formalism to within 16% across the entire OWFA baseline range. The match improves at large baselines where we have <5% deviations.
48 CFR 945.603-70 - Plant clearance function.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Plant clearance function... Plant clearance function. If the plant clearance function has not been formally delegated to another Federal agency, the contracting officer shall assume all responsibilities of the plant clearance officer...
48 CFR 945.670-1 - Plant clearance function.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 48 Federal Acquisition Regulations System 5 2014-10-01 2014-10-01 false Plant clearance function... MANAGEMENT GOVERNMENT PROPERTY Reporting, Reutilization, and Disposal 945.670-1 Plant clearance function. If the plant clearance function has not been formally delegated to another Federal agency, the...
48 CFR 945.670-1 - Plant clearance function.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 48 Federal Acquisition Regulations System 5 2013-10-01 2013-10-01 false Plant clearance function... MANAGEMENT GOVERNMENT PROPERTY Reporting, Reutilization, and Disposal 945.670-1 Plant clearance function. If the plant clearance function has not been formally delegated to another Federal agency, the...
48 CFR 945.603-70 - Plant clearance function.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 48 Federal Acquisition Regulations System 5 2012-10-01 2012-10-01 false Plant clearance function... Plant clearance function. If the plant clearance function has not been formally delegated to another Federal agency, the contracting officer shall assume all responsibilities of the plant clearance officer...
Weinkam, Patrick; Romesberg, Floyd E.; Wolynes, Peter G.
2010-01-01
A grand canonical formalism is developed to combine discrete simulations for chemically distinct species in equilibrium. Each simulation is based on a perturbed funneled landscape. The formalism is illustrated using the alkaline-induced transitions of cytochrome c as observed by FTIR spectroscopy and with various other experimental approaches. The grand canonical simulation method accounts for the acid/base chemistry of deprotonation, the inorganic chemistry of heme ligation and misligation, and the minimally frustrated folding energy landscape, thus elucidating the physics of protein folding involved with an acid/base titration of a protein. The formalism combines simulations for each of the relevant chemical species, varying by protonation and ligation states. In contrast to models based on perfectly funneled energy landscapes that contain only contacts found in the native structure, the current study introduces “chemical frustration” from deprotonation and misligation that gives rise to many intermediates at alkaline pH. While the nature of these intermediates cannot be easily inferred from available experimental data, the current study provides specific structural details of these intermediates thus extending our understanding of how cytochrome c changes with increasing pH. The results demonstrate the importance of chemical frustration for understanding biomolecular energy landscapes. PMID:19199810
Efficient evaluation of nonlocal operators in density functional theory
NASA Astrophysics Data System (ADS)
Chen, Ying-Chih; Chen, Jing-Zhe; Michaud-Rioux, Vincent; Shi, Qing; Guo, Hong
2018-02-01
We present a method which combines plane waves (PW) and numerical atomic orbitals (NAO) to efficiently evaluate nonlocal operators in density functional theory with periodic boundary conditions. Nonlocal operators are first expanded using PW and then transformed to NAO so that the problem of distance-truncation is avoided. The general formalism is implemented using the hybrid functional HSE06 where the nonlocal operator is the exact exchange. Comparison of electronic structures of a wide range of semiconductors to a pure PW scheme validates the accuracy of our method. Due to the locality of NAO, thus sparsity of matrix representations of the operators, the computational complexity of the method is asymptotically quadratic in the number of electrons. Finally, we apply the technique to investigate the electronic structure of the interface between a single-layer black phosphorous and the high-κ dielectric material c -HfO2 . We predict that the band offset between the two materials is 1.29 eV and 2.18 eV for valence and conduction band edges, respectively, and such offsets are suitable for 2D field-effect transistor applications.
The Routine Fitting of Kinetic Data to Models
Berman, Mones; Shahn, Ezra; Weiss, Marjory F.
1962-01-01
A mathematical formalism is presented for use with digital computers to permit the routine fitting of data to physical and mathematical models. Given a set of data, the mathematical equations describing a model, initial conditions for an experiment, and initial estimates for the values of model parameters, the computer program automatically proceeds to obtain a least squares fit of the data by an iterative adjustment of the values of the parameters. When the experimental measures are linear combinations of functions, the linear coefficients for a least squares fit may also be calculated. The values of both the parameters of the model and the coefficients for the sum of functions may be unknown independent variables, unknown dependent variables, or known constants. In the case of dependence, only linear dependencies are provided for in routine use. The computer program includes a number of subroutines, each one of which performs a special task. This permits flexibility in choosing various types of solutions and procedures. One subroutine, for example, handles linear differential equations, another, special non-linear functions, etc. The use of analytic or numerical solutions of equations is possible. PMID:13867975
Nonlocal response with local optics
NASA Astrophysics Data System (ADS)
Kong, Jiantao; Shvonski, Alexander J.; Kempa, Krzysztof
2018-04-01
For plasmonic systems too small for classical, local simulations to be valid, but too large for ab initio calculations to be computationally feasible, we developed a practical approach—a nonlocal-to-local mapping that enables the use of a modified local system to obtain the response due to nonlocal effects to lowest order, at the cost of higher structural complexity. In this approach, the nonlocal surface region of a metallic structure is mapped onto a local dielectric film, mathematically preserving the nonlocality of the entire system. The most significant feature of this approach is its full compatibility with conventional, highly efficient finite difference time domain (FDTD) simulation codes. Our optimized choice of mapping is based on the Feibelman's d -function formalism, and it produces an effective dielectric function of the local film that obeys all required sum rules, as well as the Kramers-Kronig causality relations. We demonstrate the power of our approach combined with an FDTD scheme, in a series of comparisons with experiments and ab initio density functional theory calculations from the literature, for structures with dimensions from the subnanoscopic to microscopic range.
Interacting hadron resonance gas model in the K -matrix formalism
NASA Astrophysics Data System (ADS)
Dash, Ashutosh; Samanta, Subhasis; Mohanty, Bedangadas
2018-05-01
An extension of hadron resonance gas (HRG) model is constructed to include interactions using relativistic virial expansion of partition function. The noninteracting part of the expansion contains all the stable baryons and mesons and the interacting part contains all the higher mass resonances which decay into two stable hadrons. The virial coefficients are related to the phase shifts which are calculated using K -matrix formalism in the present work. We have calculated various thermodynamics quantities like pressure, energy density, and entropy density of the system. A comparison of thermodynamic quantities with noninteracting HRG model, calculated using the same number of hadrons, shows that the results of the above formalism are larger. A good agreement between equation of state calculated in K -matrix formalism and lattice QCD simulations is observed. Specifically, the lattice QCD calculated interaction measure is well described in our formalism. We have also calculated second-order fluctuations and correlations of conserved charges in K -matrix formalism. We observe a good agreement of second-order fluctuations and baryon-strangeness correlation with lattice data below the crossover temperature.
37 CFR 1.152 - Design drawings.
Code of Federal Regulations, 2012 CFR
2012-07-01
... are not permitted in a design drawing. Photographs and ink drawings are not permitted to be combined as formal drawings in one application. Photographs submitted in lieu of ink drawings in design patent...
37 CFR 1.152 - Design drawings.
Code of Federal Regulations, 2014 CFR
2014-07-01
... are not permitted in a design drawing. Photographs and ink drawings are not permitted to be combined as formal drawings in one application. Photographs submitted in lieu of ink drawings in design patent...
Surface infrastructure functions, requirements and subsystems for a manned Mars mission
NASA Technical Reports Server (NTRS)
Fairchild, Kyle
1986-01-01
Planning and development for a permanently manned scientific outpost on Mars requires an in-depth understanding and analysis of the functions the outpost is expected to perform. The optimum configuration that accomplishes these functions then arises during the trade studies process. In a project this complex, it becomes necessary to use a formal methodology to document the design and planning process. The method chosen for this study is called top-down functional decomposition. This method is used to determine the functions that are needed to accomplish the overall mission, then determine what requirements and systems are needed to do each of the functions. This method facilitates automation of the trades and options process. In the example, this was done with an off-the shelf software package called TK! olver. The basic functions that a permanently manned outpost on Mars must accomplish are: (1) Establish the Life Critical Systems; (2) Support Planetary Sciences and Exploration; and (3) Develop and Maintain Long-term Support Functions, including those systems needed towards self-sufficiency. The top-down functional decomposition methology, combined with standard spread sheet software, offers a powerful tool to quickly assess various design trades and analyze options. As the specific subsystems, and the relational rule algorithms are further refined, it will be possible to very accurately determine the implications of continually evolving mission requirements.
Application of Lightweight Formal Methods to Software Security
NASA Technical Reports Server (NTRS)
Gilliam, David P.; Powell, John D.; Bishop, Matt
2005-01-01
Formal specification and verification of security has proven a challenging task. There is no single method that has proven feasible. Instead, an integrated approach which combines several formal techniques can increase the confidence in the verification of software security properties. Such an approach which species security properties in a library that can be reused by 2 instruments and their methodologies developed for the National Aeronautics and Space Administration (NASA) at the Jet Propulsion Laboratory (JPL) are described herein The Flexible Modeling Framework (FMF) is a model based verijkation instrument that uses Promela and the SPIN model checker. The Property Based Tester (PBT) uses TASPEC and a Text Execution Monitor (TEM). They are used to reduce vulnerabilities and unwanted exposures in software during the development and maintenance life cycles.
Chauvet, G A
1993-03-29
In paper I a theory of functional organization in terms of functional interactions was proposed for a formal biological system (FBS). A functional interaction was defined as the product emitted by a structural unit, i.e. an assembly of molecules, cells, tissues or organs, which acts on another. We have shown that a self-association hypothesis could be an explanation for the source of these functional interactions because it is consistent with increased stability of the system after association. The construction of the set of interactions provides the topology of the biological system, called (O-FBS), in contrast to the (D-FBS) which describes the dynamics of the processes associated with the functional interactions. In this paper, an optimum principle is established, due to the non-symmetry of functional interactions, which could explain the stability of an FBS, and a criterion of evolution for the hierarchical topological organization of a FBS during development is deduced from that principle. The combinatorics of the (O-FBS) leads to the topological stability of the related graph. It is shown that this problem can be expressed as the re-distribution of sources and sinks, when one of them is suppressed, given the constraint of the invariance of the physiological function. Such an optimum principle could be called a 'principle of increase in functional order by hierarchy'. The first step is the formulation of a 'potential' for the functional organization, which describes the ability of the system to combine functional interactions, such that the principle of vital coherence (paper I) is satisfied. This function measures the number of potential functional interactions. The second step is to discover the maximum of this function. Biological systems in such a state of maximum organization are shown to satisfy particular dynamics, which can be experimentally verified: either the number of sinks decreases, or this number increases, in a monotonic way. The class of systems considered here is assumed to satisfy such an extremum hypothesis. The third step is a study of the variation of the degree of organization (paper I), i.e. the number of structural units when the biological system is growing. We establish an optimum principle for a new function called 'orgatropy'. By adding a criterion of specialization to the system we show the emergence of a level of organization with a re-organization of the system.(ABSTRACT TRUNCATED AT 400 WORDS)
ERIC Educational Resources Information Center
Hutt, Ethan L.
2012-01-01
Background/Context: Though the impact of the legal system in shaping public education over the last sixty years is unquestioned, scholars have largely overlooked the impact of the legal system on the early development and trajectory of public schools in America. Scholars have given particularly little attention to the period in the late nineteenth…
Bol, Nadine; van Weert, Julia C M; de Haes, Hanneke C J M; Loos, Eugene F; Smets, Ellen M A
2015-04-24
Older adults are increasingly using the Internet for health information; however, they are often not able to correctly recall Web-based information (eHealth information). Recall of information is crucial for optimal health outcomes, such as adequate disease management and adherence to medical regimes. Combining effective message strategies may help to improve recall of eHealth information among older adults. Presenting information in an audiovisual format using conversational narration style is expected to optimize recall of information compared to other combinations of modality and narration style. The aim of this paper is to investigate the effect of modality and narration style on recall of health information, and whether there are differences between younger and older adults. We conducted a Web-based experiment using a 2 (modality: written vs audiovisual information) by 2 (narration style: formal vs conversational style) between-subjects design (N=440). Age was assessed in the questionnaire and included as a factor: younger (<65 years) versus older (≥65 years) age. Participants were randomly assigned to one of four experimental webpages where information about lung cancer treatment was presented. A Web-based questionnaire assessed recall of eHealth information. Audiovisual modality (vs written modality) was found to increase recall of information in both younger and older adults (P=.04). Although conversational narration style (vs formal narration style) did not increase recall of information (P=.17), a synergistic effect between modality and narration style was revealed: combining audiovisual information with conversational style outperformed combining written information with formal style (P=.01), as well as written information with conversational style (P=.045). This finding suggests that conversational style especially increases recall of information when presented audiovisually. This combination of modality and narration style improved recall of information among both younger and older adults. We conclude that combining audiovisual information with conversational style is the best way to present eHealth information to younger and older adults. Even though older adults did not proportionally recall more when audiovisual information was combined with conversational style than younger adults, this study reveals interesting implications for improving eHealth information that is effective for both younger and older adults.
Moving formal methods into practice. Verifying the FTPP Scoreboard: Results, phase 1
NASA Technical Reports Server (NTRS)
Srivas, Mandayam; Bickford, Mark
1992-01-01
This report documents the Phase 1 results of an effort aimed at formally verifying a key hardware component, called Scoreboard, of a Fault-Tolerant Parallel Processor (FTPP) being built at Charles Stark Draper Laboratory (CSDL). The Scoreboard is part of the FTPP virtual bus that guarantees reliable communication between processors in the presence of Byzantine faults in the system. The Scoreboard implements a piece of control logic that approves and validates a message before it can be transmitted. The goal of Phase 1 was to lay the foundation of the Scoreboard verification. A formal specification of the functional requirements and a high-level hardware design for the Scoreboard were developed. The hardware design was based on a preliminary Scoreboard design developed at CSDL. A main correctness theorem, from which the functional requirements can be established as corollaries, was proved for the Scoreboard design. The goal of Phase 2 is to verify the final detailed design of Scoreboard. This task is being conducted as part of a NASA-sponsored effort to explore integration of formal methods in the development cycle of current fault-tolerant architectures being built in the aerospace industry.
NASA Astrophysics Data System (ADS)
Kalanov, Temur Z.
2013-04-01
Critical analysis of the standard foundations of differential and integral calculus -- as mathematical formalism of theoretical physics -- is proposed. Methodological basis of the analysis is the unity of formal logic and rational dialectics. It is shown that: (a) the foundations (i.e. d 1ptyd,;=;δ,;->;0,;δ,δ,, δ,;->;0;δ,δ,;=;δ,;->;0;f,( x;+;δ, );-;f,( x )δ,;, d,;=;δ,, d,;=;δ, where y;=;f,( x ) is a continuous function of one argument x; δ, and δ, are increments; d, and d, are differentials) not satisfy formal logic law -- the law of identity; (b) the infinitesimal quantities d,, d, are fictitious quantities. They have neither algebraic meaning, nor geometrical meaning because these quantities do not take numerical values and, therefore, have no a quantitative measure; (c) expressions of the kind x;+;d, are erroneous because x (i.e. finite quantity) and d, (i.e. infinitely diminished quantity) have different sense, different qualitative determinacy; since x;,;,,,,onst under δ,;,;,, a derivative does not contain variable quantity x and depends only on constant c. Consequently, the standard concepts ``infinitesimal quantity (uninterruptedly diminishing quantity)'', ``derivative'', ``derivative as function of variable quantity'' represent incorrect basis of mathematics and theoretical physics.
NASA Astrophysics Data System (ADS)
Fitzpatrick, Matthew R. C.; Kennett, Malcolm P.
2018-05-01
We develop a formalism that allows the study of correlations in space and time in both the superfluid and Mott insulating phases of the Bose-Hubbard Model. Specifically, we obtain a two particle irreducible effective action within the contour-time formalism that allows for both equilibrium and out of equilibrium phenomena. We derive equations of motion for both the superfluid order parameter and two-point correlation functions. To assess the accuracy of this formalism, we study the equilibrium solution of the equations of motion and compare our results to existing strong coupling methods as well as exact methods where possible. We discuss applications of this formalism to out of equilibrium situations.
NASA Astrophysics Data System (ADS)
Aouaini, Fatma; Knani, Salah; Yahia, Manel Ben; Bahloul, Neila; Ben Lamine, Abdelmottaleb; Kechaou, Nabil
2015-12-01
In this paper, we present a new investigation that allows determining the pore size distribution (PSD) in a porous medium. This PSD is achieved by using the desorption isotherms of four varieties of olive leaves. This is by the means of statistical physics formalism and Kelvin's law. The results are compared with those obtained with scanning electron microscopy. The effect of temperature on the distribution function of pores has been studied. The influence of each parameter on the PSD is interpreted. A similar function of adsorption energy distribution, AED, is deduced from the PSD.
ERIC Educational Resources Information Center
Ogden, Daniel M., Jr.
1978-01-01
Suggests that the most practical budgeting system for most managers is a formalized combination of incremental and zero-based analysis because little can be learned about most programs from an annual zero-based budget. (Author/IRT)
Programmable Potentials: Approximate N-body potentials from coarse-level logic.
Thakur, Gunjan S; Mohr, Ryan; Mezić, Igor
2016-09-27
This paper gives a systematic method for constructing an N-body potential, approximating the true potential, that accurately captures meso-scale behavior of the chemical or biological system using pairwise potentials coming from experimental data or ab initio methods. The meso-scale behavior is translated into logic rules for the dynamics. Each pairwise potential has an associated logic function that is constructed using the logic rules, a class of elementary logic functions, and AND, OR, and NOT gates. The effect of each logic function is to turn its associated potential on and off. The N-body potential is constructed as linear combination of the pairwise potentials, where the "coefficients" of the potentials are smoothed versions of the associated logic functions. These potentials allow a potentially low-dimensional description of complex processes while still accurately capturing the relevant physics at the meso-scale. We present the proposed formalism to construct coarse-grained potential models for three examples: an inhibitor molecular system, bond breaking in chemical reactions, and DNA transcription from biology. The method can potentially be used in reverse for design of molecular processes by specifying properties of molecules that can carry them out.
Programmable Potentials: Approximate N-body potentials from coarse-level logic
NASA Astrophysics Data System (ADS)
Thakur, Gunjan S.; Mohr, Ryan; Mezić, Igor
2016-09-01
This paper gives a systematic method for constructing an N-body potential, approximating the true potential, that accurately captures meso-scale behavior of the chemical or biological system using pairwise potentials coming from experimental data or ab initio methods. The meso-scale behavior is translated into logic rules for the dynamics. Each pairwise potential has an associated logic function that is constructed using the logic rules, a class of elementary logic functions, and AND, OR, and NOT gates. The effect of each logic function is to turn its associated potential on and off. The N-body potential is constructed as linear combination of the pairwise potentials, where the “coefficients” of the potentials are smoothed versions of the associated logic functions. These potentials allow a potentially low-dimensional description of complex processes while still accurately capturing the relevant physics at the meso-scale. We present the proposed formalism to construct coarse-grained potential models for three examples: an inhibitor molecular system, bond breaking in chemical reactions, and DNA transcription from biology. The method can potentially be used in reverse for design of molecular processes by specifying properties of molecules that can carry them out.
3D GIS spatial operation based on extended Euler operators
NASA Astrophysics Data System (ADS)
Xu, Hongbo; Lu, Guonian; Sheng, Yehua; Zhou, Liangchen; Guo, Fei; Shang, Zuoyan; Wang, Jing
2008-10-01
The implementation of 3 dimensions spatial operations, based on certain data structure, has a lack of universality and is not able to treat with non-manifold cases, at present. ISO/DIS 19107 standard just presents the definition of Boolean operators and set operators for topological relationship query, and OGC GeoXACML gives formal definitions for several set functions without implementation detail. Aiming at these problems, based mathematical foundation on cell complex theory, supported by non-manifold data structure and using relevant research in the field of non-manifold geometry modeling for reference, firstly, this paper according to non-manifold Euler-Poincaré formula constructs 6 extended Euler operators and inverse operators to carry out creating, updating and deleting 3D spatial elements, as well as several pairs of supplementary Euler operators to convenient for implementing advanced functions. Secondly, we change topological element operation sequence of Boolean operation and set operation as well as set functions defined in GeoXACML into combination of extended Euler operators, which separates the upper functions and lower data structure. Lastly, we develop underground 3D GIS prototype system, in which practicability and credibility of extended Euler operators faced to 3D GIS presented by this paper are validated.
Programmable Potentials: Approximate N-body potentials from coarse-level logic
Thakur, Gunjan S.; Mohr, Ryan; Mezić, Igor
2016-01-01
This paper gives a systematic method for constructing an N-body potential, approximating the true potential, that accurately captures meso-scale behavior of the chemical or biological system using pairwise potentials coming from experimental data or ab initio methods. The meso-scale behavior is translated into logic rules for the dynamics. Each pairwise potential has an associated logic function that is constructed using the logic rules, a class of elementary logic functions, and AND, OR, and NOT gates. The effect of each logic function is to turn its associated potential on and off. The N-body potential is constructed as linear combination of the pairwise potentials, where the “coefficients” of the potentials are smoothed versions of the associated logic functions. These potentials allow a potentially low-dimensional description of complex processes while still accurately capturing the relevant physics at the meso-scale. We present the proposed formalism to construct coarse-grained potential models for three examples: an inhibitor molecular system, bond breaking in chemical reactions, and DNA transcription from biology. The method can potentially be used in reverse for design of molecular processes by specifying properties of molecules that can carry them out. PMID:27671683
Maximum entropy formalism for the analytic continuation of matrix-valued Green's functions
NASA Astrophysics Data System (ADS)
Kraberger, Gernot J.; Triebl, Robert; Zingl, Manuel; Aichhorn, Markus
2017-10-01
We present a generalization of the maximum entropy method to the analytic continuation of matrix-valued Green's functions. To treat off-diagonal elements correctly based on Bayesian probability theory, the entropy term has to be extended for spectral functions that are possibly negative in some frequency ranges. In that way, all matrix elements of the Green's function matrix can be analytically continued; we introduce a computationally cheap element-wise method for this purpose. However, this method cannot ensure important constraints on the mathematical properties of the resulting spectral functions, namely positive semidefiniteness and Hermiticity. To improve on this, we present a full matrix formalism, where all matrix elements are treated simultaneously. We show the capabilities of these methods using insulating and metallic dynamical mean-field theory (DMFT) Green's functions as test cases. Finally, we apply the methods to realistic material calculations for LaTiO3, where off-diagonal matrix elements in the Green's function appear due to the distorted crystal structure.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yan, Yiying, E-mail: yiyingyan@sjtu.edu.cn; Lü, Zhiguo, E-mail: zglv@sjtu.edu.cn; Zheng, Hang, E-mail: hzheng@sjtu.edu.cn
We present a theoretical formalism for resonance fluorescence radiating from a two-level system (TLS) driven by any periodic driving and coupled to multiple reservoirs. The formalism is derived analytically based on the combination of Floquet theory and Born–Markov master equation. The formalism allows us to calculate the spectrum when the Floquet states and quasienergies are analytically or numerically solved for simple or complicated driving fields. We can systematically explore the spectral features by implementing the present formalism. To exemplify this theory, we apply the unified formalism to comprehensively study a generic model that a harmonically driven TLS is simultaneously coupledmore » to a radiative reservoir and a dephasing reservoir. We demonstrate that the significant features of the fluorescence spectra, the driving-induced asymmetry and the dephasing-induced asymmetry, can be attributed to the violation of detailed balance condition, and explained in terms of the driving-related transition quantities between Floquet-states and their steady populations. In addition, we find the distinguished features of the fluorescence spectra under the biharmonic and multiharmonic driving fields in contrast with that of the harmonic driving case. In the case of the biharmonic driving, we find that the spectra are significantly different from the result of the RWA under the multiple resonance conditions. By the three concrete applications, we illustrate that the present formalism provides a routine tool for comprehensively exploring the fluorescence spectrum of periodically strongly driven TLSs.« less
Affine group formulation of the Standard Model coupled to gravity
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chou, Ching-Yi, E-mail: l2897107@mail.ncku.edu.tw; Ita, Eyo, E-mail: ita@usna.edu; Soo, Chopin, E-mail: cpsoo@mail.ncku.edu.tw
In this work we apply the affine group formalism for four dimensional gravity of Lorentzian signature, which is based on Klauder’s affine algebraic program, to the formulation of the Hamiltonian constraint of the interaction of matter and all forces, including gravity with non-vanishing cosmological constant Λ, as an affine Lie algebra. We use the hermitian action of fermions coupled to gravitation and Yang–Mills theory to find the density weight one fermionic super-Hamiltonian constraint. This term, combined with the Yang–Mills and Higgs energy densities, are composed with York’s integrated time functional. The result, when combined with the imaginary part of themore » Chern–Simons functional Q, forms the affine commutation relation with the volume element V(x). Affine algebraic quantization of gravitation and matter on equal footing implies a fundamental uncertainty relation which is predicated upon a non-vanishing cosmological constant. -- Highlights: •Wheeler–DeWitt equation (WDW) quantized as affine algebra, realizing Klauder’s program. •WDW formulated for interaction of matter and all forces, including gravity, as affine algebra. •WDW features Hermitian generators in spite of fermionic content: Standard Model addressed. •Constructed a family of physical states for the full, coupled theory via affine coherent states. •Fundamental uncertainty relation, predicated on non-vanishing cosmological constant.« less
Derivatization and diffusive motion of molecular fullerenes: Ab initio and atomistic simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berdiyorov, G., E-mail: gberdiyorov@qf.org.qa; Tabet, N.; Harrabi, K.
2015-07-14
Using first principles density functional theory in combination with the nonequilibrium Green's function formalism, we study the effect of derivatization on the electronic and transport properties of C{sub 60} fullerene. As a typical example, we consider [6,6]-phenyl-C{sub 61}-butyric acid methyl ester (PCBM), which forms one of the most efficient organic photovoltaic materials in combination with electron donating polymers. Extra peaks are observed in the density of states (DOS) due to the formation of new electronic states localized at/near the attached molecule. Despite such peculiar behavior in the DOS of an isolated molecule, derivatization does not have a pronounced effect onmore » the electronic transport properties of the fullerene molecular junctions. Both C{sub 60} and PCBM show the same response to finite voltage biasing with new features in the transmission spectrum due to voltage induced delocalization of some electronic states. We also study the diffusive motion of molecular fullerenes in ethanol solvent and inside poly(3-hexylthiophene) lamella using reactive molecular dynamics simulations. We found that the mobility of the fullerene reduces considerably due to derivatization; the diffusion coefficient of C{sub 60} is an order of magnitude larger than the one for PCBM.« less
Black hole entropy and Lorentz-diffeomorphism Noether charge
NASA Astrophysics Data System (ADS)
Jacobson, Ted; Mohd, Arif
2015-12-01
We show that, in the first or second order orthonormal frame formalism, black hole entropy is the horizon Noether charge for a combination of diffeomorphism and local Lorentz symmetry involving the Lie derivative of the frame. The Noether charge for diffeomorphisms alone is unsuitable, since a regular frame cannot be invariant under the flow of the Killing field at the bifurcation surface. We apply this formalism to Lagrangians polynomial in wedge products of the frame field 1-form and curvature 2-form, including general relativity, Lovelock gravity, and "topological" terms in four dimensions.
An object-oriented description method of EPMM process
NASA Astrophysics Data System (ADS)
Jiang, Zuo; Yang, Fan
2017-06-01
In order to use the object-oriented mature tools and language in software process model, make the software process model more accord with the industrial standard, it’s necessary to study the object-oriented modelling of software process. Based on the formal process definition in EPMM, considering the characteristics that Petri net is mainly formal modelling tool and combining the Petri net modelling with the object-oriented modelling idea, this paper provides this implementation method to convert EPMM based on Petri net into object models based on object-oriented description.
An Approach to Verification and Validation of a Reliable Multicasting Protocol
NASA Technical Reports Server (NTRS)
Callahan, John R.; Montgomery, Todd L.
1994-01-01
This paper describes the process of implementing a complex communications protocol that provides reliable delivery of data in multicast-capable, packet-switching telecommunication networks. The protocol, called the Reliable Multicasting Protocol (RMP), was developed incrementally using a combination of formal and informal techniques in an attempt to ensure the correctness of its implementation. Our development process involved three concurrent activities: (1) the initial construction and incremental enhancement of a formal state model of the protocol machine; (2) the initial coding and incremental enhancement of the implementation; and (3) model-based testing of iterative implementations of the protocol. These activities were carried out by two separate teams: a design team and a V&V team. The design team built the first version of RMP with limited functionality to handle only nominal requirements of data delivery. In a series of iterative steps, the design team added new functionality to the implementation while the V&V team kept the state model in fidelity with the implementation. This was done by generating test cases based on suspected errant or offnominal behaviors predicted by the current model. If the execution of a test was different between the model and implementation, then the differences helped identify inconsistencies between the model and implementation. The dialogue between both teams drove the co-evolution of the model and implementation. Testing served as the vehicle for keeping the model and implementation in fidelity with each other. This paper describes (1) our experiences in developing our process model; and (2) three example problems found during the development of RMP.
An approach to verification and validation of a reliable multicasting protocol
NASA Technical Reports Server (NTRS)
Callahan, John R.; Montgomery, Todd L.
1995-01-01
This paper describes the process of implementing a complex communications protocol that provides reliable delivery of data in multicast-capable, packet-switching telecommunication networks. The protocol, called the Reliable Multicasting Protocol (RMP), was developed incrementally using a combination of formal and informal techniques in an attempt to ensure the correctness of its implementation. Our development process involved three concurrent activities: (1) the initial construction and incremental enhancement of a formal state model of the protocol machine; (2) the initial coding and incremental enhancement of the implementation; and (3) model-based testing of iterative implementations of the protocol. These activities were carried out by two separate teams: a design team and a V&V team. The design team built the first version of RMP with limited functionality to handle only nominal requirements of data delivery. In a series of iterative steps, the design team added new functionality to the implementation while the V&V team kept the state model in fidelity with the implementation. This was done by generating test cases based on suspected errant or off-nominal behaviors predicted by the current model. If the execution of a test was different between the model and implementation, then the differences helped identify inconsistencies between the model and implementation. The dialogue between both teams drove the co-evolution of the model and implementation. Testing served as the vehicle for keeping the model and implementation in fidelity with each other. This paper describes (1) our experiences in developing our process model; and (2) three example problems found during the development of RMP.
The SAGA Survey. I. Satellite Galaxy Populations around Eight Milky Way Analogs
NASA Astrophysics Data System (ADS)
Geha, Marla; Wechsler, Risa H.; Mao, Yao-Yuan; Tollerud, Erik J.; Weiner, Benjamin; Bernstein, Rebecca; Hoyle, Ben; Marchi, Sebastian; Marshall, Phil J.; Muñoz, Ricardo; Lu, Yu
2017-09-01
We present the survey strategy and early results of the “Satellites Around Galactic Analogs” (SAGA) Survey. The SAGA Survey’s goal is to measure the distribution of satellite galaxies around 100 systems analogous to the Milky Way down to the luminosity of the Leo I dwarf galaxy ({M}r< -12.3). We define a Milky Way analog based on K-band luminosity and local environment. Here, we present satellite luminosity functions for eight Milky-Way-analog galaxies between 20 and 40 Mpc. These systems have nearly complete spectroscopic coverage of candidate satellites within the projected host virial radius down to {r}o< 20.75 using low-redshift gri color criteria. We have discovered a total of 25 new satellite galaxies: 14 new satellite galaxies meet our formal criteria around our complete host systems, plus 11 additional satellites in either incompletely surveyed hosts or below our formal magnitude limit. Combined with 13 previously known satellites, there are a total of 27 satellites around 8 complete Milky-Way-analog hosts. We find a wide distribution in the number of satellites per host, from 1 to 9, in the luminosity range for which there are 5 Milky Way satellites. Standard abundance matching extrapolated from higher luminosities predicts less scatter between hosts and a steeper luminosity function slope than observed. We find that the majority of satellites (26 of 27) are star-forming. These early results indicate that the Milky Way has a different satellite population than typical in our sample, potentially changing the physical interpretation of measurements based only on the Milky Way’s satellite galaxies.
Theoretical computer science and the natural sciences
NASA Astrophysics Data System (ADS)
Marchal, Bruno
2005-12-01
I present some fundamental theorems in computer science and illustrate their relevance in Biology and Physics. I do not assume prerequisites in mathematics or computer science beyond the set N of natural numbers, functions from N to N, the use of some notational conveniences to describe functions, and at some point, a minimal amount of linear algebra and logic. I start with Cantor's transcendental proof by diagonalization of the non enumerability of the collection of functions from natural numbers to the natural numbers. I explain why this proof is not entirely convincing and show how, by restricting the notion of function in terms of discrete well defined processes, we are led to the non algorithmic enumerability of the computable functions, but also-through Church's thesis-to the algorithmic enumerability of partial computable functions. Such a notion of function constitutes, with respect to our purpose, a crucial generalization of that concept. This will make easy to justify deep and astonishing (counter-intuitive) incompleteness results about computers and similar machines. The modified Cantor diagonalization will provide a theory of concrete self-reference and I illustrate it by pointing toward an elementary theory of self-reproduction-in the Amoeba's way-and cellular self-regeneration-in the flatworm Planaria's way. To make it easier, I introduce a very simple and powerful formal system known as the Schoenfinkel-Curry combinators. I will use the combinators to illustrate in a more concrete way the notion introduced above. The combinators, thanks to their low-level fine grained design, will also make it possible to make a rough but hopefully illuminating description of the main lessons gained by the careful observation of nature, and to describe some new relations, which should exist between computer science, the science of life and the science of inert matter, once some philosophical, if not theological, hypotheses are made in the cognitive sciences. In the last section, I come back to self-reference and I give an exposition of its modal logics. This is used to show that theoretical computer science makes those “philosophical hypotheses” in theoretical cognitive science experimentally and mathematically testable.
Jiang, Zhuoling; Wang, Hao; Shen, Ziyong; Sanvito, Stefano; Hou, Shimin
2016-07-28
The atomic structure and electronic transport properties of a single hydrogen molecule connected to both symmetric and asymmetric Cu electrodes are investigated by using the non-equilibrium Green's function formalism combined with the density functional theory. Our calculations show that in symmetric Cu-H2-Cu junctions, the low-bias conductance drops rapidly upon stretching, while asymmetric ones present a low-bias conductance spanning the 0.2-0.3 G0 interval for a wide range of electrode separations. This is in good agreement with experiments on Cu atomic contacts in a hydrogen environment. Furthermore, the distribution of the calculated vibrational energies of the two hydrogen atoms in the asymmetric Cu-H2-Cu junction is also consistent with experiments. These findings provide clear evidence for the formation of asymmetric Cu-H2-Cu molecular junctions in breaking Cu atomic contacts in the presence of hydrogen and are also helpful for the design of molecular devices with Cu electrodes.
NASA Astrophysics Data System (ADS)
Caliskan, Serkan
2018-05-01
Using first principles study, through Density Functional Theory combined with Non Equilibrium Green's Function Formalism, electronic properties of endohedral N@C20 fullerene molecule joining Au electrodes (Au-N@C20) was addressed in the presence of spin property. The electronic transport behavior across the Au-N@C20 molecular junction was investigated by spin resolved transmission, density of states, molecular orbitals, differential conductance and current-voltage (I-V) characteristics. Spin asymmetric variation was clearly observed in the results due to single N atom encapsulated in the C20 fullerene cage, where the N atom played an essential role in the electronic behavior of Au-N@C20. This N@C20 based molecular bridge, exhibiting a spin dependent I-V variation, revealed a metallic behavior within the bias range from -1 V to 1 V. The induced magnetic moment, spin polarization and other relevant quantities associated with the spin resolved transport were elucidated.
Dynamic Relaxational Behaviour of Hyperbranched Polyether Polyols
NASA Astrophysics Data System (ADS)
Navarro-Gorris, A.; Garcia-Bernabé, A.; Stiriba, S.-E.
2008-08-01
Hyperbranched polymers are highly cascade branched polymers easily accessible via one-pot procedure from ABm type monomers. A key property of hyperbranched polymers is their molecular architecture, which allows core-shell morphology to be manipulated for further specific applications in material and medical sciences. Since the discovery of hyperbranched polymer materials, an increasing number of reports have been published describing synthetic procedures and technological applications of such materials, but their physical properties have remained less studied until the last decade. In the present work, different esterified hyperbranched polyglycerols have been prepared starting from polyglycerol precursors in presence of acetic acid, thus generating functionalization degree with range from 0 to 94%. Thermal analysis of the obtained samples has been studied by Differential Scanning Calorimetry (DSC). Dielectric Spectroscopy measurements have been analyzed by combining loss spectra deconvolution with the modulus formalism. In this regard, all acetylated polyglycerols exhibited a main relaxation related to the glass transition (α process) and two sub-glassy relaxations (β and γ processes) which vanish at high functionalization degrees.
Proynov, Emil; Liu, Fenglai; Gan, Zhengting; Wang, Matthew; Kong, Jing
2015-01-01
We implement and compute the density functional nonadditive three-body dispersion interaction using a combination of Tang-Karplus formalism and the exchange-dipole moment model of Becke and Johnson. The computation of the C9 dispersion coefficients is done in a non-empirical fashion. The obtained C9 values of a series of noble atom triplets agree well with highly accurate values in the literature. We also calculate the C9 values for a series of benzene trimers and find a good agreement with high-level ab initio values reported recently in the literature. For the question of damping of the three-body dispersion at short distances, we propose two damping schemes and optimize them based on the benzene trimers data, and the fitted analytic potentials of He3 and Ar3 trimers fitted to the results of high-level wavefunction theories available from the literature. Both damping schemes respond well to the optimization of two parameters. PMID:26328836
Tuning the conductance of H2O@C60 by position of the encapsulated H2O
Zhu, Chengbo; Wang, Xiaolin
2015-01-01
The change of conductance of single-molecule junction in response to various external stimuli is the fundamental mechanism for the single-molecule electronic devices with multiple functionalities. We propose the concept that the conductance of molecular systems can be tuned from inside. The conductance is varied in C60 with encapsulated H2O, H2O@C60. The transport properties of the H2O@C60-based nanostructure sandwiched between electrodes are studied using first-principles calculations combined with the non-equilibrium Green’s function formalism. Our results show that the conductance of the H2O@C60 is sensitive to the position of the H2O and its dipole direction inside the cage with changes in conductance up to 20%. Our study paves a way for the H2O@C60 molecule to be a new platform for novel molecule-based electronics and sensors. PMID:26643873
Efficient implicit LES method for the simulation of turbulent cavitating flows
DOE Office of Scientific and Technical Information (OSTI.GOV)
Egerer, Christian P., E-mail: christian.egerer@aer.mw.tum.de; Schmidt, Steffen J.; Hickel, Stefan
2016-07-01
We present a numerical method for efficient large-eddy simulation of compressible liquid flows with cavitation based on an implicit subgrid-scale model. Phase change and subgrid-scale interface structures are modeled by a homogeneous mixture model that assumes local thermodynamic equilibrium. Unlike previous approaches, emphasis is placed on operating on a small stencil (at most four cells). The truncation error of the discretization is designed to function as a physically consistent subgrid-scale model for turbulence. We formulate a sensor functional that detects shock waves or pseudo-phase boundaries within the homogeneous mixture model for localizing numerical dissipation. In smooth regions of the flowmore » field, a formally non-dissipative central discretization scheme is used in combination with a regularization term to model the effect of unresolved subgrid scales. The new method is validated by computing standard single- and two-phase test-cases. Comparison of results for a turbulent cavitating mixing layer obtained with the new method demonstrates its suitability for the target applications.« less
Adsorption of CO2 on Fe-doped graphene nano-ribbons: Investigation of transport properties
NASA Astrophysics Data System (ADS)
Othman, W.; Fahed, M.; Hatim, S.; Sherazi, A.; Berdiyorov, G.; Tit, N.
2017-07-01
Density functional theory combined with the non-equilibrium Green’s function formalism is used to study the conductance response of Fe-doped graphene nano-ribbons (GNRs) to CO2 gas adsorption. A single Fe atom is either adsorbed on GNR’s surface (aFe-graphene) or it substitutes the carbon atom (sFe-graphene). Metal atom doping reduces the electronic transmission of pristine graphene due to the localization of electronic states near the impurity site. Moreover, the aFe-graphene is found to be less sensitive to the CO2 molecule attachment as compared to the sFe-graphene system. These behaviours are not only consolidated but rather confirmed by calculating the IV characteristics from which both surface resistance and its sensitivity to the gas are estimated. Since the change in the conductivity is one of the main outputs of sensors, our findings will be useful in developing efficient graphene-based solid-state gas sensors.
Carbon-doping-induced negative differential resistance in armchair phosphorene nanoribbons
NASA Astrophysics Data System (ADS)
Guo, Caixia; Xia, Congxin; Wang, Tianxing; Liu, Yufang
2017-03-01
By using a combined method of density functional theory and non-equilibrium Green’s function formalism, we investigate the electronic transport properties of carbon-doped armchair phosphorene nanoribbons (APNRs). The results show that C atom doping can strongly affect the electronic transport properties of the APNR and change it from semiconductor to metal. Meanwhile, obvious negative differential resistance (NDR) behaviors are obtained by tuning the doping position and concentration. In particular, with reducing doping concentration, NDR peak position can enter into mV bias range. These results provide a theoretical support to design the related nanodevice by tuning the doping position and concentration in the APNRs. Project supported by the National Natural Science Foundation of China (No. 11274096), the University Science and Technology Innovation Team Support Project of Henan Province (No. 13IRTSTHN016), the University key Science Research Project of Henan Province (No.16A140043). The calculation about this work was supported by the High Performance Computing Center of Henan Normal University.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Proynov, Emil; Wang, Matthew; Kong, Jing, E-mail: jing.kong@mtsu.edu
We implement and compute the density functional nonadditive three-body dispersion interaction using a combination of Tang-Karplus formalism and the exchange-dipole moment model of Becke and Johnson. The computation of the C{sub 9} dispersion coefficients is done in a non-empirical fashion. The obtained C{sub 9} values of a series of noble atom triplets agree well with highly accurate values in the literature. We also calculate the C{sub 9} values for a series of benzene trimers and find a good agreement with high-level ab initio values reported recently in the literature. For the question of damping of the three-body dispersion at shortmore » distances, we propose two damping schemes and optimize them based on the benzene trimers data, and the fitted analytic potentials of He{sub 3} and Ar{sub 3} trimers fitted to the results of high-level wavefunction theories available from the literature. Both damping schemes respond well to the optimization of two parameters.« less
Theoretical characterisation of point defects on a MoS2 monolayer by scanning tunnelling microscopy.
González, C; Biel, B; Dappe, Y J
2016-03-11
Different S and Mo vacancies as well as their corresponding antisite defects in a free-standing MoS2 monolayer are analysed by means of scanning tunnelling microscopy (STM) simulations. Our theoretical methodology, based on the Keldysh nonequilibrium Green function formalism within the density functional theory (DFT) approach, is applied to simulate STM images for different voltages and tip heights. Combining the geometrical and electronic effects, all features of the different STM images can be explained, providing a valuable guide for future experiments. Our results confirm previous reports on S atom imaging, but also reveal a strong dependence on the applied bias for vacancies and antisite defects that include extra S atoms. By contrast, when additional Mo atoms cover the S vacancies, the MoS2 gap vanishes and a bias-independent bright protrusion is obtained in the STM image. Finally, we show that the inclusion of these point defects promotes the emergence of reactive dangling bonds that may act as efficient adsorption sites for external adsorbates.
Dose rate calculations around 192Ir brachytherapy sources using a Sievert integration model
NASA Astrophysics Data System (ADS)
Karaiskos, P.; Angelopoulos, A.; Baras, P.; Rozaki-Mavrouli, H.; Sandilos, P.; Vlachos, L.; Sakelliou, L.
2000-02-01
The classical Sievert integral method is a valuable tool for dose rate calculations around brachytherapy sources, combining simplicity with reasonable computational times. However, its accuracy in predicting dose rate anisotropy around 192 Ir brachytherapy sources has been repeatedly put into question. In this work, we used a primary and scatter separation technique to improve an existing modification of the Sievert integral (Williamson's isotropic scatter model) that determines dose rate anisotropy around commercially available 192 Ir brachytherapy sources. The proposed Sievert formalism provides increased accuracy while maintaining the simplicity and computational time efficiency of the Sievert integral method. To describe transmission within the materials encountered, the formalism makes use of narrow beam attenuation coefficients which can be directly and easily calculated from the initially emitted 192 Ir spectrum. The other numerical parameters required for its implementation, once calculated with the aid of our home-made Monte Carlo simulation code, can be used for any 192 Ir source design. Calculations of dose rate and anisotropy functions with the proposed Sievert expression, around commonly used 192 Ir high dose rate sources and other 192 Ir elongated source designs, are in good agreement with corresponding accurate Monte Carlo results which have been reported by our group and other authors.
Property-Based Monitoring of Analog and Mixed-Signal Systems
NASA Astrophysics Data System (ADS)
Havlicek, John; Little, Scott; Maler, Oded; Nickovic, Dejan
In the recent past, there has been a steady growth of the market for consumer embedded devices such as cell phones, GPS and portable multimedia systems. In embedded systems, digital, analog and software components are combined on a single chip, resulting in increasingly complex designs that introduce richer functionality on smaller devices. As a consequence, the potential insertion of errors into a design becomes higher, yielding an increasing need for automated analog and mixed-signal validation tools. In the purely digital setting, formal verification based on properties expressed in industrial specification languages such as PSL and SVA is nowadays successfully integrated in the design flow. On the other hand, the validation of analog and mixed-signal systems still largely depends on simulation-based, ad-hoc methods. In this tutorial, we consider some ingredients of the standard verification methodology that can be successfully exported from digital to analog and mixed-signal setting, in particular property-based monitoring techniques. Property-based monitoring is a lighter approach to the formal verification, where the system is seen as a "black-box" that generates sets of traces, whose correctness is checked against a property, that is its high-level specification. Although incomplete, monitoring is effectively used to catch faults in systems, without guaranteeing their full correctness.
Guariguata, Leonor; de Beer, Ingrid; Hough, Rina; Mulongeni, Pancho; Feeley, Frank G.; Rinke de Wit, Tobias F.
2015-01-01
Introduction The burden of non-communicable diseases (NCDs) is growing in sub-Saharan Africa combined with an already high prevalence of infectious disease, like HIV. Engaging the formal employment sector may present a viable strategy for addressing both HIV and NCDs in people of working age. This study assesses the presence of three of the most significant threats to health in Namibia among employees in the formal sector: elevated blood pressure, elevated blood glucose, and HIV and assesses the knowledge and self-perceived risk of employees for these conditions. Methods A health and wellness screening survey of employees working in 13 industries in the formal sector of Namibia was conducted including 11,192 participants in the Bophelo! Project in Namibia, from January 2009 to October 2010. The survey combined a medical screening for HIV, blood glucose and blood pressure with an employee-completed survey on knowledge and risk behaviors for those conditions. We estimated the prevalence of the three conditions and compared to self-reported employee knowledge and risk behaviors and possible determinants. Results 25.8% of participants had elevated blood pressure, 8.3% of participants had an elevated random blood glucose measurement, and 8.9% of participants tested positive for HIV. Most participants were not smokers (80%), reported not drinking alcohol regularly (81.2%), and had regular condom use (66%). Most participants could not correctly identify risk factors for hypertension (57.2%), diabetes (57.3%), or high-risk behaviors for HIV infection (59.5%). In multivariate analysis, having insurance (OR:1.15, 95%CI: 1.03 – 1.28) and a managerial position (OR: 1.29, 95%CI: 1.13 – 1.47) were associated with better odds of knowledge of diabetes. Conclusion The prevalence of elevated blood pressure, elevated blood glucose, and HIV among employees of the Namibian formal sector is high, while risk awareness is low. Attention must be paid to improving the knowledge of health-related risk factors as well as providing care to those with chronic conditions in the formal sector through programs such as workplace wellness. PMID:26167926
Guariguata, Leonor; de Beer, Ingrid; Hough, Rina; Mulongeni, Pancho; Feeley, Frank G; Rinke de Wit, Tobias F
2015-01-01
The burden of non-communicable diseases (NCDs) is growing in sub-Saharan Africa combined with an already high prevalence of infectious disease, like HIV. Engaging the formal employment sector may present a viable strategy for addressing both HIV and NCDs in people of working age. This study assesses the presence of three of the most significant threats to health in Namibia among employees in the formal sector: elevated blood pressure, elevated blood glucose, and HIV and assesses the knowledge and self-perceived risk of employees for these conditions. A health and wellness screening survey of employees working in 13 industries in the formal sector of Namibia was conducted including 11,192 participants in the Bophelo! Project in Namibia, from January 2009 to October 2010. The survey combined a medical screening for HIV, blood glucose and blood pressure with an employee-completed survey on knowledge and risk behaviors for those conditions. We estimated the prevalence of the three conditions and compared to self-reported employee knowledge and risk behaviors and possible determinants. 25.8% of participants had elevated blood pressure, 8.3% of participants had an elevated random blood glucose measurement, and 8.9% of participants tested positive for HIV. Most participants were not smokers (80%), reported not drinking alcohol regularly (81.2%), and had regular condom use (66%). Most participants could not correctly identify risk factors for hypertension (57.2%), diabetes (57.3%), or high-risk behaviors for HIV infection (59.5%). In multivariate analysis, having insurance (OR:1.15, 95%CI: 1.03 - 1.28) and a managerial position (OR: 1.29, 95%CI: 1.13 - 1.47) were associated with better odds of knowledge of diabetes. The prevalence of elevated blood pressure, elevated blood glucose, and HIV among employees of the Namibian formal sector is high, while risk awareness is low. Attention must be paid to improving the knowledge of health-related risk factors as well as providing care to those with chronic conditions in the formal sector through programs such as workplace wellness.
Modeling Cyber Conflicts Using an Extended Petri Net Formalism
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zakrzewska, Anita N; Ferragut, Erik M
2011-01-01
When threatened by automated attacks, critical systems that require human-controlled responses have difficulty making optimal responses and adapting protections in real- time and may therefore be overwhelmed. Consequently, experts have called for the development of automatic real-time reaction capabilities. However, a technical gap exists in the modeling and analysis of cyber conflicts to automatically understand the repercussions of responses. There is a need for modeling cyber assets that accounts for concurrent behavior, incomplete information, and payoff functions. Furthermore, we address this need by extending the Petri net formalism to allow real-time cyber conflicts to be modeled in a way thatmore » is expressive and concise. This formalism includes transitions controlled by players as well as firing rates attached to transitions. This allows us to model both player actions and factors that are beyond the control of players in real-time. We show that our formalism is able to represent situational aware- ness, concurrent actions, incomplete information and objective functions. These factors make it well-suited to modeling cyber conflicts in a way that allows for useful analysis. MITRE has compiled the Common Attack Pattern Enumera- tion and Classification (CAPEC), an extensive list of cyber attacks at various levels of abstraction. CAPEC includes factors such as attack prerequisites, possible countermeasures, and attack goals. These elements are vital to understanding cyber attacks and to generating the corresponding real-time responses. We demonstrate that the formalism can be used to extract precise models of cyber attacks from CAPEC. Several case studies show that our Petri net formalism is more expressive than other models, such as attack graphs, for modeling cyber conflicts and that it is amenable to exploring cyber strategies.« less
Graph-based linear scaling electronic structure theory.
Niklasson, Anders M N; Mniszewski, Susan M; Negre, Christian F A; Cawkwell, Marc J; Swart, Pieter J; Mohd-Yusof, Jamal; Germann, Timothy C; Wall, Michael E; Bock, Nicolas; Rubensson, Emanuel H; Djidjev, Hristo
2016-06-21
We show how graph theory can be combined with quantum theory to calculate the electronic structure of large complex systems. The graph formalism is general and applicable to a broad range of electronic structure methods and materials, including challenging systems such as biomolecules. The methodology combines well-controlled accuracy, low computational cost, and natural low-communication parallelism. This combination addresses substantial shortcomings of linear scaling electronic structure theory, in particular with respect to quantum-based molecular dynamics simulations.
Graph-based linear scaling electronic structure theory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Niklasson, Anders M. N., E-mail: amn@lanl.gov; Negre, Christian F. A.; Cawkwell, Marc J.
2016-06-21
We show how graph theory can be combined with quantum theory to calculate the electronic structure of large complex systems. The graph formalism is general and applicable to a broad range of electronic structure methods and materials, including challenging systems such as biomolecules. The methodology combines well-controlled accuracy, low computational cost, and natural low-communication parallelism. This combination addresses substantial shortcomings of linear scaling electronic structure theory, in particular with respect to quantum-based molecular dynamics simulations.
NASA Technical Reports Server (NTRS)
Hahne, G. E.
1991-01-01
A formal theory of the scattering of time-harmonic acoustic scalar waves from impenetrable, immobile obstacles is established. The time-independent formal scattering theory of nonrelativistic quantum mechanics, in particular the theory of the complete Green's function and the transition (T) operator, provides the model. The quantum-mechanical approach is modified to allow the treatment of acoustic-wave scattering with imposed boundary conditions of impedance type on the surface (delta-Omega) of an impenetrable obstacle. With k0 as the free-space wavenumber of the signal, a simplified expression is obtained for the k0-dependent T operator for a general case of homogeneous impedance boundary conditions for the acoustic wave on delta-Omega. All the nonelementary operators entering the expression for the T operator are formally simple rational algebraic functions of a certain invertible linear radiation impedance operator which maps any sufficiently well-behaved complex-valued function on delta-Omega into another such function on delta-Omega. In the subsequent study, the short-wavelength and the long-wavelength behavior of the radiation impedance operator and its inverse (the 'radiation admittance' operator) as two-point kernels on a smooth delta-Omega are studied for pairs of points that are close together.
Revised Thomas-Fermi approximation for singular potentials
NASA Astrophysics Data System (ADS)
Dufty, James W.; Trickey, S. B.
2016-08-01
Approximations for the many-fermion free-energy density functional that include the Thomas-Fermi (TF) form for the noninteracting part lead to singular densities for singular external potentials (e.g., attractive Coulomb). This limitation of the TF approximation is addressed here by a formal map of the exact Euler equation for the density onto an equivalent TF form characterized by a modified Kohn-Sham potential. It is shown to be a "regularized" version of the Kohn-Sham potential, tempered by convolution with a finite-temperature response function. The resulting density is nonsingular, with the equilibrium properties obtained from the total free-energy functional evaluated at this density. This new representation is formally exact. Approximate expressions for the regularized potential are given to leading order in a nonlocality parameter, and the limiting behavior at high and low temperatures is described. The noninteracting part of the free energy in this approximation is the usual Thomas-Fermi functional. These results generalize and extend to finite temperatures the ground-state regularization by R. G. Parr and S. Ghosh [Proc. Natl. Acad. Sci. U.S.A. 83, 3577 (1986), 10.1073/pnas.83.11.3577] and by L. R. Pratt, G. G. Hoffman, and R. A. Harris [J. Chem. Phys. 88, 1818 (1988), 10.1063/1.454105] and formally systematize the finite-temperature regularization given by the latter authors.
Scalar formalism for non-Abelian gauge theory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hostler, L.C.
1986-09-01
The gauge field theory of an N-italic-dimensional multiplet of spin- 1/2 particles is investigated using the Klein--Gordon-type wave equation )Pi x (1+i-italicsigma) x Pi+m-italic/sup 2/)Phi = 0, Pi/sub ..mu../equivalentpartial/partiali-italicx-italic/sub ..mu../-e-italicA-italic/sub ..mu../, investigated before by a number of authors, to describe the fermions. Here Phi is a 2 x 1 Pauli spinor, and sigma repesents a Lorentz spin tensor whose components sigma/sub ..mu..//sub ..nu../ are ordinary 2 x 2 Pauli spin matrices. Feynman rules for the scalar formalism for non-Abelian gauge theory are derived starting from the conventional field theory of the multiplet and converting it to the new description. Themore » equivalence of the new and the old formalism for arbitrary radiative processes is thereby established. The conversion to the scalar formalism is accomplished in a novel way by working in terms of the path integral representation of the generating functional of the vacuum tau-functions, tau(2,1, xxx 3 xxx)equivalent<0-chemically bondT-italic(Psi/sub in/(2) Psi-bar/sub in/(1) xxx A-italic/sub ..mu../(3)/sub in/ xxx S-italic)chemically bond0->, where Psi/sub in/ is a Heisenberg operator belonging to a 4N-italic x 1 Dirac wave function of the multiplet. The Feynman rules obtained generalize earlier results for the Abelian case of quantum electrodynamics.« less
Michaels, Thomas C T; Šarić, Anđela; Habchi, Johnny; Chia, Sean; Meisl, Georg; Vendruscolo, Michele; Dobson, Christopher M; Knowles, Tuomas P J
2018-04-20
Understanding how normally soluble peptides and proteins aggregate to form amyloid fibrils is central to many areas of modern biomolecular science, ranging from the development of functional biomaterials to the design of rational therapeutic strategies against increasingly prevalent medical conditions such as Alzheimer's and Parkinson's diseases. As such, there is a great need to develop models to mechanistically describe how amyloid fibrils are formed from precursor peptides and proteins. Here we review and discuss how ideas and concepts from chemical reaction kinetics can help to achieve this objective. In particular, we show how a combination of theory, experiments, and computer simulations, based on chemical kinetics, provides a general formalism for uncovering, at the molecular level, the mechanistic steps that underlie the phenomenon of amyloid fibril formation.
Velocity-gauge real-time TDDFT within a numerical atomic orbital basis set
Pemmaraju, C. D.; Vila, F. D.; Kas, J. J.; ...
2018-02-07
The interaction of laser fields with solid-state systems can be modeled efficiently within the velocity-gauge formalism of real-time time dependent density functional theory (RT-TDDFT). In this article, we discuss the implementation of the velocity-gauge RT-TDDFT equations for electron dynamics within a linear combination of atomic orbitals (LCAO) basis set framework. Numerical results obtained from our LCAO implementation, for the electronic response of periodic systems to both weak and intense laser fields, are compared to those obtained from established real-space grid and Full-Potential Linearized Augmented Planewave approaches. As a result, potential applications of the LCAO based scheme in the context ofmore » extreme ultra-violet and soft X-ray spectroscopies involving core-electronic excitations are discussed.« less
[Surgical tactics at "difficult" perforative duodenal ulcers].
Kolosovych, I V; Bezrodnyĭ, B H; Chemodanov, P V; Sysak, O M
2013-09-01
Bacteriological research of abdominal cavities exsudate is conducted to 264 patients on perforative duodenal ulcers and the dynamics of peritonitis motion is studied in a postoperative period. It is set that already hour-long after the perforation of duodenal ulcer, according to information of peritoneal maintenance pH-metry and it's bacteriologic research, the optimum conditions for progress of inflammatory and infectious factors are created in an abdominal cavity. Therefore a formal term from the moment of perforation can not be the index of degree of inflammation (bacterial contamination) of peritoneum. The methods of duodenoplasty are improved at the giant perforative ulcers of duodenum and ulcers, combined with tubular stenosis of duodenum, allowed to avoid development of purulent-septic postoperative complications through insolvency of stitches and severe motor function disturbances.
NASA Astrophysics Data System (ADS)
Michaels, Thomas C. T.; Šarić, Anđela; Habchi, Johnny; Chia, Sean; Meisl, Georg; Vendruscolo, Michele; Dobson, Christopher M.; Knowles, Tuomas P. J.
2018-04-01
Understanding how normally soluble peptides and proteins aggregate to form amyloid fibrils is central to many areas of modern biomolecular science, ranging from the development of functional biomaterials to the design of rational therapeutic strategies against increasingly prevalent medical conditions such as Alzheimer's and Parkinson's diseases. As such, there is a great need to develop models to mechanistically describe how amyloid fibrils are formed from precursor peptides and proteins. Here we review and discuss how ideas and concepts from chemical reaction kinetics can help to achieve this objective. In particular, we show how a combination of theory, experiments, and computer simulations, based on chemical kinetics, provides a general formalism for uncovering, at the molecular level, the mechanistic steps that underlie the phenomenon of amyloid fibril formation.
Velocity-gauge real-time TDDFT within a numerical atomic orbital basis set
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pemmaraju, C. D.; Vila, F. D.; Kas, J. J.
The interaction of laser fields with solid-state systems can be modeled efficiently within the velocity-gauge formalism of real-time time dependent density functional theory (RT-TDDFT). In this article, we discuss the implementation of the velocity-gauge RT-TDDFT equations for electron dynamics within a linear combination of atomic orbitals (LCAO) basis set framework. Numerical results obtained from our LCAO implementation, for the electronic response of periodic systems to both weak and intense laser fields, are compared to those obtained from established real-space grid and Full-Potential Linearized Augmented Planewave approaches. As a result, potential applications of the LCAO based scheme in the context ofmore » extreme ultra-violet and soft X-ray spectroscopies involving core-electronic excitations are discussed.« less
Cylinders out of a top hat: counts-in-cells for projected densities
NASA Astrophysics Data System (ADS)
Uhlemann, Cora; Pichon, Christophe; Codis, Sandrine; L'Huillier, Benjamin; Kim, Juhan; Bernardeau, Francis; Park, Changbom; Prunet, Simon
2018-06-01
Large deviation statistics is implemented to predict the statistics of cosmic densities in cylinders applicable to photometric surveys. It yields few per cent accurate analytical predictions for the one-point probability distribution function (PDF) of densities in concentric or compensated cylinders; and also captures the density dependence of their angular clustering (cylinder bias). All predictions are found to be in excellent agreement with the cosmological simulation Horizon Run 4 in the quasi-linear regime where standard perturbation theory normally breaks down. These results are combined with a simple local bias model that relates dark matter and tracer densities in cylinders and validated on simulated halo catalogues. This formalism can be used to probe cosmology with existing and upcoming photometric surveys like DES, Euclid or WFIRST containing billions of galaxies.
Combining Task Execution and Background Knowledge for the Verification of Medical Guidelines
NASA Astrophysics Data System (ADS)
Hommersom, Arjen; Groot, Perry; Lucas, Peter; Balser, Michael; Schmitt, Jonathan
The use of a medical guideline can be seen as the execution of computational tasks, sequentially or in parallel, in the face of patient data. It has been shown that many of such guidelines can be represented as a 'network of tasks', i.e., as a number of steps that have a specific function or goal. To investigate the quality of such guidelines we propose a formalization of criteria for good practice medicine a guideline should comply to. We use this theory in conjunction with medical background knowledge to verify the quality of a guideline dealing with diabetes mellitus type 2 using the interactive theorem prover KIV. Verification using task execution and background knowledge is a novel approach to quality checking of medical guidelines.
Nucleon Resonance Decay by the K0Σ+ Channel
NASA Astrophysics Data System (ADS)
Castelijns, R.; Bacelar, J.; Löhner, H.; Messchendorp, J. G. M.; Shende, S.
2006-06-01
At the tagged photon beam of the ELSA electron synchrotron at the University of Bonn in Germany the Crystal Barrel and TAPS photon spectrometers have been combined to provide a 4π detector for multi-neutral-particle final states from photonuclear reactions. In a series of experiments on single and multiple neutral meson emission we have concentrated on the hyperon production off the proton, and in particular on the K0Σ+ channel. High-quality excitation function, recoil polarizations, and angular distributions from the KΣ threshold up to 2.3 GeV c.m. energy were obtained. Particular care was taken to establish the cross section normalization. The experimental results are compared with predictions aof a recent coupled-channels calculation within the K-matrix formalism by A. Usov and O. Scholten1.
A hierarchical transition state search algorithm
NASA Astrophysics Data System (ADS)
del Campo, Jorge M.; Köster, Andreas M.
2008-07-01
A hierarchical transition state search algorithm is developed and its implementation in the density functional theory program deMon2k is described. This search algorithm combines the double ended saddle interpolation method with local uphill trust region optimization. A new formalism for the incorporation of the distance constrain in the saddle interpolation method is derived. The similarities between the constrained optimizations in the local trust region method and the saddle interpolation are highlighted. The saddle interpolation and local uphill trust region optimizations are validated on a test set of 28 representative reactions. The hierarchical transition state search algorithm is applied to an intramolecular Diels-Alder reaction with several internal rotors, which makes automatic transition state search rather challenging. The obtained reaction mechanism is discussed in the context of the experimentally observed product distribution.
Kaltsatou, Antonia C H; Kouidi, Evangelia I; Anifanti, Maria A; Douka, Stella I; Deligiannis, Asterios P
2014-02-01
To compare the effects of traditional dancing with formal exercise training in terms of functional and cardiovascular benefits and motivation in patients with chronic heart failure. Randomized controlled trial. Sports Medicine Laboratory. Fifty-one Greek male patients aged 67.1±5.5 years with chronic heart failure of New York Heart Association (NYHA) class II-III, participated in an eight-month study. They were randomly assigned to either training with Greek traditional dances (group A, n=18), formal exercise training (group B, n=16) or a sedentary control group (group C, n=17). At entry and the end of the study all patients underwent cardiopulmonary exercise testing, functional ability assessment and quality of life evaluations. The Intrinsic Motivation Inventory was also used to assess participants' subjective experience. After training group A showed increased peak oxygen consumption by 33.8% (19.5 vs. 26.1 ml/kg/min, p<0.05) and B by 32.3% (19.5 vs. 25.8 ml/kg/min, p<0.05), maximal treadmill tolerance by 48.5% (p<0.05) and by 46.4% (p<0.05), and a decreased Slope of expired minute ventilation for carbon dioxide output (VE/VCO2) slope by 18% (p<0.05) and 19.5% (p<0.05), respectively. Trained patients revealed significant improvement in the quality of life indices. Intrinsic Motivation Inventory was increased only in group A by 26.2% (3.08 vs. 3.87, p<0.05). Exercise training in chronic heart failure patients with Greek traditional dances led to functional and cardiovascular benefits similar to formal exercise training and to a higher level of motivation.
Ensemble Kalman filtering in presence of inequality constraints
NASA Astrophysics Data System (ADS)
van Leeuwen, P. J.
2009-04-01
Kalman filtering is presence of constraints is an active area of research. Based on the Gaussian assumption for the probability-density functions, it looks hard to bring in extra constraints in the formalism. On the other hand, in geophysical systems we often encounter constraints related to e.g. the underlying physics or chemistry, which are violated by the Gaussian assumption. For instance, concentrations are always non-negative, model layers have non-negative thickness, and sea-ice concentration is between 0 and 1. Several methods to bring inequality constraints into the Kalman-filter formalism have been proposed. One of them is probability density function (pdf) truncation, in which the Gaussian mass from the non-allowed part of the variables is just equally distributed over the pdf where the variables are alolwed, as proposed by Shimada et al. 1998. However, a problem with this method is that the probability that e.g. the sea-ice concentration is zero, is zero! The new method proposed here does not have this drawback. It assumes that the probability-density function is a truncated Gaussian, but the truncated mass is not distributed equally over all allowed values of the variables, but put into a delta distribution at the truncation point. This delta distribution can easily be handled with in Bayes theorem, leading to posterior probability density functions that are also truncated Gaussians with delta distributions at the truncation location. In this way a much better representation of the system is obtained, while still keeping most of the benefits of the Kalman-filter formalism. In the full Kalman filter the formalism is prohibitively expensive in large-scale systems, but efficient implementation is possible in ensemble variants of the kalman filter. Applications to low-dimensional systems and large-scale systems will be discussed.
Importance of Vibronic Effects in the UV-Vis Spectrum of the 7,7,8,8-Tetracyanoquinodimethane Anion.
Tapavicza, Enrico; Furche, Filipp; Sundholm, Dage
2016-10-11
We present a computational method for simulating vibronic absorption spectra in the ultraviolet-visible (UV-vis) range and apply it to the 7,7,8,8-tetracyanoquinodimethane anion (TCNQ - ), which has been used as a ligand in black absorbers. Gaussian broadening of vertical electronic excitation energies of TCNQ - from linear-response time-dependent density functional theory produces only one band, which is qualitatively incorrect. Thus, the harmonic vibrational modes of the two lowest doublet states were computed, and the vibronic UV-vis spectrum was simulated using the displaced harmonic oscillator approximation, the frequency-shifted harmonic oscillator approximation, and the full Duschinsky formalism. An efficient real-time generating function method was implemented to avoid the exponential complexity of conventional Franck-Condon approaches to vibronic spectra. The obtained UV-vis spectra for TCNQ - agree well with experiment; the Duschinsky rotation is found to have only a minor effect on the spectrum. Born-Oppenheimer molecular dynamics simulations combined with calculations of the electronic excitation energies for a large number of molecular structures were also used for simulating the UV-vis spectrum. The Born-Oppenheimer molecular dynamics simulations yield a broadening of the energetically lowest peak in the absorption spectrum, but additional vibrational bands present in the experimental and simulated quantum harmonic oscillator spectra are not observed in the molecular dynamics simulations. Our results underline the importance of vibronic effects for the UV-vis spectrum of TCNQ - , and they establish an efficient method for obtaining vibronic spectra using a combination of linear-response time-dependent density functional theory and a real-time generating function approach.
Introduction Part II: Formal Dynamics.
ERIC Educational Resources Information Center
Stephens, Suzanne
1979-01-01
In the current period of questioning of architectural values and directions, the implications of energy use on form must be confronted sooner or later. Efforts by various practitioners at combining art and technology are shown. (Author/MLF)
Caricato, Marco
2013-07-28
The calculation of vertical electronic transition energies of molecular systems in solution with accurate quantum mechanical methods requires the use of approximate and yet reliable models to describe the effect of the solvent on the electronic structure of the solute. The polarizable continuum model (PCM) of solvation represents a computationally efficient way to describe this effect, especially when combined with coupled cluster (CC) methods. Two formalisms are available to compute transition energies within the PCM framework: State-Specific (SS) and Linear-Response (LR). The former provides a more complete account of the solute-solvent polarization in the excited states, while the latter is computationally very efficient (i.e., comparable to gas phase) and transition properties are well defined. In this work, I review the theory for the two formalisms within CC theory with a focus on their computational requirements, and present the first implementation of the LR-PCM formalism with the coupled cluster singles and doubles method (CCSD). Transition energies computed with LR- and SS-CCSD-PCM are presented, as well as a comparison between solvation models in the LR approach. The numerical results show that the two formalisms provide different absolute values of transition energy, but similar relative solvatochromic shifts (from nonpolar to polar solvents). The LR formalism may then be used to explore the solvent effect on multiple states and evaluate transition probabilities, while the SS formalism may be used to refine the description of specific states and for the exploration of excited state potential energy surfaces of solvated systems.
Shivley, Chelsey B; Garry, Franklyn B; Kogan, Lori R; Grandin, Temple
2016-05-15
OBJECTIVE To explore the extent to which veterinary colleges and schools accredited by the AVMA Council on Education (COE) have incorporated specific courses related to animal welfare, behavior, and ethics. DESIGN Survey and curriculum review. SAMPLE All 49 AVMA COE-accredited veterinary colleges and schools (institutions). PROCEDURES The study consisted of 2 parts. In part 1, a survey regarding animal welfare, behavior, and ethics was emailed to the associate dean of academic affairs at all 49 AVMA COE-accredited institutions. In part 2, the curricula for the 30 AVMA COE-accredited institutions in the United States were reviewed for courses on animal behavior, ethics, and welfare. RESULTS Seventeen of 49 (35%) institutions responded to the survey of part 1, of which 10 offered a formal animal welfare course, 9 offered a formal animal behavior course, 8 offered a formal animal ethics course, and 5 offered a combined animal welfare, behavior, and ethics course. The frequency with which courses on animal welfare, behavior, and ethics were offered differed between international and US institutions. Review of the curricula for the 30 AVMA COE-accredited US institutions revealed that 6 offered a formal course on animal welfare, 22 offered a formal course on animal behavior, and 18 offered a formal course on animal ethics. CONCLUSIONS AND CLINICAL RELEVANCE Results suggested that AVMA COE-accredited institutions need to provide more formal education on animal welfare, behavior, and ethics so veterinarians can be advocates for animals and assist with behavioral challenges.
Approaches to Foster Transfer of Formal Principles: Which Route to Take?
Schalk, Lennart; Saalbach, Henrik; Stern, Elsbeth
2016-01-01
Enabling learners to transfer knowledge about formal principles to new problems is a major aim of science and mathematics education, which, however, is notoriously difficult to reach. Previous research advocates different approaches of how to introduce principles to foster the transfer of knowledge about formal principles. One approach suggests teaching a generic formalism of the principles. Another approach suggests presenting (at least) two concrete cases instantiating the principle. A third approach suggests presenting a generic formalism accompanied by a case. As yet, though, empirical results regarding the transfer potential of these approaches are mixed and difficult to integrate as the three approaches have rarely been tested competitively. Furthermore, the approaches have been evaluated in relation to different control conditions, and they have been assessed using varying transfer measures. In the present experiment, we introduced undergraduates to the formal principles of propositional logic with the aim to systematically compare the transfer potential of the different approaches in relation to each other and to a common control condition by using various learning and transfer tasks. Results indicate that all approaches supported successful learning and transfer of the principles, but also caused systematic differences in the magnitude of transfer. Results indicate that the combination of a generic formalism with a case was surprisingly unsuccessful while learners who compared two cases outperformed the control condition. We discuss how the simultaneous assessment of the different approaches allows to more precisely capture the underlying learning mechanisms and to advance theory on how these mechanisms contribute to transfer performance. PMID:26871902
A discriminatory function for prediction of protein-DNA interactions based on alpha shape modeling.
Zhou, Weiqiang; Yan, Hong
2010-10-15
Protein-DNA interaction has significant importance in many biological processes. However, the underlying principle of the molecular recognition process is still largely unknown. As more high-resolution 3D structures of protein-DNA complex are becoming available, the surface characteristics of the complex become an important research topic. In our work, we apply an alpha shape model to represent the surface structure of the protein-DNA complex and developed an interface-atom curvature-dependent conditional probability discriminatory function for the prediction of protein-DNA interaction. The interface-atom curvature-dependent formalism captures atomic interaction details better than the atomic distance-based method. The proposed method provides good performance in discriminating the native structures from the docking decoy sets, and outperforms the distance-dependent formalism in terms of the z-score. Computer experiment results show that the curvature-dependent formalism with the optimal parameters can achieve a native z-score of -8.17 in discriminating the native structure from the highest surface-complementarity scored decoy set and a native z-score of -7.38 in discriminating the native structure from the lowest RMSD decoy set. The interface-atom curvature-dependent formalism can also be used to predict apo version of DNA-binding proteins. These results suggest that the interface-atom curvature-dependent formalism has a good prediction capability for protein-DNA interactions. The code and data sets are available for download on http://www.hy8.com/bioinformatics.htm kenandzhou@hotmail.com.
ERIC Educational Resources Information Center
Haas, Lory E.
2011-01-01
Three main purposes provided the foundation for this study. The first purpose was to investigate academic achievement through analyses of data obtained through formal and informal assessments among kindergarten through eighth grade students who participated in a Head Start program, center-based care program, or home-based care prior to school…
Precision calculation of the lowest 1S resonance in e-H scattering. [electron-hydrogen scattering
NASA Technical Reports Server (NTRS)
Ho, Y. K.; Bhatia, A. K.; Temkin, A.
1977-01-01
The position and width of the lowest resonance in electron-hydrogen scattering have been calculated using a Hylleraas correlation function with up to 95 terms in the optical potential formalism. The results should be useful as calibration points for experimental electron scattering purposes. A formula relating the conventional (Breit-Wigner) width with the Feschbach formalism is derived.
Symbolic Dynamics and Grammatical Complexity
NASA Astrophysics Data System (ADS)
Hao, Bai-Lin; Zheng, Wei-Mou
The following sections are included: * Formal Languages and Their Complexity * Formal Language * Chomsky Hierarchy of Grammatical Complexity * The L-System * Regular Language and Finite Automaton * Finite Automaton * Regular Language * Stefan Matrix as Transfer Function for Automaton * Beyond Regular Languages * Feigenbaum and Generalized Feigenbaum Limiting Sets * Even and Odd Fibonacci Sequences * Odd Maximal Primitive Prefixes and Kneading Map * Even Maximal Primitive Prefixes and Distinct Excluded Blocks * Summary of Results
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vázquez-Báez, V.; Ramírez, C.
We present calculations towards obtaining a wave functions of the universe for the supersymmetric closed string tachyon cosmology. Supersymmetrization, in the superfield formalism, is performed by taking advantage of the time reparametrization invariance of the cosmological action and generalizing the transformations to include grassmannian variables. We calculate the corresponding Hamiltonian, by means of the Dirac formalism, and make use of the superalgebra to find solutions to the Wheeler-DeWitt equations indirectly.
Terrestrial cross-calibrated assimilation of various datasources
NASA Astrophysics Data System (ADS)
Groß, André; Müller, Richard; Schömer, Elmar; Trentmann, Jörg
2014-05-01
We introduce a novel software tool, ANACLIM, for the efficient assimilation of multiple two-dimensional data sets using a variational approach. We consider a single objective function in two spatial coordinates with higher derivatives. This function measures the deviation of the input data from the target data set. By using the Euler-Lagrange formalism the minimization of this objective function can be transformed into a sparse system of linear equations, which can be efficiently solved by a conjugate gradient solver on a desktop workstation. The objective function allows for a series of physically-motivated constraints. The user can control the relative global weights, as well as the individual weight of each constraint on a per-grid-point level. The different constraints are realized as separate terms of the objective function: One similarity term for each input data set and two additional smoothness terms, penalizing high gradient and curvature values. ANACLIM is designed to combine similarity and smoothness operators easily and to choose different solvers. We performed a series of benchmarks to calibrate and verify our solution. We use, for example, terrestrial stations of BSRN and GEBA for the solar incoming flux and AERONET stations for aerosol optical depth. First results show that the combination of these data sources gain a significant benefit against the input datasets with our approach. ANACLIM also includes a region growing algorithm for the assimilation of ground based data. The region growing algorithm computes the maximum area around a station that represents the station data. The regions are grown under several constraints like the homogeneity of the area. The resulting dataset is then used within the assimilation process. Verification is performed by cross-validation. The method and validation results will be presented and discussed.
Spin coefficients and gauge fixing in the Newman-Penrose formalism
NASA Astrophysics Data System (ADS)
Nerozzi, Andrea
2017-03-01
Since its introduction in 1962, the Newman-Penrose formalism has been widely used in analytical and numerical studies of Einstein's equations, like for example for the Teukolsky master equation, or as a powerful wave extraction tool in numerical relativity. Despite the many applications, Einstein's equations in the Newman-Penrose formalism appear complicated and not easily applicable to general studies of spacetimes, mainly because physical and gauge degrees of freedom are mixed in a nontrivial way. In this paper we approach the whole formalism with the goal of expressing the spin coefficients as functions of tetrad invariants once a particular tetrad is chosen. We show that it is possible to do so, and give for the first time a general recipe for the task, as well as an indication of the quantities and identities that are required.
Choudhary, Kamal; Zhang, Qin; Reid, Andrew C E; Chowdhury, Sugata; Van Nguyen, Nhan; Trautt, Zachary; Newrock, Marcus W; Congo, Faical Yannick; Tavazza, Francesca
2018-05-08
We perform high-throughput density functional theory (DFT) calculations for optoelectronic properties (electronic bandgap and frequency dependent dielectric function) using the OptB88vdW functional (OPT) and the Tran-Blaha modified Becke Johnson potential (MBJ). This data is distributed publicly through JARVIS-DFT database. We used this data to evaluate the differences between these two formalisms and quantify their accuracy, comparing to experimental data whenever applicable. At present, we have 17,805 OPT and 7,358 MBJ bandgaps and dielectric functions. MBJ is found to predict better bandgaps and dielectric functions than OPT, so it can be used to improve the well-known bandgap problem of DFT in a relatively inexpensive way. The peak positions in dielectric functions obtained with OPT and MBJ are in comparable agreement with experiments. The data is available on our websites http://www.ctcms.nist.gov/~knc6/JVASP.html and https://jarvis.nist.gov.
Li, Yongqiang; Reinhardt, Jan D; Gosney, James E; Zhang, Xia; Hu, Xiaorong; Chen, Sijing; Ding, Mingpu; Li, Jianan
2012-06-01
To characterize a spinal cord injury (SCI) population from the 2008 Sichuan earthquake in China; to evaluate functional outcomes of physical rehabilitation interventions; to assess potential determinants of rehabilitation effectiveness; and to assess medical complications and management outcomes. A total of 51 earthquake victims with SCI were enrolled and underwent rehabilitation programming. Functional rehabilitation outcomes included ambulation ability, wheelchair mobility and activities of daily living (ADL) assessed with the Modified Barthel Index at the beginning and end of rehabilitation. Effectiveness of rehabilitation and the effect of other predictors were evaluated by mixed effects regression. Outcomes of medical complication management were determined by comparison of the incidence of respective complications at the beginning and end of rehabilitation. Ambulation, wheelchair mobility and ADL were significantly improved with rehabilitation programming. Both earlier rescue and earlier onset of rehabilitation were significant positive predictors of rehabilitation effectiveness, whereas delayed onset of rehabilitation combined with prolonged time to rescue resulted in a lesser positive effect. Medical complications were managed effectively in 63% (pressure ulcers) to 85% (deep vein thrombosis) of patients during rehabilitation. Earthquake victims with SCI may achieve significantly improved functional rehabilitation functional outcomes on a formal, institutional-based physical rehabilitation programme.
Learning-assisted theorem proving with millions of lemmas☆
Kaliszyk, Cezary; Urban, Josef
2015-01-01
Large formal mathematical libraries consist of millions of atomic inference steps that give rise to a corresponding number of proved statements (lemmas). Analogously to the informal mathematical practice, only a tiny fraction of such statements is named and re-used in later proofs by formal mathematicians. In this work, we suggest and implement criteria defining the estimated usefulness of the HOL Light lemmas for proving further theorems. We use these criteria to mine the large inference graph of the lemmas in the HOL Light and Flyspeck libraries, adding up to millions of the best lemmas to the pool of statements that can be re-used in later proofs. We show that in combination with learning-based relevance filtering, such methods significantly strengthen automated theorem proving of new conjectures over large formal mathematical libraries such as Flyspeck. PMID:26525678
Large deviations in the presence of cooperativity and slow dynamics
NASA Astrophysics Data System (ADS)
Whitelam, Stephen
2018-06-01
We study simple models of intermittency, involving switching between two states, within the dynamical large-deviation formalism. Singularities appear in the formalism when switching is cooperative or when its basic time scale diverges. In the first case the unbiased trajectory distribution undergoes a symmetry breaking, leading to a change in shape of the large-deviation rate function for a particular dynamical observable. In the second case the symmetry of the unbiased trajectory distribution remains unbroken. Comparison of these models suggests that singularities of the dynamical large-deviation formalism can signal the dynamical equivalent of an equilibrium phase transition but do not necessarily do so.
Le Deunff, Erwan; Malagoli, Philippe
2014-01-01
Background and Aims In spite of major breakthroughs in the last three decades in the identification of root nitrate uptake transporters in plants and the associated regulation of nitrate transport activities, a simplified and operational modelling approach for nitrate uptake is still lacking. This is due mainly to the difficulty in linking the various regulations of nitrate transport that act at different levels of time and on different spatial scales. Methods A cross-combination of a Flow–Force approach applied to nitrate influx isotherms and experimentally determined environmental and in planta regulation is used to model nitrate in oilseed rape, Brassica napus. In contrast to ‘Enzyme–Substrate’ interpretations, a Flow–Force modelling approach considers the root as a single catalytic structure and does not infer hypothetical cellular processes among nitrate transporter activities across cellular layers in the mature roots. In addition, this approach accounts for the driving force on ion transport based on the gradient of electrochemical potential, which is more appropriate from a thermodynamic viewpoint. Key Results and Conclusions Use of a Flow–Force formalism on nitrate influx isotherms leads to the development of a new conceptual mechanistic basis to model more accurately N uptake by a winter oilseed rape crop under field conditions during the whole growth cycle. This forms the functional component of a proposed new structure–function mechanistic model of N uptake. PMID:24638820
Decision theory applied to image quality control in radiology.
Lessa, Patrícia S; Caous, Cristofer A; Arantes, Paula R; Amaro, Edson; de Souza, Fernando M Campello
2008-11-13
The present work aims at the application of the decision theory to radiological image quality control (QC) in diagnostic routine. The main problem addressed in the framework of decision theory is to accept or reject a film lot of a radiology service. The probability of each decision of a determined set of variables was obtained from the selected films. Based on a radiology service routine a decision probability function was determined for each considered group of combination characteristics. These characteristics were related to the film quality control. These parameters were also framed in a set of 8 possibilities, resulting in 256 possible decision rules. In order to determine a general utility application function to access the decision risk, we have used a simple unique parameter called r. The payoffs chosen were: diagnostic's result (correct/incorrect), cost (high/low), and patient satisfaction (yes/no) resulting in eight possible combinations. Depending on the value of r, more or less risk will occur related to the decision-making. The utility function was evaluated in order to determine the probability of a decision. The decision was made with patients or administrators' opinions from a radiology service center. The model is a formal quantitative approach to make a decision related to the medical imaging quality, providing an instrument to discriminate what is really necessary to accept or reject a film or a film lot. The method presented herein can help to access the risk level of an incorrect radiological diagnosis decision.
Atia, Jolene; McCloskey, Conor; Shmygol, Anatoly S.; Rand, David A.; van den Berg, Hugo A.; Blanks, Andrew M.
2016-01-01
Uterine smooth muscle cells remain quiescent throughout most of gestation, only generating spontaneous action potentials immediately prior to, and during, labor. This study presents a method that combines transcriptomics with biophysical recordings to characterise the conductance repertoire of these cells, the ‘conductance repertoire’ being the total complement of ion channels and transporters expressed by an electrically active cell. Transcriptomic analysis provides a set of potential electrogenic entities, of which the conductance repertoire is a subset. Each entity within the conductance repertoire was modeled independently and its gating parameter values were fixed using the available biophysical data. The only remaining free parameters were the surface densities for each entity. We characterise the space of combinations of surface densities (density vectors) consistent with experimentally observed membrane potential and calcium waveforms. This yields insights on the functional redundancy of the system as well as its behavioral versatility. Our approach couples high-throughput transcriptomic data with physiological behaviors in health and disease, and provides a formal method to link genotype to phenotype in excitable systems. We accurately predict current densities and chart functional redundancy. For example, we find that to evoke the observed voltage waveform, the BK channel is functionally redundant whereas hERG is essential. Furthermore, our analysis suggests that activation of calcium-activated chloride conductances by intracellular calcium release is the key factor underlying spontaneous depolarisations. PMID:27105427
Determination of the Time-Space Magnetic Correlation Functions in the Solar Wind
NASA Astrophysics Data System (ADS)
Weygand, J. M.; Matthaeus, W. H.; Kivelson, M.; Dasso, S.
2013-12-01
Magnetic field data from many different intervals and 7 different solar wind spacecraft are employed to estimate the scale-dependent time decorrelation function in the interplanetary magnetic field in both the slow and fast solar wind. This estimation requires correlations varying with both space and time lags. The two point correlation function with no time lag is determined by correlating time series data from multiple spacecraft separated in space and for complete coverage of length scales relies on many intervals with different spacecraft spatial separations. In addition we employ single spacecraft time-lagged correlations, and two spacecraft time lagged correlations to access different spatial and temporal correlation data. Combining these data sets gives estimates of the scale-dependent time decorrelation function, which in principle tells us how rapidly time decorrelation occurs at a given wavelength. For static fields the scale-dependent time decorrelation function is trivially unity, but in turbulence the nonlinear cascade process induces time-decorrelation at a given length scale that occurs more rapidly with decreasing scale. The scale-dependent time decorrelation function is valuable input to theories as well as various applications such as scattering, transport, and study of predictability. It is also a fundamental element of formal turbulence theory. Our results are extension of the Eulerian correlation functions estimated in Matthaeus et al. [2010], Weygand et al [2012; 2013].
Cost implications of organizing nursing home workforce in teams.
Mukamel, Dana B; Cai, Shubing; Temkin-Greener, Helena
2009-08-01
To estimate the costs associated with formal and self-managed daily practice teams in nursing homes. Medicaid cost reports for 135 nursing homes in New York State in 2006 and survey data for 6,137 direct care workers. A retrospective statistical analysis: We estimated hybrid cost functions that include team penetration variables. Inference was based on robust standard errors. Formal and self-managed team penetration (i.e., percent of staff working in a team) were calculated from survey responses. Annual variable costs, beds, case mix-adjusted days, admissions, home care visits, outpatient clinic visits, day care days, wages, and ownership were calculated from the cost reports. Formal team penetration was significantly associated with costs, while self-managed teams penetration was not. Costs declined with increasing penetration up to 13 percent of formal teams, and increased above this level. Formal teams in nursing homes in the upward sloping range of the curve were more diverse, with a larger number of participating disciplines and more likely to include physicians. Organization of workforce in formal teams may offer nursing homes a cost-saving strategy. More research is required to understand the relationship between team composition and costs.
Invention as a combinatorial process: evidence from US patents
Youn, Hyejin; Strumsky, Deborah; Bettencourt, Luis M. A.; Lobo, José
2015-01-01
Invention has been commonly conceptualized as a search over a space of combinatorial possibilities. Despite the existence of a rich literature, spanning a variety of disciplines, elaborating on the recombinant nature of invention, we lack a formal and quantitative characterization of the combinatorial process underpinning inventive activity. Here, we use US patent records dating from 1790 to 2010 to formally characterize invention as a combinatorial process. To do this, we treat patented inventions as carriers of technologies and avail ourselves of the elaborate system of technology codes used by the United States Patent and Trademark Office to classify the technologies responsible for an invention's novelty. We find that the combinatorial inventive process exhibits an invariant rate of ‘exploitation’ (refinements of existing combinations of technologies) and ‘exploration’ (the development of new technological combinations). This combinatorial dynamic contrasts sharply with the creation of new technological capabilities—the building blocks to be combined—that has significantly slowed down. We also find that, notwithstanding the very reduced rate at which new technologies are introduced, the generation of novel technological combinations engenders a practically infinite space of technological configurations. PMID:25904530
An Integrated Environment for Efficient Formal Design and Verification
NASA Technical Reports Server (NTRS)
1998-01-01
The general goal of this project was to improve the practicality of formal methods by combining techniques from model checking and theorem proving. At the time the project was proposed, the model checking and theorem proving communities were applying different tools to similar problems, but there was not much cross-fertilization. This project involved a group from SRI that had substantial experience in the development and application of theorem-proving technology, and a group at Stanford that specialized in model checking techniques. Now, over five years after the proposal was submitted, there are many research groups working on combining theorem-proving and model checking techniques, and much more communication between the model checking and theorem proving research communities. This project contributed significantly to this research trend. The research work under this project covered a variety of topics: new theory and algorithms; prototype tools; verification methodology; and applications to problems in particular domains.
Phase space quantum mechanics - Direct
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nasiri, S.; Sobouti, Y.; Taati, F.
2006-09-15
Conventional approach to quantum mechanics in phase space (q,p), is to take the operator based quantum mechanics of Schroedinger, or an equivalent, and assign a c-number function in phase space to it. We propose to begin with a higher level of abstraction, in which the independence and the symmetric role of q and p is maintained throughout, and at once arrive at phase space state functions. Upon reduction to the q- or p-space the proposed formalism gives the conventional quantum mechanics, however, with a definite rule for ordering of factors of noncommuting observables. Further conceptual and practical merits of themore » formalism are demonstrated throughout the text.« less
Null tests of the standard model using the linear model formalism
NASA Astrophysics Data System (ADS)
Marra, Valerio; Sapone, Domenico
2018-04-01
We test both the Friedmann-Lemaître-Robertson-Walker geometry and Λ CDM cosmology in a model-independent way by reconstructing the Hubble function H (z ), the comoving distance D (z ), and the growth of structure f σ8(z ) using the most recent data available. We use the linear model formalism in order to optimally reconstruct the above cosmological functions, together with their derivatives and integrals. We then evaluate four of the null tests available in the literature that probe both background and perturbation assumptions. For all the four tests, we find agreement, within the errors, with the standard cosmological model.
Formal optimization of hovering performance using free wake lifting surface theory
NASA Technical Reports Server (NTRS)
Chung, S. Y.
1986-01-01
Free wake techniques for performance prediction and optimization of hovering rotor are discussed. The influence functions due to vortex ring, vortex cylinder, and source or vortex sheets are presented. The vortex core sizes of rotor wake vortices are calculated and their importance is discussed. Lifting body theory for finite thickness body is developed for pressure calculation, and hence performance prediction of hovering rotors. Numerical optimization technique based on free wake lifting line theory is presented and discussed. It is demonstrated that formal optimization can be used with the implicit and nonlinear objective or cost function such as the performance of hovering rotors as used in this report.
Modeling non-locality of plasmonic excitations with a fictitious film
NASA Astrophysics Data System (ADS)
Kong, Jiantao; Shvonski, Alexander; Kempa, Krzysztof
Non-local effects, requiring a wavevector (q) dependent dielectric response are becoming increasingly important in studies of plasmonic and metamaterial structures. The phenomenological hydrodynamic approximation (HDA) is the simplest, and most often used model, but it often fails. We show that the d-function formalism, exact to first order in q, is a powerful and simple-to-use alternative. Recently, we developed a mapping of the d-function formalism into a purely local fictitious film. This geometric mapping allows for non-local extensions of any local calculation scheme, including FDTD. We demonstrate here, that such mapped FDTD simulation of metallic nanoclusters agrees very well with various experiments.
Nonperturbative functions for SIDIS and Drell-Yan processes
NASA Astrophysics Data System (ADS)
Sun, Peng; Isaacson, Joshua; Yuan, C.-P.; Yuan, Feng
2018-04-01
We update the well-known BLNY fit to the low transverse momentum Drell-Yan lepton pair productions in hadronic collisions, by considering the constraints from the semi-inclusive hadron production in deep inelastic scattering (SIDIS) from HERMES and COMPASS experiments. We follow the Collins-Soper-Sterman (CSS) formalism with the b∗-prescription. A nonperturbative form factor associated with the transverse momentum dependent quark distributions is found in the analysis with a new functional form different from that of BLNY. This releases the tension between the BLNY fit to the Drell-Yan data with the SIDIS data from HERMES/COMPASS in the CSS resummation formalism.
NASA Technical Reports Server (NTRS)
Prescod-Weinstein, Chanda; Afshordi, Niayesh
2011-01-01
Structure formation provides a strong test of any cosmic acceleration model because a successful dark energy model must not inhibit or overpredict the development of observed large-scale structures. Traditional approaches to studies of structure formation in the presence of dark energy or a modified gravity implement a modified Press-Schechter formalism, which relates the linear overdensities to the abundance of dark matter haloes at the same time. We critically examine the universality of the Press-Schechter formalism for different cosmologies, and show that the halo abundance is best correlated with spherical linear overdensity at 94% of collapse (or observation) time. We then extend this argument to ellipsoidal collapse (which decreases the fractional time of best correlation for small haloes), and show that our results agree with deviations from modified Press-Schechter formalism seen in simulated mass functions. This provides a novel universal prescription to measure linear density evolution, based on current and future observations of cluster (or dark matter) halo mass function. In particular, even observations of cluster abundance in a single epoch will constrain the entire history of linear growth of cosmological of perturbations.
Lorentz covariance of loop quantum gravity
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rovelli, Carlo; Speziale, Simone
2011-05-15
The kinematics of loop gravity can be given a manifestly Lorentz-covariant formulation: the conventional SU(2)-spin-network Hilbert space can be mapped to a space K of SL(2,C) functions, where Lorentz covariance is manifest. K can be described in terms of a certain subset of the projected spin networks studied by Livine, Alexandrov and Dupuis. It is formed by SL(2,C) functions completely determined by their restriction on SU(2). These are square-integrable in the SU(2) scalar product, but not in the SL(2,C) one. Thus, SU(2)-spin-network states can be represented by Lorentz-covariant SL(2,C) functions, as two-component photons can be described in the Lorentz-covariant Gupta-Bleulermore » formalism. As shown by Wolfgang Wieland in a related paper, this manifestly Lorentz-covariant formulation can also be directly obtained from canonical quantization. We show that the spinfoam dynamics of loop quantum gravity is locally SL(2,C)-invariant in the bulk, and yields states that are precisely in K on the boundary. This clarifies how the SL(2,C) spinfoam formalism yields an SU(2) theory on the boundary. These structures define a tidy Lorentz-covariant formalism for loop gravity.« less
Who cares? A comparison of informal and formal care provision in Spain, England and the USA.
Solé-Auró, Aïda; Crimmins, Eileen M
2014-03-01
This paper investigates the prevalence of incapacity in performing daily activities and the associations between household composition and availability of family members and receipt of care among older adults with functioning problems in Spain, England and the United States of America (USA). We examine how living arrangements, marital status, child availability, limitations in functioning ability, age and gender affect the probability of receiving formal care and informal care from household members and from others in three countries with different family structures, living arrangements and policies supporting care of the incapacitated. Data sources include the 2006 Survey of Health, Ageing and Retirement in Europe for Spain, the third wave of the English Longitudinal Study of Ageing (2006), and the eighth wave of the USA Health and Retirement Study (2006). Logistic and multinomial logistic regressions are used to estimate the probability of receiving care and the sources of care among persons age 50 and older. The percentage of people with functional limitations receiving care is higher in Spain. More care comes from outside the household in the USA and England than in Spain. The use of formal care among the incapacitated is lowest in the USA and highest in Spain.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Helton, Jon C.; Brooks, Dusty Marie; Sallaberry, Cedric Jean-Marie.
Probability of loss of assured safety (PLOAS) is modeled for weak link (WL)/strong link (SL) systems in which one or more WLs or SLs could potentially degrade into a precursor condition to link failure that will be followed by an actual failure after some amount of elapsed time. The following topics are considered: (i) Definition of precursor occurrence time cumulative distribution functions (CDFs) for individual WLs and SLs, (ii) Formal representation of PLOAS with constant delay times, (iii) Approximation and illustration of PLOAS with constant delay times, (iv) Formal representation of PLOAS with aleatory uncertainty in delay times, (v) Approximationmore » and illustration of PLOAS with aleatory uncertainty in delay times, (vi) Formal representation of PLOAS with delay times defined by functions of link properties at occurrence times for failure precursors, (vii) Approximation and illustration of PLOAS with delay times defined by functions of link properties at occurrence times for failure precursors, and (viii) Procedures for the verification of PLOAS calculations for the three indicated definitions of delayed link failure.« less
Ritzema, A M; Lach, L M; Nicholas, D; Sladeczek, I E
2018-03-01
Both child function and supports and services have been found to impact the well-being of parents of children with neurodevelopmental disorders (NDD). The relationship between function and services and the well-being of children with NDD is less well-understood and is important to clarify in order to effect program and service change. The current project assessed whether child function as well as the adequacy of formal supports and services provided to children and their families were predictive of child well-being. Well-being was assessed using a measure of quality of life developed for use with children with NDD. Data from 234 parents were analysed using structural equation modelling. Each predictor was found to load significantly on the overall outcome variable of well-being. Parent concerns about child function were significantly related to child well-being; parents who reported more concerns about their children's functioning reported lower levels of child well-being. Unmet needs for formal supports and services were also significantly related to child well-being; parents who reported that more of their children's and family's service needs were unmet reported lower child well-being. An indirect relationship was also found between child function and child well-being. When parents reported that their formal support needs were adequately met, their children's functional difficulties had a lower impact on parent perceptions of their children's overall well-being. Taken together, the results of the current study enrich our understanding of well-being for children with NDD. Discussion focuses on the service implications for children with NDD and their families. © 2017 John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Randler, Christoph; Kummer, Barbara; Wilhelm, Christian
2012-06-01
The aim of this study was to assess the outcome of a zoo visit in terms of learning and retention of knowledge concerning the adaptations and behavior of vertebrate species. Basis of the work was the concept of implementing zoo visits as an out-of-school setting for formal, curriculum based learning. Our theoretical framework centers on the self-determination theory, therefore, we used a group-based, hands-on learning environment. To address this questions, we used a treatment—control design (BACI) with different treatments and a control group. Pre-, post- and retention tests were applied. All treatments led to a substantial increase of learning and retention knowledge compared to the control group. Immediately after the zoo visit, the zoo-guide tour provided the highest scores, while after a delay of 6 weeks, the learner-centered environment combined with a teacher-guided summarizing scored best. We suggest incorporating the zoo as an out-of-school environment into formal school learning, and we propose different methods to improve learning in zoo settings.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pederson, Mark R., E-mail: mark.pederson@science.doe.gov
2015-02-14
A recent modification of the Perdew-Zunger self-interaction-correction to the density-functional formalism has provided a framework for explicitly restoring unitary invariance to the expression for the total energy. The formalism depends upon construction of Löwdin orthonormalized Fermi-orbitals which parametrically depend on variational quasi-classical electronic positions. Derivatives of these quasi-classical electronic positions, required for efficient minimization of the self-interaction corrected energy, are derived and tested, here, on atoms. Total energies and ionization energies in closed-shell singlet atoms, where correlation is less important, using the Perdew-Wang 1992 Local Density Approximation (PW92) functional, are in good agreement with experiment and non-relativistic quantum-Monte-Carlo results albeitmore » slightly too low.« less
Tmd Factorization and Evolution for Tmd Correlation Functions
NASA Astrophysics Data System (ADS)
Mert Aybat, S.; Rogers, Ted C.
We discuss the application of transverse momentum dependent (TMD) factorization theorems to phenomenology. Our treatment relies on recent extensions of the Collins-Soper-Sterman (CSS) formalism. Emphasis is placed on the importance of using well-defined TMD parton distribution functions (PDFs) and fragmentation functions (FFs) in calculating the evolution of these objects. We explain how parametrizations of unpolarized TMDs can be obtained from currently existing fixed-scale Gaussian fits and previous implementations of the CSS formalism in the Drell-Yan process, and provide some examples. We also emphasize the importance of agreed-upon definitions for having an unambiguous prescription for calculating higher orders in the hard part, and provide examples of higher order calculations. We end with a discussion of strategies for extending the phenomenological applications of TMD factorization to situations beyond the unpolarized case.
Decidability of formal theories and hyperincursivity theory
NASA Astrophysics Data System (ADS)
Grappone, Arturo G.
2000-05-01
This paper shows the limits of the Proof Standard Theory (briefly, PST) and gives some ideas of how to build a proof anticipatory theory (briefly, PAT) that has no such limits. Also, this paper considers that Gödel's proof of the undecidability of Principia Mathematica formal theory is not valid for axiomatic theories that use a PAT to build their proofs because the (hyper)incursive functions are self-representable.
ERIC Educational Resources Information Center
Parkinson, Eric F.
2004-01-01
Construction kits have played a significant part in nurturing the growth and development of the minds and manipulation-based skills of children (and adults) in formal and non-formal education settings. These kits have origins rooted in the representation of the built world and now have a diversity of form and function, including technical versions…
On the Conformable Fractional Quantum Mechanics
NASA Astrophysics Data System (ADS)
Mozaffari, F. S.; Hassanabadi, H.; Sobhani, H.; Chung, W. S.
2018-05-01
In this paper, a conformable fractional quantum mechanic has been introduced using three postulates. Then in such a formalism, Schr¨odinger equation, probability density, probability flux and continuity equation have been derived. As an application of considered formalism, a fractional-radial harmonic oscillator has been considered. After obtaining its wave function and energy spectrum, effects of the conformable fractional parameter on some quantities have been investigated and plotted for different excited states.
ERIC Educational Resources Information Center
Herrera, D.; Valencia, A. M.; Pennini, F.; Curilef, S.
2008-01-01
In this work, we review two formalisms of coherent states for the case of a particle in a magnetic field. We focus our revision on both pioneering (Feldman and Kahn 1970 "Phys. Rev." B 1 4584) and recent (Kowalski and Rembielinski 2005 "J. Phys. A: Math. Gen." 38 8247) formulations of coherent states for this problem. We introduce a general…
Rudolf, Klaus-Dieter; Kus, Sandra; Chung, Kevin C; Johnston, Marie; LeBlanc, Monique; Cieza, Alarcos
2012-01-01
A formal decision-making and consensus process was applied to develop the first version of the International Classification on Functioning, Disability and Health (ICF) Core Sets for Hand Conditions. To convene an international panel to develop the ICF Core Sets for Hand Conditions (HC), preparatory studies were conducted, which included an expert survey, a systematic literature review, a qualitative study and an empirical data collection process involving persons with hand conditions. A consensus conference was convened in Switzerland in May 2009 that was attended by 23 healthcare professionals, who treat hand conditions, representing 22 countries. The preparatory studies identified a set of 743 ICF categories at the second, third or fourth hierarchical level. Altogether, 117 chapter-, second-, or third-level categories were included in the comprehensive ICF Core Set for HC. The brief ICF Core Set for HC included a total of 23 chapter- and second-level categories. A formal consensus process integrating evidence and expert opinion based on the ICF led to the formal adoption of the ICF Core Sets for Hand Conditions. The next phase of this ICF project is to conduct a formal validation process to establish its applicability in clinical settings.
On verifying a high-level design. [cost and error analysis
NASA Technical Reports Server (NTRS)
Mathew, Ben; Wehbeh, Jalal A.; Saab, Daniel G.
1993-01-01
An overview of design verification techniques is presented, and some of the current research in high-level design verification is described. Formal hardware description languages that are capable of adequately expressing the design specifications have been developed, but some time will be required before they can have the expressive power needed to be used in real applications. Simulation-based approaches are more useful in finding errors in designs than they are in proving the correctness of a certain design. Hybrid approaches that combine simulation with other formal design verification techniques are argued to be the most promising over the short term.
Formalized description and construction of semantic dictionary of graphic-text spatial relationship
NASA Astrophysics Data System (ADS)
Sun, Yizhong; Xue, Xiaolei; Zhao, Xiaoqin
2008-10-01
Graphic and text are two major elements in exhibiting of the results of urban planning and land administration. In combination, they convey the complex relationship resulting from spatial analysis and decision-making. Accurately interpreting and representing these relationships are important steps towards an intelligent GIS for urban planning. This paper employs concept-hierarchy-tree to formalize graphic-text relationships through a framework of spatial object lexicon, spatial relationship lexicon, restriction lexicon, applied pattern base, and word segmentation rule base. The methodology is further verified and shown effective on several urban planning archives.
Wagner, Edwin E
2008-07-01
I present a formal system that accounts for the misleading distinction between tests formerly termed objective and projective, duly noted by Meyer and Kurtz (2006). Three principles of Response Rightness, Response Latitude and Stimulus Ambiguity are shown to govern, in combination, the formal operating characteristics of tests, producing inevitable overlap between "objective" and "projective" tests and creating at least three "types" of tests historically regarded as being projective in nature. The system resolves many past issues regarding test classification and can be generalized to include all psychological tests.
NASA Astrophysics Data System (ADS)
Collins, J.; Gamberg, L.; Prokudin, A.; Rogers, T. C.; Sato, N.; Wang, B.
2016-08-01
We construct an improved implementation for combining transverse-momentum-dependent (TMD) factorization and collinear factorization. TMD factorization is suitable for low transverse momentum physics, while collinear factorization is suitable for high transverse momenta and for a cross section integrated over transverse momentum. The result is a modified version of the standard W +Y prescription traditionally used in the Collins-Soper-Sterman (CSS) formalism and related approaches. We further argue that questions regarding the shape and Q dependence of the cross sections at lower Q are largely governed by the matching to the Y term.
Putz, Mihai V.
2009-01-01
The density matrix theory, the ancestor of density functional theory, provides the immediate framework for Path Integral (PI) development, allowing the canonical density be extended for the many-electronic systems through the density functional closure relationship. Yet, the use of path integral formalism for electronic density prescription presents several advantages: assures the inner quantum mechanical description of the system by parameterized paths; averages the quantum fluctuations; behaves as the propagator for time-space evolution of quantum information; resembles Schrödinger equation; allows quantum statistical description of the system through partition function computing. In this framework, four levels of path integral formalism were presented: the Feynman quantum mechanical, the semiclassical, the Feynman-Kleinert effective classical, and the Fokker-Planck non-equilibrium ones. In each case the density matrix or/and the canonical density were rigorously defined and presented. The practical specializations for quantum free and harmonic motions, for statistical high and low temperature limits, the smearing justification for the Bohr’s quantum stability postulate with the paradigmatic Hydrogen atomic excursion, along the quantum chemical calculation of semiclassical electronegativity and hardness, of chemical action and Mulliken electronegativity, as well as by the Markovian generalizations of Becke-Edgecombe electronic focalization functions – all advocate for the reliability of assuming PI formalism of quantum mechanics as a versatile one, suited for analytically and/or computationally modeling of a variety of fundamental physical and chemical reactivity concepts characterizing the (density driving) many-electronic systems. PMID:20087467
Putz, Mihai V
2009-11-10
The density matrix theory, the ancestor of density functional theory, provides the immediate framework for Path Integral (PI) development, allowing the canonical density be extended for the many-electronic systems through the density functional closure relationship. Yet, the use of path integral formalism for electronic density prescription presents several advantages: assures the inner quantum mechanical description of the system by parameterized paths; averages the quantum fluctuations; behaves as the propagator for time-space evolution of quantum information; resembles Schrödinger equation; allows quantum statistical description of the system through partition function computing. In this framework, four levels of path integral formalism were presented: the Feynman quantum mechanical, the semiclassical, the Feynman-Kleinert effective classical, and the Fokker-Planck non-equilibrium ones. In each case the density matrix or/and the canonical density were rigorously defined and presented. The practical specializations for quantum free and harmonic motions, for statistical high and low temperature limits, the smearing justification for the Bohr's quantum stability postulate with the paradigmatic Hydrogen atomic excursion, along the quantum chemical calculation of semiclassical electronegativity and hardness, of chemical action and Mulliken electronegativity, as well as by the Markovian generalizations of Becke-Edgecombe electronic focalization functions - all advocate for the reliability of assuming PI formalism of quantum mechanics as a versatile one, suited for analytically and/or computationally modeling of a variety of fundamental physical and chemical reactivity concepts characterizing the (density driving) many-electronic systems.
NASA Astrophysics Data System (ADS)
Stefanucci, G.; Pavlyukh, Y.; Uimonen, A.-M.; van Leeuwen, R.
2014-09-01
We present a diagrammatic approach to construct self-energy approximations within many-body perturbation theory with positive spectral properties. The method cures the problem of negative spectral functions which arises from a straightforward inclusion of vertex diagrams beyond the GW approximation. Our approach consists of a two-step procedure: We first express the approximate many-body self-energy as a product of half-diagrams and then identify the minimal number of half-diagrams to add in order to form a perfect square. The resulting self-energy is an unconventional sum of self-energy diagrams in which the internal lines of half a diagram are time-ordered Green's functions, whereas those of the other half are anti-time-ordered Green's functions, and the lines joining the two halves are either lesser or greater Green's functions. The theory is developed using noninteracting Green's functions and subsequently extended to self-consistent Green's functions. Issues related to the conserving properties of diagrammatic approximations with positive spectral functions are also addressed. As a major application of the formalism we derive the minimal set of additional diagrams to make positive the spectral function of the GW approximation with lowest-order vertex corrections and screened interactions. The method is then applied to vertex corrections in the three-dimensional homogeneous electron gas by using a combination of analytical frequency integrations and numerical Monte Carlo momentum integrations to evaluate the diagrams.
Quantum Statistics of the Toda Oscillator in the Wigner Function Formalism
NASA Astrophysics Data System (ADS)
Vojta, Günter; Vojta, Matthias
Classical and quantum mechanical Toda systems (Toda molecules, Toda lattices, Toda quantum fields) recently found growing interest as nonlinear systems showing solitons and chaos. In this paper the statistical thermodynamics of a system of quantum mechanical Toda oscillators characterized by a potential energy V(q) = Vo cos h q is treated within the Wigner function formalism (phase space formalism of quantum statistics). The partition function is given as a Wigner- Kirkwood series expansion in terms of powers of h2 (semiclassical expansion). The partition function and all thermodynamic functions are written, with considerable exactness, as simple closed expressions containing only the modified Hankel functions Ko and K1 of the purely imaginary argument i with = Vo/kT.
Development of structured ICD-10 and its application to computer-assisted ICD coding.
Imai, Takeshi; Kajino, Masayuki; Sato, Megumi; Ohe, Kazuhiko
2010-01-01
This paper presents: (1) a framework of formal representation of ICD10, which functions as a bridge between ontological information and natural language expressions; and (2) a methodology to use formally described ICD10 for computer-assisted ICD coding. First, we analyzed and structurized the meanings of categories in 15 chapters of ICD10. Then we expanded the structured ICD10 (S-ICD10) by adding subordinate concepts and labels derived from Japanese Standard Disease Names. The information model to describe formal representation was refined repeatedly. The resultant model includes 74 types of semantic links. We also developed an ICD coding module based on S-ICD10 and a 'Coding Principle,' which achieved high accuracy (>70%) for four chapters. These results not only demonstrate the basic feasibility of our coding framework but might also inform the development of the information model for formal description framework in the ICD11 revision.
Stopping power of dense plasmas: The collisional method and limitations of the dielectric formalism.
Clauser, C F; Arista, N R
2018-02-01
We present a study of the stopping power of plasmas using two main approaches: the collisional (scattering theory) and the dielectric formalisms. In the former case, we use a semiclassical method based on quantum scattering theory. In the latter case, we use the full description given by the extension of the Lindhard dielectric function for plasmas of all degeneracies. We compare these two theories and show that the dielectric formalism has limitations when it is used for slow heavy ions or atoms in dense plasmas. We present a study of these limitations and show the regimes where the dielectric formalism can be used, with appropriate corrections to include the usual quantum and classical limits. On the other hand, the semiclassical method shows the correct behavior for all plasma conditions and projectile velocity and charge. We consider different models for the ion charge distributions, including bare and dressed ions as well as neutral atoms.
Combining conceptual graphs and argumentation for aiding in the teleexpertise.
Doumbouya, Mamadou Bilo; Kamsu-Foguem, Bernard; Kenfack, Hugues; Foguem, Clovis
2015-08-01
Current medical information systems are too complex to be meaningfully exploited. Hence there is a need to develop new strategies for maximising the exploitation of medical data to the benefit of medical professionals. It is against this backdrop that we want to propose a tangible contribution by providing a tool which combines conceptual graphs and Dung׳s argumentation system in order to assist medical professionals in their decision making process. The proposed tool allows medical professionals to easily manipulate and visualise queries and answers for making decisions during the practice of teleexpertise. The knowledge modelling is made using an open application programming interface (API) called CoGui, which offers the means for building structured knowledge bases with the dedicated functionalities of graph-based reasoning via retrieved data from different institutions (hospitals, national security centre, and nursing homes). The tool that we have described in this study supports a formal traceable structure of the reasoning with acceptable arguments to elucidate some ethical problems that occur very often in the telemedicine domain. Copyright © 2015 Elsevier Ltd. All rights reserved.
Receipt of Caregiving and Fall Risk in US Community-dwelling Older Adults.
Hoffman, Geoffrey J; Hays, Ron D; Wallace, Steven P; Shapiro, Martin F; Yakusheva, Olga; Ettner, Susan L
2017-04-01
Falls and fall-related injuries (FRI) are common and costly occurrences among older adults living in the community, with increased risk for those with physical and cognitive limitations. Caregivers provide support for older adults with physical functioning limitations, which are associated with fall risk. Using the 2004-2012 waves of the Health and Retirement Study, we examined whether receipt of low (0-13 weekly hours) and high levels (≥14 weekly hours) of informal care or any formal care is associated with lower risk of falls and FRIs among community-dwelling older adults. We additionally tested whether serious physical functioning (≥3 activities of daily living) or cognitive limitations moderated this relationship. Caregiving receipt categories were jointly significant in predicting noninjurious falls (P=0.03) but not FRIs (P=0.30). High levels of informal care category (P=0.001) and formal care (P<0.001) had stronger associations with reduced fall risk relative to low levels of informal care. Among individuals with ≥3 activities of daily living, fall risks were reduced by 21% for those receiving high levels of informal care; additionally, FRIs were reduced by 42% and 58% for those receiving high levels of informal care and any formal care. High levels of informal care receipt were also associated with a 54% FRI risk reduction among the cognitively impaired. Fall risk reductions among older adults occurred predominantly among those with significant physical and cognitive limitations. Accordingly, policy efforts involving fall prevention should target populations with increased physical functioning and cognitive limitations. They should also reduce financial barriers to informal and formal caregiving.
TRL - A FORMAL TEST REPRESENTATION LANGUAGE AND TOOL FOR FUNCTIONAL TEST DESIGNS
NASA Technical Reports Server (NTRS)
Hops, J. M.
1994-01-01
A Formal Test Representation Language and Tool for Functional Test Designs (TRL) is an automatic tool and a formal language that is used to implement the Category-Partition Method and produce the specification of test cases in the testing phase of software development. The Category-Partition Method is particularly useful in defining the inputs, outputs and purpose of the test design phase and combines the benefits of choosing normal cases with error exposing properties. Traceability can be maintained quite easily by creating a test design for each objective in the test plan. The effort to transform the test cases into procedures is simplified by using an automatic tool to create the cases based on the test design. The method allows the rapid elimination of undesired test cases from consideration, and easy review of test designs by peer groups. The first step in the category-partition method is functional decomposition, in which the specification and/or requirements are decomposed into functional units that can be tested independently. A secondary purpose of this step is to identify the parameters that affect the behavior of the system for each functional unit. The second step, category analysis, carries the work done in the previous step further by determining the properties or sub-properties of the parameters that would make the system behave in different ways. The designer should analyze the requirements to determine the features or categories of each parameter and how the system may behave if the category were to vary its value. If the parameter undergoing refinement is a data-item, then categories of this data-item may be any of its attributes, such as type, size, value, units, frequency of change, or source. After all the categories for the parameters of the functional unit have been determined, the next step is to partition each category's range space into mutually exclusive values that the category can assume. In choosing partition values, all possible kinds of values should be included, especially the ones that will maximize error detection. The purpose of the final step, partition constraint analysis, is to refine the test design specification so that only the technically effective and economically feasible test cases are implied. TRL is written in C-language to be machine independent. It has been successfully implemented on an IBM PC compatible running MS DOS, a Sun4 series computer running SunOS, an HP 9000/700 series workstation running HP-UX, a DECstation running DEC RISC ULTRIX, and a DEC VAX series computer running VMS. TRL requires 1Mb of disk space and a minimum of 84K of RAM. The documentation is available in electronic form in Word Perfect format. The standard distribution media for TRL is a 5.25 inch 360K MS-DOS format diskette. Alternate distribution media and formats are available upon request. TRL was developed in 1993 and is a copyrighted work with all copyright vested in NASA.
Statistics of primordial density perturbations from discrete seed masses
NASA Technical Reports Server (NTRS)
Scherrer, Robert J.; Bertschinger, Edmund
1991-01-01
The statistics of density perturbations for general distributions of seed masses with arbitrary matter accretion is examined. Formal expressions for the power spectrum, the N-point correlation functions, and the density distribution function are derived. These results are applied to the case of uncorrelated seed masses, and power spectra are derived for accretion of both hot and cold dark matter plus baryons. The reduced moments (cumulants) of the density distribution are computed and used to obtain a series expansion for the density distribution function. Analytic results are obtained for the density distribution function in the case of a distribution of seed masses with a spherical top-hat accretion pattern. More generally, the formalism makes it possible to give a complete characterization of the statistical properties of any random field generated from a discrete linear superposition of kernels. In particular, the results can be applied to density fields derived by smoothing a discrete set of points with a window function.
Decisional Conflict: Relationships Between and Among Family Context Variables in Cancer Survivors.
Lim, Jung-Won; Shon, En-Jung
2016-07-01
To investigate the relationships among life stress, family functioning, family coping, reliance on formal and informal resources, and decisional conflict in cancer survivors. . Cross-sectional. . Participants were recruited from the California Cancer Surveillance Program, hospital registries, and community agencies in southern California and Cleveland, Ohio. . 243 European American, African American, Chinese American, and Korean American cancer survivors diagnosed with breast, colorectal, or prostate cancer. . The merged data from an ethnically diverse cohort of cancer survivors participating in the two survey studies were used. Standardized measures were used to identify family context variables and decisional conflict. . Life stress, family functioning, family coping, reliance on formal and informal resources, and decisional conflict. . Structural equation modeling demonstrated that life stress was significantly associated with decisional conflict. Family functioning significantly mediated the impact of life stress on decisional conflict through family coping. Reliance on formal and informal resources moderated the relationships among the study variables. . The role of the family context, which includes family functioning and coping, on decisional conflict is important in the adjustment process to make high-quality decisions in cancer survivorship care. . Findings present nursing practice and research implications that highlight the need for efforts to encourage and support family involvement in the decision-making process and to enhance cancer survivors' adjustment process.
Discrimination and Mental Health–Related Service Use in a National Study of Asian Americans
Chen, Juan; Gee, Gilbert C.; Fabian, Cathryn G.; Takeuchi, David T.
2010-01-01
Objectives. We examined the association between perceived discrimination and use of mental health services among a national sample of Asian Americans. Methods. Our data came from the National Latino and Asian American Study, the first national survey of Asian Americans. Our sample included 600 Chinese, 508 Filipinos, 520 Vietnamese, and 467 other Asians (n=2095). We used logistic regression to examine the association between discrimination and formal and informal service use and the interactive effect of discrimination and English language proficiency. Results. Perceived discrimination was associated with more use of informal services, but not with less use of formal services. Additionally, higher levels of perceived discrimination combined with lower English proficiency were associated with more use of informal services. Conclusions. The effect of perceived discrimination and language proficiency on service use indicates a need for more bilingual services and more collaborations between formal service systems and community resources. PMID:20299649
WOMEN’S RELIGIOUS AUTHORITY IN A SUB-SAHARAN SETTING
AGADJANIAN, VICTOR
2015-01-01
Western scholarship on religion and gender has devoted considerable attention to women’s entry into leadership roles across various religious traditions and denominations. However, very little is known about the dynamics of women’s religious authority and leadership in developing settings, especially in sub-Saharan Africa, a region of powerful and diverse religious expressions. This study employs a combination of uniquely rich and diverse data to examine women’s formal religious authority in a predominantly Christian setting in Mozambique. I first use survey data to test hypotheses regarding the prevalence and patterns of women’s formal leadership across different denominational groups. I then support and extend the quantitative results with insights on pathways and consequences of women’s ascent to formal congregation authority drawn from qualitative data. The analysis illustrates how women’s religious authority both defies and reasserts the gendered constraints of the religious marketplace and the broader gender ideology in this developing context. PMID:27011432
Formal Analysis of Self-Efficacy in Job Interviewee’s Mental State Model
NASA Astrophysics Data System (ADS)
Ajoge, N. S.; Aziz, A. A.; Yusof, S. A. Mohd
2017-08-01
This paper presents a formal analysis approach for self-efficacy model of interviewee’s mental state during a job interview session. Self-efficacy is a construct that has been hypothesised to combine with motivation and interviewee anxiety to define state influence of interviewees. The conceptual model was built based on psychological theories and models related to self-efficacy. A number of well-known relations between events and the course of self-efficacy are summarized from the literature and it is shown that the proposed model exhibits those patterns. In addition, this formal model has been mathematically analysed to find out which stable situations exist. Finally, it is pointed out how this model can be used in a software agent or robot-based platform. Such platform can provide an interview coaching approach where support to the user is provided based on their individual metal state during interview sessions.
WOMEN'S RELIGIOUS AUTHORITY IN A SUB-SAHARAN SETTING: Dialectics of Empowerment and Dependency.
Agadjanian, Victor
2015-12-01
Western scholarship on religion and gender has devoted considerable attention to women's entry into leadership roles across various religious traditions and denominations. However, very little is known about the dynamics of women's religious authority and leadership in developing settings, especially in sub-Saharan Africa, a region of powerful and diverse religious expressions. This study employs a combination of uniquely rich and diverse data to examine women's formal religious authority in a predominantly Christian setting in Mozambique. I first use survey data to test hypotheses regarding the prevalence and patterns of women's formal leadership across different denominational groups. I then support and extend the quantitative results with insights on pathways and consequences of women's ascent to formal congregation authority drawn from qualitative data. The analysis illustrates how women's religious authority both defies and reasserts the gendered constraints of the religious marketplace and the broader gender ideology in this developing context.
Hydride ions in oxide hosts hidden by hydroxide ions
Hayashi, Katsuro; Sushko, Peter V.; Hashimoto, Yasuhiro; Shluger, Alexander L.; Hosono, Hideo
2014-01-01
The true oxidation state of formally ‘H−’ ions incorporated in an oxide host is frequently discussed in connection with chemical shifts of 1H nuclear magnetic resonance spectroscopy, as they can exhibit values typically attributed to H+. Here we systematically investigate the link between geometrical structure and chemical shift of H− ions in an oxide host, mayenite, with a combination of experimental and ab initio approaches, in an attempt to resolve this issue. We demonstrate that the electron density near the hydrogen nucleus in an OH− ion (formally H+ state) exceeds that in an H− ion. This behaviour is the opposite to that expected from formal valences. We deduce a relationship between the chemical shift of H− and the distance from the H− ion to the coordinating electropositive cation. This relationship is pivotal for resolving H− species that are masked by various states of H+ ions. PMID:24662678
Computational logic: its origins and applications.
Paulson, Lawrence C
2018-02-01
Computational logic is the use of computers to establish facts in a logical formalism. Originating in nineteenth century attempts to understand the nature of mathematical reasoning, the subject now comprises a wide variety of formalisms, techniques and technologies. One strand of work follows the 'logic for computable functions (LCF) approach' pioneered by Robin Milner, where proofs can be constructed interactively or with the help of users' code (which does not compromise correctness). A refinement of LCF, called Isabelle, retains these advantages while providing flexibility in the choice of logical formalism and much stronger automation. The main application of these techniques has been to prove the correctness of hardware and software systems, but increasingly researchers have been applying them to mathematics itself.
van Weert, Julia CM; de Haes, Hanneke CJM; Loos, Eugene F; Smets, Ellen MA
2015-01-01
Background Older adults are increasingly using the Internet for health information; however, they are often not able to correctly recall Web-based information (eHealth information). Recall of information is crucial for optimal health outcomes, such as adequate disease management and adherence to medical regimes. Combining effective message strategies may help to improve recall of eHealth information among older adults. Presenting information in an audiovisual format using conversational narration style is expected to optimize recall of information compared to other combinations of modality and narration style. Objective The aim of this paper is to investigate the effect of modality and narration style on recall of health information, and whether there are differences between younger and older adults. Methods We conducted a Web-based experiment using a 2 (modality: written vs audiovisual information) by 2 (narration style: formal vs conversational style) between-subjects design (N=440). Age was assessed in the questionnaire and included as a factor: younger (<65 years) versus older (≥65 years) age. Participants were randomly assigned to one of four experimental webpages where information about lung cancer treatment was presented. A Web-based questionnaire assessed recall of eHealth information. Results Audiovisual modality (vs written modality) was found to increase recall of information in both younger and older adults (P=.04). Although conversational narration style (vs formal narration style) did not increase recall of information (P=.17), a synergistic effect between modality and narration style was revealed: combining audiovisual information with conversational style outperformed combining written information with formal style (P=.01), as well as written information with conversational style (P=.045). This finding suggests that conversational style especially increases recall of information when presented audiovisually. This combination of modality and narration style improved recall of information among both younger and older adults. Conclusions We conclude that combining audiovisual information with conversational style is the best way to present eHealth information to younger and older adults. Even though older adults did not proportionally recall more when audiovisual information was combined with conversational style than younger adults, this study reveals interesting implications for improving eHealth information that is effective for both younger and older adults. PMID:25910416
Kinetic Effects in Inertial Confinement Fusion
NASA Astrophysics Data System (ADS)
Kagan, Grigory
2014-10-01
Sharp background gradients, inevitably introduced during ICF implosion, are likely responsible for the discrepancy between the predictions of the standard single-fluid rad-hydro codes and the experimental observations. On the one hand, these gradients drive the inter-ion-species transport, so the fuel composition no longer remains constant, unlike what the single-fluid codes assume. On the other hand, once the background scale is comparable to the mean free path, a fluid description becomes invalid. This point takes on special significance in plasmas, where the particle's mean free path scales with the square of this particle's energy. The distribution function of energetic ions may therefore be far from Maxwellian, even if thermal ions are nearly equilibrated. Ironically, it is these energetic, or tail, ions that are supposed to fuse at the onset of ignition. A combination of studies has been conducted to clarify the role of such kinetic effects on ICF performance. First, transport formalism applicable to multi-component plasmas has been developed. In particular, a novel ``electro-diffusion'' mechanism of the ion species separation has been shown to exist. Equally important, in drastic contrast to the classical case of the neutral gas mixture, thermo-diffusion is predicted to be comparable to, or even much larger than, baro-diffusion. By employing the effective potential theory this formalism has then been generalized to the case of a moderately coupled plasma with multiple ion species, making it applicable to the problem of mix at the shell/fuel interface in ICF implosion. Second, distribution function for the energetic ions has been found from first principles and the fusion reactivity reduction has been calculated for hot-spot relevant conditions. A technique for approximate evaluation of the distribution function has been identified. This finding suggests a path to effectively introducing the tail modification effects into mainline rad-hydro codes, while being in good agreement with the first principle based solution. This work was partially supported by the Laboratory Directed Research and Development (LDRD) program of LANL.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Waidmann, Christopher R.; Miller, Alexander J.; Ng, Cheuk-Wa A.
Studies in proton-coupled electron transfer (PCET) often require the combination of an outer-sphere oxidant and a base, to remove an electron and a proton. A common problem is the incompatibility of the oxidant and the base, because the former is electron deficient and the latter electron rich. We have tested a variety of reagents and report a number of oxidant/base combinations that are compatible and therefore potentially useful as PCET reagents. A formal bond dissociation free energy (BDFE) for a reagent combination is defined by the redox potential of the oxidant and pKa of the base. This is a formalmore » BDFE because no X-H bond is homolytically cleaved, but it is a very useful way to categorize the H• accepting ability of an oxidant/base PCET pair. Formal BDFEs of stable oxidant/base combinations range from 71 to at least 100 kcal mol-1. Effects of solvent, concentration, temperature, and counterions on the stability of the oxidant/base combinations are discussed. Possible extensions to related reductant/acid combinations are mentioned. We gratefully acknowledge the financial support of the U.S. National Science Foundation Center for Enabling New Technologies through Catalysis, the Camille and Henry Dreyfus Postdoctoral Program in Environmental Chemistry (for a fellowship to A.J.M.M.), the U.S. National Institutes of Health (grant GM-50422), and the Center for Molecular Electrocatalysis, an Energy Frontier Research Center funded by the U.S. Department of Energy, Office of Science, Office of Basic Energy Sciences.« less
Perception, Cognition, and Visualization.
ERIC Educational Resources Information Center
Arnheim, Rudolf
1991-01-01
Described are how pictures can combine aspects of naturalistic representation with more formal shapes to enhance cognitive understanding. These "diagrammatic" shapes derive from geometrical elementary and thereby bestow visual concreteness to concepts conveyed by the pictures. Leonardo da Vinci's anatomical drawings are used as examples…
Semantically-Rigorous Systems Engineering Modeling Using Sysml and OWL
NASA Technical Reports Server (NTRS)
Jenkins, J. Steven; Rouquette, Nicolas F.
2012-01-01
The Systems Modeling Language (SysML) has found wide acceptance as a standard graphical notation for the domain of systems engineering. SysML subsets and extends the Unified Modeling Language (UML) to define conventions for expressing structural, behavioral, and analytical elements, and relationships among them. SysML-enabled modeling tools are available from multiple providers, and have been used for diverse projects in military aerospace, scientific exploration, and civil engineering. The Web Ontology Language (OWL) has found wide acceptance as a standard notation for knowledge representation. OWL-enabled modeling tools are available from multiple providers, as well as auxiliary assets such as reasoners and application programming interface libraries, etc. OWL has been applied to diverse projects in a wide array of fields. While the emphasis in SysML is on notation, SysML inherits (from UML) a semantic foundation that provides for limited reasoning and analysis. UML's partial formalization (FUML), however, does not cover the full semantics of SysML, which is a substantial impediment to developing high confidence in the soundness of any conclusions drawn therefrom. OWL, by contrast, was developed from the beginning on formal logical principles, and consequently provides strong support for verification of consistency and satisfiability, extraction of entailments, conjunctive query answering, etc. This emphasis on formal logic is counterbalanced by the absence of any graphical notation conventions in the OWL standards. Consequently, OWL has had only limited adoption in systems engineering. The complementary strengths and weaknesses of SysML and OWL motivate an interest in combining them in such a way that we can benefit from the attractive graphical notation of SysML and the formal reasoning of OWL. This paper describes an approach to achieving that combination.
NASA Astrophysics Data System (ADS)
Shafii, M.; Tolson, B.; Matott, L. S.
2012-04-01
Hydrologic modeling has benefited from significant developments over the past two decades. This has resulted in building of higher levels of complexity into hydrologic models, which eventually makes the model evaluation process (parameter estimation via calibration and uncertainty analysis) more challenging. In order to avoid unreasonable parameter estimates, many researchers have suggested implementation of multi-criteria calibration schemes. Furthermore, for predictive hydrologic models to be useful, proper consideration of uncertainty is essential. Consequently, recent research has emphasized comprehensive model assessment procedures in which multi-criteria parameter estimation is combined with statistically-based uncertainty analysis routines such as Bayesian inference using Markov Chain Monte Carlo (MCMC) sampling. Such a procedure relies on the use of formal likelihood functions based on statistical assumptions, and moreover, the Bayesian inference structured on MCMC samplers requires a considerably large number of simulations. Due to these issues, especially in complex non-linear hydrological models, a variety of alternative informal approaches have been proposed for uncertainty analysis in the multi-criteria context. This study aims at exploring a number of such informal uncertainty analysis techniques in multi-criteria calibration of hydrological models. The informal methods addressed in this study are (i) Pareto optimality which quantifies the parameter uncertainty using the Pareto solutions, (ii) DDS-AU which uses the weighted sum of objective functions to derive the prediction limits, and (iii) GLUE which describes the total uncertainty through identification of behavioral solutions. The main objective is to compare such methods with MCMC-based Bayesian inference with respect to factors such as computational burden, and predictive capacity, which are evaluated based on multiple comparative measures. The measures for comparison are calculated both for calibration and evaluation periods. The uncertainty analysis methodologies are applied to a simple 5-parameter rainfall-runoff model, called HYMOD.
Gulati, Gaurav; Jones, Jordan T; Lee, Gregory; Altaye, Mekibib; Beebe, Dean W; Meyers-Eaton, Jamie; Wiley, Kasha; Brunner, Hermine I; DiFrancesco, Mark W
2017-02-01
To evaluate a safe, noninvasive magnetic resonance imaging (MRI) method to measure regional blood-brain barrier integrity and investigate its relationship with neurocognitive function and regional gray matter volume in juvenile-onset systemic lupus erythematosus (SLE). In this cross-sectional, case-control study, capillary permeability was measured as a marker of blood-brain barrier integrity in juvenile SLE patients and matched healthy controls, using a combination of arterial spin labeling and diffusion-weighted brain MRI. Regional gray matter volume was measured by voxel-based morphometry. Correlation analysis was done to investigate the relationship between regional capillary permeability and regional gray matter volume. Formal neurocognitive testing was completed (measuring attention, visuoconstructional ability, working memory, and psychomotor speed), and scores were regressed against regional blood-brain barrier integrity among juvenile SLE patients. Formal cognitive testing confirmed normal cognitive ability in all juvenile SLE subjects (n = 11) included in the analysis. Regional capillary permeability was negatively associated (P = 0.026) with neurocognitive performance concerning psychomotor speed in the juvenile SLE cohort. Compared with controls (n = 11), juvenile SLE patients had significantly greater capillary permeability involving Brodmann's areas 19, 28, 36, and 37 and caudate structures (P < 0.05 for all). There is imaging evidence of increased regional capillary permeability in juvenile SLE patients with normal cognitive performance using a novel noninvasive MRI technique. These blood-brain barrier outcomes appear consistent with functional neuronal network alterations and gray matter volume loss previously observed in juvenile SLE patients with overt neurocognitive deficits, supporting the notion that blood-brain barrier integrity loss precedes the loss of cognitive ability in juvenile SLE. Longitudinal studies are needed to confirm the findings of this pilot study. © 2016, American College of Rheumatology.
Maritz, Roxanne; Aronsky, Dominik; Prodinger, Birgit
2017-09-20
The International Classification of Functioning, Disability and Health (ICF) is the World Health Organization's standard for describing health and health-related states. Examples of how the ICF has been used in Electronic Health Records (EHRs) have not been systematically summarized and described yet. To provide a systematic review of peer-reviewed literature about the ICF's use in EHRs, including related challenges and benefits. Peer-reviewed literature, published between January 2001 and July 2015 was retrieved from Medline ® , CINAHL ® , Scopus ® , and ProQuest ® Social Sciences using search terms related to ICF and EHR concepts. Publications were categorized according to three groups: Requirement specification, development and implementation. Information extraction was conducted according to a qualitative content analysis method, deductively informed by the evaluation framework for Health Information Systems: Human, Organization and Technology-fit (HOT-fit). Of 325 retrieved articles, 17 publications were included; 4 were categorized as requirement specification, 7 as development, and 6 as implementation publications. Information regarding the HOT-fit evaluation framework was summarized. Main benefits of using the ICF in EHRs were its unique comprehensive perspective on health and its interdisciplinary focus. Main challenges included the fact that the ICF is not structured as a formal terminology as well as the need for a reduced number of ICF codes for more feasible and practical use. Different approaches and technical solutions exist for integrating the ICF in EHRs, such as combining the ICF with other existing standards for EHR or selecting ICF codes with natural language processing. Though the use of the ICF in EHRs is beneficial as this review revealed, the ICF could profit from further improvements such as formalizing the knowledge representation in the ICF to support and enhance interoperability.
Wawrzyniak, Piotr K; Alia, A; Schaap, Roland G; Heemskerk, Mattijs M; de Groot, Huub J M; Buda, Francesco
2008-12-14
Bacteriochlorophyll-histidine complexes are ubiquitous in nature and are essential structural motifs supporting the conversion of solar energy into chemically useful compounds in a wide range of photosynthesis processes. A systematic density functional theory study of the NMR chemical shifts for histidine and for bacteriochlorophyll-a-histidine complexes in the light-harvesting complex II (LH2) is performed using the BLYP functional in combination with the 6-311++G(d,p) basis set. The computed chemical shift patterns are consistent with available experimental data for positive and neutral(tau) (N(tau) protonated) crystalline histidines. The results for the bacteriochlorophyll-a-histidine complexes in LH2 provide evidence that the protein environment is stabilizing the histidine close to the Mg ion, thereby inducing a large charge transfer of approximately 0.5 electronic equivalent. Due to this protein-induced geometric constraint, the Mg-coordinated histidine in LH2 appears to be in a frustrated state very different from the formal neutral(pi) (N(pi) protonated) form. This finding could be important for the understanding of basic functional mechanisms involved in tuning the electronic properties and exciton coupling in LH2.
Beinecke, R H
1999-01-01
An expanded range of oversight mechanisms is being adopted to hold public human service programs more accountable to funding sources as well as consumers, family members, and providers. Most of these approaches are hierarchical in nature. Some involve negotiated agreements and each is designed to meet certain goals and functions. Each utilizes different forms of decision-making. Stakeholders prefer to be part of a shared decision-making process. Understanding these underlying premises can help to assess the strengths and weaknesses of each method and can suggest how to most effectively utilize combinations of approaches to improve program performance. Whether we will move toward a new paradigm emphasizing participation and collaboration rather than more formal structural approaches is yet undetermined but will greatly affect how programs are monitored and evaluated in the future.
FAST TRACK COMMUNICATION A DFT + DMFT approach for nanosystems
NASA Astrophysics Data System (ADS)
Turkowski, Volodymyr; Kabir, Alamgir; Nayyar, Neha; Rahman, Talat S.
2010-11-01
We propose a combined density-functional-theory-dynamical-mean-field-theory (DFT + DMFT) approach for reliable inclusion of electron-electron correlation effects in nanosystems. Compared with the widely used DFT + U approach, this method has several advantages, the most important of which is that it takes into account dynamical correlation effects. The formalism is illustrated through different calculations of the magnetic properties of a set of small iron clusters (number of atoms 2 <= N <= 5). It is shown that the inclusion of dynamical effects leads to a reduction in the cluster magnetization (as compared to results from DFT + U) and that, even for such small clusters, the magnetization values agree well with experimental estimations. These results justify confidence in the ability of the method to accurately describe the magnetic properties of clusters of interest to nanoscience.
First-principles studies of electronic, transport and bulk properties of pyrite FeS2
NASA Astrophysics Data System (ADS)
Banjara, Dipendra; Mbolle, Augustine; Malozovsky, Yuriy; Franklin, Lashounda; Bagayoko, Diola
We present results of ab-initio, self-consistent density functional theory (DFT) calculations of electronic, transport, and bulk properties of pyrite FeS2. We employed a local density approximation (LDA) potential and the linear combination of atomic orbitals (LCAO) formalism, following the Bagayoko, Zhao and Williams (BZW) method, as enhanced by Ekuma and Franklin (BZW-EF). The BZW-EF method requires successive, self consistent calculations with increasing basis sets to reach the ground state of the system under study. We report the band structure, the band gap, total and partial densities of states, effective masses, and the bulk modulus. Work funded in part by the US Department of Energy (DOE), National Nuclear Security Administration (NNSA) (Award No.DE-NA0002630), the National Science Foundation (NSF) (Award No, 1503226), LaSPACE, and LONI-SUBR.
Nazarpour, Soheila; Simbar, Masoumeh; Ramezani Tehrani, Fahimeh; Alavi Majd, Hamid
2017-07-01
The sex lives of women are strongly affected by menopause. Non-pharmacologic approaches to improving the sexual function of postmenopausal women might prove effective. To compare two methods of intervention (formal sex education and Kegel exercises) with routine postmenopausal care services in a randomized clinical trial. A randomized clinical trial was conducted of 145 postmenopausal women residing in Chalus and Noshahr, Iran. Their sexual function statuses were assessed using the Female Sexual Function Index (FSFI) questionnaire. After obtaining written informed consents, they were randomly assigned to one of three groups: (i) formal sex education, (ii) Kegel exercises, or (iii) routine postmenopausal care. After 12 weeks, all participants completed the FSFI again. Analysis of covariance was used to compare the participants' sexual function before and after the interventions, and multiple linear regression analysis was used to determine the predictive factors for variation in FSFI scores in the postintervention stage. Sexual function was assessed using the FSFI. There were no statistically significant differences in demographic and socioeconomic characteristics and FSFI total scores among the three study groups at the outset of the study. After 12 weeks, the scores of arousal in the formal sex education and Kegel groups were significantly higher compared with the control group (3.38 and 3.15 vs 2.77, respectively). The scores of orgasm and satisfaction in the Kegel group were significantly higher compared with the control group (4.43 and 4.88 vs 3.95 and 4.39, respectively). Formal sex education and Kegel exercises were used as two non-pharmacologic approaches to improve the sexual function of women after menopause. The main strength of this study was its design: a well-organized randomized trial using precise eligibility criteria with a small sample loss. The second strength was the methods of intervention used, namely non-pharmacologic approaches that are simple, easily accessible, and fairly inexpensive. The main limitation of the study was our inability to objectively assess the participants' commitment to exercise and the sexual function of their partners. Sex education programs and Kegel exercises could cause improvements in some domains of sexual function-specifically arousal, orgasm, and satisfaction-in postmenopausal women. Nazarpour S, Simbar M, Tehrani FR, Majd HA. Effects of Sex Education and Kegel Exercises on the Sexual Function of Postmenopausal Women: A Randomized Clinical Trial. J Sex Med 2017;14:959-967. Copyright © 2017 International Society for Sexual Medicine. Published by Elsevier Inc. All rights reserved.
A Theory of the Function of Technical Writing.
ERIC Educational Resources Information Center
Ross, Donald, Jr.
1981-01-01
Advances the theory that technical writing functions as a replacement for memory--an information storage receptacle. Lists the formal and stylistic features implied by such a theory. Considers the future development of technical writing within the context of this theory. (RL)
NASA Astrophysics Data System (ADS)
Cleff, Carsten; Rigneault, Hervé; Brasselet, Sophie; Duboisset, Julien
2017-07-01
We describe coherent Raman scattering in a complete spherical formalism allowing a better understanding of the coherent Raman process with respect to its symmetry properties, which is especially helpful in polarized coherent Raman microscopy. We describe how to build the coherent Raman tensor from spontaneous Raman tensor for crystalline and disordered media. We introduce a distribution function for molecular bonds and show how this distribution function results in a new macroscopic symmetry which can be very different from the symmetry of vibrational modes. Finally, we explicitly show polarization configurations for coherent anti-Stokes Raman scattering to probe specific vibration symmetries in crystalline samples and lipid layers.
Cost Implications of Organizing Nursing Home Workforce in Teams
Mukamel, Dana B; Cai, Shubing; Temkin-Greener, Helena
2009-01-01
Objective To estimate the costs associated with formal and self-managed daily practice teams in nursing homes. Data Sources/Study Setting Medicaid cost reports for 135 nursing homes in New York State in 2006 and survey data for 6,137 direct care workers. Study Design A retrospective statistical analysis: We estimated hybrid cost functions that include team penetration variables. Inference was based on robust standard errors. Data Collection Formal and self-managed team penetration (i.e., percent of staff working in a team) were calculated from survey responses. Annual variable costs, beds, case mix-adjusted days, admissions, home care visits, outpatient clinic visits, day care days, wages, and ownership were calculated from the cost reports. Principal Findings Formal team penetration was significantly associated with costs, while self-managed teams penetration was not. Costs declined with increasing penetration up to 13 percent of formal teams, and increased above this level. Formal teams in nursing homes in the upward sloping range of the curve were more diverse, with a larger number of participating disciplines and more likely to include physicians. Conclusions Organization of workforce in formal teams may offer nursing homes a cost-saving strategy. More research is required to understand the relationship between team composition and costs. PMID:19486181
Small intestinal function and dietary status in dermatitis herpetiformis.
Gawkrodger, D J; McDonald, C; O'Mahony, S; Ferguson, A
1991-01-01
Small intestinal morphology and function were assessed in 82 patients with dermatitis herpetiformis, 51 of whom were taking a normal diet and 31 a gluten free diet. Methods used were histopathological evaluation of jejunal mucosal biopsy specimens, quantitation of intraepithelial lymphocytes, cellobiose/mannitol permeability test, tissue disaccharidase values, serum antigliadin antibodies, and formal assessment of dietary gluten content by a dietician. There was no correlation between dietary gluten intake and the degree of enteropathy in the 51 patients taking a normal diet, whereas biopsy specimens were normal in 24 of the 31 patients on a gluten free diet, all previously having been abnormal. Eighteen patients on gluten containing diets had normal jejunal histology and in seven of these all tests of small intestinal morphology and function were entirely normal. Intestinal permeability was abnormal and serum antigliadin antibodies were present in most patients with enteropathy. Studies of acid secretion in seven patients showed that hypochlorhydria or achlorhydria did not lead to abnormal permeability in the absence of enteropathy. This study shows that a combination of objective tests of small intestinal architecture and function will detect abnormalities in most dermatitis herpetiformis patients, including some with histologically normal jejunal biopsy specimens. Nevertheless there is a small group in whom all conventional intestinal investigations are entirely normal. PMID:2026337
NASA Astrophysics Data System (ADS)
Wang, Zhen; Cui, Shengcheng; Yang, Jun; Gao, Haiyang; Liu, Chao; Zhang, Zhibo
2017-03-01
We present a novel hybrid scattering order-dependent variance reduction method to accelerate the convergence rate in both forward and backward Monte Carlo radiative transfer simulations involving highly forward-peaked scattering phase function. This method is built upon a newly developed theoretical framework that not only unifies both forward and backward radiative transfer in scattering-order-dependent integral equation, but also generalizes the variance reduction formalism in a wide range of simulation scenarios. In previous studies, variance reduction is achieved either by using the scattering phase function forward truncation technique or the target directional importance sampling technique. Our method combines both of them. A novel feature of our method is that all the tuning parameters used for phase function truncation and importance sampling techniques at each order of scattering are automatically optimized by the scattering order-dependent numerical evaluation experiments. To make such experiments feasible, we present a new scattering order sampling algorithm by remodeling integral radiative transfer kernel for the phase function truncation method. The presented method has been implemented in our Multiple-Scaling-based Cloudy Atmospheric Radiative Transfer (MSCART) model for validation and evaluation. The main advantage of the method is that it greatly improves the trade-off between numerical efficiency and accuracy order by order.
Approaches to modelling hydrology and ecosystem interactions
NASA Astrophysics Data System (ADS)
Silberstein, Richard P.
2014-05-01
As the pressures of industry, agriculture and mining on groundwater resources increase there is a burgeoning un-met need to be able to capture these multiple, direct and indirect stresses in a formal framework that will enable better assessment of impact scenarios. While there are many catchment hydrological models and there are some models that represent ecological states and change (e.g. FLAMES, Liedloff and Cook, 2007), these have not been linked in any deterministic or substantive way. Without such coupled eco-hydrological models quantitative assessments of impacts from water use intensification on water dependent ecosystems under changing climate are difficult, if not impossible. The concept would include facility for direct and indirect water related stresses that may develop around mining and well operations, climate stresses, such as rainfall and temperature, biological stresses, such as diseases and invasive species, and competition such as encroachment from other competing land uses. Indirect water impacts could be, for example, a change in groundwater conditions has an impact on stream flow regime, and hence aquatic ecosystems. This paper reviews previous work examining models combining ecology and hydrology with a view to developing a conceptual framework linking a biophysically defensable model that combines ecosystem function with hydrology. The objective is to develop a model capable of representing the cumulative impact of multiple stresses on water resources and associated ecosystem function.
Thermodynamics and proton activities of protic ionic liquids with quantum cluster equilibrium theory
NASA Astrophysics Data System (ADS)
Ingenmey, Johannes; von Domaros, Michael; Perlt, Eva; Verevkin, Sergey P.; Kirchner, Barbara
2018-05-01
We applied the binary Quantum Cluster Equilibrium (bQCE) method to a number of alkylammonium-based protic ionic liquids in order to predict boiling points, vaporization enthalpies, and proton activities. The theory combines statistical thermodynamics of van-der-Waals-type clusters with ab initio quantum chemistry and yields the partition functions (and associated thermodynamic potentials) of binary mixtures over a wide range of thermodynamic phase points. Unlike conventional cluster approaches that are limited to the prediction of thermodynamic properties, dissociation reactions can be effortlessly included into the bQCE formalism, giving access to ionicities, as well. The method is open to quantum chemical methods at any level of theory, but combination with low-cost composite density functional theory methods and the proposed systematic approach to generate cluster sets provides a computationally inexpensive and mostly parameter-free way to predict such properties at good-to-excellent accuracy. Boiling points can be predicted within an accuracy of 50 K, reaching excellent accuracy for ethylammonium nitrate. Vaporization enthalpies are predicted within an accuracy of 20 kJ mol-1 and can be systematically interpreted on a molecular level. We present the first theoretical approach to predict proton activities in protic ionic liquids, with results fitting well into the experimentally observed correlation. Furthermore, enthalpies of vaporization were measured experimentally for some alkylammonium nitrates and an excellent linear correlation with vaporization enthalpies of their respective parent amines is observed.
Durning, Steven J; Graner, John; Artino, Anthony R; Pangaro, Louis N; Beckman, Thomas; Holmboe, Eric; Oakes, Terrance; Roy, Michael; Riedy, Gerard; Capaldi, Vincent; Walter, Robert; van der Vleuten, Cees; Schuwirth, Lambert
2012-09-01
Clinical reasoning is essential to medical practice, but because it entails internal mental processes, it is difficult to assess. Functional magnetic resonance imaging (fMRI) and think-aloud protocols may improve understanding of clinical reasoning as these methods can more directly assess these processes. The objective of our study was to use a combination of fMRI and think-aloud procedures to examine fMRI correlates of a leading theoretical model in clinical reasoning based on experimental findings to date: analytic (i.e., actively comparing and contrasting diagnostic entities) and nonanalytic (i.e., pattern recognition) reasoning. We hypothesized that there would be functional neuroimaging differences between analytic and nonanalytic reasoning theory. 17 board-certified experts in internal medicine answered and reflected on validated U.S. Medical Licensing Exam and American Board of Internal Medicine multiple-choice questions (easy and difficult) during an fMRI scan. This procedure was followed by completion of a formal think-aloud procedure. fMRI findings provide some support for the presence of analytic and nonanalytic reasoning systems. Statistically significant activation of prefrontal cortex distinguished answering incorrectly versus correctly (p < 0.01), whereas activation of precuneus and midtemporal gyrus distinguished not guessing from guessing (p < 0.01). We found limited fMRI evidence to support analytic and nonanalytic reasoning theory, as our results indicate functional differences with correct vs. incorrect answers and guessing vs. not guessing. However, our findings did not suggest one consistent fMRI activation pattern of internal medicine expertise. This model of employing fMRI correlates offers opportunities to enhance our understanding of theory, as well as improve our teaching and assessment of clinical reasoning, a key outcome of medical education.
PyMCT: A Very High Level Language Coupling Tool For Climate System Models
NASA Astrophysics Data System (ADS)
Tobis, M.; Pierrehumbert, R. T.; Steder, M.; Jacob, R. L.
2006-12-01
At the Climate Systems Center of the University of Chicago, we have been examining strategies for applying agile programming techniques to complex high-performance modeling experiments. While the "agile" development methodology differs from a conventional requirements process and its associated milestones, the process remain a formal one. It is distinguished by continuous improvement in functionality, large numbers of small releases, extensive and ongoing testing strategies, and a strong reliance on very high level languages (VHLL). Here we report on PyMCT, which we intend as a core element in a model ensemble control superstructure. PyMCT is a set of Python bindings for MCT, the Fortran-90 based Model Coupling Toolkit, which forms the infrastructure for the inter-component communication in the Community Climate System Model (CCSM). MCT provides a scalable model communication infrastructure. In order to take maximum advantage of agile software development methodologies, we exposed MCT functionality to Python, a prominent VHLL. We describe how the scalable architecture of MCT allows us to overcome the relatively weak runtime performance of Python, so that the performance of the combined system is not severely impacted. To demonstrate these advantages, we reimplemented the CCSM coupler in Python. While this alone offers no new functionality, it does provide a rigorous test of PyMCT functionality and performance. We reimplemented the CPL6 library, presenting an interesting case study of the comparison between conventional Fortran-90 programming and the higher abstraction level provided by a VHLL. The powerful abstractions provided by Python will allow much more complex experimental paradigms. In particular, we hope to build on the scriptability of our coupling strategy to enable systematic sensitivity tests. Our most ambitious objective is to combine our efforts with Bayesian inverse modeling techniques toward objective tuning at the highest level, across model architectures.
Generalizing Is Necessary or Even Unavoidable
ERIC Educational Resources Information Center
Otte, Michael F.; Mendonça, Tânia M.; de Barros, Luiz
2015-01-01
The problems of geometry and mechanics have driven forward the generalization of the concepts of number and function. This shows how application and generalization together prevent that mathematics becomes a mere formalism. Thoughts are signs and signs have meaning within a certain context. Meaning is a function of a term: This function produces a…
On the subsystem formulation of linear-response time-dependent DFT.
Pavanello, Michele
2013-05-28
A new and thorough derivation of linear-response subsystem time-dependent density functional theory (TD-DFT) is presented and analyzed in detail. Two equivalent derivations are presented and naturally yield self-consistent subsystem TD-DFT equations. One reduces to the subsystem TD-DFT formalism of Neugebauer [J. Chem. Phys. 126, 134116 (2007)]. The other yields Dyson type equations involving three types of subsystem response functions: coupled, uncoupled, and Kohn-Sham. The Dyson type equations for subsystem TD-DFT are derived here for the first time. The response function formalism reveals previously hidden qualities and complications of subsystem TD-DFT compared with the regular TD-DFT of the supersystem. For example, analysis of the pole structure of the subsystem response functions shows that each function contains information about the electronic spectrum of the entire supersystem. In addition, comparison of the subsystem and supersystem response functions shows that, while the correlated response is subsystem additive, the Kohn-Sham response is not. Comparison with the non-subjective partition DFT theory shows that this non-additivity is largely an artifact introduced by the subjective nature of the density partitioning in subsystem DFT.
Formal methods demonstration project for space applications
NASA Technical Reports Server (NTRS)
Divito, Ben L.
1995-01-01
The Space Shuttle program is cooperating in a pilot project to apply formal methods to live requirements analysis activities. As one of the larger ongoing shuttle Change Requests (CR's), the Global Positioning System (GPS) CR involves a significant upgrade to the Shuttle's navigation capability. Shuttles are to be outfitted with GPS receivers and the primary avionics software will be enhanced to accept GPS-provided positions and integrate them into navigation calculations. Prior to implementing the CR, requirements analysts at Loral Space Information Systems, the Shuttle software contractor, must scrutinize the CR to identify and resolve any requirements issues. We describe an ongoing task of the Formal Methods Demonstration Project for Space Applications whose goal is to find an effective way to use formal methods in the GPS CR requirements analysis phase. This phase is currently under way and a small team from NASA Langley, ViGYAN Inc. and Loral is now engaged in this task. Background on the GPS CR is provided and an overview of the hardware/software architecture is presented. We outline the approach being taken to formalize the requirements, only a subset of which is being attempted. The approach features the use of the PVS specification language to model 'principal functions', which are major units of Shuttle software. Conventional state machine techniques form the basis of our approach. Given this background, we present interim results based on a snapshot of work in progress. Samples of requirements specifications rendered in PVS are offered to illustration. We walk through a specification sketch for the principal function known as GPS Receiver State processing. Results to date are summarized and feedback from Loral requirements analysts is highlighted. Preliminary data is shown comparing issues detected by the formal methods team versus those detected using existing requirements analysis methods. We conclude by discussing our plan to complete the remaining activities of this task.
A formalism for the systematic treatment of rapidity logarithms in Quantum Field Theory
NASA Astrophysics Data System (ADS)
Chiu, Jui-Yu; Jain, Ambar; Neill, Duff; Rothstein, Ira Z.
2012-05-01
Many observables in QCD rely upon the resummation of perturbation theory to retain predictive power. Resummation follows after one factorizes the cross section into the relevant modes. The class of observables which are sensitive to soft recoil effects are particularly challenging to factorize and resum since they involve rapidity logarithms. Such observables include: transverse momentum distributions at p T much less then the high energy scattering scale, jet broadening, exclusive hadroproduction and decay, as well as the Sudakov form factor. In this paper we will present a formalism which allows one to factorize and resum the perturbative series for such observables in a systematic fashion through the notion of a "rapidity renormalization group". That is, a Collin-Soper like equation is realized as a renormalization group equation, but has a more universal applicability to observables beyond the traditional transverse momentum dependent parton distribution functions (TMDPDFs) and the Sudakov form factor. This formalism has the feature that it allows one to track the (non-standard) scheme dependence which is inherent in any sce- nario where one performs a resummation of rapidity divergences. We present a pedagogical introduction to the formalism by applying it to the well-known massive Sudakov form fac- tor. The formalism is then used to study observables of current interest. A factorization theorem for the transverse momentum distribution of Higgs production is presented along with the result for the resummed cross section at NLL. Our formalism allows one to define gauge invariant TMDPDFs which are independent of both the hard scattering amplitude and the soft function, i.e. they are universal. We present details of the factorization and re- summation of the jet broadening cross section including a renormalization in p ⊥ space. We furthermore show how to regulate and renormalize exclusive processes which are plagued by endpoint singularities in such a way as to allow for a consistent resummation.
From Informal Safety-Critical Requirements to Property-Driven Formal Validation
NASA Technical Reports Server (NTRS)
Cimatti, Alessandro; Roveri, Marco; Susi, Angelo; Tonetta, Stefano
2008-01-01
Most of the efforts in formal methods have historically been devoted to comparing a design against a set of requirements. The validation of the requirements themselves, however, has often been disregarded, and it can be considered a largely open problem, which poses several challenges. The first challenge is given by the fact that requirements are often written in natural language, and may thus contain a high degree of ambiguity. Despite the progresses in Natural Language Processing techniques, the task of understanding a set of requirements cannot be automatized, and must be carried out by domain experts, who are typically not familiar with formal languages. Furthermore, in order to retain a direct connection with the informal requirements, the formalization cannot follow standard model-based approaches. The second challenge lies in the formal validation of requirements. On one hand, it is not even clear which are the correctness criteria or the high-level properties that the requirements must fulfill. On the other hand, the expressivity of the language used in the formalization may go beyond the theoretical and/or practical capacity of state-of-the-art formal verification. In order to solve these issues, we propose a new methodology that comprises of a chain of steps, each supported by a specific tool. The main steps are the following. First, the informal requirements are split into basic fragments, which are classified into categories, and dependency and generalization relationships among them are identified. Second, the fragments are modeled using a visual language such as UML. The UML diagrams are both syntactically restricted (in order to guarantee a formal semantics), and enriched with a highly controlled natural language (to allow for modeling static and temporal constraints). Third, an automatic formal analysis phase iterates over the modeled requirements, by combining several, complementary techniques: checking consistency; verifying whether the requirements entail some desirable properties; verify whether the requirements are consistent with selected scenarios; diagnosing inconsistencies by identifying inconsistent cores; identifying vacuous requirements; constructing multiple explanations by enabling the fault-tree analysis related to particular fault models; verifying whether the specification is realizable.
NASA Astrophysics Data System (ADS)
Yao, Yi; Kanai, Yosuke
2017-06-01
We present the implementation and performance of the strongly constrained and appropriately normed, SCAN, meta-GGA exchange-correlation (XC) approximation in the planewave-pseudopotential (PW-PP) formalism using the Troullier-Martins pseudopotential scheme. We studied its performance by applying the PW-PP implementation to several practical applications of interest in condensed matter sciences: (a) crystalline silicon and germanium, (b) martensitic phase transition energetics of phosphorene, and (c) a single water molecule physisorption on a graphene sheet. Given the much-improved accuracy over the GGA functionals and its relatively low computational cost compared to hybrid XC functionals, the SCAN functional is highly promising for various practical applications of density functional theory calculations for condensed matter systems. At same time, the SCAN meta-GGA functional appears to require more careful attention to numerical details. The meta-GGA functional shows more significant dependence on the fast Fourier transform grid, which is used for evaluating the XC potential in real space in the PW-PP formalism, than other more conventional GGA functionals do. Additionally, using pseudopotentials that are generated at a different/lower level of XC approximation could introduce noticeable errors in calculating some properties such as phase transition energetics.
FDE-vdW: A van der Waals inclusive subsystem density-functional theory.
Kevorkyants, Ruslan; Eshuis, Henk; Pavanello, Michele
2014-07-28
We present a formally exact van der Waals inclusive electronic structure theory, called FDE-vdW, based on the Frozen Density Embedding formulation of subsystem Density-Functional Theory. In subsystem DFT, the energy functional is composed of subsystem additive and non-additive terms. We show that an appropriate definition of the long-range correlation energy is given by the value of the non-additive correlation functional. This functional is evaluated using the fluctuation-dissipation theorem aided by a formally exact decomposition of the response functions into subsystem contributions. FDE-vdW is derived in detail and several approximate schemes are proposed, which lead to practical implementations of the method. We show that FDE-vdW is Casimir-Polder consistent, i.e., it reduces to the generalized Casimir-Polder formula for asymptotic inter-subsystems separations. Pilot calculations of binding energies of 13 weakly bound complexes singled out from the S22 set show a dramatic improvement upon semilocal subsystem DFT, provided that an appropriate exchange functional is employed. The convergence of FDE-vdW with basis set size is discussed, as well as its dependence on the choice of associated density functional approximant.
FDE-vdW: A van der Waals inclusive subsystem density-functional theory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kevorkyants, Ruslan; Pavanello, Michele, E-mail: m.pavanello@rutgers.edu; Eshuis, Henk
2014-07-28
We present a formally exact van der Waals inclusive electronic structure theory, called FDE-vdW, based on the Frozen Density Embedding formulation of subsystem Density-Functional Theory. In subsystem DFT, the energy functional is composed of subsystem additive and non-additive terms. We show that an appropriate definition of the long-range correlation energy is given by the value of the non-additive correlation functional. This functional is evaluated using the fluctuation–dissipation theorem aided by a formally exact decomposition of the response functions into subsystem contributions. FDE-vdW is derived in detail and several approximate schemes are proposed, which lead to practical implementations of the method.more » We show that FDE-vdW is Casimir-Polder consistent, i.e., it reduces to the generalized Casimir-Polder formula for asymptotic inter-subsystems separations. Pilot calculations of binding energies of 13 weakly bound complexes singled out from the S22 set show a dramatic improvement upon semilocal subsystem DFT, provided that an appropriate exchange functional is employed. The convergence of FDE-vdW with basis set size is discussed, as well as its dependence on the choice of associated density functional approximant.« less
NASA Astrophysics Data System (ADS)
Andrade-Ines, Eduardo; Robutel, Philippe
2018-01-01
We present an analytical formalism to study the secular dynamics of a system consisting of N-2 planets orbiting a binary star in outer orbits. We introduce a canonical coordinate system and expand the disturbing function in terms of canonical elliptic elements, combining both Legendre polynomials and Laplace coefficients, to obtain a general formalism for the secular description of this type of configuration. With a quadratic approximation of the development, we present a simplified analytical solution for the planetary orbits for both the single planet and the two-planet cases. From the two-planet model, we show that the inner planet accelerates the precession rate of the binary pericenter, which, in turn, may enter in resonance with the secular frequency of the outer planet, characterizing a secular resonance. We calculate an analytical expression for the approximate location of this resonance and apply it to known circumbinary systems, where we show that it can occur at relatively close orbits, for example at 2.4 au for the Kepler-38 system. With a more refined model, we analyse the dynamics of this secular resonance and we show that a bifurcation of the corresponding fixed points can affect the long- term evolution and stability of planetary systems. By comparing our results with complete integrations of the exact equations of motion, we verified the accuracy of our analytical model.
Relations between nonlinear Riccati equations and other equations in fundamental physics
NASA Astrophysics Data System (ADS)
Schuch, Dieter
2014-10-01
Many phenomena in the observable macroscopic world obey nonlinear evolution equations while the microscopic world is governed by quantum mechanics, a fundamental theory that is supposedly linear. In order to combine these two worlds in a common formalism, at least one of them must sacrifice one of its dogmas. Linearizing nonlinear dynamics would destroy the fundamental property of this theory, however, it can be shown that quantum mechanics can be reformulated in terms of nonlinear Riccati equations. In a first step, it will be shown that the information about the dynamics of quantum systems with analytical solutions can not only be obtainable from the time-dependent Schrödinger equation but equally-well from a complex Riccati equation. Comparison with supersymmetric quantum mechanics shows that even additional information can be obtained from the nonlinear formulation. Furthermore, the time-independent Schrödinger equation can also be rewritten as a complex Riccati equation for any potential. Extension of the Riccati formulation to include irreversible dissipative effects is straightforward. Via (real and complex) Riccati equations, other fields of physics can also be treated within the same formalism, e.g., statistical thermodynamics, nonlinear dynamical systems like those obeying a logistic equation as well as wave equations in classical optics, Bose- Einstein condensates and cosmological models. Finally, the link to abstract "quantizations" such as the Pythagorean triples and Riccati equations connected with trigonometric and hyperbolic functions will be shown.
Structures de l'apprentissage dans les pays de l'Europe Occidentale
NASA Astrophysics Data System (ADS)
Lengrand, Paul
1982-06-01
In West European countries, as in most modern societies, learning can be divided into three sectors. Informal education occupies the greatest space, both because of its duration and because it extends into every part of life. It happens in many and various ways and circumstances — from learning the language and socialisation in the family environment to the experiences of retirement and the third age. A large number of factors are involved, particularly married life, family responsibilities, work, the influence of mass media and participation in political activities. It is also the area of self-education. Formal education, dispensed by schools and universities, corrects and guides what is learned in the informal sector. It provides part of the necessary learning in the fields of the arts and of the sciences. However, it falls short so far as the evolution of ideas, morals and social behaviour are concerned. It also only imperfectly fulfils its function of democratization, and because of its concentration on matters intellectual, it does not promote the development of the diverse capacities of the whole person. Nonformal education resembles informal education in that it relates to life, and formal education in its structured character. It is the domain of educational innovation, especially in the context of social relationships. Through the harmonious combination of these three sectors of learning, the principles of a global and integrated lifelong education can be implemented.
NASA Astrophysics Data System (ADS)
Mégnin, Charles; Romanowicz, Barbara
1999-08-01
Most global tomographic models to date are derived using a combination of surface wave (or normal-mode) data and body wave traveltime data. The traveltime approach limits the number of phases available for inversion by requiring them to be isolated on the seismogram. This may ultimately result in limiting the resolution of 3-D structure, at least in some depth ranges in the mantle. In a previous study, we successfully derived a degree 12 whole-mantle SH-velocity tomographic model (SAW12D) using exclusively waveform data. In that inversion, a normal-mode formalism suitable for body waveforms, the non-linear asymptotic coupling theory (NACT), was combined with a body wave windowing scheme, referred to as the `individual wavepacket' (IW) technique, which allows one to assign individual weights to different body wave energy packets. We here compare the relative merits of this choice of theoretical formalism and windowing scheme at different depth ranges in the mantle. Choosing as the reference a model obtained using 7500 transverse-component body wave and 8000 surface wave seismograms and the NACT and IW approaches, we discuss the relative performance of the path average approximation (PAVA), a zeroth-order theoretical approximation appropriate for single-mode surface waves, relative to NACT, and compare the IW windowing scheme with a more standard `full window' (FW) approach, in which a single time window is considered from the first body wave arrival to the fundamental-mode surface waves. The combination PAVA/FW is often used in global tomography to supplement the traveltime data. We show that although the quality of the image derived under the PAVA/FW formalism is very similar to that derived under NACT/IW in the first 300 km of the upper mantle, where the resolution is dominated by surface waves, it deteriorates at greater depths. Images of the lower mantle are shown to be strongly sensitive to the theoretical formalism. In contrast, the resolution of structure near the core-mantle boundary depends mostly on the windowing scheme. This is because this resolution is controlled by low-amplitude phases such as S_diff, which are downweighted in the FW scheme. Whilst the image obtained in D'' using the combination NACT/IW is in good agreement with images obtained by other authors using both waveforms and traveltimes, we show that, when using FW, uppermost mantle structure can be mapped into D''. This result is confirmed by synthetic tests performed on a composite of the upper-mantle geodynamic model 3SMAC. We also show, based on synthetic tests, that for structures in the upper mantle with sharp boundaries, differences are observed between NACT and PAVA. Whilst a combination of traveltimes and surface wave data is adequate for resolving relatively smooth features in the mantle, our results show that by potentially increasing the achievable sampling, the waveform approach shows great promise for future high-resolution tomographic modelling of mantle structure, if cast in an appropriate theoretical framework.
The transition to formal thinking in mathematics
NASA Astrophysics Data System (ADS)
Tall, David
2008-09-01
This paper focuses on the changes in thinking involved in the transition from school mathematics to formal proof in pure mathematics at university. School mathematics is seen as a combination of visual representations, including geometry and graphs, together with symbolic calculations and manipulations. Pure mathematics in university shifts towards a formal framework of axiomatic systems and mathematical proof. In this paper, the transition in thinking is formulated within a framework of `three worlds of mathematics'- the `conceptual-embodied' world based on perception, action and thought experiment, the `proceptual-symbolic' world of calculation and algebraic manipulation compressing processes such as counting into concepts such as number, and the `axiomatic-formal' world of set-theoretic concept definitions and mathematical proof. Each `world' has its own sequence of development and its own forms of proof that may be blended together to give a rich variety of ways of thinking mathematically. This reveals mathematical thinking as a blend of differing knowledge structures; for instance, the real numbers blend together the embodied number line, symbolic decimal arithmetic and the formal theory of a complete ordered field. Theoretical constructs are introduced to describe how genetic structures set before birth enable the development of mathematical thinking, and how experiences that the individual has met before affect their personal growth. These constructs are used to consider how students negotiate the transition from school to university mathematics as embodiment and symbolism are blended with formalism. At a higher level, structure theorems proved in axiomatic theories link back to more sophisticated forms of embodiment and symbolism, revealing the intimate relationship between the three worlds.
Formal thought disorder in people at ultra-high risk of psychosis
Weinstein, Sara; Stahl, Daniel; Day, Fern; Valmaggia, Lucia; Rutigliano, Grazia; De Micheli, Andrea; Fusar-Poli, Paolo; McGuire, Philip
2017-01-01
Background Formal thought disorder is a cardinal feature of psychosis. However, the extent to which formal thought disorder is evident in ultra-high-risk individuals and whether it is linked to the progression to psychosis remains unclear. Aims Examine the severity of formal thought disorder in ultra-high-risk participants and its association with future psychosis. Method The Thought and Language Index (TLI) was used to assess 24 ultra-high-risk participants, 16 people with first-episode psychosis and 13 healthy controls. Ultra-high-risk individuals were followed up for a mean duration of 7 years (s.d.=1.5) to determine the relationship between formal thought disorder at baseline and transition to psychosis. Results TLI scores were significantly greater in the ultra-high-risk group compared with the healthy control group (effect size (ES)=1.2), but lower than in people with first-episode psychosis (ES=0.8). Total and negative TLI scores were higher in ultra-high-risk individuals who developed psychosis, but this was not significant. Combining negative TLI scores with attenuated psychotic symptoms and basic symptoms predicted transition to psychosis (P=0.04; ES=1.04). Conclusions TLI is beneficial in evaluating formal thought disorder in ultra-high-risk participants, and complements existing instruments for the evaluation of psychopathology in this group. Declaration of interests None. Copyright and usage © The Royal College of Psychiatrists 2017. This is an open access article distributed under the terms of the Creative Commons Non-Commercial, No Derivatives (CC BY-NC-ND) license. PMID:28713586
Who cares? A comparison of informal and formal care provision in Spain, England and the USA
SOLÉ-AURÓ, AÏDA; CRIMMINS, EILEEN M.
2013-01-01
This paper investigates the prevalence of incapacity in performing daily activities and the associations between household composition and availability of family members and receipt of care among older adults with functioning problems in Spain, England and the United States of America (USA). We examine how living arrangements, marital status, child availability, limitations in functioning ability, age and gender affect the probability of receiving formal care and informal care from household members and from others in three countries with different family structures, living arrangements and policies supporting care of the incapacitated. Data sources include the 2006 Survey of Health, Ageing and Retirement in Europe for Spain, the third wave of the English Longitudinal Study of Ageing (2006), and the eighth wave of the USA Health and Retirement Study (2006). Logistic and multinomial logistic regressions are used to estimate the probability of receiving care and the sources of care among persons age 50 and older. The percentage of people with functional limitations receiving care is higher in Spain. More care comes from outside the household in the USA and England than in Spain. The use of formal care among the incapacitated is lowest in the USA and highest in Spain. PMID:24550574
An Interdisciplinary Teacher Education Program.
ERIC Educational Resources Information Center
And Others; Little, Robert M.
1980-01-01
The University of Washington School of Dentistry developed a 36-month formal teacher education program in combination with joint specialty training in pedodontics and orthodontics. The rationale and structure of the original program is outlined and the reasons for its termination are discussed. (Author/MLW)
A Three-Step Approach to Veterinary Medical Education
ERIC Educational Resources Information Center
Kavanaugh, J. F.
1976-01-01
A formal education plan with two admission steps is outlined. Animal agriculture and the basic sciences are combined in a two-year middle stage. The medical education (third stage) that specifically addresses pathology and the clinical sciences encompasses three years. (Author/LBH)
The Words of Children's Television.
ERIC Educational Resources Information Center
Rice, Mabel L.
1984-01-01
Dialog features--communication flow, language structures, and meaning/content--and nonverbal formal features of six children's television programs are examined to determine if there is dialog simplification, if certain dialog characteristics differentiate among shows sampled, and if there are different combinations of linguistic features and…
NASA Astrophysics Data System (ADS)
Hoy, Erik P.; Mazziotti, David A.; Seideman, Tamar
2017-11-01
Can an electronic device be constructed using only a single molecule? Since this question was first asked by Aviram and Ratner in the 1970s [Chem. Phys. Lett. 29, 277 (1974)], the field of molecular electronics has exploded with significant experimental advancements in the understanding of the charge transport properties of single molecule devices. Efforts to explain the results of these experiments and identify promising new candidate molecules for molecular devices have led to the development of numerous new theoretical methods including the current standard theoretical approach for studying single molecule charge transport, i.e., the non-equilibrium Green's function formalism (NEGF). By pairing this formalism with density functional theory (DFT), a wide variety of transport problems in molecular junctions have been successfully treated. For some systems though, the conductance and current-voltage curves predicted by common DFT functionals can be several orders of magnitude above experimental results. In addition, since density functional theory relies on approximations to the exact exchange-correlation functional, the predicted transport properties can show significant variation depending on the functional chosen. As a first step to addressing this issue, the authors have replaced density functional theory in the NEGF formalism with a 2-electron reduced density matrix (2-RDM) method, creating a new approach known as the NEGF-RDM method. 2-RDM methods provide a more accurate description of electron correlation compared to density functional theory, and they have lower computational scaling compared to wavefunction based methods of similar accuracy. Additionally, 2-RDM methods are capable of capturing static electron correlation which is untreatable by existing NEGF-DFT methods. When studying dithiol alkane chains and dithiol benzene in model junctions, the authors found that the NEGF-RDM predicts conductances and currents that are 1-2 orders of magnitude below those of B3LYP and M06 DFT functionals. This suggests that the NEGF-RDM method could be a viable alternative to NEGF-DFT for molecular junction calculations.
An Analytical Framework for Soft and Hard Data Fusion: A Dempster-Shafer Belief Theoretic Approach
2012-08-01
fusion. Therefore, we provide a detailed discussion on uncertain data types, their origins and three uncertainty pro- cessing formalisms that are popular...suitable membership functions corresponding to the fuzzy sets. 3.2.3 DS Theory The DS belief theory, originally proposed by Dempster, can be thought of as... originated and various imperfections of the source. Uncertainty handling formalisms provide techniques for modeling and working with these uncertain data types
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yoon, Peter H., E-mail: yoonp@umd.edu; School of Space Research, Kyung Hee University, Yongin, Gyeonggi 446-701
2015-09-15
A previous paper [P. H. Yoon, “Kinetic theory of turbulence for parallel propagation revisited: Formal results,” Phys. Plasmas 22, 082309 (2015)] revisited the second-order nonlinear kinetic theory for turbulence propagating in directions parallel/anti-parallel to the ambient magnetic field, in which the original work according to Yoon and Fang [Phys. Plasmas 15, 122312 (2008)] was refined, following the paper by Gaelzer et al. [Phys. Plasmas 22, 032310 (2015)]. The main finding involved the dimensional correction pertaining to discrete-particle effects in Yoon and Fang's theory. However, the final result was presented in terms of formal linear and nonlinear susceptibility response functions. Inmore » the present paper, the formal equations are explicitly written down for the case of low-to-intermediate frequency regime by making use of approximate forms for the response functions. The resulting equations are sufficiently concrete so that they can readily be solved by numerical means or analyzed by theoretical means. The derived set of equations describe nonlinear interactions of quasi-parallel modes whose frequency range covers the Alfvén wave range to ion-cyclotron mode, but is sufficiently lower than the electron cyclotron mode. The application of the present formalism may range from the nonlinear evolution of whistler anisotropy instability in the high-beta regime, and the nonlinear interaction of electrons with whistler-range turbulence.« less
Specification and verification of gate-level VHDL models of synchronous and asynchronous circuits
NASA Technical Reports Server (NTRS)
Russinoff, David M.
1995-01-01
We present a mathematical definition of hardware description language (HDL) that admits a semantics-preserving translation to a subset of VHDL. Our HDL includes the basic VHDL propagation delay mechanisms and gate-level circuit descriptions. We also develop formal procedures for deriving and verifying concise behavioral specifications of combinational and sequential devices. The HDL and the specification procedures have been formally encoded in the computational logic of Boyer and Moore, which provides a LISP implementation as well as a facility for mechanical proof-checking. As an application, we design, specify, and verify a circuit that achieves asynchronous communication by means of the biphase mark protocol.
A formal language for the specification and verification of synchronous and asynchronous circuits
NASA Technical Reports Server (NTRS)
Russinoff, David M.
1993-01-01
A formal hardware description language for the intended application of verifiable asynchronous communication is described. The language is developed within the logical framework of the Nqthm system of Boyer and Moore and is based on the event-driven behavioral model of VHDL, including the basic VHDL signal propagation mechanisms, the notion of simulation deltas, and the VHDL simulation cycle. A core subset of the language corresponds closely with a subset of VHDL and is adequate for the realistic gate-level modeling of both combinational and sequential circuits. Various extensions to this subset provide means for convenient expression of behavioral circuit specifications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Collins, J.; Gamberg, L.; Prokudin, A.
We construct an improved implementation for combining TMD factorization transverse- momentum-dependent (TMD) factorization and collinear factorization. TMD factorization is suit- able for low transverse momentum physics, while collinear factorization is suitable for high transverse momenta and for a cross section integrated over transverse momentum. The result is a modified version of the standard W + Y prescription traditionally used in the Collins-Soper-Sterman (CSS) formalism and related approaches. As a result, we further argue that questions regarding the shape and Q- dependence of the cross sections at lower Q are largely governed by the matching to the Y -term.
Collins, J.; Gamberg, L.; Prokudin, A.; ...
2016-08-08
We construct an improved implementation for combining TMD factorization transverse- momentum-dependent (TMD) factorization and collinear factorization. TMD factorization is suit- able for low transverse momentum physics, while collinear factorization is suitable for high transverse momenta and for a cross section integrated over transverse momentum. The result is a modified version of the standard W + Y prescription traditionally used in the Collins-Soper-Sterman (CSS) formalism and related approaches. As a result, we further argue that questions regarding the shape and Q- dependence of the cross sections at lower Q are largely governed by the matching to the Y -term.
Green's function multiple-scattering theory with a truncated basis set: An augmented-KKR formalism
NASA Astrophysics Data System (ADS)
Alam, Aftab; Khan, Suffian N.; Smirnov, A. V.; Nicholson, D. M.; Johnson, Duane D.
2014-11-01
The Korringa-Kohn-Rostoker (KKR) Green's function, multiple-scattering theory is an efficient site-centered, electronic-structure technique for addressing an assembly of N scatterers. Wave functions are expanded in a spherical-wave basis on each scattering center and indexed up to a maximum orbital and azimuthal number Lmax=(l,mmax), while scattering matrices, which determine spectral properties, are truncated at Lt r=(l,mt r) where phase shifts δl >ltr are negligible. Historically, Lmax is set equal to Lt r, which is correct for large enough Lmax but not computationally expedient; a better procedure retains higher-order (free-electron and single-site) contributions for Lmax>Lt r with δl >ltr set to zero [X.-G. Zhang and W. H. Butler, Phys. Rev. B 46, 7433 (1992), 10.1103/PhysRevB.46.7433]. We present a numerically efficient and accurate augmented-KKR Green's function formalism that solves the KKR equations by exact matrix inversion [R3 process with rank N (ltr+1 ) 2 ] and includes higher-L contributions via linear algebra [R2 process with rank N (lmax+1) 2 ]. The augmented-KKR approach yields properly normalized wave functions, numerically cheaper basis-set convergence, and a total charge density and electron count that agrees with Lloyd's formula. We apply our formalism to fcc Cu, bcc Fe, and L 1 0 CoPt and present the numerical results for accuracy and for the convergence of the total energies, Fermi energies, and magnetic moments versus Lmax for a given Lt r.
Modeling Structure-Function Relationships in Synthetic DNA Sequences using Attribute Grammars
Cai, Yizhi; Lux, Matthew W.; Adam, Laura; Peccoud, Jean
2009-01-01
Recognizing that certain biological functions can be associated with specific DNA sequences has led various fields of biology to adopt the notion of the genetic part. This concept provides a finer level of granularity than the traditional notion of the gene. However, a method of formally relating how a set of parts relates to a function has not yet emerged. Synthetic biology both demands such a formalism and provides an ideal setting for testing hypotheses about relationships between DNA sequences and phenotypes beyond the gene-centric methods used in genetics. Attribute grammars are used in computer science to translate the text of a program source code into the computational operations it represents. By associating attributes with parts, modifying the value of these attributes using rules that describe the structure of DNA sequences, and using a multi-pass compilation process, it is possible to translate DNA sequences into molecular interaction network models. These capabilities are illustrated by simple example grammars expressing how gene expression rates are dependent upon single or multiple parts. The translation process is validated by systematically generating, translating, and simulating the phenotype of all the sequences in the design space generated by a small library of genetic parts. Attribute grammars represent a flexible framework connecting parts with models of biological function. They will be instrumental for building mathematical models of libraries of genetic constructs synthesized to characterize the function of genetic parts. This formalism is also expected to provide a solid foundation for the development of computer assisted design applications for synthetic biology. PMID:19816554
Green's function multiple-scattering theory with a truncated basis set: An augmented-KKR formalism
Alam, Aftab; Khan, Suffian N.; Smirnov, A. V.; ...
2014-11-04
Korringa-Kohn-Rostoker (KKR) Green's function, multiple-scattering theory is an ecient sitecentered, electronic-structure technique for addressing an assembly of N scatterers. Wave-functions are expanded in a spherical-wave basis on each scattering center and indexed up to a maximum orbital and azimuthal number L max = (l,m) max, while scattering matrices, which determine spectral properties, are truncated at L tr = (l,m) tr where phase shifts δl>l tr are negligible. Historically, L max is set equal to L tr, which is correct for large enough L max but not computationally expedient; a better procedure retains higher-order (free-electron and single-site) contributions for L maxmore » > L tr with δl>l tr set to zero [Zhang and Butler, Phys. Rev. B 46, 7433]. We present a numerically ecient and accurate augmented-KKR Green's function formalism that solves the KKR equations by exact matrix inversion [R 3 process with rank N(l tr + 1) 2] and includes higher-L contributions via linear algebra [R 2 process with rank N(l max +1) 2]. Augmented-KKR approach yields properly normalized wave-functions, numerically cheaper basis-set convergence, and a total charge density and electron count that agrees with Lloyd's formula. We apply our formalism to fcc Cu, bcc Fe and L1 0 CoPt, and present the numerical results for accuracy and for the convergence of the total energies, Fermi energies, and magnetic moments versus L max for a given L tr.« less
Governance Change and Institutional Adaptation: A Case Study from Harenna Forest, Ethiopia
NASA Astrophysics Data System (ADS)
Wakjira, Dereje T.; Fischer, Anke; Pinard, Michelle A.
2013-04-01
Many common pool resources have traditionally been managed through intricate local governance arrangements. Over time, such arrangements are confronted with manifold political, social, economic and ecological changes. However, the ways in which local governance arrangements react to such changes are poorly understood. Using the theoretical concept of institutional adaptation, we analyse the history of Harenna forest, Ethiopia, to examine processes of institutional change over the last 150 years. We find that the traditional institutions that governed Harenna's resources persisted, in essence, over time. However, these institutions were modified repeatedly to address changes caused by varying formal, supra-regional governance regimes, the development of markets for forest products, increasing population pressure and changes in formal property rights. A key mechanism for adaptation was combining elements from both informal and formal institutions, which allowed traditional rules to persist in the guise of more formal arrangements. Our findings also highlight several constraints of institutional adaptation. For example, by abolishing fora for collective decision-making, regime changes limited adaptive capacity. To conclude, we argue that such insights into traditional resource governance and its adaptability and dynamics over time are essential to develop sustainable approaches to participatory forest management for the future, both in Harenna and more generally.
Arts, Remo A G J; George, Erwin L J; Janssen, Miranda A M L; Griessner, Andreas; Zierhofer, Clemens; Stokroos, Robert J
2018-06-01
Previous studies show that intracochlear electrical stimulation independent of environmental sounds appears to suppress tinnitus, even long-term. In order to assess the viability of this potential treatment option it is essential to study the effects of this tinnitus specific electrical stimulation on speech perception. A randomised, prospective crossover design. Ten patients with unilateral or asymmetric hearing loss and severe tinnitus complaints. The audiological effects of standard clinical CI, formal auditory training and tinnitus specific electrical stimulation were investigated. Results show that standard clinical CI in unilateral or asymmetric hearing loss is shown to be beneficial for speech perception in quiet, speech perception in noise and subjective hearing ability. Formal auditory training does not appear to improve speech perception performance. However, CI-related discomfort reduces significantly more rapidly during CI rehabilitation in subjects receiving formal auditory training. Furthermore, tinnitus specific electrical stimulation has neither positive nor negative effects on speech perception. In combination with the findings from previous studies on tinnitus suppression using intracochlear electrical stimulation independent of environmental sounds, the results of this study contribute to the viability of cochlear implantation based on tinnitus complaints.
d'Alessio, M. A.; Williams, C.F.
2007-01-01
A suite of new techniques in thermochronometry allow analysis of the thermal history of a sample over a broad range of temperature sensitivities. New analysis tools must be developed that fully and formally integrate these techniques, allowing a single geologic interpretation of the rate and timing of exhumation and burial events consistent with all data. We integrate a thermal model of burial and exhumation, (U-Th)/He age modeling, and fission track age and length modeling. We then use a genetic algorithm to efficiently explore possible time-exhumation histories of a vertical sample profile (such as a borehole), simultaneously solving for exhumation and burial rates as well as changes in background heat flow. We formally combine all data in a rigorous statistical fashion. By parameterizing the model in terms of exhumation rather than time-temperature paths (as traditionally done in fission track modeling), we can ensure that exhumation histories result in a sedimentary basin whose thickness is consistent with the observed basin, a physically based constraint that eliminates otherwise acceptable thermal histories. We apply the technique to heat flow and thermochronometry data from the 2.1 -km-deep San Andreas Fault Observatory at Depth pilot hole near the San Andreas fault, California. We find that the site experienced <1 km of exhumation or burial since the onset of San Andreas fault activity ???30 Ma.
Uncertainty principle in loop quantum cosmology by Moyal formalism
NASA Astrophysics Data System (ADS)
Perlov, Leonid
2018-03-01
In this paper, we derive the uncertainty principle for the loop quantum cosmology homogeneous and isotropic Friedmann-Lemaiter-Robertson-Walker model with the holonomy-flux algebra. The uncertainty principle is between the variables c, with the meaning of connection and μ having the meaning of the physical cell volume to the power 2/3, i.e., v2 /3 or a plaquette area. Since both μ and c are not operators, but rather the random variables, the Robertson uncertainty principle derivation that works for hermitian operators cannot be used. Instead we use the Wigner-Moyal-Groenewold phase space formalism. The Wigner-Moyal-Groenewold formalism was originally applied to the Heisenberg algebra of the quantum mechanics. One can derive it from both the canonical and path integral quantum mechanics as well as the uncertainty principle. In this paper, we apply it to the holonomy-flux algebra in the case of the homogeneous and isotropic space. Another result is the expression for the Wigner function on the space of the cylindrical wave functions defined on Rb in c variables rather than in dual space μ variables.
NASA Technical Reports Server (NTRS)
Moore, J. Strother
1992-01-01
In this paper we present a formal model of asynchronous communication as a function in the Boyer-Moore logic. The function transforms the signal stream generated by one processor into the signal stream consumed by an independently clocked processor. This transformation 'blurs' edges and 'dilates' time due to differences in the phases and rates of the two clocks and the communications delay. The model can be used quantitatively to derive concrete performance bounds on asynchronous communications at ISO protocol level 1 (physical level). We develop part of the reusable formal theory that permits the convenient application of the model. We use the theory to show that a biphase mark protocol can be used to send messages of arbitrary length between two asynchronous processors. We study two versions of the protocol, a conventional one which uses cells of size 32 cycles and an unconventional one which uses cells of size 18. We conjecture that the protocol can be proved to work under our model for smaller cell sizes and more divergent clock rates but the proofs would be harder.
Computational logic: its origins and applications
2018-01-01
Computational logic is the use of computers to establish facts in a logical formalism. Originating in nineteenth century attempts to understand the nature of mathematical reasoning, the subject now comprises a wide variety of formalisms, techniques and technologies. One strand of work follows the ‘logic for computable functions (LCF) approach’ pioneered by Robin Milner, where proofs can be constructed interactively or with the help of users’ code (which does not compromise correctness). A refinement of LCF, called Isabelle, retains these advantages while providing flexibility in the choice of logical formalism and much stronger automation. The main application of these techniques has been to prove the correctness of hardware and software systems, but increasingly researchers have been applying them to mathematics itself. PMID:29507522
Reformulating the Schrödinger equation as a Shabat-Zakharov system
NASA Astrophysics Data System (ADS)
Boonserm, Petarpa; Visser, Matt
2010-02-01
We reformulate the second-order Schrödinger equation as a set of two coupled first-order differential equations, a so-called "Shabat-Zakharov system" (sometimes called a "Zakharov-Shabat" system). There is considerable flexibility in this approach, and we emphasize the utility of introducing an "auxiliary condition" or "gauge condition" that is used to cut down the degrees of freedom. Using this formalism, we derive the explicit (but formal) general solution to the Schrödinger equation. The general solution depends on three arbitrarily chosen functions, and a path-ordered exponential matrix. If one considers path ordering to be an "elementary" process, then this represents complete quadrature, albeit formal, of the second-order linear ordinary differential equation.
Relationship of executive function and educational status with functional balance in older adults.
Voos, Mariana Callil; Custódio, Elaine Bazilio; Malaquias, Joel
2011-01-01
The Berg Balance Scale (BBS) is frequently used to assess functional balance in older adults. The relationship of executive function and level of education with the BBS performance has not been described. The aim of this study was to determine whether (1) the performance on a task requiring executive function (part B of the Trail Making Test, TMT-B) influences results of motor and cognitive tests and (2) the number of years of formal education could be related to performance on BBS in older adults. We also explored whether there would be differences, based on performance on TMT-B (high vs low) in motor function (BBS, the timed up and go [TUG]) or cognitive function (TMT-A and TMTDELTA), the Mini Mental State Examination (MMSE), as well as years of education. Participants included 101 older adults (age range, 60-80 years) residing in São Paulo, Brazil. Functional balance was assessed using BBS and TUG. Executive function was assessed using the TMT and MMSE. Educational status was determined by self-report of participant's total number of years of formal education. The BBS scores were inversely related to TMT-A time (r = -0.63, r = 0.40, P < .001) and TMT-B time (r = -0.56, r = 0.31, P < .001). There was a similar relationship with TMTDELTA (r = -0.47, r = 0.22, P < .001). The BBS scores were positively correlated to years of formal education (r = 0.48, r = 0.23, P < .001). There was a ceiling effect on the TMT-B, with many individuals reaching maximum score of 300 seconds. Participants with high levels of executive function had higher BBS and MMSE scores, more education, and lower TMT-A, TMTDELTA and TUG scores (P < .001) than the lower functioning group. Individuals with higher capacity on tasks requiring visuospatial abilities, psychomotor speed, and executive function, such as the TMT, had better performance on BBS. Individuals with a high executive function, measured by TMT-B, also performed better on other motor and cognitive tests.
Våge, Selina; Pree, Bernadette; Thingstad, T Frede
2016-11-01
For more than 25 years, virus-to-bacteria ratios (VBR) have been measured and interpreted as indicators of the importance of viruses in aquatic ecosystems, yet a generally accepted theory for understanding mechanisms controlling VBR is still lacking. Assuming that the denominator (total bacterial abundance) is primarily predator controlled, while viral lysis compensates for host growth rates exceeding this grazing loss, the numerator (viral abundance) reflects activity differences between prokaryotic hosts. VBR is then a ratio between mechanisms generating structure within the bacterial community and interactions between different plankton functional types controlling bacterial community size. We here show how these arguments can be formalized by combining a recently published model for co-evolutionary host-virus interactions, with a previously published "minimum" model for the microbial food web. The result is a framework where viral lysis links bacterial diversity to microbial food web structure and function, creating relationships between different levels of organization that are strongly modified by organism-level properties such as cost of resistance. © 2016 The Authors. Environmental Microbiology Reports published by Society for Applied Microbiology and John Wiley & Sons Ltd.
High-order nonlinear susceptibilities of He
NASA Astrophysics Data System (ADS)
Liu, W.-C.; Clark, Charles W.
1996-05-01
High-order nonlinear optical response of noble gases to intense laser radiation is of considerable experimental interest, but is difficult to measure or calculate accurately. We have begun a set of calculations of frequency-dependent nonlinear susceptibilities of He 1s^2, within the framework of Rayleigh-Schrödinger perturbation theory at lowest applicable order, with the goal of providing critically evaluated atomic data for modelling high harmonic generation processes. The atomic Hamiltonian is decomposed in term of Hylleraas coordinates and spherical harmonics using the formalism of Pont and Shakeshaft (M. Pont and R. Shakeshaft, Phy. Rev. A 51), 257 (1995), and the hierarchy of inhomogeneous equations of perturbation theory is solved iteratively. A combination of Hylleraas and Frankowski basis functions is used(J. D. Baker, Master thesis, U. Delaware (1988); J. D. Baker, R. N. Hill, and J. D. Morgan, AIP Conference Proceedings 189) 123(1989); the compact Hylleraas basis provides a highly accurate representation of the ground state wavefunction, whereas the diffuse Frankowski basis functions efficiently reproduce the correct asymptotic structure of the perturbed orbitals.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jiang, Zhuoling; Centre for Nanoscale Science and Technology, Academy for Advanced Interdisciplinary Studies, Peking University, Beijing 100871; Wang, Hao
The atomic structure and electronic transport properties of a single hydrogen molecule connected to both symmetric and asymmetric Cu electrodes are investigated by using the non-equilibrium Green’s function formalism combined with the density functional theory. Our calculations show that in symmetric Cu–H{sub 2}–Cu junctions, the low-bias conductance drops rapidly upon stretching, while asymmetric ones present a low-bias conductance spanning the 0.2–0.3 G{sub 0} interval for a wide range of electrode separations. This is in good agreement with experiments on Cu atomic contacts in a hydrogen environment. Furthermore, the distribution of the calculated vibrational energies of the two hydrogen atoms inmore » the asymmetric Cu–H{sub 2}–Cu junction is also consistent with experiments. These findings provide clear evidence for the formation of asymmetric Cu–H{sub 2}–Cu molecular junctions in breaking Cu atomic contacts in the presence of hydrogen and are also helpful for the design of molecular devices with Cu electrodes.« less
Pree, Bernadette; Thingstad, T. Frede
2016-01-01
Summary For more than 25 years, virus‐to‐bacteria ratios (VBR) have been measured and interpreted as indicators of the importance of viruses in aquatic ecosystems, yet a generally accepted theory for understanding mechanisms controlling VBR is still lacking. Assuming that the denominator (total bacterial abundance) is primarily predator controlled, while viral lysis compensates for host growth rates exceeding this grazing loss, the numerator (viral abundance) reflects activity differences between prokaryotic hosts. VBR is then a ratio between mechanisms generating structure within the bacterial community and interactions between different plankton functional types controlling bacterial community size. We here show how these arguments can be formalized by combining a recently published model for co‐evolutionary host‐virus interactions, with a previously published “minimum” model for the microbial food web. The result is a framework where viral lysis links bacterial diversity to microbial food web structure and function, creating relationships between different levels of organization that are strongly modified by organism‐level properties such as cost of resistance. PMID:27231817
Wang, Xiaoming; Zebarjadi, Mona; Esfarjani, Keivan
2018-06-18
Two-dimensional (2D) van der Waals heterostructures (vdWHs) have shown multiple functionalities with great potential in electronics and photovoltaics. Here, we show their potential for solid-state thermionic energy conversion and demonstrate a designing strategy towards high-performance devices. We propose two promising thermionic devices, namely, the p-type Pt-G-WSe 2 -G-Pt and n-type Sc-WSe 2 -MoSe 2 -WSe 2 -Sc. We characterize the thermionic energy conversion performance of the latter using first-principles GW calculations combined with real space Green's function (GF) formalism. The optimal barrier height and high thermal resistance lead to an excellent performance. The proposed device is found to have a room temperature equivalent figure of merit of 1.2 which increases to 3 above 600 K. A high performance with cooling efficiency over 30% of the Carnot efficiency above 450 K is achieved. Our designing and characterization method can be used to pursue other potential thermionic devices based on vdWHs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jiang, Zhuoling; Wang, Hao; Sanvito, Stefano
Inelastic electron tunneling spectroscopy (IETS) of a single hydrogen atom on the Cu(100) surface in a scanning tunneling microscopy (STM) configuration has been investigated by employing the non-equilibrium Green’s function formalism combined with density functional theory. The electron-vibration interaction is treated at the level of lowest order expansion. Our calculations show that the single peak observed in the previous STM-IETS experiments is dominated by the perpendicular mode of the adsorbed H atom, while the parallel one only makes a negligible contribution even when the STM tip is laterally displaced from the top position of the H atom. This propensity ofmore » the IETS is deeply rooted in the symmetry of the vibrational modes and the characteristics of the conduction channel of the Cu-H-Cu tunneling junction, which is mainly composed of the 4s and 4p{sub z} atomic orbitals of the Cu apex atom and the 1s orbital of the adsorbed H atom. These findings are helpful for deepening our understanding of the propensity rules for IETS and promoting IETS as a more popular spectroscopic tool for molecular devices.« less
Strain distribution and band structure of InAs/GaAs quantum ring superlattice
NASA Astrophysics Data System (ADS)
Mughnetsyan, Vram; Kirakosyan, Albert
2017-12-01
The elastic strain distribution and the band structure of InAs/GaAs one-layer quantum ring superlattice with square symmetry has been considered in this work. The Green's function formalism based on the method of inclusions has been implied to calculate the components of the strain tensor, while the combination of Green's function method with the Fourier transformation to momentum space in Pikus-Bir Hamiltonian has been used for obtaining the miniband energy dispersion surfaces via the exact diagonalization procedure. The dependencies of the strain tensor components on spatial coordinates are compared with ones for single quantum ring and are in good agreement with previously obtained results for cylindrical quantum disks. It is shown that strain significantly affects the miniband structure of the superlattice and has contribution to the degeneracy lifting effect due to heavy hole-light hole coupling. The demonstrated method is simple and provides reasonable results for comparatively small Hamiltonian matrix. The obtained results may be useful for further investigation and construction of novel devices based on quantum ring superlattices.
Tailoring Quantum Dot Assemblies to Extend Exciton Coherence Times and Improve Exciton Transport
NASA Astrophysics Data System (ADS)
Seward, Kenton; Lin, Zhibin; Lusk, Mark
2012-02-01
The motion of excitons through nanostructured assemblies plays a central role in a wide range of physical phenomena including quantum computing, molecular electronics, photosynthetic processes, excitonic transistors and light emitting diodes. All of these technologies are severely handicapped, though, by quasi-particle lifetimes on the order of a nanosecond. The movement of excitons must therefore be as efficient as possible in order to move excitons meaningful distances. This is problematic for assemblies of small Si quantum dots (QDs), where excitons quickly localize and entangle with dot phonon modes. Ensuing exciton transport is then characterized by a classical random walk reduced to very short distances because of efficient recombination. We use a combination of master equation (Haken-Strobl) formalism and density functional theory to estimate the rate of decoherence in Si QD assemblies and its impact on exciton mobility. Exciton-phonon coupling and Coulomb interactions are calculated as a function of dot size, spacing and termination to minimize the rate of intra-dot phonon entanglement. This extends the time over which more efficient exciton transport, characterized by partial coherence, can be maintained.
Single-molecular diodes based on opioid derivatives.
Siqueira, M R S; Corrêa, S M; Gester, R M; Del Nero, J; Neto, A M J C
2015-12-01
We propose an efficient single-molecule rectifier based on a derivative of opioid. Electron transport properties are investigated within the non-equilibrium Green's function formalism combined with density functional theory. The analysis of the current-voltage characteristics indicates obvious diode-like behavior. While heroin presents rectification coefficient R>1, indicating preferential electronic current from electron-donating to electron-withdrawing, 3 and 6-acetylmorphine and morphine exhibit contrary behavior, R<1. Our calculations indicate that the simple inclusion of acetyl groups modulate a range of devices, which varies from simple rectifying to resonant-tunneling diodes. In particular, the rectification rations for heroin diodes show microampere electron current with a maximum of rectification (R=9.1) at very low bias voltage of ∼0.6 V and (R=14.3)∼1.8 V with resistance varying between 0.4 and 1.5 M Ω. Once most of the current single-molecule diodes usually rectifies in nanoampere, are not stable over 1.0 V and present electrical resistance around 10 M. Molecular devices based on opioid derivatives are promising in molecular electronics.
Cleaving Off Uranyl Oxygens through Chelation: A Mechanistic Study in the Gas Phase
Abergel, Rebecca J.; de Jong, Wibe A.; Deblonde, Gauthier J. -P.; ...
2017-10-11
Recent efforts to activate the strong uranium-oxygen bonds in the dioxo uranyl cation have been limited to single oxo-group activation through either uranyl reduction and functionalization in solution, or by collision induced dissociation (CID) in the gas-phase, using mass spectrometry (MS). Here, we report and investigate the surprising double activation of uranyl by an organic ligand, 3,4,3-LI(CAM), leading to the formation of a formal U 6+ chelate in the gas-phase. The cleavage of both uranyl oxo bonds was experimentally evidence d by CID, using deuterium and 18O isotopic substitutions, and by infrared multiple photon dissociation (IRMPD) spectroscopy. Density functional theorymore » (DFT) computations predict that the overall reaction requires only 132 kJ/mol, with the first oxygen activation entailing about 107 kJ/mol. Here, combined with analysis of similar, but unreactive ligands, these results shed light on the chelation-driven mechanism of uranyl oxo bond cleavage, demonstrating its dependence on the presence of ligand hydroxyl protons available for direct interactions with the uranyl oxygens.« less
Ab initio molecular dynamics simulation of LiBr association in water
NASA Astrophysics Data System (ADS)
Izvekov, Sergei; Philpott, Michael R.
2000-12-01
A computationally economical scheme which unifies the density functional description of an ionic solute and the classical description of a solvent was developed. The density functional part of the scheme comprises Car-Parrinello and related formalisms. The substantial saving in the computer time is achieved by performing the ab initio molecular dynamics of the solute electronic structure in a relatively small basis set constructed from lowest energy Kohn-Sham orbitals calculated for a single anion in vacuum, instead of using plane wave basis. The methodology permits simulation of an ionic solution for longer time scales while keeping accuracy in the prediction of the solute electronic structure. As an example the association of the Li+-Br- ion-pair system in water is studied. The results of the combined molecular dynamics simulation are compared with that obtained from the classical simulation with ion-ion interaction described by the pair potential of Born-Huggins-Mayer type. The comparison reveals an important role played by the polarization of the Br- ion in the dynamics of ion pair association.
Prospect Theory and Coercive Bargaining
ERIC Educational Resources Information Center
Butler, Christopher K.
2007-01-01
Despite many applications of prospect theory's concepts to explain political and strategic phenomena, formal analyses of strategic problems using prospect theory are rare. Using Fearon's model of bargaining, Tversky and Kahneman's value function, and an existing probability weighting function, I construct a model that demonstrates the differences…
Orthogonal polynomial projectors for the Projector Augmented Wave (PAW) formalism.
NASA Astrophysics Data System (ADS)
Holzwarth, N. A. W.; Matthews, G. E.; Tackett, A. R.; Dunning, R. B.
1998-03-01
The PAW method for density functional electronic structure calculations developed by Blöchl(Phys. Rev. B 50), 17953 (1994) and also used by our group(Phys. Rev. B 55), 2005 (1997) has numerical advantages of a pseudopotential technique while retaining the physics of an all-electron formalism. We describe a new method for generating the necessary set of atom-centered projector and basis functions, based on choosing the projector functions from a set of orthogonal polynomials multiplied by a localizing weight factor. Numerical benefits of the new scheme result from having direct control of the shape of the projector functions and from the use of a simple repulsive local potential term to eliminate ``ghost state" problems, which can haunt calculations of this kind. We demonstrate the method by calculating the cohesive energies of CaF2 and Mo and the density of states of CaMoO4 which shows detailed agreement with LAPW results over a 66 eV range of energy including upper core, valence, and conduction band states.
NASA Astrophysics Data System (ADS)
Reimberg, Paulo; Bernardeau, Francis
2018-01-01
We present a formalism based on the large deviation principle (LDP) applied to cosmological density fields, and more specifically to the arbitrary functional of density profiles, and we apply it to the derivation of the cumulant generating function and one-point probability distribution function (PDF) of the aperture mass (Map ), a common observable for cosmic shear observations. We show that the LDP can indeed be used in practice for a much larger family of observables than previously envisioned, such as those built from continuous and nonlinear functionals of density profiles. Taking advantage of this formalism, we can extend previous results, which were based on crude definitions of the aperture mass, with top-hat windows and the use of the reduced shear approximation (replacing the reduced shear with the shear itself). We were precisely able to quantify how this latter approximation affects the Map statistical properties. In particular, we derive the corrective term for the skewness of the Map and reconstruct its one-point PDF.
Koopmans-Compliant Spectral Functionals for Extended Systems
NASA Astrophysics Data System (ADS)
Nguyen, Ngoc Linh; Colonna, Nicola; Ferretti, Andrea; Marzari, Nicola
2018-04-01
Koopmans-compliant functionals have been shown to provide accurate spectral properties for molecular systems; this accuracy is driven by the generalized linearization condition imposed on each charged excitation, i.e., on changing the occupation of any orbital in the system, while accounting for screening and relaxation from all other electrons. In this work, we discuss the theoretical formulation and the practical implementation of this formalism to the case of extended systems, where a third condition, the localization of Koopmans's orbitals, proves crucial to reach seamlessly the thermodynamic limit. We illustrate the formalism by first studying one-dimensional molecular systems of increasing length. Then, we consider the band gaps of 30 paradigmatic solid-state test cases, for which accurate experimental and computational results are available. The results are found to be comparable with the state of the art in many-body perturbation theory, notably using just a functional formulation for spectral properties and the generalized-gradient approximation for the exchange and correlation functional.
Extension of the quasistatic far-wing line shape theory to multicomponent anisotropic potentials
NASA Technical Reports Server (NTRS)
Ma, Q.; Tipping, R. H.
1994-01-01
The formalism developed previously for the calculation of the far-wing line shape function and the corresponding absorption coefficient using a single-component anisotropic interaction term and the binary collision and quasistatic approximations is generalized to multicomponent anisotropic potential functions. Explicit expressions are presented for several common cases, including the long-range dipole-dipole plus dipole-quadrupole interaction and a linear molecule interacting with a perturber atom. After determining the multicomponent functional representation for the interaction between the CO2 and Ar from previously published data, we calculate the theoretical line shape function and the corresponding absorption due to the nu(sub 3) band of CO2 in the frequency range 2400-2580 cm(exp -1) and compare our results with previous calculations carried out using a single-component anisotropic interaction, and with the results obtained assuming Lorentzian line shapes. The principal uncertainties in the present results, possible refinements of the theoretical formalism, and the applicability to other systems are discussed briefly.
Bayne, Michael G; Scher, Jeremy A; Ellis, Benjamin H; Chakraborty, Arindam
2018-05-21
Electron-hole or quasiparticle representation plays a central role in describing electronic excitations in many-electron systems. For charge-neutral excitation, the electron-hole interaction kernel is the quantity of interest for calculating important excitation properties such as optical gap, optical spectra, electron-hole recombination and electron-hole binding energies. The electron-hole interaction kernel can be formally derived from the density-density correlation function using both Green's function and TDDFT formalism. The accurate determination of the electron-hole interaction kernel remains a significant challenge for precise calculations of optical properties in the GW+BSE formalism. From the TDDFT perspective, the electron-hole interaction kernel has been viewed as a path to systematic development of frequency-dependent exchange-correlation functionals. Traditional approaches, such as MBPT formalism, use unoccupied states (which are defined with respect to Fermi vacuum) to construct the electron-hole interaction kernel. However, the inclusion of unoccupied states has long been recognized as the leading computational bottleneck that limits the application of this approach for larger finite systems. In this work, an alternative derivation that avoids using unoccupied states to construct the electron-hole interaction kernel is presented. The central idea of this approach is to use explicitly correlated geminal functions for treating electron-electron correlation for both ground and excited state wave functions. Using this ansatz, it is derived using both diagrammatic and algebraic techniques that the electron-hole interaction kernel can be expressed only in terms of linked closed-loop diagrams. It is proved that the cancellation of unlinked diagrams is a consequence of linked-cluster theorem in real-space representation. The electron-hole interaction kernel derived in this work was used to calculate excitation energies in many-electron systems and results were found to be in good agreement with the EOM-CCSD and GW+BSE methods. The numerical results highlight the effectiveness of the developed method for overcoming the computational barrier of accurately determining the electron-hole interaction kernel to applications of large finite systems such as quantum dots and nanorods.
NASA Astrophysics Data System (ADS)
Matsubara, Takahiko
2003-02-01
We formulate a general method for perturbative evaluations of statistics of smoothed cosmic fields and provide useful formulae for application of the perturbation theory to various statistics. This formalism is an extensive generalization of the method used by Matsubara, who derived a weakly nonlinear formula of the genus statistic in a three-dimensional density field. After describing the general method, we apply the formalism to a series of statistics, including genus statistics, level-crossing statistics, Minkowski functionals, and a density extrema statistic, regardless of the dimensions in which each statistic is defined. The relation between the Minkowski functionals and other geometrical statistics is clarified. These statistics can be applied to several cosmic fields, including three-dimensional density field, three-dimensional velocity field, two-dimensional projected density field, and so forth. The results are detailed for second-order theory of the formalism. The effect of the bias is discussed. The statistics of smoothed cosmic fields as functions of rescaled threshold by volume fraction are discussed in the framework of second-order perturbation theory. In CDM-like models, their functional deviations from linear predictions plotted against the rescaled threshold are generally much smaller than that plotted against the direct threshold. There is still a slight meatball shift against rescaled threshold, which is characterized by asymmetry in depths of troughs in the genus curve. A theory-motivated asymmetry factor in the genus curve is proposed.
ERIC Educational Resources Information Center
Bennett, Ruth, Ed.; Goodwin, Norman
A booklet on traditional fishing practices of the Karuk Indians of northwestern California is presented in the formal, literary English speech of Norman Goodwin, a Karuk medicine man involved in preserving ancient tribal traditions. Empirical information and personal narratives are combined in descriptions of different kinds of nets, social rules…
ERIC Educational Resources Information Center
DiJulio, Betsy
2009-01-01
The best artistic challenges open students' eyes, hearts, and minds by combining both formal and conceptual concerns. In this article, the author describes a project inspired by a temporary exhibition of African Shona sculpture entitled "Mutambo! (Celebrate!)" at the Norfolk Botanical Gardens in Norfolk, Virginia. (Contains 2 online…
A Novel Integrated Ecological Model for the study of Sustainability
In recent years, there has been a growing interest among various sections of the society in the study of sustainability. Recently, a generalized mathematical model depicting a combined economic-ecological-social system has been proposed to help in the formal study of sustainabili...
Sadiqi, Said; Lehr, A Mechteld; Post, Marcel W; Jacobs, Wilco C H; Aarabi, Bizhan; Chapman, Jens R; Dunn, Robert N; Dvorak, Marcel F; Fehlings, Michael G; Rajasekaran, S; Vialle, Luiz R; Vaccaro, Alexander R; Oner, F Cumhur
2016-08-01
There is no outcome instrument specifically designed and validated for spine trauma patients without complete paralysis, which makes it difficult to compare outcomes of different treatments of the spinal column injury within and between studies. The paper aimed to report on the evidence-based consensus process that resulted in the selection of core International Classification of Functioning, Disability, and Health (ICF) categories, as well as the response scale for use in a universal patient-reported outcome measure for patients with traumatic spinal column injury. The study used a formal decision-making and consensus process. The sample includes patients with a primary diagnosis of traumatic spinal column injury, excluding completely paralyzed and polytrauma patients. The wide array of function and health status of patients with traumatic spinal column injury was explored through the identification of all potentially meaningful ICF categories. A formal decision-making and consensus process integrated evidence from four preparatory studies. Three studies aimed to identify relevant ICF categories from three different perspectives. The research perspective was covered by a systematic literature review identifying outcome measures focusing on the functioning and health of spine trauma patients. The expert perspective was explored through an international web-based survey among spine surgeons from the five AOSpine International world regions. The patient perspective was investigated in an international empirical study. A fourth study investigated various response scales for their potential use in the future universal outcome instrument. This work was supported by AOSpine. AOSpine is a clinical division of the AO Foundation, an independent medically guided non-profit organization. The AOSpine Knowledge Forums are pathology-focused working groups acting on behalf of AOSpine in their domain of scientific expertise. Combining the results of the preparatory studies, the list of ICF categories presented at the consensus conference included 159 different ICF categories. Based on voting and discussion, 11 experts from 6 countries selected a total of 25 ICF categories as core categories for patient-reported outcome measurement in adult traumatic spinal column injury patients (9 body functions, 14 activities and participation, and 2 environmental factors). The experts also agreed to use the Numeric Rating Scale 0-100 as response scale in the future universal outcome instrument. A formal consensus process integrating evidence and expert opinion led to a set of 25 core ICF categories for patient-reported outcome measurement in adult traumatic spinal column injury patients, as well as the response scale for use in the future universal disease-specific outcome instrument. The adopted core ICF categories could also serve as a benchmark for assessing the content validity of existing and future outcome instruments used in this specific patient population. Copyright © 2016 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Deakin, Michael A. B.
2006-01-01
This classroom note presents a final solution for the functional equation: f(xy)=xf(y) + yf(x). The functional equation if formally similar to the familiar product rule of elementary calculus and this similarity prompted its study by Ren et al., who derived some results concerning it. The purpose of this present note is to extend these results and…
Tag Questions across Irish English and British English: A Corpus Analysis of Form and Function
ERIC Educational Resources Information Center
Barron, Anne; Pandarova, Irina; Muderack, Karoline
2015-01-01
The present study, situated in the area of variational pragmatics, contrasts tag question (TQ) use in Ireland and Great Britain using spoken data from the Irish and British components of the International Corpus of English (ICE). Analysis is on the formal and functional level and also investigates form-functional relationships. Findings reveal…
NASA Astrophysics Data System (ADS)
Ayu Nurul Handayani, Hemas; Waspada, Indra
2018-05-01
Non-formal Early Childhood Education (non-formal ECE) is an education that is held for children under 4 years old. The implementation in District of Banyumas, Non-formal ECE is monitored by The District Government of Banyumas and helped by Sanggar Kegiatan Belajar (SKB) Purwokerto as one of the organizer of Non-formal Education. The government itself has a program for distributing ECE to all villages in Indonesia. However, The location to construct the ECE school in several years ahead is not arranged yet. Therefore, for supporting that program, a decision support system is made to give some recommendation villages for constructing The ECE building. The data are projected based on Brown’s Double Exponential Smoothing Method and utilizing Preference Ranking Organization Method for Enrichment Evaluation (Promethee) to generate priority order. As the recommendations system, it generates map visualization which is colored according to the priority level of sub-district and village area. The system was tested with black box testing, Promethee testing, and usability testing. The results showed that the system functionality and Promethee algorithm were working properly, and the user was satisfied.
Semileptonic decays of B and D mesons in the light-front formalism
NASA Astrophysics Data System (ADS)
Jaus, W.
1990-06-01
The light-front formalism is used to present a relativistic calculation of form factors for semileptonic D and B decays in the constituent quark model. The quark-antiquark wave functions of the mesons can be obtained, in principle, from an analysis of the meson spectrum, but are approximated in this work by harmonic-oscillator wave functions. The predictions of the model are consistent with the experimental data for B decays. The Kobayashi-Maskawa (KM) matrix element ||Vcs|| is determined by a comparison of the experimental and theoretical rates for D0-->K-e+ν, and is consistent with a unitary KM matrix for three families. The predictions for D-->K* transitions are in conflict with the data.