Sample records for formal sensitivity analysis

  1. Improving engineering system design by formal decomposition, sensitivity analysis, and optimization

    NASA Technical Reports Server (NTRS)

    Sobieski, J.; Barthelemy, J. F. M.

    1985-01-01

    A method for use in the design of a complex engineering system by decomposing the problem into a set of smaller subproblems is presented. Coupling of the subproblems is preserved by means of the sensitivity derivatives of the subproblem solution to the inputs received from the system. The method allows for the division of work among many people and computers.

  2. A Model for Analyzing Disability Policy

    ERIC Educational Resources Information Center

    Turnbull, Rud; Stowe, Matthew J.

    2017-01-01

    This article describes a 12-step model that can be used for policy analysis. The model encompasses policy development, implementation, and evaluation; takes into account structural foundations of policy; addresses both legal formalism and legal realism; demonstrates contextual sensitivity; and addresses application issues and different…

  3. Assessment of parametric uncertainty for groundwater reactive transport modeling,

    USGS Publications Warehouse

    Shi, Xiaoqing; Ye, Ming; Curtis, Gary P.; Miller, Geoffery L.; Meyer, Philip D.; Kohler, Matthias; Yabusaki, Steve; Wu, Jichun

    2014-01-01

    The validity of using Gaussian assumptions for model residuals in uncertainty quantification of a groundwater reactive transport model was evaluated in this study. Least squares regression methods explicitly assume Gaussian residuals, and the assumption leads to Gaussian likelihood functions, model parameters, and model predictions. While the Bayesian methods do not explicitly require the Gaussian assumption, Gaussian residuals are widely used. This paper shows that the residuals of the reactive transport model are non-Gaussian, heteroscedastic, and correlated in time; characterizing them requires using a generalized likelihood function such as the formal generalized likelihood function developed by Schoups and Vrugt (2010). For the surface complexation model considered in this study for simulating uranium reactive transport in groundwater, parametric uncertainty is quantified using the least squares regression methods and Bayesian methods with both Gaussian and formal generalized likelihood functions. While the least squares methods and Bayesian methods with Gaussian likelihood function produce similar Gaussian parameter distributions, the parameter distributions of Bayesian uncertainty quantification using the formal generalized likelihood function are non-Gaussian. In addition, predictive performance of formal generalized likelihood function is superior to that of least squares regression and Bayesian methods with Gaussian likelihood function. The Bayesian uncertainty quantification is conducted using the differential evolution adaptive metropolis (DREAM(zs)) algorithm; as a Markov chain Monte Carlo (MCMC) method, it is a robust tool for quantifying uncertainty in groundwater reactive transport models. For the surface complexation model, the regression-based local sensitivity analysis and Morris- and DREAM(ZS)-based global sensitivity analysis yield almost identical ranking of parameter importance. The uncertainty analysis may help select appropriate likelihood functions, improve model calibration, and reduce predictive uncertainty in other groundwater reactive transport and environmental modeling.

  4. Understanding visualization: a formal approach using category theory and semiotics.

    PubMed

    Vickers, Paul; Faith, Joe; Rossiter, Nick

    2013-06-01

    This paper combines the vocabulary of semiotics and category theory to provide a formal analysis of visualization. It shows how familiar processes of visualization fit the semiotic frameworks of both Saussure and Peirce, and extends these structures using the tools of category theory to provide a general framework for understanding visualization in practice, including: Relationships between systems, data collected from those systems, renderings of those data in the form of representations, the reading of those representations to create visualizations, and the use of those visualizations to create knowledge and understanding of the system under inspection. The resulting framework is validated by demonstrating how familiar information visualization concepts (such as literalness, sensitivity, redundancy, ambiguity, generalizability, and chart junk) arise naturally from it and can be defined formally and precisely. This paper generalizes previous work on the formal characterization of visualization by, inter alia, Ziemkiewicz and Kosara and allows us to formally distinguish properties of the visualization process that previous work does not.

  5. Formal Analysis of Key Integrity in PKCS#11

    NASA Astrophysics Data System (ADS)

    Falcone, Andrea; Focardi, Riccardo

    PKCS#11 is a standard API to cryptographic devices such as smarcards, hardware security modules and usb crypto-tokens. Though widely adopted, this API has been shown to be prone to attacks in which a malicious user gains access to the sensitive keys stored in the devices. In 2008, Delaune, Kremer and Steel proposed a model to formally reason on this kind of attacks. We extend this model to also describe flaws that are based on integrity violations of the stored keys. In particular, we consider scenarios in which a malicious overwriting of keys might fool honest users into using attacker's own keys, while performing sensitive operations. We further enrich the model with a trusted key mechanism ensuring that only controlled, non-tampered keys are used in cryptographic operations, and we show how this modified API prevents the above mentioned key-replacement attacks.

  6. Bayesian Sensitivity Analysis of Statistical Models with Missing Data

    PubMed Central

    ZHU, HONGTU; IBRAHIM, JOSEPH G.; TANG, NIANSHENG

    2013-01-01

    Methods for handling missing data depend strongly on the mechanism that generated the missing values, such as missing completely at random (MCAR) or missing at random (MAR), as well as other distributional and modeling assumptions at various stages. It is well known that the resulting estimates and tests may be sensitive to these assumptions as well as to outlying observations. In this paper, we introduce various perturbations to modeling assumptions and individual observations, and then develop a formal sensitivity analysis to assess these perturbations in the Bayesian analysis of statistical models with missing data. We develop a geometric framework, called the Bayesian perturbation manifold, to characterize the intrinsic structure of these perturbations. We propose several intrinsic influence measures to perform sensitivity analysis and quantify the effect of various perturbations to statistical models. We use the proposed sensitivity analysis procedure to systematically investigate the tenability of the non-ignorable missing at random (NMAR) assumption. Simulation studies are conducted to evaluate our methods, and a dataset is analyzed to illustrate the use of our diagnostic measures. PMID:24753718

  7. Gain, noise, and contrast sensitivity of linear visual neurons

    NASA Technical Reports Server (NTRS)

    Watson, Andrew B.

    1990-01-01

    Contrast sensitivity is a measure of the ability of an observer to detect contrast signals of particular spatial and temporal frequencies. A formal definition of contrast sensitivity that can be applied to individual linear visual neurons is derived. A neuron is modeled by a contrast transfer function and its modulus, contrast gain, and by a noise power spectrum. The distributions of neural responses to signal and blank presentations are derived, and from these, a definition of contrast sensitivity is obtained. This formal definition may be used to relate the sensitivities of various populations of neurons, and to relate the sensitivities of neurons to that of the behaving animal.

  8. What do we mean by sensitivity analysis? The need for comprehensive characterization of "global" sensitivity in Earth and Environmental systems models

    NASA Astrophysics Data System (ADS)

    Razavi, Saman; Gupta, Hoshin V.

    2015-05-01

    Sensitivity analysis is an essential paradigm in Earth and Environmental Systems modeling. However, the term "sensitivity" has a clear definition, based in partial derivatives, only when specified locally around a particular point (e.g., optimal solution) in the problem space. Accordingly, no unique definition exists for "global sensitivity" across the problem space, when considering one or more model responses to different factors such as model parameters or forcings. A variety of approaches have been proposed for global sensitivity analysis, based on different philosophies and theories, and each of these formally characterizes a different "intuitive" understanding of sensitivity. These approaches focus on different properties of the model response at a fundamental level and may therefore lead to different (even conflicting) conclusions about the underlying sensitivities. Here we revisit the theoretical basis for sensitivity analysis, summarize and critically evaluate existing approaches in the literature, and demonstrate their flaws and shortcomings through conceptual examples. We also demonstrate the difficulty involved in interpreting "global" interaction effects, which may undermine the value of existing interpretive approaches. With this background, we identify several important properties of response surfaces that are associated with the understanding and interpretation of sensitivities in the context of Earth and Environmental System models. Finally, we highlight the need for a new, comprehensive framework for sensitivity analysis that effectively characterizes all of the important sensitivity-related properties of model response surfaces.

  9. Reconciling uncertain costs and benefits in bayes nets for invasive species management

    USGS Publications Warehouse

    Burgman, M.A.; Wintle, B.A.; Thompson, C.A.; Moilanen, A.; Runge, M.C.; Ben-Haim, Y.

    2010-01-01

    Bayes nets are used increasingly to characterize environmental systems and formalize probabilistic reasoning to support decision making. These networks treat probabilities as exact quantities. Sensitivity analysis can be used to evaluate the importance of assumptions and parameter estimates. Here, we outline an application of info-gap theory to Bayes nets that evaluates the sensitivity of decisions to possibly large errors in the underlying probability estimates and utilities. We apply it to an example of management and eradication of Red Imported Fire Ants in Southern Queensland, Australia and show how changes in management decisions can be justified when uncertainty is considered. ?? 2009 Society for Risk Analysis.

  10. Error analysis applied to several inversion techniques used for the retrieval of middle atmospheric constituents from limb-scanning MM-wave spectroscopic measurements

    NASA Technical Reports Server (NTRS)

    Puliafito, E.; Bevilacqua, R.; Olivero, J.; Degenhardt, W.

    1992-01-01

    The formal retrieval error analysis of Rodgers (1990) allows the quantitative determination of such retrieval properties as measurement error sensitivity, resolution, and inversion bias. This technique was applied to five numerical inversion techniques and two nonlinear iterative techniques used for the retrieval of middle atmospheric constituent concentrations from limb-scanning millimeter-wave spectroscopic measurements. It is found that the iterative methods have better vertical resolution, but are slightly more sensitive to measurement error than constrained matrix methods. The iterative methods converge to the exact solution, whereas two of the matrix methods under consideration have an explicit constraint, the sensitivity of the solution to the a priori profile. Tradeoffs of these retrieval characteristics are presented.

  11. Sensitivity of landscape metrics to pixel size

    Treesearch

    J. D. Wickham; K. H. Riitters

    1995-01-01

    Analysis of diversity and evenness metrics using land cover data are becoming formalized in landscape ecology. Diversity and evenness metrics are dependent on the pixel size (scale) over which the data are collected. Aerial photography was interpreted for land cover and converted into four raster data sets with 4, 12, 28, and 80 m pixel sizes, representing pixel sizes...

  12. Testing local Lorentz invariance with short-range gravity

    DOE PAGES

    Kostelecký, V. Alan; Mewes, Matthew

    2017-01-10

    The Newton limit of gravity is studied in the presence of Lorentz-violating gravitational operators of arbitrary mass dimension. The linearized modified Einstein equations are obtained and the perturbative solutions are constructed and characterized. We develop a formalism for data analysis in laboratory experiments testing gravity at short range and demonstrate that these tests provide unique sensitivity to deviations from local Lorentz invariance.

  13. Mapping of polycrystalline films of biological fluids utilizing the Jones-matrix formalism

    NASA Astrophysics Data System (ADS)

    Ushenko, Vladimir A.; Dubolazov, Alexander V.; Pidkamin, Leonid Y.; Sakchnovsky, Michael Yu; Bodnar, Anna B.; Ushenko, Yuriy A.; Ushenko, Alexander G.; Bykov, Alexander; Meglinski, Igor

    2018-02-01

    Utilizing a polarized light approach, we reconstruct the spatial distribution of birefringence and optical activity in polycrystalline films of biological fluids. The Jones-matrix formalism is used for an accessible quantitative description of these types of optical anisotropy. We demonstrate that differentiation of polycrystalline films of biological fluids can be performed based on a statistical analysis of the distribution of rotation angles and phase shifts associated with the optical activity and birefringence, respectively. Finally, practical operational characteristics, such as sensitivity, specificity and accuracy of the Jones-matrix reconstruction of optical anisotropy, were identified with special emphasis on biomedical application, specifically for differentiation of bile films taken from healthy donors and from patients with cholelithiasis.

  14. Surrogate models for efficient stability analysis of brake systems

    NASA Astrophysics Data System (ADS)

    Nechak, Lyes; Gillot, Frédéric; Besset, Sébastien; Sinou, Jean-Jacques

    2015-07-01

    This study assesses capacities of the global sensitivity analysis combined together with the kriging formalism to be useful in the robust stability analysis of brake systems, which is too costly when performed with the classical complex eigenvalues analysis (CEA) based on finite element models (FEMs). By considering a simplified brake system, the global sensitivity analysis is first shown very helpful for understanding the effects of design parameters on the brake system's stability. This is allowed by the so-called Sobol indices which discriminate design parameters with respect to their influence on the stability. Consequently, only uncertainty of influent parameters is taken into account in the following step, namely, the surrogate modelling based on kriging. The latter is then demonstrated to be an interesting alternative to FEMs since it allowed, with a lower cost, an accurate estimation of the system's proportions of instability corresponding to the influent parameters.

  15. The development and initial validation of a sensitive bedside cognitive screening test.

    PubMed

    Faust, D; Fogel, B S

    1989-01-01

    Brief bedside cognitive examinations such as the Mini-Mental State Examination are designed to detect delirium and dementia but not more subtle or delineated cognitive deficits. Formal neuropsychological evaluation provides greater sensitivity and detects a wider range of cognitive deficits but is too lengthy for efficient use at the bedside or in epidemiological studies. The authors developed the High Sensitivity Cognitive Screen (HSCS), a 20-minute interview-based test, to identify patients who show disorder on formal neuropsychological evaluation. An initial study demonstrated satisfactory test-retest and interrater reliability. The HSCS was then administered to 60 psychiatric and neurological patients with suspected cognitive deficits but without gross impairment, who also completed formal neuropsychological testing. Results of both tests were independently classified as either normal, borderline, or abnormal. The HSCS correctly classified 93% of patients across the normal-abnormal dichotomy and showed promise for characterizing the extent and severity of cognitive dysfunction.

  16. Optimizing sensitivity to γ with B0→D K+π-, D →KS0π+π- double Dalitz plot analysis

    NASA Astrophysics Data System (ADS)

    Craik, D.; Gershon, T.; Poluektov, A.

    2018-03-01

    Two of the most powerful methods currently used to determine the angle γ of the CKM Unitarity Triangle exploit B+→D K+, D →KS0π+π- decays and B0→D K+π-, D →K+K-, π+π- decays. It is possible to combine the strengths of both approaches in a "double Dalitz plot" analysis of B0→D K+π-, D →KS0π+π- decays. The potential sensitivity of such an analysis is investigated in the light of recently published experimental information on the B0→D K+π- decay. The formalism is also expanded, compared to previous discussions in the literature, to allow B0→D K+π- with any subsequent D decay to be included.

  17. Derivation and validation of the automated search algorithms to identify cognitive impairment and dementia in electronic health records.

    PubMed

    Amra, Sakusic; O'Horo, John C; Singh, Tarun D; Wilson, Gregory A; Kashyap, Rahul; Petersen, Ronald; Roberts, Rosebud O; Fryer, John D; Rabinstein, Alejandro A; Gajic, Ognjen

    2017-02-01

    Long-term cognitive impairment is a common and important problem in survivors of critical illness. We developed electronic search algorithms to identify cognitive impairment and dementia from the electronic medical records (EMRs) that provide opportunity for big data analysis. Eligible patients met 2 criteria. First, they had a formal cognitive evaluation by The Mayo Clinic Study of Aging. Second, they were hospitalized in intensive care unit at our institution between 2006 and 2014. The "criterion standard" for diagnosis was formal cognitive evaluation supplemented by input from an expert neurologist. Using all available EMR data, we developed and improved our algorithms in the derivation cohort and validated them in the independent validation cohort. Of 993 participants who underwent formal cognitive testing and were hospitalized in intensive care unit, we selected 151 participants at random to form the derivation and validation cohorts. The automated electronic search algorithm for cognitive impairment was 94.3% sensitive and 93.0% specific. The search algorithms for dementia achieved respective sensitivity and specificity of 97% and 99%. EMR search algorithms significantly outperformed International Classification of Diseases codes. Automated EMR data extractions for cognitive impairment and dementia are reliable and accurate and can serve as acceptable and efficient alternatives to time-consuming manual data review. Copyright © 2016 Elsevier Inc. All rights reserved.

  18. Subsumption principles underlying medical concept systems and their formal reconstruction.

    PubMed Central

    Bernauer, J.

    1994-01-01

    Conventional medical concept systems represent generic concept relations by hierarchical coding principles. Often, these coding principles constrain the concept system and reduce the potential for automatical derivation of subsumption. Formal reconstruction of medical concept systems is an approach that bases on the conceptual representation of meanings and that allows for the application of formal criteria for subsumption. Those criteria must reflect intuitive principles of subordination which are underlying conventional medical concept systems. Particularly these are: The subordinate concept results (1) from adding a specializing criterion to the superordinate concept, (2) from refining the primary category, or a criterion of the superordinate concept, by a concept that is less general, (3) from adding a partitive criterion to a criterion of the superordinate, (4) from refining a criterion by a concept that is less comprehensive, and finally (5) from coordinating the superordinate concept, or one of its criteria. This paper introduces a formalism called BERNWARD that aims at the formal reconstruction of medical concept systems according to these intuitive principles. The automatical derivation of hierarchical relations is primarily supported by explicit generic and explicit partititive hierarchies of concepts, secondly, by two formal criteria that base on the structure of concept descriptions and explicit hierarchical relations between their elements, namely: formal subsumption and part-sensitive subsumption. Formal subsumption takes only generic relations into account, part-sensitive subsumption additionally regards partive relations between criteria. This approach seems to be flexible enough to cope with unforeseeable effects of partitive criteria on subsumption. PMID:7949907

  19. An Attempt of Formalizing the Selection Parameters for Settlements Generalization in Small-Scales

    NASA Astrophysics Data System (ADS)

    Karsznia, Izabela

    2014-12-01

    The paper covers one of the most important problems concerning context-sensitive settlement selection for the purpose of the small-scale maps. So far, no formal parameters for small-scale settlements generalization have been specified, hence the problem seems to be an important and innovative challenge. It is also crucial from the practical point of view as it is necessary to develop appropriate generalization algorithms for the purpose of the General Geographic Objects Database generalization which is the essential Spatial Data Infrastructure component in Poland. The author proposes and verifies quantitative generalization parameters for the purpose of the settlement selection process in small-scale maps. The selection of settlements was carried out in two research areas - in Lower Silesia and Łódź Province. Based on the conducted analysis appropriate contextual-sensitive settlements selection parameters have been defined. Particular effort has been made to develop a methodology of quantitative settlements selection which would be useful in the automation processes and that would make it possible to keep specifics of generalized objects unchanged.

  20. Through the Lens of Cultural Awareness: A Primer for US Armed Forces Deploying to Arab and Middle Eastern Countries

    DTIC Science & Technology

    2006-01-01

    Willingness to Compromise •Risk Avoidance •Time to Decision •Etc. The “What” ral Variation Behaviors •Context Sensitivity Values •Individualism vs ...Collectivism •Power Distance •Formality vs . informality •Uncertainty Avoidance •Relationship vs . Deal Focus •Long-term vs . Short term orientation •Time...Orientation Cognition •Reasoning Styles Cultural Variations Behaviors •Context Sensitivity Values •Individualism vs . Collectivism •Power Distance •Formality

  1. In Search of Rationality: The Purposes behind the Use of Formal Analysis in Organizations.

    ERIC Educational Resources Information Center

    Langley, Ann

    1989-01-01

    Examines how formal analysis is actually practiced in 3 different organizations. Identifies 4 main groups of purposes for formal analysis and relates them to various hierarchical relationships. Formal analysis and social interaction seem inextricably linked in organizational decision-making. Different structural configurations may generate…

  2. Sensitivity Analysis of earth and environmental models: a systematic review to guide scientific advancement

    NASA Astrophysics Data System (ADS)

    Wagener, Thorsten; Pianosi, Francesca

    2016-04-01

    Sensitivity Analysis (SA) investigates how the variation in the output of a numerical model can be attributed to variations of its input factors. SA is increasingly being used in earth and environmental modelling for a variety of purposes, including uncertainty assessment, model calibration and diagnostic evaluation, dominant control analysis and robust decision-making. Here we provide some practical advice regarding best practice in SA and discuss important open questions based on a detailed recent review of the existing body of work in SA. Open questions relate to the consideration of input factor interactions, methods for factor mapping and the formal inclusion of discrete factors in SA (for example for model structure comparison). We will analyse these questions using relevant examples and discuss possible ways forward. We aim at stimulating the discussion within the community of SA developers and users regarding the setting of good practices and on defining priorities for future research.

  3. What Do We Mean By Sensitivity Analysis? The Need For A Comprehensive Characterization Of Sensitivity In Earth System Models

    NASA Astrophysics Data System (ADS)

    Razavi, S.; Gupta, H. V.

    2014-12-01

    Sensitivity analysis (SA) is an important paradigm in the context of Earth System model development and application, and provides a powerful tool that serves several essential functions in modelling practice, including 1) Uncertainty Apportionment - attribution of total uncertainty to different uncertainty sources, 2) Assessment of Similarity - diagnostic testing and evaluation of similarities between the functioning of the model and the real system, 3) Factor and Model Reduction - identification of non-influential factors and/or insensitive components of model structure, and 4) Factor Interdependence - investigation of the nature and strength of interactions between the factors, and the degree to which factors intensify, cancel, or compensate for the effects of each other. A variety of sensitivity analysis approaches have been proposed, each of which formally characterizes a different "intuitive" understanding of what is meant by the "sensitivity" of one or more model responses to its dependent factors (such as model parameters or forcings). These approaches are based on different philosophies and theoretical definitions of sensitivity, and range from simple local derivatives and one-factor-at-a-time procedures to rigorous variance-based (Sobol-type) approaches. In general, each approach focuses on, and identifies, different features and properties of the model response and may therefore lead to different (even conflicting) conclusions about the underlying sensitivity. This presentation revisits the theoretical basis for sensitivity analysis, and critically evaluates existing approaches so as to demonstrate their flaws and shortcomings. With this background, we discuss several important properties of response surfaces that are associated with the understanding and interpretation of sensitivity. Finally, a new approach towards global sensitivity assessment is developed that is consistent with important properties of Earth System model response surfaces.

  4. Cavity-enhanced Faraday rotation measurement with auto-balanced photodetection.

    PubMed

    Chang, Chia-Yu; Shy, Jow-Tsong

    2015-10-01

    Optical cavity enhancement for a tiny Faraday rotation is demonstrated with auto-balanced photodetection. This configuration is analyzed using the Jones matrix formalism. The resonant rotation signal is amplified, and thus, the angular sensitivity is improved. In the experiment, the air Faraday rotation is measured with an auto-balanced photoreceiver in single-pass and cavity geometries. The result shows that the measured Faraday rotation in the single-pass geometry is enhanced by a factor of 85 in the cavity geometry, and the sensitivity is improved to 7.54×10(-10)  rad Hz(-1/2), which agrees well with the Jones matrix analysis. With this verification, we propose an AC magnetic sensor whose magnetic sensitivity is expected to achieve 10  pT Hz(-1/2).

  5. Sensitivity analysis of Repast computational ecology models with R/Repast.

    PubMed

    Prestes García, Antonio; Rodríguez-Patón, Alfonso

    2016-12-01

    Computational ecology is an emerging interdisciplinary discipline founded mainly on modeling and simulation methods for studying ecological systems. Among the existing modeling formalisms, the individual-based modeling is particularly well suited for capturing the complex temporal and spatial dynamics as well as the nonlinearities arising in ecosystems, communities, or populations due to individual variability. In addition, being a bottom-up approach, it is useful for providing new insights on the local mechanisms which are generating some observed global dynamics. Of course, no conclusions about model results could be taken seriously if they are based on a single model execution and they are not analyzed carefully. Therefore, a sound methodology should always be used for underpinning the interpretation of model results. The sensitivity analysis is a methodology for quantitatively assessing the effect of input uncertainty in the simulation output which should be incorporated compulsorily to every work based on in-silico experimental setup. In this article, we present R/Repast a GNU R package for running and analyzing Repast Simphony models accompanied by two worked examples on how to perform global sensitivity analysis and how to interpret the results.

  6. The economics of protecting tiger populations: Linking household behavior to poaching and prey depletion

    USGS Publications Warehouse

    Damania, R.; Stringer, R.; Karanth, K.U.; Stith, B.

    2003-01-01

    The tiger (Panthera tigris) is classified as endangered and populations continue to decline. This paper presents a formal economic analysis of the two most imminent threats to the survival of wild tigers: poaching tigers and hunting their prey. A model is developed to examine interactions between tigers and farm households living in and around tiger habitats. The analysis extends the existing literature on tiger demography, incorporating predator-prey interactions and exploring the sensitivity of tiger populations to key economic parameters. The analysis aims to contribute to policy debates on how best to protect one of the world's most endangered wild cats.

  7. Can feedback analysis be used to uncover the physical origin of climate sensitivity and efficacy differences?

    NASA Astrophysics Data System (ADS)

    Rieger, Vanessa S.; Dietmüller, Simone; Ponater, Michael

    2017-10-01

    Different strengths and types of radiative forcings cause variations in the climate sensitivities and efficacies. To relate these changes to their physical origin, this study tests whether a feedback analysis is a suitable approach. For this end, we apply the partial radiative perturbation method. Combining the forward and backward calculation turns out to be indispensable to ensure the additivity of feedbacks and to yield a closed forcing-feedback-balance at top of the atmosphere. For a set of CO2-forced simulations, the climate sensitivity changes with increasing forcing. The albedo, cloud and combined water vapour and lapse rate feedback are found to be responsible for the variations in the climate sensitivity. An O3-forced simulation (induced by enhanced NOx and CO surface emissions) causes a smaller efficacy than a CO2-forced simulation with a similar magnitude of forcing. We find that the Planck, albedo and most likely the cloud feedback are responsible for this effect. Reducing the radiative forcing impedes the statistical separability of feedbacks. We additionally discuss formal inconsistencies between the common ways of comparing climate sensitivities and feedbacks. Moreover, methodical recommendations for future work are given.

  8. Concept similarity and related categories in information retrieval using formal concept analysis

    NASA Astrophysics Data System (ADS)

    Eklund, P.; Ducrou, J.; Dau, F.

    2012-11-01

    The application of formal concept analysis to the problem of information retrieval has been shown useful but has lacked any real analysis of the idea of relevance ranking of search results. SearchSleuth is a program developed to experiment with the automated local analysis of Web search using formal concept analysis. SearchSleuth extends a standard search interface to include a conceptual neighbourhood centred on a formal concept derived from the initial query. This neighbourhood of the concept derived from the search terms is decorated with its upper and lower neighbours representing more general and special concepts, respectively. SearchSleuth is in many ways an archetype of search engines based on formal concept analysis with some novel features. In SearchSleuth, the notion of related categories - which are themselves formal concepts - is also introduced. This allows the retrieval focus to shift to a new formal concept called a sibling. This movement across the concept lattice needs to relate one formal concept to another in a principled way. This paper presents the issues concerning exploring, searching, and ordering the space of related categories. The focus is on understanding the use and meaning of proximity and semantic distance in the context of information retrieval using formal concept analysis.

  9. Effectiveness of groundwater governance structures and institutions in Tanzania

    NASA Astrophysics Data System (ADS)

    Gudaga, J. L.; Kabote, S. J.; Tarimo, A. K. P. R.; Mosha, D. B.; Kashaigili, J. J.

    2018-05-01

    This paper examines effectiveness of groundwater governance structures and institutions in Mbarali District, Mbeya Region. The paper adopts exploratory sequential research design to collect quantitative and qualitative data. A random sample of 90 groundwater users with 50% women was involved in the survey. Descriptive statistics, Kruskal-Wallis H test and Mann-Whitney U test were used to compare the differences in responses between groups, while qualitative data were subjected to content analysis. The results show that the Village Councils and Community Water Supply Organizations (COWSOs) were effective in governing groundwater. The results also show statistical significant difference on the overall extent of effectiveness of the Village Councils in governing groundwater between villages ( P = 0.0001), yet there was no significant difference ( P > 0.05) between male and female responses on the effectiveness of Village Councils, village water committees and COWSOs. The Mann-Whitney U test showed statistical significant difference between male and female responses on effectiveness of formal and informal institutions ( P = 0.0001), such that informal institutions were effective relative to formal institutions. The Kruskal-Wallis H test also showed statistical significant difference ( P ≤ 0.05) on the extent of effectiveness of formal institutions, norms and values between low, medium and high categories. The paper concludes that COWSOs were more effective in governing groundwater than other groundwater governance structures. Similarly, norms and values were more effective than formal institutions. The paper recommends sensitization and awareness creation on formal institutions so that they can influence water users' behaviour to govern groundwater.

  10. Element sensitive reconstruction of nanostructured surfaces with finite elements and grazing incidence soft X-ray fluorescence.

    PubMed

    Soltwisch, Victor; Hönicke, Philipp; Kayser, Yves; Eilbracht, Janis; Probst, Jürgen; Scholze, Frank; Beckhoff, Burkhard

    2018-03-29

    The geometry of a Si3N4 lamellar grating was investigated experimentally with reference-free grazing-incidence X-ray fluorescence analysis. While simple layered systems are usually treated with the matrix formalism to determine the X-ray standing-wave field, this approach fails for laterally structured surfaces. Maxwell solvers based on finite elements are often used to model electrical field strengths for any 2D or 3D structures in the optical spectral range. We show that this approach can also be applied in the field of X-rays. The electrical field distribution obtained with the Maxwell solver can subsequently be used to calculate the fluorescence intensities in full analogy to the X-ray standing-wave field obtained by the matrix formalism. Only the effective 1D integration for the layer system has to be replaced by a 2D integration of the finite elements, taking into account the local excitation conditions. We will show that this approach is capable of reconstructing the geometric line shape of a structured surface with high elemental sensitivity. This combination of GIXRF and finite-element simulations paves the way for a versatile characterization of nanoscale-structured surfaces.

  11. Local influence for generalized linear models with missing covariates.

    PubMed

    Shi, Xiaoyan; Zhu, Hongtu; Ibrahim, Joseph G

    2009-12-01

    In the analysis of missing data, sensitivity analyses are commonly used to check the sensitivity of the parameters of interest with respect to the missing data mechanism and other distributional and modeling assumptions. In this article, we formally develop a general local influence method to carry out sensitivity analyses of minor perturbations to generalized linear models in the presence of missing covariate data. We examine two types of perturbation schemes (the single-case and global perturbation schemes) for perturbing various assumptions in this setting. We show that the metric tensor of a perturbation manifold provides useful information for selecting an appropriate perturbation. We also develop several local influence measures to identify influential points and test model misspecification. Simulation studies are conducted to evaluate our methods, and real datasets are analyzed to illustrate the use of our local influence measures.

  12. Developing Sensitivity to Subword Combinatorial Orthographic Regularity (SCORe): A Two-Process Framework

    ERIC Educational Resources Information Center

    Mano, Quintino R.

    2016-01-01

    Accumulating evidence suggests that literacy acquisition involves developing sensitivity to the statistical regularities of the textual environment. To organize accumulating evidence and help guide future inquiry, this article integrates data from disparate fields of study and formalizes a new two-process framework for developing sensitivity to…

  13. Descriptive Analysis and Strategic Options to Defeat Commodity-Based Threat Financing Methodologies Related to Gold

    DTIC Science & Technology

    2015-09-01

    continue to occur in the Peruvian Andes and the low-lying Amazon basin in the environmentally sensitive and protected region of Madre de Dios . In 2013...PwC stated, “six mining companies and the small producers of the region of Madre de Dios concentrate 62% of [gold] production (PwC, 2013b, p. 16...illegal mining operations occur throughout Madre de Dios without attempts at formalization. In Madre de Dios , forests are clear cut of vegetation

  14. Sensitivity and specificity of a two-question screening tool for depression in a specialist palliative care unit.

    PubMed

    Payne, Ann; Barry, Sandra; Creedon, Brian; Stone, Carol; Sweeney, Catherine; O' Brien, Tony; O' Sullivan, Kathleen

    2007-04-01

    The primary objective in this study is to determine the sensitivity and specificity of a two-item screening interview for depression versus the formal psychiatric interview, in the setting of a specialist palliative in-patient unit so that we may identify those individuals suffering from depressive disorder and therefore optimise their management in this often-complex population. A prospective sample of consecutive admissions (n = 167) consented to partake in the study, and the screening interview was asked separately to the formal psychiatric interview. The two-item questionnaire, achieved a sensitivity of 90.7% (95% CI 76.9-97.0) but a lower specificity of 67.7% (95% CI 58.7-75.7). The false positive rate was 32.3% (95% CI 24.3-41.3), but the false negative rate was found to be a low 9.3% (95% CI 3.0-23.1). A subgroup analysis of individuals with a past experience of depressive illness, (n = 95), revealed that a significant number screened positive for depression by the screening test, 55.2% (16/29) compared to those with no background history of depression, 33.3% (22/66) (P = 0.045). The high sensitivity and low false negative rate of the two-question screening tool will aid health professionals in identifying depression in the in-patient specialist palliative care unit. Individuals, who admit to a previous experience of depressive illness, are more likely to respond positively to the two-item questionnaire than those who report no prior history of depressive illness (P = 0.045).

  15. Putting it all together: Exhumation histories from a formal combination of heat flow and a suite of thermochronometers

    USGS Publications Warehouse

    d'Alessio, M. A.; Williams, C.F.

    2007-01-01

    A suite of new techniques in thermochronometry allow analysis of the thermal history of a sample over a broad range of temperature sensitivities. New analysis tools must be developed that fully and formally integrate these techniques, allowing a single geologic interpretation of the rate and timing of exhumation and burial events consistent with all data. We integrate a thermal model of burial and exhumation, (U-Th)/He age modeling, and fission track age and length modeling. We then use a genetic algorithm to efficiently explore possible time-exhumation histories of a vertical sample profile (such as a borehole), simultaneously solving for exhumation and burial rates as well as changes in background heat flow. We formally combine all data in a rigorous statistical fashion. By parameterizing the model in terms of exhumation rather than time-temperature paths (as traditionally done in fission track modeling), we can ensure that exhumation histories result in a sedimentary basin whose thickness is consistent with the observed basin, a physically based constraint that eliminates otherwise acceptable thermal histories. We apply the technique to heat flow and thermochronometry data from the 2.1 -km-deep San Andreas Fault Observatory at Depth pilot hole near the San Andreas fault, California. We find that the site experienced <1 km of exhumation or burial since the onset of San Andreas fault activity ???30 Ma.

  16. Experience report: Using formal methods for requirements analysis of critical spacecraft software

    NASA Technical Reports Server (NTRS)

    Lutz, Robyn R.; Ampo, Yoko

    1994-01-01

    Formal specification and analysis of requirements continues to gain support as a method for producing more reliable software. However, the introduction of formal methods to a large software project is difficult, due in part to the unfamiliarity of the specification languages and the lack of graphics. This paper reports results of an investigation into the effectiveness of formal methods as an aid to the requirements analysis of critical, system-level fault-protection software on a spacecraft currently under development. Our experience indicates that formal specification and analysis can enhance the accuracy of the requirements and add assurance prior to design development in this domain. The work described here is part of a larger, NASA-funded research project whose purpose is to use formal-methods techniques to improve the quality of software in space applications. The demonstration project described here is part of the effort to evaluate experimentally the effectiveness of supplementing traditional engineering approaches to requirements specification with the more rigorous specification and analysis available with formal methods.

  17. Concepts of formal concept analysis

    NASA Astrophysics Data System (ADS)

    Žáček, Martin; Homola, Dan; Miarka, Rostislav

    2017-07-01

    The aim of this article is apply of Formal Concept Analysis on concept of world. Formal concept analysis (FCA) as a methodology of data analysis, information management and knowledge representation has potential to be applied to a verity of linguistic problems. FCA is mathematical theory for concepts and concept hierarchies that reflects an understanding of concept. Formal concept analysis explicitly formalizes extension and intension of a concept, their mutual relationships. A distinguishing feature of FCA is an inherent integration of three components of conceptual processing of data and knowledge, namely, the discovery and reasoning with concepts in data, discovery and reasoning with dependencies in data, and visualization of data, concepts, and dependencies with folding/unfolding capabilities.

  18. Bedside Ultrasound in the Emergency Department to Detect Hydronephrosis for the Evaluation of Suspected Ureteric Colic.

    PubMed

    Shrestha, R; Shakya, R M; Khan A, A

    2016-01-01

    Background Renal colic is a common emergency department presentation. Hydronephrosis is indirect sign of urinary obstruction which may be due to obstructing ureteric calculus and can be detected easily by bedside ultrasound with minimal training. Objective To compare the accuracy of detection of hydronephrosis performed by the emergency physician with that of radiologist's in suspected renal colic cases. Method This was a prospective observational study performed over a period of 6 months. Patients >8 years with provisional diagnosis of renal colic with both the bedside ultrasound and the formal ultrasound performed were included. Presence of hydronephrosis in both ultrasounds and size and location of ureteric stone if present in formal ultrasound was recorded. The accuracy of the emergency physician detection of hydronephrosis was determined using the scan reported by the radiologists as the "gold standard" as computed tomography was unavailable. Statistical analysis was executed using SPSS 17.0. Result Among the 111 included patients, 56.7% had ureteric stone detected in formal ultrasound. The overall sensitivity, specificity, positive predictive value and negative predictive value of bedside ultrasound performed by emergency physician for detection of hydronephrosis with that of formal ultrasound performed by radiologist was 90.8%., 78.3%, 85.5% and 85.7% respectively. Bedside ultrasound and formal ultrasound both detected hydronephrosis more often in patients with larger stones and the difference was statistically significant (p=.000). Conclusion Bedside ultrasound can be potentially used as an important tool in detecting clinically significant hydronephrosis in emergency to evaluate suspected ureteric colic. Focused training in ultrasound could greatly improve the emergency management of these patients.

  19. Clinical Applicability and Cutoff Values for an Unstructured Neuropsychological Assessment Protocol for Older Adults with Low Formal Education

    PubMed Central

    de Paula, Jonas Jardim; Bertola, Laiss; Ávila, Rafaela Teixeira; Moreira, Lafaiete; Coutinho, Gabriel; de Moraes, Edgar Nunes; Bicalho, Maria Aparecida Camargos; Nicolato, Rodrigo; Diniz, Breno Satler; Malloy-Diniz, Leandro Fernandes

    2013-01-01

    Background and Objectives The neuropsychological exam plays a central role in the assessment of elderly patients with cognitive complaints. It is particularly relevant to differentiate patients with mild dementia from those subjects with mild cognitive impairment. Formal education is a critical factor in neuropsychological performance; however, there are few studies that evaluated the psychometric properties, especially criterion related validity, neuropsychological tests for patients with low formal education. The present study aims to investigate the validity of an unstructured neuropsychological assessment protocol for this population and develop cutoff values for clinical use. Methods and Results A protocol composed by the Rey-Auditory Verbal Learning Test, Frontal Assessment Battery, Category and Letter Fluency, Stick Design Test, Clock Drawing Test, Digit Span, Token Test and TN-LIN was administered to 274 older adults (96 normal aging, 85 mild cognitive impairment and 93 mild Alzheimer`s disease) with predominantly low formal education. Factor analysis showed a four factor structure related to Executive Functions, Language/Semantic Memory, Episodic Memory and Visuospatial Abilities, accounting for 65% of explained variance. Most of the tests showed a good sensitivity and specificity to differentiate the diagnostic groups. The neuropsychological protocol showed a significant ecological validity as 3 of the cognitive factors explained 31% of the variance on Instrumental Activities of Daily Living. Conclusion The study presents evidence of the construct, criteria and ecological validity for this protocol. The neuropsychological tests and the proposed cutoff values might be used for the clinical assessment of older adults with low formal education. PMID:24066031

  20. Data Acquisition and Preprocessing in Studies on Humans: What Is Not Taught in Statistics Classes?

    PubMed

    Zhu, Yeyi; Hernandez, Ladia M; Mueller, Peter; Dong, Yongquan; Forman, Michele R

    2013-01-01

    The aim of this paper is to address issues in research that may be missing from statistics classes and important for (bio-)statistics students. In the context of a case study, we discuss data acquisition and preprocessing steps that fill the gap between research questions posed by subject matter scientists and statistical methodology for formal inference. Issues include participant recruitment, data collection training and standardization, variable coding, data review and verification, data cleaning and editing, and documentation. Despite the critical importance of these details in research, most of these issues are rarely discussed in an applied statistics program. One reason for the lack of more formal training is the difficulty in addressing the many challenges that can possibly arise in the course of a study in a systematic way. This article can help to bridge this gap between research questions and formal statistical inference by using an illustrative case study for a discussion. We hope that reading and discussing this paper and practicing data preprocessing exercises will sensitize statistics students to these important issues and achieve optimal conduct, quality control, analysis, and interpretation of a study.

  1. Generalized Linear Covariance Analysis

    NASA Technical Reports Server (NTRS)

    Carpenter, James R.; Markley, F. Landis

    2014-01-01

    This talk presents a comprehensive approach to filter modeling for generalized covariance analysis of both batch least-squares and sequential estimators. We review and extend in two directions the results of prior work that allowed for partitioning of the state space into solve-for'' and consider'' parameters, accounted for differences between the formal values and the true values of the measurement noise, process noise, and textita priori solve-for and consider covariances, and explicitly partitioned the errors into subspaces containing only the influence of the measurement noise, process noise, and solve-for and consider covariances. In this work, we explicitly add sensitivity analysis to this prior work, and relax an implicit assumption that the batch estimator's epoch time occurs prior to the definitive span. We also apply the method to an integrated orbit and attitude problem, in which gyro and accelerometer errors, though not estimated, influence the orbit determination performance. We illustrate our results using two graphical presentations, which we call the variance sandpile'' and the sensitivity mosaic,'' and we compare the linear covariance results to confidence intervals associated with ensemble statistics from a Monte Carlo analysis.

  2. Age and Schooling Effects on Early Literacy and Phoneme Awareness

    ERIC Educational Resources Information Center

    Cunningham, Anna; Carroll, Julia

    2011-01-01

    Previous research on age and schooling effects is largely restricted to studies of children who begin formal schooling at 6 years of age, and the measures of phoneme awareness used have typically lacked sensitivity for beginning readers. Our study addresses these issues by testing 4 to 6 year-olds (first 2 years of formal schooling in the United…

  3. MOOC & B-Learning: Students' Barriers and Satisfaction in Formal and Non-Formal Learning Environments

    ERIC Educational Resources Information Center

    Gutiérrez-Santiuste, Elba; Gámiz-Sánchez, Vanesa-M.; Gutiérrez-Pérez, Jose

    2015-01-01

    The study presents a comparative analysis of two virtual learning formats: one non-formal through a Massive Open Online Course (MOOC) and the other formal through b-learning. We compare the communication barriers and the satisfaction perceived by the students (N = 249) by developing a qualitative analysis using semi-structured questionnaires and…

  4. Internal medicine point-of-care ultrasound assessment of left ventricular function correlates with formal echocardiography.

    PubMed

    Johnson, Benjamin K; Tierney, David M; Rosborough, Terry K; Harris, Kevin M; Newell, Marc C

    2016-02-01

    Although focused cardiac ultrasonographic (FoCUS) examination has been evaluated in emergency departments and intensive care units with good correlation to formal echocardiography, accuracy for the assessment of left ventricular systolic function (LVSF) when performed by internal medicine physicians still needs independent evaluation. This prospective observational study in a 640-bed, academic, quaternary care center, included 178 inpatients examined by 10 internal medicine physicians who had completed our internal medicine bedside ultrasound training program. The ability to estimate LVSF with FoCUS as "normal," "mild to moderately decreased," or "severely decreased" was compared with left ventricular ejection fraction (>50%, 31-49%, and <31%, respectively) from formal echocardiography interpreted by a cardiologist. Sensitivity and specificity of FoCUS for any degree of LVSF impairment were 0.91 (95% confidence interval [CI] 0.80, 0.97) and 0.88 (95% CI 0.81, 0.93), respectively. The interrater agreement between internal medicine physician-performed FoCUS and formal echocardiography for any LVSF impairment was "good/substantial" with κ = 0.77 (p < 0.001), 95% CI (0.67, 0.87). Formal echocardiography was classified as "technically limited due to patient factors" in 20% of patients; however, echogenicity was sufficient in 100% of FoCUS exams to classify LVSF. Internal medicine physicians using FoCUS identify normal versus decreased LVSF with high sensitivity, specificity, and "good/substantial" interrater agreement when compared with formal echocardiography. These results support the role of cardiac FoCUS by properly trained internal medicine physicians for discriminating normal from reduced LVSF. © 2015 Wiley Periodicals, Inc.

  5. An efficient method of reducing glass dispersion tolerance sensitivity

    NASA Astrophysics Data System (ADS)

    Sparrold, Scott W.; Shepard, R. Hamilton

    2014-12-01

    Constraining the Seidel aberrations of optical surfaces is a common technique for relaxing tolerance sensitivities in the optimization process. We offer an observation that a lens's Abbe number tolerance is directly related to the magnitude by which its longitudinal and transverse color are permitted to vary in production. Based on this observation, we propose a computationally efficient and easy-to-use merit function constraint for relaxing dispersion tolerance sensitivity. Using the relationship between an element's chromatic aberration and dispersion sensitivity, we derive a fundamental limit for lens scale and power that is capable of achieving high production yield for a given performance specification, which provides insight on the point at which lens splitting or melt fitting becomes necessary. The theory is validated by comparing its predictions to a formal tolerance analysis of a Cooke Triplet, and then applied to the design of a 1.5x visible linescan lens to illustrate optimization for reduced dispersion sensitivity. A selection of lenses in high volume production is then used to corroborate the proposed method of dispersion tolerance allocation.

  6. Maternal employment in low- and middle-income countries is associated with improved infant and young child feeding.

    PubMed

    Oddo, Vanessa M; Ickes, Scott B

    2018-03-01

    Women's employment improves household income, and can increase resources available for food expenditure. However, employed women face time constraints that may influence caregiving and infant and young child feeding (IYCF) practices. As economic and social trends shift to include more women in the labor force in low- and middle-income countries (LMICs), a current understanding of the association between maternal employment and IYCF is needed. We investigated the association between maternal employment and IYCF. Using cross-sectional samples from 50 Demographic and Health Surveys, we investigated the association between maternal employment and 3 indicators of IYCF: exclusive breastfeeding (EBF) among children aged <6 mo (n = 47,340) and minimum diet diversity (MDD) and minimum meal frequency (MMF) (n = 137,208) among children aged 6-23 mo. Mothers were categorized as formally employed, informally employed, or nonemployed. We used meta-analysis to pool associations across all countries and by region. According to pooled estimates, neither formal [pooled odds ratio (POR) = 0.91; 95% CI: 0.81, 1.03] nor informal employment (POR = 1.05; 95% CI: 0.95, 1.16), compared to nonemployment, was associated with EBF. Children of both formally and informally employed women, compared to children of nonemployed women, had higher odds of meeting MDD (formal POR = 1.47; 95% CI: 1.35, 1.60; informal POR = 1.11; 95% CI: 1.03, 1.20) and MMF (formal POR = 1.18; 95% CI: 1.10, 1.26; informal POR = 1.15; 95% CI: 1.06, 1.24). Sensitivity analyses indicated that compared to nonemployed mothers, the odds of continued breastfeeding at 1 y were lower among formally employed mothers (POR = 0.82; 95% CI: 0.73, 0.98) and higher among informally employed mothers (POR = 1.19; 95% CI: 1.01, 1.40). Efforts to promote formalized employment among mothers may be an effective method for improving diet diversity and feeding frequency in LMICs. Formally employed mothers may benefit from support for breastfeeding to enable continued breastfeeding through infancy. This trial was registered at clinicaltrials.gov as NCT03209999.

  7. Development of Multiobjective Optimization Techniques for Sonic Boom Minimization

    NASA Technical Reports Server (NTRS)

    Chattopadhyay, Aditi; Rajadas, John Narayan; Pagaldipti, Naryanan S.

    1996-01-01

    A discrete, semi-analytical sensitivity analysis procedure has been developed for calculating aerodynamic design sensitivities. The sensitivities of the flow variables and the grid coordinates are numerically calculated using direct differentiation of the respective discretized governing equations. The sensitivity analysis techniques are adapted within a parabolized Navier Stokes equations solver. Aerodynamic design sensitivities for high speed wing-body configurations are calculated using the semi-analytical sensitivity analysis procedures. Representative results obtained compare well with those obtained using the finite difference approach and establish the computational efficiency and accuracy of the semi-analytical procedures. Multidisciplinary design optimization procedures have been developed for aerospace applications namely, gas turbine blades and high speed wing-body configurations. In complex applications, the coupled optimization problems are decomposed into sublevels using multilevel decomposition techniques. In cases with multiple objective functions, formal multiobjective formulation such as the Kreisselmeier-Steinhauser function approach and the modified global criteria approach have been used. Nonlinear programming techniques for continuous design variables and a hybrid optimization technique, based on a simulated annealing algorithm, for discrete design variables have been used for solving the optimization problems. The optimization procedure for gas turbine blades improves the aerodynamic and heat transfer characteristics of the blades. The two-dimensional, blade-to-blade aerodynamic analysis is performed using a panel code. The blade heat transfer analysis is performed using an in-house developed finite element procedure. The optimization procedure yields blade shapes with significantly improved velocity and temperature distributions. The multidisciplinary design optimization procedures for high speed wing-body configurations simultaneously improve the aerodynamic, the sonic boom and the structural characteristics of the aircraft. The flow solution is obtained using a comprehensive parabolized Navier Stokes solver. Sonic boom analysis is performed using an extrapolation procedure. The aircraft wing load carrying member is modeled as either an isotropic or a composite box beam. The isotropic box beam is analyzed using thin wall theory. The composite box beam is analyzed using a finite element procedure. The developed optimization procedures yield significant improvements in all the performance criteria and provide interesting design trade-offs. The semi-analytical sensitivity analysis techniques offer significant computational savings and allow the use of comprehensive analysis procedures within design optimization studies.

  8. Determining the privacy policy deficiencies of health ICT applications through semi-formal modelling.

    PubMed

    Croll, Peter R

    2011-02-01

    To ensure that patient confidentiality is securely maintained, health ICT applications that contain sensitive personal information demand comprehensive privacy policies. Determining the adequacy of these policies to meet legal conformity together with clinical users and patient expectation is demanding in practice. Organisations and agencies looking to analyse their Privacy and Security policies can benefit from guidance provided by outside entities such as the Privacy Office of their State or Government together with law firms and ICT specialists. The advice given is not uniform and often open to different interpretations. Of greater concern is the possibility of overlooking any important aspects that later result in a data breach. Based on three case studies, this paper considers whether a more formal approach to privacy analysis could be taken that would help identify the full coverage of a Privacy Impact Analysis and determine the deficiencies with an organisation's current policies and approach. A diagrammatic model showing the relationships between Confidentiality, Privacy, Trust, Security and Safety is introduced. First the validity of this model is determined by mapping it against the real-world case studies taken from three healthcare services that depend on ICT. Then, by using software engineering methods, a formal mapping of the relationships is undertaken to identify a full set of policies needed to satisfy the model. How effective this approach may prove as a generic method for deriving a comprehensive set of policies in health ICT applications is finally discussed. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  9. Effect of formal and informal likelihood functions on uncertainty assessment in a single event rainfall-runoff model

    NASA Astrophysics Data System (ADS)

    Nourali, Mahrouz; Ghahraman, Bijan; Pourreza-Bilondi, Mohsen; Davary, Kamran

    2016-09-01

    In the present study, DREAM(ZS), Differential Evolution Adaptive Metropolis combined with both formal and informal likelihood functions, is used to investigate uncertainty of parameters of the HEC-HMS model in Tamar watershed, Golestan province, Iran. In order to assess the uncertainty of 24 parameters used in HMS, three flood events were used to calibrate and one flood event was used to validate the posterior distributions. Moreover, performance of seven different likelihood functions (L1-L7) was assessed by means of DREAM(ZS)approach. Four likelihood functions, L1-L4, Nash-Sutcliffe (NS) efficiency, Normalized absolute error (NAE), Index of agreement (IOA), and Chiew-McMahon efficiency (CM), is considered as informal, whereas remaining (L5-L7) is represented in formal category. L5 focuses on the relationship between the traditional least squares fitting and the Bayesian inference, and L6, is a hetereoscedastic maximum likelihood error (HMLE) estimator. Finally, in likelihood function L7, serial dependence of residual errors is accounted using a first-order autoregressive (AR) model of the residuals. According to the results, sensitivities of the parameters strongly depend on the likelihood function, and vary for different likelihood functions. Most of the parameters were better defined by formal likelihood functions L5 and L7 and showed a high sensitivity to model performance. Posterior cumulative distributions corresponding to the informal likelihood functions L1, L2, L3, L4 and the formal likelihood function L6 are approximately the same for most of the sub-basins, and these likelihood functions depict almost a similar effect on sensitivity of parameters. 95% total prediction uncertainty bounds bracketed most of the observed data. Considering all the statistical indicators and criteria of uncertainty assessment, including RMSE, KGE, NS, P-factor and R-factor, results showed that DREAM(ZS) algorithm performed better under formal likelihood functions L5 and L7, but likelihood function L5 may result in biased and unreliable estimation of parameters due to violation of the residualerror assumptions. Thus, likelihood function L7 provides posterior distribution of model parameters credibly and therefore can be employed for further applications.

  10. Is probabilistic bias analysis approximately Bayesian?

    PubMed Central

    MacLehose, Richard F.; Gustafson, Paul

    2011-01-01

    Case-control studies are particularly susceptible to differential exposure misclassification when exposure status is determined following incident case status. Probabilistic bias analysis methods have been developed as ways to adjust standard effect estimates based on the sensitivity and specificity of exposure misclassification. The iterative sampling method advocated in probabilistic bias analysis bears a distinct resemblance to a Bayesian adjustment; however, it is not identical. Furthermore, without a formal theoretical framework (Bayesian or frequentist), the results of a probabilistic bias analysis remain somewhat difficult to interpret. We describe, both theoretically and empirically, the extent to which probabilistic bias analysis can be viewed as approximately Bayesian. While the differences between probabilistic bias analysis and Bayesian approaches to misclassification can be substantial, these situations often involve unrealistic prior specifications and are relatively easy to detect. Outside of these special cases, probabilistic bias analysis and Bayesian approaches to exposure misclassification in case-control studies appear to perform equally well. PMID:22157311

  11. Deep first formal concept search.

    PubMed

    Zhang, Tao; Li, Hui; Hong, Wenxue; Yuan, Xiamei; Wei, Xinyu

    2014-01-01

    The calculation of formal concepts is a very important part in the theory of formal concept analysis (FCA); however, within the framework of FCA, computing all formal concepts is the main challenge because of its exponential complexity and difficulty in visualizing the calculating process. With the basic idea of Depth First Search, this paper presents a visualization algorithm by the attribute topology of formal context. Limited by the constraints and calculation rules, all concepts are achieved by the visualization global formal concepts searching, based on the topology degenerated with the fixed start and end points, without repetition and omission. This method makes the calculation of formal concepts precise and easy to operate and reflects the integrity of the algorithm, which enables it to be suitable for visualization analysis.

  12. Participation, Power, and Policy: Developing a Gender-Sensitive Political Geography.

    ERIC Educational Resources Information Center

    Cope, Meghan

    1997-01-01

    Outlines three approaches to exploring gender-sensitive political geography: (1) examining formal (voting and holding office) and informal (activism and interest groups) political activity; (2) delineating the variety of gender-based power constructs; and (3) reviewing recent policy developments, specifically the Family and Medical leave Act of…

  13. Meta-analysis of diagnostic test data: a bivariate Bayesian modeling approach.

    PubMed

    Verde, Pablo E

    2010-12-30

    In the last decades, the amount of published results on clinical diagnostic tests has expanded very rapidly. The counterpart to this development has been the formal evaluation and synthesis of diagnostic results. However, published results present substantial heterogeneity and they can be regarded as so far removed from the classical domain of meta-analysis, that they can provide a rather severe test of classical statistical methods. Recently, bivariate random effects meta-analytic methods, which model the pairs of sensitivities and specificities, have been presented from the classical point of view. In this work a bivariate Bayesian modeling approach is presented. This approach substantially extends the scope of classical bivariate methods by allowing the structural distribution of the random effects to depend on multiple sources of variability. Meta-analysis is summarized by the predictive posterior distributions for sensitivity and specificity. This new approach allows, also, to perform substantial model checking, model diagnostic and model selection. Statistical computations are implemented in the public domain statistical software (WinBUGS and R) and illustrated with real data examples. Copyright © 2010 John Wiley & Sons, Ltd.

  14. Towards the Construction of a Personal Professional Pathway: An Experimental Project for the Recognition of Non-Formal and Informal Learning in the University of Catania

    ERIC Educational Resources Information Center

    Piazza, Roberta

    2013-01-01

    In Italy, accreditation of prior learning is a sensitive issue. Despite the lack of laws or qualification frameworks regulating the recognition of non-formal and informal learning, most Italian universities proceed with caution, allowing only a restricted number of credits in the university curriculum related to practical activities or to external…

  15. Defining the optimal therapy sequence in synchronous resectable liver metastases from colorectal cancer: a decision analysis approach.

    PubMed

    Van Dessel, E; Fierens, K; Pattyn, P; Van Nieuwenhove, Y; Berrevoet, F; Troisi, R; Ceelen, W

    2009-01-01

    Approximately 5%-20% of colorectal cancer (CRC) patients present with synchronous potentially resectable liver metastatic disease. Preclinical and clinical studies suggest a benefit of the 'liver first' approach, i.e. resection of the liver metastasis followed by resection of the primary tumour. A formal decision analysis may support a rational choice between several therapy options. Survival and morbidity data were retrieved from relevant clinical studies identified by a Web of Science search. Data were entered into decision analysis software (TreeAge Pro 2009, Williamstown, MA, USA). Transition probabilities including the risk of death from complications or disease progression associated with individual therapy options were entered into the model. Sensitivity analysis was performed to evaluate the model's validity under a variety of assumptions. The result of the decision analysis confirms the superiority of the 'liver first' approach. Sensitivity analysis demonstrated that this assumption is valid on condition that the mortality associated with the hepatectomy first is < 4.5%, and that the mortality of colectomy performed after hepatectomy is < 3.2%. The results of this decision analysis suggest that, in patients with synchronous resectable colorectal liver metastases, the 'liver first' approach is to be preferred. Randomized trials will be needed to confirm the results of this simulation based outcome.

  16. An Analysis of the Formal Features of "Reality-Based" Television Programs.

    ERIC Educational Resources Information Center

    Neapolitan, D. M.

    Reality-based television programs showcase actual footage or recreate actual events, and include programs such as "America's Most Wanted" and "Rescue 911." To identify the features that typify reality-based television programs, this study conducted an analysis of formal features used in reality-based programs. Formal features…

  17. Using Formal Methods to Assist in the Requirements Analysis of the Space Shuttle GPS Change Request

    NASA Technical Reports Server (NTRS)

    DiVito, Ben L.; Roberts, Larry W.

    1996-01-01

    We describe a recent NASA-sponsored pilot project intended to gauge the effectiveness of using formal methods in Space Shuttle software requirements analysis. Several Change Requests (CR's) were selected as promising targets to demonstrate the utility of formal methods in this application domain. A CR to add new navigation capabilities to the Shuttle, based on Global Positioning System (GPS) technology, is the focus of this report. Carried out in parallel with the Shuttle program's conventional requirements analysis process was a limited form of analysis based on formalized requirements. Portions of the GPS CR were modeled using the language of SRI's Prototype Verification System (PVS). During the formal methods-based analysis, numerous requirements issues were discovered and submitted as official issues through the normal requirements inspection process. Shuttle analysts felt that many of these issues were uncovered earlier than would have occurred with conventional methods. We present a summary of these encouraging results and conclusions we have drawn from the pilot project.

  18. Polarization holograms allow highly efficient generation of complex light beams.

    PubMed

    Ruiz, U; Pagliusi, P; Provenzano, C; Volke-Sepúlveda, K; Cipparrone, Gabriella

    2013-03-25

    We report a viable method to generate complex beams, such as the non-diffracting Bessel and Weber beams, which relies on the encoding of amplitude information, in addition to phase and polarization, using polarization holography. The holograms are recorded in polarization sensitive films by the interference of a reference plane wave with a tailored complex beam, having orthogonal circular polarizations. The high efficiency, the intrinsic achromaticity and the simplicity of use of the polarization holograms make them competitive with respect to existing methods and attractive for several applications. Theoretical analysis, based on the Jones formalism, and experimental results are shown.

  19. Critical Analysis on Open Source LMSs Using FCA

    ERIC Educational Resources Information Center

    Sumangali, K.; Kumar, Ch. Aswani

    2013-01-01

    The objective of this paper is to apply Formal Concept Analysis (FCA) to identify the best open source Learning Management System (LMS) for an E-learning environment. FCA is a mathematical framework that represents knowledge derived from a formal context. In constructing the formal context, LMSs are treated as objects and their features as…

  20. A New Measure of Text Formality: An Analysis of Discourse of Mao Zedong

    ERIC Educational Resources Information Center

    Li, Haiying; Graesser, Arthur C.; Conley, Mark; Cai, Zhiqiang; Pavlik, Philip I., Jr.; Pennebaker, James W.

    2016-01-01

    Formality has long been of interest in the study of discourse, with periodic discussions of the best measure of formality and the relationship between formality and text categories. In this research, we explored what features predict formality as humans perceive the construct. We categorized a corpus consisting of 1,158 discourse samples published…

  1. Mixed messages: residents' experiences learning cross-cultural care.

    PubMed

    Park, Elyse R; Betancourt, Joseph R; Kim, Minah K; Maina, Angela W; Blumenthal, David; Weissman, Joel S

    2005-09-01

    An Institute of Medicine report issued in 2002 cited cross-cultural training as a mechanism to address racial and ethnic disparities in health care, but little is known about residents' training and capabilities to provide quality care to diverse populations. This article explores a select group of residents' perceptions of their preparedness to deliver quality care to diverse populations. Seven focus groups and ten individual interviews were conducted with 68 residents in locations nationwide. Qualitative analysis of focus-group and individual interview transcripts was performed to assess residents' perceptions of (1) preparedness to deliver care to diverse patients; (2) educational climate; and (3) training experiences. Most residents in this study noted the importance of cross-cultural care yet reported little formal training in this area. Residents wanted more formal training yet expressed concern that culture-specific training could lead to stereotyping. Most residents had developed ad hoc, informal skills to care for diverse patients. Although residents perceived institutional endorsement, they sensed it was a low priority due to lack of time and resources. Residents in this study reported receiving mixed messages about cross-cultural care. They were told it is important, yet they received little formal training and did not have time to treat diverse patients in a culturally sensitive manner. As a result, many developed coping behaviors rather than skills based on formally taught best practices. Training environments need to increase training to enhance residents' preparedness to deliver high-quality cross-cultural care if the medical profession is to achieve the goals set by the Institute of Medicine.

  2. Cost-Benefit Analysis of U.S. Copyright Formalities. Final Report.

    ERIC Educational Resources Information Center

    King Research, Inc., Rockville, MD.

    This study of the feasibility of conducting a cost-benefit analysis in the complex environment of the formalities used in the United States as part of its administration of the copyright law focused on the formalities of copyright notice, deposit, registration, and recordation. The U.S. system is also compared with the less centralized copyright…

  3. Aerospace engineering design by systematic decomposition and multilevel optimization

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, J.; Barthelemy, J. F. M.; Giles, G. L.

    1984-01-01

    A method for systematic analysis and optimization of large engineering systems, by decomposition of a large task into a set of smaller subtasks that is solved concurrently is described. The subtasks may be arranged in hierarchical levels. Analyses are carried out in each subtask using inputs received from other subtasks, and are followed by optimizations carried out from the bottom up. Each optimization at the lower levels is augmented by analysis of its sensitivity to the inputs received from other subtasks to account for the couplings among the subtasks in a formal manner. The analysis and optimization operations alternate iteratively until they converge to a system design whose performance is maximized with all constraints satisfied. The method, which is still under development, is tentatively validated by test cases in structural applications and an aircraft configuration optimization.

  4. Formalization of an environmental model using formal concept analysis - FCA

    NASA Astrophysics Data System (ADS)

    Bourdon-García, Rubén D.; Burgos-Salcedo, Javier D.

    2016-08-01

    Nowadays, there is a huge necessity to generate novel strategies for social-ecological systems analyses for resolving global sustainability problems. This paper has as main purpose the application of the formal concept analysis to formalize the theory of Augusto Ángel Maya, who without a doubt, was one of the most important environmental philosophers in South America; Ángel Maya proposed and established that Ecosystem-Culture relations, instead Human-Nature ones, are determinants in our understanding and management of natural resources. Based on this, a concept lattice, formal concepts, subconcept-superconcept relations, partially ordered sets, supremum and infimum of the lattice and implications between attributes (Duquenne-Guigues base), were determined for the ecosystem-culture relations.

  5. An Ontology for State Analysis: Formalizing the Mapping to SysML

    NASA Technical Reports Server (NTRS)

    Wagner, David A.; Bennett, Matthew B.; Karban, Robert; Rouquette, Nicolas; Jenkins, Steven; Ingham, Michel

    2012-01-01

    State Analysis is a methodology developed over the last decade for architecting, designing and documenting complex control systems. Although it was originally conceived for designing robotic spacecraft, recent applications include the design of control systems for large ground-based telescopes. The European Southern Observatory (ESO) began a project to design the European Extremely Large Telescope (E-ELT), which will require coordinated control of over a thousand articulated mirror segments. The designers are using State Analysis as a methodology and the Systems Modeling Language (SysML) as a modeling and documentation language in this task. To effectively apply the State Analysis methodology in this context it became necessary to provide ontological definitions of the concepts and relations in State Analysis and greater flexibility through a mapping of State Analysis into a practical extension of SysML. The ontology provides the formal basis for verifying compliance with State Analysis semantics including architectural constraints. The SysML extension provides the practical basis for applying the State Analysis methodology with SysML tools. This paper will discuss the method used to develop these formalisms (the ontology), the formalisms themselves, the mapping to SysML and approach to using these formalisms to specify a control system and enforce architectural constraints in a SysML model.

  6. An exploration of student midwives' language to describe non-formal learning in professional practice.

    PubMed

    Finnerty, Gina; Pope, Rosemary

    2005-05-01

    The essence of non-formal learning in midwifery practice has not been previously explored. This paper provides an in-depth analysis of the language of a sample of student midwives' descriptions of their practice learning in a range of clinical settings. The students submitted audio-diaries as part of a national study (Pope, R., Graham. L., Finnerty. G., Magnusson, C. 2003. An investigation of the preparation and assessment for midwifery practice within a range of settings. Project Report. University of Surrey). Participants detailed their learning activities and support obtained whilst working with their named mentors for approximately 10 days or shifts. The rich audio-diary data have been analysed using Discourse Analysis. A typology of non-formal learning (Eraut, M. 2000. Non-formal learning and implicit knowledge in professional work. British Journal of Educational Psychology 70, 113-136) has been used to provide a framework for the analysis. Non-formal learning is defined as any learning which does not take place within a formally organised learning programme (Eraut, M. 2000. Non-formal learning and implicit knowledge in professional work. British Journal of Educational Psychology 70, 113-136). Findings indicate that fear and ambiguity hindered students' learning. Recommendations include the protection of time by mentors within the clinical curriculum to guide and supervise students in both formal and non-formal elements of midwifery practice. This paper will explore the implications of the findings for practice-based education.

  7. Formalizing and Enforcing Purpose Restrictions

    DTIC Science & Technology

    2012-05-09

    purpose restrictions [AKSX02, BBL05, HA05, AF07, BL08, PGY08, JSNS09, NBL +10, EKWB11]. However, each of these endeavors starts by assuming that actions...BBL05, AF07, BL08, PGY08, JSNS09, NBL +10, EKWB11]. These works do not empirically show that their formalism corresponds to the actual meaning of purpose...methodology for or- ganizing privacy policies and their enforcement [BBL05, BL08, NBL +10]. They associate purposes with sensitive resources and with roles

  8. Learning about gender on campus: an analysis of the hidden curriculum for medical students.

    PubMed

    Cheng, Ling-Fang; Yang, Hsing-Chen

    2015-03-01

    Gender sensitivity is a crucial factor in the provision of quality health care. This paper explores acquired gendered values and attitudes among medical students through an analysis of the hidden curriculum that exists within formal medical classes and informal learning. Discourse analysis was adopted as the research method. Data were collected from the Bulletin Board System (BBS), which represented an essential communication platform among students in Taiwan before the era of Facebook. The study examined 197 gender-related postings on the BBS boards of nine of 11 universities with a medical department in Taiwan, over a period of 10 years from 2000 to 2010. The five distinctive characteristics of the hidden curriculum were as follows: (i) gendered stereotypes of physiological knowledge; (ii) biased treatment of women; (iii) stereotyped gender-based division of labour; (iv) sexual harassment and a hostile environment, and (v) ridiculing of lesbian, gay, bisexual and transgender (LGBT) people. Both teachers and students co-produced a heterosexual masculine culture and sexism, including 'benevolent sexism' and 'hostile sexism'. As a result, the self-esteem and learning opportunities of female and LGBT students have been eroded. The paper explores gender dynamics in the context of a hidden curriculum in which heterosexual masculinity and stereotyped sexism are prevalent as norms. Both teachers and students, whether through formal medical classes or informal extracurricular interactive activities, are noted to contribute to the consolidation of such norms. The study tentatively suggests three strategies for integrating gender into medical education: (i) by separating physiological knowledge from gender stereotyping in teaching; (ii) by highlighting the importance of gender sensitivity in the language used within and outside the classroom by teachers and students, and (iii) by broadening the horizons of both teachers and students by recounting examples of the lived experiences of those who have been excluded and discriminated against, particularly members of LGBT and other minorities. © 2015 John Wiley & Sons Ltd.

  9. 49 CFR 236.923 - Task analysis and basic requirements.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... classroom, simulator, computer-based, hands-on, or other formally structured training and testing, except... for Processor-Based Signal and Train Control Systems § 236.923 Task analysis and basic requirements...) Based on a formal task analysis, identify the installation, maintenance, repair, modification...

  10. Proceedings of the Second NASA Formal Methods Symposium

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar (Editor)

    2010-01-01

    This publication contains the proceedings of the Second NASA Formal Methods Symposium sponsored by the National Aeronautics and Space Administration and held in Washington D.C. April 13-15, 2010. Topics covered include: Decision Engines for Software Analysis using Satisfiability Modulo Theories Solvers; Verification and Validation of Flight-Critical Systems; Formal Methods at Intel -- An Overview; Automatic Review of Abstract State Machines by Meta Property Verification; Hardware-independent Proofs of Numerical Programs; Slice-based Formal Specification Measures -- Mapping Coupling and Cohesion Measures to Formal Z; How Formal Methods Impels Discovery: A Short History of an Air Traffic Management Project; A Machine-Checked Proof of A State-Space Construction Algorithm; Automated Assume-Guarantee Reasoning for Omega-Regular Systems and Specifications; Modeling Regular Replacement for String Constraint Solving; Using Integer Clocks to Verify the Timing-Sync Sensor Network Protocol; Can Regulatory Bodies Expect Efficient Help from Formal Methods?; Synthesis of Greedy Algorithms Using Dominance Relations; A New Method for Incremental Testing of Finite State Machines; Verification of Faulty Message Passing Systems with Continuous State Space in PVS; Phase Two Feasibility Study for Software Safety Requirements Analysis Using Model Checking; A Prototype Embedding of Bluespec System Verilog in the PVS Theorem Prover; SimCheck: An Expressive Type System for Simulink; Coverage Metrics for Requirements-Based Testing: Evaluation of Effectiveness; Software Model Checking of ARINC-653 Flight Code with MCP; Evaluation of a Guideline by Formal Modelling of Cruise Control System in Event-B; Formal Verification of Large Software Systems; Symbolic Computation of Strongly Connected Components Using Saturation; Towards the Formal Verification of a Distributed Real-Time Automotive System; Slicing AADL Specifications for Model Checking; Model Checking with Edge-valued Decision Diagrams; and Data-flow based Model Analysis.

  11. Formalizing Space Shuttle Software Requirements

    NASA Technical Reports Server (NTRS)

    Crow, Judith; DiVito, Ben L.

    1996-01-01

    This paper describes two case studies in which requirements for new flight-software subsystems on NASA's Space Shuttle were analyzed, one using standard formal specification techniques, the other using state exploration. These applications serve to illustrate three main theses: (1) formal methods can complement conventional requirements analysis processes effectively, (2) formal methods confer benefits regardless of how extensively they are adopted and applied, and (3) formal methods are most effective when they are judiciously tailored to the application.

  12. Development of Boolean calculus and its application

    NASA Technical Reports Server (NTRS)

    Tapia, M. A.

    1979-01-01

    Formal procedures for synthesis of asynchronous sequential system using commercially available edge-sensitive flip-flops are developed. Boolean differential is defined. The exact number of compatible integrals of a Boolean differential were calculated.

  13. Ontological analysis of SNOMED CT.

    PubMed

    Héja, Gergely; Surján, György; Varga, Péter

    2008-10-27

    SNOMED CT is the most comprehensive medical terminology. However, its use for intelligent services based on formal reasoning is questionable. The analysis of the structure of SNOMED CT is based on the formal top-level ontology DOLCE. The analysis revealed several ontological and knowledge-engineering errors, the most important are errors in the hierarchy (mostly from an ontological point of view, but also regarding medical aspects) and the mixing of subsumption relations with other types (mostly 'part of'). The found errors impede formal reasoning. The paper presents a possible way to correct these problems.

  14. Formal Analysis of Extended Well-Clear Boundaries for Unmanned Aircraft

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar; Narkawicz, Anthony

    2016-01-01

    This paper concerns the application of formal methods to the definition of a detect and avoid concept for unmanned aircraft systems (UAS). In particular, it illustrates how formal analysis was used to explain and correct unexpected behaviors of the logic that issues alerts when two aircraft are predicted not to be well clear from one another. As a result of this analysis, a recommendation was proposed to, and subsequently adopted by, the US standards organization that defines the minimum operational requirements for the UAS detect and avoid concept.

  15. Formal Assurance Arguments: A Solution In Search of a Problem?

    NASA Technical Reports Server (NTRS)

    Graydon, Patrick J.

    2015-01-01

    An assurance case comprises evidence and argument showing how that evidence supports assurance claims (e.g., about safety or security). It is unsurprising that some computer scientists have proposed formalizing assurance arguments: most associate formality with rigor. But while engineers can sometimes prove that source code refines a formal specification, it is not clear that formalization will improve assurance arguments or that this benefit is worth its cost. For example, formalization might reduce the benefits of argumentation by limiting the audience to people who can read formal logic. In this paper, we present (1) a systematic survey of the literature surrounding formal assurance arguments, (2) an analysis of errors that formalism can help to eliminate, (3) a discussion of existing evidence, and (4) suggestions for experimental work to definitively answer the question.

  16. Simplification and its consequences in biological modelling: conclusions from a study of calcium oscillations in hepatocytes.

    PubMed

    Hetherington, James P J; Warner, Anne; Seymour, Robert M

    2006-04-22

    Systems Biology requires that biological modelling is scaled up from small components to system level. This can produce exceedingly complex models, which obscure understanding rather than facilitate it. The successful use of highly simplified models would resolve many of the current problems faced in Systems Biology. This paper questions whether the conclusions of simple mathematical models of biological systems are trustworthy. The simplification of a specific model of calcium oscillations in hepatocytes is examined in detail, and the conclusions drawn from this scrutiny generalized. We formalize our choice of simplification approach through the use of functional 'building blocks'. A collection of models is constructed, each a progressively more simplified version of a well-understood model. The limiting model is a piecewise linear model that can be solved analytically. We find that, as expected, in many cases the simpler models produce incorrect results. However, when we make a sensitivity analysis, examining which aspects of the behaviour of the system are controlled by which parameters, the conclusions of the simple model often agree with those of the richer model. The hypothesis that the simplified model retains no information about the real sensitivities of the unsimplified model can be very strongly ruled out by treating the simplification process as a pseudo-random perturbation on the true sensitivity data. We conclude that sensitivity analysis is, therefore, of great importance to the analysis of simple mathematical models in biology. Our comparisons reveal which results of the sensitivity analysis regarding calcium oscillations in hepatocytes are robust to the simplifications necessarily involved in mathematical modelling. For example, we find that if a treatment is observed to strongly decrease the period of the oscillations while increasing the proportion of the cycle during which cellular calcium concentrations are rising, without affecting the inter-spike or maximum calcium concentrations, then it is likely that the treatment is acting on the plasma membrane calcium pump.

  17. Diaphragm size and sensitivity for fiber optic pressure sensors

    NASA Technical Reports Server (NTRS)

    He, Gang; Cuomo, Frank W.; Zuckerwar, Allan J.

    1991-01-01

    A mechanism which leads to a significant increase in sensitivity and linear operating range in reflective type fiber optic pressure transducers with minute active dimensions is studied. A general theoretical formalism is presented which is in good agreement with the experimental data. These results are found useful in the development of small pressure sensors used in turbulent boundary layer studies and other applications.

  18. Critical Analysis of the Mathematical Formalism of Theoretical Physics. II. Foundations of Vector Calculus

    NASA Astrophysics Data System (ADS)

    Kalanov, Temur Z.

    2014-03-01

    A critical analysis of the foundations of standard vector calculus is proposed. The methodological basis of the analysis is the unity of formal logic and of rational dialectics. It is proved that the vector calculus is incorrect theory because: (a) it is not based on a correct methodological basis - the unity of formal logic and of rational dialectics; (b) it does not contain the correct definitions of ``movement,'' ``direction'' and ``vector'' (c) it does not take into consideration the dimensions of physical quantities (i.e., number names, denominate numbers, concrete numbers), characterizing the concept of ''physical vector,'' and, therefore, it has no natural-scientific meaning; (d) operations on ``physical vectors'' and the vector calculus propositions relating to the ''physical vectors'' are contrary to formal logic.

  19. Revenue Potential for Inpatient IR Consultation Services: A Financial Model.

    PubMed

    Misono, Alexander S; Mueller, Peter R; Hirsch, Joshua A; Sheridan, Robert M; Siddiqi, Assad U; Liu, Raymond W

    2016-05-01

    Interventional radiology (IR) has historically failed to fully capture the value of evaluation and management services in the inpatient setting. Understanding financial benefits of a formally incorporated billing discipline may yield meaningful insights for interventional practices. A revenue modeling tool was created deploying standard financial modeling techniques, including sensitivity and scenario analyses. Sensitivity analysis calculates revenue fluctuation related to dynamic adjustment of discrete variables. In scenario analysis, possible future scenarios as well as revenue potential of different-size clinical practices are modeled. Assuming a hypothetical inpatient IR consultation service with a daily patient census of 35 patients and two new consults per day, the model estimates annual charges of $2.3 million and collected revenue of $390,000. Revenues are most sensitive to provider billing documentation rates and patient volume. A range of realistic scenarios-from cautious to optimistic-results in a range of annual charges of $1.8 million to $2.7 million and a collected revenue range of $241,000 to $601,000. Even a small practice with a daily patient census of 5 and 0.20 new consults per day may expect annual charges of $320,000 and collected revenue of $55,000. A financial revenue modeling tool is a powerful adjunct in understanding economics of an inpatient IR consultation service. Sensitivity and scenario analyses demonstrate a wide range of revenue potential and uncover levers for financial optimization. Copyright © 2016 SIR. Published by Elsevier Inc. All rights reserved.

  20. Formal Methods for Life-Critical Software

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Johnson, Sally C.

    1993-01-01

    The use of computer software in life-critical applications, such as for civil air transports, demands the use of rigorous formal mathematical verification procedures. This paper demonstrates how to apply formal methods to the development and verification of software by leading the reader step-by-step through requirements analysis, design, implementation, and verification of an electronic phone book application. The current maturity and limitations of formal methods tools and techniques are then discussed, and a number of examples of the successful use of formal methods by industry are cited.

  1. What Do Differences Between Multi-voxel and Univariate Analysis Mean? How Subject-, Voxel-, and Trial-level Variance Impact fMRI Analysis

    PubMed Central

    Davis, Tyler; LaRocque, Karen F.; Mumford, Jeanette; Norman, Kenneth A.; Wagner, Anthony D.; Poldrack, Russell A.

    2014-01-01

    Multi-voxel pattern analysis (MVPA) has led to major changes in how fMRI data are analyzed and interpreted. Many studies now report both MVPA results and results from standard univariate voxel-wise analysis, often with the goal of drawing different conclusions from each. Because MVPA results can be sensitive to latent multidimensional representations and processes whereas univariate voxel-wise analysis cannot, one conclusion that is often drawn when MVPA and univariate results differ is that the activation patterns underlying MVPA results contain a multidimensional code. In the current study, we conducted simulations to formally test this assumption. Our findings reveal that MVPA tests are sensitive to the magnitude of voxel-level variability in the effect of a condition within subjects, even when the same linear relationship is coded in all voxels. We also find that MVPA is insensitive to subject-level variability in mean activation across an ROI, which is the primary variance component of interest in many standard univariate tests. Together, these results illustrate that differences between MVPA and univariate tests do not afford conclusions about the nature or dimensionality of the neural code. Instead, targeted tests of the informational content and/or dimensionality of activation patterns are critical for drawing strong conclusions about the representational codes that are indicated by significant MVPA results. PMID:24768930

  2. A focus-group study on spirituality and substance-abuse treatment

    PubMed Central

    Heinz, Adrienne J.; Disney, Elizabeth R.; Epstein, David H.; Glezen, Louise A.; Clark, Pamela I.; Preston, Kenzie L.

    2010-01-01

    Individuals recovering from addictions frequently cite spirituality as a helpful influence. However, little is known about whether or how spirituality could be incorporated into formal treatment in a manner that is sensitive to individual differences. In the present study, focus groups were conducted with 25 methadone-maintained outpatients (primarily high-school educated, African-American males) to examine beliefs about the role of spirituality in recovery and its appropriateness in formal treatment. Groups also discussed the relationship between spirituality and behavior during active addiction. Thematic analyses suggested that spirituality and religious practices suffered in complex ways during active addiction, but went “hand in hand” with recovery. Nearly all participants agreed that integration of a voluntary spiritual discussion group into formal treatment would be preferable to currently available alternatives. One limitation was that all participants identified as strongly spiritual. Studies of more diverse samples will help guide the development and evaluation of spiritually based interventions in formal treatment settings. PMID:20025443

  3. Informal and formal mental health: preliminary qualitative findings

    PubMed Central

    O'Neill, Linda; George, Serena; Koehn, Corinne; Shepard, Blythe

    2013-01-01

    Background Northern-based research on mental health support, no matter the specific profession, helps to inform instruction of new practitioners and practitioners already working in rural or isolated conditions. Understanding the complexities of northern mental health support not only benefits clients and practitioners living in the North, but also helps prepare psychologists and counsellors preparing to work in other countries with large rural and isolated populations. The qualitative phase is part of a multi-year research study on informal and formal mental health support in northern Canada involving the use of qualitative and quantitative data collection and analysis methods. Objective The main objective of the qualitative phase interviews was to document in-depth the situation of formal and informal helpers in providing mental health support in isolated northern communities in northern British Columbia, northern Alberta, Yukon and Northwest Territories (NWT). The intent of in-depth interviews was to collect descriptive information on the unique working conditions of northern helping practitioners for the development of a survey and subsequent community action plans for helping practitioner support. Design Twenty participants in northern BC, Yukon and NWT participated in narrative interviews. Consensual qualitative research (CQR) was used in the analysis completed by 7 researchers. The principal researcher and research associate then worked through all 7 analyses, defining common categories and themes, and using selections from each researcher in order to ensure that everyone's analysis was represented in the final consensual summary. Results The preliminary results include 7 main categories consisting of various themes. Defining elements of northern practice included the need for generalist knowledge and cultural sensitivity. The task of working with and negotiating membership in community was identified as essential for northern mental health support. The need for revised codes of ethics relevant to the reality of northern work was a major category, as was insight on how to best sustain northern practice. Conclusion Many of the practitioners who participated in this study have found ways to overcome the biggest challenges of northern practice, yet the limitations of small populations and lack of resources in small communities to adequately address mental health support were identified as existing. Empowering communities by building community capacity to educate, supervise and support formal and informal mental health workers may be the best approach to overcoming the lack of external resources. PMID:23977648

  4. Informal and formal mental health: preliminary qualitative findings.

    PubMed

    O'Neill, Linda; George, Serena; Koehn, Corinne; Shepard, Blythe

    2013-01-01

    Northern-based research on mental health support, no matter the specific profession, helps to inform instruction of new practitioners and practitioners already working in rural or isolated conditions. Understanding the complexities of northern mental health support not only benefits clients and practitioners living in the North, but also helps prepare psychologists and counsellors preparing to work in other countries with large rural and isolated populations. The qualitative phase is part of a multi-year research study on informal and formal mental health support in northern Canada involving the use of qualitative and quantitative data collection and analysis methods. The main objective of the qualitative phase interviews was to document in-depth the situation of formal and informal helpers in providing mental health support in isolated northern communities in northern British Columbia, northern Alberta, Yukon and Northwest Territories (NWT). The intent of in-depth interviews was to collect descriptive information on the unique working conditions of northern helping practitioners for the development of a survey and subsequent community action plans for helping practitioner support. Twenty participants in northern BC, Yukon and NWT participated in narrative interviews. Consensual qualitative research (CQR) was used in the analysis completed by 7 researchers. The principal researcher and research associate then worked through all 7 analyses, defining common categories and themes, and using selections from each researcher in order to ensure that everyone's analysis was represented in the final consensual summary. The preliminary results include 7 main categories consisting of various themes. Defining elements of northern practice included the need for generalist knowledge and cultural sensitivity. The task of working with and negotiating membership in community was identified as essential for northern mental health support. The need for revised codes of ethics relevant to the reality of northern work was a major category, as was insight on how to best sustain northern practice. Many of the practitioners who participated in this study have found ways to overcome the biggest challenges of northern practice, yet the limitations of small populations and lack of resources in small communities to adequately address mental health support were identified as existing. Empowering communities by building community capacity to educate, supervise and support formal and informal mental health workers may be the best approach to overcoming the lack of external resources.

  5. IATA for skin sensitization potential – 1 out of 2 or 2 out of 3? ...

    EPA Pesticide Factsheets

    To meet EU regulatory requirements and to avoid or minimize animal testing, there is a need for non-animal methods to assess skin sensitization potential. Given the complexity of the skin sensitization endpoint, there is an expectation that integrated testing and assessment approaches (IATA) will need to be developed which rely on assays representing key events in the pathway. Three non-animal assays have been formally validated: the direct peptide reactivity assay (DPRA), the KeratinoSensTM assay and the h-CLAT assay. At the same time, there have been many efforts to develop IATA with the “2 out of 3” approach attracting much attention whereby a chemical is classified on the basis of the majority outcome. A set of 271 chemicals with mouse, human and non-animal sensitization test data was evaluated to compare the predictive performances of the 3 individual non-animal assays, their binary combinations and the ‘2 out of 3’ approach. The analysis revealed that the most predictive approach was to use both the DPRA and h-CLAT: 1. Perform DPRA – if positive, classify as a sensitizer; 2. If negative, perform h-CLAT – a positive outcome denotes a sensitizer, a negative, a non-sensitizer. With this approach, 83% (LLNA) and 93% (human) of the non-sensitizer predictions were correct, in contrast to the ‘2 out of 3’ approach which had 69% (LLNA) and 79% (human) of non-sensitizer predictions correct. The views expressed are those of the authors and do not ne

  6. A formal and data-based comparison of measures of motor-equivalent covariation.

    PubMed

    Verrel, Julius

    2011-09-15

    Different analysis methods have been developed for assessing motor-equivalent organization of movement variability. In the uncontrolled manifold (UCM) method, the structure of variability is analyzed by comparing goal-equivalent and non-goal-equivalent variability components at the level of elemental variables (e.g., joint angles). In contrast, in the covariation by randomization (CR) approach, motor-equivalent organization is assessed by comparing variability at the task level between empirical and decorrelated surrogate data. UCM effects can be due to both covariation among elemental variables and selective channeling of variability to elemental variables with low task sensitivity ("individual variation"), suggesting a link between the UCM and CR method. However, the precise relationship between the notion of covariation in the two approaches has not been analyzed in detail yet. Analysis of empirical and simulated data from a study on manual pointing shows that in general the two approaches are not equivalent, but the respective covariation measures are highly correlated (ρ > 0.7) for two proposed definitions of covariation in the UCM context. For one-dimensional task spaces, a formal comparison is possible and in fact the two notions of covariation are equivalent. In situations in which individual variation does not contribute to UCM effects, for which necessary and sufficient conditions are derived, this entails the equivalence of the UCM and CR analysis. Implications for the interpretation of UCM effects are discussed. Copyright © 2011 Elsevier B.V. All rights reserved.

  7. Formal Methods Specification and Analysis Guidebook for the Verification of Software and Computer Systems. Volume 2; A Practitioner's Companion

    NASA Technical Reports Server (NTRS)

    1995-01-01

    This guidebook, the second of a two-volume series, is intended to facilitate the transfer of formal methods to the avionics and aerospace community. The 1st volume concentrates on administrative and planning issues [NASA-95a], and the second volume focuses on the technical issues involved in applying formal methods to avionics and aerospace software systems. Hereafter, the term "guidebook" refers exclusively to the second volume of the series. The title of this second volume, A Practitioner's Companion, conveys its intent. The guidebook is written primarily for the nonexpert and requires little or no prior experience with formal methods techniques and tools. However, it does attempt to distill some of the more subtle ingredients in the productive application of formal methods. To the extent that it succeeds, those conversant with formal methods will also nd the guidebook useful. The discussion is illustrated through the development of a realistic example, relevant fragments of which appear in each chapter. The guidebook focuses primarily on the use of formal methods for analysis of requirements and high-level design, the stages at which formal methods have been most productively applied. Although much of the discussion applies to low-level design and implementation, the guidebook does not discuss issues involved in the later life cycle application of formal methods.

  8. Restorative Practices as Formal and Informal Education

    ERIC Educational Resources Information Center

    Carter, Candice C.

    2013-01-01

    This article reviews restorative practices (RP) as education in formal and informal contexts of learning that are fertile sites for cultivating peace. Formal practices involve instruction about response to conflict, while informal learning occurs beyond academic lessons. The research incorporated content analysis and a critical examination of the…

  9. Formal hardware verification of digital circuits

    NASA Technical Reports Server (NTRS)

    Joyce, J.; Seger, C.-J.

    1991-01-01

    The use of formal methods to verify the correctness of digital circuits is less constrained by the growing complexity of digital circuits than conventional methods based on exhaustive simulation. This paper briefly outlines three main approaches to formal hardware verification: symbolic simulation, state machine analysis, and theorem-proving.

  10. Maternal employment and childhood overweight in low- and middle-income countries.

    PubMed

    Oddo, Vanessa M; Mueller, Noel T; Pollack, Keshia M; Surkan, Pamela J; Bleich, Sara N; Jones-Smith, Jessica C

    2017-10-01

    To investigate the association between maternal employment and childhood overweight in low- and middle-income countries (LMIC). Design/Setting We utilized cross-sectional data from forty-five Demographic and Health Surveys from 2010 to 2016 (n 268 763). Mothers were categorized as formally employed, informally employed or non-employed. We used country-specific logistic regression models to investigate the association between maternal employment and childhood overweight (BMI Z-score>2) and assessed heterogeneity in the association by maternal education with the inclusion of an interaction term. We used meta-analysis to pool the associations across countries. Sensitivity analyses included modelling BMI Z-score and normal weight (weight-for-age Z-score≥-2 to <2) as outcomes. Participants included children 0-5 years old and their mothers (aged 18-49 years). In most countries, neither formal nor informal employment was associated with childhood overweight. However, children of employed mothers, compared with children of non-employed mothers, had higher BMI Z-score and higher odds of normal weight. In countries where the association varied by education, children of formally employed women with high education, compared with children of non-employed women with high education, had higher odds of overweight (pooled OR=1·2; 95 % CI 1·0, 1·4). We find no clear association between employment and child overweight. However, maternal employment is associated with a modestly higher BMI Z-score and normal weight, suggesting that employment is currently associated with beneficial effects on children's weight status in most LMIC.

  11. draco: Analysis and simulation of drift scan radio data

    NASA Astrophysics Data System (ADS)

    Shaw, J. Richard

    2017-12-01

    draco analyzes transit radio data with the m-mode formalism. It is telescope agnostic, and is used as part of the analysis and simulation pipeline for the CHIME (Canadian Hydrogen Intensity Mapping Experiment) telescope. It can simulate time stream data from maps of the sky (using the m-mode formalism) and add gain fluctuations and correctly correlated instrumental noise (i.e. Wishart distributed). Further, it can perform various cuts on the data and make maps of the sky from data using the m-mode formalism.

  12. Formalizing New Navigation Requirements for NASA's Space Shuttle

    NASA Technical Reports Server (NTRS)

    DiVito, Ben L.

    1996-01-01

    We describe a recent NASA-sponsored pilot project intended to gauge the effectiveness of using formal methods in Space Shuttle software requirements analysis. Several Change Requests (CRs) were selected as promising targets to demonstrate the utility of formal methods in this demanding application domain. A CR to add new navigation capabilities to the Shuttle, based on Global Positioning System (GPS) technology, is the focus of this industrial usage report. Portions of the GPS CR were modeled using the language of SRI's Prototype Verification System (PVS). During a limited analysis conducted on the formal specifications, numerous requirements issues were discovered. We present a summary of these encouraging results and conclusions we have drawn from the pilot project.

  13. Effects of time delay and pitch control sensitivity in the flared landing

    NASA Technical Reports Server (NTRS)

    Berthe, C. J.; Chalk, C. R.; Wingarten, N. C.; Grantham, W.

    1986-01-01

    Between December 1985 and January 1986, a flared landing program was conducted, using the USAF Total In-Flight simulator airplane, to examine time delay effects in a formal manner. Results show that as pitch sensitivity is increased, tolerance to time delay decreases. With the proper selection of pitch sensitivity, Level I performance was maintained with time delays ranging from 150 milliseconds to greater than 300 milliseconds. With higher sensitivity, configurations with Level I performance at 150 milliseconds degraded to level 2 at 200 milliseconds. When metrics of time delay and pitch sensitivity effects are applied to enhance previously developed predictive criteria, the result is an improved prediction technique which accounts for significant closed loop items.

  14. A design automation framework for computational bioenergetics in biological networks.

    PubMed

    Angione, Claudio; Costanza, Jole; Carapezza, Giovanni; Lió, Pietro; Nicosia, Giuseppe

    2013-10-01

    The bioenergetic activity of mitochondria can be thoroughly investigated by using computational methods. In particular, in our work we focus on ATP and NADH, namely the metabolites representing the production of energy in the cell. We develop a computational framework to perform an exhaustive investigation at the level of species, reactions, genes and metabolic pathways. The framework integrates several methods implementing the state-of-the-art algorithms for many-objective optimization, sensitivity, and identifiability analysis applied to biological systems. We use this computational framework to analyze three case studies related to the human mitochondria and the algal metabolism of Chlamydomonas reinhardtii, formally described with algebraic differential equations or flux balance analysis. Integrating the results of our framework applied to interacting organelles would provide a general-purpose method for assessing the production of energy in a biological network.

  15. Sequential Least-Squares Using Orthogonal Transformations. [spacecraft communication/spacecraft tracking-data smoothing

    NASA Technical Reports Server (NTRS)

    Bierman, G. J.

    1975-01-01

    Square root information estimation, starting from its beginnings in least-squares parameter estimation, is considered. Special attention is devoted to discussions of sensitivity and perturbation matrices, computed solutions and their formal statistics, consider-parameters and consider-covariances, and the effects of a priori statistics. The constant-parameter model is extended to include time-varying parameters and process noise, and the error analysis capabilities are generalized. Efficient and elegant smoothing results are obtained as easy consequences of the filter formulation. The value of the techniques is demonstrated by the navigation results that were obtained for the Mariner Venus-Mercury (Mariner 10) multiple-planetary space probe and for the Viking Mars space mission.

  16. Strain-induced tunable negative differential resistance in triangle graphene spirals

    NASA Astrophysics Data System (ADS)

    Tan, Jie; Zhang, Xiaoming; Liu, Wenguan; He, Xiujie; Zhao, Mingwen

    2018-05-01

    Using non-equilibrium Green’s function formalism combined with density functional theory calculations, we investigate the significant changes in electronic and transport properties of triangle graphene spirals (TGSs) in response to external strain. Tunable negative differential resistance (NDR) behavior is predicted. The NDR bias region, NDR width, and peak-to-valley ratio can be well tuned by external strain. Further analysis shows that these peculiar properties can be attributed to the dispersion widths of the p z orbitals. Moreover, the conductance of TGSs is very sensitive to the applied stress, which is promising for applications in nanosensor devices. Our findings reveal a novel approach to produce tunable electronic devices based on graphene spirals.

  17. Strain-induced tunable negative differential resistance in triangle graphene spirals.

    PubMed

    Tan, Jie; Zhang, Xiaoming; Liu, Wenguan; He, Xiujie; Zhao, Mingwen

    2018-05-18

    Using non-equilibrium Green's function formalism combined with density functional theory calculations, we investigate the significant changes in electronic and transport properties of triangle graphene spirals (TGSs) in response to external strain. Tunable negative differential resistance (NDR) behavior is predicted. The NDR bias region, NDR width, and peak-to-valley ratio can be well tuned by external strain. Further analysis shows that these peculiar properties can be attributed to the dispersion widths of the p z orbitals. Moreover, the conductance of TGSs is very sensitive to the applied stress, which is promising for applications in nanosensor devices. Our findings reveal a novel approach to produce tunable electronic devices based on graphene spirals.

  18. Learning Needs Analysis of Collaborative E-Classes in Semi-Formal Settings: The REVIT Example

    ERIC Educational Resources Information Center

    Mavroudi, Anna; Hadzilacos, Thanasis

    2013-01-01

    Analysis, the first phase of the typical instructional design process, is often downplayed. This paper focuses on the analysis concerning a series of e-courses for collaborative adult education in semi-formal settings by reporting and generalizing results from the REVIT project. REVIT, an EU-funded research project, offered custom e-courses to…

  19. Pedagogical Basis of DAS Formalism in Engineering Education

    ERIC Educational Resources Information Center

    Hiltunen, J.; Heikkinen, E.-P.; Jaako, J.; Ahola, J.

    2011-01-01

    The paper presents a new approach for a bachelor-level curriculum structure in engineering. The approach is called DAS formalism according to its three phases: description, analysis and synthesis. Although developed specifically for process and environmental engineering, DAS formalism has a generic nature and it could also be used in other…

  20. Developing an approach for teaching and learning about Lewis structures

    NASA Astrophysics Data System (ADS)

    Kaufmann, Ilana; Hamza, Karim M.; Rundgren, Carl-Johan; Eriksson, Lars

    2017-08-01

    This study explores first-year university students' reasoning as they learn to draw Lewis structures. We also present a theoretical account of the formal procedure commonly taught for drawing these structures. Students' discussions during problem-solving activities were video recorded and detailed analyses of the discussions were made through the use of practical epistemology analysis (PEA). Our results show that the formal procedure was central for drawing Lewis structures, but its use varied depending on situational aspects. Commonly, the use of individual steps of the formal procedure was contingent on experiences of chemical structures, and other information such as the characteristics of the problem given. The analysis revealed a number of patterns in how students constructed, checked and modified the structure in relation to the formal procedure and the situational aspects. We suggest that explicitly teaching the formal procedure as a process of constructing, checking and modifying might be helpful for students learning to draw Lewis structures. By doing so, the students may learn to check the accuracy of the generated structure not only in relation to the octet rule and formal charge, but also to other experiences that are not explicitly included in the formal procedure.

  1. Effects of Incidental Emotions on Moral Dilemma Judgments: An Analysis Using the CNI Model.

    PubMed

    Gawronski, Bertram; Conway, Paul; Armstrong, Joel; Friesdorf, Rebecca; Hütter, Mandy

    2018-02-01

    Effects of incidental emotions on moral dilemma judgments have garnered interest because they demonstrate the context-dependent nature of moral decision-making. Six experiments (N = 727) investigated the effects of incidental happiness, sadness, and anger on responses in moral dilemmas that pit the consequences of a given action for the greater good (i.e., utilitarianism) against the consistency of that action with moral norms (i.e., deontology). Using the CNI model of moral decision-making, we further tested whether the three kinds of emotions shape moral dilemma judgments by influencing (a) sensitivity to consequences, (b) sensitivity to moral norms, or (c) general preference for inaction versus action regardless of consequences and moral norms (or some combination of the three). Incidental happiness reduced sensitivity to moral norms without affecting sensitivity to consequences or general preference for inaction versus action. Incidental sadness and incidental anger did not show any significant effects on moral dilemma judgments. The findings suggest a central role of moral norms in the contribution of emotional responses to moral dilemma judgments, requiring refinements of dominant theoretical accounts and supporting the value of formal modeling approaches in providing more nuanced insights into the determinants of moral dilemma judgments. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  2. Formal verification of a fault tolerant clock synchronization algorithm

    NASA Technical Reports Server (NTRS)

    Rushby, John; Vonhenke, Frieder

    1989-01-01

    A formal specification and mechanically assisted verification of the interactive convergence clock synchronization algorithm of Lamport and Melliar-Smith is described. Several technical flaws in the analysis given by Lamport and Melliar-Smith were discovered, even though their presentation is unusally precise and detailed. It seems that these flaws were not detected by informal peer scrutiny. The flaws are discussed and a revised presentation of the analysis is given that not only corrects the flaws but is also more precise and easier to follow. Some of the corrections to the flaws require slight modifications to the original assumptions underlying the algorithm and to the constraints on its parameters, and thus change the external specifications of the algorithm. The formal analysis of the interactive convergence clock synchronization algorithm was performed using the Enhanced Hierarchical Development Methodology (EHDM) formal specification and verification environment. This application of EHDM provides a demonstration of some of the capabilities of the system.

  3. A cost-effectiveness analysis of two different antimicrobial stewardship programs.

    PubMed

    Okumura, Lucas Miyake; Riveros, Bruno Salgado; Gomes-da-Silva, Monica Maria; Veroneze, Izelandia

    2016-01-01

    There is a lack of formal economic analysis to assess the efficiency of antimicrobial stewardship programs. Herein, we conducted a cost-effectiveness study to assess two different strategies of Antimicrobial Stewardship Programs. A 30-day Markov model was developed to analyze how cost-effective was a Bundled Antimicrobial Stewardship implemented in a university hospital in Brazil. Clinical data derived from a historical cohort that compared two different strategies of antimicrobial stewardship programs and had 30-day mortality as main outcome. Selected costs included: workload, cost of defined daily doses, length of stay, laboratory and imaging resources used to diagnose infections. Data were analyzed by deterministic and probabilistic sensitivity analysis to assess model's robustness, tornado diagram and Cost-Effectiveness Acceptability Curve. Bundled Strategy was more expensive (Cost difference US$ 2119.70), however, it was more efficient (US$ 27,549.15 vs 29,011.46). Deterministic and probabilistic sensitivity analysis suggested that critical variables did not alter final Incremental Cost-Effectiveness Ratio. Bundled Strategy had higher probabilities of being cost-effective, which was endorsed by cost-effectiveness acceptability curve. As health systems claim for efficient technologies, this study conclude that Bundled Antimicrobial Stewardship Program was more cost-effective, which means that stewardship strategies with such characteristics would be of special interest in a societal and clinical perspective. Copyright © 2016 Elsevier Editora Ltda. All rights reserved.

  4. The (Surplus) Value of Scientific Communication.

    ERIC Educational Resources Information Center

    Frohlich, Gerhard

    1996-01-01

    Discusses research on scientific communication. Topics include theory-less and formal technical/natural scientific models of scientific communication; social-scientific, power-sensitive models; the sociology of scientific communication; sciences as fields of competition; fraud and deception; potential surplus value across subject information…

  5. Unpacking buyer-seller differences in valuation from experience: A cognitive modeling approach.

    PubMed

    Pachur, Thorsten; Scheibehenne, Benjamin

    2017-12-01

    People often indicate a higher price for an object when they own it (i.e., as sellers) than when they do not (i.e., as buyers)-a phenomenon known as the endowment effect. We develop a cognitive modeling approach to formalize, disentangle, and compare alternative psychological accounts (e.g., loss aversion, loss attention, strategic misrepresentation) of such buyer-seller differences in pricing decisions of monetary lotteries. To also be able to test possible buyer-seller differences in memory and learning, we study pricing decisions from experience, obtained with the sampling paradigm, where people learn about a lottery's payoff distribution from sequential sampling. We first formalize different accounts as models within three computational frameworks (reinforcement learning, instance-based learning theory, and cumulative prospect theory), and then fit the models to empirical selling and buying prices. In Study 1 (a reanalysis of published data with hypothetical decisions), models assuming buyer-seller differences in response bias (implementing a strategic-misrepresentation account) performed best; models assuming buyer-seller differences in choice sensitivity or memory (implementing a loss-attention account) generally fared worst. In a new experiment involving incentivized decisions (Study 2), models assuming buyer-seller differences in both outcome sensitivity (as proposed by a loss-aversion account) and response bias performed best. In both Study 1 and 2, the models implemented in cumulative prospect theory performed best. Model recovery studies validated our cognitive modeling approach, showing that the models can be distinguished rather well. In summary, our analysis supports a loss-aversion account of the endowment effect, but also reveals a substantial contribution of simple response bias.

  6. NEMA, a functional-structural model of nitrogen economy within wheat culms after flowering. II. Evaluation and sensitivity analysis.

    PubMed

    Bertheloot, Jessica; Wu, Qiongli; Cournède, Paul-Henry; Andrieu, Bruno

    2011-10-01

    Simulating nitrogen economy in crop plants requires formalizing the interactions between soil nitrogen availability, root nitrogen acquisition, distribution between vegetative organs and remobilization towards grains. This study evaluates and analyses the functional-structural and mechanistic model of nitrogen economy, NEMA (Nitrogen Economy Model within plant Architecture), developed for winter wheat (Triticum aestivum) after flowering. NEMA was calibrated for field plants under three nitrogen fertilization treatments at flowering. Model behaviour was investigated and sensitivity to parameter values was analysed. Nitrogen content of all photosynthetic organs and in particular nitrogen vertical distribution along the stem and remobilization patterns in response to fertilization were simulated accurately by the model, from Rubisco turnover modulated by light intercepted by the organ and a mobile nitrogen pool. This pool proved to be a reliable indicator of plant nitrogen status, allowing efficient regulation of nitrogen acquisition by roots, remobilization from vegetative organs and accumulation in grains in response to nitrogen treatments. In our simulations, root capacity to import carbon, rather than carbon availability, limited nitrogen acquisition and ultimately nitrogen accumulation in grains, while Rubisco turnover intensity mostly affected dry matter accumulation in grains. NEMA enabled interpretation of several key patterns usually observed in field conditions and the identification of plausible processes limiting for grain yield, protein content and root nitrogen acquisition that could be targets for plant breeding; however, further understanding requires more mechanistic formalization of carbon metabolism. Its strong physiological basis and its realistic behaviour support its use to gain insights into nitrogen economy after flowering.

  7. ON THE USE OF SHOT NOISE FOR PHOTON COUNTING

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zmuidzinas, Jonas, E-mail: jonas@caltech.edu

    Lieu et al. have recently claimed that it is possible to substantially improve the sensitivity of radio-astronomical observations. In essence, their proposal is to make use of the intensity of the photon shot noise as a measure of the photon arrival rate. Lieu et al. provide a detailed quantum-mechanical calculation of a proposed measurement scheme that uses two detectors and conclude that this scheme avoids the sensitivity degradation that is associated with photon bunching. If correct, this result could have a profound impact on radio astronomy. Here I present a detailed analysis of the sensitivity attainable using shot-noise measurement schemesmore » that use either one or two detectors, and demonstrate that neither scheme can avoid the photon bunching penalty. I perform both semiclassical and fully quantum calculations of the sensitivity, obtaining consistent results, and provide a formal proof of the equivalence of these two approaches. These direct calculations are furthermore shown to be consistent with an indirect argument based on a correlation method that establishes an independent limit to the sensitivity of shot-noise measurement schemes. Furthermore, these calculations are directly applicable to the regime of interest identified by Lieu et al. Collectively, these results conclusively demonstrate that the photon-bunching sensitivity penalty applies to shot-noise measurement schemes just as it does to ordinary photon counting, in contradiction to the fundamental claim made by Lieu et al. The source of this contradiction is traced to a logical fallacy in their argument.« less

  8. Nurse manager succession planning: A cost-benefit analysis.

    PubMed

    Phillips, Tracy; Evans, Jennifer L; Tooley, Stephanie; Shirey, Maria R

    2018-03-01

    This commentary presents a cost-benefit analysis to advocate for the use of succession planning to mitigate the problems ensuing from nurse manager turnover. An estimated 75% of nurse managers will leave the workforce by 2020. Many benefits are associated with proactively identifying and developing internal candidates. Fewer than 7% of health care organisations have implemented formal leadership succession planning programmes. A cost-benefit analysis of a formal succession-planning programme from one hospital illustrates the benefits of the programme in their organisation and can be replicated easily. Assumptions of nursing manager succession planning cost-benefit analysis are identified and discussed. The succession planning exemplar demonstrates the integration of cost-benefit analysis principles. Comparing the costs of a formal nurse manager succession planning strategy with the status quo results in a positive cost-benefit ratio. The implementation of a formal nurse manager succession planning programme effectively reduces replacement costs and time to transition into the new role. This programme provides an internal pipeline of future leaders who will be more successful than external candidates. Using an actual cost-benefit analysis equips nurse managers with valuable evidence depicting succession planning as a viable business strategy. © 2017 John Wiley & Sons Ltd.

  9. EFL Teachers' Formal Assessment Practices Based on Exam Papers

    ERIC Educational Resources Information Center

    Kiliçkaya, Ferit

    2016-01-01

    This study reports initial findings from a small-scale qualitative study aimed at gaining insights into English language teachers' assessment practices in Turkey by examining the formal exam papers. Based on the technique of content analysis, formal exam papers were analyzed in terms of assessment items, language skills tested as well as the…

  10. Matching biomedical ontologies based on formal concept analysis.

    PubMed

    Zhao, Mengyi; Zhang, Songmao; Li, Weizhuo; Chen, Guowei

    2018-03-19

    The goal of ontology matching is to identify correspondences between entities from different yet overlapping ontologies so as to facilitate semantic integration, reuse and interoperability. As a well developed mathematical model for analyzing individuals and structuring concepts, Formal Concept Analysis (FCA) has been applied to ontology matching (OM) tasks since the beginning of OM research, whereas ontological knowledge exploited in FCA-based methods is limited. This motivates the study in this paper, i.e., to empower FCA with as much as ontological knowledge as possible for identifying mappings across ontologies. We propose a method based on Formal Concept Analysis to identify and validate mappings across ontologies, including one-to-one mappings, complex mappings and correspondences between object properties. Our method, called FCA-Map, incrementally generates a total of five types of formal contexts and extracts mappings from the lattices derived. First, the token-based formal context describes how class names, labels and synonyms share lexical tokens, leading to lexical mappings (anchors) across ontologies. Second, the relation-based formal context describes how classes are in taxonomic, partonomic and disjoint relationships with the anchors, leading to positive and negative structural evidence for validating the lexical matching. Third, the positive relation-based context can be used to discover structural mappings. Afterwards, the property-based formal context describes how object properties are used in axioms to connect anchor classes across ontologies, leading to property mappings. Last, the restriction-based formal context describes co-occurrence of classes across ontologies in anonymous ancestors of anchors, from which extended structural mappings and complex mappings can be identified. Evaluation on the Anatomy, the Large Biomedical Ontologies, and the Disease and Phenotype track of the 2016 Ontology Alignment Evaluation Initiative campaign demonstrates the effectiveness of FCA-Map and its competitiveness with the top-ranked systems. FCA-Map can achieve a better balance between precision and recall for large-scale domain ontologies through constructing multiple FCA structures, whereas it performs unsatisfactorily for smaller-sized ontologies with less lexical and semantic expressions. Compared with other FCA-based OM systems, the study in this paper is more comprehensive as an attempt to push the envelope of the Formal Concept Analysis formalism in ontology matching tasks. Five types of formal contexts are constructed incrementally, and their derived concept lattices are used to cluster the commonalities among classes at lexical and structural level, respectively. Experiments on large, real-world domain ontologies show promising results and reveal the power of FCA.

  11. Parameter screening: the use of a dummy parameter to identify non-influential parameters in a global sensitivity analysis

    NASA Astrophysics Data System (ADS)

    Khorashadi Zadeh, Farkhondeh; Nossent, Jiri; van Griensven, Ann; Bauwens, Willy

    2017-04-01

    Parameter estimation is a major concern in hydrological modeling, which may limit the use of complex simulators with a large number of parameters. To support the selection of parameters to include in or exclude from the calibration process, Global Sensitivity Analysis (GSA) is widely applied in modeling practices. Based on the results of GSA, the influential and the non-influential parameters are identified (i.e. parameters screening). Nevertheless, the choice of the screening threshold below which parameters are considered non-influential is a critical issue, which has recently received more attention in GSA literature. In theory, the sensitivity index of a non-influential parameter has a value of zero. However, since numerical approximations, rather than analytical solutions, are utilized in GSA methods to calculate the sensitivity indices, small but non-zero indices may be obtained for the indices of non-influential parameters. In order to assess the threshold that identifies non-influential parameters in GSA methods, we propose to calculate the sensitivity index of a "dummy parameter". This dummy parameter has no influence on the model output, but will have a non-zero sensitivity index, representing the error due to the numerical approximation. Hence, the parameters whose indices are above the sensitivity index of the dummy parameter can be classified as influential, whereas the parameters whose indices are below this index are within the range of the numerical error and should be considered as non-influential. To demonstrated the effectiveness of the proposed "dummy parameter approach", 26 parameters of a Soil and Water Assessment Tool (SWAT) model are selected to be analyzed and screened, using the variance-based Sobol' and moment-independent PAWN methods. The sensitivity index of the dummy parameter is calculated from sampled data, without changing the model equations. Moreover, the calculation does not even require additional model evaluations for the Sobol' method. A formal statistical test validates these parameter screening results. Based on the dummy parameter screening, 11 model parameters are identified as influential. Therefore, it can be denoted that the "dummy parameter approach" can facilitate the parameter screening process and provide guidance for GSA users to define a screening-threshold, with only limited additional resources. Key words: Parameter screening, Global sensitivity analysis, Dummy parameter, Variance-based method, Moment-independent method

  12. Tip-Enhanced Raman Voltammetry: Coverage Dependence and Quantitative Modeling.

    PubMed

    Mattei, Michael; Kang, Gyeongwon; Goubert, Guillaume; Chulhai, Dhabih V; Schatz, George C; Jensen, Lasse; Van Duyne, Richard P

    2017-01-11

    Electrochemical atomic force microscopy tip-enhanced Raman spectroscopy (EC-AFM-TERS) was employed for the first time to observe nanoscale spatial variations in the formal potential, E 0' , of a surface-bound redox couple. TERS cyclic voltammograms (TERS CVs) of single Nile Blue (NB) molecules were acquired at different locations spaced 5-10 nm apart on an indium tin oxide (ITO) electrode. Analysis of TERS CVs at different coverages was used to verify the observation of single-molecule electrochemistry. The resulting TERS CVs were fit to the Laviron model for surface-bound electroactive species to quantitatively extract the formal potential E 0' at each spatial location. Histograms of single-molecule E 0' at each coverage indicate that the electrochemical behavior of the cationic oxidized species is less sensitive to local environment than the neutral reduced species. This information is not accessible using purely electrochemical methods or ensemble spectroelectrochemical measurements. We anticipate that quantitative modeling and measurement of site-specific electrochemistry with EC-AFM-TERS will have a profound impact on our understanding of the role of nanoscale electrode heterogeneity in applications such as electrocatalysis, biological electron transfer, and energy production and storage.

  13. Modified Petri net model sensitivity to workload manipulations

    NASA Technical Reports Server (NTRS)

    White, S. A.; Mackinnon, D. P.; Lyman, J.

    1986-01-01

    Modified Petri Nets (MPNs) are investigated as a workload modeling tool. The results of an exploratory study of the sensitivity of MPNs to work load manipulations in a dual task are described. Petri nets have been used to represent systems with asynchronous, concurrent and parallel activities (Peterson, 1981). These characteristics led some researchers to suggest the use of Petri nets in workload modeling where concurrent and parallel activities are common. Petri nets are represented by places and transitions. In the workload application, places represent operator activities and transitions represent events. MPNs have been used to formally represent task events and activities of a human operator in a man-machine system. Some descriptive applications demonstrate the usefulness of MPNs in the formal representation of systems. It is the general hypothesis herein that in addition to descriptive applications, MPNs may be useful for workload estimation and prediction. The results are reported of the first of a series of experiments designed to develop and test a MPN system of workload estimation and prediction. This first experiment is a screening test of MPN model general sensitivity to changes in workload. Positive results from this experiment will justify the more complicated analyses and techniques necessary for developing a workload prediction system.

  14. Multicultural Supervision: What Difference Does Difference Make?

    ERIC Educational Resources Information Center

    Eklund, Katie; Aros-O'Malley, Megan; Murrieta, Imelda

    2014-01-01

    Multicultural sensitivity and competency represent critical components to contemporary practice and supervision in school psychology. Internship and supervision experiences are a capstone experience for many new school psychologists; however, few receive formal training and supervision in multicultural competencies. As an increased number of…

  15. Early assessment of the likely cost-effectiveness of a new technology: A Markov model with probabilistic sensitivity analysis of computer-assisted total knee replacement.

    PubMed

    Dong, Hengjin; Buxton, Martin

    2006-01-01

    The objective of this study is to apply a Markov model to compare cost-effectiveness of total knee replacement (TKR) using computer-assisted surgery (CAS) with that of TKR using a conventional manual method in the absence of formal clinical trial evidence. A structured search was carried out to identify evidence relating to the clinical outcome, cost, and effectiveness of TKR. Nine Markov states were identified based on the progress of the disease after TKR. Effectiveness was expressed by quality-adjusted life years (QALYs). The simulation was carried out initially for 120 cycles of a month each, starting with 1,000 TKRs. A discount rate of 3.5 percent was used for both cost and effectiveness in the incremental cost-effectiveness analysis. Then, a probabilistic sensitivity analysis was carried out using a Monte Carlo approach with 10,000 iterations. Computer-assisted TKR was a long-term cost-effective technology, but the QALYs gained were small. After the first 2 years, the incremental cost per QALY of computer-assisted TKR was dominant because of cheaper and more QALYs. The incremental cost-effectiveness ratio (ICER) was sensitive to the "effect of CAS," to the CAS extra cost, and to the utility of the state "Normal health after primary TKR," but it was not sensitive to utilities of other Markov states. Both probabilistic and deterministic analyses produced similar cumulative serious or minor complication rates and complex or simple revision rates. They also produced similar ICERs. Compared with conventional TKR, computer-assisted TKR is a cost-saving technology in the long-term and may offer small additional QALYs. The "effect of CAS" is to reduce revision rates and complications through more accurate and precise alignment, and although the conclusions from the model, even when allowing for a full probabilistic analysis of uncertainty, are clear, the "effect of CAS" on the rate of revisions awaits long-term clinical evidence.

  16. Formal reasoning about systems biology using theorem proving

    PubMed Central

    Hasan, Osman; Siddique, Umair; Tahar, Sofiène

    2017-01-01

    System biology provides the basis to understand the behavioral properties of complex biological organisms at different levels of abstraction. Traditionally, analysing systems biology based models of various diseases have been carried out by paper-and-pencil based proofs and simulations. However, these methods cannot provide an accurate analysis, which is a serious drawback for the safety-critical domain of human medicine. In order to overcome these limitations, we propose a framework to formally analyze biological networks and pathways. In particular, we formalize the notion of reaction kinetics in higher-order logic and formally verify some of the commonly used reaction based models of biological networks using the HOL Light theorem prover. Furthermore, we have ported our earlier formalization of Zsyntax, i.e., a deductive language for reasoning about biological networks and pathways, from HOL4 to the HOL Light theorem prover to make it compatible with the above-mentioned formalization of reaction kinetics. To illustrate the usefulness of the proposed framework, we present the formal analysis of three case studies, i.e., the pathway leading to TP53 Phosphorylation, the pathway leading to the death of cancer stem cells and the tumor growth based on cancer stem cells, which is used for the prognosis and future drug designs to treat cancer patients. PMID:28671950

  17. Optical impedance spectroscopy with single-mode electro-active-integrated optical waveguides.

    PubMed

    Han, Xue; Mendes, Sergio B

    2014-02-04

    An optical impedance spectroscopy (OIS) technique based on a single-mode electro-active-integrated optical waveguide (EA-IOW) was developed to investigate electron-transfer processes of redox adsorbates. A highly sensitive single-mode EA-IOW device was used to optically follow the time-dependent faradaic current originated from a submonolayer of cytochrome c undergoing redox exchanges driven by a harmonic modulation of the electric potential at several dc bias potentials and at several frequencies. To properly retrieve the faradaic current density from the ac-modulated optical signal, we introduce here a mathematical formalism that (i) accounts for intrinsic changes that invariably occur in the optical baseline of the EA-IOW device during potential modulation and (ii) provides accurate results for the electro-chemical parameters. We are able to optically reconstruct the faradaic current density profile against the dc bias potential in the working electrode, identify the formal potential, and determine the energy-width of the electron-transfer process. In addition, by combining the optically reconstructed faradaic signal with simple electrical measurements of impedance across the whole electrochemical cell and the capacitance of the electric double-layer, we are able to determine the time-constant connected to the redox reaction of the adsorbed protein assembly. For cytochrome c directly immobilized onto the indium tin oxide (ITO) surface, we measured a reaction rate constant of 26.5 s(-1). Finally, we calculate the charge-transfer resistance and pseudocapacitance associated with the electron-transfer process and show that the frequency dependence of the redox reaction of the protein submonolayer follows as expected the electrical equivalent of an RC-series admittance diagram. Above all, we show here that OIS with single-mode EA-IOW's provide strong analytical signals that can be readily monitored even for small surface-densities of species involved in the redox process (e.g., fmol/cm(2), 0.1% of a full protein monolayer). This experimental approach, when combined with the analytical formalism described here, brings additional sensitivity, accuracy, and simplicity to electro-chemical analysis and is expected to become a useful tool in investigations of redox processes.

  18. Formal verification of an oral messages algorithm for interactive consistency

    NASA Technical Reports Server (NTRS)

    Rushby, John

    1992-01-01

    The formal specification and verification of an algorithm for Interactive Consistency based on the Oral Messages algorithm for Byzantine Agreement is described. We compare our treatment with that of Bevier and Young, who presented a formal specification and verification for a very similar algorithm. Unlike Bevier and Young, who observed that 'the invariant maintained in the recursive subcases of the algorithm is significantly more complicated than is suggested by the published proof' and who found its formal verification 'a fairly difficult exercise in mechanical theorem proving,' our treatment is very close to the previously published analysis of the algorithm, and our formal specification and verification are straightforward. This example illustrates how delicate choices in the formulation of the problem can have significant impact on the readability of its formal specification and on the tractability of its formal verification.

  19. Optical asymmetric cryptography based on elliptical polarized light linear truncation and a numerical reconstruction technique.

    PubMed

    Lin, Chao; Shen, Xueju; Wang, Zhisong; Zhao, Cheng

    2014-06-20

    We demonstrate a novel optical asymmetric cryptosystem based on the principle of elliptical polarized light linear truncation and a numerical reconstruction technique. The device of an array of linear polarizers is introduced to achieve linear truncation on the spatially resolved elliptical polarization distribution during image encryption. This encoding process can be characterized as confusion-based optical cryptography that involves no Fourier lens and diffusion operation. Based on the Jones matrix formalism, the intensity transmittance for this truncation is deduced to perform elliptical polarized light reconstruction based on two intensity measurements. Use of a quick response code makes the proposed cryptosystem practical, with versatile key sensitivity and fault tolerance. Both simulation and preliminary experimental results that support theoretical analysis are presented. An analysis of the resistance of the proposed method on a known public key attack is also provided.

  20. Shared Governance in the Community College: An Analysis of Formal Authority in Collective Bargaining Agreements

    ERIC Educational Resources Information Center

    McDermott, Linda A.

    2012-01-01

    This qualitative study examines shared governance in Washington State's community and technical colleges and provides an analysis of faculty participation in governance based on formal authority in collective bargaining agreements. Contracts from Washington's thirty community and technical college districts were reviewed in order to identify in…

  1. Formalization and Analysis of Reasoning by Assumption

    ERIC Educational Resources Information Center

    Bosse, Tibor; Jonker, Catholijn M.; Treur, Jan

    2006-01-01

    This article introduces a novel approach for the analysis of the dynamics of reasoning processes and explores its applicability for the reasoning pattern called reasoning by assumption. More specifically, for a case study in the domain of a Master Mind game, it is shown how empirical human reasoning traces can be formalized and automatically…

  2. The Verification-based Analysis of Reliable Multicast Protocol

    NASA Technical Reports Server (NTRS)

    Wu, Yunqing

    1996-01-01

    Reliable Multicast Protocol (RMP) is a communication protocol that provides an atomic, totally ordered, reliable multicast service on top of unreliable IP Multicasting. In this paper, we develop formal models for R.W using existing automatic verification systems, and perform verification-based analysis on the formal RMP specifications. We also use the formal models of RW specifications to generate a test suite for conformance testing of the RMP implementation. Throughout the process of RMP development, we follow an iterative, interactive approach that emphasizes concurrent and parallel progress between the implementation and verification processes. Through this approach, we incorporate formal techniques into our development process, promote a common understanding for the protocol, increase the reliability of our software, and maintain high fidelity between the specifications of RMP and its implementation.

  3. A Survey of Logic Formalisms to Support Mishap Analysis

    NASA Technical Reports Server (NTRS)

    Johnson, Chris; Holloway, C. M.

    2003-01-01

    Mishap investigations provide important information about adverse events and near miss incidents. They are intended to help avoid any recurrence of previous failures. Over time, they can also yield statistical information about incident frequencies that helps to detect patterns of failure and can validate risk assessments. However, the increasing complexity of many safety critical systems is posing new challenges for mishap analysis. Similarly, the recognition that many failures have complex, systemic causes has helped to widen the scope of many mishap investigations. These two factors have combined to pose new challenges for the analysis of adverse events. A new generation of formal and semi-formal techniques have been proposed to help investigators address these problems. We introduce the term mishap logics to collectively describe these notations that might be applied to support the analysis of mishaps. The proponents of these notations have argued that they can be used to formally prove that certain events created the necessary and sufficient causes for a mishap to occur. These proofs can be used to reduce the bias that is often perceived to effect the interpretation of adverse events. Others have argued that one cannot use logic formalisms to prove causes in the same way that one might prove propositions or theorems. Such mechanisms cannot accurately capture the wealth of inductive, deductive and statistical forms of inference that investigators must use in their analysis of adverse events. This paper provides an overview of these mishap logics. It also identifies several additional classes of logic that might also be used to support mishap analysis.

  4. Method for computationally efficient design of dielectric laser accelerator structures

    DOE PAGES

    Hughes, Tyler; Veronis, Georgios; Wootton, Kent P.; ...

    2017-06-22

    Here, dielectric microstructures have generated much interest in recent years as a means of accelerating charged particles when powered by solid state lasers. The acceleration gradient (or particle energy gain per unit length) is an important figure of merit. To design structures with high acceleration gradients, we explore the adjoint variable method, a highly efficient technique used to compute the sensitivity of an objective with respect to a large number of parameters. With this formalism, the sensitivity of the acceleration gradient of a dielectric structure with respect to its entire spatial permittivity distribution is calculated by the use of onlymore » two full-field electromagnetic simulations, the original and ‘adjoint’. The adjoint simulation corresponds physically to the reciprocal situation of a point charge moving through the accelerator gap and radiating. Using this formalism, we perform numerical optimizations aimed at maximizing acceleration gradients, which generate fabricable structures of greatly improved performance in comparison to previously examined geometries.« less

  5. Interpersonal impact messages associated with different forms of achievement motivation.

    PubMed

    Conroy, David E; Pincus, Aaron L

    2011-08-01

    Two studies evaluated relations between different forms of achievement motivation and transactional interpersonal impact messages during a dyadic puzzle-solving task. In Study 1,400 college students received no formal competence feedback during the task. In Study 2, competence feedback was manipulated for 600 college students and used to create high-, low-, and mixed-status dyads. Expectancies of success had robust actor and partner effects on submission in both studies. Competence valuation was linked with communal partner effects in Study 1 and a generalized interpersonal sensitivity in Study 2. When competence was ambiguous, approach and avoidance achievement motives exhibited affectively driven actor and partner effects consistent with their roots in pride and shame, respectively; however, when competence was established formally, motives had more cognitively driven effects on person perception and behavior (e.g., rejection sensitivity). Collectively, these findings highlight the importance of the achievement motivation system for organizing interpersonal impact messages during competence pursuits. © 2011 The Authors. Journal Compilation © 2011, Wiley Periodicals, Inc.

  6. Error tolerance analysis of wave diagnostic based on coherent modulation imaging in high power laser system

    NASA Astrophysics Data System (ADS)

    Pan, Xingchen; Liu, Cheng; Zhu, Jianqiang

    2018-02-01

    Coherent modulation imaging providing fast convergence speed and high resolution with single diffraction pattern is a promising technique to satisfy the urgent demands for on-line multiple parameter diagnostics with single setup in high power laser facilities (HPLF). However, the influence of noise on the final calculated parameters concerned has not been investigated yet. According to a series of simulations with twenty different sampling beams generated based on the practical parameters and performance of HPLF, the quantitative analysis based on statistical results was first investigated after considering five different error sources. We found the background noise of detector and high quantization error will seriously affect the final accuracy and different parameters have different sensitivity to different noise sources. The simulation results and the corresponding analysis provide the potential directions to further improve the final accuracy of parameter diagnostics which is critically important to its formal applications in the daily routines of HPLF.

  7. [The workplace-based learning: a main paradigm of an effective continuing medical education].

    PubMed

    Lelli, Maria Barbara

    2010-01-01

    On the strength of the literature analysis and the Emilia-Romagna Region experience, we suggest a reflection on the workplace-based learning that goes beyond the analysis of the effectiveness of specific didactic methodologies and aspects related to Continuing Medical Education. Health education and training issue is viewed from a wider perspective, that integrates the three learning dimensions (formal, non formal and informal). In such a perspective the workplace-based learning becomes an essential paradigm to reshape the explicit knowledge conveyed in formal context and to emphasize informal contexts where innovation is generated.

  8. The Alignment of the Informal and Formal Organizational Supports for Reform: Implications for Improving Teaching in Schools

    ERIC Educational Resources Information Center

    Penuel, William R.; Riel, Margaret; Joshi, Aasha; Pearlman, Leslie; Kim, Chong Min; Frank, Kenneth A.

    2010-01-01

    Previous qualitative studies show that when the formal organization of a school and patterns of informal interaction are aligned, faculty and leaders in a school are better able to coordinate instructional change. This article combines social network analysis with interview data to analyze how well the formal and informal aspects of a school's…

  9. English Language Education in Formal and Cram School Contexts: An Analysis of Listening Strategy and Learning Style

    ERIC Educational Resources Information Center

    Chou, Mu-hsuan

    2017-01-01

    Formal English language education in Taiwan now starts at Year 3 in primary school, with an emphasis on communicative proficiency. In addition to formal education, attending English cram schools after regular school has become a common phenomenon for Taiwanese students. The main purpose of gaining additional reinforcement in English cram schools…

  10. Deterring Future Incidents of Intimate Partner Violence: Does Type of Formal Intervention Matter?

    PubMed

    Broidy, Lisa; Albright, Danielle; Denman, Kristine

    2016-08-01

    Few studies examine the comparative effectiveness of different formal interventions for domestic violence. Using arrest and civil protection order data, we compare three intervention scenarios (arrest, civil protection order, and both). Results suggest that intervention type has no substantive influence on the odds of reoffending. However, subsequent domestic violence is significantly associated with offender age, sex, and prior offense history as well as victim age and sex. We discuss our findings and their policy implications, noting that responding agencies should be sensitive to the characteristics that increase the odds of reoffending among those they come into contact with. © The Author(s) 2015.

  11. Why Engineers Should Consider Formal Methods

    NASA Technical Reports Server (NTRS)

    Holloway, C. Michael

    1997-01-01

    This paper presents a logical analysis of a typical argument favoring the use of formal methods for software development, and suggests an alternative argument that is simpler and stronger than the typical one.

  12. Software Formal Inspections Guidebook

    NASA Technical Reports Server (NTRS)

    1993-01-01

    The Software Formal Inspections Guidebook is designed to support the inspection process of software developed by and for NASA. This document provides information on how to implement a recommended and proven method for conducting formal inspections of NASA software. This Guidebook is a companion document to NASA Standard 2202-93, Software Formal Inspections Standard, approved April 1993, which provides the rules, procedures, and specific requirements for conducting software formal inspections. Application of the Formal Inspections Standard is optional to NASA program or project management. In cases where program or project management decide to use the formal inspections method, this Guidebook provides additional information on how to establish and implement the process. The goal of the formal inspections process as documented in the above-mentioned Standard and this Guidebook is to provide a framework and model for an inspection process that will enable the detection and elimination of defects as early as possible in the software life cycle. An ancillary aspect of the formal inspection process incorporates the collection and analysis of inspection data to effect continual improvement in the inspection process and the quality of the software subjected to the process.

  13. Teaching Fundamental Skills in Microsoft Excel to First-Year Students in Quantitative Analysis

    ERIC Educational Resources Information Center

    Rubin, Samuel J.; Abrams, Binyomin

    2015-01-01

    Despite their technological savvy, most students entering university lack the necessary computer skills to succeed in a quantitative analysis course, in which they are often expected to input, analyze, and plot results of experiments without any previous formal education in Microsoft Excel or similar programs. This lack of formal education results…

  14. Applications of Formal Methods to Specification and Safety of Avionics Software

    NASA Technical Reports Server (NTRS)

    Hoover, D. N.; Guaspari, David; Humenn, Polar

    1996-01-01

    This report treats several topics in applications of formal methods to avionics software development. Most of these topics concern decision tables, an orderly, easy-to-understand format for formally specifying complex choices among alternative courses of action. The topics relating to decision tables include: generalizations fo decision tables that are more concise and support the use of decision tables in a refinement-based formal software development process; a formalism for systems of decision tables with behaviors; an exposition of Parnas tables for users of decision tables; and test coverage criteria and decision tables. We outline features of a revised version of ORA's decision table tool, Tablewise, which will support many of the new ideas described in this report. We also survey formal safety analysis of specifications and software.

  15. Approximate Micromechanics Treatise of Composite Impact

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Handler, Louis M.

    2005-01-01

    A formalism is described for micromechanic impact of composites. The formalism consists of numerous equations which describe all aspects of impact from impactor and composite conditions to impact contact, damage progression, and penetration or containment. The formalism is based on through-the-thickness displacement increments simulation which makes it convenient to track local damage in terms of microfailure modes and their respective characteristics. A flow chart is provided to cast the formalism (numerous equations) into a computer code for embedment in composite mechanic codes and/or finite element composite structural analysis.

  16. Dynamic Assessment: The Dialectic Integration of Instruction and Assessment

    ERIC Educational Resources Information Center

    Lantolf, James P.

    2009-01-01

    This presentation is situated within the general framework of Vygotsky's educational theory, which argues that development in formal educational activity is a fundamentally different process from development that occurs in the everyday world. A cornerstone of Vygotsky's theory is that to be successful education must be sensitive to learners' zone…

  17. Wikis for Building Content Knowledge in the Foreign Language Classroom

    ERIC Educational Resources Information Center

    Pellet, Stephanie H.

    2012-01-01

    Most pedagogical applications of wikis in foreign language education draw on this collaborative tool to improve (formal) writing skills or to develop target language cultural sensitivity, missing largely on the opportunity to support student-developed L2 content knowledge. Seeking an alternative to traditional teacher-centered approaches, this…

  18. Hospitalizations of Adults with Intellectual Disability in Academic Medical Centers

    ERIC Educational Resources Information Center

    Ailey, Sarah H.; Johnson, Tricia; Fogg, Louis; Friese, Tanya R.

    2014-01-01

    Individuals with intellectual disability (ID) represent a small but important group of hospitalized patients who often have complex health care needs. Individuals with ID experience high rates of hospitalization for ambulatory-sensitive conditions and high rates of hospitalizations in general, even when in formal community care systems; however,…

  19. Educating for Sustainable Development: An Overview of Environmental Education Programmes in the Caribbean.

    ERIC Educational Resources Information Center

    Howell, Calvin A.

    1995-01-01

    Traditional approaches to formal education in the Caribbean have not achieved sufficiently high levels of sensitivity towards the environment. Nontraditional approaches and new strategies for sustainable development are needed. This paper reviews some approaches being undertaken in the English-speaking Caribbean designed to foster positive change…

  20. Learning Competences in Open Mobile Environments: A Comparative Analysis between Formal and Non-Formal Spaces

    ERIC Educational Resources Information Center

    Figaredo, Daniel Domínguez; Miravalles, Paz Trillo

    2014-01-01

    As a result of the increasing use of mobile devices in education, new approaches to define the learning competences in the field of digitally mediated learning have emerged. This paper examines these approaches, using data obtained from empirical research with a group of Spanish university students. The analysis is focused on the experiences of…

  1. State Event Models for the Formal Analysis of Human-Machine Interactions

    NASA Technical Reports Server (NTRS)

    Combefis, Sebastien; Giannakopoulou, Dimitra; Pecheur, Charles

    2014-01-01

    The work described in this paper was motivated by our experience with applying a framework for formal analysis of human-machine interactions (HMI) to a realistic model of an autopilot. The framework is built around a formally defined conformance relation called "fullcontrol" between an actual system and the mental model according to which the system is operated. Systems are well-designed if they can be described by relatively simple, full-control, mental models for their human operators. For this reason, our framework supports automated generation of minimal full-control mental models for HMI systems, where both the system and the mental models are described as labelled transition systems (LTS). The autopilot that we analysed has been developed in the NASA Ames HMI prototyping tool ADEPT. In this paper, we describe how we extended the models that our HMI analysis framework handles to allow adequate representation of ADEPT models. We then provide a property-preserving reduction from these extended models to LTSs, to enable application of our LTS-based formal analysis algorithms. Finally, we briefly discuss the analyses we were able to perform on the autopilot model with our extended framework.

  2. Learning in non-formal education: Is it "youthful" for youth in action?

    NASA Astrophysics Data System (ADS)

    Norqvist, Lars; Leffler, Eva

    2017-04-01

    This article offers insights into the practices of a non-formal education programme for youth provided by the European Union (EU). It takes a qualitative approach and is based on a case study of the European Voluntary Service (EVS). Data were collected during individual and focus group interviews with learners (the EVS volunteers), decision takers and trainers, with the aim of deriving an understanding of learning in non-formal education. The research questions concerned learning, the recognition of learning and perspectives of usefulness. The study also examined the Youthpass documentation tool as a key to understanding the recognition of learning and to determine whether the learning was useful for learners (the volunteers). The findings and analysis offer several interpretations of learning, and the recognition of learning, which take place in non-formal education. The findings also revealed that it is complicated to divide learning into formal and non- formal categories; instead, non-formal education is useful for individual learners when both formal and non-formal educational contexts are integrated. As a consequence, the division of formal and non-formal (and possibly even informal) learning creates a gap which works against the development of flexible and interconnected education with ubiquitous learning and mobility within and across formal and non-formal education. This development is not in the best interests of learners, especially when seeking useful learning and education for youth (what the authors term "youthful" for youth in action).

  3. Structuring Formal Control Systems Specifications for Reuse: Surviving Hardware Changes

    NASA Technical Reports Server (NTRS)

    Thompson, Jeffrey M.; Heimdahl, Mats P. E.; Erickson, Debra M.

    2000-01-01

    Formal capture and analysis of the required behavior of control systems have many advantages. For instance, it encourages rigorous requirements analysis, the required behavior is unambiguously defined, and we can assure that various safety properties are satisfied. Formal modeling is, however, a costly and time consuming process and if one could reuse the formal models over a family of products, significant cost savings would be realized. In an ongoing project we are investigating how to structure state-based models to achieve a high level of reusability within product families. In this paper we discuss a high-level structure of requirements models that achieves reusability of the desired control behavior across varying hardware platforms in a product family. The structuring approach is demonstrated through a case study in the mobile robotics domain where the desired robot behavior is reused on two diverse platforms-one commercial mobile platform and one build in-house. We use our language RSML (-e) to capture the control behavior for reuse and our tool NIMBUS to demonstrate how the formal specification can be validated and used as a prototype on the two platforms.

  4. Controlled pattern imputation for sensitivity analysis of longitudinal binary and ordinal outcomes with nonignorable dropout.

    PubMed

    Tang, Yongqiang

    2018-04-30

    The controlled imputation method refers to a class of pattern mixture models that have been commonly used as sensitivity analyses of longitudinal clinical trials with nonignorable dropout in recent years. These pattern mixture models assume that participants in the experimental arm after dropout have similar response profiles to the control participants or have worse outcomes than otherwise similar participants who remain on the experimental treatment. In spite of its popularity, the controlled imputation has not been formally developed for longitudinal binary and ordinal outcomes partially due to the lack of a natural multivariate distribution for such endpoints. In this paper, we propose 2 approaches for implementing the controlled imputation for binary and ordinal data based respectively on the sequential logistic regression and the multivariate probit model. Efficient Markov chain Monte Carlo algorithms are developed for missing data imputation by using the monotone data augmentation technique for the sequential logistic regression and a parameter-expanded monotone data augmentation scheme for the multivariate probit model. We assess the performance of the proposed procedures by simulation and the analysis of a schizophrenia clinical trial and compare them with the fully conditional specification, last observation carried forward, and baseline observation carried forward imputation methods. Copyright © 2018 John Wiley & Sons, Ltd.

  5. Spatial but not temporal numerosity thresholds correlate with formal math skills in children.

    PubMed

    Anobile, Giovanni; Arrighi, Roberto; Castaldi, Elisa; Grassi, Eleonora; Pedonese, Lara; Moscoso, Paula A M; Burr, David C

    2018-03-01

    Humans and other animals are able to make rough estimations of quantities using what has been termed the approximate number system (ANS). Much evidence suggests that sensitivity to numerosity correlates with symbolic math capacity, leading to the suggestion that the ANS may serve as a start-up tool to develop symbolic math. Many experiments have demonstrated that numerosity perception transcends the sensory modality of stimuli and their presentation format (sequential or simultaneous), but it remains an open question whether the relationship between numerosity and math generalizes over stimulus format and modality. Here we measured precision for estimating the numerosity of clouds of dots and sequences of flashes or clicks, as well as for paired comparisons of the numerosity of clouds of dots. Our results show that in children, formal math abilities correlate positively with sensitivity for estimation and paired-comparisons of the numerosity of visual arrays of dots. However, precision of numerosity estimation for sequences of flashes or sounds did not correlate with math, although sensitivities in all estimations tasks (for sequential or simultaneous stimuli) were strongly correlated with each other. In adults, we found no significant correlations between math scores and sensitivity to any of the psychophysical tasks. Taken together these results support the existence of a generalized number sense, and go on to demonstrate an intrinsic link between mathematics and perception of spatial, but not temporal numerosity. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  6. How do physicians learn to provide palliative care?

    PubMed

    Schulman-Green, Dena

    2003-01-01

    Medical interns, residents, and fellows are heavily involved in caring for dying patients and interacting with their families. Due to a lack of formal medical education in the area, these house staff often have a limited knowledge of palliative care. The purpose of this study was to determine how, given inadequate formal education, house staff learn to provide palliative care. Specifically, this study sought to explore the extent to which physicians learn to provide palliative care through formal medical education, from physicians and other hospital staff, and by on-the-job learning. Twenty physicians were interviewed about their medical education and other learning experiences in palliative care. ATLAS/ti software was used for data coding and analysis. Analysis of transcripts indicated that house staff learn little to nothing through formal education, to varying degrees from attending physicians and hospital staff, and mostly on the job and by making mistakes.

  7. ADGS-2100 Adaptive Display and Guidance System Window Manager Analysis

    NASA Technical Reports Server (NTRS)

    Whalen, Mike W.; Innis, John D.; Miller, Steven P.; Wagner, Lucas G.

    2006-01-01

    Recent advances in modeling languages have made it feasible to formally specify and analyze the behavior of large system components. Synchronous data flow languages, such as Lustre, SCR, and RSML-e are particularly well suited to this task, and commercial versions of these tools such as SCADE and Simulink are growing in popularity among designers of safety critical systems, largely due to their ability to automatically generate code from the models. At the same time, advances in formal analysis tools have made it practical to formally verify important properties of these models to ensure that design defects are identified and corrected early in the lifecycle. This report describes how these tools have been applied to the ADGS-2100 Adaptive Display and Guidance Window Manager being developed by Rockwell Collins Inc. This work demonstrates how formal methods can be easily and cost-efficiently used to remove defects early in the design cycle.

  8. Advancing Clinical Proteomics via Analysis Based on Biological Complexes: A Tale of Five Paradigms.

    PubMed

    Goh, Wilson Wen Bin; Wong, Limsoon

    2016-09-02

    Despite advances in proteomic technologies, idiosyncratic data issues, for example, incomplete coverage and inconsistency, resulting in large data holes, persist. Moreover, because of naïve reliance on statistical testing and its accompanying p values, differential protein signatures identified from such proteomics data have little diagnostic power. Thus, deploying conventional analytics on proteomics data is insufficient for identifying novel drug targets or precise yet sensitive biomarkers. Complex-based analysis is a new analytical approach that has potential to resolve these issues but requires formalization. We categorize complex-based analysis into five method classes or paradigms and propose an even-handed yet comprehensive evaluation rubric based on both simulated and real data. The first four paradigms are well represented in the literature. The fifth and newest paradigm, the network-paired (NP) paradigm, represented by a method called Extremely Small SubNET (ESSNET), dominates in precision-recall and reproducibility, maintains strong performance in small sample sizes, and sensitively detects low-abundance complexes. In contrast, the commonly used over-representation analysis (ORA) and direct-group (DG) test paradigms maintain good overall precision but have severe reproducibility issues. The other two paradigms considered here are the hit-rate and rank-based network analysis paradigms; both of these have good precision-recall and reproducibility, but they do not consider low-abundance complexes. Therefore, given its strong performance, NP/ESSNET may prove to be a useful approach for improving the analytical resolution of proteomics data. Additionally, given its stability, it may also be a powerful new approach toward functional enrichment tests, much like its ORA and DG counterparts.

  9. The Diagnostic Value of Capillary Refill Time for Detecting Serious Illness in Children: A Systematic Review and Meta-Analysis

    PubMed Central

    Fleming, Susannah; Gill, Peter; Jones, Caroline; Taylor, James A.; Van den Bruel, Ann; Heneghan, Carl; Roberts, Nia; Thompson, Matthew

    2015-01-01

    Importance Capillary refill time (CRT) is widely recommended as part of the routine assessment of unwell children. Objective To determine the diagnostic value of capillary refill time for a range of serious outcomes in children. Methods We searched Medline, Embase and CINAHL from inception to June 2014. We included studies that measured both capillary refill time and a relevant clinical outcome such as mortality, dehydration, meningitis, or other serious illnesses in children aged up to 18 years of age. We screened 1,265 references, of which 24 papers were included in this review. Where sufficient studies were available, we conducted meta-analysis and constructed hierarchical summary ROC curves. Results Meta-analysis on the relationship between capillary refill time and mortality resulted in sensitivity of 34.6% (95% CI 23.9 to 47.1%), specificity 92.3% (88.6 to 94.8%), positive likelihood ratio 4.49 (3.06 to 6.57), and negative likelihood ratio 0.71 (0.60 to 0.84). Studies of children attending Emergency Departments with vomiting and diarrhea showed that capillary refill time had specificity of 89 to 94% for identifying 5% dehydration, but sensitivity ranged from 0 to 94%. This level of heterogeneity precluded formal meta-analysis of this outcome. Meta-analysis was not possible for other outcomes due to insufficient data, but we found consistently high specificity for a range of outcomes including meningitis, sepsis, admission to hospital, hypoxia, severity of illness and dengue. Conclusions Our results show that capillary refill time is a specific sign, indicating that it can be used as a “red-flag”: children with prolonged capillary refill time have a four-fold risk of dying compared to children with normal capillary refill time. The low sensitivity means that a normal capillary refill time should not reassure clinicians. PMID:26375953

  10. Surgical management of bilateral vocal fold paralysis: A cost-effectiveness comparison of two treatments.

    PubMed

    Naunheim, Matthew R; Song, Phillip C; Franco, Ramon A; Alkire, Blake C; Shrime, Mark G

    2017-03-01

    Endoscopic management of bilateral vocal fold paralysis (BVFP) includes cordotomy and arytenoidectomy, and has become a well-accepted alternative to tracheostomy. However, the costs and quality-of-life benefits of endoscopic management have not been examined with formal economic analysis. This study undertakes a cost-effectiveness analysis of tracheostomy versus endoscopic management of BVFP. Cost-effectiveness analysis. A literature review identified a range of costs and outcomes associated with surgical options for BVFP. Additional costs were derived from Medicare reimbursement data; all were adjusted to 2014 dollars. Cost-effectiveness analysis evaluated both therapeutic strategies in short-term and long-term scenarios. Probabilistic sensitivity analysis was used to assess confidence levels regarding the economic evaluation. The incremental cost effectiveness ratio for endoscopic management versus tracheostomy is $31,600.06 per quality-adjusted life year (QALY), indicating that endoscopic management is the cost-effective short-term strategy at a willingness-to-pay (WTP) threshold of $50,000/QALY. The probability that endoscopic management is more cost-effective than tracheostomy at this WTP is 65.1%. Threshold analysis demonstrated that the model is sensitive to both utilities and cost in the short-term scenario. When costs of long-term care are included, tracheostomy is dominated by endoscopic management, indicating the cost-effectiveness of endoscopic management at any WTP. Endoscopic management of BVFP appears to be more cost-effective than tracheostomy. Though endoscopic cordotomy and arytenoidectomy require expertise and specialized equipment, this model demonstrates utility gains and long-term cost advantages to an endoscopic strategy. These findings are limited by the relative paucity of robust utility data and emphasize the need for further economic analysis in otolaryngology. NA Laryngoscope, 127:691-697, 2017. © 2016 The American Laryngological, Rhinological and Otological Society, Inc.

  11. Formal Methods for Verification and Validation of Partial Specifications: A Case Study

    NASA Technical Reports Server (NTRS)

    Easterbrook, Steve; Callahan, John

    1997-01-01

    This paper describes our work exploring the suitability of formal specification methods for independent verification and validation (IV&V) of software specifications for large, safety critical systems. An IV&V contractor often has to perform rapid analysis on incomplete specifications, with no control over how those specifications are represented. Lightweight formal methods show significant promise in this context, as they offer a way of uncovering major errors, without the burden of full proofs of correctness. We describe a case study of the use of partial formal models for V&V of the requirements for Fault Detection Isolation and Recovery on the space station. We conclude that the insights gained from formalizing a specification are valuable, and it is the process of formalization, rather than the end product that is important. It was only necessary to build enough of the formal model to test the properties in which we were interested. Maintenance of fidelity between multiple representations of the same requirements (as they evolve) is still a problem, and deserves further study.

  12. The inner formal structure of the H-T-P drawings: an exploratory study.

    PubMed

    Vass, Z

    1998-08-01

    The study describes some interrelated patterns of traits of the House-Tree-Person (H-T-P) drawings with the instruments of hierarchical cluster analysis. First, according to the literature 1 7 formal or structural aspects of the projective drawings were collected, after which a detailed manual for coding was compiled. Second, the interrater reliability and the consistency of this manual was tested. Third, the hierarchical cluster structure of the reliable and consistent formal aspects was analysed. Results are: (a) a psychometrically tested coding manual of the investigated formal-structural aspects, each of them illustrated with drawings that showed the highest interrater agreement; and (b) the hierarchic cluster structure of the formal aspects of the H-T-P drawings of "normal" adults.

  13. Illicit and pharmaceutical drug consumption estimated via wastewater analysis. Part B: placing back-calculations in a formal statistical framework.

    PubMed

    Jones, Hayley E; Hickman, Matthew; Kasprzyk-Hordern, Barbara; Welton, Nicky J; Baker, David R; Ades, A E

    2014-07-15

    Concentrations of metabolites of illicit drugs in sewage water can be measured with great accuracy and precision, thanks to the development of sensitive and robust analytical methods. Based on assumptions about factors including the excretion profile of the parent drug, routes of administration and the number of individuals using the wastewater system, the level of consumption of a drug can be estimated from such measured concentrations. When presenting results from these 'back-calculations', the multiple sources of uncertainty are often discussed, but are not usually explicitly taken into account in the estimation process. In this paper we demonstrate how these calculations can be placed in a more formal statistical framework by assuming a distribution for each parameter involved, based on a review of the evidence underpinning it. Using a Monte Carlo simulations approach, it is then straightforward to propagate uncertainty in each parameter through the back-calculations, producing a distribution for instead of a single estimate of daily or average consumption. This can be summarised for example by a median and credible interval. To demonstrate this approach, we estimate cocaine consumption in a large urban UK population, using measured concentrations of two of its metabolites, benzoylecgonine and norbenzoylecgonine. We also demonstrate a more sophisticated analysis, implemented within a Bayesian statistical framework using Markov chain Monte Carlo simulation. Our model allows the two metabolites to simultaneously inform estimates of daily cocaine consumption and explicitly allows for variability between days. After accounting for this variability, the resulting credible interval for average daily consumption is appropriately wider, representing additional uncertainty. We discuss possibilities for extensions to the model, and whether analysis of wastewater samples has potential to contribute to a prevalence model for illicit drug use. Copyright © 2014. Published by Elsevier B.V.

  14. Illicit and pharmaceutical drug consumption estimated via wastewater analysis. Part B: Placing back-calculations in a formal statistical framework

    PubMed Central

    Jones, Hayley E.; Hickman, Matthew; Kasprzyk-Hordern, Barbara; Welton, Nicky J.; Baker, David R.; Ades, A.E.

    2014-01-01

    Concentrations of metabolites of illicit drugs in sewage water can be measured with great accuracy and precision, thanks to the development of sensitive and robust analytical methods. Based on assumptions about factors including the excretion profile of the parent drug, routes of administration and the number of individuals using the wastewater system, the level of consumption of a drug can be estimated from such measured concentrations. When presenting results from these ‘back-calculations’, the multiple sources of uncertainty are often discussed, but are not usually explicitly taken into account in the estimation process. In this paper we demonstrate how these calculations can be placed in a more formal statistical framework by assuming a distribution for each parameter involved, based on a review of the evidence underpinning it. Using a Monte Carlo simulations approach, it is then straightforward to propagate uncertainty in each parameter through the back-calculations, producing a distribution for instead of a single estimate of daily or average consumption. This can be summarised for example by a median and credible interval. To demonstrate this approach, we estimate cocaine consumption in a large urban UK population, using measured concentrations of two of its metabolites, benzoylecgonine and norbenzoylecgonine. We also demonstrate a more sophisticated analysis, implemented within a Bayesian statistical framework using Markov chain Monte Carlo simulation. Our model allows the two metabolites to simultaneously inform estimates of daily cocaine consumption and explicitly allows for variability between days. After accounting for this variability, the resulting credible interval for average daily consumption is appropriately wider, representing additional uncertainty. We discuss possibilities for extensions to the model, and whether analysis of wastewater samples has potential to contribute to a prevalence model for illicit drug use. PMID:24636801

  15. Modeling and stochastic analysis of dynamic mechanisms of the perception

    NASA Astrophysics Data System (ADS)

    Pisarchik, A.; Bashkirtseva, I.; Ryashko, L.

    2017-10-01

    Modern studies in physiology and cognitive neuroscience consider a noise as an important constructive factor of the brain functionality. Under the adequate noise, the brain can rapidly access different ordered states, and provide decision-making by preventing deadlocks. Bistable dynamic models are often used for the study of the underlying mechanisms of the visual perception. In the present paper, we consider a bistable energy model subject to both additive and parametric noise. Using the catastrophe theory formalism and stochastic sensitivity functions technique, we analyze a response of the equilibria to noise, and study noise-induced transitions between equilibria. We demonstrate and analyse the effect of hysteresis squeezing when the intensity of noise is increased. Stochastic bifurcations connected with the suppression of oscillations by parametric noises are discussed.

  16. Simulated Data for High Temperature Composite Design

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Abumeri, Galib H.

    2006-01-01

    The paper describes an effective formal method that can be used to simulate design properties for composites that is inclusive of all the effects that influence those properties. This effective simulation method is integrated computer codes that include composite micromechanics, composite macromechanics, laminate theory, structural analysis, and multi-factor interaction model. Demonstration of the method includes sample examples for static, thermal, and fracture reliability for a unidirectional metal matrix composite as well as rupture strength and fatigue strength for a high temperature super alloy. Typical results obtained for a unidirectional composite show that the thermal properties are more sensitive to internal local damage, the longitudinal properties degrade slowly with temperature, the transverse and shear properties degrade rapidly with temperature as do rupture strength and fatigue strength for super alloys.

  17. Developing a Non-Formal Education and Literacy Database in the Asia-Pacific Region. Final Report of the Expert Group Consultation Meeting (Dhaka, Bangladesh, December 15-18, 1997).

    ERIC Educational Resources Information Center

    United Nations Educational, Scientific, and Cultural Organization, Bangkok (Thailand). Regional Office for Education in Asia and the Pacific.

    The objectives of the Expert Group Consultation Meeting for Developing a Non-Formal Education and Literacy Database in the Asia-Pacific Region were: to exchange information and review the state-of-the-art in the field of data collection, analysis and indicators of non-formal education and literacy programs; to examine and review the set of…

  18. What Can History Teach Us A Comparative Historical Analysis On the Reserve Officer Training Corps and the Department of Homeland Security

    DTIC Science & Technology

    2015-12-01

    professional development aspirations. An organization that realized a very similar narrative as the DHS is the Department of Defense (DOD), more...is one that finds itself imbedded in several debates surrounding the development of formalized education/preparatory efforts for its core civilian... development of formalized education efforts for its workforce. There is formalized preparatory training for several different kinds of homeland security

  19. Influence Strategy: Principles and Levels of Analysis

    DTIC Science & Technology

    2011-12-01

    expended its own. The United States formally entered the war on in December 1941 following the Japanese surprise attack at Pearl Harbor. Less formally...placed in key positions and the Reich Cinema Law (RLG) introduced as a means to exercise further control. For instance, the RLG required all film...Western Europe by Germany. However for this purpose it will not be counted until the formal declaration of war in 1941. Following the Japanese

  20. Contributions of the Computer-Administered Neuropsychological Screen for Mild Cognitive Impairment (CANS-MCI) for the diagnosis of MCI in Brazil.

    PubMed

    Memória, Cláudia M; Yassuda, Mônica S; Nakano, Eduardo Y; Forlenza, Orestes V

    2014-05-07

    ABSTRACT Background: The Computer-Administered Neuropsychological Screen for Mild Cognitive Impairment (CANS-MCI) is a computer-based cognitive screening instrument that involves automated administration and scoring and immediate analyses of test sessions. The objective of this study was to translate and culturally adapt the Brazilian Portuguese version of the CANS-MCI (CANS-MCI-BR) and to evaluate its reliability and validity for the diagnostic screening of MCI and dementia due to Alzheimer's disease. Methods: The test was administered to 97 older adults (mean age 73.41 ± 5.27 years) with at least four years of formal education (mean education 12.23 ± 4.48 years). Participants were classified into three diagnostic groups according to global cognitive status (normal controls, n = 41; MCI, n = 35; AD, n = 21) based on clinical data and formal neuropsychological assessments. Results: The results indicated high internal consistency (Cronbach's α = 0.77) in the total sample. Three-month test-retest reliability correlations were significant and robust (0.875; p < 0.001). A moderate level of concurrent validity was attained relative to the screening test for MCI (MoCA test, r = 0.76, p < 0.001). Confirmatory factor analysis supported the three-factor model of the original test, i.e., memory, language/spatial fluency, and executive function/mental control. Goodness of fit indicators were strong (Bentler Comparative Fit Index = 0.96, Root Mean Square Error of Approximation = 0.09). Receiver operating characteristic curve analyses suggested high sensitivity and specificity (81% and 73% respectively) to screen for possible MCI cases. Conclusions: The CANS-MCI-BR maintains adequate psychometric characteristics that render it suitable to identify elderly adults with probable cognitive impairment to whom a more extensive evaluation by formal neuropsychological tests may be required.

  1. Formal Analysis of BPMN Models Using Event-B

    NASA Astrophysics Data System (ADS)

    Bryans, Jeremy W.; Wei, Wei

    The use of business process models has gone far beyond documentation purposes. In the development of business applications, they can play the role of an artifact on which high level properties can be verified and design errors can be revealed in an effort to reduce overhead at later software development and diagnosis stages. This paper demonstrates how formal verification may add value to the specification, design and development of business process models in an industrial setting. The analysis of these models is achieved via an algorithmic translation from the de-facto standard business process modeling language BPMN to Event-B, a widely used formal language supported by the Rodin platform which offers a range of simulation and verification technologies.

  2. Defining Features of Moral Sensitivity and Moral Motivation: Pathways to Moral Reasoning in Medical Students

    ERIC Educational Resources Information Center

    Morton, Kelly R.; Worthley, Joanna S.; Testerman, John K.; Mahoney, Marita L.

    2006-01-01

    Kohlberg's theory of moral development explores the roles of cognition and emotion but focuses primarily on cognition. Contemporary post-formal theories lead to the conclusion that skills resulting from cognitive-affective integration facilitate consistency between moral judgement and moral behaviour. Rest's four-component model of moral…

  3. NASA software specification and evaluation system design, part 1

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The research to develop methods for reducing the effort expended in software and verification is reported. The development of a formal software requirements methodology, a formal specifications language, a programming language, a language preprocessor, and code analysis tools are discussed.

  4. Inductive reasoning about causally transmitted properties.

    PubMed

    Shafto, Patrick; Kemp, Charles; Bonawitz, Elizabeth Baraff; Coley, John D; Tenenbaum, Joshua B

    2008-11-01

    Different intuitive theories constrain and guide inferences in different contexts. Formalizing simple intuitive theories as probabilistic processes operating over structured representations, we present a new computational model of category-based induction about causally transmitted properties. A first experiment demonstrates undergraduates' context-sensitive use of taxonomic and food web knowledge to guide reasoning about causal transmission and shows good qualitative agreement between model predictions and human inferences. A second experiment demonstrates strong quantitative and qualitative fits to inferences about a more complex artificial food web. A third experiment investigates human reasoning about complex novel food webs where species have known taxonomic relations. Results demonstrate a double-dissociation between the predictions of our causal model and a related taxonomic model [Kemp, C., & Tenenbaum, J. B. (2003). Learning domain structures. In Proceedings of the 25th annual conference of the cognitive science society]: the causal model predicts human inferences about diseases but not genes, while the taxonomic model predicts human inferences about genes but not diseases. We contrast our framework with previous models of category-based induction and previous formal instantiations of intuitive theories, and outline challenges in developing a complete model of context-sensitive reasoning.

  5. NASA Langley's Formal Methods Research in Support of the Next Generation Air Transportation System

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Munoz, Cesar A.

    2008-01-01

    This talk will provide a brief introduction to the formal methods developed at NASA Langley and the National Institute for Aerospace (NIA) for air traffic management applications. NASA Langley's formal methods research supports the Interagency Joint Planning and Development Office (JPDO) effort to define and develop the 2025 Next Generation Air Transportation System (NGATS). The JPDO was created by the passage of the Vision 100 Century of Aviation Reauthorization Act in Dec 2003. The NGATS vision calls for a major transformation of the nation s air transportation system that will enable growth to 3 times the traffic of the current system. The transformation will require an unprecedented level of safety-critical automation used in complex procedural operations based on 4-dimensional (4D) trajectories that enable dynamic reconfiguration of airspace scalable to geographic and temporal demand. The goal of our formal methods research is to provide verification methods that can be used to insure the safety of the NGATS system. Our work has focused on the safety assessment of concepts of operation and fundamental algorithms for conflict detection and resolution (CD&R) and self- spacing in the terminal area. Formal analysis of a concept of operations is a novel area of application of formal methods. Here one must establish that a system concept involving aircraft, pilots, and ground resources is safe. The formal analysis of algorithms is a more traditional endeavor. However, the formal analysis of ATM algorithms involves reasoning about the interaction of algorithmic logic and aircraft trajectories defined over an airspace. These trajectories are described using 2D and 3D vectors and are often constrained by trigonometric relations. Thus, in many cases it has been necessary to unload the full power of an advanced theorem prover. The verification challenge is to establish that the safety-critical algorithms produce valid solutions that are guaranteed to maintain separation under all possible scenarios. Current research has assumed perfect knowledge of the location of other aircraft in the vicinity so absolute guarantees are possible, but increasingly we are relaxing the assumptions to allow incomplete, inaccurate, and/or faulty information from communication sources.

  6. "Do You Want Me to Translate This in English or in a Better Mandinka Language?": Unequal Literacy Regimes and Grassroots Spelling Practices in Peri-Urban Gambia

    ERIC Educational Resources Information Center

    Juffermans, Kasper

    2011-01-01

    This paper presents a comparative ethnographic analysis of two versions of a grassroots text in Mandinka language, one written by a non-formally educated man, the other a respelling by a formally educated urbanite. The analysis points at a crucial difference in spelling practices and inequality in literacy regimes, i.e., between established…

  7. Participation in Non-Formal Learning in EU-15 and EU-8 Countries: Demand and Supply Side Factors

    ERIC Educational Resources Information Center

    Roosmaa, Eve-Liis; Saar, Ellu

    2012-01-01

    The main purpose of this paper is to provide an in-depth analysis of participation in non-formal learning in different European Union member states. The paper also seeks to extend analysis of the training gap by pursuing the distinction between the supply and the demand for skills. We use aggregate data from the Adult Education Survey (Eurostat)…

  8. "An integrative formal model of motivation and decision making: The MGPM*": Correction to Ballard et al. (2016).

    PubMed

    2017-02-01

    Reports an error in "An integrative formal model of motivation and decision making: The MGPM*" by Timothy Ballard, Gillian Yeo, Shayne Loft, Jeffrey B. Vancouver and Andrew Neal ( Journal of Applied Psychology , 2016[Sep], Vol 101[9], 1240-1265). Equation A3 contained an error. This correct equation is provided in the erratum. (The following abstract of the original article appeared in record 2016-28692-001.) We develop and test an integrative formal model of motivation and decision making. The model, referred to as the extended multiple-goal pursuit model (MGPM*), is an integration of the multiple-goal pursuit model (Vancouver, Weinhardt, & Schmidt, 2010) and decision field theory (Busemeyer & Townsend, 1993). Simulations of the model generated predictions regarding the effects of goal type (approach vs. avoidance), risk, and time sensitivity on prioritization. We tested these predictions in an experiment in which participants pursued different combinations of approach and avoidance goals under different levels of risk. The empirical results were consistent with the predictions of the MGPM*. Specifically, participants pursuing 1 approach and 1 avoidance goal shifted priority from the approach to the avoidance goal over time. Among participants pursuing 2 approach goals, those with low time sensitivity prioritized the goal with the larger discrepancy, whereas those with high time sensitivity prioritized the goal with the smaller discrepancy. Participants pursuing 2 avoidance goals generally prioritized the goal with the smaller discrepancy. Finally, all of these effects became weaker as the level of risk increased. We used quantitative model comparison to show that the MGPM* explained the data better than the original multiple-goal pursuit model, and that the major extensions from the original model were justified. The MGPM* represents a step forward in the development of a general theory of decision making during multiple-goal pursuit. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  9. 41 CFR 109-1.5204 - Review and approval of a designated contractor's personal property management system.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... overhaul; and (2) An analysis of the cost to implement the overhaul within a year versus a proposed... be based on a formal comprehensive appraisal or a series of formal appraisals of the functional...

  10. International Workshop on Principles of Program Analysis

    DTIC Science & Technology

    1999-01-01

    with respect to a semantics of the programming language. It is a sad fact that new program analyses often contain subtle bugs, and a formal ... It defines a higher-order function f with formal parameter x and body x 1; then it defines two functions g and h that are given as actual parameters...begin by presenting a formal semantics for WHILE. The material of this section may be skimmed through on a first reading; however, it is frequently

  11. Formal Foundations for Hierarchical Safety Cases

    NASA Technical Reports Server (NTRS)

    Denney, Ewen; Pai, Ganesh; Whiteside, Iain

    2015-01-01

    Safety cases are increasingly being required in many safety-critical domains to assure, using structured argumentation and evidence, that a system is acceptably safe. However, comprehensive system-wide safety arguments present appreciable challenges to develop, understand, evaluate, and manage, partly due to the volume of information that they aggregate, such as the results of hazard analysis, requirements analysis, testing, formal verification, and other engineering activities. Previously, we have proposed hierarchical safety cases, hicases, to aid the comprehension of safety case argument structures. In this paper, we build on a formal notion of safety case to formalise the use of hierarchy as a structuring technique, and show that hicases satisfy several desirable properties. Our aim is to provide a formal, theoretical foundation for safety cases. In particular, we believe that tools for high assurance systems should be granted similar assurance to the systems to which they are applied. To this end, we formally specify and prove the correctness of key operations for constructing and managing hicases, which gives the specification for implementing hicases in AdvoCATE, our toolset for safety case automation. We motivate and explain the theory with the help of a simple running example, extracted from a real safety case and developed using AdvoCATE.

  12. [Discussion between informal and formal caregivers of community-dwelling older adults].

    PubMed

    Jacobs, M T; Broese van Groenou, M I; Deeg, D J H

    2014-04-01

    Current Dutch policy on long-term care is aimed at a stronger connection between formal home care and informal care. We examined if formal and informal caregivers of community-dwelling older adults discuss the care and whether this is related to characteristics of the older adult, the care network and the individual caregivers. Data are derived from 63 community-dwelling older adults, including their health, their perceived control of the care and their care network. In addition, 79 informal and 90 formal caregivers are interviewed on their motives and vision on caregiving. The 112 dyads between those formal and informal caregivers are the units of analysis in the current study. Bivariate analyses reveal that informal caregivers are more likely to discuss the care with formal caregivers when they are residing with the older adult, when they provide a lot of care and/or when they are strongly motivated to keep the older adult at home. This is particularly the case when the care demands are high. Characteristics of the formal caregivers were not important. In conclusion, discussion of care between non-resident informal caregivers and formal caregivers is not self-evident and requires more effort to be established.

  13. Advanced polarization sensitive analysis in optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Wieloszyńska, Aleksandra; StrÄ kowski, Marcin R.

    2017-08-01

    The optical coherence tomography (OCT) is an optical imaging method, which is widely applied in variety applications. This technology is used to cross-sectional or surface imaging with high resolution in non-contact and non-destructive way. OCT is very useful in medical applications like ophthalmology, dermatology or dentistry, as well as beyond biomedical fields like stress mapping in polymers or protective coatings defects detection. Standard OCT imaging is based on intensity images which can visualize the inner structure of scattering devices. However, there is a number of extensions improving the OCT measurement abilities. The main of them are the polarization sensitive OCT (PS-OCT), Doppler enable OCT (D-OCT) or spectroscopic OCT (S-OCT). Our research activities have been focused on PS-OCT systems. The polarization sensitive analysis delivers an useful information about optical anisotropic properties of the evaluated sample. This kind of measurements is very important for inner stress monitoring or e.g. tissue recognition. Based on our research results and knowledge the standard PS-OCT provide only data about birefringence of the measured sample. However, based on the OCT measurements more information including depolarization and diattenuation might be obtained. In our work, the method based on Jones formalism are going to be presented. It is used to determine birefringence, dichroism and optic axis orientation of the tested sample. In this contribution the setup of the optical system, as well as tests results verifying the measurements abilities of the system are going to be presented. The brief discussion about the effectiveness and usefulness of this approach will be carried out.

  14. Blunted Ambiguity Aversion During Cost-Benefit Decisions in Antisocial Individuals.

    PubMed

    Buckholtz, Joshua W; Karmarkar, Uma; Ye, Shengxuan; Brennan, Grace M; Baskin-Sommers, Arielle

    2017-05-17

    Antisocial behavior is often assumed to reflect aberrant risk processing. However, many of the most significant forms of antisocial behavior, including crime, reflect the outcomes of decisions made under conditions of ambiguity rather than risk. While risk and ambiguity are formally distinct and experimentally dissociable, little is known about ambiguity sensitivity in individuals who engage in chronic antisocial behavior. We used a financial decision-making task in a high-risk community-based sample to test for associations between sensitivity to ambiguity, antisocial behavior, and arrest history. Sensitivity to ambiguity was lower in individuals who met diagnostic criteria for Antisocial Personality Disorder. Lower ambiguity sensitivity was also associated with higher externalizing (but not psychopathy) scores, and with higher levels of aggression (but not rule-breaking). Finally, blunted sensitivity to ambiguity also predicted a greater frequency of arrests. Together, these data suggest that alterations in cost-benefit decision-making under conditions of ambiguity may promote antisocial behavior.

  15. Aerospace engineering design by systematic decomposition and multilevel optimization

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, J.; Giles, G. L.; Barthelemy, J.-F. M.

    1984-01-01

    This paper describes a method for systematic analysis and optimization of large engineering systems, e.g., aircraft, by decomposition of a large task into a set of smaller, self-contained subtasks that can be solved concurrently. The subtasks may be arranged in many hierarchical levels with the assembled system at the top level. Analyses are carried out in each subtask using inputs received from other subtasks, and are followed by optimizations carried out from the bottom up. Each optimization at the lower levels is augmented by analysis of its sensitivity to the inputs received from other subtasks to account for the couplings among the subtasks in a formal manner. The analysis and optimization operations alternate iteratively until they converge to a system design whose performance is maximized with all constraints satisfied. The method, which is still under development, is tentatively validated by test cases in structural applications and an aircraft configuration optimization. It is pointed out that the method is intended to be compatible with the typical engineering organization and the modern technology of distributed computing.

  16. Global Sensitivity Analysis of OnGuard Models Identifies Key Hubs for Transport Interaction in Stomatal Dynamics1[CC-BY

    PubMed Central

    Vialet-Chabrand, Silvere; Griffiths, Howard

    2017-01-01

    The physical requirement for charge to balance across biological membranes means that the transmembrane transport of each ionic species is interrelated, and manipulating solute flux through any one transporter will affect other transporters at the same membrane, often with unforeseen consequences. The OnGuard systems modeling platform has helped to resolve the mechanics of stomatal movements, uncovering previously unexpected behaviors of stomata. To date, however, the manual approach to exploring model parameter space has captured little formal information about the emergent connections between parameters that define the most interesting properties of the system as a whole. Here, we introduce global sensitivity analysis to identify interacting parameters affecting a number of outputs commonly accessed in experiments in Arabidopsis (Arabidopsis thaliana). The analysis highlights synergies between transporters affecting the balance between Ca2+ sequestration and Ca2+ release pathways, notably those associated with internal Ca2+ stores and their turnover. Other, unexpected synergies appear, including with the plasma membrane anion channels and H+-ATPase and with the tonoplast TPK K+ channel. These emergent synergies, and the core hubs of interaction that they define, identify subsets of transporters associated with free cytosolic Ca2+ concentration that represent key targets to enhance plant performance in the future. They also highlight the importance of interactions between the voltage regulation of the plasma membrane and tonoplast in coordinating transport between the different cellular compartments. PMID:28432256

  17. Cost and sensitivity of restricted active-space calculations of metal L-edge X-ray absorption spectra.

    PubMed

    Pinjari, Rahul V; Delcey, Mickaël G; Guo, Meiyuan; Odelius, Michael; Lundberg, Marcus

    2016-02-15

    The restricted active-space (RAS) approach can accurately simulate metal L-edge X-ray absorption spectra of first-row transition metal complexes without the use of any fitting parameters. These characteristics provide a unique capability to identify unknown chemical species and to analyze their electronic structure. To find the best balance between cost and accuracy, the sensitivity of the simulated spectra with respect to the method variables has been tested for two models, [FeCl6 ](3-) and [Fe(CN)6 ](3-) . For these systems, the reference calculations give deviations, when compared with experiment, of ≤1 eV in peak positions, ≤30% for the relative intensity of major peaks, and ≤50% for minor peaks. When compared with these deviations, the simulated spectra are sensitive to the number of final states, the inclusion of dynamical correlation, and the ionization potential electron affinity shift, in addition to the selection of the active space. The spectra are less sensitive to the quality of the basis set and even a double-ζ basis gives reasonable results. The inclusion of dynamical correlation through second-order perturbation theory can be done efficiently using the state-specific formalism without correlating the core orbitals. Although these observations are not directly transferable to other systems, they can, together with a cost analysis, aid in the design of RAS models and help to extend the use of this powerful approach to a wider range of transition metal systems. © 2015 Wiley Periodicals, Inc.

  18. Cost-effectiveness of point-of-care testing for dehydration in the pediatric ED.

    PubMed

    Whitney, Rachel E; Santucci, Karen; Hsiao, Allen; Chen, Lei

    2016-08-01

    Acute gastroenteritis (AGE) and subsequent dehydration account for a large proportion of pediatric emergency department (PED) visits. Point-of-care (POC) testing has been used in conjunction with clinical assessment to determine the degree of dehydration. Despite the wide acceptance of POC testing, little formal cost-effective analysis of POC testing in the PED exists. We aim to examine the cost-effectiveness of using POC electrolyte testing vs traditional serum chemistry testing in the PED for children with AGE. This was a cost-effective analysis using data from a randomized control trial of children with AGE. A decision analysis model was constructed to calculate cost-savings from the point of view of the payer and the provider. We used parameters obtained from the trial, including cost of testing, admission rates, cost of admission, and length of stay. Sensitivity analyses were performed to evaluate the stability of our model. Using the data set of 225 subjects, POC testing results in a cost savings of $303.30 per patient compared with traditional serum testing from the point of the view of the payer. From the point-of-view of the provider, POC testing results in consistent mean savings of $36.32 ($8.29-$64.35) per patient. Sensitivity analyses demonstrated the stability of the model and consistent savings. This decision analysis provides evidence that POC testing in children with gastroenteritis-related moderate dehydration results in significant cost savings from the points of view of payers and providers compared to traditional serum chemistry testing. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. Critical Analysis of the Mathematical Formalism of Theoretical Physics. I. Foundations of Differential and Integral Calculus

    NASA Astrophysics Data System (ADS)

    Kalanov, Temur Z.

    2013-04-01

    Critical analysis of the standard foundations of differential and integral calculus -- as mathematical formalism of theoretical physics -- is proposed. Methodological basis of the analysis is the unity of formal logic and rational dialectics. It is shown that: (a) the foundations (i.e. d 1ptyd,;=;δ,;->;0,;δ,δ,, δ,;->;0;δ,δ,;=;δ,;->;0;f,( x;+;δ, );-;f,( x )δ,;, d,;=;δ,, d,;=;δ, where y;=;f,( x ) is a continuous function of one argument x; δ, and δ, are increments; d, and d, are differentials) not satisfy formal logic law -- the law of identity; (b) the infinitesimal quantities d,, d, are fictitious quantities. They have neither algebraic meaning, nor geometrical meaning because these quantities do not take numerical values and, therefore, have no a quantitative measure; (c) expressions of the kind x;+;d, are erroneous because x (i.e. finite quantity) and d, (i.e. infinitely diminished quantity) have different sense, different qualitative determinacy; since x;,;,,,,onst under δ,;,;,, a derivative does not contain variable quantity x and depends only on constant c. Consequently, the standard concepts ``infinitesimal quantity (uninterruptedly diminishing quantity)'', ``derivative'', ``derivative as function of variable quantity'' represent incorrect basis of mathematics and theoretical physics.

  20. Development of a Software Safety Process and a Case Study of Its Use

    NASA Technical Reports Server (NTRS)

    Knight, J. C.

    1997-01-01

    Research in the year covered by this reporting period has been primarily directed toward the following areas: (1) Formal specification of user interfaces; (2) Fault-tree analysis including software; (3) Evaluation of formal specification notations; (4) Evaluation of formal verification techniques; (5) Expanded analysis of the shell architecture concept; (6) Development of techniques to address the problem of information survivability; and (7) Development of a sophisticated tool for the manipulation of formal specifications written in Z. This report summarizes activities under the grant. The technical results relating to this grant and the remainder of the principal investigator's research program are contained in various reports and papers. The remainder of this report is organized as follows. In the next section, an overview of the project is given. This is followed by a summary of accomplishments during the reporting period and details of students funded. Seminars presented describing work under this grant are listed in the following section, and the final section lists publications resulting from this grant.

  1. Sensitivity of nonuniform sampling NMR.

    PubMed

    Palmer, Melissa R; Suiter, Christopher L; Henry, Geneive E; Rovnyak, James; Hoch, Jeffrey C; Polenova, Tatyana; Rovnyak, David

    2015-06-04

    Many information-rich multidimensional experiments in nuclear magnetic resonance spectroscopy can benefit from a signal-to-noise ratio (SNR) enhancement of up to about 2-fold if a decaying signal in an indirect dimension is sampled with nonconsecutive increments, termed nonuniform sampling (NUS). This work provides formal theoretical results and applications to resolve major questions about the scope of the NUS enhancement. First, we introduce the NUS Sensitivity Theorem in which any decreasing sampling density applied to any exponentially decaying signal always results in higher sensitivity (SNR per square root of measurement time) than uniform sampling (US). Several cases will illustrate this theorem and show that even conservative applications of NUS improve sensitivity by useful amounts. Next, we turn to a serious limitation of uniform sampling: the SNR by US decreases for extending evolution times, and thus total experimental times, beyond 1.26T2 (T2 = signal decay constant). Thus, SNR and resolution cannot be simultaneously improved by extending US beyond 1.26T2. We find that NUS can eliminate this constraint, and we introduce the matched NUS SNR Theorem: an exponential sampling density matched to the signal decay always improves the SNR with additional evolution time. Though proved for a specific case, broader classes of NUS densities also improve SNR with evolution time. Applications of these theoretical results are given for a soluble plant natural product and a solid tripeptide (u-(13)C,(15)N-MLF). These formal results clearly demonstrate the inadequacies of applying US to decaying signals in indirect nD-NMR dimensions, supporting a broader adoption of NUS.

  2. Interoperability between phenotype and anatomy ontologies.

    PubMed

    Hoehndorf, Robert; Oellrich, Anika; Rebholz-Schuhmann, Dietrich

    2010-12-15

    Phenotypic information is important for the analysis of the molecular mechanisms underlying disease. A formal ontological representation of phenotypic information can help to identify, interpret and infer phenotypic traits based on experimental findings. The methods that are currently used to represent data and information about phenotypes fail to make the semantics of the phenotypic trait explicit and do not interoperate with ontologies of anatomy and other domains. Therefore, valuable resources for the analysis of phenotype studies remain unconnected and inaccessible to automated analysis and reasoning. We provide a framework to formalize phenotypic descriptions and make their semantics explicit. Based on this formalization, we provide the means to integrate phenotypic descriptions with ontologies of other domains, in particular anatomy and physiology. We demonstrate how our framework leads to the capability to represent disease phenotypes, perform powerful queries that were not possible before and infer additional knowledge. http://bioonto.de/pmwiki.php/Main/PheneOntology.

  3. EUS for the staging of gastric cancer: a meta-analysis.

    PubMed

    Mocellin, Simone; Marchet, Alberto; Nitti, Donato

    2011-06-01

    The role of EUS in the locoregional staging of gastric carcinoma is undefined. We aimed to comprehensively review and quantitatively summarize the available evidence on the staging performance of EUS. We systematically searched the MEDLINE, Cochrane, CANCERLIT, and EMBASE databases for relevant studies published until July 2010. Formal meta-analysis of diagnostic accuracy parameters was performed by using a bivariate random-effects model. Fifty-four studies enrolling 5601 patients with gastric cancer undergoing disease staging with EUS were eligible for the meta-analysis. EUS staging accuracy across eligible studies was measured by computing overall sensitivity, specificity, positive likelihood ratio (PLR), negative likelihood ratio (NLR), and diagnostic odds ratio (DOR). EUS can differentiate T1-2 from T3-4 gastric cancer with high accuracy, with overall sensitivity, specificity, PLR, NLR, and DOR of 0.86 (95% CI, 0.81-0.90), 0.91 (95% CI, 0.89-0.93), 9.8 (95% CI, 7.5-12.8), 0.15 (95% CI, 0.11-0.21), and 65 (95% CI, 41-105), respectively. In contrast, the diagnostic performance of EUS for lymph node status is less reliable, with overall sensitivity, specificity, PLR, NLR, and DOR of 0.69 (95% CI, 0.63-0.74), 0.84 (95% CI, 0.81-0.88), 4.4 (95% CI, 3.6-5.4), 0.37 (95% CI, 0.32-0.44), and 12 (95% CI, 9-16), respectively. Results regarding single T categories (including T1 substages) and Bayesian nomograms to calculate posttest probabilities for any target condition prevalence are also provided. Statistical heterogeneity was generally high; unfortunately, subgroup analysis did not identify a consistent source of the heterogeneity. Our results support the use of EUS for the locoregional staging of gastric cancer, which can affect the therapeutic management of these patients. However, clinicians must be aware of the performance limits of this staging tool. Copyright © 2011 American Society for Gastrointestinal Endoscopy. Published by Mosby, Inc. All rights reserved.

  4. A Logical Analysis of Quantum Voting Protocols

    NASA Astrophysics Data System (ADS)

    Rad, Soroush Rafiee; Shirinkalam, Elahe; Smets, Sonja

    2017-12-01

    In this paper we provide a logical analysis of the Quantum Voting Protocol for Anonymous Surveying as developed by Horoshko and Kilin in (Phys. Lett. A 375, 1172-1175 2011). In particular we make use of the probabilistic logic of quantum programs as developed in (Int. J. Theor. Phys. 53, 3628-3647 2014) to provide a formal specification of the protocol and to derive its correctness. Our analysis is part of a wider program on the application of quantum logics to the formal verification of protocols in quantum communication and quantum computation.

  5. Evaluation of high throughput gene expression platforms using a genomic biomarker signature for prediction of skin sensitization.

    PubMed

    Forreryd, Andy; Johansson, Henrik; Albrekt, Ann-Sofie; Lindstedt, Malin

    2014-05-16

    Allergic contact dermatitis (ACD) develops upon exposure to certain chemical compounds termed skin sensitizers. To reduce the occurrence of skin sensitizers, chemicals are regularly screened for their capacity to induce sensitization. The recently developed Genomic Allergen Rapid Detection (GARD) assay is an in vitro alternative to animal testing for identification of skin sensitizers, classifying chemicals by evaluating transcriptional levels of a genomic biomarker signature. During assay development and biomarker identification, genome-wide expression analysis was applied using microarrays covering approximately 30,000 transcripts. However, the microarray platform suffers from drawbacks in terms of low sample throughput, high cost per sample and time consuming protocols and is a limiting factor for adaption of GARD into a routine assay for screening of potential sensitizers. With the purpose to simplify assay procedures, improve technical parameters and increase sample throughput, we assessed the performance of three high throughput gene expression platforms--nCounter®, BioMark HD™ and OpenArray®--and correlated their performance metrics against our previously generated microarray data. We measured the levels of 30 transcripts from the GARD biomarker signature across 48 samples. Detection sensitivity, reproducibility, correlations and overall structure of gene expression measurements were compared across platforms. Gene expression data from all of the evaluated platforms could be used to classify most of the sensitizers from non-sensitizers in the GARD assay. Results also showed high data quality and acceptable reproducibility for all platforms but only medium to poor correlations of expression measurements across platforms. In addition, evaluated platforms were superior to the microarray platform in terms of cost efficiency, simplicity of protocols and sample throughput. We evaluated the performance of three non-array based platforms using a limited set of transcripts from the GARD biomarker signature. We demonstrated that it was possible to achieve acceptable discriminatory power in terms of separation between sensitizers and non-sensitizers in the GARD assay while reducing assay costs, simplify assay procedures and increase sample throughput by using an alternative platform, providing a first step towards the goal to prepare GARD for formal validation and adaption of the assay for industrial screening of potential sensitizers.

  6. Formal Solutions for Polarized Radiative Transfer. II. High-order Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Janett, Gioele; Steiner, Oskar; Belluzzi, Luca, E-mail: gioele.janett@irsol.ch

    When integrating the radiative transfer equation for polarized light, the necessity of high-order numerical methods is well known. In fact, well-performing high-order formal solvers enable higher accuracy and the use of coarser spatial grids. Aiming to provide a clear comparison between formal solvers, this work presents different high-order numerical schemes and applies the systematic analysis proposed by Janett et al., emphasizing their advantages and drawbacks in terms of order of accuracy, stability, and computational cost.

  7. Higher-harmonic collective modes in a trapped gas from second-order hydrodynamics

    DOE PAGES

    Lewis, William E.; Romatschke, P.

    2017-02-21

    Utilizing a second-order hydrodynamics formalism, the dispersion relations for the frequencies and damping rates of collective oscillations as well as spatial structure of these modes up to the decapole oscillation in both two- and three- dimensional gas geometries are calculated. In addition to higher-order modes, the formalism also gives rise to purely damped "non-hydrodynamic" modes. We calculate the amplitude of the various modes for both symmetric and asymmetric trap quenches, finding excellent agreement with an exact quantum mechanical calculation. Furthermore, we find that higher-order hydrodynamic modes are more sensitive to the value of shear viscosity, which may be of interestmore » for the precision extraction of transport coefficients in Fermi gas systems.« less

  8. Higher-harmonic collective modes in a trapped gas from second-order hydrodynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lewis, William E.; Romatschke, P.

    Utilizing a second-order hydrodynamics formalism, the dispersion relations for the frequencies and damping rates of collective oscillations as well as spatial structure of these modes up to the decapole oscillation in both two- and three- dimensional gas geometries are calculated. In addition to higher-order modes, the formalism also gives rise to purely damped "non-hydrodynamic" modes. We calculate the amplitude of the various modes for both symmetric and asymmetric trap quenches, finding excellent agreement with an exact quantum mechanical calculation. Furthermore, we find that higher-order hydrodynamic modes are more sensitive to the value of shear viscosity, which may be of interestmore » for the precision extraction of transport coefficients in Fermi gas systems.« less

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ayala, Alejandro; Hentschinski, Martin; Jalilian-Marian, Jamal

    Azimuthal angular correlations between produced hadrons/jets in high energy collisions are a sensitive probe of the dynamics of QCD at small x. Here we derive the triple differential cross section for inclusive production of 3 polarized partons in DIS at small x using the spinor helicity formalism. The target proton or nucleus is described using the Color Glass Condensate (CGC) formalism. The resulting expressions are used to study azimuthal angular correlations between produced partons in order to probe the gluon structure of the target hadron or nucleus. Finally, our analytic expressions can also be used to calculate the real partmore » of the Next to Leading Order (NLO) corrections to di-hadron production in DIS by integrating out one of the three final state partons.« less

  10. Investigation on the use of optimization techniques for helicopter airframe vibrations design studies

    NASA Technical Reports Server (NTRS)

    Sreekanta Murthy, T.

    1992-01-01

    Results of the investigation of formal nonlinear programming-based numerical optimization techniques of helicopter airframe vibration reduction are summarized. The objective and constraint function and the sensitivity expressions used in the formulation of airframe vibration optimization problems are presented and discussed. Implementation of a new computational procedure based on MSC/NASTRAN and CONMIN in a computer program system called DYNOPT for optimizing airframes subject to strength, frequency, dynamic response, and dynamic stress constraints is described. An optimization methodology is proposed which is thought to provide a new way of applying formal optimization techniques during the various phases of the airframe design process. Numerical results obtained from the application of the DYNOPT optimization code to a helicopter airframe are discussed.

  11. δ M formalism and anisotropic chaotic inflation power spectrum

    NASA Astrophysics Data System (ADS)

    Talebian-Ashkezari, A.; Ahmadi, N.

    2018-05-01

    A new analytical approach to linear perturbations in anisotropic inflation has been introduced in [A. Talebian-Ashkezari, N. Ahmadi and A.A. Abolhasani, JCAP 03 (2018) 001] under the name of δ M formalism. In this paper we apply the mentioned approach to a model of anisotropic inflation driven by a scalar field, coupled to the kinetic term of a vector field with a U(1) symmetry. The δ M formalism provides an efficient way of computing tensor-tensor, tensor-scalar as well as scalar-scalar 2-point correlations that are needed for the analysis of the observational features of an anisotropic model on the CMB. A comparison between δ M results and the tedious calculations using in-in formalism shows the aptitude of the δ M formalism in calculating accurate two point correlation functions between physical modes of the system.

  12. Culturally Sensitive Counselling in Nunavut: Implications of Inuit Traditional Knowledge

    ERIC Educational Resources Information Center

    Wihak, Christine; Merali, Noorfarah

    2003-01-01

    The success of the Inuit people of Canada in seeking political autonomy resulted in the creation of the Nunavut territory. The new Government of Nunavut (GN) has instituted Inuit Quajimajatiqangit (IQ), the values, norms, and traditional knowledge of the Inuit, as formal policy to guide the delivery of health, social, and civil services in order…

  13. Personal and Formal Backgrounds as Factors Which Influence Linguistic and Cultural Competency in the Teaching of Mathematics

    ERIC Educational Resources Information Center

    Nguyen-Le, Khanh

    2010-01-01

    This dissertation addresses the need for effective teaching approaches in a society with increasing numbers of culturally and linguistically diverse (CLD) students. The focus of this study is on teachers' linguistic and cultural competency (LCC), which I define as teachers' critical thinking and sensitivity concerning issues of language and…

  14. Spatial but Not Temporal Numerosity Thresholds Correlate with Formal Math Skills in Children

    ERIC Educational Resources Information Center

    Anobile, Giovanni; Arrighi, Roberto; Castaldi, Elisa; Grassi, Eleonora; Pedonese, Lara; Moscoso, Paula A. M.; Burr, David C.

    2018-01-01

    Humans and other animals are able to make rough estimations of quantities using what has been termed the "approximate number system" (ANS). Much evidence suggests that sensitivity to numerosity correlates with symbolic math capacity, leading to the suggestion that the ANS may serve as a start-up tool to develop symbolic math. Many…

  15. Extrusion cast explosive

    DOEpatents

    Scribner, Kenneth J.

    1985-01-01

    Improved, multiphase, high performance, high energy, extrusion cast explosive compositions, comprising, a crystalline explosive material; an energetic liquid plasticizer; a urethane prepolymer, comprising a blend of polyvinyl formal, and polycaprolactone; a polyfunctional isocyanate; and a catalyst are disclosed. These new explosive compositions exhibit higher explosive content, a smooth detonation front, excellent stability over long periods of storage, and lower sensitivity to mechanical stimulants.

  16. Preparing Science Teachers to Address Contentious and Sensitive Science Topics

    ERIC Educational Resources Information Center

    Ado, Gustave

    2015-01-01

    Purpose: Despite high HIV prevalence rates in Ivory Coast, the formal K-12 curriculum was not developed to address HIV/AIDS information completely for many African students. The purpose of this study was to identify factors that influenced Ivorian teachers' teaching of the HIV/AIDS curriculum in middle school science curricula in nine middle…

  17. Elicitation Techniques: Getting People to Talk about Ideas They Don't Usually Talk About

    ERIC Educational Resources Information Center

    Barton, Keith C.

    2015-01-01

    Elicitation techniques are a category of research tasks that use visual, verbal, or written stimuli to encourage participants to talk about their ideas. These tasks are particularly useful for exploring topics that may be difficult to discuss in formal interviews, such as those that involve sensitive issues or rely on tacit knowledge. Elicitation…

  18. Socio-Sexual Education: A Practical Study in Formal Thinking and Teachable Moments

    ERIC Educational Resources Information Center

    Wagner, Paul A.

    2011-01-01

    Sex education is almost as sensitive a topic in public schooling as is the imposition of high-stakes testing. Both typically claim to be value-free contributions to the development of the student's cognitive, psychological and sometimes even moral maturity. Ironically each seems to short-change students in all three areas of development. The focus…

  19. Rule acquisition in formal decision contexts based on formal, object-oriented and property-oriented concept lattices.

    PubMed

    Ren, Yue; Li, Jinhai; Aswani Kumar, Cherukuri; Liu, Wenqi

    2014-01-01

    Rule acquisition is one of the main purposes in the analysis of formal decision contexts. Up to now, there have been several types of rules in formal decision contexts such as decision rules, decision implications, and granular rules, which can be viewed as ∧-rules since all of them have the following form: "if conditions 1,2,…, and m hold, then decisions hold." In order to enrich the existing rule acquisition theory in formal decision contexts, this study puts forward two new types of rules which are called ∨-rules and ∨-∧ mixed rules based on formal, object-oriented, and property-oriented concept lattices. Moreover, a comparison of ∨-rules, ∨-∧ mixed rules, and ∧-rules is made from the perspectives of inclusion and inference relationships. Finally, some real examples and numerical experiments are conducted to compare the proposed rule acquisition algorithms with the existing one in terms of the running efficiency.

  20. The role of early language abilities on math skills among Chinese children.

    PubMed

    Zhang, Juan; Fan, Xitao; Cheung, Sum Kwing; Meng, Yaxuan; Cai, Zhihui; Hu, Bi Ying

    2017-01-01

    The present study investigated the role of early language abilities in the development of math skills among Chinese K-3 students. About 2000 children in China, who were on average aged 6 years, were assessed for both informal math (e.g., basic number concepts such as counting objects) and formal math (calculations including addition and subtraction) skills, language abilities and nonverbal intelligence. Correlation analysis showed that language abilities were more strongly associated with informal than formal math skills, and regression analyses revealed that children's language abilities could uniquely predict both informal and formal math skills with age, gender, and nonverbal intelligence controlled. Mediation analyses demonstrated that the relationship between children's language abilities and formal math skills was partially mediated by informal math skills. The current findings indicate 1) Children's language abilities are of strong predictive values for both informal and formal math skills; 2) Language abilities impacts formal math skills partially through the mediation of informal math skills.

  1. The role of early language abilities on math skills among Chinese children

    PubMed Central

    Fan, Xitao; Cheung, Sum Kwing; Cai, Zhihui; Hu, Bi Ying

    2017-01-01

    Background The present study investigated the role of early language abilities in the development of math skills among Chinese K-3 students. About 2000 children in China, who were on average aged 6 years, were assessed for both informal math (e.g., basic number concepts such as counting objects) and formal math (calculations including addition and subtraction) skills, language abilities and nonverbal intelligence. Methodology Correlation analysis showed that language abilities were more strongly associated with informal than formal math skills, and regression analyses revealed that children’s language abilities could uniquely predict both informal and formal math skills with age, gender, and nonverbal intelligence controlled. Mediation analyses demonstrated that the relationship between children’s language abilities and formal math skills was partially mediated by informal math skills. Results The current findings indicate 1) Children’s language abilities are of strong predictive values for both informal and formal math skills; 2) Language abilities impacts formal math skills partially through the mediation of informal math skills. PMID:28749950

  2. Rule Acquisition in Formal Decision Contexts Based on Formal, Object-Oriented and Property-Oriented Concept Lattices

    PubMed Central

    Ren, Yue; Aswani Kumar, Cherukuri; Liu, Wenqi

    2014-01-01

    Rule acquisition is one of the main purposes in the analysis of formal decision contexts. Up to now, there have been several types of rules in formal decision contexts such as decision rules, decision implications, and granular rules, which can be viewed as ∧-rules since all of them have the following form: “if conditions 1,2,…, and m hold, then decisions hold.” In order to enrich the existing rule acquisition theory in formal decision contexts, this study puts forward two new types of rules which are called ∨-rules and ∨-∧ mixed rules based on formal, object-oriented, and property-oriented concept lattices. Moreover, a comparison of ∨-rules, ∨-∧ mixed rules, and ∧-rules is made from the perspectives of inclusion and inference relationships. Finally, some real examples and numerical experiments are conducted to compare the proposed rule acquisition algorithms with the existing one in terms of the running efficiency. PMID:25165744

  3. Applying formal methods and object-oriented analysis to existing flight software

    NASA Technical Reports Server (NTRS)

    Cheng, Betty H. C.; Auernheimer, Brent

    1993-01-01

    Correctness is paramount for safety-critical software control systems. Critical software failures in medical radiation treatment, communications, and defense are familiar to the public. The significant quantity of software malfunctions regularly reported to the software engineering community, the laws concerning liability, and a recent NRC Aeronautics and Space Engineering Board report additionally motivate the use of error-reducing and defect detection software development techniques. The benefits of formal methods in requirements driven software development ('forward engineering') is well documented. One advantage of rigorously engineering software is that formal notations are precise, verifiable, and facilitate automated processing. This paper describes the application of formal methods to reverse engineering, where formal specifications are developed for a portion of the shuttle on-orbit digital autopilot (DAP). Three objectives of the project were to: demonstrate the use of formal methods on a shuttle application, facilitate the incorporation and validation of new requirements for the system, and verify the safety-critical properties to be exhibited by the software.

  4. Using software security analysis to verify the secure socket layer (SSL) protocol

    NASA Technical Reports Server (NTRS)

    Powell, John D.

    2004-01-01

    nal Aeronautics and Space Administration (NASA) have tens of thousands of networked computer systems and applications. Software Security vulnerabilities present risks such as lost or corrupted data, information the3, and unavailability of critical systems. These risks represent potentially enormous costs to NASA. The NASA Code Q research initiative 'Reducing Software Security Risk (RSSR) Trough an Integrated Approach '' offers, among its capabilities, formal verification of software security properties, through the use of model based verification (MBV) to address software security risks. [1,2,3,4,5,6] MBV is a formal approach to software assurance that combines analysis of software, via abstract models, with technology, such as model checkers, that provide automation of the mechanical portions of the analysis process. This paper will discuss: The need for formal analysis to assure software systems with respect to software and why testing alone cannot provide it. The means by which MBV with a Flexible Modeling Framework (FMF) accomplishes the necessary analysis task. An example of FMF style MBV in the verification of properties over the Secure Socket Layer (SSL) communication protocol as a demonstration.

  5. An experiment on the impact of a neonicotinoid pesticide on honeybees: the value of a formal analysis of the data.

    PubMed

    Schick, Robert S; Greenwood, Jeremy J D; Buckland, Stephen T

    2017-01-01

    We assess the analysis of the data resulting from a field experiment conducted by Pilling et al. (PLoS ONE. doi: 10.1371/journal.pone.0077193, 5) on the potential effects of thiamethoxam on honeybees. The experiment had low levels of replication, so Pilling et al. concluded that formal statistical analysis would be misleading. This would be true if such an analysis merely comprised tests of statistical significance and if the investigators concluded that lack of significance meant little or no effect. However, an analysis that includes estimation of the size of any effects-with confidence limits-allows one to reach conclusions that are not misleading and that produce useful insights. For the data of Pilling et al., we use straightforward statistical analysis to show that the confidence limits are generally so wide that any effects of thiamethoxam could have been large without being statistically significant. Instead of formal analysis, Pilling et al. simply inspected the data and concluded that they provided no evidence of detrimental effects and from this that thiamethoxam poses a "low risk" to bees. Conclusions derived from the inspection of the data were not just misleading in this case but also are unacceptable in principle, for if data are inadequate for a formal analysis (or only good enough to provide estimates with wide confidence intervals), then they are bound to be inadequate as a basis for reaching any sound conclusions. Given that the data in this case are largely uninformative with respect to the treatment effect, any conclusions reached from such informal approaches can do little more than reflect the prior beliefs of those involved.

  6. Leadership for Community Engagement--A Distributed Leadership Perspective

    ERIC Educational Resources Information Center

    Liang, Jia G.; Sandmann, Lorilee R.

    2015-01-01

    This article presents distributed leadership as a framework for analysis, showing how the phenomenon complements formal higher education structures by mobilizing leadership from various sources, formal and informal. This perspective more accurately portrays the reality of leading engaged institutions. Using the application data from 224…

  7. On the formalization and reuse of scientific research.

    PubMed

    King, Ross D; Liakata, Maria; Lu, Chuan; Oliver, Stephen G; Soldatova, Larisa N

    2011-10-07

    The reuse of scientific knowledge obtained from one investigation in another investigation is basic to the advance of science. Scientific investigations should therefore be recorded in ways that promote the reuse of the knowledge they generate. The use of logical formalisms to describe scientific knowledge has potential advantages in facilitating such reuse. Here, we propose a formal framework for using logical formalisms to promote reuse. We demonstrate the utility of this framework by using it in a worked example from biology: demonstrating cycles of investigation formalization [F] and reuse [R] to generate new knowledge. We first used logic to formally describe a Robot scientist investigation into yeast (Saccharomyces cerevisiae) functional genomics [f(1)]. With Robot scientists, unlike human scientists, the production of comprehensive metadata about their investigations is a natural by-product of the way they work. We then demonstrated how this formalism enabled the reuse of the research in investigating yeast phenotypes [r(1) = R(f(1))]. This investigation found that the removal of non-essential enzymes generally resulted in enhanced growth. The phenotype investigation was then formally described using the same logical formalism as the functional genomics investigation [f(2) = F(r(1))]. We then demonstrated how this formalism enabled the reuse of the phenotype investigation to investigate yeast systems-biology modelling [r(2) = R(f(2))]. This investigation found that yeast flux-balance analysis models fail to predict the observed changes in growth. Finally, the systems biology investigation was formalized for reuse in future investigations [f(3) = F(r(2))]. These cycles of reuse are a model for the general reuse of scientific knowledge.

  8. Formal Solutions for Polarized Radiative Transfer. III. Stiffness and Instability

    NASA Astrophysics Data System (ADS)

    Janett, Gioele; Paganini, Alberto

    2018-04-01

    Efficient numerical approximation of the polarized radiative transfer equation is challenging because this system of ordinary differential equations exhibits stiff behavior, which potentially results in numerical instability. This negatively impacts the accuracy of formal solvers, and small step-sizes are often necessary to retrieve physical solutions. This work presents stability analyses of formal solvers for the radiative transfer equation of polarized light, identifies instability issues, and suggests practical remedies. In particular, the assumptions and the limitations of the stability analysis of Runge–Kutta methods play a crucial role. On this basis, a suitable and pragmatic formal solver is outlined and tested. An insightful comparison to the scalar radiative transfer equation is also presented.

  9. Racial and/or Ethnic Differences in Formal Sex Education and Sex Education by Parents among Young Women in the United States.

    PubMed

    Vanderberg, Rachel H; Farkas, Amy H; Miller, Elizabeth; Sucato, Gina S; Akers, Aletha Y; Borrero, Sonya B

    2016-02-01

    We sought to investigate the associations between race and/or ethnicity and young women's formal sex education and sex education by parents. Cross-sectional analysis of a nationally representative sample of 1768 women aged 15-24 years who participated in the 2011-2013 National Survey of Family Growth. We assessed 6 main outcomes: participants' report of: (1) any formal sex education; (2) formal contraceptive education; (3) formal sexually transmitted infection (STI) education; (4) any sex education by parents; (5) contraceptive education by parents; and (6) STI education by parents. The primary independent variable was self-reported race and/or ethnicity. Nearly all of participants (95%) reported any formal sex education, 68% reported formal contraceptive education, and 92% reported formal STI education. Seventy-five percent of participants reported not having any sex education by parents and only 61% and 56% reported contraceptive and STI education by parents, respectively. US-born Hispanic women were more likely than white women to report STI education by parents (adjusted odds ratio = 1.87; 95% confidence interval, 1.17-2.99). No other significant racial and/or ethnic differences in sex education were found. There are few racial and/or ethnic differences in formal sex education and sex education by parents among young women. Copyright © 2016 North American Society for Pediatric and Adolescent Gynecology. All rights reserved.

  10. Comparison of sensitivity and specificity among 15 criteria for chronic inflammatory demyelinating polyneuropathy.

    PubMed

    Breiner, Ari; Brannagan, Thomas H

    2014-07-01

    There have been 15 formal sets of criteria published for the diagnosis of CIDP. No study to date has compared the sensitivity and specificity of all published criteria in the same patient population. We conducted a retrospective chart review of patients with CIDP (n = 56) and controls with diabetic polyneuropathy (n = 37) or amyotrophic lateral sclerosis (n = 39) who were followed in an academic neuromuscular practice. The sensitivity and specificity of each CIDP criterion was calculated, including clinical, laboratory, and electrodiagnostic components. Sensitivities ranged from 1.8% to 87.5%; the Dyck (87.5%), Neuropathy Association (75%), and European Federation of Neurological Societies (EFNS; 73.2%) criteria ranked highest. Specificities ranged from 65.6% to 100% and, among the 3 most sensitive criteria, the EFNS (90.8%) and Neuropathy Association (82.9%) criteria were most specific. In our patient population, the EFNS and Neuropathy Association criteria stand out due to high sensitivity and specificity. Copyright © 2013 Wiley Periodicals, Inc.

  11. Analysis of scattering statistics and governing distribution functions in optical coherence tomography.

    PubMed

    Sugita, Mitsuro; Weatherbee, Andrew; Bizheva, Kostadinka; Popov, Ivan; Vitkin, Alex

    2016-07-01

    The probability density function (PDF) of light scattering intensity can be used to characterize the scattering medium. We have recently shown that in optical coherence tomography (OCT), a PDF formalism can be sensitive to the number of scatterers in the probed scattering volume and can be represented by the K-distribution, a functional descriptor for non-Gaussian scattering statistics. Expanding on this initial finding, here we examine polystyrene microsphere phantoms with different sphere sizes and concentrations, and also human skin and fingernail in vivo. It is demonstrated that the K-distribution offers an accurate representation for the measured OCT PDFs. The behavior of the shape parameter of K-distribution that best fits the OCT scattering results is investigated in detail, and the applicability of this methodology for biological tissue characterization is demonstrated and discussed.

  12. Bayesian network modelling of upper gastrointestinal bleeding

    NASA Astrophysics Data System (ADS)

    Aisha, Nazziwa; Shohaimi, Shamarina; Adam, Mohd Bakri

    2013-09-01

    Bayesian networks are graphical probabilistic models that represent causal and other relationships between domain variables. In the context of medical decision making, these models have been explored to help in medical diagnosis and prognosis. In this paper, we discuss the Bayesian network formalism in building medical support systems and we learn a tree augmented naive Bayes Network (TAN) from gastrointestinal bleeding data. The accuracy of the TAN in classifying the source of gastrointestinal bleeding into upper or lower source is obtained. The TAN achieves a high classification accuracy of 86% and an area under curve of 92%. A sensitivity analysis of the model shows relatively high levels of entropy reduction for color of the stool, history of gastrointestinal bleeding, consistency and the ratio of blood urea nitrogen to creatinine. The TAN facilitates the identification of the source of GIB and requires further validation.

  13. Sensitivity to coincidences and paranormal belief.

    PubMed

    Hadlaczky, Gergö; Westerlund, Joakim

    2011-12-01

    Often it is difficult to find a natural explanation as to why a surprising coincidence occurs. In attempting to find one, people may be inclined to accept paranormal explanations. The objective of this study was to investigate whether people with a lower threshold for being surprised by coincidences have a greater propensity to become believers compared to those with a higher threshold. Participants were exposed to artificial coincidences, which were formally defined as less or more probable, and were asked to provide remarkability ratings. Paranormal belief was measured by the Australian Sheep-Goat Scale. An analysis of the remarkability ratings revealed a significant interaction effect between Sheep-Goat score and type of coincidence, suggesting that people with lower thresholds of surprise, when experiencing coincidences, harbor higher paranormal belief than those with a higher threshold. The theoretical aspects of these findings were discussed.

  14. The legacy of disadvantage: multigenerational neighborhood effects on cognitive ability.

    PubMed

    Sharkey, Patrick; Elwert, Felix

    2011-05-01

    This study examines how the neighborhood environments experienced over multiple generations of a family influence children's cognitive ability. Building on recent research showing strong continuity in neighborhood environments across generations of family members, the authors argue for a revised perspective on "neighborhood effects" that considers the ways in which the neighborhood environment in one generation may have a lingering impact on the next generation. To analyze multigenerational effects, the authors use newly developed methods designed to estimate unbiased treatment effects when treatments and confounders vary over time. The results confirm a powerful link between neighborhoods and cognitive ability that extends across generations. A family's exposure to neighborhood poverty across two consecutive generations reduces child cognitive ability by more than half a standard deviation. A formal sensitivity analysis suggests that results are robust to unobserved selection bias.

  15. Uncertainty Analysis of Sonic Boom Levels Measured in a Simulator at NASA Langley

    NASA Technical Reports Server (NTRS)

    Rathsam, Jonathan; Ely, Jeffry W.

    2012-01-01

    A sonic boom simulator has been constructed at NASA Langley Research Center for testing the human response to sonic booms heard indoors. Like all measured quantities, sonic boom levels in the simulator are subject to systematic and random errors. To quantify these errors, and their net influence on the measurement result, a formal uncertainty analysis is conducted. Knowledge of the measurement uncertainty, or range of values attributable to the quantity being measured, enables reliable comparisons among measurements at different locations in the simulator as well as comparisons with field data or laboratory data from other simulators. The analysis reported here accounts for acoustic excitation from two sets of loudspeakers: one loudspeaker set at the facility exterior that reproduces the exterior sonic boom waveform and a second set of interior loudspeakers for reproducing indoor rattle sounds. The analysis also addresses the effect of pressure fluctuations generated when exterior doors of the building housing the simulator are opened. An uncertainty budget is assembled to document each uncertainty component, its sensitivity coefficient, and the combined standard uncertainty. The latter quantity will be reported alongside measurement results in future research reports to indicate data reliability.

  16. REQUIREMENTS PATTERNS FOR FORMAL CONTRACTS IN ARCHITECTURAL ANALYSIS AND DESIGN LANGUAGE (AADL) MODELS

    DTIC Science & Technology

    2017-04-17

    Cyberphysical Systems, Formal Methods , Requirements Patterns, AADL, Assume Guarantee Reasoning Environment 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF...5 3. Methods , Assumptions, and Procedures...Rockwell Collins has been addressing these challenges by developing compositional reasoning methods that permit the verification of systems that exceed

  17. Recognition of Emotions in Autism: A Formal Meta-Analysis

    ERIC Educational Resources Information Center

    Uljarevic, Mirko; Hamilton, Antonia

    2013-01-01

    Determining the integrity of emotion recognition in autistic spectrum disorder is important to our theoretical understanding of autism and to teaching social skills. Previous studies have reported both positive and negative results. Here, we take a formal meta-analytic approach, bringing together data from 48 papers testing over 980 participants…

  18. Establishing the Validity of Recovery from Stuttering without Formal Treatment.

    ERIC Educational Resources Information Center

    Finn, Patrick

    1996-01-01

    This study examined a validation procedure combining self-reports with independent verification to identify cases of recovery from stuttering without formal treatment. A Speech Behavior Checklist was administered to 42 individuals familiar with recovered subjects' past speech. Analysis of subjects' descriptions of their past stuttering was…

  19. Romanian Higher Education as a Facilitator of Romania's Continued Formal and Informal Integration in the European Union

    ERIC Educational Resources Information Center

    Salajan, Florin D.; Chiper, Sorina

    2013-01-01

    This article conducts an exploration of Romania's European integration process through higher education. It contends that integration occurs at "formal" and "informal levels" through institutional norms and human agency, respectively. Through theoretical and empirical analysis, the authors discuss the modalities through which…

  20. Leading the Teacher Team--Balancing between Formal and Informal Power in Program Leadership

    ERIC Educational Resources Information Center

    Högfeldt, Anna-Karin; Malmi, Lauri; Kinnunen, Päivi; Jerbrant, Anna; Strömberg, Emma; Berglund, Anders; Villadsen, Jørgen

    2018-01-01

    This continuous research within Nordic engineering institutions targets the contexts and possibilities for leadership among engineering education program directors. The IFP-model, developed based on analysis of interviews with program leaders in these institutions, visualizes the program director's informal and formal power. The model is presented…

  1. Integrating ethics in design through the value-sensitive design approach.

    PubMed

    Cummings, Mary L

    2006-10-01

    The Accreditation Board of Engineering and Technology (ABET) has declared that to achieve accredited status, 'engineering programs must demonstrate that their graduates have an understanding of professional and ethical responsibility.' Many engineering professors struggle to integrate this required ethics instruction in technical classes and projects because of the lack of a formalized ethics-in-design approach. However, one methodology developed in human-computer interaction research, the Value-Sensitive Design approach, can serve as an engineering education tool which bridges the gap between design and ethics for many engineering disciplines. The three major components of Value-Sensitive Design, conceptual, technical, and empirical, exemplified through a case study which focuses on the development of a command and control supervisory interface for a military cruise missile.

  2. Does Formal Research Training Lead to Academic Success in Plastic Surgery? A Comprehensive Analysis of U.S. Academic Plastic Surgeons.

    PubMed

    Lopez, Joseph; Ameri, Afshin; Susarla, Srinivas M; Reddy, Sashank; Soni, Ashwin; Tong, J W; Amini, Neda; Ahmed, Rizwan; May, James W; Lee, W P Andrew; Dorafshar, Amir

    2016-01-01

    It is currently unknown whether formal research training has an influence on academic advancement in plastic surgery. The purpose of this study was to determine whether formal research training was associated with higher research productivity, academic rank, and procurement of extramural National Institutes of Health (NIH) funding in plastic surgery, comparing academic surgeons who completed said research training with those without. This was a cross-sectional study of full-time academic plastic surgeons in the United States. The main predictor variable was formal research training, defined as completion of a postdoctoral research fellowship or attainment of a Doctor of Philosophy (PhD). The primary outcome was scientific productivity measured by the Hirsh-index (h-index, the number of publications, h that have at least h citations each). The secondary outcomes were academic rank and NIH funding. Descriptive, bivariate, and multiple regression statistics were computed. A total of 607 academic surgeons were identified from 94 Accreditation Council for Graduate Medical Education-accredited plastic surgery training programs. In all, 179 (29.5%) surgeons completed formal research training. The mean h-index was 11.7 ± 9.9. And, 58 (9.6%) surgeons successfully procured NIH funding. The distribution of academic rank was the following: endowed professor (5.4%), professor (23.9%), associate professor (23.4%), assistant professor (46.0%), and instructor (1.3%). In a multiple regression analysis, completion of formal research training was significantly predictive of a higher h-index and successful procurement of NIH funding. Current evidence demonstrates that formal research training is associated with higher scientific productivity and increased likelihood of future NIH funding. Copyright © 2016 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  3. Effectiveness of interventions to promote help-seeking for mental health problems: systematic review and meta-analysis.

    PubMed

    Xu, Ziyan; Huang, Fangfang; Kösters, Markus; Staiger, Tobias; Becker, Thomas; Thornicroft, Graham; Rüsch, Nicolas

    2018-06-01

    Help-seeking is important to access appropriate care and improve mental health. However, individuals often delay or avoid seeking help for mental health problems. Interventions to improve help-seeking have been developed, but their effectiveness is unclear. A systematic review and meta-analysis were therefore conducted to examine the effectiveness of mental health related help-seeking interventions. Nine databases in English, German and Chinese were searched for randomised and non-randomised controlled trials. Effect sizes were calculated for attitudes, intentions and behaviours to seek formal, informal and self-help. Ninety-eight studies with 69 208 participants were included. Interventions yielded significant short-term benefits in terms of formal help-seeking, self-help, as well as mental health literacy and personal stigma. There were also positive long-term effects on formal help-seeking behaviours. The most common intervention types were strategies to increase mental health literacy, destigmatisation (both had positive short-term effects on formal help-seeking behaviours) as well as motivational enhancement (with positive long-term effects on formal help-seeking behaviours). Interventions improved formal help-seeking behaviours if delivered to people with or at risk of mental health problems, but not among children, adolescents or the general public. There was no evidence that interventions increased the use of informal help. Few studies were conducted in low- and middle-income countries (LMICs). This study provides evidence for the effectiveness of help-seeking interventions in terms of improving attitudes, intentions and behaviours to seek formal help for mental health problems among adults. Future research should develop effective interventions to improve informal help-seeking, for specific target groups and in LMICs settings.

  4. Neuromorphic log-domain silicon synapse circuits obey bernoulli dynamics: a unifying tutorial analysis

    PubMed Central

    Papadimitriou, Konstantinos I.; Liu, Shih-Chii; Indiveri, Giacomo; Drakakis, Emmanuel M.

    2014-01-01

    The field of neuromorphic silicon synapse circuits is revisited and a parsimonious mathematical framework able to describe the dynamics of this class of log-domain circuits in the aggregate and in a systematic manner is proposed. Starting from the Bernoulli Cell Formalism (BCF), originally formulated for the modular synthesis and analysis of externally linear, time-invariant logarithmic filters, and by means of the identification of new types of Bernoulli Cell (BC) operators presented here, a generalized formalism (GBCF) is established. The expanded formalism covers two new possible and practical combinations of a MOS transistor (MOST) and a linear capacitor. The corresponding mathematical relations codifying each case are presented and discussed through the tutorial treatment of three well-known transistor-level examples of log-domain neuromorphic silicon synapses. The proposed mathematical tool unifies past analysis approaches of the same circuits under a common theoretical framework. The speed advantage of the proposed mathematical framework as an analysis tool is also demonstrated by a compelling comparative circuit analysis example of high order, where the GBCF and another well-known log-domain circuit analysis method are used for the determination of the input-output transfer function of the high (4th) order topology. PMID:25653579

  5. Neuromorphic log-domain silicon synapse circuits obey bernoulli dynamics: a unifying tutorial analysis.

    PubMed

    Papadimitriou, Konstantinos I; Liu, Shih-Chii; Indiveri, Giacomo; Drakakis, Emmanuel M

    2014-01-01

    The field of neuromorphic silicon synapse circuits is revisited and a parsimonious mathematical framework able to describe the dynamics of this class of log-domain circuits in the aggregate and in a systematic manner is proposed. Starting from the Bernoulli Cell Formalism (BCF), originally formulated for the modular synthesis and analysis of externally linear, time-invariant logarithmic filters, and by means of the identification of new types of Bernoulli Cell (BC) operators presented here, a generalized formalism (GBCF) is established. The expanded formalism covers two new possible and practical combinations of a MOS transistor (MOST) and a linear capacitor. The corresponding mathematical relations codifying each case are presented and discussed through the tutorial treatment of three well-known transistor-level examples of log-domain neuromorphic silicon synapses. The proposed mathematical tool unifies past analysis approaches of the same circuits under a common theoretical framework. The speed advantage of the proposed mathematical framework as an analysis tool is also demonstrated by a compelling comparative circuit analysis example of high order, where the GBCF and another well-known log-domain circuit analysis method are used for the determination of the input-output transfer function of the high (4(th)) order topology.

  6. The relationship between organizational leadership for safety and learning from patient safety events.

    PubMed

    Ginsburg, Liane R; Chuang, You-Ta; Berta, Whitney Blair; Norton, Peter G; Ng, Peggy; Tregunno, Deborah; Richardson, Julia

    2010-06-01

    To examine the relationship between organizational leadership for patient safety and five types of learning from patient safety events (PSEs). Forty-nine general acute care hospitals in Ontario, Canada. A nonexperimental design using cross-sectional surveys of hospital patient safety officers (PSOs) and patient care managers (PCMs). PSOs provided data on organization-level learning from (a) minor events, (b) moderate events, (c) major near misses, (d) major event analysis, and (e) major event dissemination/communication. PCMs provided data on organizational leadership (formal and informal) for patient safety. Hospitals were the unit of analysis. Seemingly unrelated regression was used to examine the influence of formal and informal leadership for safety on the five types of learning from PSEs. The interaction between leadership and hospital size was also examined. Formal organizational leadership for patient safety is an important predictor of learning from minor, moderate, and major near-miss events, and major event dissemination. This relationship is significantly stronger for small hospitals (<100 beds). We find support for the relationship between patient safety leadership and patient safety behaviors such as learning from safety events. Formal leadership support for safety is of particular importance in small organizations where the economic burden of safety programs is disproportionately large and formal leadership is closer to the front lines.

  7. Linking communities to formal health care providers through village health teams in rural Uganda: lessons from linking social capital.

    PubMed

    Musinguzi, Laban Kashaija; Turinawe, Emmanueil Benon; Rwemisisi, Jude T; de Vries, Daniel H; Mafigiri, David K; Muhangi, Denis; de Groot, Marije; Katamba, Achilles; Pool, Robert

    2017-01-11

    Community-based programmes, particularly community health workers (CHWs), have been portrayed as a cost-effective alternative to the shortage of health workers in low-income countries. Usually, literature emphasises how easily CHWs link and connect communities to formal health care services. There is little evidence in Uganda to support or dispute such claims. Drawing from linking social capital framework, this paper examines the claim that village health teams (VHTs), as an example of CHWs, link and connect communities with formal health care services. Data were collected through ethnographic fieldwork undertaken as part of a larger research program in Luwero District, Uganda, between 2012 and 2014. The main methods of data collection were participant observation in events organised by VHTs. In addition, a total of 91 in-depth interviews and 42 focus group discussions (FGD) were conducted with adult community members as part of the larger project. After preliminary analysis of the data, we conducted an additional six in-depth interviews and three FGD with VHTs and four FGD with community members on the role of VHTs. Key informant interviews were conducted with local government staff, health workers, local leaders, and NGO staff with health programs in Luwero. Thematic analysis was used during data analysis. The ability of VHTs to link communities with formal health care was affected by the stakeholders' perception of their roles. Community members perceive VHTs as working for and under instructions of "others", which makes them powerless in the formal health care system. One of the challenges associated with VHTs' linking roles is support from the government and formal health care providers. Formal health care providers perceived VHTs as interested in special recognition for their services yet they are not "experts". For some health workers, the introduction of VHTs is seen as a ploy by the government to control people and hide its inability to provide health services. Having received training and initial support from an NGO, VHTs suffered transition failure from NGO to the formal public health care structure. As a result, VHTs are entangled in power relations that affect their role of linking community members with formal health care services. We also found that factors such as lack of money for treatment, poor transport networks, the attitudes of health workers and the existence of multiple health care systems, all factors that hinder access to formal health care, cannot be addressed by the VHTs. As linking social capital framework shows, for VHTs to effectively act as links between the community and formal health care and harness the resources that exist in institutions beyond the community, it is important to take into account the power relationships embedded in vertical relationships and forge a partnership between public health providers and the communities they serve. This will ensure strengthened partnerships and the improved capacity of local people to leverage resources embedded in vertical power networks.

  8. The effect of musical practice on gesture/sound pairing.

    PubMed

    Proverbio, Alice M; Attardo, Lapo; Cozzi, Matteo; Zani, Alberto

    2015-01-01

    Learning to play a musical instrument is a demanding process requiring years of intense practice. Dramatic changes in brain connectivity, volume, and functionality have been shown in skilled musicians. It is thought that music learning involves the formation of novel audio visuomotor associations, but not much is known about the gradual acquisition of this ability. In the present study, we investigated whether formal music training enhances audiovisual multisensory processing. To this end, pupils at different stages of education were examined based on the hypothesis that the strength of audio/visuomotor associations would be augmented as a function of the number of years of conservatory study (expertise). The study participants were violin and clarinet students of pre-academic and academic levels and of different chronological ages, ages of acquisition, and academic levels. A violinist and a clarinetist each played the same score, and each participant viewed the video corresponding to his or her instrument. Pitch, intensity, rhythm, and sound duration were matched across instruments. In half of the trials, the soundtrack did not match (in pitch) the corresponding musical gestures. Data analysis indicated a correlation between the number of years of formal training (expertise) and the ability to detect an audiomotor incongruence in music performance (relative to the musical instrument practiced), thus suggesting a direct correlation between knowing how to play and perceptual sensitivity.

  9. From Informal Safety-Critical Requirements to Property-Driven Formal Validation

    NASA Technical Reports Server (NTRS)

    Cimatti, Alessandro; Roveri, Marco; Susi, Angelo; Tonetta, Stefano

    2008-01-01

    Most of the efforts in formal methods have historically been devoted to comparing a design against a set of requirements. The validation of the requirements themselves, however, has often been disregarded, and it can be considered a largely open problem, which poses several challenges. The first challenge is given by the fact that requirements are often written in natural language, and may thus contain a high degree of ambiguity. Despite the progresses in Natural Language Processing techniques, the task of understanding a set of requirements cannot be automatized, and must be carried out by domain experts, who are typically not familiar with formal languages. Furthermore, in order to retain a direct connection with the informal requirements, the formalization cannot follow standard model-based approaches. The second challenge lies in the formal validation of requirements. On one hand, it is not even clear which are the correctness criteria or the high-level properties that the requirements must fulfill. On the other hand, the expressivity of the language used in the formalization may go beyond the theoretical and/or practical capacity of state-of-the-art formal verification. In order to solve these issues, we propose a new methodology that comprises of a chain of steps, each supported by a specific tool. The main steps are the following. First, the informal requirements are split into basic fragments, which are classified into categories, and dependency and generalization relationships among them are identified. Second, the fragments are modeled using a visual language such as UML. The UML diagrams are both syntactically restricted (in order to guarantee a formal semantics), and enriched with a highly controlled natural language (to allow for modeling static and temporal constraints). Third, an automatic formal analysis phase iterates over the modeled requirements, by combining several, complementary techniques: checking consistency; verifying whether the requirements entail some desirable properties; verify whether the requirements are consistent with selected scenarios; diagnosing inconsistencies by identifying inconsistent cores; identifying vacuous requirements; constructing multiple explanations by enabling the fault-tree analysis related to particular fault models; verifying whether the specification is realizable.

  10. Sensitivity and specificity of a briefer version of the Cambridge Cognitive Examination (CAMCog-Short) in the detection of cognitive decline in the elderly: An exploratory study.

    PubMed

    Radanovic, Marcia; Facco, Giuliana; Forlenza, Orestes V

    2018-05-01

    To create a reduced and briefer version of the widely used Cambridge Cognitive Examination (CAMCog) battery as a concise cognitive test to be used in primary and secondary levels of health care to detect cognitive decline. Our aim was to reduce the administration time of the original test while maintaining its diagnostic accuracy. On the basis of the analysis of 835 CAMCog tests performed by 429 subjects (107 controls, 192 mild cognitive impairment [MCI], and 130 dementia patients), we extracted items that most contributed to intergroup differentiation, according to 2 educational levels (≤8 and >8 y of formal schooling). The final 33-item "low education" and 24-item"high education" CAMCog-Short correspond to 48.5% and 35% of the original version and yielded similar rates of accuracy: area under ROC curves (AUC) > 0.9 in the differentiation between controls × dementia and MCI × dementia (sensitivities > 75%; specificities > 90%); AUC > 0.7 for the differentiation between controls and MCI (sensitivities > 65%; specificities > 75%). The CAMCog-Short emerges as a promising tool for a brief, yet sufficiently accurate, screening tool for use in clinical settings. Further prospective studies designed to validate its diagnostic accuracy are needed. Copyright © 2018 John Wiley & Sons, Ltd.

  11. Formalizing the Austrian Procedure Catalogue: A 4-step methodological analysis approach.

    PubMed

    Neururer, Sabrina Barbara; Lasierra, Nelia; Peiffer, Karl Peter; Fensel, Dieter

    2016-04-01

    Due to the lack of an internationally accepted and adopted standard for coding health interventions, Austria has established its own country-specific procedure classification system - the Austrian Procedure Catalogue (APC). Even though the APC is an elaborate coding standard for medical procedures, it has shortcomings that limit its usability. In order to enhance usability and usefulness, especially for research purposes and e-health applications, we developed an ontologized version of the APC. In this paper we present a novel four-step approach for the ontology engineering process, which enables accurate extraction of relevant concepts for medical ontologies from written text. The proposed approach for formalizing the APC consists of the following four steps: (1) comparative pre-analysis, (2) definition analysis, (3) typological analysis, and (4) ontology implementation. The first step contained a comparison of the APC to other well-established or elaborate health intervention coding systems in order to identify strengths and weaknesses of the APC. In the second step, a list of definitions of medical terminology used in the APC was obtained. This list of definitions was used as input for Step 3, in which we identified the most important concepts to describe medical procedures using the qualitative typological analysis approach. The definition analysis as well as the typological analysis are well-known and effective methods used in social sciences, but not commonly employed in the computer science or ontology engineering domain. Finally, this list of concepts was used in Step 4 to formalize the APC. The pre-analysis highlighted the major shortcomings of the APC, such as the lack of formal definition, leading to implicitly available, but not directly accessible information (hidden data), or the poor procedural type classification. After performing the definition and subsequent typological analyses, we were able to identify the following main characteristics of health interventions: (1) Procedural type, (2) Anatomical site, (3) Medical device, (4) Pathology, (5) Access, (6) Body system, (7) Population, (8) Aim, (9) Discipline, (10) Technique, and (11) Body Function. These main characteristics were taken as input of classes for the formalization of the APC. We were also able to identify relevant relations between classes. The proposed four-step approach for formalizing the APC provides a novel, systematically developed, strong framework to semantically enrich procedure classifications. Although this methodology was designed to address the particularities of the APC, the included methods are based on generic analysis tasks, and therefore can be re-used to provide a systematic representation of other procedure catalogs or classification systems and hence contribute towards a universal alignment of such representations, if desired. Copyright © 2015 Elsevier Inc. All rights reserved.

  12. Performance Measurement and Analysis of Certain Search Algorithms

    DTIC Science & Technology

    1979-05-01

    methodology that combines experiment and analysis in complementary and highly specialized and formalized roles, and that the richness of the domains make it ... it is difficult to determine what fraction of the observed differences between the 51 two sets is due to bias in sample set 1, and what fraction simply...given by its characteristic KMIN and KMAX functions. We posit a formal model of "knowledge" itself in which there are at least as many distinct "states

  13. Software Formal Inspections Standard

    NASA Technical Reports Server (NTRS)

    1993-01-01

    This Software Formal Inspections Standard (hereinafter referred to as Standard) is applicable to NASA software. This Standard defines the requirements that shall be fulfilled by the software formal inspections process whenever this process is specified for NASA software. The objective of this Standard is to define the requirements for a process that inspects software products to detect and eliminate defects as early as possible in the software life cycle. The process also provides for the collection and analysis of inspection data to improve the inspection process as well as the quality of the software.

  14. Photon wave function formalism for analysis of Mach–Zehnder interferometer and sum-frequency generation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ritboon, Atirach, E-mail: atirach.3.14@gmail.com; Department of Physics, Faculty of Science, Prince of Songkla University, Hat Yai 90112; Daengngam, Chalongrat, E-mail: chalongrat.d@psu.ac.th

    2016-08-15

    Biakynicki-Birula introduced a photon wave function similar to the matter wave function that satisfies the Schrödinger equation. Its second quantization form can be applied to investigate nonlinear optics at nearly full quantum level. In this paper, we applied the photon wave function formalism to analyze both linear optical processes in the well-known Mach–Zehnder interferometer and nonlinear optical processes for sum-frequency generation in dispersive and lossless medium. Results by photon wave function formalism agree with the well-established Maxwell treatments and existing experimental verifications.

  15. Extrusion cast explosive

    DOEpatents

    Scribner, K.J.

    1985-01-29

    Improved, multiphase, high performance, high energy, extrusion cast explosive compositions, comprising, a crystalline explosive material; an energetic liquid plasticizer; a urethane prepolymer, comprising a blend of polyvinyl formal, and polycaprolactone; a polyfunctional isocyanate; and a catalyst are disclosed. These new explosive compositions exhibit higher explosive content, a smooth detonation front, excellent stability over long periods of storage, and lower sensitivity to mechanical stimulants. 1 fig.

  16. Fisher information and Rényi entropies in dynamical systems.

    PubMed

    Godó, B; Nagy, Á

    2017-07-01

    The link between the Fisher information and Rényi entropies is explored. The relationship is based on a thermodynamical formalism based on Fisher information with a parameter, β, which is interpreted as the inverse temperature. The Fisher heat capacity is defined and found to be sensitive to changes of higher order than the analogous quantity in the conventional formulation.

  17. Awareness, Interest, Sensitivity, and Advocacy--AISA: An Educational "Take-Away" for Business Ethics Students

    ERIC Educational Resources Information Center

    Smith, Brent

    2009-01-01

    It has been nearly 30 years since business schools began providing formal courses in business ethics to their students. In that time, the public has witnessed countless cases of business misconduct, often performed by these students. Scholars and researchers agree that ethics education is important, yet they disagree about how it should be taught,…

  18. Extrusion cast explosive

    DOEpatents

    Scribner, K.J.

    1985-11-26

    Disclosed is an improved, multiphase, high performance, high energy, extrusion cast explosive compositions, comprising, a crystalline explosive material; an energetic liquid plasticizer; a urethane prepolymer, comprising a blend of polyvinyl formal, and polycaprolactone; a polyfunctional isocyanate; and a catalyst. These new explosive compositions exhibit higher explosive content, a smooth detonation front, excellent stability over long periods of storage, and lower sensitivity to mechanical stimulants. 1 fig.

  19. EPA Facility Registry Service (FRS): ICIS

    EPA Pesticide Factsheets

    This web feature service contains location and facility identification information from EPA's Facility Registry Service (FRS) for the subset of facilities that link to the Integrated Compliance Information System (ICIS). When complete, ICIS will provide a database that will contain integrated enforcement and compliance information across most of EPA's programs. The vision for ICIS is to replace EPA's independent databases that contain enforcement data with a single repository for that information. Currently, ICIS contains all Federal Administrative and Judicial enforcement actions and a subset of the Permit Compliance System (PCS), which supports the National Pollutant Discharge Elimination System (NPDES). ICIS exchanges non-sensitive enforcement/compliance activities, non-sensitive formal enforcement actions and NPDES information with FRS. This web feature service contains the enforcement/compliance activities and formal enforcement action related facilities; the NPDES facilities are contained in the PCS_NPDES web feature service. FRS identifies and geospatially locates facilities, sites or places subject to environmental regulations or of environmental interest. Using vigorous verification and data management procedures, FRS integrates facility data from EPA's national program systems, other federal agencies, and State and tribal master facility records and provides EPA with a centrally managed, single source of comprehensive and authoritative information on f

  20. Perspectives of policy and political decision makers on access to formal dementia care: expert interviews in eight European countries.

    PubMed

    Broda, Anja; Bieber, Anja; Meyer, Gabriele; Hopper, Louise; Joyce, Rachael; Irving, Kate; Zanetti, Orazio; Portolani, Elisa; Kerpershoek, Liselot; Verhey, Frans; Vugt, Marjolein de; Wolfs, Claire; Eriksen, Siren; Røsvik, Janne; Marques, Maria J; Gonçalves-Pereira, Manuel; Sjölund, Britt-Marie; Woods, Bob; Jelley, Hannah; Orrell, Martin; Stephan, Astrid

    2017-08-03

    As part of the ActifCare (ACcess to Timely Formal Care) project, we conducted expert interviews in eight European countries with policy and political decision makers, or representatives of relevant institutions, to determine their perspectives on access to formal care for people with dementia and their carers. Each ActifCare country (Germany, Ireland, Italy, The Netherlands, Norway, Portugal, Sweden, United Kingdom) conducted semi-structured interviews with 4-7 experts (total N = 38). The interview guide addressed the topics "Complexity and Continuity of Care", "Formal Services", and "Public Awareness". Country-specific analysis of interview transcripts used an inductive qualitative content analysis. Cross-national synthesis focused on similarities in themes across the ActifCare countries. The analysis revealed ten common themes and two additional sub-themes across countries. Among others, the experts highlighted the need for a coordinating role and the necessity of information to address issues of complexity and continuity of care, demanded person-centred, tailored, and multidisciplinary formal services, and referred to education, mass media and campaigns as means to raise public awareness. Policy and political decision makers appear well acquainted with current discussions among both researchers and practitioners of possible approaches to improve access to dementia care. Experts described pragmatic, realistic strategies to influence dementia care. Suggested innovations concerned how to achieve improved dementia care, rather than transforming the nature of the services provided. Knowledge gained in these expert interviews may be useful to national decision makers when they consider reshaping the organisation of dementia care, and may thus help to develop best-practice strategies and recommendations.

  1. The mathematical bases for qualitative reasoning

    NASA Technical Reports Server (NTRS)

    Kalagnanam, Jayant; Simon, Herbert A.; Iwasaki, Yumi

    1991-01-01

    The practices of researchers in many fields who use qualitative reasoning are summarized and explained. The goal is to gain an understanding of the formal assumptions and mechanisms that underlie this kind of analysis. The explanations given are based on standard mathematical formalisms, particularly on ordinal properties, continuous differentiable functions, and the mathematics of nonlinear dynamic systems.

  2. Proceedings 3rd NASA/IEEE Workshop on Formal Approaches to Agent-Based Systems (FAABS-III)

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael (Editor); Rash, James (Editor); Truszkowski, Walt (Editor); Rouff, Christopher (Editor)

    2004-01-01

    These preceedings contain 18 papers and 4 poster presentation, covering topics such as: multi-agent systems, agent-based control, formalism, norms, as well as physical and biological models of agent-based systems. Some applications presented in the proceedings include systems analysis, software engineering, computer networks and robot control.

  3. Communication Patterns in Normal and Disturbed Families.

    ERIC Educational Resources Information Center

    Angermeyer, Matthias C.; Hecker, Hartmut

    A study of formal communication in 30 families each with a schizophrenic son and 28 families, each with a "normal" son was conducted in Germany. By means of factor analysis four types of formal speech behavior were identified using musical terminology: "staccato," a highly fragmented flow of conversation with high turnover rate; "solo" in which…

  4. A Comparative Analysis of Recidivism with Propensity Score Matching of Informal and Formal Juvenile Probationers

    ERIC Educational Resources Information Center

    Onifade, Eyitayo; Wilkins, Jeffrey; Davidson, William; Campbell, Christina; Petersen, Jodi

    2011-01-01

    Given service costs and evidence suggesting mixing young offenders of different risk levels increases recidivism, this study determined the extent to which differential disposition and risk determined subsequent recidivism. Furthermore, this study entailed a comparison of offense outcomes for informal probationers (n = 581) and formal probationers…

  5. Financial Satisfaction and (In)formal Sector in a Transition Country

    ERIC Educational Resources Information Center

    Ferrer-i-Carbonell, Ada; Gerxhani, Klarita

    2011-01-01

    This paper examines the relationship between working in the formal or informal sector and self-reported individual financial satisfaction in a country in transition. It does so by allowing for individual heterogeneity in terms of perceived financial insecurity and tax morale. The empirical analysis uses a dataset for Albania, a country in…

  6. Developing Formal Object-oriented Requirements Specifications: A Model, Tool and Technique.

    ERIC Educational Resources Information Center

    Jackson, Robert B.; And Others

    1995-01-01

    Presents a formal object-oriented specification model (OSS) for computer software system development that is supported by a tool that automatically generates a prototype from an object-oriented analysis model (OSA) instance, lets the user examine the prototype, and permits the user to refine the OSA model instance to generate a requirements…

  7. The Lived Faculty Experience with Formalized Assessment Initiatives: An Interpretive Phenomenological Analysis

    ERIC Educational Resources Information Center

    Leary, Thomas D., IV.

    2017-01-01

    Institutions of higher education both value and need student assessment data. Faculty, as seen in numerous studies, however, have generally negatively received the formalization and reporting of student assessments to gather this assessment data. If we could better understand faculty experiences and perceptions of student assessment data within…

  8. Counting Strategies and Semantic Analysis as Applied to Class Inclusion. Report No. 61.

    ERIC Educational Resources Information Center

    Wilkinson, Alexander

    This investigation examined strategic and semantic aspects of the answers given by preschool children to class inclusion problems. The Piagetian logical formalism for class inclusion was contrasted with a new, problem processing formalism in three experiments. In experiment 1, it was found that 48 nursery school subjects nearly always performed…

  9. Formal Methods of V&V of Partial Specifications: An Experience Report

    NASA Technical Reports Server (NTRS)

    Easterbrook, Steve; Callahan, John

    1997-01-01

    This paper describes our work exploring the suitability of formal specification methods for independent verification and validation (IV&V) of software specifications for large, safety critical systems. An IV&V contractor often has to perform rapid analysis on incomplete specifications, with no control over how those specifications are represented. Lightweight formal methods show significant promise in this context, as they offer a way of uncovering major errors, without the burden of full proofs of correctness. We describe an experiment in the application of the method SCR. to testing for consistency properties of a partial model of requirements for Fault Detection Isolation and Recovery on the space station. We conclude that the insights gained from formalizing a specification is valuable, and it is the process of formalization, rather than the end product that is important. It was only necessary to build enough of the formal model to test the properties in which we were interested. Maintenance of fidelity between multiple representations of the same requirements (as they evolve) is still a problem, and deserves further study.

  10. The Markov process admits a consistent steady-state thermodynamic formalism

    NASA Astrophysics Data System (ADS)

    Peng, Liangrong; Zhu, Yi; Hong, Liu

    2018-01-01

    The search for a unified formulation for describing various non-equilibrium processes is a central task of modern non-equilibrium thermodynamics. In this paper, a novel steady-state thermodynamic formalism was established for general Markov processes described by the Chapman-Kolmogorov equation. Furthermore, corresponding formalisms of steady-state thermodynamics for the master equation and Fokker-Planck equation could be rigorously derived in mathematics. To be concrete, we proved that (1) in the limit of continuous time, the steady-state thermodynamic formalism for the Chapman-Kolmogorov equation fully agrees with that for the master equation; (2) a similar one-to-one correspondence could be established rigorously between the master equation and Fokker-Planck equation in the limit of large system size; (3) when a Markov process is restrained to one-step jump, the steady-state thermodynamic formalism for the Fokker-Planck equation with discrete state variables also goes to that for master equations, as the discretization step gets smaller and smaller. Our analysis indicated that general Markov processes admit a unified and self-consistent non-equilibrium steady-state thermodynamic formalism, regardless of underlying detailed models.

  11. Proceedings of the Sixth NASA Langley Formal Methods (LFM) Workshop

    NASA Technical Reports Server (NTRS)

    Rozier, Kristin Yvonne (Editor)

    2008-01-01

    Today's verification techniques are hard-pressed to scale with the ever-increasing complexity of safety critical systems. Within the field of aeronautics alone, we find the need for verification of algorithms for separation assurance, air traffic control, auto-pilot, Unmanned Aerial Vehicles (UAVs), adaptive avionics, automated decision authority, and much more. Recent advances in formal methods have made verifying more of these problems realistic. Thus we need to continually re-assess what we can solve now and identify the next barriers to overcome. Only through an exchange of ideas between theoreticians and practitioners from academia to industry can we extend formal methods for the verification of ever more challenging problem domains. This volume contains the extended abstracts of the talks presented at LFM 2008: The Sixth NASA Langley Formal Methods Workshop held on April 30 - May 2, 2008 in Newport News, Virginia, USA. The topics of interest that were listed in the call for abstracts were: advances in formal verification techniques; formal models of distributed computing; planning and scheduling; automated air traffic management; fault tolerance; hybrid systems/hybrid automata; embedded systems; safety critical applications; safety cases; accident/safety analysis.

  12. Group adaptation, formal darwinism and contextual analysis.

    PubMed

    Okasha, S; Paternotte, C

    2012-06-01

    We consider the question: under what circumstances can the concept of adaptation be applied to groups, rather than individuals? Gardner and Grafen (2009, J. Evol. Biol.22: 659-671) develop a novel approach to this question, building on Grafen's 'formal Darwinism' project, which defines adaptation in terms of links between evolutionary dynamics and optimization. They conclude that only clonal groups, and to a lesser extent groups in which reproductive competition is repressed, can be considered as adaptive units. We re-examine the conditions under which the selection-optimization links hold at the group level. We focus on an important distinction between two ways of understanding the links, which have different implications regarding group adaptationism. We show how the formal Darwinism approach can be reconciled with G.C. Williams' famous analysis of group adaptation, and we consider the relationships between group adaptation, the Price equation approach to multi-level selection, and the alternative approach based on contextual analysis. © 2012 The Authors. Journal of Evolutionary Biology © 2012 European Society For Evolutionary Biology.

  13. A Content Analysis of E-mail Communication between Patients and Their Providers: Patients Get the Message

    PubMed Central

    White, Casey B.; Moyer, Cheryl A.; Stern, David T.; Katz, Steven J.

    2004-01-01

    Objective: E-mail use in the clinical setting has been slow to diffuse for several reasons, including providers' concerns about patients' inappropriate and inefficient use of the technology. This study examined the content of a random sample of patient–physician e-mail messages to determine the validity of those concerns. Design: A qualitative analysis of patient–physician e-mail messages was performed. Measurements: A total of 3,007 patient–physician e-mail messages were collected over 11 months as part of a randomized, controlled trial of a triage-based e-mail system in two primary care centers (including 98 physicians); 10% of messages were randomly selected for review. Messages were coded across such domains as message type, number of requests per e-mail, inclusion of sensitive content, necessity of a physician response, and message tone. Results: The majority (82.8%) of messages addressed a single issue. The most common message types included information updates to the physicians (41.4%), prescription renewals (24.2%), health questions (13.2%), questions about test results (10.9%), referrals (8.8%), “other” (including thank yous, apologies) (8.8%), appointments (5.4%), requests for non-health-related information (4.8%), and billing questions (0.3%). Overall, messages were concise, formal, and medically relevant. Very few (5.1%) included sensitive content, and none included urgent messages. Less than half (43.2%) required a physician response. Conclusion: A triage-based e-mail system promoted e-mail exchanges appropriate for primary care. Most patients adhered to guidelines aimed at focusing content, limiting the number of requests per message, and avoiding urgent requests or highly sensitive content. Thus, physicians' concerns about the content of patients' e-mails may be unwarranted. PMID:15064295

  14. Systematic Review of Health Economic Evaluations of Diagnostic Tests in Brazil: How accurate are the results?

    PubMed

    Oliveira, Maria Regina Fernandes; Leandro, Roseli; Decimoni, Tassia Cristina; Rozman, Luciana Martins; Novaes, Hillegonda Maria Dutilh; De Soárez, Patrícia Coelho

    2017-08-01

    The aim of this study is to identify and characterize the health economic evaluations (HEEs) of diagnostic tests conducted in Brazil, in terms of their adherence to international guidelines for reporting economic studies and specific questions in test accuracy reports. We systematically searched multiple databases, selecting partial and full HEEs of diagnostic tests, published between 1980 and 2013. Two independent reviewers screened articles for relevance and extracted the data. We performed a qualitative narrative synthesis. Forty-three articles were reviewed. The most frequently studied diagnostic tests were laboratory tests (37.2%) and imaging tests (32.6%). Most were non-invasive tests (51.2%) and were performed in the adult population (48.8%). The intended purposes of the technologies evaluated were mostly diagnostic (69.8%), but diagnosis and treatment and screening, diagnosis, and treatment accounted for 25.6% and 4.7%, respectively. Of the reviewed studies, 12.5% described the methods used to estimate the quantities of resources, 33.3% reported the discount rate applied, and 29.2% listed the type of sensitivity analysis performed. Among the 12 cost-effectiveness analyses, only two studies (17%) referred to the application of formal methods to check the quality of the accuracy studies that provided support for the economic model. The existing Brazilian literature on the HEEs of diagnostic tests exhibited reasonably good performance. However, the following points still require improvement: 1) the methods used to estimate resource quantities and unit costs, 2) the discount rate, 3) descriptions of sensitivity analysis methods, 4) reporting of conflicts of interest, 5) evaluations of the quality of the accuracy studies considered in the cost-effectiveness models, and 6) the incorporation of accuracy measures into sensitivity analyses.

  15. Formal methods demonstration project for space applications

    NASA Technical Reports Server (NTRS)

    Divito, Ben L.

    1995-01-01

    The Space Shuttle program is cooperating in a pilot project to apply formal methods to live requirements analysis activities. As one of the larger ongoing shuttle Change Requests (CR's), the Global Positioning System (GPS) CR involves a significant upgrade to the Shuttle's navigation capability. Shuttles are to be outfitted with GPS receivers and the primary avionics software will be enhanced to accept GPS-provided positions and integrate them into navigation calculations. Prior to implementing the CR, requirements analysts at Loral Space Information Systems, the Shuttle software contractor, must scrutinize the CR to identify and resolve any requirements issues. We describe an ongoing task of the Formal Methods Demonstration Project for Space Applications whose goal is to find an effective way to use formal methods in the GPS CR requirements analysis phase. This phase is currently under way and a small team from NASA Langley, ViGYAN Inc. and Loral is now engaged in this task. Background on the GPS CR is provided and an overview of the hardware/software architecture is presented. We outline the approach being taken to formalize the requirements, only a subset of which is being attempted. The approach features the use of the PVS specification language to model 'principal functions', which are major units of Shuttle software. Conventional state machine techniques form the basis of our approach. Given this background, we present interim results based on a snapshot of work in progress. Samples of requirements specifications rendered in PVS are offered to illustration. We walk through a specification sketch for the principal function known as GPS Receiver State processing. Results to date are summarized and feedback from Loral requirements analysts is highlighted. Preliminary data is shown comparing issues detected by the formal methods team versus those detected using existing requirements analysis methods. We conclude by discussing our plan to complete the remaining activities of this task.

  16. An integrative formal model of motivation and decision making: The MGPM*.

    PubMed

    Ballard, Timothy; Yeo, Gillian; Loft, Shayne; Vancouver, Jeffrey B; Neal, Andrew

    2016-09-01

    We develop and test an integrative formal model of motivation and decision making. The model, referred to as the extended multiple-goal pursuit model (MGPM*), is an integration of the multiple-goal pursuit model (Vancouver, Weinhardt, & Schmidt, 2010) and decision field theory (Busemeyer & Townsend, 1993). Simulations of the model generated predictions regarding the effects of goal type (approach vs. avoidance), risk, and time sensitivity on prioritization. We tested these predictions in an experiment in which participants pursued different combinations of approach and avoidance goals under different levels of risk. The empirical results were consistent with the predictions of the MGPM*. Specifically, participants pursuing 1 approach and 1 avoidance goal shifted priority from the approach to the avoidance goal over time. Among participants pursuing 2 approach goals, those with low time sensitivity prioritized the goal with the larger discrepancy, whereas those with high time sensitivity prioritized the goal with the smaller discrepancy. Participants pursuing 2 avoidance goals generally prioritized the goal with the smaller discrepancy. Finally, all of these effects became weaker as the level of risk increased. We used quantitative model comparison to show that the MGPM* explained the data better than the original multiple-goal pursuit model, and that the major extensions from the original model were justified. The MGPM* represents a step forward in the development of a general theory of decision making during multiple-goal pursuit. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  17. Temporal diagnostic analysis of the SWAT model to detect dominant periods of poor model performance

    NASA Astrophysics Data System (ADS)

    Guse, Björn; Reusser, Dominik E.; Fohrer, Nicola

    2013-04-01

    Hydrological models generally include thresholds and non-linearities, such as snow-rain-temperature thresholds, non-linear reservoirs, infiltration thresholds and the like. When relating observed variables to modelling results, formal methods often calculate performance metrics over long periods, reporting model performance with only few numbers. Such approaches are not well suited to compare dominating processes between reality and model and to better understand when thresholds and non-linearities are driving model results. We present a combination of two temporally resolved model diagnostic tools to answer when a model is performing (not so) well and what the dominant processes are during these periods. We look at the temporal dynamics of parameter sensitivities and model performance to answer this question. For this, the eco-hydrological SWAT model is applied in the Treene lowland catchment in Northern Germany. As a first step, temporal dynamics of parameter sensitivities are analyzed using the Fourier Amplitude Sensitivity test (FAST). The sensitivities of the eight model parameters investigated show strong temporal variations. High sensitivities were detected for two groundwater (GW_DELAY, ALPHA_BF) and one evaporation parameters (ESCO) most of the time. The periods of high parameter sensitivity can be related to different phases of the hydrograph with dominances of the groundwater parameters in the recession phases and of ESCO in baseflow and resaturation periods. Surface runoff parameters show high parameter sensitivities in phases of a precipitation event in combination with high soil water contents. The dominant parameters give indication for the controlling processes during a given period for the hydrological catchment. The second step included the temporal analysis of model performance. For each time step, model performance was characterized with a "finger print" consisting of a large set of performance measures. These finger prints were clustered into four reoccurring patterns of typical model performance, which can be related to different phases of the hydrograph. Overall, the baseflow cluster has the lowest performance. By combining the periods with poor model performance with the dominant model components during these phases, the groundwater module was detected as the model part with the highest potential for model improvements. The detection of dominant processes in periods of poor model performance enhances the understanding of the SWAT model. Based on this, concepts how to improve the SWAT model structure for the application in German lowland catchment are derived.

  18. p-exponent and p-leaders, Part II: Multifractal analysis. Relations to detrended fluctuation analysis

    NASA Astrophysics Data System (ADS)

    Leonarduzzi, R.; Wendt, H.; Abry, P.; Jaffard, S.; Melot, C.; Roux, S. G.; Torres, M. E.

    2016-04-01

    Multifractal analysis studies signals, functions, images or fields via the fluctuations of their local regularity along time or space, which capture crucial features of their temporal/spatial dynamics. It has become a standard signal and image processing tool and is commonly used in numerous applications of different natures. In its common formulation, it relies on the Hölder exponent as a measure of local regularity, which is by nature restricted to positive values and can hence be used for locally bounded functions only. In this contribution, it is proposed to replace the Hölder exponent with a collection of novel exponents for measuring local regularity, the p-exponents. One of the major virtues of p-exponents is that they can potentially take negative values. The corresponding wavelet-based multiscale quantities, the p-leaders, are constructed and shown to permit the definition of a new multifractal formalism, yielding an accurate practical estimation of the multifractal properties of real-world data. Moreover, theoretical and practical connections to and comparisons against another multifractal formalism, referred to as multifractal detrended fluctuation analysis, are achieved. The performance of the proposed p-leader multifractal formalism is studied and compared to previous formalisms using synthetic multifractal signals and images, illustrating its theoretical and practical benefits. The present contribution is complemented by a companion article studying in depth the theoretical properties of p-exponents and the rich classification of local singularities it permits.

  19. Formal analysis of imprecise system requirements with Event-B.

    PubMed

    Le, Hong Anh; Nakajima, Shin; Truong, Ninh Thuan

    2016-01-01

    Formal analysis of functional properties of system requirements needs precise descriptions. However, the stakeholders sometimes describe the system with ambiguous, vague or fuzzy terms, hence formal frameworks for modeling and verifying such requirements are desirable. The Fuzzy If-Then rules have been used for imprecise requirements representation, but verifying their functional properties still needs new methods. In this paper, we propose a refinement-based modeling approach for specification and verification of such requirements. First, we introduce a representation of imprecise requirements in the set theory. Then we make use of Event-B refinement providing a set of translation rules from Fuzzy If-Then rules to Event-B notations. After that, we show how to verify both safety and eventuality properties with RODIN/Event-B. Finally, we illustrate the proposed method on the example of Crane Controller.

  20. Nonlinear estimation of parameters in biphasic Arrhenius plots.

    PubMed

    Puterman, M L; Hrboticky, N; Innis, S M

    1988-05-01

    This paper presents a formal procedure for the statistical analysis of data on the thermotropic behavior of membrane-bound enzymes generated using the Arrhenius equation and compares the analysis to several alternatives. Data is modeled by a bent hyperbola. Nonlinear regression is used to obtain estimates and standard errors of the intersection of line segments, defined as the transition temperature, and slopes, defined as energies of activation of the enzyme reaction. The methodology allows formal tests of the adequacy of a biphasic model rather than either a single straight line or a curvilinear model. Examples on data concerning the thermotropic behavior of pig brain synaptosomal acetylcholinesterase are given. The data support the biphasic temperature dependence of this enzyme. The methodology represents a formal procedure for statistical validation of any biphasic data and allows for calculation of all line parameters with estimates of precision.

  1. Prediction of skin sensitizers using alternative methods to animal experimentation.

    PubMed

    Johansson, Henrik; Lindstedt, Malin

    2014-07-01

    Regulatory frameworks within the European Union demand that chemical substances are investigated for their ability to induce sensitization, an adverse health effect caused by the human immune system in response to chemical exposure. A recent ban on the use of animal tests within the cosmetics industry has led to an urgent need for alternative animal-free test methods that can be used for assessment of chemical sensitizers. To date, no such alternative assay has yet completed formal validation. However, a number of assays are in development and the understanding of the biological mechanisms of chemical sensitization has greatly increased during the last decade. In this MiniReview, we aim to summarize and give our view on the recent progress of method development for alternative assessment of chemical sensitizers. We propose that integrated testing strategies should comprise complementary assays, providing measurements of a wide range of mechanistic events, to perform well-educated risk assessments based on weight of evidence. © 2014 Nordic Association for the Publication of BCPT (former Nordic Pharmacological Society).

  2. Health-related quality of life in end-stage COPD and lung cancer patients.

    PubMed

    Habraken, Jolanda M; ter Riet, Gerben; Gore, Justin M; Greenstone, Michael A; Weersink, Els J M; Bindels, Patrick J E; Willems, Dick L

    2009-06-01

    Historically, palliative care has been developed for cancer patients and is not yet generally available for patients suffering from chronic life-limiting illnesses, such as chronic obstructive pulmonary disease (COPD). To examine whether COPD patients experience similar or worse disease burden in comparison with non-small cell lung cancer (NSCLC) patients, we compared the health-related quality of life (HRQOL) scores of severe COPD patients with those of advanced NSCLC patients. We also formally updated previous evidence in this area provided by a landmark study published by Gore et al. in 2000. In updating this previous evidence, we addressed the methodological limitations of this study and a number of confounding variables. Eighty-two GOLD IV COPD patients and 19 Stage IIIb or IV NSCLC patients completed generic and disease-specific HRQOL questionnaires. We used an individual patient data meta-analysis to integrate the new and existing evidence (total n=201). Finally, to enhance between-group comparability, we performed a sensitivity analysis using a subgroup of patients with a similar degree of "terminality," namely those who had died within one year after study entry. Considerable differences in HRQOL were found for physical functioning, social functioning, mental health, general health perceptions, dyspnea, activities of daily living, and depression. All differences favored the NSCLC patients. The sensitivity analysis, using only terminal NSCLC and COPD patients, confirmed these findings. In conclusion, end-stage COPD patients experience poor HRQOL comparable to or worse than that of advanced NSCLC patients. We discuss these findings in the light of the notion that these COPD patients may have a similar need for palliative care.

  3. A formalism for the systematic treatment of rapidity logarithms in Quantum Field Theory

    NASA Astrophysics Data System (ADS)

    Chiu, Jui-Yu; Jain, Ambar; Neill, Duff; Rothstein, Ira Z.

    2012-05-01

    Many observables in QCD rely upon the resummation of perturbation theory to retain predictive power. Resummation follows after one factorizes the cross section into the relevant modes. The class of observables which are sensitive to soft recoil effects are particularly challenging to factorize and resum since they involve rapidity logarithms. Such observables include: transverse momentum distributions at p T much less then the high energy scattering scale, jet broadening, exclusive hadroproduction and decay, as well as the Sudakov form factor. In this paper we will present a formalism which allows one to factorize and resum the perturbative series for such observables in a systematic fashion through the notion of a "rapidity renormalization group". That is, a Collin-Soper like equation is realized as a renormalization group equation, but has a more universal applicability to observables beyond the traditional transverse momentum dependent parton distribution functions (TMDPDFs) and the Sudakov form factor. This formalism has the feature that it allows one to track the (non-standard) scheme dependence which is inherent in any sce- nario where one performs a resummation of rapidity divergences. We present a pedagogical introduction to the formalism by applying it to the well-known massive Sudakov form fac- tor. The formalism is then used to study observables of current interest. A factorization theorem for the transverse momentum distribution of Higgs production is presented along with the result for the resummed cross section at NLL. Our formalism allows one to define gauge invariant TMDPDFs which are independent of both the hard scattering amplitude and the soft function, i.e. they are universal. We present details of the factorization and re- summation of the jet broadening cross section including a renormalization in p ⊥ space. We furthermore show how to regulate and renormalize exclusive processes which are plagued by endpoint singularities in such a way as to allow for a consistent resummation.

  4. The Effect of Instruction on the Acquisition of Conservation of Volume.

    ERIC Educational Resources Information Center

    Butts, David P.; Howe, Ann C.

    Tested was the hypothesis that science instruction based on task analysis will lead to the acquisition of the ability to perform certain Piaget volume tasks which have been characterized as requiring formal operations for their solutions. A Test on Formal Operations and a Learning Hierarchies Test were given to fourth- and sixth-grade students in…

  5. A Descriptive Analysis of Pointing and Oral Movements in a Home Sign System.

    ERIC Educational Resources Information Center

    Torigoe, Takashi; Takei, Wataru

    2002-01-01

    Discussed a social survey on communication among deaf people who had no formal schooling. Participants were deaf individuals who lived in the Okinawa Islands of Japan. Reveals many elderly deaf people had no formal education, no access to conventional sign languages during childhood, and no contact with a Deaf community. Despite this, most…

  6. Reductions of topologically massive gravity I: Hamiltonian analysis of second order degenerate Lagrangians

    NASA Astrophysics Data System (ADS)

    Ćaǧatay Uçgun, Filiz; Esen, Oǧul; Gümral, Hasan

    2018-01-01

    We present Skinner-Rusk and Hamiltonian formalisms of second order degenerate Clément and Sarıoğlu-Tekin Lagrangians. The Dirac-Bergmann constraint algorithm is employed to obtain Hamiltonian realizations of Lagrangian theories. The Gotay-Nester-Hinds algorithm is used to investigate Skinner-Rusk formalisms of these systems.

  7. Growing the Desert: Educational Pathways for Remote Indigenous People. Support Document

    ERIC Educational Resources Information Center

    Collier, Pam; King, Sharijn; Lawrence, Kate; Nangala, Irene; Nangala, Marilyn; Schaber, Evelyn; Young, Metta; Guenther, John; Oster, John

    2007-01-01

    As part of a project funded by the National Centre for Vocational Education and Research (NCVER) and the Desert Knowledge CRC (DKCRC), the "Growing the desert" research team have conducted a broad-ranging analysis of the role of formal and non-formal training opportunities that lead to employment and enterprise opportunities in the…

  8. Factors Related to Educational Participation among Adults. ASHE 1985 Annual Meeting Paper.

    ERIC Educational Resources Information Center

    Graham, Steve

    Students enrolled in formal continuing education classes were studied to determine if their motivations were similar. Eighty-six percent of the students were enrolled in formal credit courses. Students were also compared to graduates who did not continue their education. Included in the analysis were college graduates (23-62 years old) from 46…

  9. Views of HR Specialists on Formal Mentoring: Current Situation and Prospects for the Future

    ERIC Educational Resources Information Center

    Laiho, Maarit; Brandt, Tiina

    2012-01-01

    Purpose: The article aims to report the findings of quantitative and qualitative analysis of the benefits, drawbacks and future prospects of formal mentoring in medium-sized and large organisations. Design/methodology/approach: The empirical data for the study were collected via an online survey, and consist of responses from 152 human resource…

  10. Spontaneous Meta-Arithmetic as a First Step toward School Algebra

    ERIC Educational Resources Information Center

    Caspi, Shai; Sfard, Anna

    2012-01-01

    Taking as the point of departure the vision of school algebra as a formalized meta-discourse of arithmetic, we have been following five pairs of 7th grade students as they progress in algebraic discourse during 24 months, from their informal algebraic talk to the formal algebraic discourse, as taught in school. Our analysis follows changes that…

  11. An astronomer's guide to period searching

    NASA Astrophysics Data System (ADS)

    Schwarzenberg-Czerny, A.

    2003-03-01

    We concentrate on analysis of unevenly sampled time series, interrupted by periodic gaps, as often encountered in astronomy. While some of our conclusions may appear surprising, all are based on classical statistical principles of Fisher & successors. Except for discussion of the resolution issues, it is best for the reader to forget temporarily about Fourier transforms and to concentrate on problems of fitting of a time series with a model curve. According to their statistical content we divide the issues into several sections, consisting of: (ii) statistical numerical aspects of model fitting, (iii) evaluation of fitted models as hypotheses testing, (iv) the role of the orthogonal models in signal detection (v) conditions for equivalence of periodograms (vi) rating sensitivity by test power. An experienced observer working with individual objects would benefit little from formalized statistical approach. However, we demonstrate the usefulness of this approach in evaluation of performance of periodograms and in quantitative design of large variability surveys.

  12. Integrated Evaluation of Reliability and Power Consumption of Wireless Sensor Networks.

    PubMed

    Dâmaso, Antônio; Rosa, Nelson; Maciel, Paulo

    2017-11-05

    Power consumption is a primary interest in Wireless Sensor Networks (WSNs), and a large number of strategies have been proposed to evaluate it. However, those approaches usually neither consider reliability issues nor the power consumption of applications executing in the network. A central concern is the lack of consolidated solutions that enable us to evaluate the power consumption of applications and the network stack also considering their reliabilities. To solve this problem, we introduce a fully automatic solution to design power consumption aware WSN applications and communication protocols. The solution presented in this paper comprises a methodology to evaluate the power consumption based on the integration of formal models, a set of power consumption and reliability models, a sensitivity analysis strategy to select WSN configurations and a toolbox named EDEN to fully support the proposed methodology. This solution allows accurately estimating the power consumption of WSN applications and the network stack in an automated way.

  13. Automatic Estimation of Verified Floating-Point Round-Off Errors via Static Analysis

    NASA Technical Reports Server (NTRS)

    Moscato, Mariano; Titolo, Laura; Dutle, Aaron; Munoz, Cesar A.

    2017-01-01

    This paper introduces a static analysis technique for computing formally verified round-off error bounds of floating-point functional expressions. The technique is based on a denotational semantics that computes a symbolic estimation of floating-point round-o errors along with a proof certificate that ensures its correctness. The symbolic estimation can be evaluated on concrete inputs using rigorous enclosure methods to produce formally verified numerical error bounds. The proposed technique is implemented in the prototype research tool PRECiSA (Program Round-o Error Certifier via Static Analysis) and used in the verification of floating-point programs of interest to NASA.

  14. Data to DecisionsTerminate, Tolerate, Transfer, or Treat

    DTIC Science & Technology

    2016-07-25

    and patching, a risk-based cyber - security decision model that enables a pre- dictive capability to respond to impending cyber -attacks is needed...States. This sensitive data includes business proprietary information on key programs of record and infrastructure, including government documents at...leverage nationally. The Institute for Defense Analyses (IDA) assisted the DoD CIO in formalizing a proof of concept for cyber initiatives and

  15. The African American Student Network: An Informal Networking Group as a Therapeutic Intervention for Black College Students on a Predominantly White Campus

    ERIC Educational Resources Information Center

    Grier-Reed, Tabitha

    2013-01-01

    Informal support networks as opposed to formal mental health counseling may represent a culture-specific, indigenous style of coping for Black college students. Using the African American Student Network (or as students refer to it AFAM), this article comments on the potential of an informal networking group as a culturally sensitive therapeutic…

  16. Adjoint shape optimization for fluid-structure interaction of ducted flows

    NASA Astrophysics Data System (ADS)

    Heners, J. P.; Radtke, L.; Hinze, M.; Düster, A.

    2018-03-01

    Based on the coupled problem of time-dependent fluid-structure interaction, equations for an appropriate adjoint problem are derived by the consequent use of the formal Lagrange calculus. Solutions of both primal and adjoint equations are computed in a partitioned fashion and enable the formulation of a surface sensitivity. This sensitivity is used in the context of a steepest descent algorithm for the computation of the required gradient of an appropriate cost functional. The efficiency of the developed optimization approach is demonstrated by minimization of the pressure drop in a simple two-dimensional channel flow and in a three-dimensional ducted flow surrounded by a thin-walled structure.

  17. Formal language theory: refining the Chomsky hierarchy

    PubMed Central

    Jäger, Gerhard; Rogers, James

    2012-01-01

    The first part of this article gives a brief overview of the four levels of the Chomsky hierarchy, with a special emphasis on context-free and regular languages. It then recapitulates the arguments why neither regular nor context-free grammar is sufficiently expressive to capture all phenomena in the natural language syntax. In the second part, two refinements of the Chomsky hierarchy are reviewed, which are both relevant to the extant research in cognitive science: the mildly context-sensitive languages (which are located between context-free and context-sensitive languages), and the sub-regular hierarchy (which distinguishes several levels of complexity within the class of regular languages). PMID:22688632

  18. Formal language theory: refining the Chomsky hierarchy.

    PubMed

    Jäger, Gerhard; Rogers, James

    2012-07-19

    The first part of this article gives a brief overview of the four levels of the Chomsky hierarchy, with a special emphasis on context-free and regular languages. It then recapitulates the arguments why neither regular nor context-free grammar is sufficiently expressive to capture all phenomena in the natural language syntax. In the second part, two refinements of the Chomsky hierarchy are reviewed, which are both relevant to the extant research in cognitive science: the mildly context-sensitive languages (which are located between context-free and context-sensitive languages), and the sub-regular hierarchy (which distinguishes several levels of complexity within the class of regular languages).

  19. Situation resolution with context-sensitive fuzzy relations

    NASA Astrophysics Data System (ADS)

    Jakobson, Gabriel; Buford, John; Lewis, Lundy

    2009-05-01

    Context plays a significant role in situation resolution by intelligent agents (human or machine) by affecting how the situations are recognized, interpreted, acted upon or predicted. Many definitions and formalisms for the notion of context have emerged in various research fields including psychology, economics and computer science (computational linguistics, data management, control theory, artificial intelligence and others). In this paper we examine the role of context in situation management, particularly how to resolve situations that are described by using fuzzy (inexact) relations among their components. We propose a language for describing context sensitive inexact constraints and an algorithm for interpreting relations using inexact (fuzzy) computations.

  20. Older adults' preferences for formal social support of autonomy and dependence in pain: development and validation of a scale.

    PubMed

    Bernardes, Sónia F; Matos, Marta; Goubert, Liesbet

    2017-09-01

    Chronic pain among older adults is common and often disabling. Pain-related formal social support (e.g., provided by staff at day-care centers, nursing homes), and the extent to which it promotes functional autonomy or dependence, plays a significant role in the promotion of older adults' ability to engage in their daily activities. Assessing older adults' preferences for pain-related social support for functional autonomy or dependence could contribute to increase formal social support responsiveness to individuals' needs. Therefore, this study aimed at developing and validating the preferences for formal social support of autonomy and dependence in pain inventory (PFSSADI). One hundred and sixty-five older adults with chronic musculoskeletal pain ( M age  = 79.1, 67.3% women), attending day-care centers, completed the PFSSADI, the revised formal social support for autonomy and dependence in pain inventory, and a measure of desire for (in)dependence; the PFSSADI was filled out again 6 weeks later. Confirmatory factor analyses showed a structure of two correlated factors ( r  = .56): (a) preferences for autonomy support ( α  = .99) and (b) preferences for dependence support ( α  = .98). The scale showed good test-retest reliability, sensitivity and discriminant and concurrent validity; the higher the preferences for dependence support, the higher the desire for dependence ( r  = .33) and the lower the desire for independence ( r  = -.41). The PFSSADI is an innovative tool, which may contribute to explore the role of pain-related social support responsiveness on the promotion of older adults' functional autonomy when in pain.

  1. "I Treat Him as a Normal Patient": Unveiling the Normalization Coping Strategy Among Formal Caregivers of Persons With Dementia and Its Implications for Person-Centered Care.

    PubMed

    Bentwich, Miriam Ethel; Dickman, Nomy; Oberman, Amitai; Bokek-Cohen, Ya'arit

    2017-11-01

    Currently, 47 million people have dementia, worldwide, often requiring paid care by formal caregivers. Research regarding family caregivers suggests normalization as a model for coping with negative emotional outcomes in caring for a person with dementia (PWD). The study aims to explore whether normalization coping mechanism exists among formal caregivers, reveal differences in its application among cross-cultural caregivers, and examine how this coping mechanism may be related to implementing person-centered care for PWDs. Content analysis of interviews with 20 formal caregivers from three cultural groups (Jews born in Israel [JI], Arabs born in Israel [AI], Russian immigrants [RI]), attending to PWDs. We extracted five normalization modes, revealing AI caregivers had substantially more utterances of normalization expressions than their colleagues. The normalization modes most commonly expressed by AI caregivers relate to the personhood of PWDs. These normalization modes may enhance formal caregivers' ability to employ person-centered care.

  2. Removing interference-based effects from the infrared transflectance spectra of thin films on metallic substrates: a fast and wave optics conform solution.

    PubMed

    Mayerhöfer, Thomas G; Pahlow, Susanne; Hübner, Uwe; Popp, Jürgen

    2018-06-25

    A hybrid formalism combining elements from Kramers-Kronig based analyses and dispersion analysis was developed, which allows removing interference-based effects in the infrared spectra of layers on highly reflecting substrates. In order to enable a highly convenient application, the correction procedure is fully automatized and usually requires less than a minute with non-optimized software on a typical office PC. The formalism was tested with both synthetic and experimental spectra of poly(methyl methacrylate) on gold. The results confirmed the usefulness of the formalism: apparent peak ratios as well as the interference fringes in the original spectra were successfully corrected. Accordingly, the introduced formalism makes it possible to use inexpensive and robust highly reflecting substrates for routine infrared spectroscopic investigations of layers or films the thickness of which is limited by the imperative that reflectance absorbance must be smaller than about 1. For thicker films the formalism is still useful, but requires estimates for the optical constants.

  3. Constellation Probabilistic Risk Assessment (PRA): Design Consideration for the Crew Exploration Vehicle

    NASA Technical Reports Server (NTRS)

    Prassinos, Peter G.; Stamatelatos, Michael G.; Young, Jonathan; Smith, Curtis

    2010-01-01

    Managed by NASA's Office of Safety and Mission Assurance, a pilot probabilistic risk analysis (PRA) of the NASA Crew Exploration Vehicle (CEV) was performed in early 2006. The PRA methods used follow the general guidance provided in the NASA PRA Procedures Guide for NASA Managers and Practitioners'. Phased-mission based event trees and fault trees are used to model a lunar sortie mission of the CEV - involving the following phases: launch of a cargo vessel and a crew vessel; rendezvous of these two vessels in low Earth orbit; transit to th$: moon; lunar surface activities; ascension &om the lunar surface; and return to Earth. The analysis is based upon assumptions, preliminary system diagrams, and failure data that may involve large uncertainties or may lack formal validation. Furthermore, some of the data used were based upon expert judgment or extrapolated from similar componentssystemsT. his paper includes a discussion of the system-level models and provides an overview of the analysis results used to identify insights into CEV risk drivers, and trade and sensitivity studies. Lastly, the PRA model was used to determine changes in risk as the system configurations or key parameters are modified.

  4. Modular analysis of biological networks.

    PubMed

    Kaltenbach, Hans-Michael; Stelling, Jörg

    2012-01-01

    The analysis of complex biological networks has traditionally relied on decomposition into smaller, semi-autonomous units such as individual signaling pathways. With the increased scope of systems biology (models), rational approaches to modularization have become an important topic. With increasing acceptance of de facto modularity in biology, widely different definitions of what constitutes a module have sparked controversies. Here, we therefore review prominent classes of modular approaches based on formal network representations. Despite some promising research directions, several important theoretical challenges remain open on the way to formal, function-centered modular decompositions for dynamic biological networks.

  5. [Universalization of health or of social security?].

    PubMed

    Levy-Algazi, Santiago

    2011-01-01

    This article presents an analysis of the architecture of Mexico's health system based on the main economic problem, failing to achieve a GDP growth rate to increase real wages and give workers in formal employment coverage social security. This analysis describes the relationship between social security of the population and employment status of it (either formal or informal employment) and the impact that this situation poses to our health system. Also, it ends with a reform proposal that will give all workers the same social rights, ie to grant universal social security.

  6. An Energy Efficient Mutual Authentication and Key Agreement Scheme Preserving Anonymity for Wireless Sensor Networks.

    PubMed

    Lu, Yanrong; Li, Lixiang; Peng, Haipeng; Yang, Yixian

    2016-06-08

    WSNs (Wireless sensor networks) are nowadays viewed as a vital portion of the IoTs (Internet of Things). Security is a significant issue in WSNs, especially in resource-constrained environments. AKA (Authentication and key agreement) enhances the security of WSNs against adversaries attempting to get sensitive sensor data. Various AKA schemes have been developed for verifying the legitimate users of a WSN. Firstly, we scrutinize Amin-Biswas's currently scheme and demonstrate the major security loopholes in their works. Next, we propose a lightweight AKA scheme, using symmetric key cryptography based on smart card, which is resilient against all well known security attacks. Furthermore, we prove the scheme accomplishes mutual handshake and session key agreement property securely between the participates involved under BAN (Burrows, Abadi and Needham) logic. Moreover, formal security analysis and simulations are also conducted using AVISPA(Automated Validation of Internet Security Protocols and Applications) to show that our scheme is secure against active and passive attacks. Additionally, performance analysis shows that our proposed scheme is secure and efficient to apply for resource-constrained WSNs.

  7. An Energy Efficient Mutual Authentication and Key Agreement Scheme Preserving Anonymity for Wireless Sensor Networks

    PubMed Central

    Lu, Yanrong; Li, Lixiang; Peng, Haipeng; Yang, Yixian

    2016-01-01

    WSNs (Wireless sensor networks) are nowadays viewed as a vital portion of the IoTs (Internet of Things). Security is a significant issue in WSNs, especially in resource-constrained environments. AKA (Authentication and key agreement) enhances the security of WSNs against adversaries attempting to get sensitive sensor data. Various AKA schemes have been developed for verifying the legitimate users of a WSN. Firstly, we scrutinize Amin-Biswas’s currently scheme and demonstrate the major security loopholes in their works. Next, we propose a lightweight AKA scheme, using symmetric key cryptography based on smart card, which is resilient against all well known security attacks. Furthermore, we prove the scheme accomplishes mutual handshake and session key agreement property securely between the participates involved under BAN (Burrows, Abadi and Needham) logic. Moreover, formal security analysis and simulations are also conducted using AVISPA(Automated Validation of Internet Security Protocols and Applications) to show that our scheme is secure against active and passive attacks. Additionally, performance analysis shows that our proposed scheme is secure and efficient to apply for resource-constrained WSNs. PMID:27338382

  8. Advanced Weather Awareness and Reporting Enhancements

    NASA Technical Reports Server (NTRS)

    Busquets, Anthony M. (Technical Monitor); Ruokangas, Corinne Clinton; Kelly, Wallace E., III

    2005-01-01

    AWARE (Aviation Weather Awareness and Reporting Enhancements) was a NASA Cooperative Research and Development program conducted jointly by Rockwell Scientific, Rockwell Collins, and NASA. The effort culminated in an enhanced weather briefing and reporting tool prototype designed to integrate graphical and text-based aviation weather data to provide clear situational awareness in the context of a specific pilot, flight and equipment profile. The initial implementation of AWARE was as a web-based preflight planning tool, specifically for general aviation pilots, who do not have access to support such as the dispatchers available for commercial airlines. Initial usability tests showed that for VFR (Visual Flight Rules) pilots, AWARE provided faster and more effective weather evaluation. In a subsequent formal usability test for IFR (Instrument Flight Rules) pilots, all users finished the AWARE tests faster than the parallel DUAT tests, and all subjects graded AWARE higher for effectiveness, efficiency, and usability. The decision analysis basis of AWARE differentiates it from other aviation safety programs, providing analysis of context-sensitive data in a personalized graphical format to aid pilots/dispatchers in their complex flight requirements.

  9. Formal Analysis of Self-Efficacy in Job Interviewee’s Mental State Model

    NASA Astrophysics Data System (ADS)

    Ajoge, N. S.; Aziz, A. A.; Yusof, S. A. Mohd

    2017-08-01

    This paper presents a formal analysis approach for self-efficacy model of interviewee’s mental state during a job interview session. Self-efficacy is a construct that has been hypothesised to combine with motivation and interviewee anxiety to define state influence of interviewees. The conceptual model was built based on psychological theories and models related to self-efficacy. A number of well-known relations between events and the course of self-efficacy are summarized from the literature and it is shown that the proposed model exhibits those patterns. In addition, this formal model has been mathematically analysed to find out which stable situations exist. Finally, it is pointed out how this model can be used in a software agent or robot-based platform. Such platform can provide an interview coaching approach where support to the user is provided based on their individual metal state during interview sessions.

  10. A formal framework of scenario creation and analysis of extreme hydrological events

    NASA Astrophysics Data System (ADS)

    Lohmann, D.

    2007-12-01

    We are presenting a formal framework for a hydrological risk analysis. Different measures of risk will be introduced, such as average annual loss or occurrence exceedance probability. These are important measures for e.g. insurance companies to determine the cost of insurance. One key aspect of investigating the potential consequences of extreme hydrological events (floods and draughts) is the creation of meteorological scenarios that reflect realistic spatial and temporal patterns of precipitation that also have correct local statistics. 100,000 years of these meteorological scenarios are used in a calibrated rainfall-runoff-flood-loss-risk model to produce flood and draught events that have never been observed. The results of this hazard model are statistically analyzed and linked to socio-economic data and vulnerability functions to show the impact of severe flood events. We are showing results from the Risk Management Solutions (RMS) Europe Flood Model to introduce this formal framework.

  11. Reaction-diffusion-like formalism for plastic neural networks reveals dissipative solitons at criticality

    NASA Astrophysics Data System (ADS)

    Grytskyy, Dmytro; Diesmann, Markus; Helias, Moritz

    2016-06-01

    Self-organized structures in networks with spike-timing dependent synaptic plasticity (STDP) are likely to play a central role for information processing in the brain. In the present study we derive a reaction-diffusion-like formalism for plastic feed-forward networks of nonlinear rate-based model neurons with a correlation sensitive learning rule inspired by and being qualitatively similar to STDP. After obtaining equations that describe the change of the spatial shape of the signal from layer to layer, we derive a criterion for the nonlinearity necessary to obtain stable dynamics for arbitrary input. We classify the possible scenarios of signal evolution and find that close to the transition to the unstable regime metastable solutions appear. The form of these dissipative solitons is determined analytically and the evolution and interaction of several such coexistent objects is investigated.

  12. Formal Specification and Automatic Analysis of Business Processes under Authorization Constraints: An Action-Based Approach

    NASA Astrophysics Data System (ADS)

    Armando, Alessandro; Giunchiglia, Enrico; Ponta, Serena Elisa

    We present an approach to the formal specification and automatic analysis of business processes under authorization constraints based on the action language \\cal{C}. The use of \\cal{C} allows for a natural and concise modeling of the business process and the associated security policy and for the automatic analysis of the resulting specification by using the Causal Calculator (CCALC). Our approach improves upon previous work by greatly simplifying the specification step while retaining the ability to perform a fully automatic analysis. To illustrate the effectiveness of the approach we describe its application to a version of a business process taken from the banking domain and use CCALC to determine resource allocation plans complying with the security policy.

  13. An inventory of reasons for sperm donation in formal versus informal settings.

    PubMed

    Bossema, Ercolie R; Janssens, Pim M W; Treucker, Roswitha G L; Landwehr, Frieda; van Duinen, Kor; Nap, Annemiek W; Geenen, Rinie

    2014-03-01

    The shortage of sperm donors in formal settings (i.e., assisted reproduction clinics) and the availability of sperm donors in informal settings (such as through contacts on the internet) motivated us to investigate why men may prefer either a formal or an informal setting for sperm donation. Interviews with ten sperm donors and non-sperm donors yielded 55 reasons for sperm donation in the two settings. These reasons were categorized according to similarity by 14 sperm donors and non-sperm donors. These categorizations were then structured by means of hierarchical cluster analysis. Reasons favouring formal settings included being legally and physically protected, evading paternal feelings or social consequences, and having a simple, standardized procedure in terms of effort and finances. Reasons favouring informal settings related to engagement, the possibility to choose a recipient, lack of rules and regulations, having contact with the donor child, and having an (intimate) bond with the recipient. The overview of reasons identified may help potential sperm donors decide on whether to donate in a formal or informal setting, and may fuel discussions by professionals about the most appropriate conditions and legislation for sperm donation in formal settings.

  14. Cost implications of organizing nursing home workforce in teams.

    PubMed

    Mukamel, Dana B; Cai, Shubing; Temkin-Greener, Helena

    2009-08-01

    To estimate the costs associated with formal and self-managed daily practice teams in nursing homes. Medicaid cost reports for 135 nursing homes in New York State in 2006 and survey data for 6,137 direct care workers. A retrospective statistical analysis: We estimated hybrid cost functions that include team penetration variables. Inference was based on robust standard errors. Formal and self-managed team penetration (i.e., percent of staff working in a team) were calculated from survey responses. Annual variable costs, beds, case mix-adjusted days, admissions, home care visits, outpatient clinic visits, day care days, wages, and ownership were calculated from the cost reports. Formal team penetration was significantly associated with costs, while self-managed teams penetration was not. Costs declined with increasing penetration up to 13 percent of formal teams, and increased above this level. Formal teams in nursing homes in the upward sloping range of the curve were more diverse, with a larger number of participating disciplines and more likely to include physicians. Organization of workforce in formal teams may offer nursing homes a cost-saving strategy. More research is required to understand the relationship between team composition and costs.

  15. Don't abandon hope all ye who enter here: The protective role of formal mentoring and learning processes on burnout in correctional officers.

    PubMed

    Farnese, M L; Barbieri, B; Bellò, B; Bartone, P T

    2017-01-01

    Within a Job Demands-Resources Model framework, formal mentoring can be conceived as a job resource expressing the organization's support for new members, which may prevent their being at risk for burnout. This research aims at understanding the protective role of formal mentoring on burnout, through the effect of increasing learning personal resources. Specifically, we hypothesized that formal mentoring enhances newcomers' learning about job and social domains related to the new work context, thus leading to lower burnout. In order to test the hypotheses, a multiple regression analysis using the bootstrapping method was used. Based on a questionnaire administered to 117 correctional officer newcomers who had a formal mentor assigned, our results confirm that formal mentoring exerts a positive influence on newcomers' adjustment, and that this in turn exerts a protective influence against burnout onset by reducing cynicism and interpersonal stress and also enhancing the sense of personal accomplishment. Confirming previous literature's suggestions, supportive mentoring and effective socialization seem to represent job and personal resources that are protective against burnout. This study provides empirical support for this relation in the prison context.

  16. A formal approach to the analysis of clinical computer-interpretable guideline modeling languages.

    PubMed

    Grando, M Adela; Glasspool, David; Fox, John

    2012-01-01

    To develop proof strategies to formally study the expressiveness of workflow-based languages, and to investigate their applicability to clinical computer-interpretable guideline (CIG) modeling languages. We propose two strategies for studying the expressiveness of workflow-based languages based on a standard set of workflow patterns expressed as Petri nets (PNs) and notions of congruence and bisimilarity from process calculus. Proof that a PN-based pattern P can be expressed in a language L can be carried out semi-automatically. Proof that a language L cannot provide the behavior specified by a PNP requires proof by exhaustion based on analysis of cases and cannot be performed automatically. The proof strategies are generic but we exemplify their use with a particular CIG modeling language, PROforma. To illustrate the method we evaluate the expressiveness of PROforma against three standard workflow patterns and compare our results with a previous similar but informal comparison. We show that the two proof strategies are effective in evaluating a CIG modeling language against standard workflow patterns. We find that using the proposed formal techniques we obtain different results to a comparable previously published but less formal study. We discuss the utility of these analyses as the basis for principled extensions to CIG modeling languages. Additionally we explain how the same proof strategies can be reused to prove the satisfaction of patterns expressed in the declarative language CIGDec. The proof strategies we propose are useful tools for analysing the expressiveness of CIG modeling languages. This study provides good evidence of the benefits of applying formal methods of proof over semi-formal ones. Copyright © 2011 Elsevier B.V. All rights reserved.

  17. Modeling Cyber Conflicts Using an Extended Petri Net Formalism

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zakrzewska, Anita N; Ferragut, Erik M

    2011-01-01

    When threatened by automated attacks, critical systems that require human-controlled responses have difficulty making optimal responses and adapting protections in real- time and may therefore be overwhelmed. Consequently, experts have called for the development of automatic real-time reaction capabilities. However, a technical gap exists in the modeling and analysis of cyber conflicts to automatically understand the repercussions of responses. There is a need for modeling cyber assets that accounts for concurrent behavior, incomplete information, and payoff functions. Furthermore, we address this need by extending the Petri net formalism to allow real-time cyber conflicts to be modeled in a way thatmore » is expressive and concise. This formalism includes transitions controlled by players as well as firing rates attached to transitions. This allows us to model both player actions and factors that are beyond the control of players in real-time. We show that our formalism is able to represent situational aware- ness, concurrent actions, incomplete information and objective functions. These factors make it well-suited to modeling cyber conflicts in a way that allows for useful analysis. MITRE has compiled the Common Attack Pattern Enumera- tion and Classification (CAPEC), an extensive list of cyber attacks at various levels of abstraction. CAPEC includes factors such as attack prerequisites, possible countermeasures, and attack goals. These elements are vital to understanding cyber attacks and to generating the corresponding real-time responses. We demonstrate that the formalism can be used to extract precise models of cyber attacks from CAPEC. Several case studies show that our Petri net formalism is more expressive than other models, such as attack graphs, for modeling cyber conflicts and that it is amenable to exploring cyber strategies.« less

  18. Physics graduate students' perceptions of the value of teaching

    NASA Astrophysics Data System (ADS)

    Verley, Jim D.

    An exploratory study was undertaken to examine the perceptions of physics graduate students regarding teaching and their institutional and departmental support for their teaching efforts. A Likert survey was developed and distributed to 249 physics graduate students at four Rocky Mountain institutions of higher education. The survey was distributed through individual physics department email lists to prevent spam and virus blockers from removing the survey email. Of those 249 receiving the survey 132 students responded (53%) and of those responding 50% gave written comments about their perceptions of the value of teaching. Two of the institutions surveyed have some level of formal teaching development and assistance programming available to the graduate students and two had no formal programs in place either departmentally or institutionally. Both quantitative and qualitative analysis was utilized to examine the survey questions, demographic information and an open-ended question regarding the students' personal perceptions of teaching. Results of the survey analysis indicate that this group of physics graduate students perceive and place a high value on the importance of teaching. The results of the study also indicate that while there was high awareness by the student population of formal programs to aid in their teaching efforts, it did not translate into a high value placed on teaching by the institutions or departments from the student perspective. Students at those institutions that maintain formal programs for teaching development and support, while aware of those programs, often perceive departmental support for their teaching efforts to be lacking and feel unable to accommodate a personal interest in teaching because of a departmental focus on research. The students attending the institution with no formal institutional or departmental programs for teaching had the highest perceived value on its departmental teaching and support for teaching compared to those institutions with formal programs in place.

  19. Cross-sectional comparison of point-of-care with laboratory HbA1c in detecting diabetes in real-world remote Aboriginal settings

    PubMed Central

    Marley, Julia V; Oh, May S; Hadgraft, Nyssa; Singleton, Sally; Isaacs, Kim; Atkinson, David

    2015-01-01

    Objectives To determine if point-of-care (POC) glycated haemoglobin (HbA1c) is sufficiently accurate in real-world remote settings to predict or exclude the diagnosis of diabetes based on laboratory HbA1c measurements. Design Cross-sectional study comparing POC capillary HbA1c results with corresponding venous HbA1c levels measured in a reference laboratory. Participants Aboriginal patients ≥15 years old who were due for diabetes screening at the participating clinics were invited to participate. Two hundred and fifty-five Aboriginal participants were enrolled and 241 were included in the analysis. Setting 6 primary healthcare sites in the remote Kimberley region of Western Australia from September 2011 to November 2013. Main outcome measures Concordance and mean differences between POC capillary blood HbA1c measurement and laboratory measurement of venous blood HbA1c level; POC capillary blood HbA1c equivalence value for screening for diabetes or a high risk of developing diabetes; sensitivity, specificity and positive-predictive value for diagnosing and screening for diabetes; barriers to conducting POC testing. Results Concordance between POC and laboratory results was good (ρ=0.88, p<0.001). The mean difference was −0.15% (95% limits of agreement, −0.67% to 0.36%). POC HbA1c measurements ≥6.5%, 48 mmol/mol had a specificity of 98.2% and sensitivity of 73.7% for laboratory measurements ≥6.5%. The POC equivalence value for screening for diabetes or a high risk of developing diabetes was ≥5.7%, 39 mmol/mol (sensitivity, 91%; specificity, 76.7% for laboratory measurements ≥6.0%, 42 mmol/mol). Staff trained by other clinic staff ‘on the job’ performed as well as people with formal accredited training. Staff reported difficulty in maintaining formal accreditation. Conclusions POC HbA1c testing is sufficiently accurate to be a useful component in screening for, and diagnosing, diabetes in remote communities. Limited local training is adequate to produce results comparable to laboratory results and accreditation processes need to reflect this. PMID:25765020

  20. Cost Analysis of Non-Formal ETV Systems: A Case Study of the "Extra-Scolaire" System in the Ivory Coast.

    ERIC Educational Resources Information Center

    Klees, Steven J.

    Building on previous evaluations of the ETV systems--both formal and informal--of the Ivory Coast, this study examines the system costs of the "Extra Scolaire" (E/S) program for rural adults. Educational television is utilized through the Ivorian primary system, and battery operated televisions have been widely distributed to schools in…

  1. Practicas de las Relacioenes Formales e Informales de los Profesores con los Estudiantes Que Inciden en la Persistencia

    ERIC Educational Resources Information Center

    Arroyo, Rosa

    2012-01-01

    This work includes three models of retention for higher education. The study was performed at a private university, with 144 students of second semester second year. The theme is formal and informal relationships between students and their professors. The statics analysis was performed with SPSS 19 and STAT 12. [The dissertation citations…

  2. In Search for the Ideological Roots of Non-Formal Environment-Related Education in Finland: The Case of Helsinki Humane Society before World War II

    ERIC Educational Resources Information Center

    Rouhiainen, Henna; Vuorisalo, Timo

    2014-01-01

    So far the research on historical environment-related education has focused on scientific rather than "humanistic" (including Romantic and religious) educational approaches and ideologies. In the field of non-formal education implemented by associations these have, however, been common. We used content analysis to study two membership…

  3. Solar Workbook. A Step by Step Procedure for Assessing Solar Opportunities in Existing Buildings in New York State.

    ERIC Educational Resources Information Center

    New York State Energy Office, Albany.

    Determining the feasibility of modifying existing buildings to take advantage of solar energy requires a formal technical study. This workbook is designed to help the reader perform a preliminary analysis to decide whether investing in such a formal study would be beneficial. The book guides the user through an exploration of the general…

  4. Does basing an intervention on behavioral theory enhance the efficacy/effectiveness on dietary change for obesity prevention among children? A systematic review and meta-analysis

    USDA-ARS?s Scientific Manuscript database

    Our purpose was to test whether interventions based on theory, multiple theories, or a formal planning process were more effective in changing fruit and vegetable (FV) consumption among children than interventions with no behavioral theoretical foundation or no formal planning. The authors conducted...

  5. Developing Metrics for Effective Teaching in Extension Education: A Multi-State Factor-Analytic and Psychometric Analysis of Effective Teaching

    ERIC Educational Resources Information Center

    McKim, Billy R.; Lawver, Rebecca G.; Enns, Kellie; Smith, Amy R.; Aschenbrener, Mollie S.

    2013-01-01

    To successfully educate the public about agriculture, food, and natural resources, we must have effective educators in both formal and nonformal settings. Specifically, this study, which is a valuable part of a larger sequential mixed-method study addressing effective teaching in formal and nonformal agricultural education, provides direction for…

  6. Tacit Knowledge and General Qualification: Concepts of Learning in Everyday Life and Formal Education When Work Changes with Examples from Office Work.

    ERIC Educational Resources Information Center

    Olesen, Henning Salling

    An analysis of office work (OW) highlights the relationship between formal vocational qualifications and tacit knowledge gained through experience. In OW, "abstracted" skills (typewriting, correspondence) and theory are taught in schools out of their practical context and can become obsolete because of technological change. Some types of…

  7. A Comparative Study of Pre-Service Education for Preschool Teachers in China and the United States

    ERIC Educational Resources Information Center

    Gong, Xin; Wang, Pengcheng

    2017-01-01

    This study provides a comparative analysis of the pre-service education system for preschool educators in China and the United States. Based on collected data and materials (literature, policy documents, and statistical data), we compare two areas of pre-service training: (1) the formal system; (2) the informal system. In the formal system, most…

  8. Social Network Analysis of Crowds

    DTIC Science & Technology

    2009-08-06

    crowd responses to non-lethal weapons d tan sys ems – Prior, existing social relationships – Real time social interactions – Formal/informal...Crowd Behavior Testbed Layout Video Cameras on Trusses Importance of Social Factors • Response to non-lethal weapons fire depends on social ... relationships among crowd members – Pre-existing Personal Relationships – Ongoing Real Time Social Interactions – Formal/Informal Hierarchies • Therefore

  9. Prescriptive models to support decision making in genetics.

    PubMed

    Pauker, S G; Pauker, S P

    1987-01-01

    Formal prescriptive models can help patients and clinicians better understand the risks and uncertainties they face and better formulate well-reasoned decisions. Using Bayes rule, the clinician can interpret pedigrees, historical data, physical findings and laboratory data, providing individualized probabilities of various diagnoses and outcomes of pregnancy. With the advent of screening programs for genetic disease, it becomes increasingly important to consider the prior probabilities of disease when interpreting an abnormal screening test result. Decision trees provide a convenient formalism for structuring diagnostic, therapeutic and reproductive decisions; such trees can also enhance communication between clinicians and patients. Utility theory provides a mechanism for patients to understand the choices they face and to communicate their attitudes about potential reproductive outcomes in a manner which encourages the integration of those attitudes into appropriate decisions. Using a decision tree, the relevant probabilities and the patients' utilities, physicians can estimate the relative worth of various medical and reproductive options by calculating the expected utility of each. By performing relevant sensitivity analyses, clinicians and patients can understand the impact of various soft data, including the patients' attitudes toward various health outcomes, on the decision making process. Formal clinical decision analytic models can provide deeper understanding and improved decision making in clinical genetics.

  10. Diagnostic accuracy of tests to detect Hepatitis C antibody: a meta-analysis and review of the literature.

    PubMed

    Tang, Weiming; Chen, Wen; Amini, Ali; Boeras, Debi; Falconer, Jane; Kelly, Helen; Peeling, Rosanna; Varsaneux, Olivia; Tucker, Joseph D; Easterbrook, Philippa

    2017-11-01

    Although direct-acting antivirals can achieve sustained virological response rates greater than 90% in Hepatitis C Virus (HCV) infected persons, at present the majority of HCV-infected individuals remain undiagnosed and therefore untreated. While there are a wide range of HCV serological tests available, there is a lack of formal assessment of their diagnostic performance. We undertook a systematic review and meta-analysis to evaluate he diagnostic accuracy of available rapid diagnostic tests (RDT) and laboratory based EIA assays in detecting antibodies to HCV. We used the PRISMA checklist and Cochrane guidance to develop our search protocol. The search strategy was registered in PROSPERO (CRD42015023567). The search focused on hepatitis C, diagnostic tests, and diagnostic accuracy within eight databases (MEDLINE, EMBASE, the Cochrane Central Register of Controlled Trials, Science Citation Index Expanded, Conference Proceedings Citation Index-Science, SCOPUS, Literatura Latino-Americana e do Caribe em Ciências da Saúde and WHO Global Index Medicus. Studies were included if they evaluated an assay to determine the sensitivity and specificity of HCV antibody (HCV Ab) in humans. Two reviewers independently extracted data and performed a quality assessment of the studies using the QUADAS tool. We pooled test estimates using the DerSimonian-Laird method, by using the software R and RevMan. 5.3. A total of 52 studies were identified that included 52,673 unique test measurements. Based on five studies, the pooled sensitivity and specificity of HCV Ab rapid diagnostic tests (RDTs) were 98% (95% CI 98-100%) and 100% (95% CI 100-100%) compared to an enzyme immunoassay (EIA) reference standard. High HCV Ab RDTs sensitivity and specificity were observed across screening populations (general population, high risk populations, and hospital patients) using different reference standards (EIA, nucleic acid testing, immunoblot). There were insufficient studies to undertake subanalyses based on HIV co-infection. Oral HCV Ab RDTs also had excellent sensitivity and specificity compared to blood reference tests, respectively at 94% (95% CI 93-96%) and 100% (95% CI 100-100%). Among studies that assessed individual oral RDTs, the eight studies revealed that OraQuick ADVANCE® had a slightly higher sensitivity (98%, 95% CI 97-98%) compared to the other oral brands (pooled sensitivity: 88%, 95% CI 84-92%). RDTs, including oral tests, have excellent sensitivity and specificity compared to laboratory-based methods for HCV antibody detection across a wide range of settings. Oral HCV Ab RDTs had good sensitivity and specificity compared to blood reference standards.

  11. The specification-based validation of reliable multicast protocol: Problem Report. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Wu, Yunqing

    1995-01-01

    Reliable Multicast Protocol (RMP) is a communication protocol that provides an atomic, totally ordered, reliable multicast service on top of unreliable IP multicasting. In this report, we develop formal models for RMP using existing automated verification systems, and perform validation on the formal RMP specifications. The validation analysis help identifies some minor specification and design problems. We also use the formal models of RMP to generate a test suite for conformance testing of the implementation. Throughout the process of RMP development, we follow an iterative, interactive approach that emphasizes concurrent and parallel progress of implementation and verification processes. Through this approach, we incorporate formal techniques into our development process, promote a common understanding for the protocol, increase the reliability of our software, and maintain high fidelity between the specifications of RMP and its implementation.

  12. Evidence Arguments for Using Formal Methods in Software Certification

    NASA Technical Reports Server (NTRS)

    Denney, Ewen W.; Pai, Ganesh

    2013-01-01

    We describe a generic approach for automatically integrating the output generated from a formal method/tool into a software safety assurance case, as an evidence argument, by (a) encoding the underlying reasoning as a safety case pattern, and (b) instantiating it using the data produced from the method/tool. We believe this approach not only improves the trustworthiness of the evidence generated from a formal method/tool, by explicitly presenting the reasoning and mechanisms underlying its genesis, but also provides a way to gauge the suitability of the evidence in the context of the wider assurance case. We illustrate our work by application to a real example-an unmanned aircraft system- where we invoke a formal code analysis tool from its autopilot software safety case, automatically transform the verification output into an evidence argument, and then integrate it into the former.

  13. A Formal Semantics for the WS-BPEL Recovery Framework

    NASA Astrophysics Data System (ADS)

    Dragoni, Nicola; Mazzara, Manuel

    While current studies on Web services composition are mostly focused - from the technical viewpoint - on standards and protocols, this work investigates the adoption of formal methods for dependable composition. The Web Services Business Process Execution Language (WS-BPEL) - an OASIS standard widely adopted both in academic and industrial environments - is considered as a touchstone for concrete composition languages and an analysis of its ambiguous Recovery Framework specification is offered. In order to show the use of formal methods, a precise and unambiguous description of its (simplified) mechanisms is provided by means of a conservative extension of the π-calculus. This has to be intended as a well known case study providing methodological arguments for the adoption of formal methods in software specification. The aspect of verification is not the main topic of the paper but some hints are given.

  14. Asymptotic symmetries and geometry on the boundary in the first order formalism

    NASA Astrophysics Data System (ADS)

    Korovin, Yegor

    2018-03-01

    Proper understanding of the geometry on the boundary of a spacetime is a critical step on the way to extending holography to spaces with non-AdS asymptotics. In general the boundary cannot be described in terms of the Riemannian geometry and the first order formalism is more appropriate as we show. We analyze the asymptotic symmetries in the first order formalism for large classes of theories on AdS, Lifshitz or flat space. In all cases the asymptotic symmetry algebra is realized on the first order variables as a gauged symmetry algebra. First order formalism geometrizes and simplifies the analysis. We apply our framework to the issue of scale versus conformal invariance in AdS/CFT and obtain new perspective on the structure of asymptotic expansions for AdS and flat spaces.

  15. Learning Through Experience: Influence of Formal and Informal Training on Medical Error Disclosure Skills in Residents.

    PubMed

    Wong, Brian M; Coffey, Maitreya; Nousiainen, Markku T; Brydges, Ryan; McDonald-Blumer, Heather; Atkinson, Adelle; Levinson, Wendy; Stroud, Lynfa

    2017-02-01

    Residents' attitudes toward error disclosure have improved over time. It is unclear whether this has been accompanied by improvements in disclosure skills. To measure the disclosure skills of internal medicine (IM), paediatrics, and orthopaedic surgery residents, and to explore resident perceptions of formal versus informal training in preparing them for disclosure in real-world practice. We assessed residents' error disclosure skills using a structured role play with a standardized patient in 2012-2013. We compared disclosure skills across programs using analysis of variance. We conducted a multiple linear regression, including data from a historical cohort of IM residents from 2005, to investigate the influence of predictor variables on performance: training program, cohort year, and prior disclosure training and experience. We conducted a qualitative descriptive analysis of data from semistructured interviews with residents to explore resident perceptions of formal versus informal disclosure training. In a comparison of disclosure skills for 49 residents, there was no difference in overall performance across specialties (4.1 to 4.4 of 5, P  = .19). In regression analysis, only the current cohort was significantly associated with skill: current residents performed better than a historical cohort of 42 IM residents ( P  < .001). Qualitative analysis identified the importance of both formal (workshops, morbidity and mortality rounds) and informal (role modeling, debriefing) activities in preparation for disclosure in real-world practice. Residents across specialties have similar skills in disclosure of errors. Residents identified role modeling and a strong local patient safety culture as key facilitators for disclosure.

  16. Timing of Formal Phase Safety Reviews for Large-Scale Integrated Hazard Analysis

    NASA Technical Reports Server (NTRS)

    Massie, Michael J.; Morris, A. Terry

    2010-01-01

    Integrated hazard analysis (IHA) is a process used to identify and control unacceptable risk. As such, it does not occur in a vacuum. IHA approaches must be tailored to fit the system being analyzed. Physical, resource, organizational and temporal constraints on large-scale integrated systems impose additional direct or derived requirements on the IHA. The timing and interaction between engineering and safety organizations can provide either benefits or hindrances to the overall end product. The traditional approach for formal phase safety review timing and content, which generally works well for small- to moderate-scale systems, does not work well for very large-scale integrated systems. This paper proposes a modified approach to timing and content of formal phase safety reviews for IHA. Details of the tailoring process for IHA will describe how to avoid temporary disconnects in major milestone reviews and how to maintain a cohesive end-to-end integration story particularly for systems where the integrator inherently has little to no insight into lower level systems. The proposal has the advantage of allowing the hazard analysis development process to occur as technical data normally matures.

  17. Using ICT techniques for improving mechatronic systems' dependability

    NASA Astrophysics Data System (ADS)

    Miron, Emanuel; Silva, João P. M. A.; Machado, José; Olaru, Dumitru; Prisacaru, Gheorghe

    2013-10-01

    The use of analysis techniques for industrial controller's analysis, such as Simulation and Formal Verification, is complex on industrial context. This complexity is due to the fact that such techniques require sometimes high investment in specific skilled human resources that have sufficient theoretical knowledge in those domains. This paper aims, mainly, to show that it is possible to obtain a timed automata model for formal verification purposes, considering the CAD model of a mechanical component. This systematic approach can be used, by companies, for the analysis of industrial controllers programs. For this purpose, it is discussed, in the paper, the best way to systematize these procedures, and this paper describes, only, the first step of a complex process and promotes a discussion of the main difficulties that can be found and a possibility for handle those difficulties. A library for formal verification purposes is obtained from original 3D CAD models using Software as a Service platform (SaaS) that, nowadays, has become a common deliverable model for many applications, because SaaS is typically accessed by users via internet access.

  18. A Visible Light Initiating System for Free Radical Promoted Cationic Polymerization

    DTIC Science & Technology

    1994-02-02

    identify the end groups in the polymer of cyclohexene oxide. N,N-Dimethylnaphthyl amine (DNA), a compound with high fluorescence quantum yield, was used...candidates to be polymerized via a cationic mechanism include cyclic ethers, cyclic formals and acetals, vinyl ethers, and epoxy compounds . Of these...reported sensitizer, bears two dimethylamino groups, is direct evidence that an aromatic amine can be present in a cationically photopolymerizable system

  19. Proceedings of the First NASA Formal Methods Symposium

    NASA Technical Reports Server (NTRS)

    Denney, Ewen (Editor); Giannakopoulou, Dimitra (Editor); Pasareanu, Corina S. (Editor)

    2009-01-01

    Topics covered include: Model Checking - My 27-Year Quest to Overcome the State Explosion Problem; Applying Formal Methods to NASA Projects: Transition from Research to Practice; TLA+: Whence, Wherefore, and Whither; Formal Methods Applications in Air Transportation; Theorem Proving in Intel Hardware Design; Building a Formal Model of a Human-Interactive System: Insights into the Integration of Formal Methods and Human Factors Engineering; Model Checking for Autonomic Systems Specified with ASSL; A Game-Theoretic Approach to Branching Time Abstract-Check-Refine Process; Software Model Checking Without Source Code; Generalized Abstract Symbolic Summaries; A Comparative Study of Randomized Constraint Solvers for Random-Symbolic Testing; Component-Oriented Behavior Extraction for Autonomic System Design; Automated Verification of Design Patterns with LePUS3; A Module Language for Typing by Contracts; From Goal-Oriented Requirements to Event-B Specifications; Introduction of Virtualization Technology to Multi-Process Model Checking; Comparing Techniques for Certified Static Analysis; Towards a Framework for Generating Tests to Satisfy Complex Code Coverage in Java Pathfinder; jFuzz: A Concolic Whitebox Fuzzer for Java; Machine-Checkable Timed CSP; Stochastic Formal Correctness of Numerical Algorithms; Deductive Verification of Cryptographic Software; Coloured Petri Net Refinement Specification and Correctness Proof with Coq; Modeling Guidelines for Code Generation in the Railway Signaling Context; Tactical Synthesis Of Efficient Global Search Algorithms; Towards Co-Engineering Communicating Autonomous Cyber-Physical Systems; and Formal Methods for Automated Diagnosis of Autosub 6000.

  20. Formal caregivers' experiences of aggressive behaviour in older people living with dementia in nursing homes: A systematic review.

    PubMed

    Holst, Adelheid; Skär, Lisa

    2017-12-01

    The purpose of this study was to investigate formal caregivers' experiences of aggressive behaviour in older people living with dementia in nursing homes. Aggressive behaviour symptoms among older people living with dementia are reported to be prevalent. As aggressive behaviour includes both verbal and physical behaviours, such as kicking, hitting and screaming, it causes an increased burden on formal caregivers. Professionals experiencing this aggression perceived it as challenging, causing physical and psychological damage, leading to anger, stress and depression. A systematic review was conducted. A search of published research studies between 2000 and 2015 was conducted using appropriate search terms. Eleven studies were identified and included in this review. The analysis resulted in four categories: formal caregivers' views on triggers of aggression, expressions of aggression, the effect of aggressive behaviours on formal caregivers and formal caregivers' strategies to address aggression. The results show that aggressive behaviour may lead to negative feelings in formal caregivers and nursing home residents. The results of this study suggest that having the ability to identify triggers possibly assists caregivers with addressing aggressive behaviour. Aggressive behaviour might also affect quality of care. Results from this systematic review indicate that caregivers prefer person-centred strategies to handle aggressive behaviour among older people, while the use of pharmaceuticals and coercion strategies is a last resort. © 2017 John Wiley & Sons Ltd.

  1. Use of healthcare services by injured people in Khartoum State, Sudan.

    PubMed

    El Tayeb, Sally; Abdalla, Safa; Van den Bergh, Graziella; Heuch, Ivar

    2015-05-01

    Trauma care is an important factor in preventing death and reducing disability. Injured persons in low- and middle-income countries are expected to use the formal healthcare system in increasing numbers. The objective of this paper is to examine use of healthcare services after injury in Khartoum State, Sudan. A community-based survey using a stratified two-stage cluster sampling technique in Khartoum State was performed. Information on healthcare utilisation was taken from injured people. A logistic regression analysis was used to explore factors affecting the probability of using formal healthcare services. During the 12 months preceding the survey a total of 441 cases of non-fatal injuries occurred, with 260 patients accessing formal healthcare. About a quarter of the injured persons were admitted to hospital. Injured people with primary education were less likely to use formal healthcare compared to those with no education. Formal health services were most used by males and in cases of road traffic injuries. The lowest socio-economic strata were least likely to use formal healthcare. Public health measures and social security should be strengthened by identifying other real barriers that prevent low socio-economic groups from making use of formal healthcare facilities. Integration and collaboration with traditional orthopaedic practitioners are important aspects that need further attention. © The Author 2014. Published by Oxford University Press on behalf of Royal Society of Tropical Medicine and Hygiene.

  2. Assessing a change mechanism in a randomized home-visiting trial: Reducing disrupted maternal communication decreases infant disorganization.

    PubMed

    Tereno, Susana; Madigan, Sheri; Lyons-Ruth, Karlen; Plamondon, Andre; Atkinson, Leslie; Guedeney, Nicole; Greacen, Tim; Dugravier, Romain; Saias, Thomas; Guedeney, Antoine

    2017-05-01

    Although randomized interventions trials have been shown to reduce the incidence of disorganized attachment, no studies to date have identified the mechanisms of change responsible for such reductions. Maternal sensitivity has been assessed in various studies and shown to change with intervention, but in the only study to formally assess mediation, changes in maternal sensitivity did not mediate changes in infant security of attachment (Cicchetti, Rogosch, & Toth, 2006). Primary aims of the current randomized controlled intervention trial in a high-risk population were to fill gaps in the literature by assessing whether the intervention (a) reduced disorganization, (b) reduced disrupted maternal communication, and (c) whether reductions in disrupted maternal communication mediated changes in infant disorganization. The results indicated that, compared to controls (n = 52), both infant disorganization and disrupted maternal communication were significantly reduced in the intervention group (n = 65) that received regular home-visiting during pregnancy and the first year of life. Furthermore, reductions in disrupted maternal communication partially accounted for the observed reductions in infant disorganization compared to randomized controls. The results are discussed in relation to the societal cost effectiveness of early attachment-informed interventions for mothers and infants, as well as the importance of formally assessing underlying mechanisms of change in order to improve and appropriately target preventive interventions.

  3. A neural mediator of human anxiety sensitivity.

    PubMed

    Harrison, Ben J; Fullana, Miquel A; Soriano-Mas, Carles; Via, Esther; Pujol, Jesus; Martínez-Zalacaín, Ignacio; Tinoco-Gonzalez, Daniella; Davey, Christopher G; López-Solà, Marina; Pérez Sola, Victor; Menchón, José M; Cardoner, Narcís

    2015-10-01

    Advances in the neuroscientific understanding of bodily autonomic awareness, or interoception, have led to the hypothesis that human trait anxiety sensitivity (AS)-the fear of bodily autonomic arousal-is primarily mediated by the anterior insular cortex. Despite broad appeal, few experimental studies have comprehensively addressed this hypothesis. We recruited 55 individuals exhibiting a range of AS and assessed them with functional magnetic resonance imaging (fMRI) during aversive fear conditioning. For each participant, three primary measures of interest were derived: a trait Anxiety Sensitivity Index score; an in-scanner rating of elevated bodily anxiety sensations during fear conditioning; and a corresponding estimate of whole-brain functional activation to the conditioned versus nonconditioned stimuli. Using a voxel-wise mediation analysis framework, we formally tested for 'neural mediators' of the predicted association between trait AS score and in-scanner anxiety sensations during fear conditioning. Contrary to the anterior insular hypothesis, no evidence of significant mediation was observed for this brain region, which was instead linked to perceived anxiety sensations independently from AS. Evidence for significant mediation was obtained for the dorsal anterior cingulate cortex-a finding that we argue is more consistent with the hypothesized role of human cingulofrontal cortex in conscious threat appraisal processes, including threat-overestimation. This study offers an important neurobiological validation of the AS construct and identifies a specific neural substrate that may underlie high AS clinical phenotypes, including but not limited to panic disorder. © 2015 Wiley Periodicals, Inc.

  4. Investigating Cooperative Behavior in Ecological Settings: An EEG Hyperscanning Study.

    PubMed

    Toppi, Jlenia; Borghini, Gianluca; Petti, Manuela; He, Eric J; De Giusti, Vittorio; He, Bin; Astolfi, Laura; Babiloni, Fabio

    2016-01-01

    The coordinated interactions between individuals are fundamental for the success of the activities in some professional categories. We reported on brain-to-brain cooperative interactions between civil pilots during a simulated flight. We demonstrated for the first time how the combination of neuroelectrical hyperscanning and intersubject connectivity could provide indicators sensitive to the humans' degree of synchronization under a highly demanding task performed in an ecological environment. Our results showed how intersubject connectivity was able to i) characterize the degree of cooperation between pilots in different phases of the flight, and ii) to highlight the role of specific brain macro areas in cooperative behavior. During the most cooperative flight phases pilots showed, in fact, dense patterns of interbrain connectivity, mainly linking frontal and parietal brain areas. On the contrary, the amount of interbrain connections went close to zero in the non-cooperative phase. The reliability of the interbrain connectivity patterns was verified by means of a baseline condition represented by formal couples, i.e. pilots paired offline for the connectivity analysis but not simultaneously recorded during the flight. Interbrain density was, in fact, significantly higher in real couples with respect to formal couples in the cooperative flight phases. All the achieved results demonstrated how the description of brain networks at the basis of cooperation could effectively benefit from a hyperscanning approach. Interbrain connectivity was, in fact, more informative in the investigation of cooperative behavior with respect to established EEG signal processing methodologies applied at a single subject level.

  5. Investigating Cooperative Behavior in Ecological Settings: An EEG Hyperscanning Study

    PubMed Central

    Petti, Manuela; He, Eric J.; De Giusti, Vittorio; He, Bin; Astolfi, Laura; Babiloni, Fabio

    2016-01-01

    The coordinated interactions between individuals are fundamental for the success of the activities in some professional categories. We reported on brain-to-brain cooperative interactions between civil pilots during a simulated flight. We demonstrated for the first time how the combination of neuroelectrical hyperscanning and intersubject connectivity could provide indicators sensitive to the humans’ degree of synchronization under a highly demanding task performed in an ecological environment. Our results showed how intersubject connectivity was able to i) characterize the degree of cooperation between pilots in different phases of the flight, and ii) to highlight the role of specific brain macro areas in cooperative behavior. During the most cooperative flight phases pilots showed, in fact, dense patterns of interbrain connectivity, mainly linking frontal and parietal brain areas. On the contrary, the amount of interbrain connections went close to zero in the non-cooperative phase. The reliability of the interbrain connectivity patterns was verified by means of a baseline condition represented by formal couples, i.e. pilots paired offline for the connectivity analysis but not simultaneously recorded during the flight. Interbrain density was, in fact, significantly higher in real couples with respect to formal couples in the cooperative flight phases. All the achieved results demonstrated how the description of brain networks at the basis of cooperation could effectively benefit from a hyperscanning approach. Interbrain connectivity was, in fact, more informative in the investigation of cooperative behavior with respect to established EEG signal processing methodologies applied at a single subject level. PMID:27124558

  6. Recent technological developments: in situ histopathological interrogation of surgical tissues and resection margins

    PubMed Central

    Upile, Tahwinder; Fisher, Cyril; Jerjes, Waseem; El Maaytah, Mohammed; Singh, Sandeep; Sudhoff, Holger; Searle, Adam; Archer, Daniel; Michaels, Leslie; Hopper, Colin; Rhys-Evans, Peter; Howard, David; Wright, Anthony

    2007-01-01

    Objectives The tumour margin is an important surgical concept significantly affecting patient morbidity and mortality. We aimed in this prospective study to apply the microendoscope on tissue margins from patients undergoing surgery for oral cancer in vivo and ex vivo and compare it to the gold standard "paraffin wax", inter-observer agreement was measured; also to present the surgical pathologist with a practical guide to the every day use of the microendoscope both in the clinical and surgical fields. Materials and methods Forty patients undergoing resection of oral squamous cell carcinoma were recruited. The surgical margin was first marked by the operator followed by microendoscopic assessment. Biopsies were taken from areas suggestive of close or positive margins after microendoscopic examination. These histological samples were later scrutinized formally and the resection margins revisited accordingly when necessary. Results Using the microendoscope we report our experience in the determination of surgical margins at operation and later comparison with frozen section and paraffin section margins "gold standard". We were able to obtain a sensitivity of 95% and a specificity of 90%. Inter-observer Kappa scores comparing the microendoscope with formal histological analysis of normal and abnormal mucosa were 0.85. Conclusion The advantage of this technique is that a large area of mucosa can be sampled and any histomorphological changes can be visualized in real time allowing the operator to make important informed decisions with regards the intra-operative resection margin at the time of the surgery. PMID:17331229

  7. Diagnostic accuracy and limitations of post-mortem MRI for neurological abnormalities in fetuses and children.

    PubMed

    Arthurs, O J; Thayyil, S; Pauliah, S S; Jacques, T S; Chong, W K; Gunny, R; Saunders, D; Addison, S; Lally, P; Cady, E; Jones, R; Norman, W; Scott, R; Robertson, N J; Wade, A; Chitty, L; Taylor, A M; Sebire, N J

    2015-08-01

    To compare the diagnostic accuracy of non-invasive cerebral post-mortem magnetic resonance imaging (PMMRI) specifically for cerebral and neurological abnormalities in a series of fetuses and children, compared to conventional autopsy. Institutional ethics approval and parental consent was obtained. Pre-autopsy cerebral PMMRI was performed in a sequential prospective cohort (n = 400) of fetuses (n = 277; 185 ≤ 24 weeks and 92 > 24 weeks gestation) and children <16 years (n = 123) of age. PMMRI and conventional autopsy findings were reported blinded and independently of each other. Cerebral PMMRI had sensitivities and specificities (95% confidence interval) of 88.4% (75.5 to 94.9), and 95.2% (92.1 to 97.1), respectively, for cerebral malformations; 100% (83.9 to 100), and 99.1% (97.2 to 99.7) for major intracranial bleeds; and 87.5% (80.1 to 92.4) and 74.1% (68 to 79.4) for overall brain pathology. Formal neuropathological examination was non-diagnostic due to maceration/autolysis in 43/277 (16%) fetuses; of these, cerebral PMMRI imaging provided clinically important information in 23 (53%). The sensitivity of PMMRI for detecting significant ante-mortem ischaemic injury was only 68% (48.4 to 82.8) overall. PMMRI is an accurate investigational technique for identifying significant neuropathology in fetuses and children, and may provide important information even in cases where autolysis prevents formal neuropathological examination; however, PMMRI is less sensitive at detecting hypoxic-ischaemic brain injury, and may not detect rarer disorders not encountered in this study. Copyright © 2015 The Royal College of Radiologists. Published by Elsevier Ltd. All rights reserved.

  8. Software Design for Real-Time Systems on Parallel Computers: Formal Specifications.

    DTIC Science & Technology

    1996-04-01

    This research investigated the important issues related to the analysis and design of real - time systems targeted to parallel architectures. In...particular, the software specification models for real - time systems on parallel architectures were evaluated. A survey of current formal methods for...uniprocessor real - time systems specifications was conducted to determine their extensibility in specifying real - time systems on parallel architectures. In

  9. Embracing the Devil: An Analysis of the Formal Adoption of Red Teaming in the Security Planning for Major Events

    DTIC Science & Technology

    2017-03-01

    little hope of a better solution, low self - esteem temporarily induced by recent failures, and difficulties in determining feasible alternatives in...The ability to think creatively and communicate potentially negative findings effectively are unique skills improved with formal training and...Homeland Security Presidential Directive IC intelligence community IED improvised explosive device JCCIC Joint Congressional Committee on

  10. Determinants of Salary Growth in Shenzhen, China: An Analysis of Formal Education, On-the-Job Training, and Adult Education with a Three-Level Model.

    ERIC Educational Resources Information Center

    Xiao, Jin

    2002-01-01

    Uses hierarchical linear model to estimate the effects of three forms of human capital on employee salary in China: Formal education, employer-provided on-the-job training, and adult education. Finds, for example, that employees' experience in changing production technology and on-the-job training are positively associated with salary increases…

  11. Language of Mechanisms: Exam Analysis Reveals Students' Strengths, Strategies, and Errors When Using the Electron-Pushing Formalism (Curved Arrows) in New Reactions

    ERIC Educational Resources Information Center

    Flynn, Alison B.; Featherstone, Ryan B.

    2017-01-01

    This study investigated students' successes, strategies, and common errors in their answers to questions that involved the electron-pushing (curved arrow) formalism (EPF), part of organic chemistry's language. We analyzed students' answers to two question types on midterms and final exams: (1) draw the electron-pushing arrows of a reaction step,…

  12. The Economic Aspects of Non-Formal Education: A Selected Annotated Bibliography. Program of Studies in Non-Formal Education, Supplementary Series. Paper No. 3.

    ERIC Educational Resources Information Center

    Mannan, M. A.

    The 303 items in the annotated bibliography are arrayed under four headings: (1) general literature--economic issues, (2) general literature--nonformal education, (3) economics of nonformal education (including cost-benefit analysis, investment and return in human capital, and economics of on-the-job training and retraining), and (4) planning and…

  13. Student approaches for learning in medicine: what does it tell us about the informal curriculum?

    PubMed

    Zhang, Jianzhen; Peterson, Raymond F; Ozolins, Ieva Z

    2011-10-21

    It has long been acknowledged that medical students frequently focus their learning on that which will enable them to pass examinations, and that they use a range of study approaches and resources in preparing for their examinations. A recent qualitative study identified that in addition to the formal curriculum, students are using a range of resources and study strategies which could be attributed to the informal curriculum. What is not clearly established is the extent to which these informal learning resources and strategies are utilized by medical students. The aim of this study was to establish the extent to which students in a graduate-entry medical program use various learning approaches to assist their learning and preparation for examinations, apart from those resources offered as part of the formal curriculum. A validated survey instrument was administered to 522 medical students. Factor analysis and internal consistence, descriptive analysis and comparisons with demographic variables were completed. The factor analysis identified eight scales with acceptable levels of internal consistency with an alpha coefficient between 0.72 and 0.96. Nearly 80% of the students reported that they were overwhelmed by the amount of work that was perceived necessary to complete the formal curriculum, with 74.3% believing that the informal learning approaches helped them pass the examinations. 61.3% believed that they prepared them to be good doctors. A variety of informal learning activities utilized by students included using past student notes (85.8%) and PBL tutor guides (62.7%), and being part of self-organised study groups (62.6%), and peer-led tutorials (60.2%). Almost all students accessed the formal school resources for at least 10% of their study time. Students in the first year of the program were more likely to rely on the formal curriculum resources compared to those of Year 2 (p = 0.008). Curriculum planners should examine the level of use of informal learning activities in their schools, and investigate whether this is to enhance student progress, a result of perceived weakness in the delivery and effectiveness of formal resources, or to overcome anxiety about the volume of work expected by medical programs.

  14. Modeling and Analysis of Asynchronous Systems Using SAL and Hybrid SAL

    NASA Technical Reports Server (NTRS)

    Tiwari, Ashish; Dutertre, Bruno

    2013-01-01

    We present formal models and results of formal analysis of two different asynchronous systems. We first examine a mid-value select module that merges the signals coming from three different sensors that are each asynchronously sampling the same input signal. We then consider the phase locking protocol proposed by Daly, Hopkins, and McKenna. This protocol is designed to keep a set of non-faulty (asynchronous) clocks phase locked even in the presence of Byzantine-faulty clocks on the network. All models and verifications have been developed using the SAL model checking tools and the Hybrid SAL abstractor.

  15. The inter-relationship between formal and informal care: a study in France and Israel

    PubMed Central

    LITWIN, HOWARD; ATTIAS-DONFUT, CLAUDINE

    2012-01-01

    This study examined whether formal care services delivered to frail older people’s homes in France and Israel substitute for or complement informal support. The two countries have comparable family welfare systems but many historical, cultural and religious differences. Data for the respondents aged 75 or more years at the first wave of the Survey of Health, Ageing and Retirement in Europe (SHARE) were analysed. Regressions were examined of three patterns of care from outside the household: informal support only, formal support only and both formal and informal care, with the predictor variables including whether informal help was provided by a family member living in the household. The results revealed that about one-half of the respondents received no help at all (France 51%, Israel 55%), about one-tenth received care from a household member (France 8%, Israel 10%), and one-third were helped by informal carers from outside the household (France 34%, Israel 33%). More French respondents (35%) received formal care services at home than Israelis (27%). Most predictors of the care patterns were similar in the two countries. The analysis showed that complementarity is a common outcome of the co-existence of formal and informal care, and that mixed provision occurs more frequently in situations of greater need. It is also shown that spouse care-givers had less formal home-care supports than either co-resident children or other family care-givers. Even so, spouses, children and other family care-givers all had considerable support from formal home-delivered care. PMID:23316096

  16. Recruiting and retaining low-income Latinos in psychotherapy research.

    PubMed

    Miranda, J; Azocar, F; Organista, K C; Muñoz, R F; Lieberman, A

    1996-10-01

    This article offers suggestions for recruiting and retaining low-income Latinos in treatment studies. Because Latinos underuse traditional mental health services, places such as medical centers or churches with large Latino constituents are suggested as useful alternative sources. To keep Latinos in research protocols, providing culturally sensitive treatments are necessary. Culturally sensitive treatments should incorporate families as part of recruitment efforts, particularly older men in the family. In addition, showing respect is an important aspect of traditional Latino culture that includes using formal titles and taking time to listen carefully. Finally, traditional Latinos tend to like interactions with others that are more warm and personal than is generally part of a research atmosphere.

  17. Application of Adjoint Methodology to Supersonic Aircraft Design Using Reversed Equivalent Areas

    NASA Technical Reports Server (NTRS)

    Rallabhandi, Sriram K.

    2013-01-01

    This paper presents an approach to shape an aircraft to equivalent area based objectives using the discrete adjoint approach. Equivalent areas can be obtained either using reversed augmented Burgers equation or direct conversion of off-body pressures into equivalent area. Formal coupling with CFD allows computation of sensitivities of equivalent area objectives with respect to aircraft shape parameters. The exactness of the adjoint sensitivities is verified against derivatives obtained using the complex step approach. This methodology has the benefit of using designer-friendly equivalent areas in the shape design of low-boom aircraft. Shape optimization results with equivalent area cost functionals are discussed and further refined using ground loudness based objectives.

  18. The use of the Petri net method in the simulation modeling of mitochondrial swelling.

    PubMed

    Danylovych, Yu V; Chunikhin, A Y; Danylovych, G V; Kolomiets, O V

    2016-01-01

    Using photon correlation spectroscopy, which allows investigating changes in the hydrodynamic dia­meter of the particles in suspension, it was shown that ultrahigh concentrations of Ca2+ (over 10 mM) induce swelling of isolated mitochondria. An increase in hydrodynamic diameter was caused by an increase of non-specific mitochondrial membrane permeability to Ca ions, matrix Ca2+ overload, activation of ATP- and Ca2+-sensitive K+-channels, as well as activation of cyclosporin-sensitive permeability transition pore. To formalize the experimental data and to assess conformity of experimental results with theoretical predictions we developed a simulation model using the hybrid functional Petri net method.

  19. Amplitude Analysis of the Decay $$D_s^+ \\to \\pi^+ \\pi^- \\pi^+$$ in the Experiment E831/FOCUS (in Portuguese)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schilithz, Anderson Correa; /Rio de Janeiro, CBPF

    We present in this thesis the Dalitz Plot analysis of the D{sub s}{sup +} {yields} {pi}{sup +}{pi}{sup -}{pi}{sup +} decay, with the data of the E831/FOCUS, that took data in 1996 and 1997. The masses and widhts of f{sub 0}(980) and f{sub 0}(1370) are free parametres of the fit on Dalitz Plot, objectiving to study in detail these resonances. After this analysis we present the Spectator Model study on the S wave in this decay. For this study we used the formalism developed by M. Svec [2] for scattering. We present the comparison between the Isobar Model, frequently used inmore » Dalitz Plot analysis, and this formalism.« less

  20. Quantifying hypoxia in human cancers using static PET imaging.

    PubMed

    Taylor, Edward; Yeung, Ivan; Keller, Harald; Wouters, Bradley G; Milosevic, Michael; Hedley, David W; Jaffray, David A

    2016-11-21

    Compared to FDG, the signal of 18 F-labelled hypoxia-sensitive tracers in tumours is low. This means that in addition to the presence of hypoxic cells, transport properties contribute significantly to the uptake signal in static PET images. This sensitivity to transport must be minimized in order for static PET to provide a reliable standard for hypoxia quantification. A dynamic compartmental model based on a reaction-diffusion formalism was developed to interpret tracer pharmacokinetics and applied to static images of FAZA in twenty patients with pancreatic cancer. We use our model to identify tumour properties-well-perfused without substantial necrosis or partitioning-for which static PET images can reliably quantify hypoxia. Normalizing the measured activity in a tumour voxel by the value in blood leads to a reduction in the sensitivity to variations in 'inter-corporal' transport properties-blood volume and clearance rate-as well as imaging study protocols. Normalization thus enhances the correlation between static PET images and the FAZA binding rate K 3 , a quantity which quantifies hypoxia in a biologically significant way. The ratio of FAZA uptake in spinal muscle and blood can vary substantially across patients due to long muscle equilibration times. Normalized static PET images of hypoxia-sensitive tracers can reliably quantify hypoxia for homogeneously well-perfused tumours with minimal tissue partitioning. The ideal normalizing reference tissue is blood, either drawn from the patient before PET scanning or imaged using PET. If blood is not available, uniform, homogeneously well-perfused muscle can be used. For tumours that are not homogeneously well-perfused or for which partitioning is significant, only an analysis of dynamic PET scans can reliably quantify hypoxia.

  1. Quantifying hypoxia in human cancers using static PET imaging

    NASA Astrophysics Data System (ADS)

    Taylor, Edward; Yeung, Ivan; Keller, Harald; Wouters, Bradley G.; Milosevic, Michael; Hedley, David W.; Jaffray, David A.

    2016-11-01

    Compared to FDG, the signal of 18F-labelled hypoxia-sensitive tracers in tumours is low. This means that in addition to the presence of hypoxic cells, transport properties contribute significantly to the uptake signal in static PET images. This sensitivity to transport must be minimized in order for static PET to provide a reliable standard for hypoxia quantification. A dynamic compartmental model based on a reaction-diffusion formalism was developed to interpret tracer pharmacokinetics and applied to static images of FAZA in twenty patients with pancreatic cancer. We use our model to identify tumour properties—well-perfused without substantial necrosis or partitioning—for which static PET images can reliably quantify hypoxia. Normalizing the measured activity in a tumour voxel by the value in blood leads to a reduction in the sensitivity to variations in ‘inter-corporal’ transport properties—blood volume and clearance rate—as well as imaging study protocols. Normalization thus enhances the correlation between static PET images and the FAZA binding rate K 3, a quantity which quantifies hypoxia in a biologically significant way. The ratio of FAZA uptake in spinal muscle and blood can vary substantially across patients due to long muscle equilibration times. Normalized static PET images of hypoxia-sensitive tracers can reliably quantify hypoxia for homogeneously well-perfused tumours with minimal tissue partitioning. The ideal normalizing reference tissue is blood, either drawn from the patient before PET scanning or imaged using PET. If blood is not available, uniform, homogeneously well-perfused muscle can be used. For tumours that are not homogeneously well-perfused or for which partitioning is significant, only an analysis of dynamic PET scans can reliably quantify hypoxia.

  2. Reservoir Performance Under Future Climate For Basins With Different Hydrologic Sensitivities

    NASA Astrophysics Data System (ADS)

    Mateus, M. C.; Tullos, D. D.

    2013-12-01

    In addition to long-standing uncertainties related to variable inflows and market price of power, reservoir operators face a number of new uncertainties related to hydrologic nonstationarity, changing environmental regulations, and rapidly growing water and energy demands. This study investigates the impact, sensitivity, and uncertainty of changing hydrology on hydrosystem performance across different hydrogeologic settings. We evaluate the performance of reservoirs in the Santiam River basin, including a case study in the North Santiam Basin, with high permeability and extensive groundwater storage, and the South Santiam Basin, with low permeability, little groundwater storage and rapid runoff response. The modeling objective is to address the following study questions: (1) for the two hydrologic regimes, how does the flood management, water supply, and environmental performance of current reservoir operations change under future 2.5, 50 and 97.5 percentile streamflow projections; and (2) how much change in inflow is required to initiate a failure to meet downstream minimum or maximum flows in the future. We couple global climate model results with a rainfall-runoff model and a formal Bayesian uncertainty analysis to simulate future inflow hydrographs as inputs to a reservoir operations model. To evaluate reservoir performance under a changing climate, we calculate reservoir refill reliability, changes in flood frequency, and reservoir time and volumetric reliability of meeting minimum spring and summer flow target. Reservoir performance under future hydrology appears to vary with hydrogeology. We find higher sensitivity to floods for the North Santiam Basin and higher sensitivity to minimum flow targets for the South Santiam Basin. Higher uncertainty is related with basins with a more complex hydrologeology. Results from model simulations contribute to understanding of the reliability and vulnerability of reservoirs to a changing climate.

  3. Electric-field-enhanced nutrient consumption in dielectric biomaterials that contain anchorage-dependent cells.

    PubMed

    Belfiore, Laurence A; Floren, Michael L; Belfiore, Carol J

    2012-02-01

    This research contribution addresses electric-field stimulation of intra-tissue mass transfer and cell proliferation in viscoelastic biomaterials. The unsteady state reaction-diffusion equation is solved according to the von Kármán-Pohlhausen integral method of boundary layer analysis when nutrient consumption and tissue regeneration occur in response to harmonic electric potential differences across a parallel-plate capacitor in a dielectric-sandwich configuration. The partial differential mass balance with diffusion and electro-kinetic consumption contains the Damköhler (Λ(2)) and Deborah (De) numbers. Zero-field and electric-field-sensitive Damköhler numbers affect nutrient boundary layer growth. Diagonal elements of the 2nd-rank diffusion tensor are enhanced in the presence of weak electric fields, in agreement with the formalism of equilibrium and nonequilibrium thermodynamics. Induced dipole polarization density within viscoelastic biomaterials is calculated via the real and imaginary components of the complex dielectric constant, according to the Debye equation, to quantify electro-kinetic stimulation. Rates of nutrient consumption under zero-field conditions are described by third-order kinetics that include local mass densities of nutrients, oxygen, and attached cells. Thinner nutrient boundary layers are stabilized at shorter dimensionless diffusion times when the zero-field intra-tissue Damköhler number increases above its initial-condition-sensitive critical value [i.e., {Λ(2)(zero-field)}(critical)≥53, see Eq. (23)], such that the biomaterial core is starved of essential ingredients required for successful proliferation. When tissue regeneration occurs above the critical electric-field-sensitive intra-tissue Damköhler number, the electro-kinetic contribution to nutrient consumption cannot be neglected. The critical electric-field-sensitive intra-tissue Damköhler number is proportional to the Deborah number. Copyright © 2011 Elsevier B.V. All rights reserved.

  4. Sensitivity and specificity of the Beck Depression Inventory in cardiologic inpatients: how useful is the conventional cut-off score?

    PubMed

    Forkmann, Thomas; Vehren, Thomas; Boecker, Maren; Norra, Christine; Wirtz, Markus; Gauggel, Siegfried

    2009-10-01

    The Beck Depression Inventory (BDI) is widely used for depression screening in various patient populations. However, there are still insufficient data about its sensitivity and specificity in nonpsychiatric patients. Furthermore, some research suggests that somatic BDI items heighten its sum score artificially in physically ill patients. The aim of the present study was to validate the conventional BDI cut-off score by examination of its sensitivity and specificity in a mixed sample of cardiac inpatients and compare it to a modified "cognitive-emotional" BDI (BDI(c/e)) after exclusion of somatic items. A total of 126 cardiologic inpatients were assessed. Receiver operating characteristic curves (ROC) were calculated for total BDI (BDI(t)) and BDI(c/e). Screening performance of cut-off scores was evaluated using the Youden Index (Y). With the application of the conventional BDI cut-off score, ROC analysis revealed a moderate overall screening performance with Y=52.6 and an area under the curve (AUC) of 0.83. In contrast, Y improved to 57.5 at a cut-off score of >9, but screening performance was still not optimal. BDI(c/e) showed also a moderate screening performance (AUC=.82); Y was maximized at a cut-off score of >8 (Y=0.53.5). Again, no cut-off score provided optimal screening performance. The BDI cannot be recommended as a formal screening instrument in cardiac inpatients since no cut-off score for either BDI(t) or BDI(c/e) combined both sufficiently high sensitivity and specificity. However, the shorter BDI(c/e) could be used as alternative to BDI(t) which may be confounded in physically ill patients. Generally, researchers should consider using alternative screening instruments (e.g., the Hospital Anxiety and Depression Scale) instead.

  5. Gender-related differences in reasoning skills and learning interests of junior high school students

    NASA Astrophysics Data System (ADS)

    Shemesh, Michal

    The purpose of this study was to investigate gender-related differences in the relationship between the development of formal reasoning skills and learning interests during the early adolescent stage. For this purpose, 249 students, from seventh to ninth grade, were assessed for their level of mastery of formal reasoning skills by a test based on videotaped simple experiments. Learning interests were assessed by a written response to an open question. Results showed that adolescent boys develop patterns of formal reasoning before their girl classmates. In addition, boys tend to prefer science and technology subjects, while girls tend to prefer language, social studies, and humanities. Analysis of interactions showed that boys' tendency toward science and technology is positively correlated to their age and development of formal reasoning, while girls' tendency to the above subjects is positively related to their development of formal reasoning capacity, but inversely related to their age. Possible explanations to the above-described findings and suggestions for instructional modes that may increase girls' interest in science and technology are discussed.

  6. IEEE/NASA Workshop on Leveraging Applications of Formal Methods, Verification, and Validation

    NASA Technical Reports Server (NTRS)

    Margaria, Tiziana (Editor); Steffen, Bernhard (Editor); Hichey, Michael G.

    2005-01-01

    This volume contains the Preliminary Proceedings of the 2005 IEEE ISoLA Workshop on Leveraging Applications of Formal Methods, Verification, and Validation, with a special track on the theme of Formal Methods in Human and Robotic Space Exploration. The workshop was held on 23-24 September 2005 at the Loyola College Graduate Center, Columbia, MD, USA. The idea behind the Workshop arose from the experience and feedback of ISoLA 2004, the 1st International Symposium on Leveraging Applications of Formal Methods held in Paphos (Cyprus) last October-November. ISoLA 2004 served the need of providing a forum for developers, users, and researchers to discuss issues related to the adoption and use of rigorous tools and methods for the specification, analysis, verification, certification, construction, test, and maintenance of systems from the point of view of their different application domains.

  7. The Formal Semantics of PVS

    NASA Technical Reports Server (NTRS)

    Owre, Sam; Shankar, Natarajan

    1999-01-01

    A specification language is a medium for expressing what is computed rather than how it is computed. Specification languages share some features with programming languages but are also different in several important ways. For our purpose, a specification language is a logic within which the behavior of computational systems can be formalized. Although a specification can be used to simulate the behavior of such systems, we mainly use specifications to state and prove system properties with mechanical assistance. We present the formal semantics of the specification language of SRI's Prototype Verification System (PVS). This specification language is based on the simply typed lambda calculus. The novelty in PVS is that it contains very expressive language features whose static analysis (e.g., typechecking) requires the assistance of a theorem prover. The formal semantics illuminates several of the design considerations underlying PVS, the interaction between theorem proving and typechecking.

  8. Diagnostic accuracy of fundal height and handheld ultrasound-measured abdominal circumference to screen for fetal growth abnormalities

    PubMed Central

    Haragan, Adriane F.; Hulsey, Thomas C.; Hawk, Angela F.; Newman, Roger B.; Chang, Eugene Y.

    2015-01-01

    OBJECTIVE We sought to compare fundal height and handheld ultrasound–measured fetal abdominal circumference (HHAC) for the prediction of fetal growth restriction (FGR) or large for gestational age. STUDY DESIGN This was a diagnostic accuracy study in nonanomalous singleton pregnancies between 24 and 40 weeks’ gestation. Patients underwent HHAC and fundal height measurement prior to formal growth ultrasound. FGR was defined as estimated fetal weight less than 10%, whereas large for gestational age was defined as estimated fetal weight greater than 90%. Sensitivity and specificity were calculated and compared using methods described elsewhere. RESULTS There were 251 patients included in this study. HHAC had superior sensitivity and specificity for the detection of FGR (sensitivity, 100% vs 42.86%) and (specificity, 92.62% vs 85.24%). HHAC had higher specificity but lower sensitivity when screening for LGA (specificity, 85.66% vs 66.39%) and (sensitivity, 57.14% vs 71.43%). CONCLUSION HHAC could prove to be a valuable screening tool in the detection of FGR. Further studies are needed in a larger population. PMID:25818672

  9. Direct evidence that prostate tumors show high sensitivity to fractionation (low alpha/beta ratio), similar to late-responding normal tissue.

    PubMed

    Brenner, David J; Martinez, Alvaro A; Edmundson, Gregory K; Mitchell, Christina; Thames, Howard D; Armour, Elwood P

    2002-01-01

    A direct approach to the question of whether prostate tumors have an atypically high sensitivity to fractionation (low alpha/beta ratio), more typical of the surrounding late-responding normal tissue. Earlier estimates of alpha/beta for prostate cancer have relied on comparing results from external beam radiotherapy (EBRT) and brachytherapy, an approach with significant pitfalls due to the many differences between the treatments. To circumvent this, we analyze recent data from a single EBRT + high-dose-rate (HDR) brachytherapy protocol, in which the brachytherapy was given in either 2 or 3 implants, and at various doses. For the analysis, standard models of tumor cure based on Poisson statistics were used in conjunction with the linear-quadratic formalism. Biochemical control at 3 years was the clinical endpoint. Patients were matched between the 3 HDR vs. 2 HDR implants by clinical stage, pretreatment prostate-specific antigen (PSA), Gleason score, length of follow-up, and age. The estimated value of alpha/beta from the current analysis of 1.2 Gy (95% CI: 0.03, 4.1 Gy) is consistent with previous estimates for prostate tumor control. This alpha/beta value is considerably less than typical values for tumors (> or =8 Gy), and more comparable to values in surrounding late-responding normal tissues. This analysis provides strong supporting evidence that alpha/beta values for prostate tumor control are atypically low, as indicated by previous analyses and radiobiological considerations. If true, hypofractionation or HDR regimens for prostate radiotherapy (with appropriate doses) should produce tumor control and late sequelae that are at least as good or even better than currently achieved, with the added possibility that early sequelae may be reduced.

  10. Prostate Cancer Associated Lipid Signatures in Serum Studied by ESI-Tandem Mass Spectrometryas Potential New Biomarkers.

    PubMed

    Duscharla, Divya; Bhumireddy, Sudarshana Reddy; Lakshetti, Sridhar; Pospisil, Heike; Murthy, P V L N; Walther, Reinhard; Sripadi, Prabhakar; Ummanni, Ramesh

    2016-01-01

    Prostate cancer (PCa) is one amongst the most common cancersin western men. Incidence rate ofPCa is on the rise worldwide. The present study deals with theserum lipidome profiling of patients diagnosed with PCa to identify potential new biomarkers. We employed ESI-MS/MS and GC-MS for identification of significantly altered lipids in cancer patient's serum compared to controls. Lipidomic data revealed 24 lipids are significantly altered in cancer patinet's serum (n = 18) compared to normal (n = 18) with no history of PCa. By using hierarchical clustering and principal component analysis (PCA) we could clearly separate cancer patients from control group. Correlation and partition analysis along with Formal Concept Analysis (FCA) have identified that PC (39:6) and FA (22:3) could classify samples with higher certainty. Both the lipids, PC (39:6) and FA (22:3) could influence the cataloging of patients with 100% sensitivity (all 18 control samples are classified correctly) and 77.7% specificity (of 18 tumor samples 4 samples are misclassified) with p-value of 1.612×10-6 in Fischer's exact test. Further, we performed GC-MS to denote fatty acids altered in PCa patients and found that alpha-linolenic acid (ALA) levels are altered in PCa. We also performed an in vitro proliferation assay to determine the effect of ALA in survival of classical human PCa cell lines LNCaP and PC3. We hereby report that the altered lipids PC (39:6) and FA (22:3) offer a new set of biomarkers in addition to the existing diagnostic tests that could significantly improve sensitivity and specificity in PCa diagnosis.

  11. Prostate Cancer Associated Lipid Signatures in Serum Studied by ESI-Tandem Mass Spectrometryas Potential New Biomarkers

    PubMed Central

    Duscharla, Divya; Bhumireddy, Sudarshana Reddy; Lakshetti, Sridhar; Pospisil, Heike; Murthy, P. V. L. N.; Walther, Reinhard; Sripadi, Prabhakar; Ummanni, Ramesh

    2016-01-01

    Prostate cancer (PCa) is one amongst the most common cancersin western men. Incidence rate ofPCa is on the rise worldwide. The present study deals with theserum lipidome profiling of patients diagnosed with PCa to identify potential new biomarkers. We employed ESI-MS/MS and GC-MS for identification of significantly altered lipids in cancer patient’s serum compared to controls. Lipidomic data revealed 24 lipids are significantly altered in cancer patinet’s serum (n = 18) compared to normal (n = 18) with no history of PCa. By using hierarchical clustering and principal component analysis (PCA) we could clearly separate cancer patients from control group. Correlation and partition analysis along with Formal Concept Analysis (FCA) have identified that PC (39:6) and FA (22:3) could classify samples with higher certainty. Both the lipids, PC (39:6) and FA (22:3) could influence the cataloging of patients with 100% sensitivity (all 18 control samples are classified correctly) and 77.7% specificity (of 18 tumor samples 4 samples are misclassified) with p-value of 1.612×10−6 in Fischer’s exact test. Further, we performed GC-MS to denote fatty acids altered in PCa patients and found that alpha-linolenic acid (ALA) levels are altered in PCa. We also performed an in vitro proliferation assay to determine the effect of ALA in survival of classical human PCa cell lines LNCaP and PC3. We hereby report that the altered lipids PC (39:6) and FA (22:3) offer a new set of biomarkers in addition to the existing diagnostic tests that could significantly improve sensitivity and specificity in PCa diagnosis. PMID:26958841

  12. MEDICAL DEVICE PRICES IN ECONOMIC EVALUATIONS.

    PubMed

    Akpinar, Ilke; Jacobs, Philip; Husereau, Don

    2015-01-01

    Economic evaluations, although not formally used in purchasing decisions for medical devices in Canada, are still being conducted and published. The aim of this study was to examine the way that prices have been included in Canadian economic evaluations of medical devices. We conducted a review of the economic concepts and implications of methods used for economic evaluations of the eleven most implanted medical devices from the Canadian perspective. We found Canadian economic studies for five of the eleven medical devices and identified nineteen Canadian studies. Overall, the device costs were important components of total procedure cost, with an average ratio of 44.1 %. Observational estimates of the device costs were obtained from buyers or sellers in 13 of the 19 studies. Although most of the devices last more than 1 year, standard costing methods for capital equipment was never used. In addition, only eight studies included a sensitivity analysis for the device cost. None of the sensitivity analyses were based on actual price distributions. Economic evaluations are potentially important for policy making, but although they are being conducted, there is no standardized approach for incorporating medical device prices in economic analyses. Our review provides suggestions for improvements in how the prices are incorporated for economic evaluations of medical devices.

  13. The Dynamics of Son Preference, Technology Diffusion, and Fertility Decline Underlying Distorted Sex Ratios at Birth: A Simulation Approach.

    PubMed

    Kashyap, Ridhi; Villavicencio, Francisco

    2016-10-01

    We present a micro-founded simulation model that formalizes the "ready, willing, and able" framework, originally used to explain historical fertility decline, to the practice of prenatal sex selection. The model generates sex ratio at birth (SRB) distortions from the bottom up and attempts to quantify plausible levels, trends, and interactions of son preference, technology diffusion, and fertility decline that underpin SRB trajectories at the macro level. Calibrating our model for South Korea, we show how even as the proportion with a preference for sons was declining, SRB distortions emerged due to rapid diffusion of prenatal sex determination technology combined with small but growing propensities to abort at low birth parities. Simulations reveal that relatively low levels of son preference (about 20 % to 30 % wanting one son) can result in skewed SRB levels if technology diffuses early and steadily, and if fertility falls rapidly to encourage sex-selective abortion at low parities. Model sensitivity analysis highlights how the shape of sex ratio trajectories is particularly sensitive to the timing and speed of prenatal sex-determination technology diffusion. The maximum SRB levels reached in a population are influenced by how the readiness to abort rises as a function of the fertility decline.

  14. Quantum caustics in resonance-fluorescence trajectories

    NASA Astrophysics Data System (ADS)

    Naghiloo, M.; Tan, D.; Harrington, P. M.; Lewalle, P.; Jordan, A. N.; Murch, K. W.

    2017-11-01

    We employ phase-sensitive amplification to perform homodyne detection of the resonance fluorescence from a driven superconducting artificial atom. Entanglement between the emitter and its fluorescence allows us to track the individual quantum state trajectories of the emitter conditioned on the outcomes of the field measurements. We analyze the ensemble properties of these trajectories by considering trajectories that connect specific initial and final states. By applying the stochastic path-integral formalism, we calculate equations of motion for the most-likely path between two quantum states and compare these predicted paths to experimental data. Drawing on the mathematical similarity between the action formalism of the most-likely quantum paths and ray optics, we study the emergence of caustics in quantum trajectories: places where multiple extrema in the stochastic action occur. We observe such multiple most-likely paths in experimental data and find these paths to be in reasonable quantitative agreement with theoretical calculations.

  15. Resonance fluorescence trajectories in superconducting qubit

    NASA Astrophysics Data System (ADS)

    Naghiloo, Mahdi; Tan, Dian; Harrington, Patrick; Lewalle, Philippe; Jordan, Andrew; Murch, Kater

    We employ phase-sensitive amplification to perform homodyne detection of the resonance fluorescence from a driven superconducting artificial atom. Entanglement between the emitter and its fluorescence allows us to track the individual quantum state trajectories of the emitter. We analyze the ensemble properties of these trajectories by considering paths that connect specific initial and final states. By applying a stochastic path integral formalism, we calculate equations of motion for the most likely path between two quantum states and compare these predicted paths to experimental data. Drawing on the mathematical similarity between the action formalism of the most likely quantum paths and ray optics, we study the emergence of caustics in quantum trajectories-situations where multiple extrema in the stochastic action occur. We observe such multiple most likely paths in experimental data and find these paths to be in reasonable quantitative agreement with theoretical calculations. Supported by the John Templeton Foundation.

  16. Does formal complexity reflect cognitive complexity? Investigating aspects of the Chomsky Hierarchy in an artificial language learning study.

    PubMed

    Öttl, Birgit; Jäger, Gerhard; Kaup, Barbara

    2015-01-01

    This study investigated whether formal complexity, as described by the Chomsky Hierarchy, corresponds to cognitive complexity during language learning. According to the Chomsky Hierarchy, nested dependencies (context-free) are less complex than cross-serial dependencies (mildly context-sensitive). In two artificial grammar learning (AGL) experiments participants were presented with a language containing either nested or cross-serial dependencies. A learning effect for both types of dependencies could be observed, but no difference between dependency types emerged. These behavioral findings do not seem to reflect complexity differences as described in the Chomsky Hierarchy. This study extends previous findings in demonstrating learning effects for nested and cross-serial dependencies with more natural stimulus materials in a classical AGL paradigm after only one hour of exposure. The current findings can be taken as a starting point for further exploring the degree to which the Chomsky Hierarchy reflects cognitive processes.

  17. Does Formal Complexity Reflect Cognitive Complexity? Investigating Aspects of the Chomsky Hierarchy in an Artificial Language Learning Study

    PubMed Central

    Öttl, Birgit; Jäger, Gerhard; Kaup, Barbara

    2015-01-01

    This study investigated whether formal complexity, as described by the Chomsky Hierarchy, corresponds to cognitive complexity during language learning. According to the Chomsky Hierarchy, nested dependencies (context-free) are less complex than cross-serial dependencies (mildly context-sensitive). In two artificial grammar learning (AGL) experiments participants were presented with a language containing either nested or cross-serial dependencies. A learning effect for both types of dependencies could be observed, but no difference between dependency types emerged. These behavioral findings do not seem to reflect complexity differences as described in the Chomsky Hierarchy. This study extends previous findings in demonstrating learning effects for nested and cross-serial dependencies with more natural stimulus materials in a classical AGL paradigm after only one hour of exposure. The current findings can be taken as a starting point for further exploring the degree to which the Chomsky Hierarchy reflects cognitive processes. PMID:25885790

  18. Professional Identity Development of Teacher Candidates Participating in an Informal Science Education Internship: A focus on drawings as evidence

    NASA Astrophysics Data System (ADS)

    Katz, Phyllis; McGinnis, J. Randy; Hestness, Emily; Riedinger, Kelly; Marbach-Ad, Gili; Dai, Amy; Pease, Rebecca

    2011-06-01

    This study investigated the professional identity development of teacher candidates participating in an informal afterschool science internship in a formal science teacher preparation programme. We used a qualitative research methodology. Data were collected from the teacher candidates, their informal internship mentors, and the researchers. The data were analysed through an identity development theoretical framework, informed by participants' mental models of science teaching and learning. We learned that the experience in an afterschool informal internship encouraged the teacher candidates to see themselves, and to be seen by others, as enacting key recommendations by science education standards documents, including exhibiting: positive attitudes, sensitivity to diversity, and increasing confidence in facilitating hands-on science participation, inquiry, and collaborative work. Our study provided evidence that the infusion of an informal science education internship in a formal science teacher education programme influenced positively participants' professional identity development as science teachers.

  19. Cost Implications of Organizing Nursing Home Workforce in Teams

    PubMed Central

    Mukamel, Dana B; Cai, Shubing; Temkin-Greener, Helena

    2009-01-01

    Objective To estimate the costs associated with formal and self-managed daily practice teams in nursing homes. Data Sources/Study Setting Medicaid cost reports for 135 nursing homes in New York State in 2006 and survey data for 6,137 direct care workers. Study Design A retrospective statistical analysis: We estimated hybrid cost functions that include team penetration variables. Inference was based on robust standard errors. Data Collection Formal and self-managed team penetration (i.e., percent of staff working in a team) were calculated from survey responses. Annual variable costs, beds, case mix-adjusted days, admissions, home care visits, outpatient clinic visits, day care days, wages, and ownership were calculated from the cost reports. Principal Findings Formal team penetration was significantly associated with costs, while self-managed teams penetration was not. Costs declined with increasing penetration up to 13 percent of formal teams, and increased above this level. Formal teams in nursing homes in the upward sloping range of the curve were more diverse, with a larger number of participating disciplines and more likely to include physicians. Conclusions Organization of workforce in formal teams may offer nursing homes a cost-saving strategy. More research is required to understand the relationship between team composition and costs. PMID:19486181

  20. Quantifying patient preferences for symptomatic breast clinic referral: a decision analysis study.

    PubMed

    Quinlan, Aisling; O'Brien, Kirsty K; Galvin, Rose; Hardy, Colin; McDonnell, Ronan; Joyce, Doireann; McDowell, Ronald D; Aherne, Emma; Keogh, Claire; O'Sullivan, Katriona; Fahey, Tom

    2018-05-31

    Decision analysis study that incorporates patient preferences and probability estimates to investigate the impact of women's preferences for referral or an alternative strategy of watchful waiting if faced with symptoms that could be due to breast cancer. Community-based study. Asymptomatic women aged 30-60 years. Participants were presented with 11 health scenarios that represent the possible consequences of symptomatic breast problems. Participants were asked the risk of death that they were willing to take in order to avoid the health scenario using the standard gamble utility method. This process was repeated for all 11 health scenarios. Formal decision analysis for the preferred individual decision was then estimated for each participant. The preferred diagnostic strategy was either watchful waiting or referral to a breast clinic. Sensitivity analysis was used to examine how each varied according to changes in the probabilities of the health scenarios. A total of 35 participants completed the interviews, with a median age 41 years (IQR 35-47 years). The majority of the study sample was employed (n=32, 91.4%), with a third-level (university) education (n=32, 91.4%) and with knowledge of someone with breast cancer (n=30, 85.7%). When individual preferences were accounted for, 25 (71.4%) patients preferred watchful waiting to referral for triple assessment as their preferred initial diagnostic strategy. Sensitivity analysis shows that referral for triple assessment becomes the dominant strategy at the upper probability estimate (18%) of breast cancer in the community. Watchful waiting is an acceptable strategy for most women who present to their general practitioner (GP) with breast symptoms. These findings suggest that current referral guidelines should take more explicit account of women's preferences in relation to their GPs initial management strategy. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  1. Effect of Formal Education on Vascular Cognitive Impairment after Stroke: A Meta-analysis and Study in Young-Stroke Patients.

    PubMed

    Kessels, Roy P C; Eikelboom, Willem Sake; Schaapsmeerders, Pauline; Maaijwee, Noortje A M; Arntz, Renate M; van Dijk, Ewoud J; de Leeuw, Frank-Erik

    2017-03-01

    The extent of vascular cognitive impairment (VCI) after stroke varies greatly across individuals, even when the same amount of brain damage is present. Education level is a potentially protective factor explaining these differences, but results on its effects on VCI are inconclusive. First, we performed a meta-analysis on formal education and VCI, identifying 21 studies (N=7770). Second, we examined the effect of formal education on VCI in young-stroke patients who were cognitively assessed on average 11.0 (SD=8.2) years post-stroke (the FUTURE study cohort). The total sample consisted of 277 young-stroke patients with a mean age at follow-up 50.9 (SD=10.3). Age and education-adjusted expected scores were computed using 146 matched stroke-free controls. The meta-analysis showed an overall effect size (z') of 0.25 (95% confidence interval [0.18-0.31]), indicating that formal education level had a small to medium effect on VCI. Analyses of the FUTURE data showed that the effect of education on post-stroke executive dysfunction was mediated by age (β age -0.015; p<.05). Below-average performance in the attention domain was more frequent for low-education patients (χ2(2)=9.8; p<.05). While education level was found to be related to post-stroke VCI in previous research, the effects were small. Further analysis in a large stroke cohort showed that these education effects were fully mediated by age, even in relatively young stroke patients. Education level in and of itself does not appear to be a valid indicator of cognitive reserve. Multi-indicator methods may be more valid, but have not been studied in relation to VCI. (JINS, 2017, 23, 223-238).

  2. The Profile Envision and Splicing Tool (PRESTO): Developing an Atmospheric Wind Analysis Tool for Space Launch Vehicles Using Python

    NASA Technical Reports Server (NTRS)

    Orcutt, John M.; Barbre, Robert E., Jr.; Brenton, James C.; Decker, Ryan K.

    2017-01-01

    Launch vehicle programs require vertically complete atmospheric profiles. Many systems at the ER to make the necessary measurements, but all have different EVR, vertical coverage, and temporal coverage. MSFC Natural Environments Branch developed a tool to create a vertically complete profile from multiple inputs using Python. Forward work: Finish Formal Testing Acceptance Testing, End-to-End Testing. Formal Release

  3. Confronting quasi-exponential inflation with WMAP seven

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pal, Barun Kumar; Pal, Supratik; Basu, B., E-mail: barunp1985@rediffmail.com, E-mail: pal@th.physik.uni-bonn.de, E-mail: banasri@isical.ac.in

    2012-04-01

    We confront quasi-exponential models of inflation with WMAP seven years dataset using Hamilton Jacobi formalism. With a phenomenological Hubble parameter, representing quasi exponential inflation, we develop the formalism and subject the analysis to confrontation with WMAP seven using the publicly available code CAMB. The observable parameters are found to fair extremely well with WMAP seven. We also obtain a ratio of tensor to scalar amplitudes which may be detectable in PLANCK.

  4. Model-Driven Test Generation of Distributed Systems

    NASA Technical Reports Server (NTRS)

    Easwaran, Arvind; Hall, Brendan; Schweiker, Kevin

    2012-01-01

    This report describes a novel test generation technique for distributed systems. Utilizing formal models and formal verification tools, spe cifically the Symbolic Analysis Laboratory (SAL) tool-suite from SRI, we present techniques to generate concurrent test vectors for distrib uted systems. These are initially explored within an informal test validation context and later extended to achieve full MC/DC coverage of the TTEthernet protocol operating within a system-centric context.

  5. Fourth NASA Langley Formal Methods Workshop

    NASA Technical Reports Server (NTRS)

    Holloway, C. Michael (Compiler); Hayhurst, Kelly J. (Compiler)

    1997-01-01

    This publication consists of papers presented at NASA Langley Research Center's fourth workshop on the application of formal methods to the design and verification of life-critical systems. Topic considered include: Proving properties of accident; modeling and validating SAFER in VDM-SL; requirement analysis of real-time control systems using PVS; a tabular language for system design; automated deductive verification of parallel systems. Also included is a fundamental hardware design in PVS.

  6. Development of Formal Agricultural Education in Canada (Based on the Analysis of Scientific Periodicals of the 19th-Early 20th Centuries)

    ERIC Educational Resources Information Center

    Havrylenko, Kateryna

    2016-01-01

    The article states that one of the world leaders in agricultural sector training is Canada, which has gained a great scientific and practical experience. The paper examines the role of periodicals of the 19th-early 20th centuries, preserved in the Canadian book funds for the establishment and development of formal agricultural education of this…

  7. The equivalence of Darmois-Israel and distributional method for thin shells in general relativity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mansouri, R.; Khorrami, M.

    1996-11-01

    A distributional method to solve the Einstein{close_quote}s field equations for thin shells is formulated. The familiar field equations and jump conditions of Darmois-Israel formalism are derived. A careful analysis of the Bianchi identities shows that, for cases under consideration, they make sense as distributions and lead to jump conditions of Darmois-Israel formalism. {copyright} {ital 1996 American Institute of Physics.}

  8. Approaches to formalization of the informal waste sector into municipal solid waste management systems in low- and middle-income countries: Review of barriers and success factors.

    PubMed

    Aparcana, Sandra

    2017-03-01

    The Municipal Solid Waste Management (MSWM) sector represents a major challenge for low-and middle-income countries due to significant environmental and socioeconomic issues involving rapid urbanization, their MSWM systems, and the existence of the informal waste sector. Recognizing its role, several countries have implemented various formalization measures, aiming to address the social problems linked to this sector. However, regardless of these initiatives, not all attempts at formalization have proved successful due to the existence of barriers preventing their implementation in the long term. Along with this, there is a frequent lack of knowledge or understanding regarding these barriers and the kind of measures that may enable formalization, thereby attaining a win-win situation for all the stakeholders involved. In this context, policy- and decision-makers in the public and private sectors are frequently confronted with the dilemma of finding workable approaches to formalization, adjusted to their particular MSWM contexts. Building on the review of frequently implemented approaches to formalization, including an analysis of the barriers to and enabling measures for formalization, this paper aims to address this gap by explaining to policy- and decision-makers, and to waste managers in the private sector, certain dynamics that can be observed and that should be taken into account when designing formalization strategies that are adapted to their particular socioeconomic and political-institutional context. This includes possible links between formalization approaches and barriers, the kinds of barriers that need to be removed, and enabling measures leading to successful formalization in the long term. This paper involved a literature review of common approaches to formalization, which were classified into three categories: (1) informal waste workers organized in associations or cooperatives; (2) organized in CBOs or MSEs; and (3) contracted as individual workers by the formal waste sector. This was followed by the identification and subsequent classification of measures for removing common barriers to formalization into five categories: policy/legal, institutional/organizational, technical, social, and economic/financial. The approaches to formalization, as well as the barrier categories, were validated through the assessment of twenty case studies of formalization. Building on the assessment, the paper discussed possible links between formalization approaches and barriers, the 'persistent' challenges that represent barriers to formalization, as well as key enabling factors improving the likelihood of successful formalization. Regardless of the type of approach adopted to formalization, the review identifies measures to remove barriers in all five categories, with a stronger link between the approaches 1 and 2 and the existence of measures in the policy, institutional, and financial categories. Regarding persistent barriers, the review identified ones arising from the absence of measures to address a particular issue before formalization or due to specific country- or sector-related conditions, and their interaction with the MSWM context. 75% of the case studies had persistent barriers in respect of policy/legal issues, 50% of institutional/organizational, 45% of financial/economic, and 40%, and 35% of social and technical issues respectively. This paper concludes that independently of the formalization approach, the lack of interventions or measures in any of the five categories of barriers may lead formalization initiatives to fail, as unaddressed barriers become 'persistent' after formalization is implemented. Furthermore, 'persistent barriers' may also appear due to unfavorable country-specific conditions. The success of a formalization initiative does not depend on a specific approach, but most likely on the inclusion of country-appropriate measures at the policy, economic and institutional levels. The empowerment of informal waste-workers is again confirmed as a further key success factor for their formalization. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Extension of Liouville Formalism to Postinstability Dynamics

    NASA Technical Reports Server (NTRS)

    Zak, Michail

    2003-01-01

    A mathematical formalism has been developed for predicting the postinstability motions of a dynamic system governed by a system of nonlinear equations and subject to initial conditions. Previously, there was no general method for prediction and mathematical modeling of postinstability behaviors (e.g., chaos and turbulence) in such a system. The formalism of nonlinear dynamics does not afford means to discriminate between stable and unstable motions: an additional stability analysis is necessary for such discrimination. However, an additional stability analysis does not suggest any modifications of a mathematical model that would enable the model to describe postinstability motions efficiently. The most important type of instability that necessitates a postinstability description is associated with positive Lyapunov exponents. Such an instability leads to exponential growth of small errors in initial conditions or, equivalently, exponential divergence of neighboring trajectories. The development of the present formalism was undertaken in an effort to remove positive Lyapunov exponents. The means chosen to accomplish this is coupling of the governing dynamical equations with the corresponding Liouville equation that describes the evolution of the flow of error probability. The underlying idea is to suppress the divergences of different trajectories that correspond to different initial conditions, without affecting a target trajectory, which is one that starts with prescribed initial conditions.

  10. Quasipolynomial generalization of Lotka-Volterra mappings

    NASA Astrophysics Data System (ADS)

    Hernández-Bermejo, Benito; Brenig, Léon

    2002-07-01

    In recent years, it has been shown that Lotka-Volterra mappings constitute a valuable tool from both the theoretical and the applied points of view, with developments in very diverse fields such as physics, population dynamics, chemistry and economy. The purpose of this work is to demonstrate that many of the most important ideas and algebraic methods that constitute the basis of the quasipolynomial formalism (originally conceived for the analysis of ordinary differential equations) can be extended into the mapping domain. The extension of the formalism into the discrete-time context is remarkable as far as the quasipolynomial methodology had never been shown to be applicable beyond the differential case. It will be demonstrated that Lotka-Volterra mappings play a central role in the quasipolynomial formalism for the discrete-time case. Moreover, the extension of the formalism into the discrete-time domain allows a significant generalization of Lotka-Volterra mappings as well as a whole transfer of algebraic methods into the discrete-time context. The result is a novel and more general conceptual framework for the understanding of Lotka-Volterra mappings as well as a new range of possibilities that become open not only for the theoretical analysis of Lotka-Volterra mappings and their generalizations, but also for the development of new applications.

  11. Exact reconstruction with directional wavelets on the sphere

    NASA Astrophysics Data System (ADS)

    Wiaux, Y.; McEwen, J. D.; Vandergheynst, P.; Blanc, O.

    2008-08-01

    A new formalism is derived for the analysis and exact reconstruction of band-limited signals on the sphere with directional wavelets. It represents an evolution of a previously developed wavelet formalism developed by Antoine & Vandergheynst and Wiaux et al. The translations of the wavelets at any point on the sphere and their proper rotations are still defined through the continuous three-dimensional rotations. The dilations of the wavelets are directly defined in harmonic space through a new kernel dilation, which is a modification of an existing harmonic dilation. A family of factorized steerable functions with compact harmonic support which are suitable for this kernel dilation are first identified. A scale-discretized wavelet formalism is then derived, relying on this dilation. The discrete nature of the analysis scales allows the exact reconstruction of band-limited signals. A corresponding exact multi-resolution algorithm is finally described and an implementation is tested. The formalism is of interest notably for the denoising or the deconvolution of signals on the sphere with a sparse expansion in wavelets. In astrophysics, it finds a particular application for the identification of localized directional features in the cosmic microwave background data, such as the imprint of topological defects, in particular, cosmic strings, and for their reconstruction after separation from the other signal components.

  12. A Formal Methods Approach to the Analysis of Mode Confusion

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Miller, Steven P.; Potts, James N.; Carreno, Victor A.

    2004-01-01

    The goal of the new NASA Aviation Safety Program (AvSP) is to reduce the civil aviation fatal accident rate by 80% in ten years and 90% in twenty years. This program is being driven by the accident data with a focus on the most recent history. Pilot error is the most commonly cited cause for fatal accidents (up to 70%) and obviously must be given major consideration in this program. While the greatest source of pilot error is the loss of situation awareness , mode confusion is increasingly becoming a major contributor as well. The January 30, 1995 issue of Aviation Week lists 184 incidents and accidents involving mode awareness including the Bangalore A320 crash 2/14/90, the Strasbourg A320 crash 1/20/92, the Mulhouse-Habsheim A320 crash 6/26/88, and the Toulouse A330 crash 6/30/94. These incidents and accidents reveal that pilots sometimes become confused about what the cockpit automation is doing. Consequently, human factors research is an obvious investment area. However, even a cursory look at the accident data reveals that the mode confusion problem is much deeper than just training deficiencies and a lack of human-oriented design. This is readily acknowledged by human factors experts. It seems that further progress in human factors must come through a deeper scrutiny of the internals of the automation. It is in this arena that formal methods can contribute. Formal methods refers to the use of techniques from logic and discrete mathematics in the specification, design, and verification of computer systems, both hardware and software. The fundamental goal of formal methods is to capture requirements, designs and implementations in a mathematically based model that can be analyzed in a rigorous manner. Research in formal methods is aimed at automating this analysis as much as possible. By capturing the internal behavior of a flight deck in a rigorous and detailed formal model, the dark corners of a design can be analyzed. This paper will explore how formal models and analyses can be used to help eliminate mode confusion from flight deck designs and at the same time increase our confidence in the safety of the implementation. The paper is based upon interim results from a new project involving NASA Langley and Rockwell Collins in applying formal methods to a realistic business jet Flight Guidance System (FGS).

  13. Safety Verification of the Small Aircraft Transportation System Concept of Operations

    NASA Technical Reports Server (NTRS)

    Carreno, Victor; Munoz, Cesar

    2005-01-01

    A critical factor in the adoption of any new aeronautical technology or concept of operation is safety. Traditionally, safety is accomplished through a rigorous process that involves human factors, low and high fidelity simulations, and flight experiments. As this process is usually performed on final products or functional prototypes, concept modifications resulting from this process are very expensive to implement. This paper describe an approach to system safety that can take place at early stages of a concept design. It is based on a set of mathematical techniques and tools known as formal methods. In contrast to testing and simulation, formal methods provide the capability of exhaustive state exploration analysis. We present the safety analysis and verification performed for the Small Aircraft Transportation System (SATS) Concept of Operations (ConOps). The concept of operations is modeled using discrete and hybrid mathematical models. These models are then analyzed using formal methods. The objective of the analysis is to show, in a mathematical framework, that the concept of operation complies with a set of safety requirements. It is also shown that the ConOps has some desirable characteristic such as liveness and absence of dead-lock. The analysis and verification is performed in the Prototype Verification System (PVS), which is a computer based specification language and a theorem proving assistant.

  14. Analyzing the influence of institutions on health policy development in Uganda: a case study of the decision to abolish user fees.

    PubMed

    Moat, K A; Abelson, J

    2011-12-01

    During the 2001 election campaign, President Yoweri Museveni announced he was abolishing user fees for health services in Uganda. No analysis has been carried out to explain how he was able to initiate such an important policy decision without encountering any immediate barriers. To explain this outcome through in-depth policy analysis driven by the application of key analytical frameworks. An explanatory case study informed by analytical frameworks from the institutionalism literature was undertaken. Multiple data sources were used including: academic literature, key government documents, grey literature, and a variety of print media. According to the analytical frameworks employed, several formal institutional constraints existed that would have reduced the prospects for the abolition of user fees. However, prevalent informal institutions such as "Big Man" presidentialism and clientelism that were both 'competing' and 'complementary' can be used to explain the policy outcome. The analysis suggests that these factors trumped the impact of more formal institutional structures in the Ugandan context. Consideration should be given to the interactions between formal and informal institutions in the analysis of health policy processes in Uganda, as they provide a more nuanced understanding of how each set of factors influence policy outcomes.

  15. Critical Analysis of the Mathematical Formalism of Theoretical Physics. V. Foundations of the Theory of Negative Numbers

    NASA Astrophysics Data System (ADS)

    Kalanov, Temur Z.

    2015-04-01

    Analysis of the foundations of the theory of negative numbers is proposed. The unity of formal logic and of rational dialectics is methodological basis of the analysis. Statement of the problem is as follows. As is known, point O in the Cartesian coordinate system XOY determines the position of zero on the scale. The number ``zero'' belongs to both the scale of positive numbers and the scale of negative numbers. In this case, the following formallogical contradiction arises: the number 0 is both positive number and negative number; or, equivalently, the number 0 is neither positive number nor negative number, i.e. number 0 has no sign. Then the following question arises: Do negative numbers exist in science and practice? A detailed analysis of the problem shows that negative numbers do not exist because the foundations of the theory of negative numbers contrary to the formal-logical laws. It is proved that: (a) all numbers have no signs; (b) the concepts ``negative number'' and ``negative sign of number'' represent a formallogical error; (c) signs ``plus'' and ``minus'' are only symbols of mathematical operations. The logical errors determine the essence of the theory of negative numbers: the theory of negative number is a false theory.

  16. On the Formal-Logical Analysis of the Foundations of Mathematics Applied to Problems in Physics

    NASA Astrophysics Data System (ADS)

    Kalanov, Temur Z.

    2016-03-01

    Analysis of the foundations of mathematics applied to problems in physics was proposed. The unity of formal logic and of rational dialectics is methodological basis of the analysis. It is shown that critical analysis of the concept of mathematical quantity - central concept of mathematics - leads to the following conclusion: (1) The concept of ``mathematical quantity'' is the result of the following mental operations: (a) abstraction of the ``quantitative determinacy of physical quantity'' from the ``physical quantity'' at that the ``quantitative determinacy of physical quantity'' is an independent object of thought; (b) abstraction of the ``amount (i.e., abstract number)'' from the ``quantitative determinacy of physical quantity'' at that the ``amount (i.e., abstract number)'' is an independent object of thought. In this case, unnamed, abstract numbers are the only sign of the ``mathematical quantity''. This sign is not an essential sign of the material objects. (2) The concept of mathematical quantity is meaningless, erroneous, and inadmissible concept in science because it represents the following formal-logical and dialectical-materialistic error: negation of the existence of the essential sign of the concept (i.e., negation of the existence of the essence of the concept) and negation of the existence of measure of material object.

  17. Investigating Actuation Force Fight with Asynchronous and Synchronous Redundancy Management Techniques

    NASA Technical Reports Server (NTRS)

    Hall, Brendan; Driscoll, Kevin; Schweiker, Kevin; Dutertre, Bruno

    2013-01-01

    Within distributed fault-tolerant systems the term force-fight is colloquially used to describe the level of command disagreement present at redundant actuation interfaces. This report details an investigation of force-fight using three distributed system case-study architectures. Each case study architecture is abstracted and formally modeled using the Symbolic Analysis Laboratory (SAL) tool chain from the Stanford Research Institute (SRI). We use the formal SAL models to produce k-induction based proofs of a bounded actuation agreement property. We also present a mathematically derived bound of redundant actuation agreement for sine-wave stimulus. The report documents our experiences and lessons learned developing the formal models and the associated proofs.

  18. Two formalisms, one renormalized stress-energy tensor

    NASA Astrophysics Data System (ADS)

    Barceló, C.; Carballo, R.; Garay, L. J.

    2012-04-01

    We explicitly compare the structure of the renormalized stress-energy tensor of a massless scalar field in a (1+1) curved spacetime as obtained by two different strategies: normal-mode construction of the field operator and one-loop effective action. We pay special attention to where and how the information related to the choice of vacuum state in both formalisms is encoded. By establishing a clear translation map between both procedures, we show that these two potentially different renormalized stress-energy tensors are actually equal, when using vacuum-state choices related by this map. One specific aim of the analysis is to facilitate the comparison of results regarding semiclassical effects in gravitational collapse as obtained within these different formalisms.

  19. ADM Analysis of gravity models within the framework of bimetric variational formalism

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Golovnev, Alexey; Karčiauskas, Mindaugas; Nyrhinen, Hannu J., E-mail: agolovnev@yandex.ru, E-mail: mindaugas.karciauskas@helsinki.fi, E-mail: hannu.nyrhinen@helsinki.fi

    2015-05-01

    Bimetric variational formalism was recently employed to construct novel bimetric gravity models. In these models an affine connection is generated by an additional tensor field which is independent of the physical metric. In this work we demonstrate how the ADM decomposition can be applied to study such models and provide some technical intermediate details. Using ADM decomposition we are able to prove that a linear model is unstable as has previously been indicated by perturbative analysis. Moreover, we show that it is also very difficult if not impossible to construct a non-linear model which is ghost-free within the framework ofmore » bimetric variational formalism. However, we demonstrate that viable models are possible along similar lines of thought. To this end, we consider a set up in which the affine connection is a variation of the Levi-Civita one. As a proof of principle we construct a gravity model with a massless scalar field obtained this way.« less

  20. Dependability modeling and assessment in UML-based software development.

    PubMed

    Bernardi, Simona; Merseguer, José; Petriu, Dorina C

    2012-01-01

    Assessment of software nonfunctional properties (NFP) is an important problem in software development. In the context of model-driven development, an emerging approach for the analysis of different NFPs consists of the following steps: (a) to extend the software models with annotations describing the NFP of interest; (b) to transform automatically the annotated software model to the formalism chosen for NFP analysis; (c) to analyze the formal model using existing solvers; (d) to assess the software based on the results and give feedback to designers. Such a modeling→analysis→assessment approach can be applied to any software modeling language, be it general purpose or domain specific. In this paper, we focus on UML-based development and on the dependability NFP, which encompasses reliability, availability, safety, integrity, and maintainability. The paper presents the profile used to extend UML with dependability information, the model transformation to generate a DSPN formal model, and the assessment of the system properties based on the DSPN results.

  1. Searching for ethical leadership in nursing.

    PubMed

    Makaroff, Kara Schick; Storch, Janet; Pauly, Bernie; Newton, Lorelei

    2014-09-01

    Attention to ethical leadership in nursing has diminished over the past several decades. The aim of our study was to investigate how frontline nurses and formal nurse leaders envision ethical nursing leadership. Meta-ethnography was used to guide our analysis and synthesis of four studies that explored the notion of ethical nursing leadership. These four original studies were conducted from 1999-2008 in Canada with 601 participants. Ethical approval from the original studies covered future analysis. Using the analytic strategy of lines-of-argument, we found that 1) ethical nursing leadership must be responsive to practitioners and to the contextual system in which they and formal nurse leaders work, and 2) ethical nursing leadership requires receiving and providing support to increase the capacity to practice and discuss ethics in the day-to-day. Formal nurse leaders play a critical, yet often neglected role, in providing ethical leadership and supporting ethical nursing practice at the point of patient care. © The Author(s) 2014.

  2. Dependability Modeling and Assessment in UML-Based Software Development

    PubMed Central

    Bernardi, Simona; Merseguer, José; Petriu, Dorina C.

    2012-01-01

    Assessment of software nonfunctional properties (NFP) is an important problem in software development. In the context of model-driven development, an emerging approach for the analysis of different NFPs consists of the following steps: (a) to extend the software models with annotations describing the NFP of interest; (b) to transform automatically the annotated software model to the formalism chosen for NFP analysis; (c) to analyze the formal model using existing solvers; (d) to assess the software based on the results and give feedback to designers. Such a modeling→analysis→assessment approach can be applied to any software modeling language, be it general purpose or domain specific. In this paper, we focus on UML-based development and on the dependability NFP, which encompasses reliability, availability, safety, integrity, and maintainability. The paper presents the profile used to extend UML with dependability information, the model transformation to generate a DSPN formal model, and the assessment of the system properties based on the DSPN results. PMID:22988428

  3. Assessing Requirements Quality through Requirements Coverage

    NASA Technical Reports Server (NTRS)

    Rajan, Ajitha; Heimdahl, Mats; Woodham, Kurt

    2008-01-01

    In model-based development, the development effort is centered around a formal description of the proposed software system the model. This model is derived from some high-level requirements describing the expected behavior of the software. For validation and verification purposes, this model can then be subjected to various types of analysis, for example, completeness and consistency analysis [6], model checking [3], theorem proving [1], and test-case generation [4, 7]. This development paradigm is making rapid inroads in certain industries, e.g., automotive, avionics, space applications, and medical technology. This shift towards model-based development naturally leads to changes in the verification and validation (V&V) process. The model validation problem determining that the model accurately captures the customer's high-level requirements has received little attention and the sufficiency of the validation activities has been largely determined through ad-hoc methods. Since the model serves as the central artifact, its correctness with respect to the users needs is absolutely crucial. In our investigation, we attempt to answer the following two questions with respect to validation (1) Are the requirements sufficiently defined for the system? and (2) How well does the model implement the behaviors specified by the requirements? The second question can be addressed using formal verification. Nevertheless, the size and complexity of many industrial systems make formal verification infeasible even if we have a formal model and formalized requirements. Thus, presently, there is no objective way of answering these two questions. To this end, we propose an approach based on testing that, when given a set of formal requirements, explores the relationship between requirements-based structural test-adequacy coverage and model-based structural test-adequacy coverage. The proposed technique uses requirements coverage metrics defined in [9] on formal high-level software requirements and existing model coverage metrics such as the Modified Condition and Decision Coverage (MC/DC) used when testing highly critical software in the avionics industry [8]. Our work is related to Chockler et al. [2], but we base our work on traditional testing techniques as opposed to verification techniques.

  4. Heuristics structure and pervade formal risk assessment.

    PubMed

    MacGillivray, Brian H

    2014-04-01

    Lay perceptions of risk appear rooted more in heuristics than in reason. A major concern of the risk regulation literature is that such "error-strewn" perceptions may be replicated in policy, as governments respond to the (mis)fears of the citizenry. This has led many to advocate a relatively technocratic approach to regulating risk, characterized by high reliance on formal risk and cost-benefit analysis. However, through two studies of chemicals regulation, we show that the formal assessment of risk is pervaded by its own set of heuristics. These include rules to categorize potential threats, define what constitutes valid data, guide causal inference, and to select and apply formal models. Some of these heuristics lay claim to theoretical or empirical justifications, others are more back-of-the-envelope calculations, while still more purport not to reflect some truth but simply to constrain discretion or perform a desk-clearing function. These heuristics can be understood as a way of authenticating or formalizing risk assessment as a scientific practice, representing a series of rules for bounding problems, collecting data, and interpreting evidence (a methodology). Heuristics are indispensable elements of induction. And so they are not problematic per se, but they can become so when treated as laws rather than as contingent and provisional rules. Pitfalls include the potential for systematic error, masking uncertainties, strategic manipulation, and entrenchment. Our central claim is that by studying the rules of risk assessment qua rules, we develop a novel representation of the methods, conventions, and biases of the prior art. © 2013 Society for Risk Analysis.

  5. The redefinition of the familialist home care model in France: the complex formalization of care through cash payment.

    PubMed

    Le Bihan, Blanche

    2012-05-01

    This article investigates the impact of policy measures on the organisation of home-based care for older people in France, by examining the balance between formal and informal care and the redefinition of the initial familialist model. It focuses on the specific cash for care scheme (the Allocation personnalisée d'autonomie - Personalised allowance for autonomy) which is at the core of the French home-based care policy. The author argues that in a redefined context of 'welfare mix', the French public strategy for supporting home-based care in France is articulated around two major objectives, which can appear contradictory. It aims to formalise a professional care sector, with respect to the employment policy while allowing the development of new forms of informal care, which cannot be considered to be formal employment. The data collection is two-fold. Firstly, a detailed analysis was made of different policy documents and public reports, together with a systematic review of existing studies. Secondly, statistical analysis on home-based care resources were collected, which was not easy, as home-care services for older people in France are part of a larger sector of activity, 'personal services' (services à la personne). The article exposes three main findings. First, it highlights the complexity of the formalisation process related to the introduction of the French care allowance and demonstrates that formalisation, which facilitates the recognition of care as work, does not necessarily mean professionalisation. Second, it outlines the diversity of the resources available: heterogeneous professional care, semi-formal forms of care work with the possibility to employ a relative and informal family care. Finally, the analysis outlines the importance of the regulation of cash payments on the reshaping of formal and informal care and comments on its impact on the redefinition of informal caring activities. © 2012 Blackwell Publishing Ltd.

  6. Results with OECD recommended positive control sensitizers in the maximization, Buehler and local lymph node assays.

    PubMed

    Basketter, D A; Selbie, E; Scholes, E W; Lees, D; Kimber, I; Botham, P A

    1993-01-01

    The guinea pig maximization test and the Buehler occluded patch test are used widely to identify the sensitization potential of new chemicals. This information enables toxicologists and/or regulatory authorities to determine whether a chemical should be classified formally as a skin sensitizer. Both to improve and to harmonize these assessments internationally, the OECD has recommended recently that moderate rather than strong contact sensitizers are used as positive control substances. The purpose is to ensure an adequate level of sensitivity in sensitization assays performed at specific testing establishments. Results from two laboratories reported here show that the minimum acceptable standard laid down by the OECD can be achieved and indeed commonly exceeded by a substantial margin. Furthermore, results with these positive controls in a new method, the local lymph node assay, also appear to satisfy similar criteria, suggesting results from this assay, including negative data, should be acceptable for classification purposes. However, a review of the way in which results with new chemicals will be interpreted for regulatory purposes, in the context of positive control data, reveals that considerable inadequacies still exist. It is recommended that ultimately, sensitization data can only be interpreted meaningfully (i.e. to protect humans from sensitization hazards) by considering the potency of the contact allergen in the context of the sensitivity of the assay performed at the particular testing institution.

  7. Sensitivity analysis for linear structural equation models, longitudinal mediation with latent growth models and blended learning in biostatistics education

    NASA Astrophysics Data System (ADS)

    Sullivan, Adam John

    In chapter 1, we consider the biases that may arise when an unmeasured confounder is omitted from a structural equation model (SEM) and sensitivity analysis techniques to correct for such biases. We give an analysis of which effects in an SEM are and are not biased by an unmeasured confounder. It is shown that a single unmeasured confounder will bias not just one but numerous effects in an SEM. We present sensitivity analysis techniques to correct for biases in total, direct, and indirect effects when using SEM analyses, and illustrate these techniques with a study of aging and cognitive function. In chapter 2, we consider longitudinal mediation with latent growth curves. We define the direct and indirect effects using counterfactuals and consider the assumptions needed for identifiability of those effects. We develop models with a binary treatment/exposure followed by a model where treatment/exposure changes with time allowing for treatment/exposure-mediator interaction. We thus formalize mediation analysis with latent growth curve models using counterfactuals, makes clear the assumptions and extends these methods to allow for exposure mediator interactions. We present and illustrate the techniques with a study on Multiple Sclerosis(MS) and depression. In chapter 3, we report on a pilot study in blended learning that took place during the Fall 2013 and Summer 2014 semesters here at Harvard. We blended the traditional BIO 200: Principles of Biostatistics and created ID 200: Principles of Biostatistics and epidemiology. We used materials from the edX course PH207x: Health in Numbers: Quantitative Methods in Clinical & Public Health Research and used. These materials were used as a video textbook in which students would watch a given number of these videos prior to class. Using surveys as well as exam data we informally assess these blended classes from the student's perspective as well as a comparison of these students with students in another course, BIO 201: Introduction to Statistical Methods in Fall 2013 as well as students from BIO 200 in Fall semesters of 1992 and 1993. We then suggest improvements upon our original course designs and follow up with an informal look at how these implemented changes affected the second offering of the newly blended ID 200 in Summer 2014.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aartsen, M.G.; Abraham, K.; Ackermann, M.

    We present an improved event-level likelihood formalism for including neutrino telescope data in global fits to new physics. We derive limits on spin-dependent dark matter-proton scattering by employing the new formalism in a re-analysis of data from the 79-string IceCube search for dark matter annihilation in the Sun, including explicit energy information for each event. The new analysis excludes a number of models in the weak-scale minimal supersymmetric standard model (MSSM) for the first time. This work is accompanied by the public release of the 79-string IceCube data, as well as an associated computer code for applying the new likelihoodmore » to arbitrary dark matter models.« less

  9. Fitting Higgs data with nonlinear effective theory.

    PubMed

    Buchalla, G; Catà, O; Celis, A; Krause, C

    2016-01-01

    In a recent paper we showed that the electroweak chiral Lagrangian at leading order is equivalent to the conventional [Formula: see text] formalism used by ATLAS and CMS to test Higgs anomalous couplings. Here we apply this fact to fit the latest Higgs data. The new aspect of our analysis is a systematic interpretation of the fit parameters within an EFT. Concentrating on the processes of Higgs production and decay that have been measured so far, six parameters turn out to be relevant: [Formula: see text], [Formula: see text], [Formula: see text], [Formula: see text], [Formula: see text], [Formula: see text]. A global Bayesian fit is then performed with the result [Formula: see text], [Formula: see text], [Formula: see text], [Formula: see text], [Formula: see text], [Formula: see text]. Additionally, we show how this leading-order parametrization can be generalized to next-to-leading order, thus improving the [Formula: see text] formalism systematically. The differences with a linear EFT analysis including operators of dimension six are also discussed. One of the main conclusions of our analysis is that since the conventional [Formula: see text] formalism can be properly justified within a QFT framework, it should continue to play a central role in analyzing and interpreting Higgs data.

  10. A Framework for Analysis of Research Risks and Benefits to Participants in Standard of Care Pragmatic Clinical Trials

    PubMed Central

    Chen, Stephanie C; Kim, Scott Y H

    2016-01-01

    Background/Aims Standard of care pragmatic clinical trials (SCPCTs) that compare treatments already in use could improve care and reduce cost but there is considerable debate about the research risks of SCPCTs and how to apply informed consent regulations to such trials. We sought to develop a framework integrating the insights from opposing sides of the debate. Methods We developed a formal risk-benefit analysis framework for SCPCTs and then applied it to key provisions of the U.S. federal regulations. Results Our formal framework for SCPCT risk-benefit analysis takes into account three key considerations: the ex ante estimates of risks and benefits of the treatments to be compared in a SCPCT, the allocation ratios of treatments inside and outside a SCPCT, and the significance of some participants receiving a different treatment inside a SCPCT than outside the trial. The framework provides practical guidance on how the research ethics regulations on informed consent should be applied to SCPCTs. Conclusions Our proposed formal model makes explicit the relationship between the concepts used by opposing sides of the debate about the research risks of SCPCTs and can be used to clarify the implications for informed consent. PMID:27365010

  11. The Portuguese formal social support for autonomy and dependence in pain inventory (FSSADI_PAIN): a preliminary validation study.

    PubMed

    Matos, Marta; Bernardes, Sónia F

    2013-09-01

    Development and preliminary validation of a Portuguese measure of perceived Formal Social Support for Autonomy and Dependence in Pain (FSSADI_PAIN). One hundred and fifty-one older adults (88.1% women), between 56 and 94 years of age (M = 75.41; SD = 9.11), who attended one of the following institutions--day care centre (33.1%), nursing home (36.4%) and senior university (30.5%)--were recruited for this study. Along with the FSSADI_PAIN, participants filled out the Portuguese versions of the Brief Pain Inventory (Azevedo et al., 2007, Dor, 15, 6) and the Social Support Scale of Medical Outcomes Survey (Pais-Ribeiro & Ponte, 2009, Psicologia, Saúde & Doença, 10, 163). The factorial structure reflected the functions of perceived promotion of (1) dependence and (2) autonomy, showing good internal consistency (α > .70) and sensitivity indices. The FSSADI_PAIN showed good content, discriminant and criterion validity; it differentiated the perceptions of promotion of dependence/autonomy according to individual's pain severity and disability, as well as the type of institution. These preliminary findings suggest that the FSSADI_PAIN is an innovative and promising measure of perceived formal social support adapted to pain-related contexts. © 2012 The British Psychological Society.

  12. Determination of absorbed dose to water from a miniature kilovoltage x-ray source using a parallel-plate ionization chamber

    NASA Astrophysics Data System (ADS)

    Watson, Peter G. F.; Popovic, Marija; Seuntjens, Jan

    2018-01-01

    Electronic brachytherapy sources are widely accepted as alternatives to radionuclide-based systems. Yet, formal dosimetry standards for these devices to independently complement the dose protocol provided by the manufacturer are lacking. This article presents a formalism for calculating and independently verifying the absorbed dose to water from a kV x-ray source (The INTRABEAM System) measured in a water phantom with an ionization chamber calibrated in terms of air-kerma. This formalism uses a Monte Carlo (MC) calculated chamber conversion factor, CQ , to convert air-kerma in a reference beam to absorbed dose to water in the measurement beam. In this work CQ was determined for a PTW 34013 parallel-plate ionization chamber. Our results show that CQ was sensitive to the chamber plate separation tolerance, with differences of up to 15%. CQ was also found to have a depth dependence which varied with chamber plate separation (0 to 10% variation for the smallest and largest cavity height, over 3 to 30 mm depth). However for all chamber dimensions investigated, CQ was found to be significantly larger than the manufacturer reported value, suggesting that the manufacturer recommended method of dose calculation could be underestimating the dose to water.

  13. E-SAP: Efficient-Strong Authentication Protocol for Healthcare Applications Using Wireless Medical Sensor Networks

    PubMed Central

    Kumar, Pardeep; Lee, Sang-Gon; Lee, Hoon-Jae

    2012-01-01

    A wireless medical sensor network (WMSN) can sense humans’ physiological signs without sacrificing patient comfort and transmit patient vital signs to health professionals’ hand-held devices. The patient physiological data are highly sensitive and WMSNs are extremely vulnerable to many attacks. Therefore, it must be ensured that patients’ medical signs are not exposed to unauthorized users. Consequently, strong user authentication is the main concern for the success and large scale deployment of WMSNs. In this regard, this paper presents an efficient, strong authentication protocol, named E-SAP, for healthcare application using WMSNs. The proposed E-SAP includes: (1) a two-factor (i.e., password and smartcard) professional authentication; (2) mutual authentication between the professional and the medical sensor; (3) symmetric encryption/decryption for providing message confidentiality; (4) establishment of a secure session key at the end of authentication; and (5) professionals can change their password. Further, the proposed protocol requires three message exchanges between the professional, medical sensor node and gateway node, and achieves efficiency (i.e., low computation and communication cost). Through the formal analysis, security analysis and performance analysis, we demonstrate that E-SAP is more secure against many practical attacks, and allows a tradeoff between the security and the performance cost for healthcare application using WMSNs. PMID:22438729

  14. E-SAP: efficient-strong authentication protocol for healthcare applications using wireless medical sensor networks.

    PubMed

    Kumar, Pardeep; Lee, Sang-Gon; Lee, Hoon-Jae

    2012-01-01

    A wireless medical sensor network (WMSN) can sense humans' physiological signs without sacrificing patient comfort and transmit patient vital signs to health professionals' hand-held devices. The patient physiological data are highly sensitive and WMSNs are extremely vulnerable to many attacks. Therefore, it must be ensured that patients' medical signs are not exposed to unauthorized users. Consequently, strong user authentication is the main concern for the success and large scale deployment of WMSNs. In this regard, this paper presents an efficient, strong authentication protocol, named E-SAP, for healthcare application using WMSNs. The proposed E-SAP includes: (1) a two-factor (i.e., password and smartcard) professional authentication; (2) mutual authentication between the professional and the medical sensor; (3) symmetric encryption/decryption for providing message confidentiality; (4) establishment of a secure session key at the end of authentication; and (5) professionals can change their password. Further, the proposed protocol requires three message exchanges between the professional, medical sensor node and gateway node, and achieves efficiency (i.e., low computation and communication cost). Through the formal analysis, security analysis and performance analysis, we demonstrate that E-SAP is more secure against many practical attacks, and allows a tradeoff between the security and the performance cost for healthcare application using WMSNs.

  15. Crowdsourced Formal Verification: A Business Case Analysis Toward a Human-Centered Business Model

    DTIC Science & Technology

    2015-06-01

    literacycampaignmc.org/wp-content/uploads/2011/11/ Compressed-State-of-Literacy-MC1.pdf Ryan , R. M., & Deci , E. L. (2000). Self - determination theory and the...crowd- sourced formal verification games provide intrinsic motivation. Ryan and Deci (2000) sum- marized three needs that drive the intrinsic motivation...competence, relatedness, and au- tonomy. Therefore, such games have to embrace the self - determination of the customers. Games, per se, can satisfy

  16. Renormalization in Coulomb-gauge QCD within the Lagrangian formalism

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Niegawa, A.

    2006-08-15

    We study renormalization of Coulomb-gauge QCD within the Lagrangian, second-order, formalism. We derive a Ward identity and the Zinn-Justin equation, and, with the help of the latter, we give a proof of algebraic renormalizability of the theory. Through diagrammatic analysis, we show that, in the strict Coulomb gauge, g{sup 2}D{sup 00} is invariant under renormalization. (D{sup 00} is the time-time component of the gluon propagator.)

  17. The COUNSELOR Project: Understanding Legal Argument.

    DTIC Science & Technology

    1986-01-01

    utilize is one presented by Stephen Toulmin [ Toulmin 58]. The Toulmin model is one of the most widely accepted formalizations in existence as it is...provides allows analysis and criticism of propositions to occur at several levels. Argument, as seen by Toulmin , is defined as "movement from...optional pieces of the Toulmin model. It is these features that allow the model a great deal of flexibility and give it advantages over other formalisms

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pilat, Joseph F

    The application of the methodology developed by the GenIV International Forum's (GIF's) Proliferation Resistance and Physical Protection (PR&PP) Working Group is an expert elicitation. Although the framework of the methodology is structured and systematic, it does not by itself constitute or require a formal elicitation. However, formal elicitation can be utilized in the PR&PP context to provide a systematic, credible and transparent qualitative analysis and develop input for quantitative analyses. This section provides an overview of expert elicitations, a discussion of the role formal expert elicitations can play in the PR&PP methodology, an outline of the formal expert elicitation processmore » and a brief practical guide to conducting formal expert elicitations. Expert elicitation is a process utilizing knowledgeable people in cases, for example, when an assessment is needed but physically based data is absent or open to interpretation. More specifically, it can be used to: (1) predict future events; (2) provide estimates on new, rare, complex or poorly understood phenomena; (3) integrate or interpret existing information; or (4) determine what is currently known, how well it is known or what is worth learning in a field. Expert elicitation can be informal or formal. The informal application of expert judgment is frequently used. Although it can produce good results, it often provides demonstrably biased or otherwise flawed answers to problems. This along with the absence of transparency can result in a loss of confidence when experts speak on issues. More formal expert elicitation is a structured process that makes use of people knowledgeable in certain areas to make assessments. The reason for advocating formal use is that the quality and accuracy of expert judgment comes from the completeness of the expert's understanding of the phenomena and the process used to elicit and analyze the data. The use of a more formal process to obtain, lU1derstand and analyze expert judgment has led to an improved acceptance of expert judgment because of the rigor and transparency of the results.« less

  19. Detection limits of quantitative and digital PCR assays and their influence in presence-absence surveys of environmental DNA

    USGS Publications Warehouse

    Hunter, Margaret; Dorazio, Robert M.; Butterfield, John S.; Meigs-Friend, Gaia; Nico, Leo; Ferrante, Jason A.

    2017-01-01

    A set of universal guidelines is needed to determine the limit of detection (LOD) in PCR-based analyses of low concentration DNA. In particular, environmental DNA (eDNA) studies require sensitive and reliable methods to detect rare and cryptic species through shed genetic material in environmental samples. Current strategies for assessing detection limits of eDNA are either too stringent or subjective, possibly resulting in biased estimates of species’ presence. Here, a conservative LOD analysis grounded in analytical chemistry is proposed to correct for overestimated DNA concentrations predominantly caused by the concentration plateau, a nonlinear relationship between expected and measured DNA concentrations. We have used statistical criteria to establish formal mathematical models for both quantitative and droplet digital PCR. To assess the method, a new Grass Carp (Ctenopharyngodon idella) TaqMan assay was developed and tested on both PCR platforms using eDNA in water samples. The LOD adjustment reduced Grass Carp occupancy and detection estimates while increasing uncertainty – indicating that caution needs to be applied to eDNA data without LOD correction. Compared to quantitative PCR, digital PCR had higher occurrence estimates due to increased sensitivity and dilution of inhibitors at low concentrations. Without accurate LOD correction, species occurrence and detection probabilities based on eDNA estimates are prone to a source of bias that cannot be reduced by an increase in sample size or PCR replicates. Other applications also could benefit from a standardized LOD such as GMO food analysis, and forensic and clinical diagnostics.

  20. A Case Study of Manned Strategic Bomber Acquisition: The B-70 Valkyrie

    DTIC Science & Technology

    1990-09-01

    TIC Aproved for public ,, iota ,: L.c E - DEPARTMENT OF THE AIR FORCE S AIR UNIVERSITYAIR FORCE INSTITUTE OF TECHNOLOGY iTHESI W right- Patterson Air...found that covered the political history in a thorough, chronological manner. Additionally, because of security classification or other sensitivities...discussions. The first step in securing the interview was a formal, written requests from the AFIT/LS Dean, shown in Appendix A. Any further steps in the

  1. Point design targets, specifications, and requirements for the 2010 NIF ignition campaign

    NASA Astrophysics Data System (ADS)

    Haan, Steven

    2010-11-01

    A set of point design targets has been specified for the initial ignition campaign on the National Ignition Facility [G. Miller, E. Moses, and C. Wuest, Opt. Eng. 443, 2841 (2004)]. The targets use an ablator of either Be(Cu) or CH(Ge). They are imploded in a U or Au hohlraum at peak radiation temperature 270 to 300eV. Considerations determining the point design include laser-plasma interactions, hydro stability, laser operations, and target fabrication. Simulations were used to evaluate choices, to define requirements, and to estimate sensitivity to uncertainties. Designs were updated to account for 2009 experimental results. We describe a formalism to evaluate the margin for ignition, in a parameter the Ignition Threshold Factor (ITF). Uncertainty and shot-to-shot variability can be evaluated, as well as sensitivity to systematic uncertainties. The formalism is used to estimate the probability of ignition for each target. In collaboration with J Lindl, D Callahan, D Clark, J Salmonson, B Hammel, L Atherton, R Cook, J Edwards, S Glenzer, A Hamza, S Hatchett, D Hinkel, D Ho, O Jones, O Landen, B MacGowan, M Marinak, E Moses, D Munro, S Pollaine, B Spears, P Springer, L Suter, C Thomas, R Town, S Weber, D Wilson, G Kyrala, M Herrmann, R Olson, R Vesey, A Nikroo, H Huang, and K Moreno.

  2. PTW-diamond detector: dose rate and particle type dependence.

    PubMed

    Fidanzio, A; Azario, L; Miceli, R; Russo, A; Piermattei, A

    2000-11-01

    In this paper the suitability of a PTW natural diamond detector (DD) for relative and reference dosimetry of photon and electron beams, with dose per pulse between 0.068 mGy and 0.472 mGy, was studied and the results were compared with those obtained by a stereotactic silicon detector (SFD). The results show that, in the range of the examined dose per pulse the DD sensitivity changes up to 1.8% while the SFD sensitivity changes up to 4.5%. The fitting parameter, delta, used to correct the dose per pulse dependence of solid state detectors, was delta = 0.993 +/- 0.002 and delta = 1.025 +/- 0.002 for the diamond detector and for the silicon diode, respectively. The delta values were found to be independent of particle type of two conventional beams (a 10 MV x-ray beam and a 21 MeV electron beam). So if delta is determined for a radiotherapy beam, it can be used to correct relative dosimetry for other conventional radiotherapy beams. Moreover the diamond detector shows a calibration factor which is independent of beam quality and particle type, so an empirical dosimetric formalism is proposed here to obtain the reference dosimetry. This formalism is based on a dose-to-water calibration factor and on an empirical coefficient, that takes into account the reading dependence on the dose per pulse.

  3. Effects of user mental state on EEG-BCI performance.

    PubMed

    Myrden, Andrew; Chau, Tom

    2015-01-01

    Changes in psychological state have been proposed as a cause of variation in brain-computer interface performance, but little formal analysis has been conducted to support this hypothesis. In this study, we investigated the effects of three mental states-fatigue, frustration, and attention-on BCI performance. Twelve able-bodied participants were trained to use a two-class EEG-BCI based on the performance of user-specific mental tasks. Following training, participants completed three testing sessions, during which they used the BCI to play a simple maze navigation game while periodically reporting their perceived levels of fatigue, frustration, and attention. Statistical analysis indicated that there is a significant relationship between frustration and BCI performance while the relationship between fatigue and BCI performance approached significance. BCI performance was 7% lower than average when self-reported fatigue was low and 7% higher than average when self-reported frustration was moderate. A multivariate analysis of mental state revealed the presence of contiguous regions in mental state space where BCI performance was more accurate than average, suggesting the importance of moderate fatigue for achieving effortless focus on BCI control, frustration as a potential motivating factor, and attention as a compensatory mechanism to increasing frustration. Finally, a visual analysis showed the sensitivity of underlying class distributions to changes in mental state. Collectively, these results indicate that mental state is closely related to BCI performance, encouraging future development of psychologically adaptive BCIs.

  4. Application of BALB/c mouse in the local lymph node assay:BrdU-ELISA for the prediction of the skin sensitizing potential of chemicals.

    PubMed

    Hou, Fenxia; Xing, Caihong; Li, Bin; Cheng, Juan; Chen, Wei; Zhang, Man

    2015-01-01

    Allergic contact dermatitis (ACD) is a skin disease characterized by eczema and itching. A considerable proportion of chemicals induce ACD in humans. More than 10,000 substances should be tested for skin sensitization potential under the Registration, Evaluation, Authorization and Restriction of Chemical substances (REACH) regulation. The Local Lymph Node Assay (LLNA) has been designated as the first-choice in vivo assay for sensitization testing by REACH. The LLNA:BrdU-ELISA is a validated non-radioactive modification to the LLNA. For both the LLNA and the LLNA:BrdU-ELISA, CBA/JN mouse is the preferred mouse strain recommended in the regulatory guidelines. However, the availability of CBA/JN mouse in China is only limited to a few animal suppliers, which makes the mouse difficult to obtain. BALB/c mouse, which is widely commercially available, is considered for alternative use but it can only be used in the assay after it has been evaluated by formal validation study. Thus, a validation study was conducted in our laboratory to determine if BALB/c mouse could also be used in the LLNA:BrdU-ELISA. Forty-three test substances including 32 LLNA sensitizers and 11 LLNA non-sensitizers, their vehicles and each concentration used were the same as that used in the formal validation study for the LLNA:BrdU-ELISA using CBA/JN mouse. Female BALB/c mice of 8-10 weeks old were randomly allocated to groups (four mice per group). The test substance (25 μl) or the vehicle alone was applied to the dorsum of both ears daily for 3 consecutive days. A single intraperitoneal injection of 0.5 ml of BrdU (10mg/ml) solution was given on day 5. On day 6, a pair of auricular lymph nodes from each mouse was excised, weighed and stored at -20°C until BrdU-ELISA was conducted. This validation study for the LLNA:BrdU-ELISA using BALB/c mouse correctly identified 30 of 31 sensitizers and 8 of 11 non-sensitizers. The accuracy, sensitivity, specificity, false positive rate, false negative rate, positive predictivity values and negative predictivity values in this study, which could indicate the performance of the LLNA:BrdU-ELISA using BALB/c mouse, were not different statistically from that of the validation study for the LLNA:BrdU-ELISA using CBA/JN mouse. This validation study indicates that BALB/c mouse could be used alternatively in the LLNA:BrdU-ELISA for the prediction of the skin sensitizing potential of chemicals. Copyright © 2015 Elsevier Inc. All rights reserved.

  5. The relationship between temperamental traits and the level of performance of an eye-hand co-ordination task in jet pilots.

    PubMed

    Biernacki, Marcin; Tarnowski, Adam

    2008-01-01

    When assessing the psychological suitability for the profession of a pilot, it is important to consider personality traits and psychomotor abilities. Our study aimed at estimating the role of temperamental traits as components of pilots' personality in eye-hand co-ordination. The assumption was that differences in the escalation of the level of temperamental traits, as measured with the Formal Characteristic of Behaviour-Temperament Inventory (FCB-TI), will significantly influence eye-hand co-ordination. At the level of general scores, enhanced briskness proved to be the most important trait for eye-hand co-ordination. An analysis of partial scores additionally underlined the importance of sensory sensitivity, endurance and activity. The application of eye-hand co-ordination tasks, which involve energetic and temporal dimensions of performance, helped to disclose the role of biologically-based personality traits in psychomotor performance. The implication of these findings for selecting pilots is discussed.

  6. An advanced environment for hybrid modeling of biological systems based on modelica.

    PubMed

    Pross, Sabrina; Bachmann, Bernhard

    2011-01-20

    Biological systems are often very complex so that an appropriate formalism is needed for modeling their behavior. Hybrid Petri Nets, consisting of time-discrete Petri Net elements as well as continuous ones, have proven to be ideal for this task. Therefore, a new Petri Net library was implemented based on the object-oriented modeling language Modelica which allows the modeling of discrete, stochastic and continuous Petri Net elements by differential, algebraic and discrete equations. An appropriate Modelica-tool performs the hybrid simulation with discrete events and the solution of continuous differential equations. A special sub-library contains so-called wrappers for specific reactions to simplify the modeling process. The Modelica-models can be connected to Simulink-models for parameter optimization, sensitivity analysis and stochastic simulation in Matlab. The present paper illustrates the implementation of the Petri Net component models, their usage within the modeling process and the coupling between the Modelica-tool Dymola and Matlab/Simulink. The application is demonstrated by modeling the metabolism of Chinese Hamster Ovary Cells.

  7. Lanthanides caged by the organic chelates; structural properties

    NASA Astrophysics Data System (ADS)

    Smentek, Lidia

    2011-04-01

    The structure, in particular symmetry, geometry and morphology of organic chelates coordinated with the lanthanide ions are analyzed in the present review. This is the first part of a complete presentation of a theoretical description of the properties of systems, which are widely used in technology, but most of all, in molecular biology and medicine. The discussion is focused on the symmetry and geometry of the cages, since these features play a dominant role in the spectroscopic activity of the lanthanides caged by organic chelates. At the same time, the spectroscopic properties require more formal presentation in the language of Racah algebra, and deserve a separate analysis. In addition to the parent systems of DOTA, DOTP, EDTMP and CDTMP presented here, their modifications by various antennas are analyzed. The conclusions that have a strong impact upon the theory of the energy transfer and the sensitized luminescence of these systems are based on the results of numerical density functional theory calculations.

  8. Integrated Evaluation of Reliability and Power Consumption of Wireless Sensor Networks

    PubMed Central

    Dâmaso, Antônio; Maciel, Paulo

    2017-01-01

    Power consumption is a primary interest in Wireless Sensor Networks (WSNs), and a large number of strategies have been proposed to evaluate it. However, those approaches usually neither consider reliability issues nor the power consumption of applications executing in the network. A central concern is the lack of consolidated solutions that enable us to evaluate the power consumption of applications and the network stack also considering their reliabilities. To solve this problem, we introduce a fully automatic solution to design power consumption aware WSN applications and communication protocols. The solution presented in this paper comprises a methodology to evaluate the power consumption based on the integration of formal models, a set of power consumption and reliability models, a sensitivity analysis strategy to select WSN configurations and a toolbox named EDEN to fully support the proposed methodology. This solution allows accurately estimating the power consumption of WSN applications and the network stack in an automated way. PMID:29113078

  9. Photo-assisted electron emission from illuminated monolayer graphene

    NASA Astrophysics Data System (ADS)

    Upadhyay Kahaly, M.; Misra, Shikha; Mishra, S. K.

    2017-05-01

    We establish a formalism to address co-existing and complementing thermionic and photoelectric emission from a monolayer graphene sheet illuminated via monochromatic laser radiation and operating at a finite temperature. Taking into account the two dimensional Fermi-Dirac statistics as is applicable for a graphene sheet, the electron energy redistribution due to thermal agitation via laser irradiation, and Fowler's approach of the electron emission, along with Born's approximation to evaluate the tunneling probability, the expressions for the photoelectric and thermionic emission flux have been derived. The cumulative emission flux is observed to be sensitive to the parametric tuning of the laser and material specifications. Based on the parametric analysis, the photoemission flux is noticed to dominate over its coexisting counterpart thermionic emission flux for smaller values of the material work function, surface temperature, and laser wavelength; the analytical estimates are in reasonably good agreement with the recent experimental observations [Massicotte et al., Nat. Commun. 7, 12174 (2016)]. The results evince the efficient utilization of a graphene layer as a photo-thermionic emitter.

  10. THE CAUSAL ANALYSIS / DIAGNOSIS DECISION ...

    EPA Pesticide Factsheets

    CADDIS is an on-line decision support system that helps investigators in the regions, states and tribes find, access, organize, use and share information to produce causal evaluations in aquatic systems. It is based on the US EPA's Stressor Identification process which is a formal method for identifying causes of impairments in aquatic systems. CADDIS 2007 increases access to relevant information useful for causal analysis and provides methods and tools that practitioners can use to analyze their own data. The new Candidate Cause section provides overviews of commonly encountered causes of impairments to aquatic systems: metals, sediments, nutrients, flow alteration, temperature, ionic strength, and low dissolved oxygen. CADDIS includes new Conceptual Models that illustrate the relationships from sources to stressors to biological effects. An Interactive Conceptual Model for phosphorus links the diagram with supporting literature citations. The new Analyzing Data section helps practitioners analyze their data sets and interpret and use those results as evidence within the USEPA causal assessment process. Downloadable tools include a graphical user interface statistical package (CADStat), and programs for use with the freeware R statistical package, and a Microsoft Excel template. These tools can be used to quantify associations between causes and biological impairments using innovative methods such as species-sensitivity distributions, biological inferenc

  11. Symbolic discrete event system specification

    NASA Technical Reports Server (NTRS)

    Zeigler, Bernard P.; Chi, Sungdo

    1992-01-01

    Extending discrete event modeling formalisms to facilitate greater symbol manipulation capabilities is important to further their use in intelligent control and design of high autonomy systems. An extension to the DEVS formalism that facilitates symbolic expression of event times by extending the time base from the real numbers to the field of linear polynomials over the reals is defined. A simulation algorithm is developed to generate the branching trajectories resulting from the underlying nondeterminism. To efficiently manage symbolic constraints, a consistency checking algorithm for linear polynomial constraints based on feasibility checking algorithms borrowed from linear programming has been developed. The extended formalism offers a convenient means to conduct multiple, simultaneous explorations of model behaviors. Examples of application are given with concentration on fault model analysis.

  12. Student approaches for learning in medicine: What does it tell us about the informal curriculum?

    PubMed Central

    2011-01-01

    Background It has long been acknowledged that medical students frequently focus their learning on that which will enable them to pass examinations, and that they use a range of study approaches and resources in preparing for their examinations. A recent qualitative study identified that in addition to the formal curriculum, students are using a range of resources and study strategies which could be attributed to the informal curriculum. What is not clearly established is the extent to which these informal learning resources and strategies are utilized by medical students. The aim of this study was to establish the extent to which students in a graduate-entry medical program use various learning approaches to assist their learning and preparation for examinations, apart from those resources offered as part of the formal curriculum. Methods A validated survey instrument was administered to 522 medical students. Factor analysis and internal consistence, descriptive analysis and comparisons with demographic variables were completed. The factor analysis identified eight scales with acceptable levels of internal consistency with an alpha coefficient between 0.72 and 0.96. Results Nearly 80% of the students reported that they were overwhelmed by the amount of work that was perceived necessary to complete the formal curriculum, with 74.3% believing that the informal learning approaches helped them pass the examinations. 61.3% believed that they prepared them to be good doctors. A variety of informal learning activities utilized by students included using past student notes (85.8%) and PBL tutor guides (62.7%), and being part of self-organised study groups (62.6%), and peer-led tutorials (60.2%). Almost all students accessed the formal school resources for at least 10% of their study time. Students in the first year of the program were more likely to rely on the formal curriculum resources compared to those of Year 2 (p = 0.008). Conclusions Curriculum planners should examine the level of use of informal learning activities in their schools, and investigate whether this is to enhance student progress, a result of perceived weakness in the delivery and effectiveness of formal resources, or to overcome anxiety about the volume of work expected by medical programs. PMID:22013994

  13. The impact of signal normalization on seizure detection using line length features.

    PubMed

    Logesparan, Lojini; Rodriguez-Villegas, Esther; Casson, Alexander J

    2015-10-01

    Accurate automated seizure detection remains a desirable but elusive target for many neural monitoring systems. While much attention has been given to the different feature extractions that can be used to highlight seizure activity in the EEG, very little formal attention has been given to the normalization that these features are routinely paired with. This normalization is essential in patient-independent algorithms to correct for broad-level differences in the EEG amplitude between people, and in patient-dependent algorithms to correct for amplitude variations over time. It is crucial, however, that the normalization used does not have a detrimental effect on the seizure detection process. This paper presents the first formal investigation into the impact of signal normalization techniques on seizure discrimination performance when using the line length feature to emphasize seizure activity. Comparing five normalization methods, based upon the mean, median, standard deviation, signal peak and signal range, we demonstrate differences in seizure detection accuracy (assessed as the area under a sensitivity-specificity ROC curve) of up to 52 %. This is despite the same analysis feature being used in all cases. Further, changes in performance of up to 22 % are present depending on whether the normalization is applied to the raw EEG itself or directly to the line length feature. Our results highlight the median decaying memory as the best current approach for providing normalization when using line length features, and they quantify the under-appreciated challenge of providing signal normalization that does not impair seizure detection algorithm performance.

  14. The formal support experiences of mothers of adolescents with intellectual disabilities in Edinburgh, UK: a longitudinal qualitative design.

    PubMed

    Lin, Mei-Chun; Macmillan, Maureen; Brown, Norrie

    2010-03-01

    In the United Kingdom, healthcare provision for children with intellectual disabilities (ID) has shifted away from institutions to the community. Today, family members most often assume the primary caregiver role and look after care recipients in the home. The support needs of caregivers, therefore, represent an important area of research that should help caregivers enhance their quality of life. The aim of this study was to understand the received formal support perceptions of mothers of ID adolescents over time. This study used a longitudinal qualitative method in three phases. Semistructured interviews were conducted with seven mothers at three points in time (initial, at 6 months, and at 18 months). Constant comparative analysis was conducted on transcribed interviews. The three themes that emerged from research included (1) the process of complex emotions, (2) the perception of received support, and (3) the process of fighting reactions. Mothers expressed different levels of satisfaction and dissatisfaction with the range of support received. Respite care was, overall, a beneficial intervention for participants. However, some mothers felt health professionals to be insensitive, showing lack of understanding and empathy, diminished personhood, and perceived lack of respect for the human value of their children with ID. The "fighting process" experienced when applying for financial help from the social welfare system was also pointed out as stressful. Further exploration of the professional support needs of mothers is important to support effectively their effective caregiving role. Professionals should increase their awareness of caregiver sensitivities and be respectful of individual responses by providing empathy and understanding from the caregivers' point of view.

  15. Decoherence in Neutrino Propagation Through Matter, and Bounds from IceCube/DeepCore

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coloma, Pilar; Lopez-Pavon, Jacobo; Martinez-Soler, Ivan

    We revisit neutrino oscillations in matter considering the open quantum system framework which allows to introduce possible decoherence effects generated by New Physics in a phenomenological manner. We assume that the decoherence parametersmore » $$\\gamma_{ij}$$ may depend on the neutrino energy, as $$\\gamma_{ij}=\\gamma_{ij}^{0}(E/\\text{GeV})^n$$ $$(n = 0,\\pm1,\\pm2) $$. The case of non-uniform matter is studied in detail, both within the adiabatic approximation and in the more general non-adiabatic case. In particular, we develop a consistent formalism to study the non-adiabatic case dividing the matter profile into an arbitrary number of layers of constant densities. This formalism is then applied to explore the sensitivity of IceCube and DeepCore to this type of effects. Our study is the first atmospheric neutrino analysis where a consistent treatment of the matter effects in the three-neutrino case is performed in presence of decoherence. We show that matter effects are indeed extremely relevant in this context. We find that IceCube is able to considerably improve over current bounds in the solar sector ($$\\gamma_{21}$$) and in the atmospheric sector ($$\\gamma_{31}$$ and $$\\gamma_{32}$$) for $n=0,1,2$ and, in particular, by several orders of magnitude (between 3 and 9) for the $n=1,2$ cases. For $n=0$ we find $$\\gamma_{32},\\gamma_{31}< 4.0\\cdot10^{-24} (1.3\\cdot10^{-24})$$ GeV and $$\\gamma_{21}<1.3\\cdot10^{-24} (4.1\\cdot10^{-24})$$ GeV, for normal (inverted) mass ordering.« less

  16. On Discerning Critical Elements, Relationships and Shifts in Attaining Scientific Terms: The Challenge of Polysemy/Homonymy and Reference

    NASA Astrophysics Data System (ADS)

    Strömdahl, Helge R.

    2012-01-01

    Words with well-known meaning in colloquial language often make up an educational challenge when introduced as terms with formal scientific meaning. New connections must be established between the word, already constrained by existing meaning and reference, and the intended formal scientific meaning and reference. A two-dimensional semantic/semiotic analysing schema (acronym 2-D SAS) has been developed to clarify a given word/term in a structured mode both according to non-formal senses and referents and formal scientific meaning and referents. The schema is constructed on ideas from semantics, semiotics and history and philosophy of science. The approach is supposed to be a contribution to make a fine-gained analysis of the structure and dynamics of conceptual change. The role of referents and referent change in conceptual change is highlighted by analysing the character of the recurrent mix-up of the terms heat and temperature among students at different educational levels.

  17. Towards improving phenotype representation in OWL

    PubMed Central

    2012-01-01

    Background Phenotype ontologies are used in species-specific databases for the annotation of mutagenesis experiments and to characterize human diseases. The Entity-Quality (EQ) formalism is a means to describe complex phenotypes based on one or more affected entities and a quality. EQ-based definitions have been developed for many phenotype ontologies, including the Human and Mammalian Phenotype ontologies. Methods We analyze formalizations of complex phenotype descriptions in the Web Ontology Language (OWL) that are based on the EQ model, identify several representational challenges and analyze potential solutions to address these challenges. Results In particular, we suggest a novel, role-based approach to represent relational qualities such as concentration of iron in spleen, discuss its ontological foundation in the General Formal Ontology (GFO) and evaluate its representation in OWL and the benefits it can bring to the representation of phenotype annotations. Conclusion Our analysis of OWL-based representations of phenotypes can contribute to improving consistency and expressiveness of formal phenotype descriptions. PMID:23046625

  18. Planning for Future Care and the End of Life: A Qualitative Analysis of Gay, Lesbian, and Heterosexual Couples.

    PubMed

    Thomeer, Mieke Beth; Donnelly, Rachel; Reczek, Corinne; Umberson, Debra

    2017-12-01

    Two key components of end-of-life planning are (1) informal discussions about future care and other end-of-life preferences and (2) formal planning via living wills and other legal documents. We leverage previous work on the institutional aspects of marriage and on sexual-minority discrimination to theorize why and how heterosexual, gay, and lesbian married couples engage in informal and formal end-of-life planning. We analyze qualitative dyadic in-depth interviews with 45 midlife gay, lesbian, and heterosexual married couples ( N = 90 spouses). Findings suggest that same-sex spouses devote considerable attention to informal planning conversations and formal end-of-life plans, while heterosexual spouses report minimal formal or informal planning. The primary reasons same-sex spouses give for making end-of-life preparations are related to the absence of legal protections and concerns about discrimination from families. These findings raise questions about future end-of-life planning for same- and different-sex couples given a rapidly shifting legal and social landscape.

  19. Interoperability between biomedical ontologies through relation expansion, upper-level ontologies and automatic reasoning.

    PubMed

    Hoehndorf, Robert; Dumontier, Michel; Oellrich, Anika; Rebholz-Schuhmann, Dietrich; Schofield, Paul N; Gkoutos, Georgios V

    2011-01-01

    Researchers design ontologies as a means to accurately annotate and integrate experimental data across heterogeneous and disparate data- and knowledge bases. Formal ontologies make the semantics of terms and relations explicit such that automated reasoning can be used to verify the consistency of knowledge. However, many biomedical ontologies do not sufficiently formalize the semantics of their relations and are therefore limited with respect to automated reasoning for large scale data integration and knowledge discovery. We describe a method to improve automated reasoning over biomedical ontologies and identify several thousand contradictory class definitions. Our approach aligns terms in biomedical ontologies with foundational classes in a top-level ontology and formalizes composite relations as class expressions. We describe the semi-automated repair of contradictions and demonstrate expressive queries over interoperable ontologies. Our work forms an important cornerstone for data integration, automatic inference and knowledge discovery based on formal representations of knowledge. Our results and analysis software are available at http://bioonto.de/pmwiki.php/Main/ReasonableOntologies.

  20. Cultural competence in healthcare in the community: A concept analysis.

    PubMed

    Henderson, Saras; Horne, Maria; Hills, Ruth; Kendall, Elizabeth

    2018-03-07

    This study aims to conduct a concept analysis on cultural competence in community healthcare. Clarification of the concept of cultural competence is needed to enable clarity in the definition and operation, research and theory development to assist healthcare providers to better understand this evolving concept. Rodgers' evolutionary concept analysis method was used to clarify the concept's context, surrogate terms, antecedents, attributes and consequences and to determine implications for further research. Articles from 2004 to 2015 were sought from Medline, PubMed, CINAHL and Scopus using the terms "cultural competency" AND "health," "cultural competence" OR "cultural safety" OR "cultural knowledge" OR "cultural awareness" OR cultural sensitivity OR "cultural skill" AND "Health." Articles with antecedents, attributes and consequences of cultural competence in community health were included. The 26 articles selected included nursing (n = 8), health (n = 8), psychology (n = 2), social work (n = 1), mental health (n = 3), medicine (n = 3) and occupational therapy (n = 1). Findings identify cultural openness, awareness, desire, knowledge and sensitivity and encounter as antecedents of cultural competence. Defining attributes are respecting and tailoring care aligned with clients' values, needs, practices and expectations, providing equitable and ethical care, and understanding. Consequences of cultural competence are satisfaction with care, the perception of quality healthcare, better adherence to treatments, effective interaction and improved health outcomes. An interesting finding is that the antecedents and attributes of cultural competence appear to represent a superficial level of understanding, sometimes only manifested through the need for social desirability. What is reported as critical in sustaining competence is the carers' capacity for a higher level of moral reasoning attainable through formal education in cultural and ethics knowledge. Our conceptual analysis incorporates moral reasoning in the definition of cultural competence. Further research to underpin moral reasoning with antecedents, attributes and consequences could enhance its clarity and promote a sustainable enactment of cultural competence. © 2018 John Wiley & Sons Ltd.

  1. Elemental depth profiling in transparent conducting oxide thin film by X-ray reflectivity and grazing incidence X-ray fluorescence combined analysis

    NASA Astrophysics Data System (ADS)

    Rotella, H.; Caby, B.; Ménesguen, Y.; Mazel, Y.; Valla, A.; Ingerle, D.; Detlefs, B.; Lépy, M.-C.; Novikova, A.; Rodriguez, G.; Streli, C.; Nolot, E.

    2017-09-01

    The optical and electrical properties of transparent conducting oxide (TCO) thin films are strongly linked with the structural and chemical properties such as elemental depth profile. In R&D environments, the development of non-destructive characterization techniques to probe the composition over the depth of deposited films is thus necessary. The combination of Grazing-Incidence X-ray Fluorescence (GIXRF) and X-ray reflectometry (XRR) is emerging as a fab-compatible solution for the measurement of thickness, density and elemental profile in complex stacks. Based on the same formalism, both techniques can be implemented on the same experimental set-up and the analysis can be combined in a single software in order to refine the sample model. While XRR is sensitive to the electronic density profile, GIXRF is sensitive to the atomic density (i. e. the elemental depth profile). The combination of both techniques allows to get simultaneous information about structural properties (thickness and roughness) as well as the chemical properties. In this study, we performed a XRR-GIXRF combined analysis on indium-free TCO thin films (Ga doped ZnO compound) in order to correlate the optical properties of the films with the elemental distribution of Ga dopant over the thickness. The variation of optical properties due to annealing process were probed by spectroscopic ellipsometry measurements. We studied the evolution of atomic profiles before and after annealing process. We show that the blue shift of the band gap in the optical absorption edge is linked to a homogenization of the atomic profiles of Ga and Zn over the layer after the annealing. This work demonstrates that the combination of the techniques gives insight into the material composition and makes the XRR-GIXRF combined analysis a promising technique for elemental depth profiling.

  2. Which Method of Assigning Bond Orders in Lewis Structures Best Reflects Experimental Data? An Analysis of the Octet Rule and Formal Charge Systems for Period 2 and 3 Nonmetallic Compounds

    ERIC Educational Resources Information Center

    See, Ronald F.

    2009-01-01

    Two systems were evaluated for drawing Lewis structures of period 2 and 3 non-metallic compounds: the octet rule and minimization of formal charge. The test set of molecules consisted of the oxides, halides, oxohalides, oxoanions, and oxoacids of B, N, O, F, Al, P, S, and Cl. Bond orders were quantified using experimental data, including bond…

  3. Formal Verification at System Level

    NASA Astrophysics Data System (ADS)

    Mazzini, S.; Puri, S.; Mari, F.; Melatti, I.; Tronci, E.

    2009-05-01

    System Level Analysis calls for a language comprehensible to experts with different background and yet precise enough to support meaningful analyses. SysML is emerging as an effective balance between such conflicting goals. In this paper we outline some the results obtained as for SysML based system level functional formal verification by an ESA/ESTEC study, with a collaboration among INTECS and La Sapienza University of Roma. The study focuses on SysML based system level functional requirements techniques.

  4. Statechart Analysis with Symbolic PathFinder

    NASA Technical Reports Server (NTRS)

    Pasareanu, Corina S.

    2012-01-01

    We report here on our on-going work that addresses the automated analysis and test case generation for software systems modeled using multiple Statechart formalisms. The work is motivated by large programs such as NASA Exploration, that involve multiple systems that interact via safety-critical protocols and are designed with different Statechart variants. To verify these safety-critical systems, we have developed Polyglot, a framework for modeling and analysis of model-based software written using different Statechart formalisms. Polyglot uses a common intermediate representation with customizable Statechart semantics and leverages the analysis and test generation capabilities of the Symbolic PathFinder tool. Polyglot is used as follows: First, the structure of the Statechart model (expressed in Matlab Stateflow or Rational Rhapsody) is translated into a common intermediate representation (IR). The IR is then translated into Java code that represents the structure of the model. The semantics are provided as "pluggable" modules.

  5. Cyto-Sim: a formal language model and stochastic simulator of membrane-enclosed biochemical processes.

    PubMed

    Sedwards, Sean; Mazza, Tommaso

    2007-10-15

    Compartments and membranes are the basis of cell topology and more than 30% of the human genome codes for membrane proteins. While it is possible to represent compartments and membrane proteins in a nominal way with many mathematical formalisms used in systems biology, few, if any, explicitly model the topology of the membranes themselves. Discrete stochastic simulation potentially offers the most accurate representation of cell dynamics. Since the details of every molecular interaction in a pathway are often not known, the relationship between chemical species in not necessarily best described at the lowest level, i.e. by mass action. Simulation is a form of computer-aided analysis, relying on human interpretation to derive meaning. To improve efficiency and gain meaning in an automatic way, it is necessary to have a formalism based on a model which has decidable properties. We present Cyto-Sim, a stochastic simulator of membrane-enclosed hierarchies of biochemical processes, where the membranes comprise an inner, outer and integral layer. The underlying model is based on formal language theory and has been shown to have decidable properties (Cavaliere and Sedwards, 2006), allowing formal analysis in addition to simulation. The simulator provides variable levels of abstraction via arbitrary chemical kinetics which link to ordinary differential equations. In addition to its compact native syntax, Cyto-Sim currently supports models described as Petri nets, can import all versions of SBML and can export SBML and MATLAB m-files. Cyto-Sim is available free, either as an applet or a stand-alone Java program via the web page (http://www.cosbi.eu/Rpty_Soft_CytoSim.php). Other versions can be made available upon request.

  6. A service-oriented architecture for integrating the modeling and formal verification of genetic regulatory networks

    PubMed Central

    2009-01-01

    Background The study of biological networks has led to the development of increasingly large and detailed models. Computer tools are essential for the simulation of the dynamical behavior of the networks from the model. However, as the size of the models grows, it becomes infeasible to manually verify the predictions against experimental data or identify interesting features in a large number of simulation traces. Formal verification based on temporal logic and model checking provides promising methods to automate and scale the analysis of the models. However, a framework that tightly integrates modeling and simulation tools with model checkers is currently missing, on both the conceptual and the implementational level. Results We have developed a generic and modular web service, based on a service-oriented architecture, for integrating the modeling and formal verification of genetic regulatory networks. The architecture has been implemented in the context of the qualitative modeling and simulation tool GNA and the model checkers NUSMV and CADP. GNA has been extended with a verification module for the specification and checking of biological properties. The verification module also allows the display and visual inspection of the verification results. Conclusions The practical use of the proposed web service is illustrated by means of a scenario involving the analysis of a qualitative model of the carbon starvation response in E. coli. The service-oriented architecture allows modelers to define the model and proceed with the specification and formal verification of the biological properties by means of a unified graphical user interface. This guarantees a transparent access to formal verification technology for modelers of genetic regulatory networks. PMID:20042075

  7. Development of X-ray laser media. Measurement of gain and development of cavity resonators for wavelengths near 130 angstroms, volume 3

    NASA Astrophysics Data System (ADS)

    Forsyth, J. M.

    1983-02-01

    In this document the authors summarize our investigation of the reflecting properties of X-ray multilayers. The breadth of this investigation indicates the utility of the difference equation formalism in the analysis of such structure. The formalism is particularly useful in analyzing multilayers whose structure is not a simple periodic bilayer. The complexity in structure can be either intentional, as in multilayers made by in-situ reflectance monitoring, or it can be a consequence of a degradation mechanism, such as random thickness errors or interlayer diffusion. Both the analysis of thickness errors and the analysis of interlayer diffusion are conceptually simple, effectively one dimensional problems that are straightforwared to pose. In the authors analysis of in-situ reflectance monitoring, they provide a quantitative understanding of an experimentally successful process that has not previously been treated theoretically. As X-ray multilayers come into wider use, there will undoubtedly be an increasing need for a more precise understanding of their reflecting properties. Thus, it is expected that in the future more detailed modeling will be undertaken of less easily specified structures than those above. The authors believe that their formalism will continue to prove useful in the modeling of these more complex structures. One such structure that may be of interest is that of a multilayer degraded by interfacial roughness.

  8. Stability analysis for non-minimally coupled dark energy models in the Palatini formalism

    NASA Astrophysics Data System (ADS)

    Wang, Zuobin; Wu, Puxun; Yu, Hongwei

    2018-06-01

    In this paper, we use the method of global analysis to study the stability of de-Sitter solutions in an universe dominated by a scalar field dark energy, which couples non-minimally with the Ricci scalar defined in the Palatini formalism. Effective potential and phase-space diagrams are introduced to describe qualitatively the de-Sitter solutions and their stabilities. We find that for the simple power-law function V(φ)=V0φn there are no stable de-Sitter solutions. While for some more complicated potentials, i.e. V(φ)=V0φn+Λ and V(φ)=V0 (e ^{-λφ}+e^{λφ)2, stable de-Sitter solutions can exist.

  9. A primer in macromolecular linguistics.

    PubMed

    Searls, David B

    2013-03-01

    Polymeric macromolecules, when viewed abstractly as strings of symbols, can be treated in terms of formal language theory, providing a mathematical foundation for characterizing such strings both as collections and in terms of their individual structures. In addition this approach offers a framework for analysis of macromolecules by tools and conventions widely used in computational linguistics. This article introduces the ways that linguistics can be and has been applied to molecular biology, covering the relevant formal language theory at a relatively nontechnical level. Analogies between macromolecules and human natural language are used to provide intuitive insights into the relevance of grammars, parsing, and analysis of language complexity to biology. Copyright © 2012 Wiley Periodicals, Inc.

  10. Formal concept analysis with background knowledge: a case study in paleobiological taxonomy of belemnites

    NASA Astrophysics Data System (ADS)

    Belohlavek, Radim; Kostak, Martin; Osicka, Petr

    2013-05-01

    We present a case study in identification of taxa in paleobiological data. Our approach utilizes formal concept analysis and is based on conceiving a taxon as a group of individuals sharing a collection of attributes. In addition to the incidence relation between individuals and their attributes, the method uses expert background knowledge regarding importance of attributes which helps to filter out correctly formed but paleobiologically irrelevant taxa. We present results of experiments carried out with belemnites-a group of extinct cephalopods which seems particularly suitable for such a purpose. We demonstrate that the methods are capable of revealing taxa and relationships among them that are relevant from a paleobiological point of view.

  11. Adjusting game difficulty level through Formal Concept Analysis

    NASA Astrophysics Data System (ADS)

    Gómez-Martín, Marco A.; Gómez-Martín, Pedro P.; Gonzâlez-Calero, Pedro A.; Díaz-Agudo, Belén

    In order to reach as many players as possible, videogames usually allow the user to choose the difficulty level. To do it, game designers have to decide the values that some game parameters will have depending on that decision. In simple videogames this is almost trivial: minesweeper is harder with longer board sizes and number of mines. In more complex games, game designers may take advantage of data mining to establish which of all the possible parameters will affect positively to the player experience. This paper describes the use of Formal Concept Analysis to help to balance the game using the logs obtained in the tests made prior the release of the game.

  12. Imaging modalities for characterising focal pancreatic lesions.

    PubMed

    Best, Lawrence Mj; Rawji, Vishal; Pereira, Stephen P; Davidson, Brian R; Gurusamy, Kurinchi Selvan

    2017-04-17

    Increasing numbers of incidental pancreatic lesions are being detected each year. Accurate characterisation of pancreatic lesions into benign, precancerous, and cancer masses is crucial in deciding whether to use treatment or surveillance. Distinguishing benign lesions from precancerous and cancerous lesions can prevent patients from undergoing unnecessary major surgery. Despite the importance of accurately classifying pancreatic lesions, there is no clear algorithm for management of focal pancreatic lesions. To determine and compare the diagnostic accuracy of various imaging modalities in detecting cancerous and precancerous lesions in people with focal pancreatic lesions. We searched the CENTRAL, MEDLINE, Embase, and Science Citation Index until 19 July 2016. We searched the references of included studies to identify further studies. We did not restrict studies based on language or publication status, or whether data were collected prospectively or retrospectively. We planned to include studies reporting cross-sectional information on the index test (CT (computed tomography), MRI (magnetic resonance imaging), PET (positron emission tomography), EUS (endoscopic ultrasound), EUS elastography, and EUS-guided biopsy or FNA (fine-needle aspiration)) and reference standard (confirmation of the nature of the lesion was obtained by histopathological examination of the entire lesion by surgical excision, or histopathological examination for confirmation of precancer or cancer by biopsy and clinical follow-up of at least six months in people with negative index tests) in people with pancreatic lesions irrespective of language or publication status or whether the data were collected prospectively or retrospectively. Two review authors independently searched the references to identify relevant studies and extracted the data. We planned to use the bivariate analysis to calculate the summary sensitivity and specificity with their 95% confidence intervals and the hierarchical summary receiver operating characteristic (HSROC) to compare the tests and assess heterogeneity, but used simpler models (such as univariate random-effects model and univariate fixed-effect model) for combining studies when appropriate because of the sparse data. We were unable to compare the diagnostic performance of the tests using formal statistical methods because of sparse data. We included 54 studies involving a total of 3,196 participants evaluating the diagnostic accuracy of various index tests. In these 54 studies, eight different target conditions were identified with different final diagnoses constituting benign, precancerous, and cancerous lesions. None of the studies was of high methodological quality. None of the comparisons in which single studies were included was of sufficiently high methodological quality to warrant highlighting of the results. For differentiation of cancerous lesions from benign or precancerous lesions, we identified only one study per index test. The second analysis, of studies differentiating cancerous versus benign lesions, provided three tests in which meta-analysis could be performed. The sensitivities and specificities for diagnosing cancer were: EUS-FNA: sensitivity 0.79 (95% confidence interval (CI) 0.07 to 1.00), specificity 1.00 (95% CI 0.91 to 1.00); EUS: sensitivity 0.95 (95% CI 0.84 to 0.99), specificity 0.53 (95% CI 0.31 to 0.74); PET: sensitivity 0.92 (95% CI 0.80 to 0.97), specificity 0.65 (95% CI 0.39 to 0.84). The third analysis, of studies differentiating precancerous or cancerous lesions from benign lesions, only provided one test (EUS-FNA) in which meta-analysis was performed. EUS-FNA had moderate sensitivity for diagnosing precancerous or cancerous lesions (sensitivity 0.73 (95% CI 0.01 to 1.00) and high specificity 0.94 (95% CI 0.15 to 1.00), the extremely wide confidence intervals reflecting the heterogeneity between the studies). The fourth analysis, of studies differentiating cancerous (invasive carcinoma) from precancerous (dysplasia) provided three tests in which meta-analysis was performed. The sensitivities and specificities for diagnosing invasive carcinoma were: CT: sensitivity 0.72 (95% CI 0.50 to 0.87), specificity 0.92 (95% CI 0.81 to 0.97); EUS: sensitivity 0.78 (95% CI 0.44 to 0.94), specificity 0.91 (95% CI 0.61 to 0.98); EUS-FNA: sensitivity 0.66 (95% CI 0.03 to 0.99), specificity 0.92 (95% CI 0.73 to 0.98). The fifth analysis, of studies differentiating cancerous (high-grade dysplasia or invasive carcinoma) versus precancerous (low- or intermediate-grade dysplasia) provided six tests in which meta-analysis was performed. The sensitivities and specificities for diagnosing cancer (high-grade dysplasia or invasive carcinoma) were: CT: sensitivity 0.87 (95% CI 0.00 to 1.00), specificity 0.96 (95% CI 0.00 to 1.00); EUS: sensitivity 0.86 (95% CI 0.74 to 0.92), specificity 0.91 (95% CI 0.83 to 0.96); EUS-FNA: sensitivity 0.47 (95% CI 0.24 to 0.70), specificity 0.91 (95% CI 0.32 to 1.00); EUS-FNA carcinoembryonic antigen 200 ng/mL: sensitivity 0.58 (95% CI 0.28 to 0.83), specificity 0.51 (95% CI 0.19 to 0.81); MRI: sensitivity 0.69 (95% CI 0.44 to 0.86), specificity 0.93 (95% CI 0.43 to 1.00); PET: sensitivity 0.90 (95% CI 0.79 to 0.96), specificity 0.94 (95% CI 0.81 to 0.99). The sixth analysis, of studies differentiating cancerous (invasive carcinoma) from precancerous (low-grade dysplasia) provided no tests in which meta-analysis was performed. The seventh analysis, of studies differentiating precancerous or cancerous (intermediate- or high-grade dysplasia or invasive carcinoma) from precancerous (low-grade dysplasia) provided two tests in which meta-analysis was performed. The sensitivity and specificity for diagnosing cancer were: CT: sensitivity 0.83 (95% CI 0.68 to 0.92), specificity 0.83 (95% CI 0.64 to 0.93) and MRI: sensitivity 0.80 (95% CI 0.58 to 0.92), specificity 0.81 (95% CI 0.53 to 0.95), respectively. The eighth analysis, of studies differentiating precancerous or cancerous (intermediate- or high-grade dysplasia or invasive carcinoma) from precancerous (low-grade dysplasia) or benign lesions provided no test in which meta-analysis was performed.There were no major alterations in the subgroup analysis of cystic pancreatic focal lesions (42 studies; 2086 participants). None of the included studies evaluated EUS elastography or sequential testing. We were unable to arrive at any firm conclusions because of the differences in the way that study authors classified focal pancreatic lesions into cancerous, precancerous, and benign lesions; the inclusion of few studies with wide confidence intervals for each comparison; poor methodological quality in the studies; and heterogeneity in the estimates within comparisons.

  13. A Scalable Analysis Toolkit

    NASA Technical Reports Server (NTRS)

    Aiken, Alexander

    2001-01-01

    The Scalable Analysis Toolkit (SAT) project aimed to demonstrate that it is feasible and useful to statically detect software bugs in very large systems. The technical focus of the project was on a relatively new class of constraint-based techniques for analysis software, where the desired facts about programs (e.g., the presence of a particular bug) are phrased as constraint problems to be solved. At the beginning of this project, the most successful forms of formal software analysis were limited forms of automatic theorem proving (as exemplified by the analyses used in language type systems and optimizing compilers), semi-automatic theorem proving for full verification, and model checking. With a few notable exceptions these approaches had not been demonstrated to scale to software systems of even 50,000 lines of code. Realistic approaches to large-scale software analysis cannot hope to make every conceivable formal method scale. Thus, the SAT approach is to mix different methods in one application by using coarse and fast but still adequate methods at the largest scales, and reserving the use of more precise but also more expensive methods at smaller scales for critical aspects (that is, aspects critical to the analysis problem under consideration) of a software system. The principled method proposed for combining a heterogeneous collection of formal systems with different scalability characteristics is mixed constraints. This idea had been used previously in small-scale applications with encouraging results: using mostly coarse methods and narrowly targeted precise methods, useful information (meaning the discovery of bugs in real programs) was obtained with excellent scalability.

  14. How Nutrition Sensitive Are the Nutrition Policies of New Zealand Food Manufacturers? A Benchmarking Study.

    PubMed

    Doonan, Rebecca; Field, Penny

    2017-12-19

    Nutrition sensitive policy addresses the underlying determinants of nutrition-related disease and is a powerful tool in reducing the incidence of non-communicable disease. Some members of the food industry have long standing commitments to health-oriented nutrition policies. The aim of this study was to develop and apply a balanced scorecard of nutrition sensitive indicators to the policies of influential New Zealand food and beverage manufacturers and explore factors affecting policy processes. The average nutrition sensitivity score of the twenty influential manufacturers policies was 42 against a benchmark of 75. Some manufacturers performed well whilst others had substantial scope for improvement, the largest variation was in policy development and implementation, whereas nutrition quality was relatively consistent. Manufacturers with written policy ( n = 11) scored on average three times higher than their counterparts with verbal policy. The value a manufacturer placed on nutrition influenced whether formal nutrition policies were developed. The reputational risk of failing to deliver on publicly declared nutrition commitments acted as an informal accountability mechanism. We conclude the balanced scorecard offers a useful tool for assessing the nutrition sensitivity of influential food and beverage manufacturers' policies. Our results provide a baseline for repeat assessments of the nutrition sensitivity of food manufacturers' policies.

  15. How Nutrition Sensitive Are the Nutrition Policies of New Zealand Food Manufacturers? A Benchmarking Study

    PubMed Central

    Doonan, Rebecca

    2017-01-01

    Nutrition sensitive policy addresses the underlying determinants of nutrition-related disease and is a powerful tool in reducing the incidence of non-communicable disease. Some members of the food industry have long standing commitments to health-oriented nutrition policies. The aim of this study was to develop and apply a balanced scorecard of nutrition sensitive indicators to the policies of influential New Zealand food and beverage manufacturers and explore factors affecting policy processes. Results: The average nutrition sensitivity score of the twenty influential manufacturers policies was 42 against a benchmark of 75. Some manufacturers performed well whilst others had substantial scope for improvement, the largest variation was in policy development and implementation, whereas nutrition quality was relatively consistent. Manufacturers with written policy (n = 11) scored on average three times higher than their counterparts with verbal policy. The value a manufacturer placed on nutrition influenced whether formal nutrition policies were developed. The reputational risk of failing to deliver on publicly declared nutrition commitments acted as an informal accountability mechanism. We conclude the balanced scorecard offers a useful tool for assessing the nutrition sensitivity of influential food and beverage manufacturers’ policies. Our results provide a baseline for repeat assessments of the nutrition sensitivity of food manufacturers’ policies. PMID:29257049

  16. FORMAL SCENARIO DEVELOPMENT FOR ENVIRONMENTAL IMPACT ASSESSMENT STUDIES

    EPA Science Inventory

    Scenario analysis is a process of evaluating possible future events through the consideration of alternative plausible (though not equally likely) outcomes (scenarios). The analysis is designed to enable improved decision-making and assessment through a more rigorous evaluation o...

  17. Bus incident reporting, tracking and analysis system

    DOT National Transportation Integrated Search

    2006-08-01

    Many Florida transit systems do little formal analysis of all accidents on an aggregate basis. In many transit system accidents and incidents are not being tracked or analyzed to identify common trends from types of incidents, location, driver, bus r...

  18. Tunneling current noise spectra of biased impurity with a phonon mode

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maslova, N. S.; Arseev, P. I.; Mantsevich, V. N., E-mail: vmantsev@gmail.com

    We report the results of theoretical investigations of the tunneling current noise spectra through a single-level impurity both in the presence and in the absence of electron–phonon interaction based on the nonequilibrium Green’s functions formalism. We show that due to the quantum nature of tunneling, the Fano factor is dramatically different from the Poisson limit both in the presence and in the absence of inelastic processes. The results are demonstrated to be sensitive to the tunneling contact parameters.

  19. Mapping vulnerability to bipolar disorder: a systematic review and meta-analysis of neuroimaging studies

    PubMed Central

    Fusar-Poli, Paolo; Howes, Oliver; Bechdolf, Andreas; Borgwardt, Stefan

    2012-01-01

    Background Although early interventions in individuals with bipolar disorder may reduce the associated personal and economic burden, the neurobiologic markers of enhanced risk are unknown. Methods Neuroimaging studies involving individuals at enhanced genetic risk for bipolar disorder (HR) were included in a systematic review. We then performed a region of interest (ROI) analysis and a whole-brain meta-analysis combined with a formal effect-sizes meta-analysis in a subset of studies. Results There were 37 studies included in our systematic review. The overall sample for the systematic review included 1258 controls and 996 HR individuals. No significant differences were detected between HR individuals and controls in the selected ROIs: striatum, amygdala, hippocampus, pituitary and frontal lobe. The HR group showed increased grey matter volume compared with patients with established bipolar disorder. The HR individuals showed increased neural response in the left superior frontal gyrus, medial frontal gyrus and left insula compared with controls, independent from the functional magnetic resonance imaging task used. There were no publication biases. Sensitivity analysis confirmed the robustness of these results. Limitations As the included studies were cross-sectional, it remains to be determined whether the observed neurofunctional and structural alterations represent risk factors that can be clinically used in preventive interventions for prodromal bipolar disorder. Conclusion Accumulating structural and functional imaging evidence supports the existence of neurobiologic trait abnormalities in individuals at genetic risk for bipolar disorder at various scales of investigation. PMID:22297067

  20. Why has energy consumption increased. An energy and society approach to the American case

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lacy, M.G.

    1981-01-01

    The general intellectual debate over energy issues has not exhausted the possibilities for sociological work. Sociology can improve on such previous work by providing an empirical-analytic moment, attending to meaning adequacy, recognizing process, assessing the materially determinative character of energy, and by being critical. However, if these several dimensions are taken as prescriptive criteria, even the strictly sociological literature on energy and society has numerous errors and omissions. Based on the findings of that critical examination of the sociological energy literature, a simple formal theory is developed to attack a particular substantive problem: Why has energy consumption increased in themore » United States during the twentieth century. This formalism requires that we begin by regarding energy consumption as completely determined by population, affluence, and technology. The results of the first empirical analysis using that formalism show that rising affluence, rather than deteriorating technology, is the culprit. However, the urge to praise technology is too hasty, since a second analysis shows that there actually have been two trends in energy technology, only one of which tended to hold down energy consumption.« less

  1. Formal thought disorder in schizophrenia and bipolar disorder: A systematic review and meta-analysis.

    PubMed

    Yalincetin, Berna; Bora, Emre; Binbay, Tolga; Ulas, Halis; Akdede, Berna Binnur; Alptekin, Koksal

    2017-07-01

    Historically, formal thought disorder has been considered as one of the distinctive symptoms of schizophrenia. However, research in last few decades suggested that there is a considerable clinical and neurobiological overlap between schizophrenia and bipolar disorder (BP). We conducted a meta-analysis of studies comparing positive (PTD) and negative formal thought disorder (NTD) in schizophrenia and BP. We included 19 studies comparing 715 schizophrenia and 474 BP patients. In the acute inpatient samples, there was no significant difference in the severity of PTD (d=-0.07, CI=-0.22-0.09) between schizophrenia and BP. In stable patients, schizophrenia was associated with increased PTD compared to BP (d=1.02, CI=0.35-1.70). NTD was significantly more severe (d=0.80, CI=0.52-0.1.08) in schizophrenia compared to BP. Our findings suggest that PTD is a shared feature of both schizophrenia and BP but persistent PTD or NTD can distinguish subgroups of schizophrenia from BP and schizophrenia patients with better clinical outcomes. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Specifying the ovarian cancer risk threshold of 'premenopausal risk-reducing salpingo-oophorectomy' for ovarian cancer prevention: a cost-effectiveness analysis.

    PubMed

    Manchanda, Ranjit; Legood, Rosa; Antoniou, Antonis C; Gordeev, Vladimir S; Menon, Usha

    2016-09-01

    Risk-reducing salpingo-oophorectomy (RRSO) is the most effective intervention to prevent ovarian cancer (OC). It is only available to high-risk women with >10% lifetime OC risk. This threshold has not been formally tested for cost-effectiveness. To specify the OC risk thresholds for RRSO being cost-effective for preventing OC in premenopausal women. The costs as well as effects of surgical prevention ('RRSO') were compared over a lifetime with 'no RRSO' using a decision analysis model. RRSO was undertaken in premenopausal women >40 years. The model was evaluated at lifetime OC risk levels: 2%, 4%, 5%, 6%, 8% and 10%. Costs and outcomes are discounted at 3.5%. Uncertainty in the model was assessed using both deterministic sensitivity analysis and probabilistic sensitivity analysis (PSA). Outcomes included in the analyses were OC, breast cancer (BC) and additional deaths from coronary heart disease. Total costs and effects were estimated in terms of quality-adjusted life-years (QALYs); incidence of OC and BC; as well as incremental cost-effectiveness ratio (ICER). Published literature, Nurses Health Study, British National Formulary, Cancer Research UK, National Institute for Health and Care Excellence guidelines and National Health Service reference costs. The time horizon is lifetime and perspective: payer. Premenopausal RRSO is cost-effective at 4% OC risk (life expectancy gained=42.7 days, ICER=£19 536/QALY) with benefits largely driven by reduction in BC risk. RRSO remains cost-effective at >8.2% OC risk without hormone replacement therapy (ICER=£29 071/QALY, life expectancy gained=21.8 days) or 6%if BC risk reduction=0 (ICER=£27 212/QALY, life expectancy gained=35.3 days). Sensitivity analysis indicated results are not impacted much by costs of surgical prevention or treatment of OC/ BC or cardiovascular disease. However, results were sensitive to RRSO utility scores. Additionally, 37%, 61%, 74%, 84%, 96% and 99.5% simulations on PSA are cost-effective for RRSO at the 2%, 4%, 5%, 6%, 8% and 10% levels of OC risk, respectively. Premenopausal RRSO appears to be extremely cost-effective at ≥4% lifetime OC risk, with ≥42.7 days gain in life expectancy if compliance with hormone replacement therapy is high. Current guidelines should be re-evaluated to reduce the RRSO OC risk threshold to benefit a number of at-risk women who presently cannot access risk-reducing surgery. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  3. Analysis of Phase-Type Stochastic Petri Nets With Discrete and Continuous Timing

    NASA Technical Reports Server (NTRS)

    Jones, Robert L.; Goode, Plesent W. (Technical Monitor)

    2000-01-01

    The Petri net formalism is useful in studying many discrete-state, discrete-event systems exhibiting concurrency, synchronization, and other complex behavior. As a bipartite graph, the net can conveniently capture salient aspects of the system. As a mathematical tool, the net can specify an analyzable state space. Indeed, one can reason about certain qualitative properties (from state occupancies) and how they arise (the sequence of events leading there). By introducing deterministic or random delays, the model is forced to sojourn in states some amount of time, giving rise to an underlying stochastic process, one that can be specified in a compact way and capable of providing quantitative, probabilistic measures. We formalize a new non-Markovian extension to the Petri net that captures both discrete and continuous timing in the same model. The approach affords efficient, stationary analysis in most cases and efficient transient analysis under certain restrictions. Moreover, this new formalism has the added benefit in modeling fidelity stemming from the simultaneous capture of discrete- and continuous-time events (as opposed to capturing only one and approximating the other). We show how the underlying stochastic process, which is non-Markovian, can be resolved into simpler Markovian problems that enjoy efficient solutions. Solution algorithms are provided that can be easily programmed.

  4. Studies 1. The Yugoslav Serbo-Croatian-English Contrastive Project.

    ERIC Educational Resources Information Center

    Filipovic, Rudolf, Ed.

    The first volume in this series on Serbo-Croatian-English contrastive analysis contains four articles. They are: "Contrasting via Translation: Formal Correspondence vs. Translation Equivalence," by Vladimir Ivir; "Approach to Contrastive Analysis," by Leonardo Spalatin; and "The Choice of the Corpus for the Contrastive Analysis of Serbo-Croatian…

  5. Studies 2. The Yugoslav Serbo-Croatian-English Contrastive Project.

    ERIC Educational Resources Information Center

    Filipovic, Rudolf, Ed.

    The second volume in this series on Serbo-Croatian-English contrastive analysis contains five articles. They are: "On Contrastive Contrastive Grammar," by Eric P. Hamp; "Remarks on Contrastive Analysis and Translation," by Vladimir Ivir; "Formal and Semantic Considerations in Contrastive Analysis," by Jerry L. Liston; "On Differences in…

  6. Methods for Mediation Analysis with Missing Data

    ERIC Educational Resources Information Center

    Zhang, Zhiyong; Wang, Lijuan

    2013-01-01

    Despite wide applications of both mediation models and missing data techniques, formal discussion of mediation analysis with missing data is still rare. We introduce and compare four approaches to dealing with missing data in mediation analysis including list wise deletion, pairwise deletion, multiple imputation (MI), and a two-stage maximum…

  7. The formal de Rham complex

    NASA Astrophysics Data System (ADS)

    Zharinov, V. V.

    2013-02-01

    We propose a formal construction generalizing the classic de Rham complex to a wide class of models in mathematical physics and analysis. The presentation is divided into a sequence of definitions and elementary, easily verified statements; proofs are therefore given only in the key case. Linear operations are everywhere performed over a fixed number field {F} = {R},{C}. All linear spaces, algebras, and modules, although not stipulated explicitly, are by definition or by construction endowed with natural locally convex topologies, and their morphisms are continuous.

  8. Formalization and Transformation of Informal Analysis Models into Executive REFINE (trademark) Specifications

    DTIC Science & Technology

    1992-12-01

    describing how. 5. EDDA . EDDA is an attempt to add mathematical formalism to SADT. Because it is based on SADT, it cannot easily represent any other...design methodology. EDDA has two forms: G- EDDA , the standard graphical version of SADT, and S- EDDA , a textual language that partially represents the...used. "* EDDA only supports the SADT methodology and is too limited in scope to be useful in our research. "* SAMM lacks the semantic richness of

  9. Integration Toolkit and Methods (ITKM) Corporate Data Integration Tools (CDIT). Review of the State-of-the-Art with Respect to Integration Toolkits and Methods (ITKM)

    DTIC Science & Technology

    1992-06-01

    system capabilities \\Jch as memory management and network communications are provided by a virtual machine-type operating environment. Various human ...thinking. The elements of this substrate include representational formality, genericity, a method of formal analysis, and augmentation of human analytical...the form of identifying: the data entity itself; its aliases (including how the data is presented th programs or human users in the form of copy

  10. Diagnostic accuracy of serological diagnosis of hepatitis C and B using dried blood spot samples (DBS): two systematic reviews and meta-analyses.

    PubMed

    Lange, Berit; Cohn, Jennifer; Roberts, Teri; Camp, Johannes; Chauffour, Jeanne; Gummadi, Nina; Ishizaki, Azumi; Nagarathnam, Anupriya; Tuaillon, Edouard; van de Perre, Philippe; Pichler, Christine; Easterbrook, Philippa; Denkinger, Claudia M

    2017-11-01

    Dried blood spots (DBS) are a convenient tool to enable diagnostic testing for viral diseases due to transport, handling and logistical advantages over conventional venous blood sampling. A better understanding of the performance of serological testing for hepatitis C (HCV) and hepatitis B virus (HBV) from DBS is important to enable more widespread use of this sampling approach in resource limited settings, and to inform the 2017 World Health Organization (WHO) guidance on testing for HBV/HCV. We conducted two systematic reviews and meta-analyses on the diagnostic accuracy of HCV antibody (HCV-Ab) and HBV surface antigen (HBsAg) from DBS samples compared to venous blood samples. MEDLINE, EMBASE, Global Health and Cochrane library were searched for studies that assessed diagnostic accuracy with DBS and agreement between DBS and venous sampling. Heterogeneity of results was assessed and where possible a pooled analysis of sensitivity and specificity was performed using a bivariate analysis with maximum likelihood estimate and 95% confidence intervals (95%CI). We conducted a narrative review on the impact of varying storage conditions or limits of detection in subsets of samples. The QUADAS-2 tool was used to assess risk of bias. For the diagnostic accuracy of HBsAg from DBS compared to venous blood, 19 studies were included in a quantitative meta-analysis, and 23 in a narrative review. Pooled sensitivity and specificity were 98% (95%CI:95%-99%) and 100% (95%CI:99-100%), respectively. For the diagnostic accuracy of HCV-Ab from DBS, 19 studies were included in a pooled quantitative meta-analysis, and 23 studies were included in a narrative review. Pooled estimates of sensitivity and specificity were 98% (CI95%:95-99) and 99% (CI95%:98-100), respectively. Overall quality of studies and heterogeneity were rated as moderate in both systematic reviews. HCV-Ab and HBsAg testing using DBS compared to venous blood sampling was associated with excellent diagnostic accuracy. However, generalizability is limited as no uniform protocol was applied and most studies did not use fresh samples. Future studies on diagnostic accuracy should include an assessment of impact of environmental conditions common in low resource field settings. Manufacturers also need to formally validate their assays for DBS for use with their commercial assays.

  11. Cost-Effectiveness of Dabigatran Compared to Vitamin-K Antagonists for the Treatment of Deep Venous Thrombosis in the Netherlands Using Real-World Data.

    PubMed

    van Leent, Merlijn W J; Stevanović, Jelena; Jansman, Frank G; Beinema, Maarten J; Brouwers, Jacobus R B J; Postma, Maarten J

    2015-01-01

    Vitamin-K antagonists (VKAs) present an effective anticoagulant treatment in deep venous thrombosis (DVT). However, the use of VKAs is limited because of the risk of bleeding and the necessity of frequent and long-term laboratory monitoring. Therefore, new oral anticoagulant drugs (NOACs) such as dabigatran, with lower rates of (major) intracranial bleeding compared to VKAs and not requiring monitoring, may be considered. To estimate resource utilization and costs of patients treated with the VKAs acenocoumarol and phenprocoumon, for the indication DVT. Furthermore, a formal cost-effectiveness analysis of dabigatran compared to VKAs for DVT treatment was performed, using these estimates. A retrospective observational study design in the thrombotic service of a teaching hospital (Deventer, The Netherlands) was applied to estimate real-world resource utilization and costs of VKA monitoring. A pooled analysis of data from RE-COVER and RE-COVER II on DVT was used to reflect the probabilities for events in the cost-effectiveness model. Dutch costs, utilities and specific data on coagulation monitoring levels were incorporated in the model. Next to the base case analysis, univariate probabilistic sensitivity and scenario analyses were performed. Real-world resource utilization in the thrombotic service of patients treated with VKA for the indication of DVT consisted of 12.3 measurements of the international normalized ratio (INR), with corresponding INR monitoring costs of €138 for a standardized treatment period of 180 days. In the base case, dabigatran treatment compared to VKAs in a cohort of 1,000 DVT patients resulted in savings of €18,900 (95% uncertainty interval (UI) -95,832, 151,162) and 41 (95% UI -18, 97) quality-adjusted life-years (QALYs) gained calculated from societal perspective. The probability that dabigatran is cost-effective at a conservative willingness-to pay threshold of €20,000 per QALY was 99%. Sensitivity and scenario analyses also indicated cost savings or cost-effectiveness below this same threshold. Total INR monitoring costs per patient were estimated at minimally €138. Inserting these real-world data into a cost-effectiveness analysis for patients diagnosed with DVT, dabigatran appeared to be a cost-saving alternative to VKAs in the Netherlands in the base case. Cost savings or favorable cost-effectiveness were robust in sensitivity and scenario analyses. Our results warrant confirmation in other settings and locations.

  12. Error analysis for the ground-based microwave ozone measurements during STOIC

    NASA Technical Reports Server (NTRS)

    Connor, Brian J.; Parrish, Alan; Tsou, Jung-Jung; McCormick, M. Patrick

    1995-01-01

    We present a formal error analysis and characterization of the microwave measurements made during the Stratospheric Ozone Intercomparison Campaign (STOIC). The most important error sources are found to be determination of the tropospheric opacity, the pressure-broadening coefficient of the observed line, and systematic variations in instrument response as a function of frequency ('baseline'). Net precision is 4-6% between 55 and 0.2 mbar, while accuracy is 6-10%. Resolution is 8-10 km below 3 mbar and increases to 17km at 0.2 mbar. We show the 'blind' microwave measurements from STOIC and make limited comparisons to other measurements. We use the averaging kernels of the microwave measurement to eliminate resolution and a priori effects from a comparison to SAGE 2. The STOIC results and comparisons are broadly consistent with the formal analysis.

  13. Probabilistic Micromechanics and Macromechanics for Ceramic Matrix Composites

    NASA Technical Reports Server (NTRS)

    Murthy, Pappu L. N.; Mital, Subodh K.; Shah, Ashwin R.

    1997-01-01

    The properties of ceramic matrix composites (CMC's) are known to display a considerable amount of scatter due to variations in fiber/matrix properties, interphase properties, interphase bonding, amount of matrix voids, and many geometry- or fabrication-related parameters, such as ply thickness and ply orientation. This paper summarizes preliminary studies in which formal probabilistic descriptions of the material-behavior- and fabrication-related parameters were incorporated into micromechanics and macromechanics for CMC'S. In this process two existing methodologies, namely CMC micromechanics and macromechanics analysis and a fast probability integration (FPI) technique are synergistically coupled to obtain the probabilistic composite behavior or response. Preliminary results in the form of cumulative probability distributions and information on the probability sensitivities of the response to primitive variables for a unidirectional silicon carbide/reaction-bonded silicon nitride (SiC/RBSN) CMC are presented. The cumulative distribution functions are computed for composite moduli, thermal expansion coefficients, thermal conductivities, and longitudinal tensile strength at room temperature. The variations in the constituent properties that directly affect these composite properties are accounted for via assumed probabilistic distributions. Collectively, the results show that the present technique provides valuable information about the composite properties and sensitivity factors, which is useful to design or test engineers. Furthermore, the present methodology is computationally more efficient than a standard Monte-Carlo simulation technique; and the agreement between the two solutions is excellent, as shown via select examples.

  14. Applications of a formal approach to decipher discrete genetic networks.

    PubMed

    Corblin, Fabien; Fanchon, Eric; Trilling, Laurent

    2010-07-20

    A growing demand for tools to assist the building and analysis of biological networks exists in systems biology. We argue that the use of a formal approach is relevant and applicable to address questions raised by biologists about such networks. The behaviour of these systems being complex, it is essential to exploit efficiently every bit of experimental information. In our approach, both the evolution rules and the partial knowledge about the structure and the behaviour of the network are formalized using a common constraint-based language. In this article our formal and declarative approach is applied to three biological applications. The software environment that we developed allows to specifically address each application through a new class of biologically relevant queries. We show that we can describe easily and in a formal manner the partial knowledge about a genetic network. Moreover we show that this environment, based on a constraint algorithmic approach, offers a wide variety of functionalities, going beyond simple simulations, such as proof of consistency, model revision, prediction of properties, search for minimal models relatively to specified criteria. The formal approach proposed here deeply changes the way to proceed in the exploration of genetic and biochemical networks, first by avoiding the usual trial-and-error procedure, and second by placing the emphasis on sets of solutions, rather than a single solution arbitrarily chosen among many others. Last, the constraint approach promotes an integration of model and experimental data in a single framework.

  15. Reasoning about variables in 11 to 18 year olds: informal, schooled and formal expression in learning about functions

    NASA Astrophysics Data System (ADS)

    Ayalon, Michal; Watson, Anne; Lerman, Steve

    2016-09-01

    This study examines expressions of reasoning by some higher achieving 11 to 18 year-old English students responding to a survey consisting of function tasks developed in collaboration with their teachers. We report on 70 students, 10 from each of English years 7-13. Iterative and comparative analysis identified capabilities and difficulties of students and suggested conjectures concerning links between the affordances of the tasks, the curriculum, and students' responses. The paper focuses on five of the survey tasks and highlights connections between informal and formal expressions of reasoning about variables in learning. We introduce the notion of `schooled' expressions of reasoning, neither formal nor informal, to emphasise the role of the formatting tools introduced in school that shape future understanding and reasoning.

  16. Greenland Regional and Ice Sheet-wide Geometry Sensitivity to Boundary and Initial conditions

    NASA Astrophysics Data System (ADS)

    Logan, L. C.; Narayanan, S. H. K.; Greve, R.; Heimbach, P.

    2017-12-01

    Ice sheet and glacier model outputs require inputs from uncertainly known initial and boundary conditions, and other parameters. Conservation and constitutive equations formalize the relationship between model inputs and outputs, and the sensitivity of model-derived quantities of interest (e.g., ice sheet volume above floatation) to model variables can be obtained via the adjoint model of an ice sheet. We show how one particular ice sheet model, SICOPOLIS (SImulation COde for POLythermal Ice Sheets), depends on these inputs through comprehensive adjoint-based sensitivity analyses. SICOPOLIS discretizes the shallow-ice and shallow-shelf approximations for ice flow, and is well-suited for paleo-studies of Greenland and Antarctica, among other computational domains. The adjoint model of SICOPOLIS was developed via algorithmic differentiation, facilitated by the source transformation tool OpenAD (developed at Argonne National Lab). While model sensitivity to various inputs can be computed by costly methods involving input perturbation simulations, the time-dependent adjoint model of SICOPOLIS delivers model sensitivities to initial and boundary conditions throughout time at lower cost. Here, we explore both the sensitivities of the Greenland Ice Sheet's entire and regional volumes to: initial ice thickness, precipitation, basal sliding, and geothermal flux over the Holocene epoch. Sensitivity studies such as described here are now accessible to the modeling community, based on the latest version of SICOPOLIS that has been adapted for OpenAD to generate correct and efficient adjoint code.

  17. Invisible realities: Caring for older Moroccan migrants with dementia in Belgium.

    PubMed

    Berdai Chaouni, Saloua; De Donder, Liesbeth

    2018-01-01

    The number of older Moroccan migrants reaching the age of high risk for dementia is increasing in Belgium. Yet no study has been performed to explore how Moroccan families facing dementia experience and manage the condition. The study employed a qualitative design using semi-structured interviews with 12 informal and 13 formal caregivers to answer this research question. Findings indicate that the experience of dementia includes several invisible realities that challenge the informal and formal caregivers: (1) the invisibility of dementia as a condition; (2) the invisible subtleties of the informal care execution; (3) the invisibility and inaccessibility of care services as explanation for these family's non-use of available services; and (4) the overlooking of culture, migration and religion as invisible influencers of the overall dementia experience. A better understanding of these hidden realities of migrant older people with dementia and their caregivers could lead to interventions to provide effective and tailored person-centred care that is sensitive to the individual's life experiences, culture and religious background.

  18. Preferences for and Barriers to Formal and Informal Athletic Training Continuing Education Activities

    PubMed Central

    Armstrong, Kirk J.; Weidner, Thomas G.

    2011-01-01

    Context: Our previous research determined the frequency of participation and perceived effect of formal and informal continuing education (CE) activities. However, actual preferences for and barriers to CE must be characterized. Objective: To determine the types of formal and informal CE activities preferred by athletic trainers (ATs) and barriers to their participation in these activities. Design: Cross-sectional study. Setting: Athletic training practice settings. Patients or Other Participants: Of a geographically stratified random sample of 1000 ATs, 427 ATs (42.7%) completed the survey. Main Outcome Measure(s): As part of a larger study, the Survey of Formal and Informal Athletic Training Continuing Education Activities (FIATCEA) was developed and administered electronically. The FIATCEA consists of demographic characteristics and Likert scale items (1 = strongly disagree, 5 = strongly agree) about preferred CE activities and barriers to these activities. Internal consistency of survey items, as determined by Cronbach α, was 0.638 for preferred CE activities and 0.860 for barriers to these activities. Descriptive statistics were computed for all items. Differences between respondent demographic characteristics and preferred CE activities and barriers to these activities were determined via analysis of variance and dependent t tests. The α level was set at .05. Results: Hands-on clinical workshops and professional networking were the preferred formal and informal CE activities, respectively. The most frequently reported barriers to formal CE were the cost of attending and travel distance, whereas the most frequently reported barriers to informal CE were personal and job-specific factors. Differences were noted between both the cost of CE and travel distance to CE and all other barriers to CE participation (F1,411 = 233.54, P < .001). Conclusions: Overall, ATs preferred formal CE activities. The same barriers (eg, cost, travel distance) to formal CE appeared to be universal to all ATs. Informal CE was highly valued by ATs because it could be individualized. PMID:22488195

  19. Formal and Informal Continuing Education Activities and Athletic Training Professional Practice

    PubMed Central

    Armstrong, Kirk J.; Weidner, Thomas G.

    2010-01-01

    Abstract Context: Continuing education (CE) is intended to promote professional growth and, ultimately, to enhance professional practice. Objective: To determine certified athletic trainers' participation in formal (ie, approved for CE credit) and informal (ie, not approved for CE credit) CE activities and the perceived effect these activities have on professional practice with regard to improving knowledge, clinical skills and abilities, attitudes toward patient care, and patient care itself. Design: Cross-sectional study. Setting: Athletic training practice settings. Patients or Other Participants: Of a geographic, stratified random sample of 1000 athletic trainers, 427 (42.7%) completed the survey. Main Outcome Measure(s): The Survey of Formal and Informal Athletic Training Continuing Education Activities was developed and administered electronically. The survey consisted of demographic characteristics and Likert-scale items regarding CE participation and perceived effect of CE on professional practice. Internal consistency of survey items was determined using the Cronbach α (α  =  0.945). Descriptive statistics were computed for all items. An analysis of variance and dependent t tests were calculated to determine differences among respondents' demographic characteristics and their participation in, and perceived effect of, CE activities. The α level was set at .05. Results: Respondents completed more informal CE activities than formal CE activities. Participation in informal CE activities included reading athletic training journals (75.4%), whereas formal CE activities included attending a Board of Certification–approved workshop, seminar, or professional conference not conducted by the National Athletic Trainers' Association or affiliates or committees (75.6%). Informal CE activities were perceived to improve clinical skills or abilities and attitudes toward patient care. Formal CE activities were perceived to enhance knowledge. Conclusions: More respondents completed informal CE activities than formal CE activities. Both formal and informal CE activities were perceived to enhance athletic training professional practice. Informal CE activities should be explored and considered for CE credit. PMID:20446842

  20. Experiences with Dating Violence and Help Seeking Among Hispanic Females in Their Late Adolescence

    PubMed Central

    Gonzalez-Guarda, Rosa M.; Ferranti, Dina; Halstead, Valerie; Ilias, Vanessa M.

    2017-01-01

    Hispanic females in their late adolescence appear to be disproportionately affected by dating violence, yet the majority of victims never seek out formal services. The purpose of this study was to explore the dating violence and the help-seeking experiences of Hispanic females in their late adolescence. Participants were recruited from a social service agency providing wrap-around services to individuals-and families affected by abuse in South Florida. Eleven in-depth qualitative interviews were conducted with Hispanic female victims of dating violence in their late adolescence (18 to 24 years of age) in English or Spanish. A thematic analysis of transcripts identified four major themes: (a) conflict, culture, and context influences Hispanic couples; (b) missed opportunities to accessing help; (c) pivotal moments are needed to access formal services; and (d) family matters. Participants of this study believed that dating violence was more normative in Hispanic relationships than “American” relationships. Although participants had opportunities to seek formal services early in their relationships, formal services were only sought after pivotal moments. Families played an important role in supporting or further victimizing the participants. Findings from this study can be used to inform interventions addressing both informal and formal sources of support for Hispanic female victims of dating violence in their late adolescence. PMID:27077507

  1. Formal Methods in Air Traffic Management: The Case of Unmanned Aircraft Systems

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar A.

    2015-01-01

    As the technological and operational capabilities of unmanned aircraft systems (UAS) continue to grow, so too does the need to introduce these systems into civil airspace. Unmanned Aircraft Systems Integration in the National Airspace System is a NASA research project that addresses the integration of civil UAS into non-segregated airspace operations. One of the major challenges of this integration is the lack of an onboard pilot to comply with the legal requirement that pilots see and avoid other aircraft. The need to provide an equivalent to this requirement for UAS has motivated the development of a detect and avoid (DAA) capability to provide the appropriate situational awareness and maneuver guidance in avoiding and remaining well clear of traffic aircraft. Formal methods has played a fundamental role in the development of this capability. This talk reports on the formal methods work conducted under NASA's Safe Autonomous System Operations project in support of the development of DAA for UAS. This work includes specification of low-level and high-level functional requirements, formal verification of algorithms, and rigorous validation of software implementations. The talk also discusses technical challenges in formal methods research in the context of the development and safety analysis of advanced air traffic management concepts.

  2. Fair Inference on Outcomes

    PubMed Central

    Nabi, Razieh; Shpitser, Ilya

    2017-01-01

    In this paper, we consider the problem of fair statistical inference involving outcome variables. Examples include classification and regression problems, and estimating treatment effects in randomized trials or observational data. The issue of fairness arises in such problems where some covariates or treatments are “sensitive,” in the sense of having potential of creating discrimination. In this paper, we argue that the presence of discrimination can be formalized in a sensible way as the presence of an effect of a sensitive covariate on the outcome along certain causal pathways, a view which generalizes (Pearl 2009). A fair outcome model can then be learned by solving a constrained optimization problem. We discuss a number of complications that arise in classical statistical inference due to this view and provide workarounds based on recent work in causal and semi-parametric inference.

  3. Probing New Long-Range Interactions by Isotope Shift Spectroscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berengut, Julian C.; Budker, Dmitry; Delaunay, Cédric

    We explore a method to probe new long- and intermediate-range interactions using precision atomic isotope shift spectroscopy. We develop a formalism to interpret linear King plots as bounds on new physics with minimal theory inputs. We focus only on bounding the new physics contributions that can be calculated independently of the standard model nuclear effects. We apply our method to existing Ca + data and project its sensitivity to conjectured new bosons with spin-independent couplings to the electron and the neutron using narrow transitions in other atoms and ions, specifically, Sr and Yb. Future measurements are expected to improve themore » relative precision by 5 orders of magnitude, and they can potentially lead to an unprecedented sensitivity for bosons within the 0.3 to 10 MeV mass range.« less

  4. Probing New Long-Range Interactions by Isotope Shift Spectroscopy.

    PubMed

    Berengut, Julian C; Budker, Dmitry; Delaunay, Cédric; Flambaum, Victor V; Frugiuele, Claudia; Fuchs, Elina; Grojean, Christophe; Harnik, Roni; Ozeri, Roee; Perez, Gilad; Soreq, Yotam

    2018-03-02

    We explore a method to probe new long- and intermediate-range interactions using precision atomic isotope shift spectroscopy. We develop a formalism to interpret linear King plots as bounds on new physics with minimal theory inputs. We focus only on bounding the new physics contributions that can be calculated independently of the standard model nuclear effects. We apply our method to existing Ca^{+} data and project its sensitivity to conjectured new bosons with spin-independent couplings to the electron and the neutron using narrow transitions in other atoms and ions, specifically, Sr and Yb. Future measurements are expected to improve the relative precision by 5 orders of magnitude, and they can potentially lead to an unprecedented sensitivity for bosons within the 0.3 to 10 MeV mass range.

  5. Probing New Long-Range Interactions by Isotope Shift Spectroscopy

    DOE PAGES

    Berengut, Julian C.; Budker, Dmitry; Delaunay, Cédric; ...

    2018-02-26

    We explore a method to probe new long- and intermediate-range interactions using precision atomic isotope shift spectroscopy. We develop a formalism to interpret linear King plots as bounds on new physics with minimal theory inputs. We focus only on bounding the new physics contributions that can be calculated independently of the standard model nuclear effects. We apply our method to existing Ca + data and project its sensitivity to conjectured new bosons with spin-independent couplings to the electron and the neutron using narrow transitions in other atoms and ions, specifically, Sr and Yb. Future measurements are expected to improve themore » relative precision by 5 orders of magnitude, and they can potentially lead to an unprecedented sensitivity for bosons within the 0.3 to 10 MeV mass range.« less

  6. Power-law modeling based on least-squares minimization criteria.

    PubMed

    Hernández-Bermejo, B; Fairén, V; Sorribas, A

    1999-10-01

    The power-law formalism has been successfully used as a modeling tool in many applications. The resulting models, either as Generalized Mass Action or as S-systems models, allow one to characterize the target system and to simulate its dynamical behavior in response to external perturbations and parameter changes. The power-law formalism was first derived as a Taylor series approximation in logarithmic space for kinetic rate-laws. The especial characteristics of this approximation produce an extremely useful systemic representation that allows a complete system characterization. Furthermore, their parameters have a precise interpretation as local sensitivities of each of the individual processes and as rate-constants. This facilitates a qualitative discussion and a quantitative estimation of their possible values in relation to the kinetic properties. Following this interpretation, parameter estimation is also possible by relating the systemic behavior to the underlying processes. Without leaving the general formalism, in this paper we suggest deriving the power-law representation in an alternative way that uses least-squares minimization. The resulting power-law mimics the target rate-law in a wider range of concentration values than the classical power-law. Although the implications of this alternative approach remain to be established, our results show that the predicted steady-state using the least-squares power-law is closest to the actual steady-state of the target system.

  7. Managerial capacity and adoption of culturally competent practices in outpatient substance abuse treatment organizations.

    PubMed

    Guerrero, Erick G

    2010-12-01

    The field of cultural competence is shifting its primary emphasis from enhancement of counselors' skills to management, organizational policy, and processes of care. This study examined managers' characteristics associated with adoption of culturally competent practices in the nation's outpatient substance abuse treatment field. Findings indicate that in 1995, supervisors' cultural sensitivity played the most significant role in adopting practices, such as matching counselors and clients based on race and offering bilingual services. Staff's exposure to cross-cultural training increased from 1995 to 2005. In this period, positive associations were found between managers' cultural sensitivity and connection with the community and staff receiving cross-cultural training and the number of training hours completed. However, exposure to and investment in this training were negatively correlated with managers' formal education. Health administration policy should consider the extent to which the decision makers' education, community involvement, and cultural sensitivity contribute to building culturally responsive systems of care. Copyright © 2010 Elsevier Inc. All rights reserved.

  8. Managerial Capacity and Adoption of Culturally Competent Practices in Outpatient Substance Abuse Treatment Organizations

    PubMed Central

    Guerrero, Erick G.

    2010-01-01

    The field of cultural competence is shifting its primary emphasis from enhancement of counselors' skills to management, organizational policy and processes of care. This study examined managers' characteristics associated with adoption of culturally competent practices in the nation's outpatient substance abuse treatment field. Findings indicate that in 1995 supervisors' cultural sensitivity played the most significant role in adopting practices, such as matching counselors and clients based on race and offering bilingual services. Staff's exposure to cross-cultural training increased from 1995 to 2005. In this time period, positive associations were found between managers' cultural sensitivity and connection with the community and staff receiving cross-cultural training and the number of training hours completed. However, exposure to and investment in this training were negatively correlated with managers' formal education. Health administration policy should consider the extent to which decision makers' education, community involvement and cultural sensitivity contributes to building culturally responsive systems of care. PMID:20727703

  9. Evaluating trade-offs in bull trout reintroduction strategies using structured decision making

    USGS Publications Warehouse

    Brignon, William R.; Peterson, James T.; Dunham, Jason B.; Schaller, Howard A.; Schreck, Carl B.

    2018-01-01

    Structured decision making allows reintroduction decisions to be made despite uncertainty by linking reintroduction goals with alternative management actions through predictive models of ecological processes. We developed a decision model to evaluate the trade-offs between six bull trout (Salvelinus confluentus) reintroduction decisions with the goal of maximizing the number of adults in the recipient population without reducing the donor population to an unacceptable level. Sensitivity analyses suggested that the decision identity and outcome were most influenced by survival parameters that result in increased adult abundance in the recipient population, increased juvenile survival in the donor and recipient populations, adult fecundity rates, and sex ratio. The decision was least sensitive to survival parameters associated with the captive-reared population, the effect of naivety on released individuals, and juvenile carrying capacity of the reintroduced population. The model and sensitivity analyses can serve as the foundation for formal adaptive management and improved effectiveness, efficiency, and transparency of bull trout reintroduction decisions.

  10. Limited predictive ability of surrogate indices of insulin sensitivity/resistance in Asian-Indian men.

    PubMed

    Muniyappa, Ranganath; Irving, Brian A; Unni, Uma S; Briggs, William M; Nair, K Sreekumaran; Quon, Michael J; Kurpad, Anura V

    2010-12-01

    Insulin resistance is highly prevalent in Asian Indians and contributes to worldwide public health problems, including diabetes and related disorders. Surrogate measurements of insulin sensitivity/resistance are used frequently to study Asian Indians, but these are not formally validated in this population. In this study, we compared the ability of simple surrogate indices to accurately predict insulin sensitivity as determined by the reference glucose clamp method. In this cross-sectional study of Asian-Indian men (n = 70), we used a calibration model to assess the ability of simple surrogate indices for insulin sensitivity [quantitative insulin sensitivity check index (QUICKI), homeostasis model assessment (HOMA2-IR), fasting insulin-to-glucose ratio (FIGR), and fasting insulin (FI)] to predict an insulin sensitivity index derived from the reference glucose clamp method (SI(Clamp)). Predictive accuracy was assessed by both root mean squared error (RMSE) of prediction as well as leave-one-out cross-validation-type RMSE of prediction (CVPE). QUICKI, FIGR, and FI, but not HOMA2-IR, had modest linear correlations with SI(Clamp) (QUICKI: r = 0.36; FIGR: r = -0.36; FI: r = -0.27; P < 0.05). No significant differences were noted among CVPE or RMSE from any of the surrogate indices when compared with QUICKI. Surrogate measurements of insulin sensitivity/resistance such as QUICKI, FIGR, and FI are easily obtainable in large clinical studies, but these may only be useful as secondary outcome measurements in assessing insulin sensitivity/resistance in clinical studies of Asian Indians.

  11. Cultural Sensitivity Among Clinical Nurses: A Descriptive Study.

    PubMed

    Yilmaz, Medine; Toksoy, Serap; Direk, Zübeyde Denizci; Bezirgan, Selma; Boylu, Münevver

    2017-03-01

    The purpose of this study was to investigate the cultural sensitivity of nurses working in rural and urban hospitals in Turkey. The sampling of this descriptive and correlational study was composed of only 516 clinical nurses working in inpatient clinics. The data collection tools were the Socio-Demographic Questionnaire and the Intercultural Sensitivity Scale. A majority of the participating nurses experienced culture-related problems. Intercultural Sensitivity Scale results were partially high. The nurses had more problems in areas related to language barriers, patients' education level, and health perception about disease and religious beliefs when providing health care. Participants who were female, had an undergraduate or graduate education, had received in-service education on cultural care, or had taken transcultural nursing coursework obtained higher scores on the Intercultural Sensitivity Scale and its Interaction Engagement subscale. The cultural sensitivity level was 84.01 ± 9.1 (range = 43-107). The proportion of nurses who had received no in-service education was very high. They wanted to participate in an education program to gain better understanding of the culture of the society in which they lived. The results of the present study demonstrated that nurses should be prepared in cultural sensitivity and cultural competence. Continuing education and formal courses on cultural sensitivity for nursing professionals are essential for optimal health outcomes. Thus, inequalities in health could be prevented and the quality of health care could be improved. © 2017 Sigma Theta Tau International.

  12. Recognition of emotions in autism: a formal meta-analysis.

    PubMed

    Uljarevic, Mirko; Hamilton, Antonia

    2013-07-01

    Determining the integrity of emotion recognition in autistic spectrum disorder is important to our theoretical understanding of autism and to teaching social skills. Previous studies have reported both positive and negative results. Here, we take a formal meta-analytic approach, bringing together data from 48 papers testing over 980 participants with autism. Results show there is an emotion recognition difficulty in autism, with a mean effect size of 0.80 which reduces to 0.41 when a correction for publication bias is applied. Recognition of happiness was only marginally impaired in autism, but recognition of fear was marginally worse than recognition of happiness. This meta-analysis provides an opportunity to survey the state of emotion recognition research in autism and to outline potential future directions.

  13. Improved limits on dark matter annihilation in the Sun with the 79-string IceCube detector and implications for supersymmetry

    NASA Astrophysics Data System (ADS)

    Aartsen, M. G.; Abraham, K.; Ackermann, M.; Adams, J.; Aguilar, J. A.; Ahlers, M.; Ahrens, M.; Altmann, D.; Anderson, T.; Ansseau, I.; Anton, G.; Archinger, M.; Arguelles, C.; Arlen, T. C.; Auffenberg, J.; Bai, X.; Barwick, S. W.; Baum, V.; Bay, R.; Beatty, J. J.; Becker Tjus, J.; Becker, K.-H.; Beiser, E.; BenZvi, S.; Berghaus, P.; Berley, D.; Bernardini, E.; Bernhard, A.; Besson, D. Z.; Binder, G.; Bindig, D.; Bissok, M.; Blaufuss, E.; Blumenthal, J.; Boersma, D. J.; Bohm, C.; Börner, M.; Bos, F.; Bose, D.; Böser, S.; Botner, O.; Braun, J.; Brayeur, L.; Bretz, H.-P.; Buzinsky, N.; Casey, J.; Casier, M.; Cheung, E.; Chirkin, D.; Christov, A.; Clark, K.; Classen, L.; Coenders, S.; Collin, G. H.; Conrad, J. M.; Cowen, D. F.; Cruz Silva, A. H.; Danninger, M.; Daughhetee, J.; Davis, J. C.; Day, M.; de André, J. P. A. M.; De Clercq, C.; del Pino Rosendo, E.; Dembinski, H.; De Ridder, S.; Desiati, P.; de Vries, K. D.; de Wasseige, G.; de With, M.; DeYoung, T.; Díaz-Vélez, J. C.; di Lorenzo, V.; Dumm, J. P.; Dunkman, M.; Eberhardt, B.; Edsjö, J.; Ehrhardt, T.; Eichmann, B.; Euler, S.; Evenson, P. A.; Fahey, S.; Fazely, A. R.; Feintzeig, J.; Felde, J.; Filimonov, K.; Finley, C.; Flis, S.; Fösig, C.-C.; Fuchs, T.; Gaisser, T. K.; Gaior, R.; Gallagher, J.; Gerhardt, L.; Ghorbani, K.; Gier, D.; Gladstone, L.; Glagla, M.; Glüsenkamp, T.; Goldschmidt, A.; Golup, G.; Gonzalez, J. G.; Góra, D.; Grant, D.; Griffith, Z.; Groß, A.; Ha, C.; Haack, C.; Haj Ismail, A.; Hallgren, A.; Halzen, F.; Hansen, E.; Hansmann, B.; Hanson, K.; Hebecker, D.; Heereman, D.; Helbing, K.; Hellauer, R.; Hickford, S.; Hignight, J.; Hill, G. C.; Hoffman, K. D.; Hoffmann, R.; Holzapfel, K.; Homeier, A.; Hoshina, K.; Huang, F.; Huber, M.; Huelsnitz, W.; Hulth, P. O.; Hultqvist, K.; In, S.; Ishihara, A.; Jacobi, E.; Japaridze, G. S.; Jeong, M.; Jero, K.; Jones, B. J. P.; Jurkovic, M.; Kappes, A.; Karg, T.; Karle, A.; Katz, U.; Kauer, M.; Keivani, A.; Kelley, J. L.; Kemp, J.; Kheirandish, A.; Kiryluk, J.; Klein, S. R.; Kohnen, G.; Koirala, R.; Kolanoski, H.; Konietz, R.; Köpke, L.; Kopper, C.; Kopper, S.; Koskinen, D. J.; Kowalski, M.; Krings, K.; Kroll, G.; Kroll, M.; Krückl, G.; Kunnen, J.; Kurahashi, N.; Kuwabara, T.; Labare, M.; Lanfranchi, J. L.; Larson, M. J.; Lesiak-Bzdak, M.; Leuermann, M.; Leuner, J.; Lu, L.; Lünemann, J.; Madsen, J.; Maggi, G.; Mahn, K. B. M.; Mandelartz, M.; Maruyama, R.; Mase, K.; Matis, H. S.; Maunu, R.; McNally, F.; Meagher, K.; Medici, M.; Meier, M.; Meli, A.; Menne, T.; Merino, G.; Meures, T.; Miarecki, S.; Middell, E.; Mohrmann, L.; Montaruli, T.; Morse, R.; Nahnhauer, R.; Naumann, U.; Neer, G.; Niederhausen, H.; Nowicki, S. C.; Nygren, D. R.; Obertacke Pollmann, A.; Olivas, A.; Omairat, A.; O'Murchadha, A.; Palczewski, T.; Pandya, H.; Pankova, D. V.; Paul, L.; Pepper, J. A.; Pérez de los Heros, C.; Pfendner, C.; Pieloth, D.; Pinat, E.; Posselt, J.; Price, P. B.; Przybylski, G. T.; Quinnan, M.; Raab, C.; Rädel, L.; Rameez, M.; Rawlins, K.; Reimann, R.; Relich, M.; Resconi, E.; Rhode, W.; Richman, M.; Richter, S.; Riedel, B.; Robertson, S.; Rongen, M.; Rott, C.; Ruhe, T.; Ryckbosch, D.; Sabbatini, L.; Sander, H.-G.; Sandrock, A.; Sandroos, J.; Sarkar, S.; Savage, C.; Schatto, K.; Schimp, M.; Schlunder, P.; Schmidt, T.; Schoenen, S.; Schöneberg, S.; Schönwald, A.; Schulte, L.; Schumacher, L.; Scott, P.; Seckel, D.; Seunarine, S.; Silverwood, H.; Soldin, D.; Song, M.; Spiczak, G. M.; Spiering, C.; Stahlberg, M.; Stamatikos, M.; Stanev, T.; Stasik, A.; Steuer, A.; Stezelberger, T.; Stokstad, R. G.; Stößl, A.; Ström, R.; Strotjohann, N. L.; Sullivan, G. W.; Sutherland, M.; Taavola, H.; Taboada, I.; Tatar, J.; Ter-Antonyan, S.; Terliuk, A.; Te{š}ić, G.; Tilav, S.; Toale, P. A.; Tobin, M. N.; Toscano, S.; Tosi, D.; Tselengidou, M.; Turcati, A.; Unger, E.; Usner, M.; Vallecorsa, S.; Vandenbroucke, J.; van Eijndhoven, N.; Vanheule, S.; van Santen, J.; Veenkamp, J.; Vehring, M.; Voge, M.; Vraeghe, M.; Walck, C.; Wallace, A.; Wallraff, M.; Wandkowsky, N.; Weaver, Ch.; Wendt, C.; Westerhoff, S.; Whelan, B. J.; Wiebe, K.; Wiebusch, C. H.; Wille, L.; Williams, D. R.; Wills, L.; Wissing, H.; Wolf, M.; Wood, T. R.; Woschnagg, K.; Xu, D. L.; Xu, X. W.; Xu, Y.; Yanez, J. P.; Yodh, G.; Yoshida, S.; Zoll, M.

    2016-04-01

    We present an improved event-level likelihood formalism for including neutrino telescope data in global fits to new physics. We derive limits on spin-dependent dark matter-proton scattering by employing the new formalism in a re-analysis of data from the 79-string IceCube search for dark matter annihilation in the Sun, including explicit energy information for each event. The new analysis excludes a number of models in the weak-scale minimal supersymmetric standard model (MSSM) for the first time. This work is accompanied by the public release of the 79-string IceCube data, as well as an associated computer code for applying the new likelihood to arbitrary dark matter models.

  14. Compressible fluids with Maxwell-type equations, the minimal coupling with electromagnetic field and the Stefan–Boltzmann law

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mendes, Albert C.R., E-mail: albert@fisica.ufjf.br; Takakura, Flavio I., E-mail: takakura@fisica.ufjf.br; Abreu, Everton M.C., E-mail: evertonabreu@ufrrj.br

    In this work we have obtained a higher-derivative Lagrangian for a charged fluid coupled with the electromagnetic fluid and the Dirac’s constraints analysis was discussed. A set of first-class constraints fixed by noncovariant gauge condition were obtained. The path integral formalism was used to obtain the partition function for the corresponding higher-derivative Hamiltonian and the Faddeev–Popov ansatz was used to construct an effective Lagrangian. Through the partition function, a Stefan–Boltzmann type law was obtained. - Highlights: • Higher-derivative Lagrangian for a charged fluid. • Electromagnetic coupling and Dirac’s constraint analysis. • Partition function through path integral formalism. • Stefan–Boltzmann-kind lawmore » through the partition function.« less

  15. Factors Relating Infrastructure Provision by Developer in Formal Housing

    NASA Astrophysics Data System (ADS)

    Putri, H. T.; Maryati, S.; Humaira, A. N. S.

    2018-03-01

    In big cities, housing developer has significant role in infrastructure provision. Nevertheless in some cases developers have not fulfilled their role to complete the housing with infrastructures needed. The objective of this study is to explore the characteristics and the related factors of infrastructure provisioning in formal housing developed by developer using the quantitative and association method analysis. Infrastructures are focused on clean water, sewage, drainage, and solid waste system. This study used Parongpong District, West Bandung Regency as case study where the need of infrastructure is not fulfilled. Based on the analysis, can be concluded that there are some variation in infrastructure provisioning and the factor related the condition is the level of income of house owner target.

  16. Approach for classification and taxonomy within family Rickettsiaceae based on the Formal Order Analysis.

    PubMed

    Shpynov, S; Pozdnichenko, N; Gumenuk, A

    2015-01-01

    Genome sequences of 36 Rickettsia and Orientia were analyzed using Formal Order Analysis (FOA). This approach takes into account arrangement of nucleotides in each sequence. A numerical characteristic, the average distance (remoteness) - "g" was used to compare of genomes. Our results corroborated previous separation of three groups within the genus Rickettsia, including typhus group, classic spotted fever group, and the ancestral group and Orientia as a separate genus. Rickettsia felis URRWXCal2 and R. akari Hartford were not in the same group based on FOA, therefore designation of a so-called transitional Rickettsia group could not be confirmed with this approach. Copyright © 2015 Institut Pasteur. Published by Elsevier Masson SAS. All rights reserved.

  17. Circuitbot

    DTIC Science & Technology

    2016-03-01

    constraints problem. Game rules described valid moves allowing player to generate a memory graph performing improved C program verification . 15. SUBJECT...TERMS Formal Verification , Static Analysis, Abstract Interpretation, Pointer Analysis, Fixpoint Iteration 16. SECURITY CLASSIFICATION OF: 17...36 3.4.12 Example: Game Play . . . . . . . . . . . . . . . . . . . . . . . . . . . 37 3.4.13 Verification

  18. Integrating Security into the Curriculum

    DTIC Science & Technology

    1998-12-01

    predicate calculus, discrete math , and finite-state machine the- ory. In addition to applying standard mathematical foundations to constructing hardware and...models, specifi- cations, and the use of formal methods for verification and covert channel analysis. The means for analysis is based on discrete math , information

  19. Detecting dark-matter waves with a network of precision-measurement tools

    NASA Astrophysics Data System (ADS)

    Derevianko, Andrei

    2018-04-01

    Virialized ultralight fields (VULFs) are viable cold dark-matter candidates and include scalar and pseudoscalar bosonic fields, such as axions and dilatons. Direct searches for VULFs rely on low-energy precision-measurement tools. While previous proposals have focused on detecting coherent oscillations of the VULF signals at the VULF Compton frequencies for individual devices, here I consider a network of such devices. Virialized ultralight fields are essentially dark-matter waves and as such they carry both temporal and spatial phase information. Thereby, the discovery reach can be improved by using networks of precision-measurement tools. To formalize this idea, I derive a spatiotemporal two-point correlation function for the ultralight dark-matter fields in the framework of the standard halo model. Due to VULFs being Gaussian random fields, the derived two-point correlation function fully determines N -point correlation functions. For a network of ND devices within the coherence length of the field, the sensitivity compared to a single device can be improved by a factor of √{ND}. Further, I derive a VULF dark-matter signal profile for an individual device. The resulting line shape is strongly asymmetric due to the parabolic dispersion relation for massive nonrelativistic bosons. I discuss the aliasing effect that extends the discovery reach to VULF frequencies higher than the experimental sampling rate. I present sensitivity estimates and develop a stochastic field signal-to-noise ratio statistic. Finally, I consider an application of the formalism developed to atomic clocks and their networks.

  20. Some Implications of a Behavioral Analysis of Verbal Behavior for Logic and Mathematics

    PubMed Central

    2013-01-01

    The evident power and utility of the formal models of logic and mathematics pose a puzzle: Although such models are instances of verbal behavior, they are also essentialistic. But behavioral terms, and indeed all products of selection contingencies, are intrinsically variable and in this respect appear to be incommensurate with essentialism. A distinctive feature of verbal contingencies resolves this puzzle: The control of behavior by the nonverbal environment is often mediated by the verbal behavior of others, and behavior under control of verbal stimuli is blind to the intrinsic variability of the stimulating environment. Thus, words and sentences serve as filters of variability and thereby facilitate essentialistic model building and the formal structures of logic, mathematics, and science. Autoclitic frames, verbal chains interrupted by interchangeable variable terms, are ubiquitous in verbal behavior. Variable terms can be substituted in such frames almost without limit, a feature fundamental to formal models. Consequently, our fluency with autoclitic frames fosters generalization to formal models, which in turn permit deduction and other kinds of logical and mathematical inference. PMID:28018038

  1. Mechanically verified hardware implementing an 8-bit parallel IO Byzantine agreement processor

    NASA Technical Reports Server (NTRS)

    Moore, J. Strother

    1992-01-01

    Consider a network of four processors that use the Oral Messages (Byzantine Generals) Algorithm of Pease, Shostak, and Lamport to achieve agreement in the presence of faults. Bevier and Young have published a functional description of a single processor that, when interconnected appropriately with three identical others, implements this network under the assumption that the four processors step in synchrony. By formalizing the original Pease, et al work, Bevier and Young mechanically proved that such a network achieves fault tolerance. We develop, formalize, and discuss a hardware design that has been mechanically proven to implement their processor. In particular, we formally define mapping functions from the abstract state space of the Bevier-Young processor to a concrete state space of a hardware module and state a theorem that expresses the claim that the hardware correctly implements the processor. We briefly discuss the Brock-Hunt Formal Hardware Description Language which permits designs both to be proved correct with the Boyer-Moore theorem prover and to be expressed in a commercially supported hardware description language for additional electrical analysis and layout. We briefly describe our implementation.

  2. [The process and effect of heaviness exercise in autogenic training: factor analytical study of subjective response induced by the concentration upon and formal language of the sense of heaviness in the arm].

    PubMed

    Ikezuki, M; Sasaki, Y

    1996-02-01

    The present study examined the subjective response induced by the concentration upon and repetition of the formal language expressing the sense of heaviness--"The arm is heavy." As a result of the factor analysis of the experiment using 60 subjects, the following five factors emerged. (1) Overall sense of improvement; (2) awareness of the sensation of the arm; (3) change to less nervousness; (4) awareness of positive aspects; (5) understanding of the formal language. Also, those who were aware of psychosomatic symptoms felt their change to less nervousness more significantly, and their understanding of the formal language was significantly higher than those who were not aware of psychosomatic symptoms. The result of the experiment suggests one possibility that excessive concentration upon the body concerning psychosomatic symptoms may have changed to the concentration upon the sense of heaviness, or that the reduction of the stress thereof may have brought the change to less nervousness.

  3. The Effective-One-Body Approach to the General Relativistic Two Body Problem

    NASA Astrophysics Data System (ADS)

    Damour, Thibault; Nagar, Alessandro

    The two-body problem in General Relativity has been the subject of many analytical investigations. After reviewing some of the methods used to tackle this problem (and, more generally, the N-body problem), we focus on a new, recently introduced approach to the motion and radiation of (comparable mass) binary systems: the Effective One Body (EOB) formalism. We review the basic elements of this formalism, and discuss some of its recent developments. Several recent comparisons between EOB predictions and Numerical Relativity (NR) simulations have shown the aptitude of the EOB formalism to provide accurate descriptions of the dynamics and radiation of various binary systems (comprising black holes or neutron stars) in regimes that are inaccessible to other analytical approaches (such as the last orbits and the merger of comparable mass black holes). In synergy with NR simulations, post-Newtonian (PN) theory and Gravitational Self-Force (GSF) computations, the EOB formalism is likely to provide an efficient way of computing the very many accurate template waveforms that are needed for Gravitational Wave (GW) data analysis purposes.

  4. 76 FR 53638 - Approval and Promulgation of Air Quality Implementation Plans; Delaware; Infrastructure State...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-29

    ...), DNREC supplemented its September 16, 2009 submittal with a technical analysis submitted to EPA for... supplemental technical analysis, for which it has requested parallel-processing, through the public notice and... submitted the technical analysis to EPA as a formal supplement to its September 16, 2009 submittal. The...

  5. The methodology of semantic analysis for extracting physical effects

    NASA Astrophysics Data System (ADS)

    Fomenkova, M. A.; Kamaev, V. A.; Korobkin, D. M.; Fomenkov, S. A.

    2017-01-01

    The paper represents new methodology of semantic analysis for physical effects extracting. This methodology is based on the Tuzov ontology that formally describes the Russian language. In this paper, semantic patterns were described to extract structural physical information in the form of physical effects. A new algorithm of text analysis was described.

  6. Performance criteria for emergency medicine residents: a job analysis.

    PubMed

    Blouin, Danielle; Dagnone, Jeffrey Damon

    2008-11-01

    A major role of admission interviews is to assess a candidate's suitability for a residency program. Structured interviews have greater reliability and validity than do unstructured ones. The development of content for a structured interview is typically based on the dimensions of performance that are perceived as important to succeed in a particular line of work. A formal job analysis is normally conducted to determine these dimensions. The dimensions essential to succeed as an emergency medicine (EM) resident have not yet been studied. We aimed to analyze the work of EM residents to determine these essential dimensions. The "critical incident technique" was used to generate scenarios of poor and excellent resident performance. Two reviewers independently read each scenario and labelled the performance dimensions that were reflected in each. All labels assigned to a particular scenario were pooled and reviewed again until a consensus was reached. Five faculty members (25% of our total faculty) comprised the subject experts. Fifty-one incidents were generated and 50 different labels were applied. Eleven dimensions of performance applied to at least 5 incidents. "Professionalism" was the most valued performance dimension, represented in 56% of the incidents, followed by "self-confidence" (22%), "experience" (20%) and "knowledge" (20%). "Professionalism," "self-confidence," "experience" and "knowledge" were identified as the performance dimensions essential to succeed as an EM resident based on our formal job analysis using the critical incident technique. Performing a formal job analysis may assist training program directors with developing admission interviews.

  7. 'Haven of safety' and 'secure base': a qualitative inquiry into factors affecting child attachment security in Nairobi, Kenya.

    PubMed

    Polkovnikova-Wamoto, Anastasia; Mathai, Muthoni; Stoep, Ann Vander; Kumar, Manasi

    2016-01-01

    Secure attachment in childhood and adolescence protects children from engagement in high risk behaviors and development of mental health problems over the life span. Poverty has been shown to create impoverishment in certain aspects of caregiving and correspondingly to compromise development of secure attachment in children. Nineteen children 8 to 14 years old from two schools in a middle income area and an urban informal settlement area of Nairobi were interviewed using an adapted Child Attachment Interview (CAI) protocol. CAI was developed to provide a glimpse into the 'meta-theories' children have about themselves, parents, parenting and their attachment ties with parents and extended family members. Narratives obtained with the CAI were analyzed using thematic analysis. Both Bowlby's idea of 'secure base' as well as Bronfrenbrenner's 'ecological niche' are used as reference points to situate child attachment and parenting practices in the larger Kenyan context. We found that with slight linguistic alterations CAI can be used to assess attachment security of Kenyan children in this particular age range. We also found that the narration ability in both groups of children was generally good such that formal coding was possible, despite cultural differences. Our analysis suggested differences in narrative quality across the children from middle class and lower socio-economic class schools on specific themes such as: sensitivity of parenting (main aspects of sensitivity were associated with disciplinary methods and child's access to education), birth order , parental emotional availability , and severity of inter-parental conflicts and child's level of exposure. The paper puts in context a few cultural practices such as greater household responsibility accorded to the eldest child and stern to harsh disciplinary methods adopted by parents in the Kenyan setting.

  8. Detection limits of quantitative and digital PCR assays and their influence in presence-absence surveys of environmental DNA.

    PubMed

    Hunter, Margaret E; Dorazio, Robert M; Butterfield, John S S; Meigs-Friend, Gaia; Nico, Leo G; Ferrante, Jason A

    2017-03-01

    A set of universal guidelines is needed to determine the limit of detection (LOD) in PCR-based analyses of low-concentration DNA. In particular, environmental DNA (eDNA) studies require sensitive and reliable methods to detect rare and cryptic species through shed genetic material in environmental samples. Current strategies for assessing detection limits of eDNA are either too stringent or subjective, possibly resulting in biased estimates of species' presence. Here, a conservative LOD analysis grounded in analytical chemistry is proposed to correct for overestimated DNA concentrations predominantly caused by the concentration plateau, a nonlinear relationship between expected and measured DNA concentrations. We have used statistical criteria to establish formal mathematical models for both quantitative and droplet digital PCR. To assess the method, a new Grass Carp (Ctenopharyngodon idella) TaqMan assay was developed and tested on both PCR platforms using eDNA in water samples. The LOD adjustment reduced Grass Carp occupancy and detection estimates while increasing uncertainty-indicating that caution needs to be applied to eDNA data without LOD correction. Compared to quantitative PCR, digital PCR had higher occurrence estimates due to increased sensitivity and dilution of inhibitors at low concentrations. Without accurate LOD correction, species occurrence and detection probabilities based on eDNA estimates are prone to a source of bias that cannot be reduced by an increase in sample size or PCR replicates. Other applications also could benefit from a standardized LOD such as GMO food analysis and forensic and clinical diagnostics. Published 2016. This article is a U.S. Government work and is in the public domain in the USA.

  9. Worth of Geophysical Data in Natural-Disaster-Insurance Rate Setting.

    NASA Astrophysics Data System (ADS)

    Attanasi, E. D.; Karlinger, M. R.

    1982-04-01

    Insurance firms that offer natural-disaster insurance base their rates on available information. The benefits from collecting additional data and incorporating this information to improve parameter estimates of probability distributions that are used to characterize natural-disaster events can be determined by computing changes in premiums as a function of additional data. Specifically, the worth of data can be measured by changes in consumer's surplus (the widely applied measure of benefits to consumers used in benefit-cost analysis) brought about when the premiums are adjusted. In this paper, a formal model of the process for setting insurance rates is hypothesized in which the insurance firm sets rates so as to trade off penalties of overestimation and underestimation of expected damages estimated from currently available hydrologic data. A Bayesian preposterior analysis is performed which permits the determination of the expected benefits of collecting additional geophysical data by examining the changes in expected premium rates as a function of the longer record before the data are actually collected. An estimate of the expected benefits associated with collecting more data for the representative consumer is computed using an assumed demand function for insurance. In addition, a sensitivity analysis of expected benefits to changes in insurance demand and firm rate-setting procedures is carried out. From these results, conclusions are drawn regarding aggregate benefits to all flood insurance purchasers.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hartmann, Anja, E-mail: hartmann@ipk-gatersleben.de; Schreiber, Falk; Martin-Luther-University Halle-Wittenberg, Halle

    The characterization of biological systems with respect to their behavior and functionality based on versatile biochemical interactions is a major challenge. To understand these complex mechanisms at systems level modeling approaches are investigated. Different modeling formalisms allow metabolic models to be analyzed depending on the question to be solved, the biochemical knowledge and the availability of experimental data. Here, we describe a method for an integrative analysis of the structure and dynamics represented by qualitative and quantitative metabolic models. Using various formalisms, the metabolic model is analyzed from different perspectives. Determined structural and dynamic properties are visualized in the contextmore » of the metabolic model. Interaction techniques allow the exploration and visual analysis thereby leading to a broader understanding of the behavior and functionality of the underlying biological system. The System Biology Metabolic Model Framework (SBM{sup 2} – Framework) implements the developed method and, as an example, is applied for the integrative analysis of the crop plant potato.« less

  11. Rewriting Modulo SMT and Open System Analysis

    NASA Technical Reports Server (NTRS)

    Rocha, Camilo; Meseguer, Jose; Munoz, Cesar

    2014-01-01

    This paper proposes rewriting modulo SMT, a new technique that combines the power of SMT solving, rewriting modulo theories, and model checking. Rewriting modulo SMT is ideally suited to model and analyze infinite-state open systems, i.e., systems that interact with a non-deterministic environment. Such systems exhibit both internal non-determinism, which is proper to the system, and external non-determinism, which is due to the environment. In a reflective formalism, such as rewriting logic, rewriting modulo SMT can be reduced to standard rewriting. Hence, rewriting modulo SMT naturally extends rewriting-based reachability analysis techniques, which are available for closed systems, to open systems. The proposed technique is illustrated with the formal analysis of: (i) a real-time system that is beyond the scope of timed-automata methods and (ii) automatic detection of reachability violations in a synchronous language developed to support autonomous spacecraft operations.

  12. Usability engineering: domain analysis activities for augmented-reality systems

    NASA Astrophysics Data System (ADS)

    Gabbard, Joseph; Swan, J. E., II; Hix, Deborah; Lanzagorta, Marco O.; Livingston, Mark; Brown, Dennis B.; Julier, Simon J.

    2002-05-01

    This paper discusses our usability engineering process for the Battlefield Augmented Reality System (BARS). Usability engineering is a structured, iterative, stepwise development process. Like the related disciplines of software and systems engineering, usability engineering is a combination of management principals and techniques, formal and semi- formal evaluation techniques, and computerized tools. BARS is an outdoor augmented reality system that displays heads- up battlefield intelligence information to a dismounted warrior. The paper discusses our general usability engineering process. We originally developed the process in the context of virtual reality applications, but in this work we are adapting the procedures to an augmented reality system. The focus of this paper is our work on domain analysis, the first activity of the usability engineering process. We describe our plans for and our progress to date on our domain analysis for BARS. We give results in terms of a specific urban battlefield use case we have designed.

  13. Non-standard analysis and embedded software

    NASA Technical Reports Server (NTRS)

    Platek, Richard

    1995-01-01

    One model for computing in the future is ubiquitous, embedded computational devices analogous to embedded electrical motors. Many of these computers will control physical objects and processes. Such hidden computerized environments introduce new safety and correctness concerns whose treatment go beyond present Formal Methods. In particular, one has to begin to speak about Real Space software in analogy with Real Time software. By this we mean, computerized systems which have to meet requirements expressed in the real geometry of space. How to translate such requirements into ordinary software specifications and how to carry out proofs is a major challenge. In this talk we propose a research program based on the use of no-standard analysis. Much detail remains to be carried out. The purpose of the talk is to inform the Formal Methods community that Non-Standard Analysis provides a possible avenue to attack which we believe will be fruitful.

  14. Stellar rotation periods determined from simultaneously measured Ca II H&K and Ca II IRT lines

    NASA Astrophysics Data System (ADS)

    Mittag, M.; Hempelmann, A.; Schmitt, J. H. M. M.; Fuhrmeister, B.; González-Pérez, J. N.; Schröder, K.-P.

    2017-11-01

    Aims: Previous studies have shown that, for late-type stars, activity indicators derived from the Ca II infrared-triplet (IRT) lines are correlated with the indicators derived from the Ca II H&K lines. Therefore, the Ca II IRT lines are in principle usable for activity studies, but they may be less sensitive when measuring the rotation period. Our goal is to determine whether the Ca II IRT lines are sufficiently sensitive to measure rotation periods and how any Ca II IRT derived rotation periods compare with periods derived from the "classical" Mount Wilson S-index. Methods: To analyse the Ca II IRT lines' sensitivity and to measure rotation periods, we define an activity index for each of the Ca II IRT lines similar to the Mount Wilson S-index and perform a period analysis for the lines separately and jointly. Results: For eleven late-type stars we can measure the rotation periods using the Ca II IRT indices similar to those found in the Mount Wilson S-index time series and find that a period derived from all four indices gives the most probable rotation period; we find good agreement for stars with already existing literature values. In a few cases the computed periodograms show a complicated structure with multiple peaks, meaning that formally different periods are derived in different indices. We show that in one case, this is due to data sampling effects and argue that denser cadence sampling is necessary to provide credible evidence for differential rotation. However, our TIGRE data for HD 101501 shows good evidence for the presence of differential rotation.

  15. Tools reference manual for a Requirements Specification Language (RSL), version 2.0

    NASA Technical Reports Server (NTRS)

    Fisher, Gene L.; Cohen, Gerald C.

    1993-01-01

    This report describes a general-purpose Requirements Specification Language, RSL. The purpose of RSL is to specify precisely the external structure of a mechanized system and to define requirements that the system must meet. A system can be comprised of a mixture of hardware, software, and human processing elements. RSL is a hybrid of features found in several popular requirements specification languages, such as SADT (Structured Analysis and Design Technique), PSL (Problem Statement Language), and RMF (Requirements Modeling Framework). While languages such as these have useful features for structuring a specification, they generally lack formality. To overcome the deficiencies of informal requirements languages, RSL has constructs for formal mathematical specification. These constructs are similar to those found in formal specification languages such as EHDM (Enhanced Hierarchical Development Methodology), Larch, and OBJ3.

  16. Formal analysis of temporal dynamics in anxiety states and traits for virtual patients

    NASA Astrophysics Data System (ADS)

    Aziz, Azizi Ab; Ahmad, Faudziah; Yusof, Nooraini; Ahmad, Farzana Kabir; Yusof, Shahrul Azmi Mohd

    2016-08-01

    This paper presents a temporal dynamic model of anxiety states and traits for an individual. Anxiety is a natural part of life, and most of us experience it from time to time. But for some people, anxiety can be extreme. Based on several personal characteristics, traits, and a representation of events (i.e. psychological and physiological stressors), the formal model can represent whether a human that experience certain scenarios will fall into an anxiety states condition. A number of well-known relations between events and the course of anxiety are summarized from the literature and it is shown that the model exhibits those patterns. In addition, the formal model has been mathematically analyzed to find out which stable situations exist. Finally, it is pointed out how this model can be used in therapy, supported by a software agent.

  17. Stochastic Formal Correctness of Numerical Algorithms

    NASA Technical Reports Server (NTRS)

    Daumas, Marc; Lester, David; Martin-Dorel, Erik; Truffert, Annick

    2009-01-01

    We provide a framework to bound the probability that accumulated errors were never above a given threshold on numerical algorithms. Such algorithms are used for example in aircraft and nuclear power plants. This report contains simple formulas based on Levy's and Markov's inequalities and it presents a formal theory of random variables with a special focus on producing concrete results. We selected four very common applications that fit in our framework and cover the common practices of systems that evolve for a long time. We compute the number of bits that remain continuously significant in the first two applications with a probability of failure around one out of a billion, where worst case analysis considers that no significant bit remains. We are using PVS as such formal tools force explicit statement of all hypotheses and prevent incorrect uses of theorems.

  18. Professionalism and ethics: A proposed curriculum for undergraduates.

    PubMed

    Mahajan, Rajiv; Aruldhas, Blessed Winston; Sharma, Monika; Badyal, Dinesh K; Singh, Tejinder

    2016-01-01

    Professionalism is the attributes, behaviors, commitments, values, and goals that characterize a profession. In medical professional, it encompasses strong societal role and involves emotional component too. On the other hand, ethics is the study of morality - careful and systematic analysis of moral decisions and behaviors and practicing those decisions. Medical ethics focuses primarily on issues arising out of the practice of medicine. It is generally believed that professionalism and ethics are caught by watching your teachers and seniors and not taught formally. Professionalism and ethics are previously diffused passively to the students through "the hidden curriculum," leaving a lot to chance. However, over the time, it has been advocated that graduates need to be formally trained in the concepts of professionalism and ethics. In this paper, we propose a formal curriculum on professionalism and ethics, tailor-made for Indian medical graduates.

  19. What is the right formalism to search for resonances?

    NASA Astrophysics Data System (ADS)

    Mikhasenko, M.; Pilloni, A.; Nys, J.; Albaladejo, M.; Fernández-Ramírez, C.; Jackura, A.; Mathieu, V.; Sherrill, N.; Skwarnicki, T.; Szczepaniak, A. P.

    2018-03-01

    Hadron decay chains constitute one of the main sources of information on the QCD spectrum. We discuss the differences between several partial wave analysis formalisms used in the literature to build the amplitudes. We match the helicity amplitudes to the covariant tensor basis. Hereby, we pay attention to the analytical properties of the amplitudes and separate singularities of kinematical and dynamical nature. We study the analytical properties of the spin-orbit (LS) formalism, and some of the covariant tensor approaches. In particular, we explicitly build the amplitudes for the B→ ψ π K and B→ \\bar{D}π π decays, and show that the energy dependence of the covariant approach is model dependent. We also show that the usual recursive construction of covariant tensors explicitly violates crossing symmetry, which would lead to different resonance parameters extracted from scattering and decay processes.

  20. Occupational risk for Legionella infection among dental healthcare workers: meta-analysis in occupational epidemiology.

    PubMed

    Petti, Stefano; Vitali, Matteo

    2017-07-13

    The occupational risk for Legionella infection among dental healthcare workers (DHCWs) is conjectured because of the risk of routine inhalation of potentially contaminated aerosols produced by the dental instruments. Nevertheless, occupational epidemiology studies are contrasting. This meta-analysis assessed the level of scientific evidence regarding the relative occupational risk for Legionella infection among DHCWs. Literature search was performed without time and language restrictions, using broad data banks (PubMed, Scopus, Web of Science, GOOGLE Scholar) and generic keywords ('legionella' AND 'dent*'). Analytical cross-sectional studies comparing prevalence of high serum Legionella antibody levels in DHCWs and occupationally unexposed individuals were considered. The relative occupational risk was assessed through prevalence ratio (PR) with 95% CI. Between-study heterogeneity was assessed (Cochran's Q test) and was used to choose the meta-analytic method. Study quality (modified Newcastle-Ottawa Scale) and publication bias (Begg and Mazumdar's test, Egger and colleagues' test, trim and fill R 0 method) were assessed formally and considered for the sensitivity analysis. Sensitivity analysis to study inclusion, subgroup analyses (dental staff categories; publication year, before vs after 1998, ie, 5 years after the release by the Centers for Disease Control and Prevention of the infection control guidelines in dental healthcare setting) were performed. Seven studies were included (2232 DHCWs, 1172 occupationally unexposed individuals). No evidence of publication bias was detected. The pooled PR estimate was statistically non-significant at 95% level (1.7; 95% CI 0.8 to 3.2), study-quality adjustment did not change the PR considerably (PR, 1.5; 95% CI 0.5 to 4.1). PR was statistically significant before 1998 and no longer significant after 1998. Subgroup analysis according to DHCW categories was inconclusive. There is no scientific evidence that DHCWs are at high occupational risk. The differences between former and recent studies could be due to different characteristics of municipal water systems and the infection control guideline dissemination. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

Top