Sample records for objective analysis scheme

  1. The GEMPAK Barnes objective analysis scheme

    NASA Technical Reports Server (NTRS)

    Koch, S. E.; Desjardins, M.; Kocin, P. J.

    1981-01-01

    GEMPAK, an interactive computer software system developed for the purpose of assimilating, analyzing, and displaying various conventional and satellite meteorological data types is discussed. The objective map analysis scheme possesses certain characteristics that allowed it to be adapted to meet the analysis needs GEMPAK. Those characteristics and the specific adaptation of the scheme to GEMPAK are described. A step-by-step guide for using the GEMPAK Barnes scheme on an interactive computer (in real time) to analyze various types of meteorological datasets is also presented.

  2. Error determination of a successive correction type objective analysis scheme. [for surface meteorological data

    NASA Technical Reports Server (NTRS)

    Smith, D. R.; Leslie, F. W.

    1984-01-01

    The Purdue Regional Objective Analysis of the Mesoscale (PROAM) is a successive correction type scheme for the analysis of surface meteorological data. The scheme is subjected to a series of experiments to evaluate its performance under a variety of analysis conditions. The tests include use of a known analytic temperature distribution to quantify error bounds for the scheme. Similar experiments were conducted using actual atmospheric data. Results indicate that the multiple pass technique increases the accuracy of the analysis. Furthermore, the tests suggest appropriate values for the analysis parameters in resolving disturbances for the data set used in this investigation.

  3. Comparison of Optimum Interpolation and Cressman Analyses

    NASA Technical Reports Server (NTRS)

    Baker, W. E.; Bloom, S. C.; Nestler, M. S.

    1984-01-01

    The objective of this investigation is to develop a state-of-the-art optimum interpolation (O/I) objective analysis procedure for use in numerical weather prediction studies. A three-dimensional multivariate O/I analysis scheme has been developed. Some characteristics of the GLAS O/I compared with those of the NMC and ECMWF systems are summarized. Some recent enhancements of the GLAS scheme include a univariate analysis of water vapor mixing ratio, a geographically dependent model prediction error correlation function and a multivariate oceanic surface analysis.

  4. Statistical Field Estimation for Complex Coastal Regions and Archipelagos (PREPRINT)

    DTIC Science & Technology

    2011-04-09

    and study the computational properties of these schemes. Specifically, we extend a multiscale Objective Analysis (OA) approach to complex coastal...computational properties of these schemes. Specifically, we extend a multiscale Objective Analysis (OA) approach to complex coastal regions and... multiscale free-surface code builds on the primitive-equation model of the Harvard Ocean Predic- tion System (HOPS, Haley et al. (2009)). Additionally

  5. Process optimization of solid rad waste management at the Shelter object transformation to the ecologically safety system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Batiy, V.G.; Stojanov, A.I.; Schmieman, E.

    2007-07-01

    Methodological approach of optimization of schemes of solid radwaste management of the Object Shelter (Shelter) and ChNPP industrial site during transformation to the ecologically safe system was developed. On the basis of the conducted models researches the ALARA-analysis was carried out for the choice of optimum variant of schemes and technologies of solid radwaste management. The criteria of choice of optimum schemes, which are directed on optimization of doses and financial expenses, minimization of amount of the formed radwaste etc, were developed for realization of this ALARA-analysis. (authors)

  6. Modification and evaluation of a Barnes-type objective analysis scheme for surface meteorological data

    NASA Technical Reports Server (NTRS)

    Smith, D. R.

    1982-01-01

    The Purdue Regional Objective Analysis of the Mesoscale (PROAM) is a Barness-type scheme for the analysis of surface meteorological data. Modifications are introduced to the original version in order to increase its flexibility and to permit greater ease of usage. The code was rewritten for an interactive computer environment. Furthermore, a multiple iteration technique suggested by Barnes was implemented for greater accuracy. PROAM was subjected to a series of experiments in order to evaluate its performance under a variety of analysis conditions. The tests include use of a known analytic temperature distribution in order to quantify error bounds for the scheme. Similar experiments were conducted using actual atmospheric data. Results indicate that the multiple iteration technique increases the accuracy of the analysis. Furthermore, the tests verify appropriate values for the analysis parameters in resolving meso-beta scale phenomena.

  7. Multilevel Green's function interpolation method for scattering from composite metallic and dielectric objects.

    PubMed

    Shi, Yan; Wang, Hao Gang; Li, Long; Chan, Chi Hou

    2008-10-01

    A multilevel Green's function interpolation method based on two kinds of multilevel partitioning schemes--the quasi-2D and the hybrid partitioning scheme--is proposed for analyzing electromagnetic scattering from objects comprising both conducting and dielectric parts. The problem is formulated using the surface integral equation for homogeneous dielectric and conducting bodies. A quasi-2D multilevel partitioning scheme is devised to improve the efficiency of the Green's function interpolation. In contrast to previous multilevel partitioning schemes, noncubic groups are introduced to discretize the whole EM structure in this quasi-2D multilevel partitioning scheme. Based on the detailed analysis of the dimension of the group in this partitioning scheme, a hybrid quasi-2D/3D multilevel partitioning scheme is proposed to effectively handle objects with fine local structures. Selection criteria for some key parameters relating to the interpolation technique are given. The proposed algorithm is ideal for the solution of problems involving objects such as missiles, microstrip antenna arrays, photonic bandgap structures, etc. Numerical examples are presented to show that CPU time is between O(N) and O(N log N) while the computer memory requirement is O(N).

  8. Sensitivity of Forecast Skill to Different Objective Analysis Schemes

    NASA Technical Reports Server (NTRS)

    Baker, W. E.

    1979-01-01

    Numerical weather forecasts are characterized by rapidly declining skill in the first 48 to 72 h. Recent estimates of the sources of forecast error indicate that the inaccurate specification of the initial conditions contributes substantially to this error. The sensitivity of the forecast skill to the initial conditions is examined by comparing a set of real-data experiments whose initial data were obtained with two different analysis schemes. Results are presented to emphasize the importance of the objective analysis techniques used in the assimilation of observational data.

  9. A simple, objective analysis scheme for scatterometer data. [Seasat A satellite observation of wind over ocean

    NASA Technical Reports Server (NTRS)

    Levy, G.; Brown, R. A.

    1986-01-01

    A simple economical objective analysis scheme is devised and tested on real scatterometer data. It is designed to treat dense data such as those of the Seasat A Satellite Scatterometer (SASS) for individual or multiple passes, and preserves subsynoptic scale features. Errors are evaluated with the aid of sampling ('bootstrap') statistical methods. In addition, sensitivity tests have been performed which establish qualitative confidence in calculated fields of divergence and vorticity. The SASS wind algorithm could be improved; however, the data at this point are limited by instrument errors rather than analysis errors. The analysis error is typically negligible in comparison with the instrument error, but amounts to 30 percent of the instrument error in areas of strong wind shear. The scheme is very economical, and thus suitable for large volumes of dense data such as SASS data.

  10. Using object-based image analysis to guide the selection of field sample locations

    USDA-ARS?s Scientific Manuscript database

    One of the most challenging tasks for resource management and research is designing field sampling schemes to achieve unbiased estimates of ecosystem parameters as efficiently as possible. This study focused on the potential of fine-scale image objects from object-based image analysis (OBIA) to be u...

  11. The Nature of All "Inappropriate Referrals" Made to a Countywide Physical Activity Referral Scheme: Implications for Practice

    ERIC Educational Resources Information Center

    Johnston, Lynne Halley; Warwick, Jane; De Ste Croix, Mark; Crone, Diane; Sldford, Adrienne

    2005-01-01

    Objective: The aim of this study was to evaluate the impact of a centralised referral mechanism (CRM) upon the number and type of "inappropriate referrals" made to a countywide physical activity referral scheme. Design: Case study. Method: Phase 1: Hierarchical Content Analysis of 458 "inappropriate referrals" made to a countywide scheme over a…

  12. Objective analysis of observational data from the FGGE observing systems

    NASA Technical Reports Server (NTRS)

    Baker, W.; Edelmann, D.; Iredell, M.; Han, D.; Jakkempudi, S.

    1981-01-01

    An objective analysis procedure for updating the GLAS second and fourth order general atmospheric circulation models using observational data from the first GARP global experiment is described. The objective analysis procedure is based on a successive corrections method and the model is updated in a data assimilation cycle. Preparation of the observational data for analysis and the objective analysis scheme are described. The organization of the program and description of the required data sets are presented. The program logic and detailed descriptions of each subroutine are given.

  13. Comparison of Optimum Interpolation and Cressman Analyses

    NASA Technical Reports Server (NTRS)

    Baker, W. E.; Bloom, S. C.; Nestler, M. S.

    1985-01-01

    The development of a state of the art optimum interpolation (O/I) objective analysis procedure for use in numerical weather prediction studies was investigated. A three dimensional multivariate O/I analysis scheme was developed. Some characteristics of the GLAS O/I compared with those of the NMC and ECMWF systems are summarized. Some recent enhancements of the GLAS scheme include a univariate analysis of water vapor mixing ratio, a geographically dependent model prediction error correlation function and a multivariate oceanic surface analysis.

  14. Health financing for universal coverage and health system performance: concepts and implications for policy

    PubMed Central

    2013-01-01

    Abstract Unless the concept is clearly understood, “universal coverage” (or universal health coverage, UHC) can be used to justify practically any health financing reform or scheme. This paper unpacks the definition of health financing for universal coverage as used in the World Health Organization’s World health report 2010 to show how UHC embodies specific health system goals and intermediate objectives and, broadly, how health financing reforms can influence these. All countries seek to improve equity in the use of health services, service quality and financial protection for their populations. Hence, the pursuit of UHC is relevant to every country. Health financing policy is an integral part of efforts to move towards UHC, but for health financing policy to be aligned with the pursuit of UHC, health system reforms need to be aimed explicitly at improving coverage and the intermediate objectives linked to it, namely, efficiency, equity in health resource distribution and transparency and accountability. The unit of analysis for goals and objectives must be the population and health system as a whole. What matters is not how a particular financing scheme affects its individual members, but rather, how it influences progress towards UHC at the population level. Concern only with specific schemes is incompatible with a universal coverage approach and may even undermine UHC, particularly in terms of equity. Conversely, if a scheme is fully oriented towards system-level goals and objectives, it can further progress towards UHC. Policy and policy analysis need to shift from the scheme to the system level. PMID:23940408

  15. Health financing for universal coverage and health system performance: concepts and implications for policy.

    PubMed

    Kutzin, Joseph

    2013-08-01

    Unless the concept is clearly understood, "universal coverage" (or universal health coverage, UHC) can be used to justify practically any health financing reform or scheme. This paper unpacks the definition of health financing for universal coverage as used in the World Health Organization's World health report 2010 to show how UHC embodies specific health system goals and intermediate objectives and, broadly, how health financing reforms can influence these. All countries seek to improve equity in the use of health services, service quality and financial protection for their populations. Hence, the pursuit of UHC is relevant to every country. Health financing policy is an integral part of efforts to move towards UHC, but for health financing policy to be aligned with the pursuit of UHC, health system reforms need to be aimed explicitly at improving coverage and the intermediate objectives linked to it, namely, efficiency, equity in health resource distribution and transparency and accountability. The unit of analysis for goals and objectives must be the population and health system as a whole. What matters is not how a particular financing scheme affects its individual members, but rather, how it influences progress towards UHC at the population level. Concern only with specific schemes is incompatible with a universal coverage approach and may even undermine UHC, particularly in terms of equity. Conversely, if a scheme is fully oriented towards system-level goals and objectives, it can further progress towards UHC. Policy and policy analysis need to shift from the scheme to the system level.

  16. Error analysis of finite difference schemes applied to hyperbolic initial boundary value problems

    NASA Technical Reports Server (NTRS)

    Skollermo, G.

    1979-01-01

    Finite difference methods for the numerical solution of mixed initial boundary value problems for hyperbolic equations are studied. The reported investigation has the objective to develop a technique for the total error analysis of a finite difference scheme, taking into account initial approximations, boundary conditions, and interior approximation. Attention is given to the Cauchy problem and the initial approximation, the homogeneous problem in an infinite strip with inhomogeneous boundary data, the reflection of errors in the boundaries, and two different boundary approximations for the leapfrog scheme with a fourth order accurate difference operator in space.

  17. Connecting Payments for Ecosystem Services and Agri-Environment Regulation: An Analysis of the Welsh Glastir Scheme

    ERIC Educational Resources Information Center

    Wynne-Jones, Sophie

    2013-01-01

    Policy debates in the European Union have increasingly emphasised "Payments for Ecosystem Services" (PES) as a model for delivering agri-environmental objectives. This paper examines the Glastir scheme, introduced in Wales in 2009, as a notable attempt to move between long standing models of European agri-environment regulation and…

  18. Ancient numerical daemons of conceptual hydrological modeling: 2. Impact of time stepping schemes on model analysis and prediction

    NASA Astrophysics Data System (ADS)

    Kavetski, Dmitri; Clark, Martyn P.

    2010-10-01

    Despite the widespread use of conceptual hydrological models in environmental research and operations, they remain frequently implemented using numerically unreliable methods. This paper considers the impact of the time stepping scheme on model analysis (sensitivity analysis, parameter optimization, and Markov chain Monte Carlo-based uncertainty estimation) and prediction. It builds on the companion paper (Clark and Kavetski, 2010), which focused on numerical accuracy, fidelity, and computational efficiency. Empirical and theoretical analysis of eight distinct time stepping schemes for six different hydrological models in 13 diverse basins demonstrates several critical conclusions. (1) Unreliable time stepping schemes, in particular, fixed-step explicit methods, suffer from troublesome numerical artifacts that severely deform the objective function of the model. These deformations are not rare isolated instances but can arise in any model structure, in any catchment, and under common hydroclimatic conditions. (2) Sensitivity analysis can be severely contaminated by numerical errors, often to the extent that it becomes dominated by the sensitivity of truncation errors rather than the model equations. (3) Robust time stepping schemes generally produce "better behaved" objective functions, free of spurious local optima, and with sufficient numerical continuity to permit parameter optimization using efficient quasi Newton methods. When implemented within a multistart framework, modern Newton-type optimizers are robust even when started far from the optima and provide valuable diagnostic insights not directly available from evolutionary global optimizers. (4) Unreliable time stepping schemes lead to inconsistent and biased inferences of the model parameters and internal states. (5) Even when interactions between hydrological parameters and numerical errors provide "the right result for the wrong reason" and the calibrated model performance appears adequate, unreliable time stepping schemes make the model unnecessarily fragile in predictive mode, undermining validation assessments and operational use. Erroneous or misleading conclusions of model analysis and prediction arising from numerical artifacts in hydrological models are intolerable, especially given that robust numerics are accepted as mainstream in other areas of science and engineering. We hope that the vivid empirical findings will encourage the conceptual hydrological community to close its Pandora's box of numerical problems, paving the way for more meaningful model application and interpretation.

  19. Assessment of numerical methods for the solution of fluid dynamics equations for nonlinear resonance systems

    NASA Technical Reports Server (NTRS)

    Przekwas, A. J.; Yang, H. Q.

    1989-01-01

    The capability of accurate nonlinear flow analysis of resonance systems is essential in many problems, including combustion instability. Classical numerical schemes are either too diffusive or too dispersive especially for transient problems. In the last few years, significant progress has been made in the numerical methods for flows with shocks. The objective was to assess advanced shock capturing schemes on transient flows. Several numerical schemes were tested including TVD, MUSCL, ENO, FCT, and Riemann Solver Godunov type schemes. A systematic assessment was performed on scalar transport, Burgers' and gas dynamic problems. Several shock capturing schemes are compared on fast transient resonant pipe flow problems. A system of 1-D nonlinear hyperbolic gas dynamics equations is solved to predict propagation of finite amplitude waves, the wave steepening, formation, propagation, and reflection of shocks for several hundred wave cycles. It is shown that high accuracy schemes can be used for direct, exact nonlinear analysis of combustion instability problems, preserving high harmonic energy content for long periods of time.

  20. Method of center localization for objects containing concentric arcs

    NASA Astrophysics Data System (ADS)

    Kuznetsova, Elena G.; Shvets, Evgeny A.; Nikolaev, Dmitry P.

    2015-02-01

    This paper proposes a method for automatic center location of objects containing concentric arcs. The method utilizes structure tensor analysis and voting scheme optimized with Fast Hough Transform. Two applications of the proposed method are considered: (i) wheel tracking in video-based system for automatic vehicle classification and (ii) tree growth rings analysis on a tree cross cut image.

  1. No-Fault Compensation for Adverse Events Following Immunization: A Review of Chinese Law And Practice.

    PubMed

    Fei, Lanfang; Peng, Zhou

    2017-02-01

    In 2005, China introduced an administrative no-fault one-time compensation scheme for adverse events following immunization (AEFI). The scheme aims to ensure fair compensation for those injured by adverse reactions following immunization. These individuals bear a significant burden for the benefits of widespread immunization. However, there is little empirical evidence of how the scheme has been implemented and how it functions in practice. The article aims to fill this gap. Based on an analysis of the legal basis of the scheme and of practical compensation cases, this article examines the structuring, function, and effects of the scheme; evaluates loopholes in the scheme; evaluates the extent to which the scheme has achieved its intended objectives; and discusses further development of the scheme. © The Author 2017. Published by Oxford University Press; all rights reserved. For Permissions, please email: journals.permissions@oup.com.

  2. Research on comprehensive decision-making of PV power station connecting system

    NASA Astrophysics Data System (ADS)

    Zhou, Erxiong; Xin, Chaoshan; Ma, Botao; Cheng, Kai

    2018-04-01

    In allusion to the incomplete indexes system and not making decision on the subjectivity and objectivity of PV power station connecting system, based on the combination of improved Analytic Hierarchy Process (AHP), Criteria Importance Through Intercriteria Correlation (CRITIC) as well as grey correlation degree analysis (GCDA) is comprehensively proposed to select the appropriate system connecting scheme of PV power station. Firstly, indexes of PV power station connecting system are divided the recursion order hierarchy and calculated subjective weight by the improved AHP. Then, CRITIC is adopted to determine the objective weight of each index through the comparison intensity and conflict between indexes. The last the improved GCDA is applied to screen the optimal scheme, so as to, from the subjective and objective angle, select the connecting system. Comprehensive decision of Xinjiang PV power station is conducted and reasonable analysis results are attained. The research results might provide scientific basis for investment decision.

  3. On the convergence of nonconvex minimization methods for image recovery.

    PubMed

    Xiao, Jin; Ng, Michael Kwok-Po; Yang, Yu-Fei

    2015-05-01

    Nonconvex nonsmooth regularization method has been shown to be effective for restoring images with neat edges. Fast alternating minimization schemes have also been proposed and developed to solve the nonconvex nonsmooth minimization problem. The main contribution of this paper is to show the convergence of these alternating minimization schemes, based on the Kurdyka-Łojasiewicz property. In particular, we show that the iterates generated by the alternating minimization scheme, converges to a critical point of this nonconvex nonsmooth objective function. We also extend the analysis to nonconvex nonsmooth regularization model with box constraints, and obtain similar convergence results of the related minimization algorithm. Numerical examples are given to illustrate our convergence analysis.

  4. Improved biliary detection and diagnosis through intelligent machine analysis.

    PubMed

    Logeswaran, Rajasvaran

    2012-09-01

    This paper reports on work undertaken to improve automated detection of bile ducts in magnetic resonance cholangiopancreatography (MRCP) images, with the objective of conducting preliminary classification of the images for diagnosis. The proposed I-BDeDIMA (Improved Biliary Detection and Diagnosis through Intelligent Machine Analysis) scheme is a multi-stage framework consisting of successive phases of image normalization, denoising, structure identification, object labeling, feature selection and disease classification. A combination of multiresolution wavelet, dynamic intensity thresholding, segment-based region growing, region elimination, statistical analysis and neural networks, is used in this framework to achieve good structure detection and preliminary diagnosis. Tests conducted on over 200 clinical images with known diagnosis have shown promising results of over 90% accuracy. The scheme outperforms related work in the literature, making it a viable framework for computer-aided diagnosis of biliary diseases. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  5. Classification schemes for knowledge translation interventions: a practical resource for researchers.

    PubMed

    Slaughter, Susan E; Zimmermann, Gabrielle L; Nuspl, Megan; Hanson, Heather M; Albrecht, Lauren; Esmail, Rosmin; Sauro, Khara; Newton, Amanda S; Donald, Maoliosa; Dyson, Michele P; Thomson, Denise; Hartling, Lisa

    2017-12-06

    As implementation science advances, the number of interventions to promote the translation of evidence into healthcare, health systems, or health policy is growing. Accordingly, classification schemes for these knowledge translation (KT) interventions have emerged. A recent scoping review identified 51 classification schemes of KT interventions to integrate evidence into healthcare practice; however, the review did not evaluate the quality of the classification schemes or provide detailed information to assist researchers in selecting a scheme for their context and purpose. This study aimed to further examine and assess the quality of these classification schemes of KT interventions, and provide information to aid researchers when selecting a classification scheme. We abstracted the following information from each of the original 51 classification scheme articles: authors' objectives; purpose of the scheme and field of application; socioecologic level (individual, organizational, community, system); adaptability (broad versus specific); target group (patients, providers, policy-makers), intent (policy, education, practice), and purpose (dissemination versus implementation). Two reviewers independently evaluated the methodological quality of the development of each classification scheme using an adapted version of the AGREE II tool. Based on these assessments, two independent reviewers reached consensus about whether to recommend each scheme for researcher use, or not. Of the 51 original classification schemes, we excluded seven that were not specific classification schemes, not accessible or duplicates. Of the remaining 44 classification schemes, nine were not recommended. Of the 35 recommended classification schemes, ten focused on behaviour change and six focused on population health. Many schemes (n = 29) addressed practice considerations. Fewer schemes addressed educational or policy objectives. Twenty-five classification schemes had broad applicability, six were specific, and four had elements of both. Twenty-three schemes targeted health providers, nine targeted both patients and providers and one targeted policy-makers. Most classification schemes were intended for implementation rather than dissemination. Thirty-five classification schemes of KT interventions were developed and reported with sufficient rigour to be recommended for use by researchers interested in KT in healthcare. Our additional categorization and quality analysis will aid in selecting suitable classification schemes for research initiatives in the field of implementation science.

  6. Adaptive critic neural network-based object grasping control using a three-finger gripper.

    PubMed

    Jagannathan, S; Galan, Gustavo

    2004-03-01

    Grasping of objects has been a challenging task for robots. The complex grasping task can be defined as object contact control and manipulation subtasks. In this paper, object contact control subtask is defined as the ability to follow a trajectory accurately by the fingers of a gripper. The object manipulation subtask is defined in terms of maintaining a predefined applied force by the fingers on the object. A sophisticated controller is necessary since the process of grasping an object without a priori knowledge of the object's size, texture, softness, gripper, and contact dynamics is rather difficult. Moreover, the object has to be secured accurately and considerably fast without damaging it. Since the gripper, contact dynamics, and the object properties are not typically known beforehand, an adaptive critic neural network (NN)-based hybrid position/force control scheme is introduced. The feedforward action generating NN in the adaptive critic NN controller compensates the nonlinear gripper and contact dynamics. The learning of the action generating NN is performed on-line based on a critic NN output signal. The controller ensures that a three-finger gripper tracks a desired trajectory while applying desired forces on the object for manipulation. Novel NN weight tuning updates are derived for the action generating and critic NNs so that Lyapunov-based stability analysis can be shown. Simulation results demonstrate that the proposed scheme successfully allows fingers of a gripper to secure objects without the knowledge of the underlying gripper and contact dynamics of the object compared to conventional schemes.

  7. Assessing the Predictability of Convection using Ensemble Data Assimilation of Simulated Radar Observations in an LETKF system

    NASA Astrophysics Data System (ADS)

    Lange, Heiner; Craig, George

    2014-05-01

    This study uses the Local Ensemble Transform Kalman Filter (LETKF) to perform storm-scale Data Assimilation of simulated Doppler radar observations into the non-hydrostatic, convection-permitting COSMO model. In perfect model experiments (OSSEs), it is investigated how the limited predictability of convective storms affects precipitation forecasts. The study compares a fine analysis scheme with small RMS errors to a coarse scheme that allows for errors in position, shape and occurrence of storms in the ensemble. The coarse scheme uses superobservations, a coarser grid for analysis weights, a larger localization radius and larger observation error that allow a broadening of the Gaussian error statistics. Three hour forecasts of convective systems (with typical lifetimes exceeding 6 hours) from the detailed analyses of the fine scheme are found to be advantageous to those of the coarse scheme during the first 1-2 hours, with respect to the predicted storm positions. After 3 hours in the convective regime used here, the forecast quality of the two schemes appears indiscernible, judging by RMSE and verification methods for rain-fields and objects. It is concluded that, for operational assimilation systems, the analysis scheme might not necessarily need to be detailed to the grid scale of the model. Depending on the forecast lead time, and on the presence of orographic or synoptic forcing that enhance the predictability of storm occurrences, analyses from a coarser scheme might suffice.

  8. Experiments with a three-dimensional statistical objective analysis scheme using FGGE data

    NASA Technical Reports Server (NTRS)

    Baker, Wayman E.; Bloom, Stephen C.; Woollen, John S.; Nestler, Mark S.; Brin, Eugenia

    1987-01-01

    A three-dimensional (3D), multivariate, statistical objective analysis scheme (referred to as optimum interpolation or OI) has been developed for use in numerical weather prediction studies with the FGGE data. Some novel aspects of the present scheme include: (1) a multivariate surface analysis over the oceans, which employs an Ekman balance instead of the usual geostrophic relationship, to model the pressure-wind error cross correlations, and (2) the capability to use an error correlation function which is geographically dependent. A series of 4-day data assimilation experiments are conducted to examine the importance of some of the key features of the OI in terms of their effects on forecast skill, as well as to compare the forecast skill using the OI with that utilizing a successive correction method (SCM) of analysis developed earlier. For the three cases examined, the forecast skill is found to be rather insensitive to varying the error correlation function geographically. However, significant differences are noted between forecasts from a two-dimensional (2D) version of the OI and those from the 3D OI, with the 3D OI forecasts exhibiting better forecast skill. The 3D OI forecasts are also more accurate than those from the SCM initial conditions. The 3D OI with the multivariate oceanic surface analysis was found to produce forecasts which were slightly more accurate, on the average, than a univariate version.

  9. Juggling land retirement objectives on an agricultural landscape: coordination, conflict, or compromise?

    PubMed

    Marshall, Elizabeth P; Homans, Frances R

    2006-07-01

    Strategic land retirement in agricultural settings has been used as one way to achieve a combination of social objectives, which include ameliorating water quality problems and enhancing existing systems of wildlife habitat. This study uses a simulation model operating on a virtual landscape, along with the compromise programming method, to illustrate the implications of alternative weighting schemes for the long-term performance of the landscape toward various objectives. The analysis suggests that particular spatial patterns may be related to how various objectives are weighted. The analysis also illustrates the inevitable trade-offs among objectives, although it may be tempting to present retirement strategies as "win-win."

  10. Multiobjective hyper heuristic scheme for system design and optimization

    NASA Astrophysics Data System (ADS)

    Rafique, Amer Farhan

    2012-11-01

    As system design is becoming more and more multifaceted, integrated, and complex, the traditional single objective optimization trends of optimal design are becoming less and less efficient and effective. Single objective optimization methods present a unique optimal solution whereas multiobjective methods present pareto front. The foremost intent is to predict a reasonable distributed pareto-optimal solution set independent of the problem instance through multiobjective scheme. Other objective of application of intended approach is to improve the worthiness of outputs of the complex engineering system design process at the conceptual design phase. The process is automated in order to provide the system designer with the leverage of the possibility of studying and analyzing a large multiple of possible solutions in a short time. This article presents Multiobjective Hyper Heuristic Optimization Scheme based on low level meta-heuristics developed for the application in engineering system design. Herein, we present a stochastic function to manage meta-heuristics (low-level) to augment surety of global optimum solution. Generic Algorithm, Simulated Annealing and Swarm Intelligence are used as low-level meta-heuristics in this study. Performance of the proposed scheme is investigated through a comprehensive empirical analysis yielding acceptable results. One of the primary motives for performing multiobjective optimization is that the current engineering systems require simultaneous optimization of conflicting and multiple. Random decision making makes the implementation of this scheme attractive and easy. Injecting feasible solutions significantly alters the search direction and also adds diversity of population resulting in accomplishment of pre-defined goals set in the proposed scheme.

  11. Concealed object segmentation and three-dimensional localization with passive millimeter-wave imaging

    NASA Astrophysics Data System (ADS)

    Yeom, Seokwon

    2013-05-01

    Millimeter waves imaging draws increasing attention in security applications for weapon detection under clothing. In this paper, concealed object segmentation and three-dimensional localization schemes are reviewed. A concealed object is segmented by the k-means algorithm. A feature-based stereo-matching method estimates the longitudinal distance of the concealed object. The distance is estimated by the discrepancy between the corresponding centers of the segmented objects. Experimental results are provided with the analysis of the depth resolution.

  12. Depth data research of GIS based on clustering analysis algorithm

    NASA Astrophysics Data System (ADS)

    Xiong, Yan; Xu, Wenli

    2018-03-01

    The data of GIS have spatial distribution. Geographic data has both spatial characteristics and attribute characteristics, and also changes with time. Therefore, the amount of data is very large. Nowadays, many industries and departments in the society are using GIS. However, without proper data analysis and mining scheme, GIS will not exert its maximum effectiveness and will waste a lot of data. In this paper, we use the geographic information demand of a national security department as the experimental object, combining the characteristics of GIS data, taking into account the characteristics of time, space, attributes and so on, and using cluster analysis algorithm. We further study the mining scheme for depth data, and get the algorithm model. This algorithm can automatically classify sample data, and then carry out exploratory analysis. The research shows that the algorithm model and the information mining scheme can quickly find hidden depth information from the surface data of GIS, thus improving the efficiency of the security department. This algorithm can also be extended to other fields.

  13. Hologram representation of design data in an expert system knowledge base

    NASA Technical Reports Server (NTRS)

    Shiva, S. G.; Klon, Peter F.

    1988-01-01

    A novel representational scheme for design object descriptions is presented. An abstract notion of modules and signals is developed as a conceptual foundation for the scheme. This abstraction relates the objects to the meaning of system descriptions. Anchored on this abstraction, a representational model which incorporates dynamic semantics for these objects is presented. This representational model is called a hologram scheme since it represents dual level information, namely, structural and semantic. The benefits of this scheme are presented.

  14. Improved numerical methods for turbulent viscous recirculating flows

    NASA Technical Reports Server (NTRS)

    Turan, A.

    1985-01-01

    The hybrid-upwind finite difference schemes employed in generally available combustor codes possess excessive numerical diffusion errors which preclude accurate quantative calculations. The present study has as its primary objective the identification and assessment of an improved solution algorithm as well as discretization schemes applicable to analysis of turbulent viscous recirculating flows. The assessment is carried out primarily in two dimensional/axisymetric geometries with a view to identifying an appropriate technique to be incorporated in a three-dimensional code.

  15. Power Allocation and Outage Probability Analysis for SDN-based Radio Access Networks

    NASA Astrophysics Data System (ADS)

    Zhao, Yongxu; Chen, Yueyun; Mai, Zhiyuan

    2018-01-01

    In this paper, performance of Access network Architecture based SDN (Software Defined Network) is analyzed with respect to the power allocation issue. A power allocation scheme PSO-PA (Particle Swarm Optimization-power allocation) algorithm is proposed, the proposed scheme is subjected to constant total power with the objective of minimizing system outage probability. The entire access network resource configuration is controlled by the SDN controller, then it sends the optimized power distribution factor to the base station source node (SN) and the relay node (RN). Simulation results show that the proposed scheme reduces the system outage probability at a low complexity.

  16. Assessment of numerical techniques for unsteady flow calculations

    NASA Technical Reports Server (NTRS)

    Hsieh, Kwang-Chung

    1989-01-01

    The characteristics of unsteady flow motions have long been a serious concern in the study of various fluid dynamic and combustion problems. With the advancement of computer resources, numerical approaches to these problems appear to be feasible. The objective of this paper is to assess the accuracy of several numerical schemes for unsteady flow calculations. In the present study, Fourier error analysis is performed for various numerical schemes based on a two-dimensional wave equation. Four methods sieved from the error analysis are then adopted for further assessment. Model problems include unsteady quasi-one-dimensional inviscid flows, two-dimensional wave propagations, and unsteady two-dimensional inviscid flows. According to the comparison between numerical and exact solutions, although second-order upwind scheme captures the unsteady flow and wave motions quite well, it is relatively more dissipative than sixth-order central difference scheme. Among various numerical approaches tested in this paper, the best performed one is Runge-Kutta method for time integration and six-order central difference for spatial discretization.

  17. Nondestructive pavement evaluation using ILLI-PAVE based artificial neural network models.

    DOT National Transportation Integrated Search

    2008-09-01

    The overall objective in this research project is to develop advanced pavement structural analysis models for more accurate solutions with fast computation schemes. Soft computing and modeling approaches, specifically the Artificial Neural Network (A...

  18. Numerical analysis and design optimization of supersonic after-burning with strut fuel injectors for scramjet engines

    NASA Astrophysics Data System (ADS)

    Candon, M. J.; Ogawa, H.

    2018-06-01

    Scramjets are a class of hypersonic airbreathing engine that offer promise for economical, reliable and high-speed access-to-space and atmospheric transport. The expanding flow in the scramjet nozzle comprises of unburned hydrogen. An after-burning scheme can be used to effectively utilize the remaining hydrogen by supplying additional oxygen into the nozzle, aiming to augment the thrust. This paper presents the results of a single-objective design optimization for a strut fuel injection scheme considering four design variables with the objective of maximizing thrust augmentation. Thrust is found to be augmented significantly owing to a combination of contributions from aerodynamic and combustion effects. Further understanding and physical insights have been gained by performing variance-based global sensitivity analysis, scrutinizing the nozzle flowfields, analyzing the distributions and contributions of the forces acting on the nozzle wall, and examining the combustion efficiency.

  19. Applications of Bayesian Procrustes shape analysis to ensemble radar reflectivity nowcast verification

    NASA Astrophysics Data System (ADS)

    Fox, Neil I.; Micheas, Athanasios C.; Peng, Yuqiang

    2016-07-01

    This paper introduces the use of Bayesian full Procrustes shape analysis in object-oriented meteorological applications. In particular, the Procrustes methodology is used to generate mean forecast precipitation fields from a set of ensemble forecasts. This approach has advantages over other ensemble averaging techniques in that it can produce a forecast that retains the morphological features of the precipitation structures and present the range of forecast outcomes represented by the ensemble. The production of the ensemble mean avoids the problems of smoothing that result from simple pixel or cell averaging, while producing credible sets that retain information on ensemble spread. Also in this paper, the full Bayesian Procrustes scheme is used as an object verification tool for precipitation forecasts. This is an extension of a previously presented Procrustes shape analysis based verification approach into a full Bayesian format designed to handle the verification of precipitation forecasts that match objects from an ensemble of forecast fields to a single truth image. The methodology is tested on radar reflectivity nowcasts produced in the Warning Decision Support System - Integrated Information (WDSS-II) by varying parameters in the K-means cluster tracking scheme.

  20. The impact of catchment source group classification on the accuracy of sediment fingerprinting outputs.

    PubMed

    Pulley, Simon; Foster, Ian; Collins, Adrian L

    2017-06-01

    The objective classification of sediment source groups is at present an under-investigated aspect of source tracing studies, which has the potential to statistically improve discrimination between sediment sources and reduce uncertainty. This paper investigates this potential using three different source group classification schemes. The first classification scheme was simple surface and subsurface groupings (Scheme 1). The tracer signatures were then used in a two-step cluster analysis to identify the sediment source groupings naturally defined by the tracer signatures (Scheme 2). The cluster source groups were then modified by splitting each one into a surface and subsurface component to suit catchment management goals (Scheme 3). The schemes were tested using artificial mixtures of sediment source samples. Controlled corruptions were made to some of the mixtures to mimic the potential causes of tracer non-conservatism present when using tracers in natural fluvial environments. It was determined how accurately the known proportions of sediment sources in the mixtures were identified after unmixing modelling using the three classification schemes. The cluster analysis derived source groups (2) significantly increased tracer variability ratios (inter-/intra-source group variability) (up to 2122%, median 194%) compared to the surface and subsurface groupings (1). As a result, the composition of the artificial mixtures was identified an average of 9.8% more accurately on the 0-100% contribution scale. It was found that the cluster groups could be reclassified into a surface and subsurface component (3) with no significant increase in composite uncertainty (a 0.1% increase over Scheme 2). The far smaller effects of simulated tracer non-conservatism for the cluster analysis based schemes (2 and 3) was primarily attributed to the increased inter-group variability producing a far larger sediment source signal that the non-conservatism noise (1). Modified cluster analysis based classification methods have the potential to reduce composite uncertainty significantly in future source tracing studies. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Small Engine Technology. Task 4: Advanced Small Turboshaft Compressor (ASTC) Performance and Range Investigation

    NASA Technical Reports Server (NTRS)

    Hansen, Jeff L.; Delaney, Robert A.

    1997-01-01

    This contact had two main objectives involving both numerical and experimental investigations of a small highly loaded two-stage axial compressor designated Advanced Small Turboshaft Compressor (ASTC) winch had a design pressure ratio goal of 5:1 at a flowrate of 10.53 lbm/s. The first objective was to conduct 3-D Navier Stokes multistage analyses of the ASTC using several different flow modelling schemes. The second main objective was to complete a numerical/experimental investigation into stall range enhancement of the ASTC. This compressor was designed wider a cooperative Space Act Agreement and all testing was completed at NASA Lewis Research Center. For the multistage analyses, four different flow model schemes were used, namely: (1) steady-state ADPAC analysis, (2) unsteady ADPAC analysis, (3) steady-state APNASA analysis, and (4) steady state OCOM3D analysis. The results of all the predictions were compared to the experimental data. The steady-state ADPAC and APNASA codes predicted similar overall performance and produced good agreement with data, however the blade row performance and flowfield details were quite different. In general, it can be concluded that the APNASA average-passage code does a better job of predicting the performance and flowfield details of the highly loaded ASTC compressor.

  2. Dependency graph for code analysis on emerging architectures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shashkov, Mikhail Jurievich; Lipnikov, Konstantin

    Direct acyclic dependency (DAG) graph is becoming the standard for modern multi-physics codes.The ideal DAG is the true block-scheme of a multi-physics code. Therefore, it is the convenient object for insitu analysis of the cost of computations and algorithmic bottlenecks related to statistical frequent data motion and dymanical machine state.

  3. OBJECTIVE METEOROLOGICAL CLASSIFICATION SCHEME DESIGNED TO ELUCIDATE OZONE'S DEPENDENCE ON METEOROLOGY

    EPA Science Inventory

    This paper utilizes a two-stage clustering approach as part of an objective classification scheme designed to elucidate 03's dependence on meteorology. hen applied to ten years (1981-1990) of meteorological data for Birmingham, Alabama, the classification scheme identified seven ...

  4. A kernel-based novelty detection scheme for the ultra-fast detection of chirp evoked Auditory Brainstem Responses.

    PubMed

    Corona-Strauss, Farah I; Delb, Wolfgang; Schick, Bernhard; Strauss, Daniel J

    2010-01-01

    Auditory Brainstem Responses (ABRs) are used as objective method for diagnostics and quantification of hearing loss. Many methods for automatic recognition of ABRs have been developed, but none of them include the individual measurement setup in the analysis. The purpose of this work was to design a fast recognition scheme for chirp-evoked ABRs that is adjusted to the individual measurement condition using spontaneous electroencephalographic activity (SA). For the classification, the kernel-based novelty detection scheme used features based on the inter-sweep instantaneous phase synchronization as well as energy and entropy relations in the time-frequency domain. This method provided SA discrimination from stimulations above the hearing threshold with a minimum number of sweeps, i.e., 200 individual responses. It is concluded that the proposed paradigm, processing procedures and stimulation techniques improve the detection of ABRs in terms of the degree of objectivity, i.e., automation of procedure, and measurement time.

  5. A staggered-grid finite-difference scheme optimized in the time–space domain for modeling scalar-wave propagation in geophysical problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tan, Sirui, E-mail: siruitan@hotmail.com; Huang, Lianjie, E-mail: ljh@lanl.gov

    For modeling scalar-wave propagation in geophysical problems using finite-difference schemes, optimizing the coefficients of the finite-difference operators can reduce numerical dispersion. Most optimized finite-difference schemes for modeling seismic-wave propagation suppress only spatial but not temporal dispersion errors. We develop a novel optimized finite-difference scheme for numerical scalar-wave modeling to control dispersion errors not only in space but also in time. Our optimized scheme is based on a new stencil that contains a few more grid points than the standard stencil. We design an objective function for minimizing relative errors of phase velocities of waves propagating in all directions within amore » given range of wavenumbers. Dispersion analysis and numerical examples demonstrate that our optimized finite-difference scheme is computationally up to 2.5 times faster than the optimized schemes using the standard stencil to achieve the similar modeling accuracy for a given 2D or 3D problem. Compared with the high-order finite-difference scheme using the same new stencil, our optimized scheme reduces 50 percent of the computational cost to achieve the similar modeling accuracy. This new optimized finite-difference scheme is particularly useful for large-scale 3D scalar-wave modeling and inversion.« less

  6. An operational air quality objective analysis of surface pollutants

    NASA Astrophysics Data System (ADS)

    Menard, R.; Robichaud, A.

    2013-05-01

    As of December 2012 a surface analysis of O3, PM2.5 at a resolution of 10 km over Canada and USA has become an operational product of Environment Canada. Analyses based an optimum interpolation scheme adapted to the variability of surface pollutant is run each hour. We will briefly discuss the specifics of the scheme, the technical implementation that lead to an operational implementation, a description and validation of the product as it stands today. An analysis of NO2 and a map of an air quality health index is also under way. We are now developing a high resolution analysis, 2.5 km over major cities over the Montreal-Toronto area and over the Oil sands region. The effect of state-dependent error covariance modeling will be present with some early results of the high resolutions analysis/assimilation.

  7. Quantitative and Qualitative Analysis of Nutrition and Food Safety Information in School Science Textbooks of India

    ERIC Educational Resources Information Center

    Subba Rao, G. M.; Vijayapushapm, T.; Venkaiah, K.; Pavarala, V.

    2012-01-01

    Objective: To assess quantity and quality of nutrition and food safety information in science textbooks prescribed by the Central Board of Secondary Education (CBSE), India for grades I through X. Design: Content analysis. Methods: A coding scheme was developed for quantitative and qualitative analyses. Two investigators independently coded the…

  8. An Automated Scheme for the Large-Scale Survey of Herbig-Haro Objects

    NASA Astrophysics Data System (ADS)

    Deng, Licai; Yang, Ji; Zheng, Zhongyuan; Jiang, Zhaoji

    2001-04-01

    Owing to their spectral properties, Herbig-Haro (HH) objects can be discovered using photometric methods through a combination of filters, sampling the characteristic spectral lines and the nearby continuum. The data are commonly processed through direct visual inspection of the images. To make data reduction more efficient and the results more uniform and complete, an automated searching scheme for HH objects is developed to manipulate the images using IRAF. This approach helps to extract images with only intrinsic HH emissions. By using this scheme, the pointlike stellar sources and extended nebulous sources with continuum emission can be eliminated from the original images. The objects with only characteristic HH emission become prominent and can be easily picked up. In this paper our scheme is illustrated by a sample field and has been applied to our surveys for HH objects.

  9. A frequency-based window width optimized two-dimensional S-Transform profilometry

    NASA Astrophysics Data System (ADS)

    Zhong, Min; Chen, Feng; Xiao, Chao

    2017-11-01

    A new scheme is proposed to as a frequency-based window width optimized two-dimensional S-Transform profilometry, in which parameters pu and pv are introduced to control the width of a two-dimensional Gaussian window. Unlike the standard two-dimensional S-transform using the Gaussian window with window width proportional to the reciprocal local frequency of the tested signal, the size of window width for the optimized two-dimensional S-Transform varies with the pu th (pv th) power of the reciprocal local frequency fx (fy) in x (y) direction. The paper gives a detailed theoretical analysis of optimized two-dimensional S-Transform in fringe analysis as well as the characteristics of the modified Gauss window. Simulations are applied to evaluate the proposed scheme, the results show that the new scheme has better noise reduction ability and can extract phase distribution more precise in comparison with the standard two-dimensional S-transform even though the surface of the measured object varies sharply. Finally, the proposed scheme is demonstrated on three-dimensional surface reconstruction for a complex plastic cat mask to show its effectiveness.

  10. Chain-Based Communication in Cylindrical Underwater Wireless Sensor Networks

    PubMed Central

    Javaid, Nadeem; Jafri, Mohsin Raza; Khan, Zahoor Ali; Alrajeh, Nabil; Imran, Muhammad; Vasilakos, Athanasios

    2015-01-01

    Appropriate network design is very significant for Underwater Wireless Sensor Networks (UWSNs). Application-oriented UWSNs are planned to achieve certain objectives. Therefore, there is always a demand for efficient data routing schemes, which can fulfill certain requirements of application-oriented UWSNs. These networks can be of any shape, i.e., rectangular, cylindrical or square. In this paper, we propose chain-based routing schemes for application-oriented cylindrical networks and also formulate mathematical models to find a global optimum path for data transmission. In the first scheme, we devise four interconnected chains of sensor nodes to perform data communication. In the second scheme, we propose routing scheme in which two chains of sensor nodes are interconnected, whereas in third scheme single-chain based routing is done in cylindrical networks. After finding local optimum paths in separate chains, we find global optimum paths through their interconnection. Moreover, we develop a computational model for the analysis of end-to-end delay. We compare the performance of the above three proposed schemes with that of Power Efficient Gathering System in Sensor Information Systems (PEGASIS) and Congestion adjusted PEGASIS (C-PEGASIS). Simulation results show that our proposed 4-chain based scheme performs better than the other selected schemes in terms of network lifetime, end-to-end delay, path loss, transmission loss, and packet sending rate. PMID:25658394

  11. National Rural Employment Guarantee Scheme, poverty and prices in rural India.

    PubMed

    Gaiha, Raghav; Kulkarni, Vani S; Pandey, Manoj K; Imai, Katsushi S

    2010-01-01

    The objective of this analysis is mainly to construct an intuitive measure of the performance of the National Rural Employment Guarantee Scheme (NREGS) in India. The focus is on divergence between demand and supply at the district level. Some related issues addressed are: (i) whether the gap between demand and supply responds to poverty; and (ii) whether recent hikes in NREGS wages are inflationary. Our analysis confirms responsiveness of the positive gap between demand and supply to poverty. Also, apprehensions expressed about the inflationary potential of recent hikes in NREGS wages have been confirmed. More importantly, higher NREGS wages are likely to undermine self-selection of the poor in it.

  12. Inelastic and Dynamic Fracture and Stress Analyses

    NASA Technical Reports Server (NTRS)

    Atluri, S. N.

    1984-01-01

    Large deformation inelastic stress analysis and inelastic and dynamic crack propagation research work is summarized. The salient topics of interest in engine structure analysis that are discussed herein include: (1) a path-independent integral (T) in inelastic fracture mechanics, (2) analysis of dynamic crack propagation, (3) generalization of constitutive relations of inelasticity for finite deformations , (4) complementary energy approaches in inelastic analyses, and (5) objectivity of time integration schemes in inelastic stress analysis.

  13. Financial incentives to encourage healthy behaviour: an analysis of U.K. media coverage.

    PubMed

    Parke, Hannah; Ashcroft, Richard; Brown, Rebecca; Marteau, Theresa M; Seale, Clive

    2013-09-01

    Policies to use financial incentives to encourage healthy behaviour are controversial. Much of this controversy is played out in the mass media, both reflecting and shaping public opinion. To describe U.K. mass media coverage of incentive schemes, comparing schemes targeted at different client groups and assessing the relative prominence of the views of different interest groups. Thematic content analysis. National and local news coverage in newspapers, news media targeted at health-care providers and popular websites between January 2005 and February 2010. U.K. mass media. The study included 210 articles. Fifteen separate arguments favourable towards schemes, and 19 unfavourable, were identified. Overall, coverage was more favourable than unfavourable, although most articles reported a mix of views. Arguments about the prevalence and seriousness of the health problems targeted by incentive schemes were uncontested. Moral and ethical objections to such schemes were common, focused in particular on recipients such as drug users or the overweight who were already stereotyped as morally deficient, and these arguments were largely uncontested. Arguments about the effectiveness of schemes and their potential for benefit or harm were areas of greater contestation. Government, public health and other health-care provider interests dominated favourable coverage; opposition came from rival politicians, taxpayers' representatives, certain charities and from some journalists themselves. Those promoting incentive schemes for people who might be regarded as 'undeserving' should plan a media strategy that anticipates their public reception. © 2011 John Wiley & Sons Ltd.

  14. Financial incentives to encourage healthy behaviour: an analysis of UK media coverage

    PubMed Central

    Parke, Hannah; Ashcroft, Richard; Brown, Rebecca; Marteau, Theresa M; Seale, Clive

    2011-01-01

    Abstract Background  Policies to use financial incentives to encourage healthy behaviour are controversial. Much of this controversy is played out in the mass media, both reflecting and shaping public opinion. Objective  To describe UK mass media coverage of incentive schemes, comparing schemes targeted at different client groups and assessing the relative prominence of the views of different interest groups. Design  Thematic content analysis. Subjects  National and local news coverage in newspapers, news media targeted at health‐care providers and popular websites between January 2005 and February 2010. Setting  UK mass media. Results  The study included 210 articles. Fifteen separate arguments favourable towards schemes, and 19 unfavourable, were identified. Overall, coverage was more favourable than unfavourable, although most articles reported a mix of views. Arguments about the prevalence and seriousness of the health problems targeted by incentive schemes were uncontested. Moral and ethical objections to such schemes were common, focused in particular on recipients such as drug users or the overweight who were already stereotyped as morally deficient, and these arguments were largely uncontested. Arguments about the effectiveness of schemes and their potential for benefit or harm were areas of greater contestation. Government, public health and other health‐care provider interests dominated favourable coverage; opposition came from rival politicians, taxpayers’ representatives, certain charities and from some journalists themselves. Conclusions  Those promoting incentive schemes for people who might be regarded as ‘undeserving’ should plan a media strategy that anticipates their public reception. PMID:21771227

  15. Consumer Information. Final Report.

    ERIC Educational Resources Information Center

    CEMREL, Inc., St. Ann, MO.

    One of three projects reported by the Central Midwestern Regional Educational Laboratory included analysis of 178 existing consumer information products. Steps in the analytical scheme were preparation of an annotated bibliography and development of a plan for providing objective, comparative information on such products. These were found in the…

  16. A Parameter Estimation Scheme for Multiscale Kalman Smoother (MKS) Algorithm Used in Precipitation Data Fusion

    NASA Technical Reports Server (NTRS)

    Wang, Shugong; Liang, Xu

    2013-01-01

    A new approach is presented in this paper to effectively obtain parameter estimations for the Multiscale Kalman Smoother (MKS) algorithm. This new approach has demonstrated promising potentials in deriving better data products based on data of different spatial scales and precisions. Our new approach employs a multi-objective (MO) parameter estimation scheme (called MO scheme hereafter), rather than using the conventional maximum likelihood scheme (called ML scheme) to estimate the MKS parameters. Unlike the ML scheme, the MO scheme is not simply built on strict statistical assumptions related to prediction errors and observation errors, rather, it directly associates the fused data of multiple scales with multiple objective functions in searching best parameter estimations for MKS through optimization. In the MO scheme, objective functions are defined to facilitate consistency among the fused data at multiscales and the input data at their original scales in terms of spatial patterns and magnitudes. The new approach is evaluated through a Monte Carlo experiment and a series of comparison analyses using synthetic precipitation data. Our results show that the MKS fused precipitation performs better using the MO scheme than that using the ML scheme. Particularly, improvements are significant compared to that using the ML scheme for the fused precipitation associated with fine spatial resolutions. This is mainly due to having more criteria and constraints involved in the MO scheme than those included in the ML scheme. The weakness of the original ML scheme that blindly puts more weights onto the data associated with finer resolutions is overcome in our new approach.

  17. Decreased Risk of Preeclampsia After the Introduction of Universal Voucher Scheme for Antenatal Care and Birth Services in the Republic of Korea.

    PubMed

    Choe, Seung-Ah; Min, Hye Sook; Cho, Sung-Il

    2017-01-01

    Objectives A number of interventions to reduce disparities in maternal health have been introduced and implemented without concrete evidence to support them. In Korea, a universal voucher scheme for antenatal care and birth services was initiated in December 2008 to improve Korea's fertility rate. This study explores the risk of preeclampsia after the introduction of a universal voucher scheme. Methods Population-based cohort data from the National Health Insurance Service-National Sample Cohort (NHIS-NSC) covering 2002-2013 were analysed. A generalized linear mixed model (GLMM) was used to estimate the relationship between the risk of preeclampsia and voucher scheme introduction. Results The annual age-adjusted incidence of preeclampsia showed no significant unidirectional change during the study period. In the GLMM analysis, the introduction of a voucher scheme was associated with a reduced risk of preeclampsia, controlling for potential confounding factors. The interaction between household income level and voucher scheme was not significant. Conclusions for Practice This finding suggests that the introduction of a voucher scheme for mothers is related to a reduced risk of preeclampsia even under universal health coverage.

  18. The Selection of Computed Tomography Scanning Schemes for Lengthy Symmetric Objects

    NASA Astrophysics Data System (ADS)

    Trinh, V. B.; Zhong, Y.; Osipov, S. P.

    2017-04-01

    . The article describes the basic computed tomography scan schemes for lengthy symmetric objects: continuous (discrete) rotation with a discrete linear movement; continuous (discrete) rotation with discrete linear movement to acquire 2D projection; continuous (discrete) linear movement with discrete rotation to acquire one-dimensional projection and continuous (discrete) rotation to acquire of 2D projection. The general method to calculate the scanning time is discussed in detail. It should be extracted the comparison principle to select a scanning scheme. This is because data are the same for all scanning schemes: the maximum energy of the X-ray radiation; the power of X-ray radiation source; the angle of the X-ray cone beam; the transverse dimension of a single detector; specified resolution and the maximum time, which is need to form one point of the original image and complies the number of registered photons). It demonstrates the possibilities of the above proposed method to compare the scanning schemes. Scanning object was a cylindrical object with the mass thickness is 4 g/cm2, the effective atomic number is 15 and length is 1300 mm. It analyzes data of scanning time and concludes about the efficiency of scanning schemes. It examines the productivity of all schemes and selects the effective one.

  19. View subspaces for indexing and retrieval of 3D models

    NASA Astrophysics Data System (ADS)

    Dutagaci, Helin; Godil, Afzal; Sankur, Bülent; Yemez, Yücel

    2010-02-01

    View-based indexing schemes for 3D object retrieval are gaining popularity since they provide good retrieval results. These schemes are coherent with the theory that humans recognize objects based on their 2D appearances. The viewbased techniques also allow users to search with various queries such as binary images, range images and even 2D sketches. The previous view-based techniques use classical 2D shape descriptors such as Fourier invariants, Zernike moments, Scale Invariant Feature Transform-based local features and 2D Digital Fourier Transform coefficients. These methods describe each object independent of others. In this work, we explore data driven subspace models, such as Principal Component Analysis, Independent Component Analysis and Nonnegative Matrix Factorization to describe the shape information of the views. We treat the depth images obtained from various points of the view sphere as 2D intensity images and train a subspace to extract the inherent structure of the views within a database. We also show the benefit of categorizing shapes according to their eigenvalue spread. Both the shape categorization and data-driven feature set conjectures are tested on the PSB database and compared with the competitor view-based 3D shape retrieval algorithms.

  20. A Novel Certificateless Signature Scheme for Smart Objects in the Internet-of-Things.

    PubMed

    Yeh, Kuo-Hui; Su, Chunhua; Choo, Kim-Kwang Raymond; Chiu, Wayne

    2017-05-01

    Rapid advances in wireless communications and pervasive computing technologies have resulted in increasing interest and popularity of Internet-of-Things (IoT) architecture, ubiquitously providing intelligence and convenience to our daily life. In IoT-based network environments, smart objects are embedded everywhere as ubiquitous things connected in a pervasive manner. Ensuring security for interactions between these smart things is significantly more important, and a topic of ongoing interest. In this paper, we present a certificateless signature scheme for smart objects in IoT-based pervasive computing environments. We evaluate the utility of the proposed scheme in IoT-oriented testbeds, i.e., Arduino Uno and Raspberry PI 2. Experiment results present the practicability of the proposed scheme. Moreover, we revisit the scheme of Wang et al. (2015) and revealed that a malicious super type I adversary can easily forge a legitimate signature to cheat any receiver as he/she wishes in the scheme. The superiority of the proposed certificateless signature scheme over relevant studies is demonstrated in terms of the summarized security and performance comparisons.

  1. A Novel Certificateless Signature Scheme for Smart Objects in the Internet-of-Things

    PubMed Central

    Yeh, Kuo-Hui; Su, Chunhua; Choo, Kim-Kwang Raymond; Chiu, Wayne

    2017-01-01

    Rapid advances in wireless communications and pervasive computing technologies have resulted in increasing interest and popularity of Internet-of-Things (IoT) architecture, ubiquitously providing intelligence and convenience to our daily life. In IoT-based network environments, smart objects are embedded everywhere as ubiquitous things connected in a pervasive manner. Ensuring security for interactions between these smart things is significantly more important, and a topic of ongoing interest. In this paper, we present a certificateless signature scheme for smart objects in IoT-based pervasive computing environments. We evaluate the utility of the proposed scheme in IoT-oriented testbeds, i.e., Arduino Uno and Raspberry PI 2. Experiment results present the practicability of the proposed scheme. Moreover, we revisit the scheme of Wang et al. (2015) and revealed that a malicious super type I adversary can easily forge a legitimate signature to cheat any receiver as he/she wishes in the scheme. The superiority of the proposed certificateless signature scheme over relevant studies is demonstrated in terms of the summarized security and performance comparisons. PMID:28468313

  2. Emergency material allocation and scheduling for the application to chemical contingency spills under multiple scenarios.

    PubMed

    Liu, Jie; Guo, Liang; Jiang, Jiping; Jiang, Dexun; Wang, Peng

    2017-01-01

    In the emergency management relevant to chemical contingency spills, efficiency emergency rescue can be deeply influenced by a reasonable assignment of the available emergency materials to the related risk sources. In this study, an emergency material scheduling model (EMSM) with time-effective and cost-effective objectives is developed to coordinate both allocation and scheduling of the emergency materials. Meanwhile, an improved genetic algorithm (IGA) which includes a revision operation for EMSM is proposed to identify the emergency material scheduling schemes. Then, scenario analysis is used to evaluate optimal emergency rescue scheme under different emergency pollution conditions associated with different threat degrees based on analytic hierarchy process (AHP) method. The whole framework is then applied to a computational experiment based on south-to-north water transfer project in China. The results demonstrate that the developed method not only could guarantee the implementation of the emergency rescue to satisfy the requirements of chemical contingency spills but also help decision makers identify appropriate emergency material scheduling schemes in a balance between time-effective and cost-effective objectives.

  3. Coded aperture ptychography: uniqueness and reconstruction

    NASA Astrophysics Data System (ADS)

    Chen, Pengwen; Fannjiang, Albert

    2018-02-01

    Uniqueness of solution is proved for any ptychographic scheme with a random mask under a minimum overlap condition and local geometric convergence analysis is given for the alternating projection (AP) and Douglas-Rachford (DR) algorithms. DR is shown to possess a unique fixed point in the object domain and for AP a simple criterion for distinguishing the true solution among possibly many fixed points is given. A minimalist scheme, where the adjacent masks overlap 50% of the area and each pixel of the object is illuminated by exactly four illuminations, is conveniently parametrized by the number q of shifted masks in each direction. The lower bound 1  -  C/q 2 is proved for the geometric convergence rate of the minimalist scheme, predicting a poor performance with large q which is confirmed by numerical experiments. The twin-image ambiguity is shown to arise for certain Fresnel masks and degrade the performance of reconstruction. Extensive numerical experiments are performed to explore the general features of a well-performing mask, the optimal value of q and the robustness with respect to measurement noise.

  4. Processing uncertain RFID data in traceability supply chains.

    PubMed

    Xie, Dong; Xiao, Jie; Guo, Guangjun; Jiang, Tong

    2014-01-01

    Radio Frequency Identification (RFID) is widely used to track and trace objects in traceability supply chains. However, massive uncertain data produced by RFID readers are not effective and efficient to be used in RFID application systems. Following the analysis of key features of RFID objects, this paper proposes a new framework for effectively and efficiently processing uncertain RFID data, and supporting a variety of queries for tracking and tracing RFID objects. We adjust different smoothing windows according to different rates of uncertain data, employ different strategies to process uncertain readings, and distinguish ghost, missing, and incomplete data according to their apparent positions. We propose a comprehensive data model which is suitable for different application scenarios. In addition, a path coding scheme is proposed to significantly compress massive data by aggregating the path sequence, the position, and the time intervals. The scheme is suitable for cyclic or long paths. Moreover, we further propose a processing algorithm for group and independent objects. Experimental evaluations show that our approach is effective and efficient in terms of the compression and traceability queries.

  5. Processing Uncertain RFID Data in Traceability Supply Chains

    PubMed Central

    Xie, Dong; Xiao, Jie

    2014-01-01

    Radio Frequency Identification (RFID) is widely used to track and trace objects in traceability supply chains. However, massive uncertain data produced by RFID readers are not effective and efficient to be used in RFID application systems. Following the analysis of key features of RFID objects, this paper proposes a new framework for effectively and efficiently processing uncertain RFID data, and supporting a variety of queries for tracking and tracing RFID objects. We adjust different smoothing windows according to different rates of uncertain data, employ different strategies to process uncertain readings, and distinguish ghost, missing, and incomplete data according to their apparent positions. We propose a comprehensive data model which is suitable for different application scenarios. In addition, a path coding scheme is proposed to significantly compress massive data by aggregating the path sequence, the position, and the time intervals. The scheme is suitable for cyclic or long paths. Moreover, we further propose a processing algorithm for group and independent objects. Experimental evaluations show that our approach is effective and efficient in terms of the compression and traceability queries. PMID:24737978

  6. Impact of different parameterization schemes on simulation of mesoscale convective system over south-east India

    NASA Astrophysics Data System (ADS)

    Madhulatha, A.; Rajeevan, M.

    2018-02-01

    Main objective of the present paper is to examine the role of various parameterization schemes in simulating the evolution of mesoscale convective system (MCS) occurred over south-east India. Using the Weather Research and Forecasting (WRF) model, numerical experiments are conducted by considering various planetary boundary layer, microphysics, and cumulus parameterization schemes. Performances of different schemes are evaluated by examining boundary layer, reflectivity, and precipitation features of MCS using ground-based and satellite observations. Among various physical parameterization schemes, Mellor-Yamada-Janjic (MYJ) boundary layer scheme is able to produce deep boundary layer height by simulating warm temperatures necessary for storm initiation; Thompson (THM) microphysics scheme is capable to simulate the reflectivity by reasonable distribution of different hydrometeors during various stages of system; Betts-Miller-Janjic (BMJ) cumulus scheme is able to capture the precipitation by proper representation of convective instability associated with MCS. Present analysis suggests that MYJ, a local turbulent kinetic energy boundary layer scheme, which accounts strong vertical mixing; THM, a six-class hybrid moment microphysics scheme, which considers number concentration along with mixing ratio of rain hydrometeors; and BMJ, a closure cumulus scheme, which adjusts thermodynamic profiles based on climatological profiles might have contributed for better performance of respective model simulations. Numerical simulation carried out using the above combination of schemes is able to capture storm initiation, propagation, surface variations, thermodynamic structure, and precipitation features reasonably well. This study clearly demonstrates that the simulation of MCS characteristics is highly sensitive to the choice of parameterization schemes.

  7. Impact of a variational objective analysis scheme on a regional area numerical model: The Italian Air Force Weather Service experience

    NASA Astrophysics Data System (ADS)

    Bonavita, M.; Torrisi, L.

    2005-03-01

    A new data assimilation system has been designed and implemented at the National Center for Aeronautic Meteorology and Climatology of the Italian Air Force (CNMCA) in order to improve its operational numerical weather prediction capabilities and provide more accurate guidance to operational forecasters. The system, which is undergoing testing before operational use, is based on an “observation space” version of the 3D-VAR method for the objective analysis component, and on the High Resolution Regional Model (HRM) of the Deutscher Wetterdienst (DWD) for the prognostic component. Notable features of the system include a completely parallel (MPI+OMP) implementation of the solution of analysis equations by a preconditioned conjugate gradient descent method; correlation functions in spherical geometry with thermal wind constraint between mass and wind field; derivation of the objective analysis parameters from a statistical analysis of the innovation increments.

  8. Computer Facilitated Mathematical Methods in Chemical Engineering--Similarity Solution

    ERIC Educational Resources Information Center

    Subramanian, Venkat R.

    2006-01-01

    High-performance computers coupled with highly efficient numerical schemes and user-friendly software packages have helped instructors to teach numerical solutions and analysis of various nonlinear models more efficiently in the classroom. One of the main objectives of a model is to provide insight about the system of interest. Analytical…

  9. A comparative study of advanced shock-capturing schemes applied to Burgers' equation

    NASA Technical Reports Server (NTRS)

    Yang, H. Q.; Przekwas, A. J.

    1990-01-01

    Several variations of the TVD scheme, ENO scheme, FCT scheme, and geometrical schemes, such as MUSCL and PPM, are considered. A comparative study of these schemes as applied to the Burgers' equation is presented. The objective is to assess their performance for problems involving formation and propagation of shocks, shock collisions, and expansion of discontinuities.

  10. An optimal implicit staggered-grid finite-difference scheme based on the modified Taylor-series expansion with minimax approximation method for elastic modeling

    NASA Astrophysics Data System (ADS)

    Yang, Lei; Yan, Hongyong; Liu, Hong

    2017-03-01

    Implicit staggered-grid finite-difference (ISFD) scheme is competitive for its great accuracy and stability, whereas its coefficients are conventionally determined by the Taylor-series expansion (TE) method, leading to a loss in numerical precision. In this paper, we modify the TE method using the minimax approximation (MA), and propose a new optimal ISFD scheme based on the modified TE (MTE) with MA method. The new ISFD scheme takes the advantage of the TE method that guarantees great accuracy at small wavenumbers, and keeps the property of the MA method that keeps the numerical errors within a limited bound at the same time. Thus, it leads to great accuracy for numerical solution of the wave equations. We derive the optimal ISFD coefficients by applying the new method to the construction of the objective function, and using a Remez algorithm to minimize its maximum. Numerical analysis is made in comparison with the conventional TE-based ISFD scheme, indicating that the MTE-based ISFD scheme with appropriate parameters can widen the wavenumber range with high accuracy, and achieve greater precision than the conventional ISFD scheme. The numerical modeling results also demonstrate that the MTE-based ISFD scheme performs well in elastic wave simulation, and is more efficient than the conventional ISFD scheme for elastic modeling.

  11. Integration of object-oriented knowledge representation with the CLIPS rule based system

    NASA Technical Reports Server (NTRS)

    Logie, David S.; Kamil, Hasan

    1990-01-01

    The paper describes a portion of the work aimed at developing an integrated, knowledge based environment for the development of engineering-oriented applications. An Object Representation Language (ORL) was implemented in C++ which is used to build and modify an object-oriented knowledge base. The ORL was designed in such a way so as to be easily integrated with other representation schemes that could effectively reason with the object base. Specifically, the integration of the ORL with the rule based system C Language Production Systems (CLIPS), developed at the NASA Johnson Space Center, will be discussed. The object-oriented knowledge representation provides a natural means of representing problem data as a collection of related objects. Objects are comprised of descriptive properties and interrelationships. The object-oriented model promotes efficient handling of the problem data by allowing knowledge to be encapsulated in objects. Data is inherited through an object network via the relationship links. Together, the two schemes complement each other in that the object-oriented approach efficiently handles problem data while the rule based knowledge is used to simulate the reasoning process. Alone, the object based knowledge is little more than an object-oriented data storage scheme; however, the CLIPS inference engine adds the mechanism to directly and automatically reason with that knowledge. In this hybrid scheme, the expert system dynamically queries for data and can modify the object base with complete access to all the functionality of the ORL from rules.

  12. 3D early embryogenesis image filtering by nonlinear partial differential equations.

    PubMed

    Krivá, Z; Mikula, K; Peyriéras, N; Rizzi, B; Sarti, A; Stasová, O

    2010-08-01

    We present nonlinear diffusion equations, numerical schemes to solve them and their application for filtering 3D images obtained from laser scanning microscopy (LSM) of living zebrafish embryos, with a goal to identify the optimal filtering method and its parameters. In the large scale applications dealing with analysis of 3D+time embryogenesis images, an important objective is a correct detection of the number and position of cell nuclei yielding the spatio-temporal cell lineage tree of embryogenesis. The filtering is the first and necessary step of the image analysis chain and must lead to correct results, removing the noise, sharpening the nuclei edges and correcting the acquisition errors related to spuriously connected subregions. In this paper we study such properties for the regularized Perona-Malik model and for the generalized mean curvature flow equations in the level-set formulation. A comparison with other nonlinear diffusion filters, like tensor anisotropic diffusion and Beltrami flow, is also included. All numerical schemes are based on the same discretization principles, i.e. finite volume method in space and semi-implicit scheme in time, for solving nonlinear partial differential equations. These numerical schemes are unconditionally stable, fast and naturally parallelizable. The filtering results are evaluated and compared first using the Mean Hausdorff distance between a gold standard and different isosurfaces of original and filtered data. Then, the number of isosurface connected components in a region of interest (ROI) detected in original and after the filtering is compared with the corresponding correct number of nuclei in the gold standard. Such analysis proves the robustness and reliability of the edge preserving nonlinear diffusion filtering for this type of data and lead to finding the optimal filtering parameters for the studied models and numerical schemes. Further comparisons consist in ability of splitting the very close objects which are artificially connected due to acquisition error intrinsically linked to physics of LSM. In all studied aspects it turned out that the nonlinear diffusion filter which is called geodesic mean curvature flow (GMCF) has the best performance. Copyright 2010 Elsevier B.V. All rights reserved.

  13. Collaborative Emission Reduction Model Based on Multi-Objective Optimization for Greenhouse Gases and Air Pollutants.

    PubMed

    Meng, Qing-chun; Rong, Xiao-xia; Zhang, Yi-min; Wan, Xiao-le; Liu, Yuan-yuan; Wang, Yu-zhi

    2016-01-01

    CO2 emission influences not only global climate change but also international economic and political situations. Thus, reducing the emission of CO2, a major greenhouse gas, has become a major issue in China and around the world as regards preserving the environmental ecology. Energy consumption from coal, oil, and natural gas is primarily responsible for the production of greenhouse gases and air pollutants such as SO2 and NOX, which are the main air pollutants in China. In this study, a mathematical multi-objective optimization method was adopted to analyze the collaborative emission reduction of three kinds of gases on the basis of their common restraints in different ways of energy consumption to develop an economic, clean, and efficient scheme for energy distribution. The first part introduces the background research, the collaborative emission reduction for three kinds of gases, the multi-objective optimization, the main mathematical modeling, and the optimization method. The second part discusses the four mathematical tools utilized in this study, which include the Granger causality test to analyze the causality between air quality and pollutant emission, a function analysis to determine the quantitative relation between energy consumption and pollutant emission, a multi-objective optimization to set up the collaborative optimization model that considers energy consumption, and an optimality condition analysis for the multi-objective optimization model to design the optimal-pole algorithm and obtain an efficient collaborative reduction scheme. In the empirical analysis, the data of pollutant emission and final consumption of energies of Tianjin in 1996-2012 was employed to verify the effectiveness of the model and analyze the efficient solution and the corresponding dominant set. In the last part, several suggestions for collaborative reduction are recommended and the drawn conclusions are stated.

  14. Collaborative Emission Reduction Model Based on Multi-Objective Optimization for Greenhouse Gases and Air Pollutants

    PubMed Central

    Zhang, Yi-min; Wan, Xiao-le; Liu, Yuan-yuan; Wang, Yu-zhi

    2016-01-01

    CO2 emission influences not only global climate change but also international economic and political situations. Thus, reducing the emission of CO2, a major greenhouse gas, has become a major issue in China and around the world as regards preserving the environmental ecology. Energy consumption from coal, oil, and natural gas is primarily responsible for the production of greenhouse gases and air pollutants such as SO2 and NOX, which are the main air pollutants in China. In this study, a mathematical multi-objective optimization method was adopted to analyze the collaborative emission reduction of three kinds of gases on the basis of their common restraints in different ways of energy consumption to develop an economic, clean, and efficient scheme for energy distribution. The first part introduces the background research, the collaborative emission reduction for three kinds of gases, the multi-objective optimization, the main mathematical modeling, and the optimization method. The second part discusses the four mathematical tools utilized in this study, which include the Granger causality test to analyze the causality between air quality and pollutant emission, a function analysis to determine the quantitative relation between energy consumption and pollutant emission, a multi-objective optimization to set up the collaborative optimization model that considers energy consumption, and an optimality condition analysis for the multi-objective optimization model to design the optimal-pole algorithm and obtain an efficient collaborative reduction scheme. In the empirical analysis, the data of pollutant emission and final consumption of energies of Tianjin in 1996–2012 was employed to verify the effectiveness of the model and analyze the efficient solution and the corresponding dominant set. In the last part, several suggestions for collaborative reduction are recommended and the drawn conclusions are stated. PMID:27010658

  15. Tripartite counterfactual entanglement distribution.

    PubMed

    Chen, Yuanyuan; Gu, Xuemei; Jiang, Dong; Xie, Ling; Chen, Lijun

    2015-08-10

    We propose two counterfactual schemes for tripartite entanglement distribution without any physical particles travelling through the quantum channel. One scheme arranges three participators to connect with the absorption object by using switch. Using the "chained" quantum Zeno effect, three participators can accomplish the task of entanglement distribution with unique counterfactual interference probability. Another scheme uses Michelson-type interferometer to swap two entanglement pairs such that the photons of three participators are entangled. Moreover, the distance of entanglement distribution is doubled as two distant absorption objects are used. We also discuss the implementation issues to show that the proposed schemes can be realized with current technology.

  16. An application of object-oriented knowledge representation to engineering expert systems

    NASA Technical Reports Server (NTRS)

    Logie, D. S.; Kamil, H.; Umaretiya, J. R.

    1990-01-01

    The paper describes an object-oriented knowledge representation and its application to engineering expert systems. The object-oriented approach promotes efficient handling of the problem data by allowing knowledge to be encapsulated in objects and organized by defining relationships between the objects. An Object Representation Language (ORL) was implemented as a tool for building and manipulating the object base. Rule-based knowledge representation is then used to simulate engineering design reasoning. Using a common object base, very large expert systems can be developed, comprised of small, individually processed, rule sets. The integration of these two schemes makes it easier to develop practical engineering expert systems. The general approach to applying this technology to the domain of the finite element analysis, design, and optimization of aerospace structures is discussed.

  17. Effect of CFRP Schemes on the Flexural Behavior of RC Beams Modeled by Using a Nonlinear Finite-element Analysis

    NASA Astrophysics Data System (ADS)

    Al-Rousan, R. Z.

    2015-09-01

    The main objective of this study was to assess the effect of the number and schemes of carbon-fiber-reinforced polymer (CFRP) sheets on the capacity of bending moment, the ultimate displacement, the ultimate tensile strain of CFRP, the yielding moment, concrete compression strain, and the energy absorption of RC beams and to provide useful relationships that can be effectively utilized to determine the required number of CFRP sheets for a necessary increase in the flexural strength of the beams without a major loss in their ductility. To accomplish this, various RC beams, identical in their geometric and reinforcement details and having different number and configurations of CFRP sheets, are modeled and analyzed using the ANSYS software and a nonlinear finite-element analysis.

  18. A Structure and Scheme for the Evaluation of Innovative Programs. The EPIC Brief, Issue No. 2.

    ERIC Educational Resources Information Center

    Objective evaluation of school programs is a process in which a school staff collects information used to provide feedback as to whether or not a given set of objectives has been met. The Evaluative Programs for Innovativ e Curriculums (EPIC) four-step scheme of objective evaluation is based on a three-dimensional structure of variables…

  19. An image understanding system using attributed symbolic representation and inexact graph-matching

    NASA Astrophysics Data System (ADS)

    Eshera, M. A.; Fu, K.-S.

    1986-09-01

    A powerful image understanding system using a semantic-syntactic representation scheme consisting of attributed relational graphs (ARGs) is proposed for the analysis of the global information content of images. A multilayer graph transducer scheme performs the extraction of ARG representations from images, with ARG nodes representing the global image features, and the relations between features represented by the attributed branches between corresponding nodes. An efficient dynamic programming technique is employed to derive the distance between two ARGs and the inexact matching of their respective components. Noise, distortion and ambiguity in real-world images are handled through modeling in the transducer mapping rules and through the appropriate cost of error-transformation for the inexact matching of the representation. The system is demonstrated for the case of locating objects in a scene composed of complex overlapped objects, and the case of target detection in noisy and distorted synthetic aperture radar image.

  20. Numerical scoring for the Classic BILAG index

    PubMed Central

    Cresswell, Lynne; Yee, Chee-Seng; Farewell, Vernon; Rahman, Anisur; Teh, Lee-Suan; Griffiths, Bridget; Bruce, Ian N.; Ahmad, Yasmeen; Prabu, Athiveeraramapandian; Akil, Mohammed; McHugh, Neil; Toescu, Veronica; D’Cruz, David; Khamashta, Munther A.; Maddison, Peter; Isenberg, David A.

    2009-01-01

    Objective. To develop an additive numerical scoring scheme for the Classic BILAG index. Methods. SLE patients were recruited into this multi-centre cross-sectional study. At every assessment, data were collected on disease activity and therapy. Logistic regression was used to model an increase in therapy, as an indicator of active disease, by the Classic BILAG score in eight systems. As both indicate inactivity, scores of D and E were set to 0 and used as the baseline in the fitted model. The coefficients from the fitted model were used to determine the numerical values for Grades A, B and C. Different scoring schemes were then compared using receiver operating characteristic (ROC) curves. Validation analysis was performed using assessments from a single centre. Results. There were 1510 assessments from 369 SLE patients. The currently used coding scheme (A = 9, B = 3, C = 1 and D/E = 0) did not fit the data well. The regression model suggested three possible numerical scoring schemes: (i) A = 11, B = 6, C = 1 and D/E = 0; (ii) A = 12, B = 6, C = 1 and D/E = 0; and (iii) A = 11, B = 7, C = 1 and D/E = 0. These schemes produced comparable ROC curves. Based on this, A = 12, B = 6, C = 1 and D/E = 0 seemed a reasonable and practical choice. The validation analysis suggested that although the A = 12, B = 6, C = 1 and D/E = 0 coding is still reasonable, a scheme with slightly less weighting for B, such as A = 12, B = 5, C = 1 and D/E = 0, may be more appropriate. Conclusions. A reasonable additive numerical scoring scheme based on treatment decision for the Classic BILAG index is A = 12, B = 5, C = 1, D = 0 and E = 0. PMID:19779027

  1. A color-coded vision scheme for robotics

    NASA Technical Reports Server (NTRS)

    Johnson, Kelley Tina

    1991-01-01

    Most vision systems for robotic applications rely entirely on the extraction of information from gray-level images. Humans, however, regularly depend on color to discriminate between objects. Therefore, the inclusion of color in a robot vision system seems a natural extension of the existing gray-level capabilities. A method for robot object recognition using a color-coding classification scheme is discussed. The scheme is based on an algebraic system in which a two-dimensional color image is represented as a polynomial of two variables. The system is then used to find the color contour of objects. In a controlled environment, such as that of the in-orbit space station, a particular class of objects can thus be quickly recognized by its color.

  2. Pricing schemes for new drugs: a welfare analysis.

    PubMed

    Levaggi, Rosella

    2014-02-01

    Drug price regulation is acquiring increasing significance in the investment choices of the pharmaceutical sector. The overall objective is to determine an optimal trade-off between the incentives for innovation, consumer protection, and value for money. However, price regulation is itself a source of distortion. In this study, we examine the welfare properties of listing through a bargaining process and value-based pricing schemes. The latter are superior instruments to uncertain listing processes for maximising total welfare, but the distribution of the benefits between consumers and the industry depends on rate of rebate chosen by the regulator. However, through an appropriate choice, it is always possible to define a value-based pricing scheme with risk sharing, which both consumers and the industry prefer to an uncertain bargaining process. Copyright © 2013 Elsevier Ltd. All rights reserved.

  3. Watermarking textures in video games

    NASA Astrophysics Data System (ADS)

    Liu, Huajian; Berchtold, Waldemar; Schäfer, Marcel; Lieb, Patrick; Steinebach, Martin

    2014-02-01

    Digital watermarking is a promising solution to video game piracy. In this paper, based on the analysis of special challenges and requirements in terms of watermarking textures in video games, a novel watermarking scheme for DDS textures in video games is proposed. To meet the performance requirements in video game applications, the proposed algorithm embeds the watermark message directly in the compressed stream in DDS files and can be straightforwardly applied in watermark container technique for real-time embedding. Furthermore, the embedding approach achieves high watermark payload to handle collusion secure fingerprinting codes with extreme length. Hence, the scheme is resistant to collusion attacks, which is indispensable in video game applications. The proposed scheme is evaluated in aspects of transparency, robustness, security and performance. Especially, in addition to classical objective evaluation, the visual quality and playing experience of watermarked games is assessed subjectively in game playing.

  4. Insights for the assessment of the economic impact of endemic diseases: specific adaptation of economic frameworks using the case of bovine viral diarrhoea.

    PubMed

    Stott, A W; Gunn, G J

    2017-04-01

    Generic frameworks for the economic analysis of farm animal disease are now well established. The paper, therefore, uses bovine viral diarrhoea (BVD) as an example to explore how these frameworks need to be adapted to fit the characteristics of a particular disease and the specific objectives of the analysis. In the case of BVD, given the relative strength of tests available to correctly identify virus-positive animals, thus enabling them to be culled, the emphasis has been on cost-benefit analysis of regional and national certification/eradication schemes. Such analyses in turn raise interesting questions about farmer uptake and maintenance of certification schemes and the equity and cost-effective implementation of these schemes. The complex epidemiology of BVD virus infections and the long-term, widespread and often occult nature of BVD effects make economic analysis of the disease and its control particularly challenging. However, this has resulted in a wider whole-farm perspective that captures the influence of multiple decisions, not just those directly associated with disease prevention and control. There is a need to include management of reproduction, risk and enterprise mix in the research on farmer decision-making, as all these factors impinge on, and are affected by, the spread of BVD.

  5. Setting monitoring objectives for landscape-size areas

    Treesearch

    Craig M. Olson; Dean Angelides

    2000-01-01

    The setting of objectives for monitoring schemes for landscape-size areas should be a complex task in today's regulatory and sociopolitical atmosphere. The technology available today, the regulatory environment, and the sociopolitical considerations require multiresource inventory and monitoring schemes, whether tile ownership is industrial or for preservation....

  6. Indoor Trajectory Tracking Scheme Based on Delaunay Triangulation and Heuristic Information in Wireless Sensor Networks.

    PubMed

    Qin, Junping; Sun, Shiwen; Deng, Qingxu; Liu, Limin; Tian, Yonghong

    2017-06-02

    Object tracking and detection is one of the most significant research areas for wireless sensor networks. Existing indoor trajectory tracking schemes in wireless sensor networks are based on continuous localization and moving object data mining. Indoor trajectory tracking based on the received signal strength indicator ( RSSI ) has received increased attention because it has low cost and requires no special infrastructure. However, RSSI tracking introduces uncertainty because of the inaccuracies of measurement instruments and the irregularities (unstable, multipath, diffraction) of wireless signal transmissions in indoor environments. Heuristic information includes some key factors for trajectory tracking procedures. This paper proposes a novel trajectory tracking scheme based on Delaunay triangulation and heuristic information (TTDH). In this scheme, the entire field is divided into a series of triangular regions. The common side of adjacent triangular regions is regarded as a regional boundary. Our scheme detects heuristic information related to a moving object's trajectory, including boundaries and triangular regions. Then, the trajectory is formed by means of a dynamic time-warping position-fingerprint-matching algorithm with heuristic information constraints. Field experiments show that the average error distance of our scheme is less than 1.5 m, and that error does not accumulate among the regions.

  7. Shark: Fast Data Analysis Using Coarse-grained Distributed Memory

    DTIC Science & Technology

    2013-05-01

    Schemes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12 7.1.1 Java Objects...often MySQL or Derby) with a namespace for tables, table metadata, and par- tition information. Table data is stored in an HDFS directory, while a...saving time and space for large data sets. This is achieved with support for custom SerDe (serialization/deserialization) java interface implementations

  8. Kinematics and force analysis of a robot hand based on an artificial biological control scheme

    NASA Astrophysics Data System (ADS)

    Kim, Man Guen

    An artificial biological control scheme (ABCS) is used to study the kinematics and statics of a multifingered hand with a view to developing an efficient control scheme for grasping. The ABCS is based on observation of human grasping, intuitively taking it as the optimum model for robotic grasping. A final chapter proposes several grasping measures to be applied to the design and control of a robot hand. The ABCS leads to the definition of two modes of the grasping action: natural grasping (NG), which is the human motion to grasp the object without any special task command, and forced grasping (FG), which is the motion with a specific task. The grasping direction line (GDL) is defined to determine the position and orientation of the object in the hand. The kinematic model of a redundant robot arm and hand is developed by reconstructing the human upper extremity and using anthropometric measurement data. The inverse kinematic analyses of various types of precision and power grasping are studied by replacing the three-link with one virtual link and using the GDL. The static force analysis for grasping with fingertips is studied by applying the ABCS. A measure of grasping stability, that maintains the positions of contacts as well as the configurations of the redundant fingers, is derived. The grasping stability measure (GSM), a measure of how well the hand maintains grasping under the existence of external disturbance, is derived by the torque vector of the hand calculated from the external force applied to the object. The grasping manipulability measure (GMM), a measure of how well the hand manipulates the object for the task, is derived by the joint velocity vector of the hand calculated from the object velocity. The grasping performance measure (GPM) is defined by the sum of the directional components of the GSM and the GMM. Finally, a planar redundant hand with two fingers is examined in order to study the various postures of the hand performing pinch grasping by applying the GSM and the GMM.

  9. A suggested color scheme for reducing perception-related accidents on construction work sites.

    PubMed

    Yi, June-seong; Kim, Yong-woo; Kim, Ki-aeng; Koo, Bonsang

    2012-09-01

    Changes in workforce demographics have led to the need for more sophisticated approaches to addressing the safety requirements of the construction industry. Despite extensive research in other industry domains, the construction industry has been passive in exploring the impact of a color scheme; perception-related accidents have been effectively diminished by its implementation. The research demonstrated that the use of appropriate color schemes could improve the actions and psychology of workers on site, thereby increasing their perceptions of potentially dangerous situations. As a preliminary study, the objects selected by rigorous analysis on accident reports were workwear, safety net, gondola, scaffolding, and safety passage. The colors modified on site for temporary facilities were adopted from existing theoretical and empirical research that suggests the use of certain colors and their combinations to improve visibility and conspicuity while minimizing work fatigue. The color schemes were also tested and confirmed through two workshops with workers and managers currently involved in actual projects. The impacts of color schemes suggested in this paper are summarized as follows. First, the color schemes improve the conspicuity of facilities with other on site components, enabling workers to quickly discern and orient themselves in their work environment. Secondly, the color schemes have been selected to minimize the visual work fatigue and monotony that can potentially increase accidents. Copyright © 2011 Elsevier Ltd. All rights reserved.

  10. On resilience studies of system detection and recovery techniques against stealthy insider attacks

    NASA Astrophysics Data System (ADS)

    Wei, Sixiao; Zhang, Hanlin; Chen, Genshe; Shen, Dan; Yu, Wei; Pham, Khanh D.; Blasch, Erik P.; Cruz, Jose B.

    2016-05-01

    With the explosive growth of network technologies, insider attacks have become a major concern to business operations that largely rely on computer networks. To better detect insider attacks that marginally manipulate network traffic over time, and to recover the system from attacks, in this paper we implement a temporal-based detection scheme using the sequential hypothesis testing technique. Two hypothetical states are considered: the null hypothesis that the collected information is from benign historical traffic and the alternative hypothesis that the network is under attack. The objective of such a detection scheme is to recognize the change within the shortest time by comparing the two defined hypotheses. In addition, once the attack is detected, a server migration-based system recovery scheme can be triggered to recover the system to the state prior to the attack. To understand mitigation of insider attacks, a multi-functional web display of the detection analysis was developed for real-time analytic. Experiments using real-world traffic traces evaluate the effectiveness of Detection System and Recovery (DeSyAR) scheme. The evaluation data validates the detection scheme based on sequential hypothesis testing and the server migration-based system recovery scheme can perform well in effectively detecting insider attacks and recovering the system under attack.

  11. Does enrollment status in community-based insurance lead to poorer quality of care? Evidence from Burkina Faso

    PubMed Central

    2013-01-01

    Introduction In 2004, a community-based health insurance (CBI) scheme was introduced in Nouna health district, Burkina Faso, with the objective of improving financial access to high quality health services. We investigate the role of CBI enrollment in the quality of care provided at primary-care facilities in Nouna district, and measure differences in objective and perceived quality of care and patient satisfaction between enrolled and non-enrolled populations who visit the facilities. Methods We interviewed a systematic random sample of 398 patients after their visit to one of the thirteen primary-care facilities contracted with the scheme; 34% (n = 135) of the patients were currently enrolled in the CBI scheme. We assessed objective quality of care as consultation, diagnostic and counselling tasks performed by providers during outpatient visits, perceived quality of care as patient evaluations of the structures and processes of service delivery, and overall patient satisfaction. Two-sample t-tests were performed for group comparison and ordinal logistic regression (OLR) analysis was used to estimate the association between CBI enrollment and overall patient satisfaction. Results Objective quality of care evaluations show that CBI enrollees received substantially less comprehensive care for outpatient services than non-enrollees. In contrast, CBI enrollment was positively associated with overall patient satisfaction (aOR = 1.51, p = 0.014), controlling for potential confounders such as patient socio-economic status, illness symptoms, history of illness and characteristics of care received. Conclusions CBI patients perceived better quality of care, while objectively receiving worse quality of care, compared to patients who were not enrolled in CBI. Systematic differences in quality of care expectations between CBI enrollees and non-enrollees may explain this finding. One factor influencing quality of care may be the type of provider payment used by the CBI scheme, which has been identified as a leading factor in reducing provider motivation to deliver high quality care to CBI enrollees in previous studies. Based on this study, it is unlikely that perceived quality of care and patient satisfaction explain the low CBI enrollment rates in this community. PMID:23680066

  12. SASS wind forecast impact studies using the GLAS and NEPRF systems: Preliminary conclusions

    NASA Technical Reports Server (NTRS)

    Kalnay, E.; Atlas, R.; Baker, W. E.; Duffy, D.; Halem, M.; Helfand, M.

    1984-01-01

    For this project, a version of the GLAS Analysis/Forecast System was developed that includes an objective dealiasing scheme as an integral part of the analysis cycle. With this system the (100 sq km) binned SASS wind data generated by S. Peteherych of AER, Canada corresponding of the period 0000 GMT 7 September 1978 to 1200 GMT 13 September 1978 was objectively dealiased. The dealiased wind fields have been requested and received by JPL, NMC and the British Meteorological Office. The first 3.5 days of objectively dealiased fields were subjectively enhanced on the McIDAS system. Approximately 20% of the wind directions were modified, and of these, about 70% were changed by less than 90 deg. Two SASS forecast impact studies, were performed using the dealiased fields, with the GLAS and the NEPRF (Navy Environmental Prediction Research Facility) analysis/forecast systems.

  13. Effect of exercise referral schemes in primary care on physical activity and improving health outcomes: systematic review and meta-analysis

    PubMed Central

    Taylor, A H; Fox, K R; Hillsdon, M; Anokye, N; Campbell, J L; Foster, C; Green, C; Moxham, T; Mutrie, N; Searle, J; Trueman, P; Taylor, R S

    2011-01-01

    Objective To assess the impact of exercise referral schemes on physical activity and health outcomes. Design Systematic review and meta-analysis. Data sources Medline, Embase, PsycINFO, Cochrane Library, ISI Web of Science, SPORTDiscus, and ongoing trial registries up to October 2009. We also checked study references. Study selection Design: randomised controlled trials or non-randomised controlled (cluster or individual) studies published in peer review journals. Population: sedentary individuals with or without medical diagnosis. Exercise referral schemes defined as: clear referrals by primary care professionals to third party service providers to increase physical activity or exercise, physical activity or exercise programmes tailored to individuals, and initial assessment and monitoring throughout programmes. Comparators: usual care, no intervention, or alternative exercise referral schemes. Results Eight randomised controlled trials met the inclusion criteria, comparing exercise referral schemes with usual care (six trials), alternative physical activity intervention (two), and an exercise referral scheme plus a self determination theory intervention (one). Compared with usual care, follow-up data for exercise referral schemes showed an increased number of participants who achieved 90-150 minutes of physical activity of at least moderate intensity per week (pooled relative risk 1.16, 95% confidence intervals 1.03 to 1.30) and a reduced level of depression (pooled standardised mean difference −0.82, −1.28 to −0.35). Evidence of a between group difference in physical activity of moderate or vigorous intensity or in other health outcomes was inconsistent at follow-up. We did not find any difference in outcomes between exercise referral schemes and the other two comparator groups. None of the included trials separately reported outcomes in individuals with specific medical diagnoses.Substantial heterogeneity in the quality and nature of the exercise referral schemes across studies might have contributed to the inconsistency in outcome findings. Conclusions Considerable uncertainty remains as to the effectiveness of exercise referral schemes for increasing physical activity, fitness, or health indicators, or whether they are an efficient use of resources for sedentary people with or without a medical diagnosis. PMID:22058134

  14. Multi-objective optimization of radiotherapy: distributed Q-learning and agent-based simulation

    NASA Astrophysics Data System (ADS)

    Jalalimanesh, Ammar; Haghighi, Hamidreza Shahabi; Ahmadi, Abbas; Hejazian, Hossein; Soltani, Madjid

    2017-09-01

    Radiotherapy (RT) is among the regular techniques for the treatment of cancerous tumours. Many of cancer patients are treated by this manner. Treatment planning is the most important phase in RT and it plays a key role in therapy quality achievement. As the goal of RT is to irradiate the tumour with adequately high levels of radiation while sparing neighbouring healthy tissues as much as possible, it is a multi-objective problem naturally. In this study, we propose an agent-based model of vascular tumour growth and also effects of RT. Next, we use multi-objective distributed Q-learning algorithm to find Pareto-optimal solutions for calculating RT dynamic dose. We consider multiple objectives and each group of optimizer agents attempt to optimise one of them, iteratively. At the end of each iteration, agents compromise the solutions to shape the Pareto-front of multi-objective problem. We propose a new approach by defining three schemes of treatment planning created based on different combinations of our objectives namely invasive, conservative and moderate. In invasive scheme, we enforce killing cancer cells and pay less attention about irradiation effects on normal cells. In conservative scheme, we take more care of normal cells and try to destroy cancer cells in a less stressed manner. The moderate scheme stands in between. For implementation, each of these schemes is handled by one agent in MDQ-learning algorithm and the Pareto optimal solutions are discovered by the collaboration of agents. By applying this methodology, we could reach Pareto treatment plans through building different scenarios of tumour growth and RT. The proposed multi-objective optimisation algorithm generates robust solutions and finds the best treatment plan for different conditions.

  15. A maternal health voucher scheme: what have we learned from the demand-side financing scheme in Bangladesh?

    PubMed

    Ahmed, Shakil; Khan, M Mahmud

    2011-01-01

    It is now more than 2 years since the Ministry of Health and Family Welfare of the Government of Bangladesh implemented the Maternal Health Voucher Scheme, a specialized form of demand-side financing programme. To analyse the early lessons from the scheme, information was obtained through semi-structured interviews with stakeholders at the sub-district level. The analysis identified a number of factors affecting the efficiency and performance of the scheme in the program area: delay in the release of voucher funds, selection criteria used for enrolling pregnant women in the programme, incentives created by the reimbursement system, etc. One of the objectives of the scheme was to encourage market competition among health care providers, but it failed to increase market competitiveness in the area. The resources made available through the scheme did not attract any new providers into the market and public facilities remained the only eligible provider both before and after scheme implementation. However, incentives provided through the voucher system did motivate public providers to offer a higher level of services. The beneficiaries expressed their overall satisfaction with the scheme as well. Since the local facility was not technically ready to provide all types of maternal health care services, providing vouchers may not improve access to care for many pregnant women. To improve the performance of the demand-side strategy, it has become important to adopt some supply-side interventions. In poor developing countries, a demand-side strategy may not be very effective without significant expansion of the service delivery capacity of health facilities at the sub-district level.

  16. A Wrf-Chem Flash Rate Parameterization Scheme and LNO(x) Analysis of the 29-30 May 2012 Convective Event in Oklahoma During DC3

    NASA Technical Reports Server (NTRS)

    Cummings, Kristin A.; Pickering, Kenneth E.; Barth, M.; Weinheimer, A.; Bela, M.; Li, Y.; Allen, D.; Bruning, E.; MacGorman, D.; Rutledge, S.; hide

    2014-01-01

    The Deep Convective Clouds and Chemistry (DC3) field campaign in 2012 provided a plethora of aircraft and ground-based observations (e.g., trace gases, lightning and radar) to study deep convective storms, their convective transport of trace gases, and associated lightning occurrence and production of nitrogen oxides (NOx). Based on the measurements taken of the 29-30 May 2012 Oklahoma thunderstorm, an analysis against a Weather Research and Forecasting Chemistry (WRF-Chem) model simulation of the same event at 3-km horizontal resolution was performed. One of the main objectives was to include various flash rate parameterization schemes (FRPSs) in the model and identify which scheme(s) best captured the flash rates observed by the National Lightning Detection Network (NLDN) and Oklahoma Lightning Mapping Array (LMA). The comparison indicates how well the schemes predicted the timing, location, and number of lightning flashes. The FRPSs implemented in the model were based on the simulated thunderstorms physical features, such as maximum vertical velocity, cloud top height, and updraft volume. Adjustment factors were added to each FRPS to best capture the observed flash trend and a sensitivity study was performed to compare the range in model-simulated lightning-generated nitrogen oxides (LNOx) generated by each FRPS over the storms lifetime. Based on the best FRPS, model-simulated LNOx was compared against aircraft measured NOx. The trace gas analysis, along with the increased detail in the model specification of the vertical distribution of lightning flashes as suggested by the LMA data, provide guidance in determining the scenario of NO production per intracloud and cloud-to-ground flash that best matches the NOx mixing ratios observed by the aircraft.

  17. A WRF-Chem Flash Rate Parameterization Scheme and LNOx Analysis of the 29-30 May 2012 Convective Event in Oklahoma During DC3

    NASA Technical Reports Server (NTRS)

    Cummings, Kristin A.; Pickering, Kenneth E.; Barth, M.; Weinheimer, A.; Bela, M.; Li, Y.; Allen, D.; Bruning, E.; MacGorman, D.; Rutledge, S.; hide

    2014-01-01

    The Deep Convective Clouds and Chemistry (DC3) field campaign in 2012 provided a plethora of aircraft and ground-based observations (e.g., trace gases, lightning and radar) to study deep convective storms, their convective transport of trace gases, and associated lightning occurrence and production of nitrogen oxides (NOx). Based on the measurements taken of the 29-30 May 2012 Oklahoma thunderstorm, an analysis against a Weather Research and Forecasting Chemistry (WRF-Chem) model simulation of the same event at 3-km horizontal resolution was performed. One of the main objectives was to include various flash rate parameterization schemes (FRPSs) in the model and identify which scheme(s) best captured the flash rates observed by the National Lightning Detection Network (NLDN) and Oklahoma Lightning Mapping Array (LMA). The comparison indicates how well the schemes predicted the timing, location, and number of lightning flashes. The FRPSs implemented in the model were based on the simulated thunderstorms physical features, such as maximum vertical velocity, cloud top height, and updraft volume. Adjustment factors were applied to each FRPS to best capture the observed flash trend and a sensitivity study was performed to compare the range in model-simulated lightning-generated nitrogen oxides (LNOx) generated by each FRPS over the storms lifetime. Based on the best FRPS, model-simulated LNOx was compared against aircraft measured NOx. The trace gas analysis, along with the increased detail in the model specification of the vertical distribution of lightning flashes as suggested by the LMA data, provide guidance in determining the scenario of NO production per intracloud and cloud-to-ground flash that best matches the NOx mixing ratios observed by the aircraft.

  18. A New Hybrid Scheme for Preventing Channel Interference and Collision in Mobile Networks

    NASA Astrophysics Data System (ADS)

    Kim, Kyungjun; Han, Kijun

    This paper proposes a new hybrid scheme based on a given set of channels for preventing channel interference and collision in mobile networks. The proposed scheme is designed for improving system performance, focusing on enhancement of performance related to path breakage and channel interference. The objective of this scheme is to improve the performance of inter-node communication. Simulation results from this paper show that the new hybrid scheme can reduce a more control message overhead than a conventional random scheme.

  19. Improved Object Localization Using Accurate Distance Estimation in Wireless Multimedia Sensor Networks

    PubMed Central

    Ur Rehman, Yasar Abbas; Tariq, Muhammad; Khan, Omar Usman

    2015-01-01

    Object localization plays a key role in many popular applications of Wireless Multimedia Sensor Networks (WMSN) and as a result, it has acquired a significant status for the research community. A significant body of research performs this task without considering node orientation, object geometry and environmental variations. As a result, the localized object does not reflect the real world scenarios. In this paper, a novel object localization scheme for WMSN has been proposed that utilizes range free localization, computer vision, and principle component analysis based algorithms. The proposed approach provides the best possible approximation of distance between a wmsn sink and an object, and the orientation of the object using image based information. Simulation results report 99% efficiency and an error ratio of 0.01 (around 1 ft) when compared to other popular techniques. PMID:26528919

  20. Extended opening hours and patient experience of general practice in England: multilevel regression analysis of a national patient survey

    PubMed Central

    Cowling, Thomas E; Harris, Matthew; Majeed, Azeem

    2017-01-01

    Background The UK government plans to extend the opening hours of general practices in England. The ‘extended hours access scheme’ pays practices for providing appointments outside core times (08:00 to 18.30, Monday to Friday) for at least 30 min per 1000 registered patients each week. Objective To determine the association between extended hours access scheme participation and patient experience. Methods Retrospective analysis of a national cross-sectional survey completed by questionnaire (General Practice Patient Survey 2013–2014); 903 357 survey respondents aged ≥18 years old and registered to 8005 general practices formed the study population. Outcome measures were satisfaction with opening hours, experience of making an appointment and overall experience (on five-level interval scales from 0 to 100). Mean differences between scheme participation groups were estimated using multilevel random-effects regression, propensity score matching and instrumental variable analysis. Results Most patients were very (37.2%) or fairly satisfied (42.7%) with the opening hours of their general practices; results were similar for experience of making an appointment and overall experience. Most general practices participated in the extended hours access scheme (73.9%). Mean differences in outcome measures between scheme participants and non-participants were positive but small across estimation methods (mean differences ≤1.79). For example, scheme participation was associated with a 1.25 (95% CI 0.96 to 1.55) increase in satisfaction with opening hours using multilevel regression; this association was slightly greater when patients could not take time off work to see a general practitioner (2.08, 95% CI 1.53 to 2.63). Conclusions Participation in the extended hours access scheme has a limited association with three patient experience measures. This questions expected impacts of current plans to extend opening hours on patient experience. PMID:27343274

  1. Kalman filters for assimilating near-surface observations into the Richards equation - Part 1: Retrieving state profiles with linear and nonlinear numerical schemes

    NASA Astrophysics Data System (ADS)

    Chirico, G. B.; Medina, H.; Romano, N.

    2014-07-01

    This paper examines the potential of different algorithms, based on the Kalman filtering approach, for assimilating near-surface observations into a one-dimensional Richards equation governing soil water flow in soil. Our specific objectives are: (i) to compare the efficiency of different Kalman filter algorithms in retrieving matric pressure head profiles when they are implemented with different numerical schemes of the Richards equation; (ii) to evaluate the performance of these algorithms when nonlinearities arise from the nonlinearity of the observation equation, i.e. when surface soil water content observations are assimilated to retrieve matric pressure head values. The study is based on a synthetic simulation of an evaporation process from a homogeneous soil column. Our first objective is achieved by implementing a Standard Kalman Filter (SKF) algorithm with both an explicit finite difference scheme (EX) and a Crank-Nicolson (CN) linear finite difference scheme of the Richards equation. The Unscented (UKF) and Ensemble Kalman Filters (EnKF) are applied to handle the nonlinearity of a backward Euler finite difference scheme. To accomplish the second objective, an analogous framework is applied, with the exception of replacing SKF with the Extended Kalman Filter (EKF) in combination with a CN numerical scheme, so as to handle the nonlinearity of the observation equation. While the EX scheme is computationally too inefficient to be implemented in an operational assimilation scheme, the retrieval algorithm implemented with a CN scheme is found to be computationally more feasible and accurate than those implemented with the backward Euler scheme, at least for the examined one-dimensional problem. The UKF appears to be as feasible as the EnKF when one has to handle nonlinear numerical schemes or additional nonlinearities arising from the observation equation, at least for systems of small dimensionality as the one examined in this study.

  2. Combining color and shape information for illumination-viewpoint invariant object recognition.

    PubMed

    Diplaros, Aristeidis; Gevers, Theo; Patras, Ioannis

    2006-01-01

    In this paper, we propose a new scheme that merges color- and shape-invariant information for object recognition. To obtain robustness against photometric changes, color-invariant derivatives are computed first. Color invariance is an important aspect of any object recognition scheme, as color changes considerably with the variation in illumination, object pose, and camera viewpoint. These color invariant derivatives are then used to obtain similarity invariant shape descriptors. Shape invariance is equally important as, under a change in camera viewpoint and object pose, the shape of a rigid object undergoes a perspective projection on the image plane. Then, the color and shape invariants are combined in a multidimensional color-shape context which is subsequently used as an index. As the indexing scheme makes use of a color-shape invariant context, it provides a high-discriminative information cue robust against varying imaging conditions. The matching function of the color-shape context allows for fast recognition, even in the presence of object occlusion and cluttering. From the experimental results, it is shown that the method recognizes rigid objects with high accuracy in 3-D complex scenes and is robust against changing illumination, camera viewpoint, object pose, and noise.

  3. Mundane? Demographic characteristics as predictors of enrolment onto the National Health Insurance Scheme in two districts of Ghana.

    PubMed

    Seddoh, Anthony; Sataru, Fuseini

    2018-05-04

    In 2003, Ghana passed a law to establish a National Health Insurance Scheme (NHIS) to serve as the main vehicle for achieving universal health coverage. Over 60% of the population had registered by 2009. Current active membership is however 40%. The stagnation in growth has been recorded across all the membership categories. Clearly, the Scheme is falling short of its core objective. This analysis is a critical thematic contextual examination of the effects of demographic factors on enrolment onto the Scheme. Demographic secondary data for 625 respondents collected (using a structured questionnaire) during a cross-sectional household survey in an urban, Ashaiman, and rural, Adaklu, districts was analyzed in univariate and multivariate logistic regression models using Statistical Package for Social Scientists (SPSS). Statistical significance was set at P-value < 0.05. Variables included in the analysis were age, gender, education, occupation and knowledge about the NHIS. Seventy-nine percent of the survey respondents have ever enrolled onto the NHIS with three-fifths being females. Of the ever enrolled, 63% had valid cards. Age, gender and educational level were significant predictors of enrolment in the multivariate analysis. Respondents between the ages 41-60 years were twice (p = 0.05) more likely to be enrolled onto a district Scheme compared with respondents between the ages 21-40 years. Females were thrice (p = 0.00) more likely to enroll compared with males. Respondents educated to the tertiary, five times (p = 0.02), and post-graduate, four times (p = 0.05), levels were more likely to enroll compared with non-educated respondents. No significant association was observed between occupation and enrolment. Uptake of the scheme is declining despite high awareness and knowledge. Leadership, innovation and collaboration are required at the district Scheme level to curtail issues of low self-enrolment and to grow membership. Otherwise, the goal of universal coverage under the NHIS will become merely a slogan and equity in financial access to health care for all Ghanaians will remain elusive.

  4. Parameter Estimation and Sensitivity Analysis of an Urban Surface Energy Balance Parameterization at a Tropical Suburban Site

    NASA Astrophysics Data System (ADS)

    Harshan, S.; Roth, M.; Velasco, E.

    2014-12-01

    Forecasting of the urban weather and climate is of great importance as our cities become more populated and considering the combined effects of global warming and local land use changes which make urban inhabitants more vulnerable to e.g. heat waves and flash floods. In meso/global scale models, urban parameterization schemes are used to represent the urban effects. However, these schemes require a large set of input parameters related to urban morphological and thermal properties. Obtaining all these parameters through direct measurements are usually not feasible. A number of studies have reported on parameter estimation and sensitivity analysis to adjust and determine the most influential parameters for land surface schemes in non-urban areas. Similar work for urban areas is scarce, in particular studies on urban parameterization schemes in tropical cities have so far not been reported. In order to address above issues, the town energy balance (TEB) urban parameterization scheme (part of the SURFEX land surface modeling system) was subjected to a sensitivity and optimization/parameter estimation experiment at a suburban site in, tropical Singapore. The sensitivity analysis was carried out as a screening test to identify the most sensitive or influential parameters. Thereafter, an optimization/parameter estimation experiment was performed to calibrate the input parameter. The sensitivity experiment was based on the "improved Sobol's global variance decomposition method" . The analysis showed that parameters related to road, roof and soil moisture have significant influence on the performance of the model. The optimization/parameter estimation experiment was performed using the AMALGM (a multi-algorithm genetically adaptive multi-objective method) evolutionary algorithm. The experiment showed a remarkable improvement compared to the simulations using the default parameter set. The calibrated parameters from this optimization experiment can be used for further model validation studies to identify inherent deficiencies in model physics.

  5. One-dimensional high-order compact method for solving Euler's equations

    NASA Astrophysics Data System (ADS)

    Mohamad, M. A. H.; Basri, S.; Basuno, B.

    2012-06-01

    In the field of computational fluid dynamics, many numerical algorithms have been developed to simulate inviscid, compressible flows problems. Among those most famous and relevant are based on flux vector splitting and Godunov-type schemes. Previously, this system was developed through computational studies by Mawlood [1]. However the new test cases for compressible flows, the shock tube problems namely the receding flow and shock waves were not investigated before by Mawlood [1]. Thus, the objective of this study is to develop a high-order compact (HOC) finite difference solver for onedimensional Euler equation. Before developing the solver, a detailed investigation was conducted to assess the performance of the basic third-order compact central discretization schemes. Spatial discretization of the Euler equation is based on flux-vector splitting. From this observation, discretization of the convective flux terms of the Euler equation is based on a hybrid flux-vector splitting, known as the advection upstream splitting method (AUSM) scheme which combines the accuracy of flux-difference splitting and the robustness of flux-vector splitting. The AUSM scheme is based on the third-order compact scheme to the approximate finite difference equation was completely analyzed consequently. In one-dimensional problem for the first order schemes, an explicit method is adopted by using time integration method. In addition to that, development and modification of source code for the one-dimensional flow is validated with four test cases namely, unsteady shock tube, quasi-one-dimensional supersonic-subsonic nozzle flow, receding flow and shock waves in shock tubes. From these results, it was also carried out to ensure that the definition of Riemann problem can be identified. Further analysis had also been done in comparing the characteristic of AUSM scheme against experimental results, obtained from previous works and also comparative analysis with computational results generated by van Leer, KFVS and AUSMPW schemes. Furthermore, there is a remarkable improvement with the extension of the AUSM scheme from first-order to third-order accuracy in terms of shocks, contact discontinuities and rarefaction waves.

  6. Problems in the Study of lineaments

    NASA Astrophysics Data System (ADS)

    Anokhin, Vladimir; Kholmyanskii, Michael

    2015-04-01

    The study of linear objects in upper crust, called lineaments, led at one time to a major scientific results - discovery of the planetary regmatic network, the birth of some new tectonic concepts, establishment of new search for signs of mineral deposits. But now lineaments studied not enough for such a promising research direction. Lineament geomorphology has a number of problems. 1.Terminology problems. Lineament theme still has no generally accepted terminology base. Different scientists have different interpretations even for the definition of lineament We offer an expanded definition for it: lineaments - line features of the earth's crust, expressed by linear landforms, geological linear forms, linear anomalies of physical fields may follow each other, associated with faults. The term "lineament" is not identical to the term "fault", but always lineament - reasonable suspicion to fault, and this suspicion is justified in most cases. The structure lineament may include only the objects that are at least presumably can be attributed to the deep processes. Specialists in the lineament theme can overcome terminological problems if together create a common terminology database. 2. Methodological problems. Procedure manual selection lineaments mainly is depiction of straight line segments along the axes of linear morphostructures on some cartographic basis. Reduce the subjective factors of manual selection is possible, following a few simple rules: - The choice of optimal projection, scale and quality of cartographic basis; - Selection of the optimal type of linear objects under study; - The establishment of boundary conditions for the allocation lineament (minimum length, maximum bending, the minimum length to width ratio, etc.); - Allocation of an increasing number of lineaments - for representative sampling and reduce the influence of random errors; - Ranking lineaments: fine lines (rank 3) combined to form larger lineaments rank 2; which, when combined capabilities in large lineaments rank 1; - Correlation of the resulting pattern of lineaments with a pattern already known of faults in the study area; - Separate allocation lineaments by several experts with correlation of the resulting schemes and create a common scheme. The problem of computer lineament allocation is not solved yet. Existing programs for lineament analysis is not so perfect to completely rely on them. In any of them, changing the initial parameters, we can get pictures lineaments any desired configuration. Also a high probability of heavy and hardly recognized systematic errors. In any case, computer lineament patterns after their creation should be subject to examination Real. 3. Interpretive problems. To minimize the distortion results of the lineament analysis is advisable to stick to a few techniques and rules: - use of visualization techniques, in particular, rose-charts, which are submitted azimuth and length of selected lineaments; - consistent downscaling of analysis. A preliminary analysis of a larger area that includes the area of interest with surroundings; - using the available information on the location of the already known faults and other tectonic linear objects of the study area; - comparison of the lineament scheme with schemes of other authors - can reduce the element of subjectivity in the schemes. The study of lineaments is a very promising direction of geomorfology and tectonics. Challenges facing the lineament theme, are solvable. To solve them, professionals should meet and talk to each other. The results of further work in this direction may exceed expectations.

  7. Real-data tests of a single-Doppler radar assimilation system

    NASA Astrophysics Data System (ADS)

    Nehrkorn, Thomas; Hegarty, James; Hamill, Thomas M.

    1994-06-01

    Real data tests of a single-Doppler radar data assimilation and forecast system have been conducted for a Florida sea breeze case. The system consists of a hydrostatic mesoscale model used for prediction of the preconvective boundary layer, an objective analysis that combines model first guess fields with radar derived horizontal winds, a thermodynamic retrieval scheme that obtains temperature information from the three-dimensional wind field and its temporal evolution, and a Newtonian nudging scheme for forcing the model forecast to closer agreement with the analysis. As was found in earlier experiments with simulated data, assimilation using Newtonian nudging benefits from temperature data in addition to wind data. The thermodynamic retrieval technique was successful in retrieving a horizontal temperature gradient from the radar-derived wind fields that, when assimilated into the model, led to a significantly improved forecast of the seabreeze strength and position.

  8. Financing Maternal Health and Family Planning: Are We on the Right Track? Evidence from the Reproductive Health Subaccounts in Mexico, 2003–2012

    PubMed Central

    Aracena-Genao, Belkis; del Río-Zolezzi, Aurora

    2016-01-01

    Objective To analyze whether the changes observed in the level and distribution of resources for maternal health and family planning (MHFP) programs from 2003 to 2012 were consistent with the financial goals of the related policies. Materials and Methods A longitudinal descriptive analysis of the Mexican Reproductive Health Subaccounts 2003–2012 was performed by financing scheme and health function. Financing schemes included social security, government schemes, household out-of-pocket (OOP) payments, and private insurance plans. Functions were preventive care, including family planning, antenatal and puerperium health services, normal and cesarean deliveries, and treatment of complications. Changes in the financial imbalance indicators covered by MHFP policy were tracked: (a) public and OOP expenditures as percentages of total MHFP spending; (b) public expenditure per woman of reproductive age (WoRA, 15–49 years) by financing scheme; (c) public expenditure on treating complications as a percentage of preventive care; and (d) public expenditure on WoRA at state level. Statistical analyses of trends and distributions were performed. Results Public expenditure on government schemes grew by approximately 300%, and the financial imbalance between populations covered by social security and government schemes decreased. The financial burden on households declined, particularly among households without social security. Expenditure on preventive care grew by 16%, narrowing the financing gap between treatment of complications and preventive care. Finally, public expenditure per WoRA for government schemes nearly doubled at the state level, although considerable disparities persist. Conclusions Changes in the level and distribution of MHFP funding from 2003 to 2012 were consistent with the relevant policy goals. However, improving efficiency requires further analysis to ascertain the impact of investments on health outcomes. This, in turn, will require better financial data systems as a precondition for improving the monitoring and accountability functions in Mexico. PMID:26812646

  9. High aperture off-axis parabolic mirror applied in digital holographic microscopy

    NASA Astrophysics Data System (ADS)

    Kalenkov, Georgy S.; Kalenkov, Sergey G.; Shtanko, Alexander E.

    2018-04-01

    An optical scheme of recording digital holograms of micro-objects based on high numerical aperture off-axis parabolic mirror forming a high aperture reference wave is suggested. Registration of digital holograms based on the proposed optical scheme is confirmed experimentally. Application of the proposed approach for hyperspectral holograms registration of micro-objects in incoherent light is discussed.

  10. A pseudospectra-based approach to non-normal stability of embedded boundary methods

    NASA Astrophysics Data System (ADS)

    Rapaka, Narsimha; Samtaney, Ravi

    2017-11-01

    We present non-normal linear stability of embedded boundary (EB) methods employing pseudospectra and resolvent norms. Stability of the discrete linear wave equation is characterized in terms of the normalized distance of the EB to the nearest ghost node (α) in one and two dimensions. An important objective is that the CFL condition based on the Cartesian grid spacing remains unaffected by the EB. We consider various discretization methods including both central and upwind-biased schemes. Stability is guaranteed when α <=αmax ranges between 0.5 and 0.77 depending on the discretization scheme. Also, the stability characteristics remain the same in both one and two dimensions. Sharper limits on the sufficient conditions for stability are obtained based on the pseudospectral radius (the Kreiss constant) than the restrictive limits based on the usual singular value decomposition analysis. We present a simple and robust reclassification scheme for the ghost cells (``hybrid ghost cells'') to ensure Lax stability of the discrete systems. This has been tested successfully for both low and high order discretization schemes with transient growth of at most O (1). Moreover, we present a stable, fourth order EB reconstruction scheme. Supported by the KAUST Office of Competitive Research Funds under Award No. URF/1/1394-01.

  11. Clinical Application of Esophageal High-resolution Manometry in the Diagnosis of Esophageal Motility Disorders.

    PubMed

    van Hoeij, Froukje B; Bredenoord, Albert J

    2016-01-31

    Esophageal high-resolution manometry (HRM) is replacing conventional manometry in the clinical evaluation of patients with esophageal symptoms, especially dysphagia. The introduction of HRM gave rise to new objective metrics and recognizable patterns of esophageal motor function, requiring a new classification scheme: the Chicago classification. HRM measurements are more detailed and more easily performed compared to conventional manometry. The visual presentation of acquired data improved the analysis and interpretation of esophageal motor function. This led to a more sensitive, accurate, and objective analysis of esophageal motility. In this review we discuss how HRM changed the way we define and categorize esophageal motility disorders. Moreover, we discuss the clinical applications of HRM for each esophageal motility disorder separately.

  12. Clinical Application of Esophageal High-resolution Manometry in the Diagnosis of Esophageal Motility Disorders

    PubMed Central

    van Hoeij, Froukje B; Bredenoord, Albert J

    2016-01-01

    Esophageal high-resolution manometry (HRM) is replacing conventional manometry in the clinical evaluation of patients with esophageal symptoms, especially dysphagia. The introduction of HRM gave rise to new objective metrics and recognizable patterns of esophageal motor function, requiring a new classification scheme: the Chicago classification. HRM measurements are more detailed and more easily performed compared to conventional manometry. The visual presentation of acquired data improved the analysis and interpretation of esophageal motor function. This led to a more sensitive, accurate, and objective analysis of esophageal motility. In this review we discuss how HRM changed the way we define and categorize esophageal motility disorders. Moreover, we discuss the clinical applications of HRM for each esophageal motility disorder separately. PMID:26631942

  13. Multicriteria decision analysis applied to Glen Canyon Dam

    USGS Publications Warehouse

    Flug, M.; Seitz, H.L.H.; Scott, J.F.

    2000-01-01

    Conflicts in water resources exist because river-reservoir systems are managed to optimize traditional benefits (e.g., hydropower and flood control), which are historically quantified in economic terms, whereas natural and environmental resources, including in-stream and riparian resources, are more difficult or impossible to quantify in economic terms. Multicriteria decision analysis provides a quantitative approach to evaluate resources subject to river basin management alternatives. This objective quantification method includes inputs from special interest groups, the general public, and concerned individuals, as well as professionals for each resource considered in a trade-off analysis. Multicriteria decision analysis is applied to resources and flow alternatives presented in the environmental impact statement for Glen Canyon Dam on the Colorado River. A numeric rating and priority-weighting scheme is used to evaluate 29 specific natural resource attributes, grouped into seven main resource objectives, for nine flow alternatives enumerated in the environmental impact statement.

  14. Using EIGER for Antenna Design and Analysis

    NASA Technical Reports Server (NTRS)

    Champagne, Nathan J.; Khayat, Michael; Kennedy, Timothy F.; Fink, Patrick W.

    2007-01-01

    EIGER (Electromagnetic Interactions GenERalized) is a frequency-domain electromagnetics software package that is built upon a flexible framework, designed using object-oriented techniques. The analysis methods used include moment method solutions of integral equations, finite element solutions of partial differential equations, and combinations thereof. The framework design permits new analysis techniques (boundary conditions, Green#s functions, etc.) to be added to the software suite with a sensible effort. The code has been designed to execute (in serial or parallel) on a wide variety of platforms from Intel-based PCs and Unix-based workstations. Recently, new potential integration scheme s that avoid singularity extraction techniques have been added for integral equation analysis. These new integration schemes are required for facilitating the use of higher-order elements and basis functions. Higher-order elements are better able to model geometrical curvature using fewer elements than when using linear elements. Higher-order basis functions are beneficial for simulating structures with rapidly varying fields or currents. Results presented here will demonstrate curren t and future capabilities of EIGER with respect to analysis of installed antenna system performance in support of NASA#s mission of exploration. Examples include antenna coupling within an enclosed environment and antenna analysis on electrically large manned space vehicles.

  15. Multi-Dimensional High Order Essentially Non-Oscillatory Finite Difference Methods in Generalized Coordinates

    NASA Technical Reports Server (NTRS)

    Shu, Chi-Wang

    1998-01-01

    This project is about the development of high order, non-oscillatory type schemes for computational fluid dynamics. Algorithm analysis, implementation, and applications are performed. Collaborations with NASA scientists have been carried out to ensure that the research is relevant to NASA objectives. The combination of ENO finite difference method with spectral method in two space dimension is considered, jointly with Cai [3]. The resulting scheme behaves nicely for the two dimensional test problems with or without shocks. Jointly with Cai and Gottlieb, we have also considered one-sided filters for spectral approximations to discontinuous functions [2]. We proved theoretically the existence of filters to recover spectral accuracy up to the discontinuity. We also constructed such filters for practical calculations.

  16. Optimal placement of fast cut back units based on the theory of cellular automata and agent

    NASA Astrophysics Data System (ADS)

    Yan, Jun; Yan, Feng

    2017-06-01

    The thermal power generation units with the function of fast cut back could serve power for auxiliary system and keep island operation after a major blackout, so they are excellent substitute for the traditional black-start power sources. Different placement schemes for FCB units have different influence on the subsequent restoration process. Considering the locality of the emergency dispatching rules, the unpredictability of specific dispatching instructions and unexpected situations like failure of transmission line energization, a novel deduction model for network reconfiguration based on the theory of cellular automata and agent is established. Several indexes are then defined for evaluating the placement schemes for FCB units. The attribute weights determination method based on subjective and objective integration and grey relational analysis are combinatorically used to determine the optimal placement scheme for FCB unit. The effectiveness of the proposed method is validated by the test results on the New England 10-unit 39-bus power system.

  17. Numerical analysis of base flowfield at high altitude for a four-engine clustered nozzle configuration

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See

    1993-01-01

    The objective of this study is to benchmark a four-engine clustered nozzle base flowfield with a computational fluid dynamics (CFD) model. The CFD model is a pressure based, viscous flow formulation. An adaptive upwind scheme is employed for the spatial discretization. The upwind scheme is based on second and fourth order central differencing with adaptive artificial dissipation. Qualitative base flow features such as the reverse jet, wall jet, recompression shock, and plume-plume impingement have been captured. The computed quantitative flow properties such as the radial base pressure distribution, model centerline Mach number and static pressure variation, and base pressure characteristic curve agreed reasonably well with those of the measurement. Parametric study on the effect of grid resolution, turbulence model, inlet boundary condition and difference scheme on convective terms has been performed. The results showed that grid resolution and turbulence model are two primary factors that influence the accuracy of the base flowfield prediction.

  18. A curricula-based comparison of biomedical and health informatics programs in the USA

    PubMed Central

    Hemminger, Bradley M

    2011-01-01

    Objective The field of Biomedical and Health Informatics (BMHI) continues to define itself, and there are many educational programs offering ‘informatics’ degrees with varied foci. The goal of this study was to develop a scheme for systematic comparison of programs across the entire BMHI spectrum and to identify commonalities among informatics curricula. Design Guided by several published competency sets, a grounded theory approach was used to develop a program/curricula categorization scheme based on the descriptions of 636 courses offered by 73 public health, nursing, health, medical, and bioinformatics programs in the USA. The scheme was then used to compare the programs in the aforementioned five informatics disciplines. Results The authors developed a Course-Based Informatics Program Categorization (CBIPC) scheme that can be used both to classify coursework for any BMHI educational program and to compare programs from the same or related disciplines. The application of CBIPC scheme to the analysis of public health, nursing, health, medical, and bioinformatics programs reveals distinct intradisciplinary curricular patterns and a common core of courses across the entire BMHI education domain. Limitations The study is based on descriptions of courses from the university's webpages. Thus, it is limited to sampling courses at one moment in time, and classification for the coding scheme is based primarily on course titles and course descriptions. Conclusion The CBIPC scheme combines empirical data about educational curricula from diverse informatics programs and several published competency sets. It also provides a foundation for discussion of BMHI education as a whole and can help define subdisciplinary competencies. PMID:21292707

  19. Robustly stable adaptive control of a tandem of master-slave robotic manipulators with force reflection by using a multiestimation scheme.

    PubMed

    Ibeas, Asier; de la Sen, Manuel

    2006-10-01

    The problem of controlling a tandem of robotic manipulators composing a teleoperation system with force reflection is addressed in this paper. The final objective of this paper is twofold: 1) to design a robust control law capable of ensuring closed-loop stability for robots with uncertainties and 2) to use the so-obtained control law to improve the tracking of each robot to its corresponding reference model in comparison with previously existing controllers when the slave is interacting with the obstacle. In this way, a multiestimation-based adaptive controller is proposed. Thus, the master robot is able to follow more accurately the constrained motion defined by the slave when interacting with an obstacle than when a single-estimation-based controller is used, improving the transparency property of the teleoperation scheme. The closed-loop stability is guaranteed if a minimum residence time, which might be updated online when unknown, between different controller parameterizations is respected. Furthermore, the analysis of the teleoperation and stability capabilities of the overall scheme is carried out. Finally, some simulation examples showing the working of the multiestimation scheme complete this paper.

  20. Classification and assessment of retrieved electron density maps in coherent X-ray diffraction imaging using multivariate analysis.

    PubMed

    Sekiguchi, Yuki; Oroguchi, Tomotaka; Nakasako, Masayoshi

    2016-01-01

    Coherent X-ray diffraction imaging (CXDI) is one of the techniques used to visualize structures of non-crystalline particles of micrometer to submicrometer size from materials and biological science. In the structural analysis of CXDI, the electron density map of a sample particle can theoretically be reconstructed from a diffraction pattern by using phase-retrieval (PR) algorithms. However, in practice, the reconstruction is difficult because diffraction patterns are affected by Poisson noise and miss data in small-angle regions due to the beam stop and the saturation of detector pixels. In contrast to X-ray protein crystallography, in which the phases of diffracted waves are experimentally estimated, phase retrieval in CXDI relies entirely on the computational procedure driven by the PR algorithms. Thus, objective criteria and methods to assess the accuracy of retrieved electron density maps are necessary in addition to conventional parameters monitoring the convergence of PR calculations. Here, a data analysis scheme, named ASURA, is proposed which selects the most probable electron density maps from a set of maps retrieved from 1000 different random seeds for a diffraction pattern. Each electron density map composed of J pixels is expressed as a point in a J-dimensional space. Principal component analysis is applied to describe characteristics in the distribution of the maps in the J-dimensional space. When the distribution is characterized by a small number of principal components, the distribution is classified using the k-means clustering method. The classified maps are evaluated by several parameters to assess the quality of the maps. Using the proposed scheme, structure analysis of a diffraction pattern from a non-crystalline particle is conducted in two stages: estimation of the overall shape and determination of the fine structure inside the support shape. In each stage, the most accurate and probable density maps are objectively selected. The validity of the proposed scheme is examined by application to diffraction data that were obtained from an aggregate of metal particles and a biological specimen at the XFEL facility SACLA using custom-made diffraction apparatus.

  1. Using a binaural biomimetic array to identify bottom objects ensonified by echolocating dolphins

    USGS Publications Warehouse

    Heiweg, D.A.; Moore, P.W.; Martin, S.W.; Dankiewicz, L.A.

    2006-01-01

    The development of a unique dolphin biomimetic sonar produced data that were used to study signal processing methods for object identification. Echoes from four metallic objects proud on the bottom, and a substrate-only condition, were generated by bottlenose dolphins trained to ensonify the targets in very shallow water. Using the two-element ('binaural') receive array, object echo spectra were collected and submitted for identification to four neural network architectures. Identification accuracy was evaluated over two receive array configurations, and five signal processing schemes. The four neural networks included backpropagation, learning vector quantization, genetic learning and probabilistic network architectures. The processing schemes included four methods that capitalized on the binaural data, plus a monaural benchmark process. All the schemes resulted in above-chance identification accuracy when applied to learning vector quantization and backpropagation. Beam-forming or concatenation of spectra from both receive elements outperformed the monaural benchmark, with higher sensitivity and lower bias. Ultimately, best object identification performance was achieved by the learning vector quantization network supplied with beam-formed data. The advantages of multi-element signal processing for object identification are clearly demonstrated in this development of a first-ever dolphin biomimetic sonar. ?? 2006 IOP Publishing Ltd.

  2. Investigating the role of the land surface in explaining the interannual variation of the net radiation balance over the Western Sahara and sub-Sahara

    NASA Technical Reports Server (NTRS)

    Smith, Eric A.; Nicholson, Sharon

    1987-01-01

    The status of the data sets is discussed. Progress was made in both data analysis and modeling areas. The atmospheric and land surface contributions to the net radiation budget over the Sahara-Sahel region is being decoupled. The interannual variability of these two processes was investigated and this variability related to seasonal rainfall fluctuations. A modified Barnes objective analysis scheme was developed which uses an eliptic scan pattern and a 3-pass iteration of the difference fields.

  3. Critical evaluation of sample pretreatment techniques.

    PubMed

    Hyötyläinen, Tuulia

    2009-06-01

    Sample preparation before chromatographic separation is the most time-consuming and error-prone part of the analytical procedure. Therefore, selecting and optimizing an appropriate sample preparation scheme is a key factor in the final success of the analysis, and the judicious choice of an appropriate procedure greatly influences the reliability and accuracy of a given analysis. The main objective of this review is to critically evaluate the applicability, disadvantages, and advantages of various sample preparation techniques. Particular emphasis is placed on extraction techniques suitable for both liquid and solid samples.

  4. Evaluation of the Danish Leave Schemes. Summary of a Report.

    ERIC Educational Resources Information Center

    Andersen, Dines; Appeldorn, Alice; Weise, Hanne

    An evaluation examined how the Danish leave schemes, an offer to employed and unemployed persons who qualify for unemployment benefits, were functioning and to what extent the objectives have been achieved. It was found that 60 percent of those taking leave had previously been unemployed; women accounted for two-thirds of those joining the scheme;…

  5. Local sensory control of a dexterous end effector

    NASA Technical Reports Server (NTRS)

    Pinto, Victor H.; Everett, Louis J.; Driels, Morris

    1990-01-01

    A numerical scheme was developed to solve the inverse kinematics for a user-defined manipulator. The scheme was based on a nonlinear least-squares technique which determines the joint variables by minimizing the difference between the target end effector pose and the actual end effector pose. The scheme was adapted to a dexterous hand in which the joints are either prismatic or revolute and the fingers are considered open kinematic chains. Feasible solutions were obtained using a three-fingered dexterous hand. An algorithm to estimate the position and orientation of a pre-grasped object was also developed. The algorithm was based on triangulation using an ideal sensor and a spherical object model. By choosing the object to be a sphere, only the position of the object frame was important. Based on these simplifications, a minimum of three sensors are needed to find the position of a sphere. A two dimensional example to determine the position of a circle coordinate frame using a two-fingered dexterous hand was presented.

  6. Optimization of storage tank locations in an urban stormwater drainage system using a two-stage approach.

    PubMed

    Wang, Mingming; Sun, Yuanxiang; Sweetapple, Chris

    2017-12-15

    Storage is important for flood mitigation and non-point source pollution control. However, to seek a cost-effective design scheme for storage tanks is very complex. This paper presents a two-stage optimization framework to find an optimal scheme for storage tanks using storm water management model (SWMM). The objectives are to minimize flooding, total suspended solids (TSS) load and storage cost. The framework includes two modules: (i) the analytical module, which evaluates and ranks the flooding nodes with the analytic hierarchy process (AHP) using two indicators (flood depth and flood duration), and then obtains the preliminary scheme by calculating two efficiency indicators (flood reduction efficiency and TSS reduction efficiency); (ii) the iteration module, which obtains an optimal scheme using a generalized pattern search (GPS) method based on the preliminary scheme generated by the analytical module. The proposed approach was applied to a catchment in CZ city, China, to test its capability in choosing design alternatives. Different rainfall scenarios are considered to test its robustness. The results demonstrate that the optimal framework is feasible, and the optimization is fast based on the preliminary scheme. The optimized scheme is better than the preliminary scheme for reducing runoff and pollutant loads under a given storage cost. The multi-objective optimization framework presented in this paper may be useful in finding the best scheme of storage tanks or low impact development (LID) controls. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Current use of impact models for agri-environment schemes and potential for improvements of policy design and assessment.

    PubMed

    Primdahl, Jørgen; Vesterager, Jens Peter; Finn, John A; Vlahos, George; Kristensen, Lone; Vejre, Henrik

    2010-06-01

    Agri-Environment Schemes (AES) to maintain or promote environmentally-friendly farming practices were implemented on about 25% of all agricultural land in the EU by 2002. This article analyses and discusses the actual and potential use of impact models in supporting the design, implementation and evaluation of AES. Impact models identify and establish the causal relationships between policy objectives and policy outcomes. We review and discuss the role of impact models at different stages in the AES policy process, and present results from a survey of impact models underlying 60 agri-environmental schemes in seven EU member states. We distinguished among three categories of impact models (quantitative, qualitative or common sense), depending on the degree of evidence in the formal scheme description, additional documents, or key person interviews. The categories of impact models used mainly depended on whether scheme objectives were related to natural resources, biodiversity or landscape. A higher proportion of schemes dealing with natural resources (primarily water) were based on quantitative impact models, compared to those concerned with biodiversity or landscape. Schemes explicitly targeted either on particular parts of individual farms or specific areas tended to be based more on quantitative impact models compared to whole-farm schemes and broad, horizontal schemes. We conclude that increased and better use of impact models has significant potential to improve efficiency and effectiveness of AES. (c) 2009 Elsevier Ltd. All rights reserved.

  8. Multiple-3D-object secure information system based on phase shifting method and single interference.

    PubMed

    Li, Wei-Na; Shi, Chen-Xiao; Piao, Mei-Lan; Kim, Nam

    2016-05-20

    We propose a multiple-3D-object secure information system for encrypting multiple three-dimensional (3D) objects based on the three-step phase shifting method. During the decryption procedure, five phase functions (PFs) are decreased to three PFs, in comparison with our previous method, which implies that one cross beam splitter is utilized to implement the single decryption interference. Moreover, the advantages of the proposed scheme also include: each 3D object can be decrypted discretionarily without decrypting a series of other objects earlier; the quality of the decrypted slice image of each object is high according to the correlation coefficient values, none of which is lower than 0.95; no iterative algorithm is involved. The feasibility of the proposed scheme is demonstrated by computer simulation results.

  9. The 'Real Welfare' scheme: benchmarking welfare outcomes for commercially farmed pigs.

    PubMed

    Pandolfi, F; Stoddart, K; Wainwright, N; Kyriazakis, I; Edwards, S A

    2017-10-01

    Animal welfare standards have been incorporated in EU legislation and in farm assurance schemes, based on scientific information and aiming to safeguard the welfare of the species concerned. Recently, emphasis has shifted from resource-based measures of welfare to animal-based measures, which are considered to assess more accurately the welfare status. The data used in this analysis were collected from April 2013 to May 2016 through the 'Real Welfare' scheme in order to assess on-farm pig welfare, as required for those finishing pigs under the UK Red Tractor Assurance scheme. The assessment involved five main measures (percentage of pigs requiring hospitalization, percentage of lame pigs, percentage of pigs with severe tail lesions, percentage of pigs with severe body marks and enrichment use ratio) and optional secondary measures (percentage of pigs with mild tail lesions, percentage of pigs with dirty tails, percentage of pigs with mild body marks, percentage of pigs with dirty bodies), with associated information about the environment and the enrichment in the farms. For the complete database, a sample of pens was assessed from 1928 farm units. Repeated measures were taken in the same farm unit over time, giving 112 240 records at pen level. These concerned a total of 13 480 289 pigs present on the farm during the assessments, with 5 463 348 pigs directly assessed using the 'Real Welfare' protocol. The three most common enrichment types were straw, chain and plastic objects. The main substrate was straw which was present in 67.9% of the farms. Compared with 2013, a significant increase of pens with undocked-tail pigs, substrates and objects was observed over time (P0.3). The results from the first 3 years of the scheme demonstrate a reduction of the prevalence of animal-based measures of welfare problems and highlight the value of this initiative.

  10. Towards photometry pipeline of the Indonesian space surveillance system

    NASA Astrophysics Data System (ADS)

    Priyatikanto, Rhorom; Religia, Bahar; Rachman, Abdul; Dani, Tiar

    2015-09-01

    Optical observation through sub-meter telescope equipped with CCD camera becomes alternative method for increasing orbital debris detection and surveillance. This observational mode is expected to eye medium-sized objects in higher orbits (e.g. MEO, GTO, GSO & GEO), beyond the reach of usual radar system. However, such observation of fast moving objects demands special treatment and analysis technique. In this study, we performed photometric analysis of the satellite track images photographed using rehabilitated Schmidt Bima Sakti telescope in Bosscha Observatory. The Hough transformation was implemented to automatically detect linear streak from the images. From this analysis and comparison to USSPACECOM catalog, two satellites were identified and associated with inactive Thuraya-3 satellite and Satcom-3 debris which are located at geostationary orbit. Further aperture photometry analysis revealed the periodicity of tumbling Satcom-3 debris. In the near future, it is not impossible to apply similar scheme to establish an analysis pipeline for optical space surveillance system hosted in Indonesia.

  11. Emerging: The Impact of the Artist Teacher Scheme MA on Students' Pedagogical and Artistic Practices

    ERIC Educational Resources Information Center

    Page, Tara; Adams, Jeff; Hyde, Wendy

    2011-01-01

    The United Kingdom Artist Teacher Scheme (ATS) commissioned a study of the artistic and pedagogical practices of students on a recently established Artist Teacher Scheme MA (ATS MA). The aims of this study were to: investigate the motives and objectives teachers/educators have for undertaking this ATS MA programme, the impact the programme had on…

  12. Evaluation of the MindMatters Buddy Support Scheme in Southwest Sydney: Strategies, Achievements and Challenges

    ERIC Educational Resources Information Center

    Khan, Raquiba J.; Bedford, Karen; Williams, Mandy

    2012-01-01

    Objective: Assessing the strategies, achievements and challenges of implementing MindMatters and the views of partner schools towards the buddy support scheme. Design: The MindMatters buddy support scheme (2007-2008) was designed to increase the capacity of secondary schools to adopt a whole-school approach to improving health and well-being of…

  13. Strong Motion Instrumentation of Seismically-Strengthened Port Structures in California by CSMIP

    USGS Publications Warehouse

    Huang, M.J.; Shakal, A.F.

    2009-01-01

    The California Strong Motion Instrumentation Program (CSMIP) has instrumented five port structures. Instrumentation of two more port structures is underway and another one is in planning. Two of the port structures have been seismically strengthened. The primary goals of the strong motion instrumentation are to obtain strong earthquake shaking data for verifying seismic analysis procedures and strengthening schemes, and for post-earthquake evaluations of port structures. The wharves instrumented by CSMIP were recommended by the Strong Motion Instrumentation Advisory Committee, a committee of the California Seismic Safety Commission. Extensive instrumentation of a wharf is difficult and would be impossible without the cooperation of the owners and the involvement of the design engineers. The instrumentation plan for a wharf is developed through study of the retrofit plans of the wharf, and the strong-motion sensors are installed at locations where specific instrumentation objectives can be achieved and access is possible. Some sensor locations have to be planned during design; otherwise they are not possible to install after construction. This paper summarizes the two seismically-strengthened wharves and discusses the instrumentation schemes and objectives. ?? 2009 ASCE.

  14. Metric freeness and projectivity for classical and quantum normed modules

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Helemskii, A Ya

    2013-07-31

    In functional analysis, there are several diverse approaches to the notion of projective module. We show that a certain general categorical scheme contains all basic versions as special cases. In this scheme, the notion of free object comes to the foreground, and, in the best categories, projective objects are precisely retracts of free ones. We are especially interested in the so-called metric version of projectivity and characterize the metrically free classical and quantum (= operator) normed modules. Informally speaking, so-called extremal projectivity, which was known earlier, is interpreted as a kind of 'asymptotical metric projectivity'. In addition, we answer themore » following specific question in the geometry of normed spaces: what is the structure of metrically projective modules in the simplest case of normed spaces? We prove that metrically projective normed spaces are precisely the subspaces of l{sub 1}(M) (where M is a set) that are denoted by l{sub 1}{sup 0}(M) and consist of finitely supported functions. Thus, in this case, projectivity coincides with freeness. Bibliography: 28 titles.« less

  15. Generalized interpretation scheme for arbitrary HR InSAR image pairs

    NASA Astrophysics Data System (ADS)

    Boldt, Markus; Thiele, Antje; Schulz, Karsten

    2013-10-01

    Land cover classification of remote sensing imagery is an important topic of research. For example, different applications require precise and fast information about the land cover of the imaged scenery (e.g., disaster management and change detection). Focusing on high resolution (HR) spaceborne remote sensing imagery, the user has the choice between passive and active sensor systems. Passive systems, such as multispectral sensors, have the disadvantage of being dependent from weather influences (fog, dust, clouds, etc.) and time of day, since they work in the visible part of the electromagnetic spectrum. Here, active systems like Synthetic Aperture Radar (SAR) provide improved capabilities. As an interactive method analyzing HR InSAR image pairs, the CovAmCohTM method was introduced in former studies. CovAmCoh represents the joint analysis of locality (coefficient of variation - Cov), backscatter (amplitude - Am) and temporal stability (coherence - Coh). It delivers information on physical backscatter characteristics of imaged scene objects or structures and provides the opportunity to detect different classes of land cover (e.g., urban, rural, infrastructure and activity areas). As example, railway tracks are easily distinguishable from other infrastructure due to their characteristic bluish coloring caused by the gravel between the sleepers. In consequence, imaged objects or structures have a characteristic appearance in CovAmCoh images which allows the development of classification rules. In this paper, a generalized interpretation scheme for arbitrary InSAR image pairs using the CovAmCoh method is proposed. This scheme bases on analyzing the information content of typical CovAmCoh imagery using the semisupervised k-means clustering. It is shown that eight classes model the main local information content of CovAmCoh images sufficiently and can be used as basis for a classification scheme.

  16. Plasmonic rainbow rings induced by white radial polarization.

    PubMed

    Lan, Tzu-Hsiang; Chung, Yi-Kuan; Li, Jie-En; Tien, Chung-Hao

    2012-04-01

    This Letter presents a scheme to embed both angular/spectral surface plasmon resonance (SPR) in a unique far-field rainbow feature by tightly focusing (effective NA=1.45) a polychromatic radially polarized beam on an Au (20 nm)/SiO2 (500 nm)/Au (20 nm) sandwich structure. Without the need for angular or spectral scanning, the virtual spectral probe snapshots a wide operation range (n=1-1.42; λ=400-700 nm) of SPR excitation in a locally nanosized region. Combined with the high-speed spectral analysis, a proof-of-concept scenario was given by monitoring the NaCl liquid concentration change in real time. The proposed scheme will certainly has a promising impact on the development of objective-based SPR sensor and biometric studies due to its rapidity and versatility.

  17. Examination of Spectral Transformations on Spectral Mixture Analysis

    NASA Astrophysics Data System (ADS)

    Deng, Y.; Wu, C.

    2018-04-01

    While many spectral transformation techniques have been applied on spectral mixture analysis (SMA), few study examined their necessity and applicability. This paper focused on exploring the difference between spectrally transformed schemes and untransformed scheme to find out which transformed scheme performed better in SMA. In particular, nine spectrally transformed schemes as well as untransformed scheme were examined in two study areas. Each transformed scheme was tested 100 times using different endmember classes' spectra under the endmember model of vegetation- high albedo impervious surface area-low albedo impervious surface area-soil (V-ISAh-ISAl-S). Performance of each scheme was assessed based on mean absolute error (MAE). Statistical analysis technique, Paired-Samples T test, was applied to test the significance of mean MAEs' difference between transformed and untransformed schemes. Results demonstrated that only NSMA could exceed the untransformed scheme in all study areas. Some transformed schemes showed unstable performance since they outperformed the untransformed scheme in one area but weakened the SMA result in another region.

  18. Efficient Hybrid Watermarking Scheme for Security and Transmission Bit Rate Enhancement of 3D Color-Plus-Depth Video Communication

    NASA Astrophysics Data System (ADS)

    El-Shafai, W.; El-Rabaie, S.; El-Halawany, M.; Abd El-Samie, F. E.

    2018-03-01

    Three-Dimensional Video-plus-Depth (3DV + D) comprises diverse video streams captured by different cameras around an object. Therefore, there is a great need to fulfill efficient compression to transmit and store the 3DV + D content in compressed form to attain future resource bounds whilst preserving a decisive reception quality. Also, the security of the transmitted 3DV + D is a critical issue for protecting its copyright content. This paper proposes an efficient hybrid watermarking scheme for securing the 3DV + D transmission, which is the homomorphic transform based Singular Value Decomposition (SVD) in Discrete Wavelet Transform (DWT) domain. The objective of the proposed watermarking scheme is to increase the immunity of the watermarked 3DV + D to attacks and achieve adequate perceptual quality. Moreover, the proposed watermarking scheme reduces the transmission-bandwidth requirements for transmitting the color-plus-depth 3DV over limited-bandwidth wireless networks through embedding the depth frames into the color frames of the transmitted 3DV + D. Thus, it saves the transmission bit rate and subsequently it enhances the channel bandwidth-efficiency. The performance of the proposed watermarking scheme is compared with those of the state-of-the-art hybrid watermarking schemes. The comparisons depend on both the subjective visual results and the objective results; the Peak Signal-to-Noise Ratio (PSNR) of the watermarked frames and the Normalized Correlation (NC) of the extracted watermark frames. Extensive simulation results on standard 3DV + D sequences have been conducted in the presence of attacks. The obtained results confirm that the proposed hybrid watermarking scheme is robust in the presence of attacks. It achieves not only very good perceptual quality with appreciated PSNR values and saving in the transmission bit rate, but also high correlation coefficient values in the presence of attacks compared to the existing hybrid watermarking schemes.

  19. Improvements in estimating proportions of objects from multispectral data

    NASA Technical Reports Server (NTRS)

    Horwitz, H. M.; Hyde, P. D.; Richardson, W.

    1974-01-01

    Methods for estimating proportions of objects and materials imaged within the instantaneous field of view of a multispectral sensor were developed further. Improvements in the basic proportion estimation algorithm were devised as well as improved alien object detection procedures. Also, a simplified signature set analysis scheme was introduced for determining the adequacy of signature set geometry for satisfactory proportion estimation. Averaging procedures used in conjunction with the mixtures algorithm were examined theoretically and applied to artificially generated multispectral data. A computationally simpler estimator was considered and found unsatisfactory. Experiments conducted to find a suitable procedure for setting the alien object threshold yielded little definitive result. Mixtures procedures were used on a limited amount of ERTS data to estimate wheat proportion in selected areas. Results were unsatisfactory, partly because of the ill-conditioned nature of the pure signature set.

  20. A scheme for recording a fast process at nanosecond scale by using digital holographic interferometry with continuous wave laser

    NASA Astrophysics Data System (ADS)

    Wang, Jun; Zhao, Jianlin; Di, Jianglei; Jiang, Biqiang

    2015-04-01

    A scheme for recording fast process at nanosecond scale by using digital holographic interferometry with continuous wave (CW) laser is described and demonstrated experimentally, which employs delayed-time fibers and angular multiplexing technique and can realize the variable temporal resolution at nanosecond scale and different measured depths of object field at certain temporal resolution. The actual delay-time is controlled by two delayed-time fibers with different lengths. The object field information in two different states can be simultaneously recorded in a composite hologram. This scheme is also suitable for recording fast process at picosecond scale, by using an electro-optic modulator.

  1. Optimal Resource Allocation for NOMA-TDMA Scheme with α-Fairness in Industrial Internet of Things.

    PubMed

    Sun, Yanjing; Guo, Yiyu; Li, Song; Wu, Dapeng; Wang, Bin

    2018-05-15

    In this paper, a joint non-orthogonal multiple access and time division multiple access (NOMA-TDMA) scheme is proposed in Industrial Internet of Things (IIoT), which allowed multiple sensors to transmit in the same time-frequency resource block using NOMA. The user scheduling, time slot allocation, and power control are jointly optimized in order to maximize the system α -fair utility under transmit power constraint and minimum rate constraint. The optimization problem is nonconvex because of the fractional objective function and the nonconvex constraints. To deal with the original problem, we firstly convert the objective function in the optimization problem into a difference of two convex functions (D.C.) form, and then propose a NOMA-TDMA-DC algorithm to exploit the global optimum. Numerical results show that the NOMA-TDMA scheme significantly outperforms the traditional orthogonal multiple access scheme in terms of both spectral efficiency and user fairness.

  2. Spatial eigensolution analysis of energy-stable flux reconstruction schemes and influence of the numerical flux on accuracy and robustness

    NASA Astrophysics Data System (ADS)

    Mengaldo, Gianmarco; De Grazia, Daniele; Moura, Rodrigo C.; Sherwin, Spencer J.

    2018-04-01

    This study focuses on the dispersion and diffusion characteristics of high-order energy-stable flux reconstruction (ESFR) schemes via the spatial eigensolution analysis framework proposed in [1]. The analysis is performed for five ESFR schemes, where the parameter 'c' dictating the properties of the specific scheme recovered is chosen such that it spans the entire class of ESFR methods, also referred to as VCJH schemes, proposed in [2]. In particular, we used five values of 'c', two that correspond to its lower and upper bounds and the others that identify three schemes that are linked to common high-order methods, namely the ESFR recovering two versions of discontinuous Galerkin methods and one recovering the spectral difference scheme. The performance of each scheme is assessed when using different numerical intercell fluxes (e.g. different levels of upwinding), ranging from "under-" to "over-upwinding". In contrast to the more common temporal analysis, the spatial eigensolution analysis framework adopted here allows one to grasp crucial insights into the diffusion and dispersion properties of FR schemes for problems involving non-periodic boundary conditions, typically found in open-flow problems, including turbulence, unsteady aerodynamics and aeroacoustics.

  3. Image encryption based on a delayed fractional-order chaotic logistic system

    NASA Astrophysics Data System (ADS)

    Wang, Zhen; Huang, Xia; Li, Ning; Song, Xiao-Na

    2012-05-01

    A new image encryption scheme is proposed based on a delayed fractional-order chaotic logistic system. In the process of generating a key stream, the time-varying delay and fractional derivative are embedded in the proposed scheme to improve the security. Such a scheme is described in detail with security analyses including correlation analysis, information entropy analysis, run statistic analysis, mean-variance gray value analysis, and key sensitivity analysis. Experimental results show that the newly proposed image encryption scheme possesses high security.

  4. On the numerical treatment of nonlinear source terms in reaction-convection equations

    NASA Technical Reports Server (NTRS)

    Lafon, A.; Yee, H. C.

    1992-01-01

    The objectives of this paper are to investigate how various numerical treatments of the nonlinear source term in a model reaction-convection equation can affect the stability of steady-state numerical solutions and to show under what conditions the conventional linearized analysis breaks down. The underlying goal is to provide part of the basic building blocks toward the ultimate goal of constructing suitable numerical schemes for hypersonic reacting flows, combustions and certain turbulence models in compressible Navier-Stokes computations. It can be shown that nonlinear analysis uncovers much of the nonlinear phenomena which linearized analysis is not capable of predicting in a model reaction-convection equation.

  5. Risk neutral second best toll pricing.

    DOT National Transportation Integrated Search

    2011-08-01

    We propose a risk-neutral second best toll pricing scheme to account for the possible no uniqueness : of user equilibrium solutions. The scheme is designed to optimize for the expected objective value : as the UE solution varies within the solution s...

  6. In-Space Radiator Shape Optimization using Genetic Algorithms

    NASA Technical Reports Server (NTRS)

    Hull, Patrick V.; Kittredge, Ken; Tinker, Michael; SanSoucie, Michael

    2006-01-01

    Future space exploration missions will require the development of more advanced in-space radiators. These radiators should be highly efficient and lightweight, deployable heat rejection systems. Typical radiators for in-space heat mitigation commonly comprise a substantial portion of the total vehicle mass. A small mass savings of even 5-10% can greatly improve vehicle performance. The objective of this paper is to present the development of detailed tools for the analysis and design of in-space radiators using evolutionary computation techniques. The optimality criterion is defined as a two-dimensional radiator with a shape demonstrating the smallest mass for the greatest overall heat transfer, thus the end result is a set of highly functional radiator designs. This cross-disciplinary work combines topology optimization and thermal analysis design by means of a genetic algorithm The proposed design tool consists of the following steps; design parameterization based on the exterior boundary of the radiator, objective function definition (mass minimization and heat loss maximization), objective function evaluation via finite element analysis (thermal radiation analysis) and optimization based on evolutionary algorithms. The radiator design problem is defined as follows: the input force is a driving temperature and the output reaction is heat loss. Appropriate modeling of the space environment is added to capture its effect on the radiator. The design parameters chosen for this radiator shape optimization problem fall into two classes, variable height along the width of the radiator and a spline curve defining the -material boundary of the radiator. The implementation of multiple design parameter schemes allows the user to have more confidence in the radiator optimization tool upon demonstration of convergence between the two design parameter schemes. This tool easily allows the user to manipulate the driving temperature regions thus permitting detailed design of in-space radiators for unique situations. Preliminary results indicate an optimized shape following that of the temperature distribution regions in the "cooler" portions of the radiator. The results closely follow the expected radiator shape.

  7. Improving Biometric-Based Authentication Schemes with Smart Card Revocation/Reissue for Wireless Sensor Networks.

    PubMed

    Moon, Jongho; Lee, Donghoon; Lee, Youngsook; Won, Dongho

    2017-04-25

    User authentication in wireless sensor networks is more difficult than in traditional networks owing to sensor network characteristics such as unreliable communication, limited resources, and unattended operation. For these reasons, various authentication schemes have been proposed to provide secure and efficient communication. In 2016, Park et al. proposed a secure biometric-based authentication scheme with smart card revocation/reissue for wireless sensor networks. However, we found that their scheme was still insecure against impersonation attack, and had a problem in the smart card revocation/reissue phase. In this paper, we show how an adversary can impersonate a legitimate user or sensor node, illegal smart card revocation/reissue and prove that Park et al.'s scheme fails to provide revocation/reissue. In addition, we propose an enhanced scheme that provides efficiency, as well as anonymity and security. Finally, we provide security and performance analysis between previous schemes and the proposed scheme, and provide formal analysis based on the random oracle model. The results prove that the proposed scheme can solve the weaknesses of impersonation attack and other security flaws in the security analysis section. Furthermore, performance analysis shows that the computational cost is lower than the previous scheme.

  8. Improving Biometric-Based Authentication Schemes with Smart Card Revocation/Reissue for Wireless Sensor Networks

    PubMed Central

    Moon, Jongho; Lee, Donghoon; Lee, Youngsook; Won, Dongho

    2017-01-01

    User authentication in wireless sensor networks is more difficult than in traditional networks owing to sensor network characteristics such as unreliable communication, limited resources, and unattended operation. For these reasons, various authentication schemes have been proposed to provide secure and efficient communication. In 2016, Park et al. proposed a secure biometric-based authentication scheme with smart card revocation/reissue for wireless sensor networks. However, we found that their scheme was still insecure against impersonation attack, and had a problem in the smart card revocation/reissue phase. In this paper, we show how an adversary can impersonate a legitimate user or sensor node, illegal smart card revocation/reissue and prove that Park et al.’s scheme fails to provide revocation/reissue. In addition, we propose an enhanced scheme that provides efficiency, as well as anonymity and security. Finally, we provide security and performance analysis between previous schemes and the proposed scheme, and provide formal analysis based on the random oracle model. The results prove that the proposed scheme can solve the weaknesses of impersonation attack and other security flaws in the security analysis section. Furthermore, performance analysis shows that the computational cost is lower than the previous scheme. PMID:28441331

  9. Changes in animal performance and profitability of Holstein dairy operations after introduction of crossbreeding with Montbéliarde, Normande, and Scandinavian Red.

    PubMed

    Dezetter, C; Bareille, N; Billon, D; Côrtes, C; Lechartier, C; Seegers, H

    2017-10-01

    An individual-based mechanistic, stochastic, and dynamic simulation model was developed to assess economic effects resulting from changes in performance for milk yield and solid contents, reproduction, health, and replacement, induced by the introduction of crossbreeding in Holstein dairy operations. Three crossbreeding schemes, Holstein × Montbéliarde, Holstein × Montbéliarde × Normande, and Holstein × Montbéliarde × Scandinavian Red, were implemented in Holstein dairy operations and compared with Holstein pure breeding. Sires were selected based on their estimated breeding value for milk. Two initial operations were simulated according to the prevalence (average or high) of reproductive and health disorders in the lactating herd. Evolution of operations was simulated during 15 yr under 2 alternative managerial goals (constant number of cows or constant volume of milk sold). After 15 yr, breed percentages reached equilibrium for the 2-breed but not for the 3-breed schemes. After 5 yr of simulation, all 3 crossbreeding schemes reduced average milk yield per cow-year compared with the pure Holstein scheme. Changes in other animal performance (milk solid contents, reproduction, udder health, and longevity) were always in favor of crossbreeding schemes. Under an objective of constant number of cows, margin over variable costs in average discounted value over the 15 yr of simulation was slightly increased by crossbreeding schemes, with an average prevalence of disorders up to €32/cow-year. In operations with a high prevalence of disorders, crossbreeding schemes increased the margin over variable costs up to €91/cow-year. Under an objective of constant volume of milk sold, crossbreeding schemes improved margin over variable costs up to €10/1,000L (corresponding to around €96/cow-year) for average prevalence of disorders, and up to €13/1,000L (corresponding to around €117/cow-year) for high prevalence of disorders. Under an objective of constant number of cows, an unfavorable pricing context (milk price vs. concentrates price) increased slightly crossbreeding positive effects on margin over variable costs. Under an objective of constant volume of milk, only very limited changes in differences of margins were found between the breeding schemes. Our results, obtained conditionally to the parameterization values used here, suggest that dairy crossbreeding should be considered as a relevant option for Holstein dairy operations with a production level until 9,000 kg/cow-year in France, and possibly in other countries. Copyright © 2017 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  10. Moving towards universal coverage in South Africa? Lessons from a voluntary government insurance scheme

    PubMed Central

    Govender, Veloshnee; Chersich, Matthew F.; Harris, Bronwyn; Alaba, Olufunke; Ataguba, John E.; Nxumalo, Nonhlanhla; Goudge, Jane

    2013-01-01

    Background In 2005, the South African government introduced a voluntary, subsidised health insurance scheme for civil servants. In light of the global emphasis on universal coverage, empirical evidence is needed to understand the relationship between new health financing strategies and health care access thereby improving global understanding of these issues. Objectives This study analysed coverage of the South African government health insurance scheme, the population groups with low uptake, and the individual-level factors, as well as characteristics of the scheme, that influenced enrolment. Methods Multi-stage random sampling was used to select 1,329 civil servants from the health and education sectors in four of South Africa's nine provinces. They were interviewed to determine factors associated with enrolment in the scheme. The analysis included both descriptive statistics and multivariate logistic regression. Results Notwithstanding the availability of a non-contributory option within the insurance scheme and access to privately-provided primary care, a considerable portion of socio-economically vulnerable groups remained uninsured (57.7% of the lowest salary category). Non-insurance was highest among men, black African or coloured ethnic groups, less educated and lower-income employees, and those living in informal-housing. The relatively poor uptake of the contributory and non-contributory insurance options was mostly attributed to insufficient information, perceived administrative challenges of taking up membership, and payment costs. Conclusion Barriers to enrolment include insufficient information, unaffordability of payments and perceived administrative complexity. Achieving universal coverage requires good physical access to service providers and appropriate benefit options within pre-payment health financing mechanisms. PMID:23364093

  11. Effects of information, education, and communication campaign on a community-based health insurance scheme in Burkina Faso

    PubMed Central

    Cofie, Patience; De Allegri, Manuela; Kouyaté, Bocar; Sauerborn, Rainer

    2013-01-01

    Objective The study analysed the effect of Information, Education, and Communication (IEC) campaign activities on the adoption of a community-based health insurance (CHI) scheme in Nouna, Burkina Faso. It also identified the factors that enhanced or limited the campaign's effectiveness. Design Complementary data collection approaches were used. A survey was conducted with 250 randomly selected household heads, followed by in-depth interviews with 22 purposively selected community leaders, group discussions with the project management team, and field observations. Bivariate analysis and multivariate logistic regression models were used to assess the association between household exposure to campaign and acquisition of knowledge as well as household exposure to campaign and enrolment. Results The IEC campaign had a positive effect on households’ knowledge about the CHI and to a lesser extent on household enrolment in the scheme. The effectiveness of the IEC strategy was mainly influenced by: (1) frequent and consistent IEC messages from multiple media channels (mass and interpersonal channels), including the radio, a mobile information van, and CHI team, and (2) community heads’ participation in the CHI scheme promotion. Education was the only significantly influential socio-demographic determinant of knowledge and enrolment among household heads. The relatively low effects of the IEC campaign on CHI enrolment are indicative of other important IEC mediating factors, which should be taken into account in future CHI campaign evaluation. Conclusion The study concludes that an IEC campaign is crucial to improving the understanding of the CHI scheme concept, which is an enabler to enrolment, and should be integrated into scheme designs and evaluations. PMID:24314344

  12. An objective isobaric/isentropic technique for upper air analysis

    NASA Technical Reports Server (NTRS)

    Mancuso, R. L.; Endlich, R. M.; Ehernberger, L. J.

    1981-01-01

    An objective meteorological analysis technique is presented whereby both horizontal and vertical upper air analyses are performed. The process used to interpolate grid-point values from the upper-air station data is the same as for grid points on both an isobaric surface and a vertical cross-sectional plane. The nearby data surrounding each grid point are used in the interpolation by means of an anisotropic weighting scheme, which is described. The interpolation for a grid-point potential temperature is performed isobarically; whereas wind, mixing-ratio, and pressure height values are interpolated from data that lie on the isentropic surface that passes through the grid point. Two versions (A and B) of the technique are evaluated by qualitatively comparing computer analyses with subjective handdrawn analyses. The objective products of version A generally have fair correspondence with the subjective analyses and with the station data, and depicted the structure of the upper fronts, tropopauses, and jet streams fairly well. The version B objective products correspond more closely to the subjective analyses, and show the same strong gradients across the upper front with only minor smoothing.

  13. The physical and empirical basis for a specific clear-air turbulence risk index

    NASA Technical Reports Server (NTRS)

    Keller, J. L.

    1985-01-01

    An improved operational CAT detection and forecasting technique is developed and detailed. This technique is the specific clear air turbulence risk (SCATR) index. This index shows some promising results. The improvements seen using hand analyzed data, as a result of the more realistic representation of the vertical shear of the horizontal wind, are also realized in the data analysis used in the PROFS/CWP application. The SCATR index should improve as database enhancements such as profiler and VAS satellite data, which increase the resolution in space and time, are brought into even more sophisticated objective analysis schemes.

  14. Asymptotic analysis of SPTA-based algorithms for no-wait flow shop scheduling problem with release dates.

    PubMed

    Ren, Tao; Zhang, Chuan; Lin, Lin; Guo, Meiting; Xie, Xionghang

    2014-01-01

    We address the scheduling problem for a no-wait flow shop to optimize total completion time with release dates. With the tool of asymptotic analysis, we prove that the objective values of two SPTA-based algorithms converge to the optimal value for sufficiently large-sized problems. To further enhance the performance of the SPTA-based algorithms, an improvement scheme based on local search is provided for moderate scale problems. New lower bound is presented for evaluating the asymptotic optimality of the algorithms. Numerical simulations demonstrate the effectiveness of the proposed algorithms.

  15. Asymptotic Analysis of SPTA-Based Algorithms for No-Wait Flow Shop Scheduling Problem with Release Dates

    PubMed Central

    Ren, Tao; Zhang, Chuan; Lin, Lin; Guo, Meiting; Xie, Xionghang

    2014-01-01

    We address the scheduling problem for a no-wait flow shop to optimize total completion time with release dates. With the tool of asymptotic analysis, we prove that the objective values of two SPTA-based algorithms converge to the optimal value for sufficiently large-sized problems. To further enhance the performance of the SPTA-based algorithms, an improvement scheme based on local search is provided for moderate scale problems. New lower bound is presented for evaluating the asymptotic optimality of the algorithms. Numerical simulations demonstrate the effectiveness of the proposed algorithms. PMID:24764774

  16. A light and faster regional convolutional neural network for object detection in optical remote sensing images

    NASA Astrophysics Data System (ADS)

    Ding, Peng; Zhang, Ye; Deng, Wei-Jian; Jia, Ping; Kuijper, Arjan

    2018-07-01

    Detection of objects from satellite optical remote sensing images is very important for many commercial and governmental applications. With the development of deep convolutional neural networks (deep CNNs), the field of object detection has seen tremendous advances. Currently, objects in satellite remote sensing images can be detected using deep CNNs. In general, optical remote sensing images contain many dense and small objects, and the use of the original Faster Regional CNN framework does not yield a suitably high precision. Therefore, after careful analysis we adopt dense convoluted networks, a multi-scale representation and various combinations of improvement schemes to enhance the structure of the base VGG16-Net for improving the precision. We propose an approach to reduce the test-time (detection time) and memory requirements. To validate the effectiveness of our approach, we perform experiments using satellite remote sensing image datasets of aircraft and automobiles. The results show that the improved network structure can detect objects in satellite optical remote sensing images more accurately and efficiently.

  17. Air quality and passenger comfort in an air-conditioned bus micro-environment.

    PubMed

    Zhu, Xiaoxuan; Lei, Li; Wang, Xingshen; Zhang, Yinghui

    2018-04-12

    In this study, passenger comfort and the air pollution status of the micro-environmental conditions in an air-conditioned bus were investigated through questionnaires, field measurements, and a numerical simulation. As a subjective analysis, passengers' perceptions of indoor environmental quality and comfort levels were determined from questionnaires. As an objective analysis, a numerical simulation was conducted using a discrete phase model to determine the diffusion and distribution of pollutants, including particulate matter with a diameter < 10 μm (PM 10 ), which were verified by experimental results. The results revealed poor air quality and dissatisfactory thermal comfort conditions in Jinan's air-conditioned bus system. To solve these problems, three scenarios (schemes A, B, C) were designed to alter the ventilation parameters. According to the results of an improved simulation of these scenarios, reducing or adding air outputs would shorten the time taken to reach steady-state conditions and weaken the airflow or lower the temperature in the cabin. The airflow pathway was closely related to the layout of the air conditioning. Scheme B lowered the temperature by 0.4 K and reduced the airflow by 0.01 m/s, while scheme C reduced the volume concentration of PM 10 to 150 μg/m 3 . Changing the air supply angle could further improve the airflow and reduce the concentration of PM 10 . With regard to the perception of airflow and thermal comfort, the scheme with an airflow provided by a 60° nozzle was considered better, and the concentration of PM 10 was reduced to 130 μg/m 3 .

  18. On the time-splitting scheme used in the Princeton Ocean Model

    NASA Astrophysics Data System (ADS)

    Kamenkovich, V. M.; Nechaev, D. A.

    2009-05-01

    The analysis of the time-splitting procedure implemented in the Princeton Ocean Model (POM) is presented. The time-splitting procedure uses different time steps to describe the evolution of interacting fast and slow propagating modes. In the general case the exact separation of the fast and slow modes is not possible. The main idea of the analyzed procedure is to split the system of primitive equations into two systems of equations for interacting external and internal modes. By definition, the internal mode varies slowly and the crux of the problem is to determine the proper filter, which excludes the fast component of the external mode variables in the relevant equations. The objective of this paper is to examine properties of the POM time-splitting procedure applied to equations governing the simplest linear non-rotating two-layer model of constant depth. The simplicity of the model makes it possible to study these properties analytically. First, the time-split system of differential equations is examined for two types of the determination of the slow component based on an asymptotic approach or time-averaging. Second, the differential-difference scheme is developed and some criteria of its stability are discussed for centered, forward, or backward time-averaging of the external mode variables. Finally, the stability of the POM time-splitting schemes with centered and forward time-averaging is analyzed. The effect of the Asselin filter on solutions of the considered schemes is studied. It is assumed that questions arising in the analysis of the simplest model are inherent in the general model as well.

  19. Analysis of biological time-lapse microscopic experiment from the point of view of the information theory.

    PubMed

    Štys, Dalibor; Urban, Jan; Vaněk, Jan; Císař, Petr

    2011-06-01

    We report objective analysis of information in the microscopic image of the cell monolayer. The process of transfer of information about the cell by the microscope is analyzed in terms of the classical Shannon information transfer scheme. The information source is the biological object, the information transfer channel is the whole microscope including the camera chip. The destination is the model of biological system. The information contribution is analyzed as information carried by a point to overall information in the image. Subsequently we obtain information reflection of the biological object. This is transformed in the biological model which, in information terminology, is the destination. This, we propose, should be constructed as state transitions in individual cells modulated by information bonds between the cells. We show examples of detected cell states in multidimensional state space. This space is reflected as colour channel intensity phenomenological state space. We have also observed information bonds and show examples of them.

  20. Analysis of biological time-lapse microscopic experiment from the point of view of the information theory.

    PubMed

    Stys, Dalibor; Urban, Jan; Vanek, Jan; Císar, Petr

    2010-07-01

    We report objective analysis of information in the microscopic image of the cell monolayer. The process of transfer of information about the cell by the microscope is analyzed in terms of the classical Shannon information transfer scheme. The information source is the biological object, the information transfer channel is the whole microscope including the camera chip. The destination is the model of biological system. The information contribution is analyzed as information carried by a point to overall information in the image. Subsequently we obtain information reflection of the biological object. This is transformed in the biological model which, in information terminology, is the destination. This, we propose, should be constructed as state transitions in individual cells modulated by information bonds between the cells. We show examples of detected cell states in multidimensional state space reflected in space an colour channel intensity phenomenological state space. We have also observed information bonds and show examples of them. Copyright 2010 Elsevier Ltd. All rights reserved.

  1. A design study for an advanced ocean color scanner system. [spaceborne equipment

    NASA Technical Reports Server (NTRS)

    Kim, H. H.; Fraser, R. S.; Thompson, L. L.; Bahethi, O.

    1980-01-01

    Along with a colorimetric data analysis scheme, the instrumental parameters which need to be optimized in future spaceborne ocean color scanner systems are outlined. With regard to assessing atmospheric effects from ocean colorimetry, attention is given to computing size parameters of the aerosols in the atmosphere, total optical depth measurement, and the aerosol optical thickness. It is suggested that sensors based on the use of linear array technology will meet hardware objectives.

  2. Fringe pattern demodulation with a two-frame digital phase-locked loop algorithm.

    PubMed

    Gdeisat, Munther A; Burton, David R; Lalor, Michael J

    2002-09-10

    A novel technique called a two-frame digital phase-locked loop for fringe pattern demodulation is presented. In this scheme, two fringe patterns with different spatial carrier frequencies are grabbed for an object. A digital phase-locked loop algorithm tracks and demodulates the phase difference between both fringe patterns by employing the wrapped phase components of one of the fringe patterns as a reference to demodulate the second fringe pattern. The desired phase information can be extracted from the demodulated phase difference. We tested the algorithm experimentally using real fringe patterns. The technique is shown to be suitable for noncontact measurement of objects with rapid surface variations, and it outperforms the Fourier fringe analysis technique in this aspect. Phase maps produced withthis algorithm are noisy in comparison with phase maps generated with the Fourier fringe analysis technique.

  3. Development of generalized pressure velocity coupling scheme for the analysis of compressible and incompressible combusting flows

    NASA Technical Reports Server (NTRS)

    Chen, C. P.; Wu, S. T.

    1992-01-01

    The objective of this investigation has been to develop an algorithm (or algorithms) for the improvement of the accuracy and efficiency of the computer fluid dynamics (CFD) models to study the fundamental physics of combustion chamber flows, which are necessary ultimately for the design of propulsion systems such as SSME and STME. During this three year study (May 19, 1978 - May 18, 1992), a unique algorithm was developed for all speed flows. This newly developed algorithm basically consists of two pressure-based algorithms (i.e. PISOC and MFICE). This PISOC is a non-iterative scheme and the FICE is an iterative scheme where PISOC has the characteristic advantages on low and high speed flows and the modified FICE has shown its efficiency and accuracy to compute the flows in the transonic region. A new algorithm is born from a combination of these two algorithms. This newly developed algorithm has general application in both time-accurate and steady state flows, and also was tested extensively for various flow conditions, such as turbulent flows, chemically reacting flows, and multiphase flows.

  4. Ability to Pay for Future National Health Financing Scheme among Malaysian Households.

    PubMed

    Aizuddin, Azimatun Noor; Aljunid, Syed Mohamed

    Malaysia is no exception to the challenging health care financing phenomenon of globalization. The objective of the present study was to assess the ability to pay among Malaysian households as preparation for a future national health financing scheme. This was a cross-sectional study involving representative samples of 774 households in Peninsular Malaysia. A majority of households were found to have the ability to pay for their health care. Household expenditure on health care per month was between MYR1 and MYR2000 with a mean (standard deviation [SD]) of 73.54 (142.66), or in a percentage of per-month income between 0.05% and 50% with mean (SD) 2.74 (5.20). The final analysis indicated that ability to pay was significantly higher among younger and higher-income households. Sociodemographic and socioeconomic statuses are important eligibility factors to be considered in planning the proposed national health care financing scheme to shield the needed group from catastrophic health expenditures. Copyright © 2017 Icahn School of Medicine at Mount Sinai. Published by Elsevier Inc. All rights reserved.

  5. Modern Foreign Language Teachers--Don't Leave Those Kids Alone! Linguistic-Cultural "Give and Take" in an Ad-Hoc Tutoring Scheme

    ERIC Educational Resources Information Center

    Leroy, Norah

    2017-01-01

    This paper addresses the theme of social inclusion through language learning. The focus is on an ad-hoc tutoring scheme set up between newly arrived British migrant pupils and French monolingual pupils in a small secondary school in the south-west of France. Though the original objective of this tutoring scheme was to improve the English skills of…

  6. Analytic redundancy management for SCOLE

    NASA Technical Reports Server (NTRS)

    Montgomery, Raymond C.

    1988-01-01

    The objective of this work is to develop a practical sensor analytic redundancy management scheme for flexible spacecraft and to demonstrate it using the SCOLE experimental apparatus. The particular scheme to be used is taken from previous work on the Grid apparatus by Williams and Montgomery.

  7. A motion artefact study and locally deforming objects in computerized tomography

    NASA Astrophysics Data System (ADS)

    Hahn, Bernadette N.

    2017-11-01

    Movements of the object during the data collection in computerized tomography can introduce motion artefacts in the reconstructed image. They can be reduced by employing information about the dynamic behaviour within the reconstruction step. However, inaccuracies concerning the movement are inevitable in practice. In this article, we give an explicit characterization of what is visible in an image obtained by a reconstruction algorithm with incorrect motion information. Then, we use this result to study in detail the situation of locally deforming objects, i.e. individual parts of the object have a different dynamic behaviour. In this context, we prove that additional artefacts arise due to the global nature of the Radon transform, even if the motion is exactly known. Based on our analysis, we propose a numerical scheme to reduce these artefacts in the reconstructed image. All our results are illustrated by numerical examples.

  8. Whole-system evaluation research of a scheme to support inner city recruitment and retention of GPs.

    PubMed

    Bellman, Loretta

    2002-12-01

    The GP Assistant/Research Associate scheme developed in the Guy's, King's and St Thomas' School of Medicine, London, aims to attract and recruit young GPs (GP Assistants) and develop their commitment to work in local inner city practices. Continuing professional development for both young and established GPs is a key feature of the scheme. The objectives of the whole-system evaluation research were to explore the perspectives of 34 stakeholders in the academic department, the practices and the PCGs, and to investigate the experiences of 19 GP Assistants who have participated in the scheme. Qualitative methods included semi-structured interviews, non-participant observations in the practices, audio-taped meetings and personal journals. Data collection also included reviewing documentation of the scheme, i.e. the previous quantitative evaluation report, publications and e-mails. The multi-method approach enabled individual, group and team perspectives of the scheme and triangulation of the data through comparing dialogue with observations and documentary evidence. Thematic analysis was undertaken to elicit the complex experiences of the GP Assistants. Wide-ranging findings included enthusiastic support for the continuation of the scheme. The GP Assistants' personal and professional development was clearly evident from the themes 'eye opener', new knowledge, managing multiple roles, feeling vulnerable, time constraints and empowering processes. Seven of the GP Assistants have become partners and ten chose to remain working in local practices. Significant challenges for managing and leading the scheme were apparent. Greater co-operation and collaborative working between the academic department and the practices is required. The scheme provides a highly valued visible means of support for GPs and could act as a model for a career pathway aimed at enhancing recruitment and retention of GPs. The scheme is also at the forefront of national initiatives aimed at supporting single-handed practices and helping GPs with their continuing professional development. An integrated approach to change, education, research and development is advocated to enable recruitment and retention of GPs, their academic development, and to underpin the evolution of PCTs as learning organizations.

  9. A Classification Scheme for Analyzing Mobile Apps Used to Prevent and Manage Disease in Late Life

    PubMed Central

    Wang, Aiguo; Lu, Xin; Chen, Hongtu; Li, Changqun; Levkoff, Sue

    2014-01-01

    Background There are several mobile apps that offer tools for disease prevention and management among older adults, and promote health behaviors that could potentially reduce or delay the onset of disease. A classification scheme that categorizes apps could be useful to both older adult app users and app developers. Objective The objective of our study was to build and evaluate the effectiveness of a classification scheme that classifies mobile apps available for older adults in the “Health & Fitness” category of the iTunes App Store. Methods We constructed a classification scheme for mobile apps according to three dimensions: (1) the Precede-Proceed Model (PPM), which classifies mobile apps in terms of predisposing, enabling, and reinforcing factors for behavior change; (2) health care process, specifically prevention versus management of disease; and (3) health conditions, including physical health and mental health. Content analysis was conducted by the research team on health and fitness apps designed specifically for older adults, as well as those applicable to older adults, released during the months of June and August 2011 and August 2012. Face validity was assessed by a different group of individuals, who were not related to the study. A reliability analysis was conducted to confirm the accuracy of the coding scheme of the sample apps in this study. Results After applying sample inclusion and exclusion criteria, a total of 119 apps were included in the study sample, of which 26/119 (21.8%) were released in June 2011, 45/119 (37.8%) in August 2011, and 48/119 (40.3%) in August 2012. Face validity was determined by interviewing 11 people, who agreed that this scheme accurately reflected the nature of this application. The entire study sample was successfully coded, demonstrating satisfactory inter-rater reliability by two independent coders (95.8% initial concordance and 100% concordance after consensus was reached). The apps included in the study sample were more likely to be used for the management of disease than prevention of disease (109/119, 91.6% vs 15/119, 12.6%). More apps contributed to physical health rather than mental health (81/119, 68.1% vs 47/119, 39.5%). Enabling apps (114/119, 95.8%) were more common than reinforcing (20/119, 16.8%) or predisposing apps (10/119, 8.4%). Conclusions The findings, including face validity and inter-rater reliability, support the integrity of the proposed classification scheme for categorizing mobile apps for older adults in the “Health and Fitness” category available in the iTunes App Store. Using the proposed classification system, older adult app users would be better positioned to identify apps appropriate for their needs, and app developers would be able to obtain the distributions of available mobile apps for health-related concerns of older adults more easily. PMID:25098687

  10. The performance of RegCM4 over the Central America and Caribbean region using different cumulus parameterizations

    NASA Astrophysics Data System (ADS)

    Martínez-Castro, Daniel; Vichot-Llano, Alejandro; Bezanilla-Morlot, Arnoldo; Centella-Artola, Abel; Campbell, Jayaka; Giorgi, Filippo; Viloria-Holguin, Cecilia C.

    2018-06-01

    A sensitivity study of the performance of the RegCM4 regional climate model driven by the ERA Interim reanalysis is conducted for the Central America and Caribbean region. A set of numerical experiments are completed using four configurations of the model, with a horizontal grid spacing of 25 km for a period of 6 years (1998-2003), using three of the convective parameterization schemes implemented in the model, the Emanuel scheme, the Grell over land-Emanuel over ocean scheme and two configurations of the Tiedtke scheme. The objective of the study is to investigate the ability of each configuration to reproduce different characteristics of the temperature, circulation and precipitation fields for the dry and rainy seasons. All schemes simulate the general temperature and precipitation patterns over land reasonably well, with relatively high correlations compared to observation datasets, though in specific regions there are positive or negative biases, greater in the rainy season. We also focus on some circulation features relevant for the region, such as the Caribbean low level jet and sea breeze circulations over islands, which are simulated by the model with varied performance across the different configurations. We find that no model configuration assessed is best performing for all the analysis criteria selected, but the Tiedtke configurations, which include the capability of tuning in particular the exchanges between cloud and environment air, provide the most balanced range of biases across variables, with no outstanding systematic bias emerging.

  11. Mixed biodiversity benefits of agri-environment schemes in five European countries.

    PubMed

    Kleijn, D; Baquero, R A; Clough, Y; Díaz, M; De Esteban, J; Fernández, F; Gabriel, D; Herzog, F; Holzschuh, A; Jöhl, R; Knop, E; Kruess, A; Marshall, E J P; Steffan-Dewenter, I; Tscharntke, T; Verhulst, J; West, T M; Yela, J L

    2006-03-01

    Agri-environment schemes are an increasingly important tool for the maintenance and restoration of farmland biodiversity in Europe but their ecological effects are poorly known. Scheme design is partly based on non-ecological considerations and poses important restrictions on evaluation studies. We describe a robust approach to evaluate agri-environment schemes and use it to evaluate the biodiversity effects of agri-environment schemes in five European countries. We compared species density of vascular plants, birds, bees, grasshoppers and crickets, and spiders on 202 paired fields, one with an agri-environment scheme, the other conventionally managed. In all countries, agri-environment schemes had marginal to moderately positive effects on biodiversity. However, uncommon species benefited in only two of five countries and species listed in Red Data Books rarely benefited from agri-environment schemes. Scheme objectives may need to differentiate between biodiversity of common species that can be enhanced with relatively simple modifications in farming practices and diversity or abundance of endangered species which require more elaborate conservation measures.

  12. Factors affecting sustainability of rural water schemes in Swaziland

    NASA Astrophysics Data System (ADS)

    Peter, Graciana; Nkambule, Sizwe E.

    The Millennium Development Goal (MDG) target to reduce the proportion of people without sustainable access to safe drinking water by the year 2015 has been met as of 2010, but huge disparities exist. Some regions, particularly Sub-Saharan Africa are lagging behind it is also in this region where up to 30% of the rural schemes are not functional at any given time. There is need for more studies on factors affecting sustainability and necessary measures which when implemented will improve the sustainability of rural water schemes. The main objective of this study was to assess the main factors affecting the sustainability of rural water schemes in Swaziland using a Multi-Criteria Analysis Approach. The main factors considered were: financial, social, technical, environmental and institutional. The study was done in Lubombo region. Fifteen functional water schemes in 11 communities were studied. Data was collected using questionnaires, checklist and focused group discussion guide. A total of 174 heads of households were interviewed. Statistical Package for Social Sciences (SPSS) was used to analyse the data and to calculate sustainability scores for water schemes. SPSS was also used to classify sustainability scores according to sustainability categories: sustainable, partially sustainable and non-sustainable. The averages of the ratings for the different sub-factors studied and the results on the sustainability scores for the sustainable, partially sustainable and non-sustainable schemes were then computed and compared to establish the main factors influencing sustainability of the water schemes. The results indicated technical and social factors as most critical while financial and institutional, although important, played a lesser role. Factors which contributed to the sustainability of water schemes were: functionality; design flow; water fetching time; ability to meet additional demand; use by population; equity; participation in decision making on operation and maintenance; existence of fund for operation and maintenance; willingness to contribute money; existence of a user’s committee; participation in the initial planning and design of the water scheme; and coordination between the local leaders and user’s committee. The main factors which made the schemes unsustainable were: long fetching time; non-involvement in decision making; lack of willingness to contribute funds; absence of users committee; and lack of cooperation between local leaders and the users committee. Water service providers should address the technical, social, financial and institutional factors identified affecting sustainability in their planning and implementation of rural water schemes.

  13. Buffer Management Simulation in ATM Networks

    NASA Technical Reports Server (NTRS)

    Yaprak, E.; Xiao, Y.; Chronopoulos, A.; Chow, E.; Anneberg, L.

    1998-01-01

    This paper presents a simulation of a new dynamic buffer allocation management scheme in ATM networks. To achieve this objective, an algorithm that detects congestion and updates the dynamic buffer allocation scheme was developed for the OPNET simulation package via the creation of a new ATM module.

  14. An energy-efficient transmission scheme for real-time data in wireless sensor networks.

    PubMed

    Kim, Jin-Woo; Barrado, José Ramón Ramos; Jeon, Dong-Keun

    2015-05-20

    The Internet of things (IoT) is a novel paradigm where all things or objects in daily life can communicate with other devices and provide services over the Internet. Things or objects need identifying, sensing, networking and processing capabilities to make the IoT paradigm a reality. The IEEE 802.15.4 standard is one of the main communication protocols proposed for the IoT. The IEEE 802.15.4 standard provides the guaranteed time slot (GTS) mechanism that supports the quality of service (QoS) for the real-time data transmission. In spite of some QoS features in IEEE 802.15.4 standard, the problem of end-to-end delay still remains. In order to solve this problem, we propose a cooperative medium access scheme (MAC) protocol for real-time data transmission. We also evaluate the performance of the proposed scheme through simulation. The simulation results demonstrate that the proposed scheme can improve the network performance.

  15. Scheduling Future Water Supply Investments Under Uncertainty

    NASA Astrophysics Data System (ADS)

    Huskova, I.; Matrosov, E. S.; Harou, J. J.; Kasprzyk, J. R.; Reed, P. M.

    2014-12-01

    Uncertain hydrological impacts of climate change, population growth and institutional changes pose a major challenge to planning of water supply systems. Planners seek optimal portfolios of supply and demand management schemes but also when to activate assets whilst considering many system goals and plausible futures. Incorporation of scheduling into the planning under uncertainty problem strongly increases its complexity. We investigate some approaches to scheduling with many-objective heuristic search. We apply a multi-scenario many-objective scheduling approach to the Thames River basin water supply system planning problem in the UK. Decisions include which new supply and demand schemes to implement, at what capacity and when. The impact of different system uncertainties on scheme implementation schedules are explored, i.e. how the choice of future scenarios affects the search process and its outcomes. The activation of schemes is influenced by the occurrence of extreme hydrological events in the ensemble of plausible scenarios and other factors. The approach and results are compared with a previous study where only the portfolio problem is addressed (without scheduling).

  16. An Energy-Efficient Transmission Scheme for Real-Time Data in Wireless Sensor Networks

    PubMed Central

    Kim, Jin-Woo; Barrado, José Ramón Ramos; Jeon, Dong-Keun

    2015-01-01

    The Internet of things (IoT) is a novel paradigm where all things or objects in daily life can communicate with other devices and provide services over the Internet. Things or objects need identifying, sensing, networking and processing capabilities to make the IoT paradigm a reality. The IEEE 802.15.4 standard is one of the main communication protocols proposed for the IoT. The IEEE 802.15.4 standard provides the guaranteed time slot (GTS) mechanism that supports the quality of service (QoS) for the real-time data transmission. In spite of some QoS features in IEEE 802.15.4 standard, the problem of end-to-end delay still remains. In order to solve this problem, we propose a cooperative medium access scheme (MAC) protocol for real-time data transmission. We also evaluate the performance of the proposed scheme through simulation. The simulation results demonstrate that the proposed scheme can improve the network performance. PMID:26007722

  17. A metadata schema for data objects in clinical research.

    PubMed

    Canham, Steve; Ohmann, Christian

    2016-11-24

    A large number of stakeholders have accepted the need for greater transparency in clinical research and, in the context of various initiatives and systems, have developed a diverse and expanding number of repositories for storing the data and documents created by clinical studies (collectively known as data objects). To make the best use of such resources, we assert that it is also necessary for stakeholders to agree and deploy a simple, consistent metadata scheme. The relevant data objects and their likely storage are described, and the requirements for metadata to support data sharing in clinical research are identified. Issues concerning persistent identifiers, for both studies and data objects, are explored. A scheme is proposed that is based on the DataCite standard, with extensions to cover the needs of clinical researchers, specifically to provide (a) study identification data, including links to clinical trial registries; (b) data object characteristics and identifiers; and (c) data covering location, ownership and access to the data object. The components of the metadata scheme are described. The metadata schema is proposed as a natural extension of a widely agreed standard to fill a gap not tackled by other standards related to clinical research (e.g., Clinical Data Interchange Standards Consortium, Biomedical Research Integrated Domain Group). The proposal could be integrated with, but is not dependent on, other moves to better structure data in clinical research.

  18. Establishing Long-Term Efficacy in Chronic Disease: Use of Recursive Partitioning and Propensity Score Adjustment to Estimate Outcome in MS

    PubMed Central

    Goodin, Douglas S.; Jones, Jason; Li, David; Traboulsee, Anthony; Reder, Anthony T.; Beckmann, Karola; Konieczny, Andreas; Knappertz, Volker

    2011-01-01

    Context Establishing the long-term benefit of therapy in chronic diseases has been challenging. Long-term studies require non-randomized designs and, thus, are often confounded by biases. For example, although disease-modifying therapy in MS has a convincing benefit on several short-term outcome-measures in randomized trials, its impact on long-term function remains uncertain. Objective Data from the 16-year Long-Term Follow-up study of interferon-beta-1b is used to assess the relationship between drug-exposure and long-term disability in MS patients. Design/Setting To mitigate the bias of outcome-dependent exposure variation in non-randomized long-term studies, drug-exposure was measured as the medication-possession-ratio, adjusted up or down according to multiple different weighting-schemes based on MS severity and MS duration at treatment initiation. A recursive-partitioning algorithm assessed whether exposure (using any weighing scheme) affected long-term outcome. The optimal cut-point that was used to define “high” or “low” exposure-groups was chosen by the algorithm. Subsequent to verification of an exposure-impact that included all predictor variables, the two groups were compared using a weighted propensity-stratified analysis in order to mitigate any treatment-selection bias that may have been present. Finally, multiple sensitivity-analyses were undertaken using different definitions of long-term outcome and different assumptions about the data. Main Outcome Measure Long-Term Disability. Results In these analyses, the same weighting-scheme was consistently selected by the recursive-partitioning algorithm. This scheme reduced (down-weighted) the effectiveness of drug exposure as either disease duration or disability at treatment-onset increased. Applying this scheme and using propensity-stratification to further mitigate bias, high-exposure had a consistently better clinical outcome compared to low-exposure (Cox proportional hazard ratio = 0.30–0.42; p<0.0001). Conclusions Early initiation and sustained use of interferon-beta-1b has a beneficial impact on long-term outcome in MS. Our analysis strategy provides a methodological framework for bias-mitigation in the analysis of non-randomized clinical data. Trial Registration Clinicaltrials.gov NCT00206635 PMID:22140424

  19. Application of a derivative-free global optimization algorithm to the derivation of a new time integration scheme for the simulation of incompressible turbulence

    NASA Astrophysics Data System (ADS)

    Alimohammadi, Shahrouz; Cavaglieri, Daniele; Beyhaghi, Pooriya; Bewley, Thomas R.

    2016-11-01

    This work applies a recently developed Derivative-free optimization algorithm to derive a new mixed implicit-explicit (IMEX) time integration scheme for Computational Fluid Dynamics (CFD) simulations. This algorithm allows imposing a specified order of accuracy for the time integration and other important stability properties in the form of nonlinear constraints within the optimization problem. In this procedure, the coefficients of the IMEX scheme should satisfy a set of constraints simultaneously. Therefore, the optimization process, at each iteration, estimates the location of the optimal coefficients using a set of global surrogates, for both the objective and constraint functions, as well as a model of the uncertainty function of these surrogates based on the concept of Delaunay triangulation. This procedure has been proven to converge to the global minimum of the constrained optimization problem provided the constraints and objective functions are twice differentiable. As a result, a new third-order, low-storage IMEX Runge-Kutta time integration scheme is obtained with remarkably fast convergence. Numerical tests are then performed leveraging the turbulent channel flow simulations to validate the theoretical order of accuracy and stability properties of the new scheme.

  20. a Transplantable Compensation Scheme for the Effect of the Radiance from the Interior of a Camera on the Accuracy of Temperature Measurement

    NASA Astrophysics Data System (ADS)

    Dong, Shidu; Yang, Xiaofan; He, Bo; Liu, Guojin

    2006-11-01

    Radiance coming from the interior of an uncooled infrared camera has a significant effect on the measured value of the temperature of the object. This paper presents a three-phase compensation scheme for coping with this effect. The first phase acquires the calibration data and forms the calibration function by least square fitting. Likewise, the second phase obtains the compensation data and builds the compensation function by fitting. With the aid of these functions, the third phase determines the temperature of the object in concern from any given ambient temperature. It is known that acquiring the compensation data of a camera is very time-consuming. For the purpose of getting the compensation data at a reasonable time cost, we propose a transplantable scheme. The idea of this scheme is to calculate the ratio between the central pixel’s responsivity of the child camera to the radiance from the interior and that of the mother camera, followed by determining the compensation data of the child camera using this ratio and the compensation data of the mother camera Experimental results show that either of the child camera and the mother camera can measure the temperature of the object with an error of no more than 2°C.

  1. Players and processes behind the national health insurance scheme: a case study of Uganda

    PubMed Central

    2013-01-01

    Background Uganda is the last East African country to adopt a National Health Insurance Scheme (NHIS). To lessen the inequitable burden of healthcare spending, health financing reform has focused on the establishment of national health insurance. The objective of this research is to depict how stakeholders and their power and interests have shaped the process of agenda setting and policy formulation for Uganda’s proposed NHIS. The study provides a contextual analysis of the development of NHIS policy within the context of national policies and processes. Methods The methodology is a single case study of agenda setting and policy formulation related to the proposed NHIS in Uganda. It involves an analysis of the real-life context, the content of proposals, the process, and a retrospective stakeholder analysis in terms of policy development. Data collection comprised a literature review of published documents, technical reports, policy briefs, and memos obtained from Uganda’s Ministry of Health and other unpublished sources. Formal discussions were held with ministry staff involved in the design of the scheme and some members of the task force to obtain clarification, verify events, and gain additional information. Results The process of developing the NHIS has been an incremental one, characterised by small-scale, gradual changes and repeated adjustments through various stakeholder engagements during the three phases of development: from 1995 to 1999; 2000 to 2005; and 2006 to 2011. Despite political will in the government, progress with the NHIS has been slow, and it has yet to be implemented. Stakeholders, notably the private sector, played an important role in influencing the pace of the development process and the currently proposed design of the scheme. Conclusions This study underscores the importance of stakeholder analysis in major health reforms. Early use of stakeholder analysis combined with an ongoing review and revision of NHIS policy proposals during stakeholder discussions would be an effective strategy for avoiding potential pitfalls and obstacles in policy implementation. Given the private sector’s influence on negotiations over health insurance design in Uganda, this paper also reviews the experience of two countries with similar stakeholder dynamics. PMID:24053551

  2. Players and processes behind the national health insurance scheme: a case study of Uganda.

    PubMed

    Basaza, Robert K; O'Connell, Thomas S; Chapčáková, Ivana

    2013-09-22

    Uganda is the last East African country to adopt a National Health Insurance Scheme (NHIS). To lessen the inequitable burden of healthcare spending, health financing reform has focused on the establishment of national health insurance. The objective of this research is to depict how stakeholders and their power and interests have shaped the process of agenda setting and policy formulation for Uganda's proposed NHIS. The study provides a contextual analysis of the development of NHIS policy within the context of national policies and processes. The methodology is a single case study of agenda setting and policy formulation related to the proposed NHIS in Uganda. It involves an analysis of the real-life context, the content of proposals, the process, and a retrospective stakeholder analysis in terms of policy development. Data collection comprised a literature review of published documents, technical reports, policy briefs, and memos obtained from Uganda's Ministry of Health and other unpublished sources. Formal discussions were held with ministry staff involved in the design of the scheme and some members of the task force to obtain clarification, verify events, and gain additional information. The process of developing the NHIS has been an incremental one, characterised by small-scale, gradual changes and repeated adjustments through various stakeholder engagements during the three phases of development: from 1995 to 1999; 2000 to 2005; and 2006 to 2011. Despite political will in the government, progress with the NHIS has been slow, and it has yet to be implemented. Stakeholders, notably the private sector, played an important role in influencing the pace of the development process and the currently proposed design of the scheme. This study underscores the importance of stakeholder analysis in major health reforms. Early use of stakeholder analysis combined with an ongoing review and revision of NHIS policy proposals during stakeholder discussions would be an effective strategy for avoiding potential pitfalls and obstacles in policy implementation. Given the private sector's influence on negotiations over health insurance design in Uganda, this paper also reviews the experience of two countries with similar stakeholder dynamics.

  3. Financial protection under the new rural cooperative medical schemes in China.

    PubMed

    Wang, Juan; Zhou, Hong-Wei; Lei, Yi-Xiong; Wang, Xin-Wang

    2012-08-01

    This study was the first of its kind to analyze the finance protection in New Rural Cooperative Medical Scheme in China using a claim database analysis. A claim database analysis of all hospitalizations reimbursed from the New Rural Cooperative Medical Scheme between January 2005 and December 2008 in Panyu district of Guangzhou covering 108,414 discharges was conducted to identify the difference in real reimbursement rate among 5 hospitalization cost categories by sex, age, and hospital type and to investigate the distributions of hospital-type choices among age and hospitalization cost categories. The share of total cost reimbursed was only 34% on average, and increased with age but decreased with higher hospitalization cost, undermining catastrophic coverage. Older people were more likely to be hospitalized at lower level hospitals with higher reimbursement rate. The mean cost per hospitalization and average length of stay increased whereas the real reimbursement rate decreased with hospital level among the top 4 diseases with the same ICD-10 diagnostic code (3-digit level) for each age group. Providing better protection against costly medical needs will require shifting the balance of objectives somewhat away from cost control toward more generous reimbursement, expanding the list of treatments that the insurance will cover, or some other policy to provide adequate care at lower cost facilities where more of the cost is now covered.

  4. Development of Decision-Making Automated System for Optimal Placement of Physical Access Control System’s Elements

    NASA Astrophysics Data System (ADS)

    Danilova, Olga; Semenova, Zinaida

    2018-04-01

    The objective of this study is a detailed analysis of physical protection systems development for information resources. The optimization theory and decision-making mathematical apparatus is used to formulate correctly and create an algorithm of selection procedure for security systems optimal configuration considering the location of the secured object’s access point and zones. The result of this study is a software implementation scheme of decision-making system for optimal placement of the physical access control system’s elements.

  5. A Study on the Security Levels of Spread-Spectrum Embedding Schemes in the WOA Framework.

    PubMed

    Wang, Yuan-Gen; Zhu, Guopu; Kwong, Sam; Shi, Yun-Qing

    2017-08-23

    Security analysis is a very important issue for digital watermarking. Several years ago, according to Kerckhoffs' principle, the famous four security levels, namely insecurity, key security, subspace security, and stego-security, were defined for spread-spectrum (SS) embedding schemes in the framework of watermarked-only attack. However, up to now there has been little application of the definition of these security levels to the theoretical analysis of the security of SS embedding schemes, due to the difficulty of the theoretical analysis. In this paper, based on the security definition, we present a theoretical analysis to evaluate the security levels of five typical SS embedding schemes, which are the classical SS, the improved SS (ISS), the circular extension of ISS, the nonrobust and robust natural watermarking, respectively. The theoretical analysis of these typical SS schemes are successfully performed by taking advantage of the convolution of probability distributions to derive the probabilistic models of watermarked signals. Moreover, simulations are conducted to illustrate and validate our theoretical analysis. We believe that the theoretical and practical analysis presented in this paper can bridge the gap between the definition of the four security levels and its application to the theoretical analysis of SS embedding schemes.

  6. Asymptotic analysis of discrete schemes for non-equilibrium radiation diffusion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cui, Xia, E-mail: cui_xia@iapcm.ac.cn; Yuan, Guang-wei; Shen, Zhi-jun

    Motivated by providing well-behaved fully discrete schemes in practice, this paper extends the asymptotic analysis on time integration methods for non-equilibrium radiation diffusion in [2] to space discretizations. Therein studies were carried out on a two-temperature model with Larsen's flux-limited diffusion operator, both the implicitly balanced (IB) and linearly implicit (LI) methods were shown asymptotic-preserving. In this paper, we focus on asymptotic analysis for space discrete schemes in dimensions one and two. First, in construction of the schemes, in contrast to traditional first-order approximations, asymmetric second-order accurate spatial approximations are devised for flux-limiters on boundary, and discrete schemes with second-ordermore » accuracy on global spatial domain are acquired consequently. Then by employing formal asymptotic analysis, the first-order asymptotic-preserving property for these schemes and furthermore for the fully discrete schemes is shown. Finally, with the help of manufactured solutions, numerical tests are performed, which demonstrate quantitatively the fully discrete schemes with IB time evolution indeed have the accuracy and asymptotic convergence as theory predicts, hence are well qualified for both non-equilibrium and equilibrium radiation diffusion. - Highlights: • Provide AP fully discrete schemes for non-equilibrium radiation diffusion. • Propose second order accurate schemes by asymmetric approach for boundary flux-limiter. • Show first order AP property of spatially and fully discrete schemes with IB evolution. • Devise subtle artificial solutions; verify accuracy and AP property quantitatively. • Ideas can be generalized to 3-dimensional problems and higher order implicit schemes.« less

  7. RICH: OPEN-SOURCE HYDRODYNAMIC SIMULATION ON A MOVING VORONOI MESH

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yalinewich, Almog; Steinberg, Elad; Sari, Re’em

    2015-02-01

    We present here RICH, a state-of-the-art two-dimensional hydrodynamic code based on Godunov’s method, on an unstructured moving mesh (the acronym stands for Racah Institute Computational Hydrodynamics). This code is largely based on the code AREPO. It differs from AREPO in the interpolation and time-advancement schemeS as well as a novel parallelization scheme based on Voronoi tessellation. Using our code, we study the pros and cons of a moving mesh (in comparison to a static mesh). We also compare its accuracy to other codes. Specifically, we show that our implementation of external sources and time-advancement scheme is more accurate and robustmore » than is AREPO when the mesh is allowed to move. We performed a parameter study of the cell rounding mechanism (Lloyd iterations) and its effects. We find that in most cases a moving mesh gives better results than a static mesh, but it is not universally true. In the case where matter moves in one way and a sound wave is traveling in the other way (such that relative to the grid the wave is not moving) a static mesh gives better results than a moving mesh. We perform an analytic analysis for finite difference schemes that reveals that a Lagrangian simulation is better than a Eulerian simulation in the case of a highly supersonic flow. Moreover, we show that Voronoi-based moving mesh schemes suffer from an error, which is resolution independent, due to inconsistencies between the flux calculation and the change in the area of a cell. Our code is publicly available as open source and designed in an object-oriented, user-friendly way that facilitates incorporation of new algorithms and physical processes.« less

  8. An efficient scheme for automatic web pages categorization using the support vector machine

    NASA Astrophysics Data System (ADS)

    Bhalla, Vinod Kumar; Kumar, Neeraj

    2016-07-01

    In the past few years, with an evolution of the Internet and related technologies, the number of the Internet users grows exponentially. These users demand access to relevant web pages from the Internet within fraction of seconds. To achieve this goal, there is a requirement of an efficient categorization of web page contents. Manual categorization of these billions of web pages to achieve high accuracy is a challenging task. Most of the existing techniques reported in the literature are semi-automatic. Using these techniques, higher level of accuracy cannot be achieved. To achieve these goals, this paper proposes an automatic web pages categorization into the domain category. The proposed scheme is based on the identification of specific and relevant features of the web pages. In the proposed scheme, first extraction and evaluation of features are done followed by filtering the feature set for categorization of domain web pages. A feature extraction tool based on the HTML document object model of the web page is developed in the proposed scheme. Feature extraction and weight assignment are based on the collection of domain-specific keyword list developed by considering various domain pages. Moreover, the keyword list is reduced on the basis of ids of keywords in keyword list. Also, stemming of keywords and tag text is done to achieve a higher accuracy. An extensive feature set is generated to develop a robust classification technique. The proposed scheme was evaluated using a machine learning method in combination with feature extraction and statistical analysis using support vector machine kernel as the classification tool. The results obtained confirm the effectiveness of the proposed scheme in terms of its accuracy in different categories of web pages.

  9. Moving object detection using dynamic motion modelling from UAV aerial images.

    PubMed

    Saif, A F M Saifuddin; Prabuwono, Anton Satria; Mahayuddin, Zainal Rasyid

    2014-01-01

    Motion analysis based moving object detection from UAV aerial image is still an unsolved issue due to inconsideration of proper motion estimation. Existing moving object detection approaches from UAV aerial images did not deal with motion based pixel intensity measurement to detect moving object robustly. Besides current research on moving object detection from UAV aerial images mostly depends on either frame difference or segmentation approach separately. There are two main purposes for this research: firstly to develop a new motion model called DMM (dynamic motion model) and secondly to apply the proposed segmentation approach SUED (segmentation using edge based dilation) using frame difference embedded together with DMM model. The proposed DMM model provides effective search windows based on the highest pixel intensity to segment only specific area for moving object rather than searching the whole area of the frame using SUED. At each stage of the proposed scheme, experimental fusion of the DMM and SUED produces extracted moving objects faithfully. Experimental result reveals that the proposed DMM and SUED have successfully demonstrated the validity of the proposed methodology.

  10. Layered data association using graph-theoretic formulation with applications to tennis ball tracking in monocular sequences.

    PubMed

    Yan, Fei; Christmas, William; Kittler, Josef

    2008-10-01

    In this paper, we propose a multilayered data association scheme with graph-theoretic formulation for tracking multiple objects that undergo switching dynamics in clutter. The proposed scheme takes as input object candidates detected in each frame. At the object candidate level, "tracklets'' are "grown'' from sets of candidates that have high probabilities of containing only true positives. At the tracklet level, a directed and weighted graph is constructed, where each node is a tracklet, and the edge weight between two nodes is defined according to the "compatibility'' of the two tracklets. The association problem is then formulated as an all-pairs shortest path (APSP) problem in this graph. Finally, at the path level, by analyzing the APSPs, all object trajectories are identified, and track initiation and track termination are automatically dealt with. By exploiting a special topological property of the graph, we have also developed a more efficient APSP algorithm than the general-purpose ones. The proposed data association scheme is applied to tennis sequences to track tennis balls. Experiments show that it works well on sequences where other data association methods perform poorly or fail completely.

  11. Automatic image database generation from CAD for 3D object recognition

    NASA Astrophysics Data System (ADS)

    Sardana, Harish K.; Daemi, Mohammad F.; Ibrahim, Mohammad K.

    1993-06-01

    The development and evaluation of Multiple-View 3-D object recognition systems is based on a large set of model images. Due to the various advantages of using CAD, it is becoming more and more practical to use existing CAD data in computer vision systems. Current PC- level CAD systems are capable of providing physical image modelling and rendering involving positional variations in cameras, light sources etc. We have formulated a modular scheme for automatic generation of various aspects (views) of the objects in a model based 3-D object recognition system. These views are generated at desired orientations on the unit Gaussian sphere. With a suitable network file sharing system (NFS), the images can directly be stored on a database located on a file server. This paper presents the image modelling solutions using CAD in relation to multiple-view approach. Our modular scheme for data conversion and automatic image database storage for such a system is discussed. We have used this approach in 3-D polyhedron recognition. An overview of the results, advantages and limitations of using CAD data and conclusions using such as scheme are also presented.

  12. Visual tracking using objectness-bounding box regression and correlation filters

    NASA Astrophysics Data System (ADS)

    Mbelwa, Jimmy T.; Zhao, Qingjie; Lu, Yao; Wang, Fasheng; Mbise, Mercy

    2018-03-01

    Visual tracking is a fundamental problem in computer vision with extensive application domains in surveillance and intelligent systems. Recently, correlation filter-based tracking methods have shown a great achievement in terms of robustness, accuracy, and speed. However, such methods have a problem of dealing with fast motion (FM), motion blur (MB), illumination variation (IV), and drifting caused by occlusion (OCC). To solve this problem, a tracking method that integrates objectness-bounding box regression (O-BBR) model and a scheme based on kernelized correlation filter (KCF) is proposed. The scheme based on KCF is used to improve the tracking performance of FM and MB. For handling drift problem caused by OCC and IV, we propose objectness proposals trained in bounding box regression as prior knowledge to provide candidates and background suppression. Finally, scheme KCF as a base tracker and O-BBR are fused to obtain a state of a target object. Extensive experimental comparisons of the developed tracking method with other state-of-the-art trackers are performed on some of the challenging video sequences. Experimental comparison results show that our proposed tracking method outperforms other state-of-the-art tracking methods in terms of effectiveness, accuracy, and robustness.

  13. Comparison of the co-gasification of sewage sludge and food wastes and cost-benefit analysis of gasification- and incineration-based waste treatment schemes.

    PubMed

    You, Siming; Wang, Wei; Dai, Yanjun; Tong, Yen Wah; Wang, Chi-Hwa

    2016-10-01

    The compositions of food wastes and their co-gasification producer gas were compared with the existing data of sewage sludge. Results showed that food wastes are more favorable than sewage sludge for co-gasification based on residue generation and energy output. Two decentralized gasification-based schemes were proposed to dispose of the sewage sludge and food wastes in Singapore. Monte Carlo simulation-based cost-benefit analysis was conducted to compare the proposed schemes with the existing incineration-based scheme. It was found that the gasification-based schemes are financially superior to the incineration-based scheme based on the data of net present value (NPV), benefit-cost ratio (BCR), and internal rate of return (IRR). Sensitivity analysis was conducted to suggest effective measures to improve the economics of the schemes. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Load Balancing Using Time Series Analysis for Soft Real Time Systems with Statistically Periodic Loads

    NASA Technical Reports Server (NTRS)

    Hailperin, Max

    1993-01-01

    This thesis provides design and analysis of techniques for global load balancing on ensemble architectures running soft-real-time object-oriented applications with statistically periodic loads. It focuses on estimating the instantaneous average load over all the processing elements. The major contribution is the use of explicit stochastic process models for both the loading and the averaging itself. These models are exploited via statistical time-series analysis and Bayesian inference to provide improved average load estimates, and thus to facilitate global load balancing. This thesis explains the distributed algorithms used and provides some optimality results. It also describes the algorithms' implementation and gives performance results from simulation. These results show that our techniques allow more accurate estimation of the global system load ing, resulting in fewer object migration than local methods. Our method is shown to provide superior performance, relative not only to static load-balancing schemes but also to many adaptive methods.

  15. Simplified two-dimensional microwave imaging scheme using metamaterial-loaded Vivaldi antenna

    NASA Astrophysics Data System (ADS)

    Johari, Esha; Akhter, Zubair; Bhaskar, Manoj; Akhtar, M. Jaleel

    2017-03-01

    In this paper, a highly efficient, low-cost scheme for two-dimensional microwave imaging is proposed. To this end, the AZIM (anisotropic zero index metamaterial) cell-loaded Vivaldi antenna is designed and tested as effective electromagnetic radiation beam source required in the microwave imaging scheme. The designed antenna is first individually tested in the anechoic chamber, and its directivity along with the radiation pattern is obtained. The measurement setup for the imaging here involves a vector network analyzer, the AZIM cell-loaded ultra-wideband Vivaldi antenna, and other associated microwave components. The potential of the designed antenna for the microwave imaging is tested by first obtaining the two-dimensional reflectivity images of metallic samples of different shapes placed in front of the antenna, using the proposed scheme. In the next step, these sets of samples are hidden behind wooden blocks of different thicknesses and the reflectivity image of the test media is reconstructed by using the proposed scheme. Finally, the reflectivity images of various dielectric samples (Teflon, Plexiglas, permanent magnet moving coil) along with the copper sheet placed on a piece of cardboard are reconstructed by using the proposed setup. The images obtained for each case are plotted and compared with the actual objects, and a close match is observed which shows the applicability of the proposed scheme for through-wall imaging and the detection of concealed objects.

  16. Investigation of advanced counterrotation blade configuration concepts for high speed turboprop systems. Task 2: Unsteady ducted propfan analysis computer program users manual

    NASA Technical Reports Server (NTRS)

    Hall, Edward J.; Delaney, Robert A.; Bettner, James L.

    1991-01-01

    The primary objective of this study was the development of a time-dependent three-dimensional Euler/Navier-Stokes aerodynamic analysis to predict unsteady compressible transonic flows about ducted and unducted propfan propulsion systems at angle of attack. The computer codes resulting from this study are referred to as Advanced Ducted Propfan Analysis Codes (ADPAC). This report is intended to serve as a computer program user's manual for the ADPAC developed under Task 2 of NASA Contract NAS3-25270, Unsteady Ducted Propfan Analysis. Aerodynamic calculations were based on a four-stage Runge-Kutta time-marching finite volume solution technique with added numerical dissipation. A time-accurate implicit residual smoothing operator was utilized for unsteady flow predictions. For unducted propfans, a single H-type grid was used to discretize each blade passage of the complete propeller. For ducted propfans, a coupled system of five grid blocks utilizing an embedded C-grid about the cowl leading edge was used to discretize each blade passage. Grid systems were generated by a combined algebraic/elliptic algorithm developed specifically for ducted propfans. Numerical calculations were compared with experimental data for both ducted and unducted propfan flows. The solution scheme demonstrated efficiency and accuracy comparable with other schemes of this class.

  17. Atlas of TOMS ozone data collected during the Genesis of Atlantic Lows Experiment (GALE), 1986

    NASA Technical Reports Server (NTRS)

    Larko, David E.; Uccellini, Louis W.; Krueger, Arlin J.

    1986-01-01

    Data from the TOMS (Total Ozone Mapping Spectrometer) instrument aboard the Nimbus-7 satellite were collected daily in real time during the GALE (Genesis of Atlantic Lows Experiment) from January 15 through March 15, l986. The TOMS ozone data values were processed into GEMPAK format and transferred from the Goddard Space Flight Center to GALE operations in Raleigh-Durham, NC, in as little as three hours for use, in part, to direct aircraft research flights recording in situ measurements of ozone and water vapor in areas of interest. Once in GEMPAK format, the ozone values were processed into gridded form using the Barnes objective analysis scheme and contour plots of the ozone created. This atlas provides objectively analyzed contour plots of the ozone for each of the sixty days of GALE as well as four-panel presentations of the ozone analysis combined on the basis of GALE Intensive Observing Periods (IOP's).

  18. Efficient distance calculation using the spherically-extended polytope (s-tope) model

    NASA Technical Reports Server (NTRS)

    Hamlin, Gregory J.; Kelley, Robert B.; Tornero, Josep

    1991-01-01

    An object representation scheme which allows for Euclidean distance calculation is presented. The object model extends the polytope model by representing objects as the convex hull of a finite set of spheres. An algorithm for calculating distances between objects is developed which is linear in the total number of spheres specifying the two objects.

  19. Mentoring for NHS doctors: perceived benefits across the personal–professional interface

    PubMed Central

    Steven, A; Oxley, J; Fleming, WG

    2008-01-01

    Summary Objective To investigate NHS doctors' perceived benefits of being involved in mentoring schemes and to explore the overlaps and relationships between areas of benefit. Design Extended qualitative analysis of a multi-site interview study following an interpretivist approach. Setting Six NHS mentoring schemes across England. Main outcome measures Perceived benefits. Results While primary analysis resulted in lists of perceived benefits, the extended analysis revealed three overarching areas: professional practice, personal well-being and development. Benefits appear to go beyond a doctor's professional role to cross the personal–professional interface. Problem solving and change management seem to be key processes underpinning the raft of personal and professional benefits reported. A conceptual map was developed to depict these areas and relationships. In addition secondary analysis suggests that in benefitting one area mentoring may lead to consequential benefits in others. Conclusions Prior research into mentoring has mainly taken place in a single health care sector. This multi-site study suggests that the perceived benefits of involvement in mentoring may cross the personal/professional interface and may override organizational differences. Furthermore the map developed highlights the complex relationships which exist between the three areas of professional practice, personal wellbeing and personal and professional development. Given the consistency of findings across several studies it seems probable that organizations would be strengthened by doctors who feel more satisfied and confident in their professional roles as a result of participation in mentoring. Mentoring may have the potential to take us beyond individual limits to greater benefits and the conceptual map may offer a starting point for the development of outcome criteria and evaluation tools for mentoring schemes. PMID:19029356

  20. Analysis and design of numerical schemes for gas dynamics. 2: Artificial diffusion and discrete shock structure

    NASA Technical Reports Server (NTRS)

    Jameson, Antony

    1994-01-01

    The effect of artificial diffusion on discrete shock structures is examined for a family of schemes which includes scalar diffusion, convective upwind and split pressure (CUSP) schemes, and upwind schemes with characteristics splitting. The analysis leads to conditions on the diffusive flux such that stationary discrete shocks can contain a single interior point. The simplest formulation which meets these conditions is a CUSP scheme in which the coefficients of the pressure differences is fully determined by the coefficient of convective diffusion. It is also shown how both the characteristic and CUSP schemes can be modified to preserve constant stagnation enthalpy in steady flow, leading to four variants, the E and H-characteristic schemes, and the E and H-CUSP schemes. Numerical results are presented which confirm the properties of these schemes.

  1. A Secure and Robust User Authenticated Key Agreement Scheme for Hierarchical Multi-medical Server Environment in TMIS.

    PubMed

    Das, Ashok Kumar; Odelu, Vanga; Goswami, Adrijit

    2015-09-01

    The telecare medicine information system (TMIS) helps the patients to gain the health monitoring facility at home and access medical services over the Internet of mobile networks. Recently, Amin and Biswas presented a smart card based user authentication and key agreement security protocol usable for TMIS system using the cryptographic one-way hash function and biohashing function, and claimed that their scheme is secure against all possible attacks. Though their scheme is efficient due to usage of one-way hash function, we show that their scheme has several security pitfalls and design flaws, such as (1) it fails to protect privileged-insider attack, (2) it fails to protect strong replay attack, (3) it fails to protect strong man-in-the-middle attack, (4) it has design flaw in user registration phase, (5) it has design flaw in login phase, (6) it has design flaw in password change phase, (7) it lacks of supporting biometric update phase, and (8) it has flaws in formal security analysis. In order to withstand these security pitfalls and design flaws, we aim to propose a secure and robust user authenticated key agreement scheme for the hierarchical multi-server environment suitable in TMIS using the cryptographic one-way hash function and fuzzy extractor. Through the rigorous security analysis including the formal security analysis using the widely-accepted Burrows-Abadi-Needham (BAN) logic, the formal security analysis under the random oracle model and the informal security analysis, we show that our scheme is secure against possible known attacks. Furthermore, we simulate our scheme using the most-widely accepted and used Automated Validation of Internet Security Protocols and Applications (AVISPA) tool. The simulation results show that our scheme is also secure. Our scheme is more efficient in computation and communication as compared to Amin-Biswas's scheme and other related schemes. In addition, our scheme supports extra functionality features as compared to other related schemes. As a result, our scheme is very appropriate for practical applications in TMIS.

  2. Optimizing congestion and emissions via tradable credit charge and reward scheme without initial credit allocations

    NASA Astrophysics Data System (ADS)

    Zhu, Wenlong; Ma, Shoufeng; Tian, Junfang

    2017-01-01

    This paper investigates the revenue-neutral tradable credit charge and reward scheme without initial credit allocations that can reassign network traffic flow patterns to optimize congestion and emissions. First, we prove the existence of the proposed schemes and further decentralize the minimum emission flow pattern to user equilibrium. Moreover, we design the solving method of the proposed credit scheme for minimum emission problem. Second, we investigate the revenue-neutral tradable credit charge and reward scheme without initial credit allocations for bi-objectives to obtain the Pareto system optimum flow patterns of congestion and emissions; and present the corresponding solutions are located in the polyhedron constituted by some inequalities and equalities system. Last, numerical example based on a simple traffic network is adopted to obtain the proposed credit schemes and verify they are revenue-neutral.

  3. A morphing-based scheme for large deformation analysis with stereo-DIC

    NASA Astrophysics Data System (ADS)

    Genovese, Katia; Sorgente, Donato

    2018-05-01

    A key step in the DIC-based image registration process is the definition of the initial guess for the non-linear optimization routine aimed at finding the parameters describing the pixel subset transformation. This initialization may result very challenging and possibly fail when dealing with pairs of largely deformed images such those obtained from two angled-views of not-flat objects or from the temporal undersampling of rapidly evolving phenomena. To address this problem, we developed a procedure that generates a sequence of intermediate synthetic images for gradually tracking the pixel subset transformation between the two extreme configurations. To this scope, a proper image warping function is defined over the entire image domain through the adoption of a robust feature-based algorithm followed by a NURBS-based interpolation scheme. This allows a fast and reliable estimation of the initial guess of the deformation parameters for the subsequent refinement stage of the DIC analysis. The proposed method is described step-by-step by illustrating the measurement of the large and heterogeneous deformation of a circular silicone membrane undergoing axisymmetric indentation. A comparative analysis of the results is carried out by taking as a benchmark a standard reference-updating approach. Finally, the morphing scheme is extended to the most general case of the correspondence search between two largely deformed textured 3D geometries. The feasibility of this latter approach is demonstrated on a very challenging case: the full-surface measurement of the severe deformation (> 150% strain) suffered by an aluminum sheet blank subjected to a pneumatic bulge test.

  4. Development of a solution adaptive unstructured scheme for quasi-3D inviscid flows through advanced turbomachinery cascades

    NASA Technical Reports Server (NTRS)

    Usab, William J., Jr.; Jiang, Yi-Tsann

    1991-01-01

    The objective of the present research is to develop a general solution adaptive scheme for the accurate prediction of inviscid quasi-three-dimensional flow in advanced compressor and turbine designs. The adaptive solution scheme combines an explicit finite-volume time-marching scheme for unstructured triangular meshes and an advancing front triangular mesh scheme with a remeshing procedure for adapting the mesh as the solution evolves. The unstructured flow solver has been tested on a series of two-dimensional airfoil configurations including a three-element analytic test case presented here. Mesh adapted quasi-three-dimensional Euler solutions are presented for three spanwise stations of the NASA rotor 67 transonic fan. Computed solutions are compared with available experimental data.

  5. A High Fuel Consumption Efficiency Management Scheme for PHEVs Using an Adaptive Genetic Algorithm

    PubMed Central

    Lee, Wah Ching; Tsang, Kim Fung; Chi, Hao Ran; Hung, Faan Hei; Wu, Chung Kit; Chui, Kwok Tai; Lau, Wing Hong; Leung, Yat Wah

    2015-01-01

    A high fuel efficiency management scheme for plug-in hybrid electric vehicles (PHEVs) has been developed. In order to achieve fuel consumption reduction, an adaptive genetic algorithm scheme has been designed to adaptively manage the energy resource usage. The objective function of the genetic algorithm is implemented by designing a fuzzy logic controller which closely monitors and resembles the driving conditions and environment of PHEVs, thus trading off between petrol versus electricity for optimal driving efficiency. Comparison between calculated results and publicized data shows that the achieved efficiency of the fuzzified genetic algorithm is better by 10% than existing schemes. The developed scheme, if fully adopted, would help reduce over 600 tons of CO2 emissions worldwide every day. PMID:25587974

  6. GANDALF - Graphical Astrophysics code for N-body Dynamics And Lagrangian Fluids

    NASA Astrophysics Data System (ADS)

    Hubber, D. A.; Rosotti, G. P.; Booth, R. A.

    2018-01-01

    GANDALF is a new hydrodynamics and N-body dynamics code designed for investigating planet formation, star formation and star cluster problems. GANDALF is written in C++, parallelized with both OPENMP and MPI and contains a PYTHON library for analysis and visualization. The code has been written with a fully object-oriented approach to easily allow user-defined implementations of physics modules or other algorithms. The code currently contains implementations of smoothed particle hydrodynamics, meshless finite-volume and collisional N-body schemes, but can easily be adapted to include additional particle schemes. We present in this paper the details of its implementation, results from the test suite, serial and parallel performance results and discuss the planned future development. The code is freely available as an open source project on the code-hosting website github at https://github.com/gandalfcode/gandalf and is available under the GPLv2 license.

  7. Versatile and declarative dynamic programming using pair algebras.

    PubMed

    Steffen, Peter; Giegerich, Robert

    2005-09-12

    Dynamic programming is a widely used programming technique in bioinformatics. In sharp contrast to the simplicity of textbook examples, implementing a dynamic programming algorithm for a novel and non-trivial application is a tedious and error prone task. The algebraic dynamic programming approach seeks to alleviate this situation by clearly separating the dynamic programming recurrences and scoring schemes. Based on this programming style, we introduce a generic product operation of scoring schemes. This leads to a remarkable variety of applications, allowing us to achieve optimizations under multiple objective functions, alternative solutions and backtracing, holistic search space analysis, ambiguity checking, and more, without additional programming effort. We demonstrate the method on several applications for RNA secondary structure prediction. The product operation as introduced here adds a significant amount of flexibility to dynamic programming. It provides a versatile testbed for the development of new algorithmic ideas, which can immediately be put to practice.

  8. Research on multi-user encrypted search scheme in cloud environment

    NASA Astrophysics Data System (ADS)

    Yu, Zonghua; Lin, Sui

    2017-05-01

    Aiming at the existing problems of multi-user encrypted search scheme in cloud computing environment, a basic multi-user encrypted scheme is proposed firstly, and then the basic scheme is extended to an anonymous hierarchical management authority. Compared with most of the existing schemes, the scheme not only to achieve the protection of keyword information, but also to achieve the protection of user identity privacy; the same time, data owners can directly control the user query permissions, rather than the cloud server. In addition, through the use of a special query key generation rules, to achieve the hierarchical management of the user's query permissions. The safety analysis shows that the scheme is safe and that the performance analysis and experimental data show that the scheme is practicable.

  9. Implementation analysis of RC5 algorithm on Preneel-Govaerts-Vandewalle (PGV) hashing schemes using length extension attack

    NASA Astrophysics Data System (ADS)

    Siswantyo, Sepha; Susanti, Bety Hayat

    2016-02-01

    Preneel-Govaerts-Vandewalle (PGV) schemes consist of 64 possible single-block-length schemes that can be used to build a hash function based on block ciphers. For those 64 schemes, Preneel claimed that 4 schemes are secure. In this paper, we apply length extension attack on those 4 secure PGV schemes which use RC5 algorithm in its basic construction to test their collision resistance property. The attack result shows that the collision occurred on those 4 secure PGV schemes. Based on the analysis, we indicate that Feistel structure and data dependent rotation operation in RC5 algorithm, XOR operations on the scheme, along with selection of additional message block value also give impact on the collision to occur.

  10. PCA-based spatially adaptive denoising of CFA images for single-sensor digital cameras.

    PubMed

    Zheng, Lei; Lukac, Rastislav; Wu, Xiaolin; Zhang, David

    2009-04-01

    Single-sensor digital color cameras use a process called color demosiacking to produce full color images from the data captured by a color filter array (CAF). The quality of demosiacked images is degraded due to the sensor noise introduced during the image acquisition process. The conventional solution to combating CFA sensor noise is demosiacking first, followed by a separate denoising processing. This strategy will generate many noise-caused color artifacts in the demosiacking process, which are hard to remove in the denoising process. Few denoising schemes that work directly on the CFA images have been presented because of the difficulties arisen from the red, green and blue interlaced mosaic pattern, yet a well-designed "denoising first and demosiacking later" scheme can have advantages such as less noise-caused color artifacts and cost-effective implementation. This paper presents a principle component analysis (PCA)-based spatially-adaptive denoising algorithm, which works directly on the CFA data using a supporting window to analyze the local image statistics. By exploiting the spatial and spectral correlations existing in the CFA image, the proposed method can effectively suppress noise while preserving color edges and details. Experiments using both simulated and real CFA images indicate that the proposed scheme outperforms many existing approaches, including those sophisticated demosiacking and denoising schemes, in terms of both objective measurement and visual evaluation.

  11. Multiple Object Based RFID System Using Security Level

    NASA Astrophysics Data System (ADS)

    Kim, Jiyeon; Jung, Jongjin; Ryu, Ukjae; Ko, Hoon; Joe, Susan; Lee, Yongjun; Kim, Boyeon; Chang, Yunseok; Lee, Kyoonha

    2007-12-01

    RFID systems are increasingly applied for operational convenience in wide range of industries and individual life. However, it is uneasy for a person to control many tags because common RFID systems have the restriction that a tag used to identify just a single object. In addition, RFID systems can make some serious problems in violation of privacy and security because of their radio frequency communication. In this paper, we propose a multiple object RFID tag which can keep multiple object identifiers for different applications in a same tag. The proposed tag allows simultaneous access for their pair applications. We also propose an authentication protocol for multiple object tag to prevent serious problems of security and privacy in RFID applications. Especially, we focus on efficiency of the authentication protocol by considering security levels of applications. In the proposed protocol, the applications go through different authentication procedures according to security level of the object identifier stored in the tag. We implemented the proposed RFID scheme and made experimental results about efficiency and stability for the scheme.

  12. A classification of errors in lay comprehension of medical documents.

    PubMed

    Keselman, Alla; Smith, Catherine Arnott

    2012-12-01

    Emphasis on participatory medicine requires that patients and consumers participate in tasks traditionally reserved for healthcare providers. This includes reading and comprehending medical documents, often but not necessarily in the context of interacting with Personal Health Records (PHRs). Research suggests that while giving patients access to medical documents has many benefits (e.g., improved patient-provider communication), lay people often have difficulty understanding medical information. Informatics can address the problem by developing tools that support comprehension; this requires in-depth understanding of the nature and causes of errors that lay people make when comprehending clinical documents. The objective of this study was to develop a classification scheme of comprehension errors, based on lay individuals' retellings of two documents containing clinical text: a description of a clinical trial and a typical office visit note. While not comprehensive, the scheme can serve as a foundation of further development of a taxonomy of patients' comprehension errors. Eighty participants, all healthy volunteers, read and retold two medical documents. A data-driven content analysis procedure was used to extract and classify retelling errors. The resulting hierarchical classification scheme contains nine categories and 23 subcategories. The most common error made by the participants involved incorrectly recalling brand names of medications. Other common errors included misunderstanding clinical concepts, misreporting the objective of a clinical research study and physician's findings during a patient's visit, and confusing and misspelling clinical terms. A combination of informatics support and health education is likely to improve the accuracy of lay comprehension of medical documents. Published by Elsevier Inc.

  13. Qualitative analysis scheme based on the properties of ion exchangers (in French)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Machiroux, R.; Merciny, E.; Schreiber, A.

    1973-01-01

    A systematic scheme of qualitative analysis of some cations is presented. For didactic purposes the properties of cationic and anionic ion exchangers were used. At the present time, this scheme is limited to 23 ions, including Sr. (auth)

  14. A supervised 'lesion-enhancement' filter by use of a massive-training artificial neural network (MTANN) in computer-aided diagnosis (CAD).

    PubMed

    Suzuki, Kenji

    2009-09-21

    Computer-aided diagnosis (CAD) has been an active area of study in medical image analysis. A filter for the enhancement of lesions plays an important role for improving the sensitivity and specificity in CAD schemes. The filter enhances objects similar to a model employed in the filter; e.g. a blob-enhancement filter based on the Hessian matrix enhances sphere-like objects. Actual lesions, however, often differ from a simple model; e.g. a lung nodule is generally modeled as a solid sphere, but there are nodules of various shapes and with internal inhomogeneities such as a nodule with spiculations and ground-glass opacity. Thus, conventional filters often fail to enhance actual lesions. Our purpose in this study was to develop a supervised filter for the enhancement of actual lesions (as opposed to a lesion model) by use of a massive-training artificial neural network (MTANN) in a CAD scheme for detection of lung nodules in CT. The MTANN filter was trained with actual nodules in CT images to enhance actual patterns of nodules. By use of the MTANN filter, the sensitivity and specificity of our CAD scheme were improved substantially. With a database of 69 lung cancers, nodule candidate detection by the MTANN filter achieved a 97% sensitivity with 6.7 false positives (FPs) per section, whereas nodule candidate detection by a difference-image technique achieved a 96% sensitivity with 19.3 FPs per section. Classification-MTANNs were applied for further reduction of the FPs. The classification-MTANNs removed 60% of the FPs with a loss of one true positive; thus, it achieved a 96% sensitivity with 2.7 FPs per section. Overall, with our CAD scheme based on the MTANN filter and classification-MTANNs, an 84% sensitivity with 0.5 FPs per section was achieved.

  15. A robust anonymous biometric-based authenticated key agreement scheme for multi-server environments

    PubMed Central

    Huang, Yuanfei; Ma, Fangchao

    2017-01-01

    In order to improve the security in remote authentication systems, numerous biometric-based authentication schemes using smart cards have been proposed. Recently, Moon et al. presented an authentication scheme to remedy the flaws of Lu et al.’s scheme, and claimed that their improved protocol supports the required security properties. Unfortunately, we found that Moon et al.’s scheme still has weaknesses. In this paper, we show that Moon et al.’s scheme is vulnerable to insider attack, server spoofing attack, user impersonation attack and guessing attack. Furthermore, we propose a robust anonymous multi-server authentication scheme using public key encryption to remove the aforementioned problems. From the subsequent formal and informal security analysis, we demonstrate that our proposed scheme provides strong mutual authentication and satisfies the desirable security requirements. The functional and performance analysis shows that the improved scheme has the best secure functionality and is computational efficient. PMID:29121050

  16. A robust anonymous biometric-based authenticated key agreement scheme for multi-server environments.

    PubMed

    Guo, Hua; Wang, Pei; Zhang, Xiyong; Huang, Yuanfei; Ma, Fangchao

    2017-01-01

    In order to improve the security in remote authentication systems, numerous biometric-based authentication schemes using smart cards have been proposed. Recently, Moon et al. presented an authentication scheme to remedy the flaws of Lu et al.'s scheme, and claimed that their improved protocol supports the required security properties. Unfortunately, we found that Moon et al.'s scheme still has weaknesses. In this paper, we show that Moon et al.'s scheme is vulnerable to insider attack, server spoofing attack, user impersonation attack and guessing attack. Furthermore, we propose a robust anonymous multi-server authentication scheme using public key encryption to remove the aforementioned problems. From the subsequent formal and informal security analysis, we demonstrate that our proposed scheme provides strong mutual authentication and satisfies the desirable security requirements. The functional and performance analysis shows that the improved scheme has the best secure functionality and is computational efficient.

  17. Knowledge and attitude of civil servants in Osun state, Southwestern Nigeria towards the national health insurance.

    PubMed

    Olugbenga-Bello, A I; Adebimpe, W O

    2010-12-01

    In Nigeria, inequity and poor accessibility to quality health care has been a persistent problem. This study aimed to determine knowledge and attitude of civil servants in Osun state towards the National Health Insurance Scheme (NHIS). This is a descriptive, cross sectional study of 380 civil servants in the employment of Osun state government, using multi stage sampling method. The research instruments was pre-coded, semi structured, self administered questionnaires. About 60% were aware of out of pocket as the most prevalent form of health care financing, while 40% were aware of NHIS, television and billboards were their main sources of awareness, However, none had good knowledge of the components of NHIS, 26.7% knew about its objectives, and 30% knew about who ideally should benefit from the scheme. Personal spending still accounts for a high as 74.7% of health care spending among respondents but respondents believed that this does not cover all their health needs. Only 0.3% have so far benefited from NHIS while 199 (52.5%) of respondents agreed to participate in the scheme. A significant association exists between willingness to participate in the NHIS scheme and awareness of methods of options of health care financing and awareness of NHIS (P < 0.05) Poor knowledge of the objectives and mechanism of operation of the NHIS scheme characterised the civil servants under study. The poor knowledge of the components and fair attitude towards joining the scheme observed in this study could be improved upon, if stakeholders in the scheme could carry out adequate awareness seminars targeted at the civil servants.

  18. Proceedings of the IFIP WG 11.3 Working Conference on Database Security (6th) Held in Vancouver, British Columbia on 19-22 August 1992.

    DTIC Science & Technology

    1992-01-01

    multiversioning scheme for this purpose was presented in [9]. The scheme guarantees that high level methods would read down object states at lower levels that...order given by fork-stamp, and terminated writing versions with timestamp WStamp. Such a history is needed to implement the multiversioning scheme...recovery protocol for multiversion schedulers and show that this protocol is both correct and secure. The behavior of the recovery protocol depends

  19. Performance of Clinical Laboratories in South African Parasitology Proficiency Testing Surveys between 2004 and 2010

    PubMed Central

    Dini, Leigh; Frean, John

    2012-01-01

    Performance in proficiency testing (PT) schemes is an objective measure of a laboratory's best performance. We examined the performance of participants in two parasitology PT schemes in South Africa from 2004 through 2010. The average rates of acceptable scores over the period were 58% and 66% for the stool and blood parasite schemes, respectively. In our setting, participation in PT alone is insufficient to improve performance; a policy that provides additional resources and training seems necessary. PMID:22814470

  20. Aerodynamic optimization by simultaneously updating flow variables and design parameters with application to advanced propeller designs

    NASA Technical Reports Server (NTRS)

    Rizk, Magdi H.

    1988-01-01

    A scheme is developed for solving constrained optimization problems in which the objective function and the constraint function are dependent on the solution of the nonlinear flow equations. The scheme updates the design parameter iterative solutions and the flow variable iterative solutions simultaneously. It is applied to an advanced propeller design problem with the Euler equations used as the flow governing equations. The scheme's accuracy, efficiency and sensitivity to the computational parameters are tested.

  1. Coding visual features extracted from video sequences.

    PubMed

    Baroffio, Luca; Cesana, Matteo; Redondi, Alessandro; Tagliasacchi, Marco; Tubaro, Stefano

    2014-05-01

    Visual features are successfully exploited in several applications (e.g., visual search, object recognition and tracking, etc.) due to their ability to efficiently represent image content. Several visual analysis tasks require features to be transmitted over a bandwidth-limited network, thus calling for coding techniques to reduce the required bit budget, while attaining a target level of efficiency. In this paper, we propose, for the first time, a coding architecture designed for local features (e.g., SIFT, SURF) extracted from video sequences. To achieve high coding efficiency, we exploit both spatial and temporal redundancy by means of intraframe and interframe coding modes. In addition, we propose a coding mode decision based on rate-distortion optimization. The proposed coding scheme can be conveniently adopted to implement the analyze-then-compress (ATC) paradigm in the context of visual sensor networks. That is, sets of visual features are extracted from video frames, encoded at remote nodes, and finally transmitted to a central controller that performs visual analysis. This is in contrast to the traditional compress-then-analyze (CTA) paradigm, in which video sequences acquired at a node are compressed and then sent to a central unit for further processing. In this paper, we compare these coding paradigms using metrics that are routinely adopted to evaluate the suitability of visual features in the context of content-based retrieval, object recognition, and tracking. Experimental results demonstrate that, thanks to the significant coding gains achieved by the proposed coding scheme, ATC outperforms CTA with respect to all evaluation metrics.

  2. Generating spatially optimized habitat in a trade-off between social optimality and budget efficiency.

    PubMed

    Drechsler, Martin

    2017-02-01

    Auctions have been proposed as alternatives to payments for environmental services when spatial interactions and costs are better known to landowners than to the conservation agency (asymmetric information). Recently, an auction scheme was proposed that delivers optimal conservation in the sense that social welfare is maximized. I examined the social welfare and the budget efficiency delivered by this scheme, where social welfare represents the difference between the monetized ecological benefit and the conservation cost incurred to the landowners and budget efficiency is defined as maximizing the ecological benefit for a given conservation budget. For the analysis, I considered a stylized landscape with land patches that can be used for agriculture or conservation. The ecological benefit was measured by an objective function that increases with increasing number and spatial aggregation of conserved land patches. I compared the social welfare and the budget efficiency of the auction scheme with an agglomeration payment, a policy scheme that considers spatial interactions and that was proposed recently. The auction delivered a higher level of social welfare than the agglomeration payment. However, the agglomeration payment was more efficient budgetarily than the auction, so the comparative performances of the 2 schemes depended on the chosen policy criterion-social welfare or budget efficiency. Both policy criteria are relevant for conservation. Which one should be chosen depends on the problem at hand, for example, whether social preferences should be taken into account in the decision of how much money to invest in conservation or whether the available conservation budget is strictly limited. © 2016 Society for Conservation Biology.

  3. Cryptosporidium genotyping in Europe: The current status and processes for a harmonised multi-locus genotyping scheme.

    PubMed

    Chalmers, Rachel M; Pérez-Cordón, Gregorio; Cacció, Simone M; Klotz, Christian; Robertson, Lucy J

    2018-06-13

    Due to the occurrence of genetic recombination, a reliable and discriminatory method to genotype Cryptosporidium isolates at the intra-species level requires the analysis of multiple loci, but a standardised scheme is not currently available. A workshop was held at the Robert Koch Institute, Berlin in 2016 that gathered 23 scientists with appropriate expertise (in either Cryptosporidium genotyping and/or surveillance, epidemiology or outbreaks) to discuss the processes for the development of a robust, standardised, multi-locus genotyping (MLG) scheme and propose an approach. The background evidence and main conclusions were outlined in a previously published report; the objectives of this further report are to describe 1) the current use of Cryptosporidium genotyping, 2) the elicitation and synthesis of the participants' opinions, and 3) the agreed processes and criteria for the development, evaluation and validation of a standardised MLG scheme for Cryptosporidium surveillance and outbreak investigations. Cryptosporidium was characterised to the species level in 7/12 (58%) participating European countries, mostly for human outbreak investigations. Further genotyping was mostly by sequencing the gp60 gene. A ranking exercise of performance and convenience criteria found that portability, biological robustness, typeability, and discriminatory power were considered by participants as the most important attributes in developing a multilocus scheme. The major barrier to implementation was lack of funding. A structured process for marker identification, evaluation, validation, implementation, and maintenance was proposed and outlined for application to Cryptosporidium, with prioritisation of Cryptosporidium parvum to support investigation of transmission in Europe. Copyright © 2018 Elsevier Inc. All rights reserved.

  4. Organizing and Typing Persistent Objects Within an Object-Oriented Framework

    NASA Technical Reports Server (NTRS)

    Madany, Peter W.; Campbell, Roy H.

    1991-01-01

    Conventional operating systems provide little or no direct support for the services required for an efficient persistent object system implementation. We have built a persistent object scheme using a customization and extension of an object-oriented operating system called Choices. Choices includes a framework for the storage of persistent data that is suited to the construction of both conventional file system and persistent object system. In this paper we describe three areas in which persistent object support differs from file system support: storage organization, storage management, and typing. Persistent object systems must support various sizes of objects efficiently. Customizable containers, which are themselves persistent objects and can be nested, support a wide range of object sizes in Choices. Collections of persistent objects that are accessed as an aggregate and collections of light-weight persistent objects can be clustered in containers that are nested within containers for larger objects. Automated garbage collection schemes are added to storage management and have a major impact on persistent object applications. The Choices persistent object store provides extensible sets of persistent object types. The store contains not only the data for persistent objects but also the names of the classes to which they belong and the code for the operation of the classes. Besides presenting persistent object storage organization, storage management, and typing, this paper discusses how persistent objects are named and used within the Choices persistent data/file system framework.

  5. Load Balancing Using Time Series Analysis for Soft Real Time Systems with Statistically Periodic Loads

    NASA Technical Reports Server (NTRS)

    Hailperin, M.

    1993-01-01

    This thesis provides design and analysis of techniques for global load balancing on ensemble architectures running soft-real-time object-oriented applications with statistically periodic loads. It focuses on estimating the instantaneous average load over all the processing elements. The major contribution is the use of explicit stochastic process models for both the loading and the averaging itself. These models are exploited via statistical time-series analysis and Bayesian inference to provide improved average load estimates, and thus to facilitate global load balancing. This thesis explains the distributed algorithms used and provides some optimality results. It also describes the algorithms' implementation and gives performance results from simulation. These results show that the authors' techniques allow more accurate estimation of the global system loading, resulting in fewer object migrations than local methods. The authors' method is shown to provide superior performance, relative not only to static load-balancing schemes but also to many adaptive load-balancing methods. Results from a preliminary analysis of another system and from simulation with a synthetic load provide some evidence of more general applicability.

  6. A Bayesian network coding scheme for annotating biomedical information presented to genetic counseling clients.

    PubMed

    Green, Nancy

    2005-04-01

    We developed a Bayesian network coding scheme for annotating biomedical content in layperson-oriented clinical genetics documents. The coding scheme supports the representation of probabilistic and causal relationships among concepts in this domain, at a high enough level of abstraction to capture commonalities among genetic processes and their relationship to health. We are using the coding scheme to annotate a corpus of genetic counseling patient letters as part of the requirements analysis and knowledge acquisition phase of a natural language generation project. This paper describes the coding scheme and presents an evaluation of intercoder reliability for its tag set. In addition to giving examples of use of the coding scheme for analysis of discourse and linguistic features in this genre, we suggest other uses for it in analysis of layperson-oriented text and dialogue in medical communication.

  7. Security enhanced multi-factor biometric authentication scheme using bio-hash function.

    PubMed

    Choi, Younsung; Lee, Youngsook; Moon, Jongho; Won, Dongho

    2017-01-01

    With the rapid development of personal information and wireless communication technology, user authentication schemes have been crucial to ensure that wireless communications are secure. As such, various authentication schemes with multi-factor authentication have been proposed to improve the security of electronic communications. Multi-factor authentication involves the use of passwords, smart cards, and various biometrics to provide users with the utmost privacy and data protection. Cao and Ge analyzed various authentication schemes and found that Younghwa An's scheme was susceptible to a replay attack where an adversary masquerades as a legal server and a user masquerading attack where user anonymity is not provided, allowing an adversary to execute a password change process by intercepting the user's ID during login. Cao and Ge improved upon Younghwa An's scheme, but various security problems remained. This study demonstrates that Cao and Ge's scheme is susceptible to a biometric recognition error, slow wrong password detection, off-line password attack, user impersonation attack, ID guessing attack, a DoS attack, and that their scheme cannot provide session key agreement. Then, to address all weaknesses identified in Cao and Ge's scheme, this study proposes a security enhanced multi-factor biometric authentication scheme and provides a security analysis and formal analysis using Burrows-Abadi-Needham logic. Finally, the efficiency analysis reveals that the proposed scheme can protect against several possible types of attacks with only a slightly high computational cost.

  8. Empirical and modeled synoptic cloud climatology of the Arctic Ocean

    NASA Technical Reports Server (NTRS)

    Barry, R. G.; Newell, J. P.; Schweiger, A.; Crane, R. G.

    1986-01-01

    A set of cloud cover data were developed for the Arctic during the climatically important spring/early summer transition months. Parallel with the determination of mean monthly cloud conditions, data for different synoptic pressure patterns were also composited as a means of evaluating the role of synoptic variability on Arctic cloud regimes. In order to carry out this analysis, a synoptic classification scheme was developed for the Arctic using an objective typing procedure. A second major objective was to analyze model output of pressure fields and cloud parameters from a control run of the Goddard Institue for Space Studies climate model for the same area and to intercompare the synoptic climatatology of the model with that based on the observational data.

  9. Application of multi-agent coordination methods to the design of space debris mitigation tours

    NASA Astrophysics Data System (ADS)

    Stuart, Jeffrey; Howell, Kathleen; Wilson, Roby

    2016-04-01

    The growth in the number of defunct and fragmented objects near to the Earth poses a growing hazard to launch operations as well as existing on-orbit assets. Numerous studies have demonstrated the positive impact of active debris mitigation campaigns upon the growth of debris populations, but comparatively fewer investigations incorporate specific mission scenarios. Furthermore, while many active mitigation methods have been proposed, certain classes of debris objects are amenable to mitigation campaigns employing chaser spacecraft with existing chemical and low-thrust propulsive technologies. This investigation incorporates an ant colony optimization routing algorithm and multi-agent coordination via auctions into a debris mitigation tour scheme suitable for preliminary mission design and analysis as well as spacecraft flight operations.

  10. Multi-objective optimization for conjunctive water use using coupled hydrogeological and agronomic models: a case study in Heihe mid-reach (China)

    NASA Astrophysics Data System (ADS)

    LI, Y.; Kinzelbach, W.; Pedrazzini, G.

    2017-12-01

    Groundwater is a vital water resource to buffer unexpected drought risk in agricultural production, which is however apt to unsustainable exploitation due to its open access characteristic and a much underestimated marginal cost. Being a wicked problem of general water resource management, groundwater staying hidden from surface terrain further amplifies difficulties of management. China has been facing this challenge in last decades, particularly in the northern part where irrigated agriculture resides despite of scarce surface water available compared to the south. Farmers therefore have been increasingly exploiting groundwater as an alternative in order to reach Chinese food self-sufficiency requirements and feed fast socio-economic development. In this work, we studied Heihe mid-reach located in northern China, which represents one of a few regions suffering from symptoms of unsustainable groundwater use, such as a large drawdown of the groundwater table in some irrigation districts, or soil salinization due to phreatic evaporation in others. In addition, we focus on solving a multi-objective optimization problem of conjunctive water use in order to find an alternative management scheme that fits decision makers' preference. The methodology starts with a global sensitivity analysis to determine the most influential decision variables. Then a state-of-the-art multi-objective evolutionary algorithm (MOEA) is employed to search a hyper-dimensional Pareto Front. The aquifer system is simulated with a distributed Modflow model, which is able to capture the main phenomenon of interest. Results show that the current water allocation scheme seems to exploit the water resources in an inefficient way, where areas with depression cones and areas with salinization or groundwater table rise can both be mitigated with an alternative management scheme. When assuming uncertain boundary conditions according to future climate change, the optimal solutions can yield better performance in economical productivity by reducing opportunity cost under unexpected drought conditions.

  11. LightForce Photon-Pressure Collision Avoidance: Efficiency Assessment on an Entire Catalogue of Space Debris

    NASA Technical Reports Server (NTRS)

    Stupl, Jan Michael; Faber, Nicolas; Foster, Cyrus; Yang Yang, Fan; Levit, Creon

    2013-01-01

    The potential to perturb debris orbits using photon pressure from ground-based lasers has been confirmed by independent research teams. Two useful applications of this scheme are protecting space assets from impacts with debris and stabilizing the orbital debris environment, both relying on collision avoidance rather than de-orbiting debris. This paper presents the results of a new assessment method to analyze the efficiency of the concept for collision avoidance. Earlier research concluded that one ground based system consisting of a 10 kW class laser, directed by a 1.5 m telescope with adaptive optics, can prevent a significant fraction of debris-debris collisions in low Earth orbit. That research used in-track displacement to measure efficiency and restricted itself to an analysis of a limited number of objects. As orbit prediction error is dependent on debris object properties, a static displacement threshold should be complemented with another measure to assess the efficiency of the scheme. In this paper we present the results of an approach using probability of collision. Using a least-squares fitting method, we improve the quality of the original TLE catalogue in terms of state and co-state accuracy. We then calculate collision probabilities for all the objects in the catalogue. The conjunctions with the highest risk of collision are then engaged by a simulated network of laser ground stations. After those engagements, the perturbed orbits are used to re-assess the collision probability in a 20 minute window around the original conjunction. We then use different criteria to evaluate the utility of the laser-based collision avoidance scheme and assess the number of base-line ground stations needed to mitigate a significant number of high probability conjunctions. Finally, we also give an account how a laser ground station can be used for both orbit deflection and debris tracking.

  12. Convergence Analysis of the Graph Allen-Cahn Scheme

    DTIC Science & Technology

    2016-02-01

    CONVERGENCE ANALYSIS OF THE GRAPH ALLEN-CAHN SCHEME ∗ XIYANG LUO† AND ANDREA L. BERTOZZI† Abstract. Graph partitioning problems have a wide range of...optimization, convergence and monotonicity are shown for a class of schemes under a graph-independent timestep restriction. We also analyze the effects of...spectral truncation, a common technique used to save computational cost. Convergence of the scheme with spectral truncation is also proved under a

  13. An Innovative Approach to Scheme Learning Map Considering Tradeoff Multiple Objectives

    ERIC Educational Resources Information Center

    Lin, Yu-Shih; Chang, Yi-Chun; Chu, Chih-Ping

    2016-01-01

    An important issue in personalized learning is to provide learners with customized learning according to their learning characteristics. This paper focused attention on scheming learning map as follows. The learning goal can be achieved via different pathways based on alternative materials, which have the relationships of prerequisite, dependence,…

  14. The category MF in the semistable case

    NASA Astrophysics Data System (ADS)

    Faltings, G.

    2016-10-01

    The categories MF over discrete valuation rings were introduced by J. M. Fontaine as crystalline objects one might hope to associate with Galois representations. The definition was later extended to smooth base-schemes. Here we give a further extension to semistable schemes. As an application we show that certain Shimura varieties have semistable models.

  15. Catastrophic Health Expenditure and Rural Household Impoverishment in China: What Role Does the New Cooperative Health Insurance Scheme Play?

    PubMed Central

    Wu, Qunhong; Liu, Chaojie; Jiao, Mingli; Liu, Guoxiang; Hao, Yanhua; Ning, Ning

    2014-01-01

    Objective To determine whether the New Cooperative Medical Insurance Scheme (NCMS) is associated with decreased levels of catastrophic health expenditure and reduced impoverishment due to medical expenses in rural households of China. Methods An analysis of a national representative sample of 38,945 rural households (129,635 people) from the 2008 National Health Service Survey was performed. Logistic regression models used binary indicator of catastrophic health expenditure as dependent variable, with household consumption, demographic characteristics, health insurance schemes, and chronic illness as independent variables. Results Higher percentage of households experiencing catastrophic health expenditure and medical impoverishment correlates to increased health care need. While the higher socio-economic status households had similar levels of catastrophic health expenditure as compared with the lowest. Households covered by the NCMS had similar levels of catastrophic health expenditure and medical impoverishment as those without health insurance. Conclusion Despite over 95% of coverage, the NCMS has failed to prevent catastrophic health expenditure and medical impoverishment. An upgrade of benefit packages is needed, and effective cost control mechanisms on the provider side needs to be considered. PMID:24714605

  16. Sunspot Pattern Classification using PCA and Neural Networks (Poster)

    NASA Technical Reports Server (NTRS)

    Rajkumar, T.; Thompson, D. E.; Slater, G. L.

    2005-01-01

    The sunspot classification scheme presented in this paper is considered as a 2-D classification problem on archived datasets, and is not a real-time system. As a first step, it mirrors the Zuerich/McIntosh historical classification system and reproduces classification of sunspot patterns based on preprocessing and neural net training datasets. Ultimately, the project intends to move from more rudimentary schemes, to develop spatial-temporal-spectral classes derived by correlating spatial and temporal variations in various wavelengths to the brightness fluctuation spectrum of the sun in those wavelengths. Once the approach is generalized, then the focus will naturally move from a 2-D to an n-D classification, where "n" includes time and frequency. Here, the 2-D perspective refers both to the actual SOH0 Michelson Doppler Imager (MDI) images that are processed, but also refers to the fact that a 2-D matrix is created from each image during preprocessing. The 2-D matrix is the result of running Principal Component Analysis (PCA) over the selected dataset images, and the resulting matrices and their eigenvalues are the objects that are stored in a database, classified, and compared. These matrices are indexed according to the standard McIntosh classification scheme.

  17. Benefits of incorporating the adaptive dynamic range optimization amplification scheme into an assistive listening device for people with mild or moderate hearing loss.

    PubMed

    Chang, Hung-Yue; Luo, Ching-Hsing; Lo, Tun-Shin; Chen, Hsiao-Chuan; Huang, Kuo-You; Liao, Wen-Huei; Su, Mao-Chang; Liu, Shu-Yu; Wang, Nan-Mai

    2017-08-28

    This study investigated whether a self-designed assistive listening device (ALD) that incorporates an adaptive dynamic range optimization (ADRO) amplification strategy can surpass a commercially available monaurally worn linear ALD, SM100. Both subjective and objective measurements were implemented. Mandarin Hearing-In-Noise Test (MHINT) scores were the objective measurement, whereas participant satisfaction was the subjective measurement. The comparison was performed in a mixed design (i.e., subjects' hearing status being mild or moderate, quiet versus noisy, and linear versus ADRO scheme). The participants were two groups of hearing-impaired subjects, nine mild and eight moderate, respectively. The results of the ADRO system revealed a significant difference in the MHINT sentence reception threshold (SRT) in noisy environments between monaurally aided and unaided conditions, whereas the linear system did not. The benchmark results showed that the ADRO scheme is effectively beneficial to people who experience mild or moderate hearing loss in noisy environments. The satisfaction rating regarding overall speech quality indicated that the participants were satisfied with the speech quality of both ADRO and linear schemes in quiet environments, and they were more satisfied with ADRO than they with the linear scheme in noisy environments.

  18. Multimedia content description framework

    NASA Technical Reports Server (NTRS)

    Bergman, Lawrence David (Inventor); Mohan, Rakesh (Inventor); Li, Chung-Sheng (Inventor); Smith, John Richard (Inventor); Kim, Michelle Yoonk Yung (Inventor)

    2003-01-01

    A framework is provided for describing multimedia content and a system in which a plurality of multimedia storage devices employing the content description methods of the present invention can interoperate. In accordance with one form of the present invention, the content description framework is a description scheme (DS) for describing streams or aggregations of multimedia objects, which may comprise audio, images, video, text, time series, and various other modalities. This description scheme can accommodate an essentially limitless number of descriptors in terms of features, semantics or metadata, and facilitate content-based search, index, and retrieval, among other capabilities, for both streamed or aggregated multimedia objects.

  19. An efficient fully unsupervised video object segmentation scheme using an adaptive neural-network classifier architecture.

    PubMed

    Doulamis, A; Doulamis, N; Ntalianis, K; Kollias, S

    2003-01-01

    In this paper, an unsupervised video object (VO) segmentation and tracking algorithm is proposed based on an adaptable neural-network architecture. The proposed scheme comprises: 1) a VO tracking module and 2) an initial VO estimation module. Object tracking is handled as a classification problem and implemented through an adaptive network classifier, which provides better results compared to conventional motion-based tracking algorithms. Network adaptation is accomplished through an efficient and cost effective weight updating algorithm, providing a minimum degradation of the previous network knowledge and taking into account the current content conditions. A retraining set is constructed and used for this purpose based on initial VO estimation results. Two different scenarios are investigated. The first concerns extraction of human entities in video conferencing applications, while the second exploits depth information to identify generic VOs in stereoscopic video sequences. Human face/ body detection based on Gaussian distributions is accomplished in the first scenario, while segmentation fusion is obtained using color and depth information in the second scenario. A decision mechanism is also incorporated to detect time instances for weight updating. Experimental results and comparisons indicate the good performance of the proposed scheme even in sequences with complicated content (object bending, occlusion).

  20. Medium-range, objective predictions of thunderstorm location and severity for aviation

    NASA Technical Reports Server (NTRS)

    Wilson, G. S.; Turner, R. E.

    1981-01-01

    This paper presents a computerized technique for medium-range (12-48h) prediction of both the location and severity of thunderstorms utilizing atmospheric predictions from the National Meteorological Center's limited-area fine-mesh model (LFM). A regional-scale analysis scheme is first used to examine the spatial and temporal distributions of forecasted variables associated with the structure and dynamics of mesoscale systems over an area of approximately 10 to the 6th sq km. The final prediction of thunderstorm location and severity is based upon an objective combination of these regionally analyzed variables. Medium-range thunderstorm predictions are presented for the late afternoon period of April 10, 1979, the day of the Wichita Falls, Texas tornado. Conventional medium-range thunderstorm forecasts, made from observed data, are presented with the case study to demonstrate the possible application of this objective technique in improving 12-48 h thunderstorm forecasts for aviation.

  1. Communication and Sensorimotor Functioning in Children with Autism.

    ERIC Educational Resources Information Center

    Abrahamsen, Eileen P.; Mitchell, Jennifer R.

    1990-01-01

    Sensorimotor functioning in 10 autistic children, age 3-7, was assessed on object permanence, means-end, causality, vocal and gestural imitation, the construction of objects in space, and schemes for relating objects. The number and diversity of pragmatic functions in the children's communication were also analyzed and related to sensorimotor…

  2. Gas Evolution Dynamics in Godunov-Type Schemes and Analysis of Numerical Shock Instability

    NASA Technical Reports Server (NTRS)

    Xu, Kun

    1999-01-01

    In this paper we are going to study the gas evolution dynamics of the exact and approximate Riemann solvers, e.g., the Flux Vector Splitting (FVS) and the Flux Difference Splitting (FDS) schemes. Since the FVS scheme and the Kinetic Flux Vector Splitting (KFVS) scheme have the same physical mechanism and similar flux function, based on the analysis of the discretized KFVS scheme the weakness and advantage of the FVS scheme are closely observed. The subtle dissipative mechanism of the Godunov method in the 2D case is also analyzed, and the physical reason for shock instability, i.e., carbuncle phenomena and odd-even decoupling, is presented.

  3. Security analysis and enhancements of an effective biometric-based remote user authentication scheme using smart cards.

    PubMed

    An, Younghwa

    2012-01-01

    Recently, many biometrics-based user authentication schemes using smart cards have been proposed to improve the security weaknesses in user authentication system. In 2011, Das proposed an efficient biometric-based remote user authentication scheme using smart cards that can provide strong authentication and mutual authentication. In this paper, we analyze the security of Das's authentication scheme, and we have shown that Das's authentication scheme is still insecure against the various attacks. Also, we proposed the enhanced scheme to remove these security problems of Das's authentication scheme, even if the secret information stored in the smart card is revealed to an attacker. As a result of security analysis, we can see that the enhanced scheme is secure against the user impersonation attack, the server masquerading attack, the password guessing attack, and the insider attack and provides mutual authentication between the user and the server.

  4. Security Analysis and Enhancements of an Effective Biometric-Based Remote User Authentication Scheme Using Smart Cards

    PubMed Central

    An, Younghwa

    2012-01-01

    Recently, many biometrics-based user authentication schemes using smart cards have been proposed to improve the security weaknesses in user authentication system. In 2011, Das proposed an efficient biometric-based remote user authentication scheme using smart cards that can provide strong authentication and mutual authentication. In this paper, we analyze the security of Das's authentication scheme, and we have shown that Das's authentication scheme is still insecure against the various attacks. Also, we proposed the enhanced scheme to remove these security problems of Das's authentication scheme, even if the secret information stored in the smart card is revealed to an attacker. As a result of security analysis, we can see that the enhanced scheme is secure against the user impersonation attack, the server masquerading attack, the password guessing attack, and the insider attack and provides mutual authentication between the user and the server. PMID:22899887

  5. TH-EF-207A-03: Photon Counting Implementation Challenges Using An Electron Multiplying Charged-Coupled Device Based Micro-CT System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Podgorsak, A; Bednarek, D; Rudin, S

    2016-06-15

    Purpose: To successfully implement and operate a photon counting scheme on an electron multiplying charged-coupled device (EMCCD) based micro-CT system. Methods: We built an EMCCD based micro-CT system and implemented a photon counting scheme. EMCCD detectors use avalanche transfer registries to multiply the input signal far above the readout noise floor. Due to intrinsic differences in the pixel array, using a global threshold for photon counting is not optimal. To address this shortcoming, we generated a threshold array based on sixty dark fields (no x-ray exposure). We calculated an average matrix and a variance matrix of the dark field sequence.more » The average matrix was used for the offset correction while the variance matrix was used to set individual pixel thresholds for the photon counting scheme. Three hundred photon counting frames were added for each projection and 360 projections were acquired for each object. The system was used to scan various objects followed by reconstruction using an FDK algorithm. Results: Examination of the projection images and reconstructed slices of the objects indicated clear interior detail free of beam hardening artifacts. This suggests successful implementation of the photon counting scheme on our EMCCD based micro-CT system. Conclusion: This work indicates that it is possible to implement and operate a photon counting scheme on an EMCCD based micro-CT system, suggesting that these devices might be able to operate at very low x-ray exposures in a photon counting mode. Such devices could have future implications in clinical CT protocols. NIH Grant R01EB002873; Toshiba Medical Systems Corp.« less

  6. A depictive neural model for the representation of motion verbs.

    PubMed

    Rao, Sunil; Aleksander, Igor

    2011-11-01

    In this paper, we present a depictive neural model for the representation of motion verb semantics in neural models of visual awareness. The problem of modelling motion verb representation is shown to be one of function application, mapping a set of given input variables defining the moving object and the path of motion to a defined output outcome in the motion recognition context. The particular function-applicative implementation and consequent recognition model design presented are seen as arising from a noun-adjective recognition model enabling the recognition of colour adjectives as applied to a set of shapes representing objects to be recognised. The presence of such a function application scheme and a separately implemented position identification and path labelling scheme are accordingly shown to be the primitives required to enable the design and construction of a composite depictive motion verb recognition scheme. Extensions to the presented design to enable the representation of transitive verbs are also discussed.

  7. Long-distance thermal temporal ghost imaging over optical fibers

    NASA Astrophysics Data System (ADS)

    Yao, Xin; Zhang, Wei; Li, Hao; You, Lixing; Wang, Zhen; Huang, Yidong

    2018-02-01

    A thermal ghost imaging scheme between two distant parties is proposed and experimentally demonstrated over long-distance optical fibers. In the scheme, the weak thermal light is split into two paths. Photons in one path are spatially diffused according to their frequencies by a spatial dispersion component, then illuminate the object and record its spatial transmission information. Photons in the other path are temporally diffused by a temporal dispersion component. By the coincidence measurement between photons of two paths, the object can be imaged in a way of ghost imaging, based on the frequency correlation between photons in the two paths. In the experiment, the weak thermal light source is prepared by the spontaneous four-wave mixing in a silicon waveguide. The temporal dispersion is introduced by single mode fibers of 50 km, which also could be looked as a fiber link. Experimental results show that this scheme can be realized over long-distance optical fibers.

  8. Improvement on Gabor order tracking and objective comparison with Vold Kalman filtering order tracking

    NASA Astrophysics Data System (ADS)

    Pan, Min-Chun; Liao, Shiu-Wei; Chiu, Chun-Chin

    2007-02-01

    The waveform-reconstruction schemes of order tracking (OT) such as the Gabor and the Vold-Kalman filtering (VKF) techniques can extract specific order and/or spectral components in addition to characterizing the processed signal in rpm-frequency domain. The study first improves the Gabor OT (GOT) technique to handle the order-crossing problem, and then objectively compares the features of the improved GOT scheme and the angular-displacement VKF OT technique. It is numerically observed the improved method performs less accurately than the VKF_OT scheme at the crossing occurrences, but without end effect in the reconstructed waveform. As OT is not exact science, it may well be that the decrease in computation time can justify the reduced accuracy. The characterisation and discrimination of riding noise with crossing orders emitted by an electrical scooter are conducted as an example of the application.

  9. Content-based unconstrained color logo and trademark retrieval with color edge gradient co-occurrence histograms

    NASA Astrophysics Data System (ADS)

    Phan, Raymond; Androutsos, Dimitrios

    2008-01-01

    In this paper, we present a logo and trademark retrieval system for unconstrained color image databases that extends the Color Edge Co-occurrence Histogram (CECH) object detection scheme. We introduce more accurate information to the CECH, by virtue of incorporating color edge detection using vector order statistics. This produces a more accurate representation of edges in color images, in comparison to the simple color pixel difference classification of edges as seen in the CECH. Our proposed method is thus reliant on edge gradient information, and as such, we call this the Color Edge Gradient Co-occurrence Histogram (CEGCH). We use this as the main mechanism for our unconstrained color logo and trademark retrieval scheme. Results illustrate that the proposed retrieval system retrieves logos and trademarks with good accuracy, and outperforms the CECH object detection scheme with higher precision and recall.

  10. Data Management Systems (DMS): Complex data types study. Volume 1: Appendices A-B. Volume 2: Appendices C1-C5. Volume 3: Appendices D1-D3 and E

    NASA Technical Reports Server (NTRS)

    Leibfried, T. F., Jr.; Davari, Sadegh; Natarajan, Swami; Zhao, Wei

    1992-01-01

    Two categories were chosen for study: the issue of using a preprocessor on Ada code of Application Programs which would interface with the Run-Time Object Data Base Standard Services (RODB STSV), the intent was to catch and correct any mis-registration errors of the program coder between the user declared Objects, their types, their addresses, and the corresponding RODB definitions; and RODB STSV Performance Issues and Identification of Problems with the planned methods for accessing Primitive Object Attributes, this included the study of an alternate storage scheme to the 'store objects by attribute' scheme in the current design of the RODB. The study resulted in essentially three separate documents, an interpretation of the system requirements, an assessment of the preliminary design, and a detailing of the components of a detailed design.

  11. Security enhanced multi-factor biometric authentication scheme using bio-hash function

    PubMed Central

    Lee, Youngsook; Moon, Jongho

    2017-01-01

    With the rapid development of personal information and wireless communication technology, user authentication schemes have been crucial to ensure that wireless communications are secure. As such, various authentication schemes with multi-factor authentication have been proposed to improve the security of electronic communications. Multi-factor authentication involves the use of passwords, smart cards, and various biometrics to provide users with the utmost privacy and data protection. Cao and Ge analyzed various authentication schemes and found that Younghwa An’s scheme was susceptible to a replay attack where an adversary masquerades as a legal server and a user masquerading attack where user anonymity is not provided, allowing an adversary to execute a password change process by intercepting the user’s ID during login. Cao and Ge improved upon Younghwa An’s scheme, but various security problems remained. This study demonstrates that Cao and Ge’s scheme is susceptible to a biometric recognition error, slow wrong password detection, off-line password attack, user impersonation attack, ID guessing attack, a DoS attack, and that their scheme cannot provide session key agreement. Then, to address all weaknesses identified in Cao and Ge’s scheme, this study proposes a security enhanced multi-factor biometric authentication scheme and provides a security analysis and formal analysis using Burrows-Abadi-Needham logic. Finally, the efficiency analysis reveals that the proposed scheme can protect against several possible types of attacks with only a slightly high computational cost. PMID:28459867

  12. Optimal rotated staggered-grid finite-difference schemes for elastic wave modeling in TTI media

    NASA Astrophysics Data System (ADS)

    Yang, Lei; Yan, Hongyong; Liu, Hong

    2015-11-01

    The rotated staggered-grid finite-difference (RSFD) is an effective approach for numerical modeling to study the wavefield characteristics in tilted transversely isotropic (TTI) media. But it surfaces from serious numerical dispersion, which directly affects the modeling accuracy. In this paper, we propose two different optimal RSFD schemes based on the sampling approximation (SA) method and the least-squares (LS) method respectively to overcome this problem. We first briefly introduce the RSFD theory, based on which we respectively derive the SA-based RSFD scheme and the LS-based RSFD scheme. Then different forms of analysis are used to compare the SA-based RSFD scheme and the LS-based RSFD scheme with the conventional RSFD scheme, which is based on the Taylor-series expansion (TE) method. The contrast in numerical accuracy analysis verifies the greater accuracy of the two proposed optimal schemes, and indicates that these schemes can effectively widen the wavenumber range with great accuracy compared with the TE-based RSFD scheme. Further comparisons between these two optimal schemes show that at small wavenumbers, the SA-based RSFD scheme performs better, while at large wavenumbers, the LS-based RSFD scheme leads to a smaller error. Finally, the modeling results demonstrate that for the same operator length, the SA-based RSFD scheme and the LS-based RSFD scheme can achieve greater accuracy than the TE-based RSFD scheme, while for the same accuracy, the optimal schemes can adopt shorter difference operators to save computing time.

  13. Forcing scheme analysis for the axisymmetric lattice Boltzmann method under incompressible limit.

    PubMed

    Zhang, Liangqi; Yang, Shiliang; Zeng, Zhong; Chen, Jie; Yin, Linmao; Chew, Jia Wei

    2017-04-01

    Because the standard lattice Boltzmann (LB) method is proposed for Cartesian Navier-Stokes (NS) equations, additional source terms are necessary in the axisymmetric LB method for representing the axisymmetric effects. Therefore, the accuracy and applicability of the axisymmetric LB models depend on the forcing schemes adopted for discretization of the source terms. In this study, three forcing schemes, namely, the trapezium rule based scheme, the direct forcing scheme, and the semi-implicit centered scheme, are analyzed theoretically by investigating their derived macroscopic equations in the diffusive scale. Particularly, the finite difference interpretation of the standard LB method is extended to the LB equations with source terms, and then the accuracy of different forcing schemes is evaluated for the axisymmetric LB method. Theoretical analysis indicates that the discrete lattice effects arising from the direct forcing scheme are part of the truncation error terms and thus would not affect the overall accuracy of the standard LB method with general force term (i.e., only the source terms in the momentum equation are considered), but lead to incorrect macroscopic equations for the axisymmetric LB models. On the other hand, the trapezium rule based scheme and the semi-implicit centered scheme both have the advantage of avoiding the discrete lattice effects and recovering the correct macroscopic equations. Numerical tests applied for validating the theoretical analysis show that both the numerical stability and the accuracy of the axisymmetric LB simulations are affected by the direct forcing scheme, which indicate that forcing schemes free of the discrete lattice effects are necessary for the axisymmetric LB method.

  14. What's in a Name? A Comparison of Methods for Classifying Predominant Type of Maltreatment

    ERIC Educational Resources Information Center

    Lau, A.S.; Leeb, R.T.; English, D.; Graham, J.C.; Briggs, E.C.; Brody, K.E.; Marshall, J.M.

    2005-01-01

    Objective:: The primary aim of the study was to identify a classification scheme, for determining the predominant type of maltreatment in a child's history that best predicts differences in developmental outcomes. Method:: Three different predominant type classification schemes were examined in a sample of 519 children with a history of alleged…

  15. Hospital Coding Practice, Data Quality, and DRG-Based Reimbursement under the Thai Universal Coverage Scheme

    ERIC Educational Resources Information Center

    Pongpirul, Krit

    2011-01-01

    In the Thai Universal Coverage scheme, hospital providers are paid for their inpatient care using Diagnosis Related Group (DRG) reimbursement. Questionable quality of the submitted DRG codes has been of concern whereas knowledge about hospital coding practice has been lacking. The objectives of this thesis are (1) To explore hospital coding…

  16. An Optimally Stable and Accurate Second-Order SSP Runge-Kutta IMEX Scheme for Atmospheric Applications

    NASA Astrophysics Data System (ADS)

    Rokhzadi, Arman; Mohammadian, Abdolmajid; Charron, Martin

    2018-01-01

    The objective of this paper is to develop an optimized implicit-explicit (IMEX) Runge-Kutta scheme for atmospheric applications focusing on stability and accuracy. Following the common terminology, the proposed method is called IMEX-SSP2(2,3,2), as it has second-order accuracy and is composed of diagonally implicit two-stage and explicit three-stage parts. This scheme enjoys the Strong Stability Preserving (SSP) property for both parts. This new scheme is applied to nonhydrostatic compressible Boussinesq equations in two different arrangements, including (i) semiimplicit and (ii) Horizontally Explicit-Vertically Implicit (HEVI) forms. The new scheme preserves the SSP property for larger regions of absolute monotonicity compared to the well-studied scheme in the same class. In addition, numerical tests confirm that the IMEX-SSP2(2,3,2) improves the maximum stable time step as well as the level of accuracy and computational cost compared to other schemes in the same class. It is demonstrated that the A-stability property as well as satisfying "second-stage order" and stiffly accurate conditions lead the proposed scheme to better performance than existing schemes for the applications examined herein.

  17. Maternal healthcare financing: Gujarat's Chiranjeevi Scheme and its beneficiaries.

    PubMed

    Bhat, Ramesh; Mavalankar, Dileep V; Singh, Prabal V; Singh, Neelu

    2009-04-01

    Maternal mortality is an important public-health issue in India, specifically in Gujarat. Contributing factors are the Government's inability to operationalize the First Referral Units and to provide an adequate level of skilled birth attendants, especially to the poor. In response, the Gujarat state has developed a unique public-private partnership called the Chiranjeevi Scheme. This scheme focuses on institutional delivery, specifically emergency obstetric care for the poor. The objective of the study was to explore the targeting of the scheme, its coverage, and socioeconomic profile of the beneficiaries and to assess financial protection offered by the scheme, if any, in Dahod, one of the initial pilot districts of Gujarat. A household-level survey of beneficiaries (n=262) and non-users (n=394) indicated that the scheme is well-targeted to the poor but many poor people do not use the services. The beneficiaries saved more than Rs 3000 (US$ 75) in delivery-related expenses and were generally satisfied with the scheme. The study provided insights on how to improve the scheme further. Such a financing scheme could be replicated in other states and countries to address the cost barrier, especially in areas where high numbers of private specialists are available.

  18. Computational process to study the wave propagation In a non-linear medium by quasi- linearization

    NASA Astrophysics Data System (ADS)

    Sharath Babu, K.; Venkata Brammam, J.; Baby Rani, CH

    2018-03-01

    Two objects having distinct velocities come into contact an impact can occur. The impact study i.e., in the displacement of the objects after the impact, the impact force is function of time‘t’ which is behaves similar to compression force. The impact tenure is very short so impulses must be generated subsequently high stresses are generated. In this work we are examined the wave propagation inside the object after collision and measured the object non-linear behavior in the one-dimensional case. Wave transmission is studied by means of material acoustic parameter value. The objective of this paper is to present a computational study of propagating pulsation and harmonic waves in nonlinear media using quasi-linearization and subsequently utilized the central difference scheme. This study gives focus on longitudinal, one- dimensional wave propagation. In the finite difference scheme Non-linear system is reduced to a linear system by applying quasi-linearization method. The computed results exhibit good agreement on par with the selected non-liner wave propagation.

  19. Approach for scene reconstruction from the analysis of a triplet of still images

    NASA Astrophysics Data System (ADS)

    Lechat, Patrick; Le Mestre, Gwenaelle; Pele, Danielle

    1997-03-01

    Three-dimensional modeling of a scene from the automatic analysis of 2D image sequences is a big challenge for future interactive audiovisual services based on 3D content manipulation such as virtual vests, 3D teleconferencing and interactive television. We propose a scheme that computes 3D objects models from stereo analysis of image triplets shot by calibrated cameras. After matching the different views with a correlation based algorithm, a depth map referring to a given view is built by using a fusion criterion taking into account depth coherency, visibility constraints and correlation scores. Because luminance segmentation helps to compute accurate object borders and to detect and improve the unreliable depth values, a two steps segmentation algorithm using both depth map and graylevel image is applied to extract the objects masks. First an edge detection segments the luminance image in regions and a multimodal thresholding method selects depth classes from the depth map. Then the regions are merged and labelled with the different depth classes numbers by using a coherence test on depth values according to the rate of reliable and dominant depth values and the size of the regions. The structures of the segmented objects are obtained with a constrained Delaunay triangulation followed by a refining stage. Finally, texture mapping is performed using open inventor or VRML1.0 tools.

  20. An analysis of hydrogen production via closed-cycle schemes. [thermochemical processings from water

    NASA Technical Reports Server (NTRS)

    Chao, R. E.; Cox, K. E.

    1975-01-01

    A thermodynamic analysis and state-of-the-art review of three basic schemes for production of hydrogen from water: electrolysis, thermal water-splitting, and multi-step thermochemical closed cycles is presented. Criteria for work-saving thermochemical closed-cycle processes are established, and several schemes are reviewed in light of such criteria. An economic analysis is also presented in the context of energy costs.

  1. An enhanced biometric authentication scheme for telecare medicine information systems with nonce using chaotic hash function.

    PubMed

    Das, Ashok Kumar; Goswami, Adrijit

    2014-06-01

    Recently, Awasthi and Srivastava proposed a novel biometric remote user authentication scheme for the telecare medicine information system (TMIS) with nonce. Their scheme is very efficient as it is based on efficient chaotic one-way hash function and bitwise XOR operations. In this paper, we first analyze Awasthi-Srivastava's scheme and then show that their scheme has several drawbacks: (1) incorrect password change phase, (2) fails to preserve user anonymity property, (3) fails to establish a secret session key beween a legal user and the server, (4) fails to protect strong replay attack, and (5) lacks rigorous formal security analysis. We then a propose a novel and secure biometric-based remote user authentication scheme in order to withstand the security flaw found in Awasthi-Srivastava's scheme and enhance the features required for an idle user authentication scheme. Through the rigorous informal and formal security analysis, we show that our scheme is secure against possible known attacks. In addition, we simulate our scheme for the formal security verification using the widely-accepted AVISPA (Automated Validation of Internet Security Protocols and Applications) tool and show that our scheme is secure against passive and active attacks, including the replay and man-in-the-middle attacks. Our scheme is also efficient as compared to Awasthi-Srivastava's scheme.

  2. A Microbial Assessment Scheme to measure microbial performance of Food Safety Management Systems.

    PubMed

    Jacxsens, L; Kussaga, J; Luning, P A; Van der Spiegel, M; Devlieghere, F; Uyttendaele, M

    2009-08-31

    A Food Safety Management System (FSMS) implemented in a food processing industry is based on Good Hygienic Practices (GHP), Hazard Analysis Critical Control Point (HACCP) principles and should address both food safety control and assurance activities in order to guarantee food safety. One of the most emerging challenges is to assess the performance of a present FSMS. The objective of this work is to explain the development of a Microbial Assessment Scheme (MAS) as a tool for a systematic analysis of microbial counts in order to assess the current microbial performance of an implemented FSMS. It is assumed that low numbers of microorganisms and small variations in microbial counts indicate an effective FSMS. The MAS is a procedure that defines the identification of critical sampling locations, the selection of microbiological parameters, the assessment of sampling frequency, the selection of sampling method and method of analysis, and finally data processing and interpretation. Based on the MAS assessment, microbial safety level profiles can be derived, indicating which microorganisms and to what extent they contribute to food safety for a specific food processing company. The MAS concept is illustrated with a case study in the pork processing industry, where ready-to-eat meat products are produced (cured, cooked ham and cured, dried bacon).

  3. Coffee agroforestry for sustainability of Upper Sekampung Watershed management

    NASA Astrophysics Data System (ADS)

    Fitriani; Arifin, Bustanul; Zakaria, Wan Abbas; Hanung Ismono, R.

    2018-03-01

    The main objective of watershed management is to ensure the optimal hydrological and natural resource use for ecological, social and economic importance. One important adaptive management step in dealing with the risk of damage to forest ecosystems is the practice of agroforestry coffee. This study aimed to (1) assess the farmer's response to ecological service responsibility and (2) analyze the Sekampung watersheds management by providing environmental services. The research location was Air Naningan sub-district, Tanggamus, Lampung Province, Indonesia. The research was conducted from July until November 2016. Stratification random sampling based on the pattern of ownership of land rights is used to determine the respondents. Data were analyzed using descriptive statistics and logistic regression analysis. Based on the analysis, it was concluded that coffee farmers' participation in the practice of coffee agroforestry in the form of 38% shade plants and multiple cropping (62%). The logistic regression analysis indicated that the variables of experience and status of land ownership, and incentive-size plans were able to explain variations in the willingness of coffee growers to follow the scheme of providing environmental services. The existence of farmer with partnership and CBFM scheme on different land tenure on upper Sekampung has a strategic position to minimize the deforestation and recovery watersheds destruction.

  4. Univariate time series modeling and an application to future claims amount in SOCSO's invalidity pension scheme

    NASA Astrophysics Data System (ADS)

    Chek, Mohd Zaki Awang; Ahmad, Abu Bakar; Ridzwan, Ahmad Nur Azam Ahmad; Jelas, Imran Md.; Jamal, Nur Faezah; Ismail, Isma Liana; Zulkifli, Faiz; Noor, Syamsul Ikram Mohd

    2012-09-01

    The main objective of this study is to forecast the future claims amount of Invalidity Pension Scheme (IPS). All data were derived from SOCSO annual reports from year 1972 - 2010. These claims consist of all claims amount from 7 benefits offered by SOCSO such as Invalidity Pension, Invalidity Grant, Survivors Pension, Constant Attendance Allowance, Rehabilitation, Funeral and Education. Prediction of future claims of Invalidity Pension Scheme will be made using Univariate Forecasting Models to predict the future claims among workforce in Malaysia.

  5. Minimizing transient influence in WHPA delineation: An optimization approach for optimal pumping rate schemes

    NASA Astrophysics Data System (ADS)

    Rodriguez-Pretelin, A.; Nowak, W.

    2017-12-01

    For most groundwater protection management programs, Wellhead Protection Areas (WHPAs) have served as primarily protection measure. In their delineation, the influence of time-varying groundwater flow conditions is often underestimated because steady-state assumptions are commonly made. However, it has been demonstrated that temporary variations lead to significant changes in the required size and shape of WHPAs. Apart from natural transient groundwater drivers (e.g., changes in the regional angle of flow direction and seasonal natural groundwater recharge), anthropogenic causes such as transient pumping rates are of the most influential factors that require larger WHPAs. We hypothesize that WHPA programs that integrate adaptive and optimized pumping-injection management schemes can counter transient effects and thus reduce the additional areal demand in well protection under transient conditions. The main goal of this study is to present a novel management framework that optimizes pumping schemes dynamically, in order to minimize the impact triggered by transient conditions in WHPA delineation. For optimizing pumping schemes, we consider three objectives: 1) to minimize the risk of pumping water from outside a given WHPA, 2) to maximize the groundwater supply and 3) to minimize the involved operating costs. We solve transient groundwater flow through an available transient groundwater and Lagrangian particle tracking model. The optimization problem is formulated as a dynamic programming problem. Two different optimization approaches are explored: I) the first approach aims for single-objective optimization under objective (1) only. The second approach performs multiobjective optimization under all three objectives where compromise pumping rates are selected from the current Pareto front. Finally, we look for WHPA outlines that are as small as possible, yet allow the optimization problem to find the most suitable solutions.

  6. Proposal of a Framework for Internet Based Licensing of Learning Objects

    ERIC Educational Resources Information Center

    Santos, Osvaldo A.; Ramos, Fernando M. S.

    2004-01-01

    This paper presents a proposal of a framework whose main objective is to manage the delivery and rendering of learning objects in a digital rights controlled environment. The framework is based on a digital licensing scheme that requires each learning object to have the proper license in order to be rendered by a trusted player. A conceptual model…

  7. Single-pixel non-imaging object recognition by means of Fourier spectrum acquisition

    NASA Astrophysics Data System (ADS)

    Chen, Huichao; Shi, Jianhong; Liu, Xialin; Niu, Zhouzhou; Zeng, Guihua

    2018-04-01

    Single-pixel imaging has emerged over recent years as a novel imaging technique, which has significant application prospects. In this paper, we propose and experimentally demonstrate a scheme that can achieve single-pixel non-imaging object recognition by acquiring the Fourier spectrum. In an experiment, a four-step phase-shifting sinusoid illumination light is used to irradiate the object image, the value of the light intensity is measured with a single-pixel detection unit, and the Fourier coefficients of the object image are obtained by a differential measurement. The Fourier coefficients are first cast into binary numbers to obtain the hash value. We propose a new method of perceptual hashing algorithm, which is combined with a discrete Fourier transform to calculate the hash value. The hash distance is obtained by calculating the difference of the hash value between the object image and the contrast images. By setting an appropriate threshold, the object image can be quickly and accurately recognized. The proposed scheme realizes single-pixel non-imaging perceptual hashing object recognition by using fewer measurements. Our result might open a new path for realizing object recognition with non-imaging.

  8. A secure user anonymity-preserving three-factor remote user authentication scheme for the telecare medicine information systems.

    PubMed

    Das, Ashok Kumar

    2015-03-01

    Recent advanced technology enables the telecare medicine information system (TMIS) for the patients to gain the health monitoring facility at home and also to access medical services over the Internet of mobile networks. Several remote user authentication schemes have been proposed in the literature for TMIS. However, most of them are either insecure against various known attacks or they are inefficient. Recently, Tan proposed an efficient user anonymity preserving three-factor authentication scheme for TMIS. In this paper, we show that though Tan's scheme is efficient, it has several security drawbacks such as (1) it fails to provide proper authentication during the login phase, (2) it fails to provide correct updation of password and biometric of a user during the password and biometric update phase, and (3) it fails to protect against replay attack. In addition, Tan's scheme lacks the formal security analysis and verification. Later, Arshad and Nikooghadam also pointed out some security flaws in Tan's scheme and then presented an improvement on Tan's s scheme. However, we show that Arshad and Nikooghadam's scheme is still insecure against the privileged-insider attack through the stolen smart-card attack, and it also lacks the formal security analysis and verification. In order to withstand those security loopholes found in both Tan's scheme, and Arshad and Nikooghadam's scheme, we aim to propose an effective and more secure three-factor remote user authentication scheme for TMIS. Our scheme provides the user anonymity property. Through the rigorous informal and formal security analysis using random oracle models and the widely-accepted AVISPA (Automated Validation of Internet Security Protocols and Applications) tool, we show that our scheme is secure against various known attacks, including the replay and man-in-the-middle attacks. Furthermore, our scheme is also efficient as compared to other related schemes.

  9. Improving Hydrological Simulations by Incorporating GRACE Data for Parameter Calibration

    NASA Astrophysics Data System (ADS)

    Bai, P.

    2017-12-01

    Hydrological model parameters are commonly calibrated by observed streamflow data. This calibration strategy is questioned when the modeled hydrological variables of interest are not limited to streamflow. Well-performed streamflow simulations do not guarantee the reliable reproduction of other hydrological variables. One of the reasons is that hydrological model parameters are not reasonably identified. The Gravity Recovery and Climate Experiment (GRACE) satellite-derived total water storage change (TWSC) data provide an opportunity to constrain hydrological model parameterizations in combination with streamflow observations. We constructed a multi-objective calibration scheme based on GRACE-derived TWSC and streamflow observations, with the aim of improving the parameterizations of hydrological models. The multi-objective calibration scheme was compared with the traditional single-objective calibration scheme, which is based only on streamflow observations. Two monthly hydrological models were employed on 22 Chinese catchments with different hydroclimatic conditions. The model evaluation was performed using observed streamflows, GRACE-derived TWSC, and evapotranspiraiton (ET) estimates from flux towers and from the water balance approach. Results showed that the multi-objective calibration provided more reliable TWSC and ET simulations without significant deterioration in the accuracy of streamflow simulations than the single-objective calibration. In addition, the improvements of TWSC and ET simulations were more significant in relatively dry catchments than in relatively wet catchments. This study highlights the importance of including additional constraints besides streamflow observations in the parameter estimation to improve the performances of hydrological models.

  10. Generating AN Optimum Treatment Plan for External Beam Radiation Therapy.

    NASA Astrophysics Data System (ADS)

    Kabus, Irwin

    1990-01-01

    The application of linear programming to the generation of an optimum external beam radiation treatment plan is investigated. MPSX, an IBM linear programming software package was used. All data originated from the CAT scan of an actual patient who was treated for a pancreatic malignant tumor before this study began. An examination of several alternatives for representing the cross section of the patient showed that it was sufficient to use a set of strategically placed points in the vital organs and tumor and a grid of points spaced about one half inch apart for the healthy tissue. Optimum treatment plans were generated from objective functions representing various treatment philosophies. The optimum plans were based on allowing for 216 external radiation beams which accounted for wedges of any size. A beam reduction scheme then reduced the number of beams in the optimum plan to a number of beams small enough for implementation. Regardless of the objective function, the linear programming treatment plan preserved about 95% of the patient's right kidney vs. 59% for the plan the hospital actually administered to the patient. The clinician, on the case, found most of the linear programming treatment plans to be superior to the hospital plan. An investigation was made, using parametric linear programming, concerning any possible benefits derived from generating treatment plans based on objective functions made up of convex combinations of two objective functions, however, this proved to have only limited value. This study also found, through dual variable analysis, that there was no benefit gained from relaxing some of the constraints on the healthy regions of the anatomy. This conclusion was supported by the clinician. Finally several schemes were found that, under certain conditions, can further reduce the number of beams in the final linear programming treatment plan.

  11. The Effect of Performance-Based Financial Incentives on Improving Health Care Provision in Burundi: A Controlled Cohort Study

    PubMed Central

    Rudasingwa, Martin; Soeters, Robert; Bossuyt, Michel

    2015-01-01

    To strengthen the health care delivery, the Burundian Government in collaboration with international NGOs piloted performance-based financing (PBF) in 2006. The health facilities were assigned - by using a simple matching method - to begin PBF scheme or to continue with the traditional input-based funding. Our objective was to analyse the effect of that PBF scheme on the quality of health services between 2006 and 2008. We conducted the analysis in 16 health facilities with PBF scheme and 13 health facilities without PBF scheme. We analysed the PBF effect by using 58 composite quality indicators of eight health services: Care management, outpatient care, maternity care, prenatal care, family planning, laboratory services, medicines management and materials management. The differences in quality improvement in the two groups of health facilities were performed applying descriptive statistics, a paired non-parametric Wilcoxon Signed Ranks test and a simple difference-in-difference approach at a significance level of 5%. We found an improvement of the quality of care in the PBF group and a significant deterioration in the non-PBF group in the same four health services: care management, outpatient care, maternity care, and prenatal care. The findings suggest a PBF effect of between 38 and 66 percentage points (p<0.001) in the quality scores of care management, outpatient care, prenatal care, and maternal care. We found no PBF effect on clinical support services: laboratory services, medicines management, and material management. The PBF scheme in Burundi contributed to the improvement of the health services that were strongly under the control of medical personnel (physicians and nurses) in a short time of two years. The clinical support services that did not significantly improved were strongly under the control of laboratory technicians, pharmacists and non-medical personnel. PMID:25948432

  12. Simultaneous tensor decomposition and completion using factor priors.

    PubMed

    Chen, Yi-Lei; Hsu, Chiou-Ting; Liao, Hong-Yuan Mark

    2014-03-01

    The success of research on matrix completion is evident in a variety of real-world applications. Tensor completion, which is a high-order extension of matrix completion, has also generated a great deal of research interest in recent years. Given a tensor with incomplete entries, existing methods use either factorization or completion schemes to recover the missing parts. However, as the number of missing entries increases, factorization schemes may overfit the model because of incorrectly predefined ranks, while completion schemes may fail to interpret the model factors. In this paper, we introduce a novel concept: complete the missing entries and simultaneously capture the underlying model structure. To this end, we propose a method called simultaneous tensor decomposition and completion (STDC) that combines a rank minimization technique with Tucker model decomposition. Moreover, as the model structure is implicitly included in the Tucker model, we use factor priors, which are usually known a priori in real-world tensor objects, to characterize the underlying joint-manifold drawn from the model factors. By exploiting this auxiliary information, our method leverages two classic schemes and accurately estimates the model factors and missing entries. We conducted experiments to empirically verify the convergence of our algorithm on synthetic data and evaluate its effectiveness on various kinds of real-world data. The results demonstrate the efficacy of the proposed method and its potential usage in tensor-based applications. It also outperforms state-of-the-art methods on multilinear model analysis and visual data completion tasks.

  13. Advanced Numerical Methods and Software Approaches for Semiconductor Device Simulation

    DOE PAGES

    Carey, Graham F.; Pardhanani, A. L.; Bova, S. W.

    2000-01-01

    In this article we concisely present several modern strategies that are applicable to driftdominated carrier transport in higher-order deterministic models such as the driftdiffusion, hydrodynamic, and quantum hydrodynamic systems. The approaches include extensions of “upwind” and artificial dissipation schemes, generalization of the traditional Scharfetter – Gummel approach, Petrov – Galerkin and streamline-upwind Petrov Galerkin (SUPG), “entropy” variables, transformations, least-squares mixed methods and other stabilized Galerkin schemes such as Galerkin least squares and discontinuous Galerkin schemes. The treatment is representative rather than an exhaustive review and several schemes are mentioned only briefly with appropriate reference to the literature. Some of themore » methods have been applied to the semiconductor device problem while others are still in the early stages of development for this class of applications. We have included numerical examples from our recent research tests with some of the methods. A second aspect of the work deals with algorithms that employ unstructured grids in conjunction with adaptive refinement strategies. The full benefits of such approaches have not yet been developed in this application area and we emphasize the need for further work on analysis, data structures and software to support adaptivity. Finally, we briefly consider some aspects of software frameworks. These include dial-an-operator approaches such as that used in the industrial simulator PROPHET, and object-oriented software support such as those in the SANDIA National Laboratory framework SIERRA.« less

  14. Sub-national health care financing reforms in Indonesia.

    PubMed

    Sparrow, Robert; Budiyati, Sri; Yumna, Athia; Warda, Nila; Suryahadi, Asep; Bedi, Arjun S

    2017-02-01

    Indonesia has seen an emergence of local health care financing schemes over the last decade, implemented and operated by district governments. Often motivated by the local political context and characterized by a large degree of heterogeneity in scope and design, the common objective of the district schemes is to address the coverage gaps for the informal sector left by national social health insurance programs. This paper investigates the effect of these local health care financing schemes on access to health care and financial protection. Using data from a unique survey among District Health Offices, combined with data from the annual National Socioeconomic Surveys, the study is based on a fixed effects analysis for a panel of 262 districts over the period 2004-10, exploiting variation in local health financing reforms across districts in terms of type of reform and timing of implementation. Although the schemes had a modest impact on average, they do seem to have provided some contribution to closing the coverage gap, by increasing outpatient utilization for households in the middle quintiles that tend to fall just outside the target population of the national subsidized programs. However, there seems to be little effect on hospitalization or financial protection, indicating the limitations of local health care financing policies. In addition, we see effect heterogeneity across districts due to differences in design features. © The Author 2016. Published by Oxford University Press in association with The London School of Hygiene and Tropical Medicine. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  15. A Comprehensive Histological Assessment of Osteoarthritis Lesions in Mice

    PubMed Central

    McNulty, Margaret A.; Loeser, Richard F.; Davey, Cynthia; Callahan, Michael F.; Ferguson, Cristin M.; Carlson, Cathy S.

    2011-01-01

    Objective: Accurate histological assessment of osteoarthritis (OA) is critical in studies evaluating the effects of interventions on disease severity. The purpose of the present study was to develop a histological grading scheme that comprehensively and quantitatively assesses changes in multiple tissues that are associated with OA of the stifle joint in mice. Design: Two representative midcoronal sections from 158 stifle joints, including naturally occurring and surgically induced OA, were stained with H&E and Safranin-O stains. All slides were evaluated to characterize the changes present. A grading scheme that includes both measurements and semiquantitative scores was developed, and principal components analysis (PCA) was applied to the resulting data from the medial tibial plateaus. A subset of 30 tibial plateaus representing a wide range of severity was then evaluated by 4 observers. Reliability of the results was evaluated using intraclass correlation coefficients (ICCs) and area under the receiver operating characteristic (ROC) curve. Results: Five factors were retained by PCA, accounting for 74% of the total variance. Interobserver and intraobserver reproducibilities for evaluations of articular cartilage and subchondral bone were acceptable. The articular cartilage integrity and chondrocyte viability factor scores were able to distinguish severe OA from normal, minimal, mild, and moderate disease. Conclusion: This newly developed grading scheme and resulting factors characterize a range of joint changes in mouse stifle joints that are associated with OA. Overall, the newly developed scheme is reliable and reproducible, characterizes changes in multiple tissues, and provides comprehensive information regarding a specific site in the stifle joint. PMID:26069594

  16. Ethical issues raised by the introduction of payment for performance in France

    PubMed Central

    Saint-Lary, Olivier; Plu, Isabelle; Naiditch, Michel

    2012-01-01

    Context In France, a new payment for performance (P4P) scheme for primary care physicians was introduced in 2009 through the ‘Contract for Improving Individual Practice’ programme. Its objective was to reduce healthcare expenditures while enhancing improvement in guidelines' observance. Nevertheless, in all countries where the scheme was implemented, it raised several concerns in the domain of professional ethics. Objective To draw out in France the ethical tensions arising in the general practitioner's (GP) profession linked to the introduction of P4P. Method Qualitative research using two focus groups: first one with a sample of GPs who joined P4P and second one with those who did not. All collective interviews were recorded and fully transcribed. An inductive analysis of thematic content with construction of categories was conducted. All the data were triangulated. Results All participants agreed that conflicts of interest were a real issue, leading to the resurgence of doctor's dirigisme, which could be detrimental for patient's autonomy. GPs who did not join P4P believed that the scheme would lead to patient's selection while those who joined P4P did not. The level of the maximal bonus of the P4P was considered low by all GPs. This was considered as an offense by non-participating GPs, whereas for participating ones, this low level minimised the risk of patient's selection. Conclusion This work identified several areas of ethical tension, some being different from those previously described in other countries. The authors discuss the potential impact of institutional contexts and variability of implementation processes on shaping these differences. PMID:22493186

  17. Meeting the Cool Neighbors. XII. An Optically Anchored Analysis of the Near-infrared Spectra of L Dwarfs

    NASA Astrophysics Data System (ADS)

    Cruz, Kelle L.; Núñez, Alejandro; Burgasser, Adam J.; Abrahams, Ellianna; Rice, Emily L.; Reid, I. Neill; Looper, Dagny

    2018-01-01

    Discrepancies between competing optical and near-infrared (NIR) spectral typing systems for L dwarfs have motivated us to search for a classification scheme that ties the optical and NIR schemes together, and addresses complexities in the spectral morphology. We use new and extant optical and NIR spectra to compile a sample of 171 L dwarfs, including 27 low-gravity β and γ objects, with spectral coverage from 0.6–2.4 μm. We present 155 new low-resolution NIR spectra and 19 new optical spectra. We utilize a method for analyzing NIR spectra that partially removes the broad-band spectral slope and reveals similarities in the absorption features between objects of the same optical spectral type. Using the optical spectra as an anchor, we generate near-infrared spectral average templates for L0–L8, L0–L4γ, and L0–L1β type dwarfs. These templates reveal that NIR spectral morphologies are correlated with the optical types. They also show the range of spectral morphologies spanned by each spectral type. We compare low-gravity and field-gravity templates to provide recommendations on the minimum required observations for credibly classifying low-gravity spectra using low-resolution NIR data. We use the templates to evaluate the existing NIR spectral standards and propose new ones where appropriate. Finally, we build on the work of Kirkpatrick et al. to provide a spectral typing method that is tied to the optical and can be used when only H or K band data are available. The methods we present here provide resolutions to several long-standing issues with classifying L dwarf spectra and could also be the foundation for a spectral classification scheme for cloudy exoplanets.

  18. An improved approach for the segmentation of starch granules in microscopic images

    PubMed Central

    2010-01-01

    Background Starches are the main storage polysaccharides in plants and are distributed widely throughout plants including seeds, roots, tubers, leaves, stems and so on. Currently, microscopic observation is one of the most important ways to investigate and analyze the structure of starches. The position, shape, and size of the starch granules are the main measurements for quantitative analysis. In order to obtain these measurements, segmentation of starch granules from the background is very important. However, automatic segmentation of starch granules is still a challenging task because of the limitation of imaging condition and the complex scenarios of overlapping granules. Results We propose a novel method to segment starch granules in microscopic images. In the proposed method, we first separate starch granules from background using automatic thresholding and then roughly segment the image using watershed algorithm. In order to reduce the oversegmentation in watershed algorithm, we use the roundness of each segment, and analyze the gradient vector field to find the critical points so as to identify oversegments. After oversegments are found, we extract the features, such as the position and intensity of the oversegments, and use fuzzy c-means clustering to merge the oversegments to the objects with similar features. Experimental results demonstrate that the proposed method can alleviate oversegmentation of watershed segmentation algorithm successfully. Conclusions We present a new scheme for starch granules segmentation. The proposed scheme aims to alleviate the oversegmentation in watershed algorithm. We use the shape information and critical points of gradient vector flow (GVF) of starch granules to identify oversegments, and use fuzzy c-mean clustering based on prior knowledge to merge these oversegments to the objects. Experimental results on twenty microscopic starch images demonstrate the effectiveness of the proposed scheme. PMID:21047380

  19. Secure anonymity-preserving password-based user authentication and session key agreement scheme for telecare medicine information systems.

    PubMed

    Sutrala, Anil Kumar; Das, Ashok Kumar; Odelu, Vanga; Wazid, Mohammad; Kumari, Saru

    2016-10-01

    Information and communication and technology (ICT) has changed the entire paradigm of society. ICT facilitates people to use medical services over the Internet, thereby reducing the travel cost, hospitalization cost and time to a greater extent. Recent advancements in Telecare Medicine Information System (TMIS) facilitate users/patients to access medical services over the Internet by gaining health monitoring facilities at home. Amin and Biswas recently proposed a RSA-based user authentication and session key agreement protocol usable for TMIS, which is an improvement over Giri et al.'s RSA-based user authentication scheme for TMIS. In this paper, we show that though Amin-Biswas's scheme considerably improves the security drawbacks of Giri et al.'s scheme, their scheme has security weaknesses as it suffers from attacks such as privileged insider attack, user impersonation attack, replay attack and also offline password guessing attack. A new RSA-based user authentication scheme for TMIS is proposed, which overcomes the security pitfalls of Amin-Biswas's scheme and also preserves user anonymity property. The careful formal security analysis using the two widely accepted Burrows-Abadi-Needham (BAN) logic and the random oracle models is done. Moreover, the informal security analysis of the scheme is also done. These security analyses show the robustness of our new scheme against the various known attacks as well as attacks found in Amin-Biswas's scheme. The simulation of the proposed scheme using the widely accepted Automated Validation of Internet Security Protocols and Applications (AVISPA) tool is also done. We present a new user authentication and session key agreement scheme for TMIS, which fixes the mentioned security pitfalls found in Amin-Biswas's scheme, and we also show that the proposed scheme provides better security than other existing schemes through the rigorous security analysis and verification tool. Furthermore, we present the formal security verification of our scheme using the widely accepted AVISPA tool. High security and extra functionality features allow our proposed scheme to be applicable for telecare medicine information systems which is used for e-health care medical applications. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  20. Student Loans Schemes in Mauritius: Experience, Analysis and Scenarios

    ERIC Educational Resources Information Center

    Mohadeb, Praveen

    2006-01-01

    This study makes a comprehensive review of the situation of student loans schemes in Mauritius, and makes recommendations, based on best practices, for setting up a national scheme that attempts to avoid weaknesses identified in some of the loans schemes of other countries. It suggests that such a scheme would be cost-effective and beneficial both…

  1. School lunch program in India: background, objectives and components.

    PubMed

    Chutani, Alka Mohan

    2012-01-01

    The School Lunch Program in India (SLP) is the largest food and nutrition assistance program feeding millions of children every day. This paper provides a review of the background information on the SLP in India earlier known as national program for nutrition support to primary education (NP-NSPE) and later as mid day meal scheme, including historical trends and objectives and components/characteristics of the scheme. It also addresses steps being taken to meet challenges being faced by the administrators of the program in monitoring and evaluation of the program. This program was initially started in 1960 in few states to overcome the complex problems malnutrition and illiteracy. Mid Day Meal Scheme is the popular name for school meal program. In 2001, as per the supreme court orders, it became mandatory to give a mid day meal to all primary and later extended to upper primary school children studying in the government and government aided schools. This scheme benefitted 140 million children in government assisted schools across India in 2008, strengthening child nutrition and literacy. In a country with a large percent of illiterate population with a high percent of children unable to read or write; governmental and non-governmental organizations have reported that mid day meal scheme has consistently increased enrollment in schools in India. One of the main goals of school lunch program is to promote the health and well-being of the Nation's children.

  2. A Robust and Effective Smart-Card-Based Remote User Authentication Mechanism Using Hash Function

    PubMed Central

    Odelu, Vanga; Goswami, Adrijit

    2014-01-01

    In a remote user authentication scheme, a remote server verifies whether a login user is genuine and trustworthy, and also for mutual authentication purpose a login user validates whether the remote server is genuine and trustworthy. Several remote user authentication schemes using the password, the biometrics, and the smart card have been proposed in the literature. However, most schemes proposed in the literature are either computationally expensive or insecure against several known attacks. In this paper, we aim to propose a new robust and effective password-based remote user authentication scheme using smart card. Our scheme is efficient, because our scheme uses only efficient one-way hash function and bitwise XOR operations. Through the rigorous informal and formal security analysis, we show that our scheme is secure against possible known attacks. We perform the simulation for the formal security analysis using the widely accepted AVISPA (Automated Validation Internet Security Protocols and Applications) tool to ensure that our scheme is secure against passive and active attacks. Furthermore, our scheme supports efficiently the password change phase always locally without contacting the remote server and correctly. In addition, our scheme performs significantly better than other existing schemes in terms of communication, computational overheads, security, and features provided by our scheme. PMID:24892078

  3. A robust and effective smart-card-based remote user authentication mechanism using hash function.

    PubMed

    Das, Ashok Kumar; Odelu, Vanga; Goswami, Adrijit

    2014-01-01

    In a remote user authentication scheme, a remote server verifies whether a login user is genuine and trustworthy, and also for mutual authentication purpose a login user validates whether the remote server is genuine and trustworthy. Several remote user authentication schemes using the password, the biometrics, and the smart card have been proposed in the literature. However, most schemes proposed in the literature are either computationally expensive or insecure against several known attacks. In this paper, we aim to propose a new robust and effective password-based remote user authentication scheme using smart card. Our scheme is efficient, because our scheme uses only efficient one-way hash function and bitwise XOR operations. Through the rigorous informal and formal security analysis, we show that our scheme is secure against possible known attacks. We perform the simulation for the formal security analysis using the widely accepted AVISPA (Automated Validation Internet Security Protocols and Applications) tool to ensure that our scheme is secure against passive and active attacks. Furthermore, our scheme supports efficiently the password change phase always locally without contacting the remote server and correctly. In addition, our scheme performs significantly better than other existing schemes in terms of communication, computational overheads, security, and features provided by our scheme.

  4. Gaining qualitative insight into the subjective experiences of adherers to an exercise referral scheme: A thematic analysis.

    PubMed

    Eynon, Michael John; O'Donnell, Christopher; Williams, Lynn

    2016-07-01

    Nine adults who had completed an exercise referral scheme participated in a semi-structured interview to uncover the key psychological factors associated with adherence to the scheme. Through thematic analysis, an exercise identity emerged to be a major factor associated with adherence to the scheme, which was formed of a number of underpinning constructs including changes in self-esteem, changes in self-efficacy and changes in self-regulatory strategies. Also, an additional theme of transitions in motivation to exercise was identified, showing participants' motivation to alter from extrinsic to intrinsic reasons to exercise during the scheme.

  5. Cryptanalysis and Enhancement of Anonymity Preserving Remote User Mutual Authentication and Session Key Agreement Scheme for E-Health Care Systems.

    PubMed

    Amin, Ruhul; Islam, S K Hafizul; Biswas, G P; Khan, Muhammad Khurram; Li, Xiong

    2015-11-01

    The E-health care systems employ IT infrastructure for maximizing health care resources utilization as well as providing flexible opportunities to the remote patient. Therefore, transmission of medical data over any public networks is necessary in health care system. Note that patient authentication including secure data transmission in e-health care system is critical issue. Although several user authentication schemes for accessing remote services are available, their security analysis show that none of them are free from relevant security attacks. We reviewed Das et al.'s scheme and demonstrated their scheme lacks proper protection against several security attacks such as user anonymity, off-line password guessing attack, smart card theft attack, user impersonation attack, server impersonation attack, session key discloser attack. In order to overcome the mentioned security pitfalls, this paper proposes an anonymity preserving remote patient authentication scheme usable in E-health care systems. We then validated the security of the proposed scheme using BAN logic that ensures secure mutual authentication and session key agreement. We also presented the experimental results of the proposed scheme using AVISPA software and the results ensure that our scheme is secure under OFMC and CL-AtSe models. Moreover, resilience of relevant security attacks has been proved through both formal and informal security analysis. The performance analysis and comparison with other schemes are also made, and it has been found that the proposed scheme overcomes the security drawbacks of the Das et al.'s scheme and additionally achieves extra security requirements.

  6. Adaptive Intuitionistic Fuzzy Enhancement of Brain Tumor MR Images

    NASA Astrophysics Data System (ADS)

    Deng, He; Deng, Wankai; Sun, Xianping; Ye, Chaohui; Zhou, Xin

    2016-10-01

    Image enhancement techniques are able to improve the contrast and visual quality of magnetic resonance (MR) images. However, conventional methods cannot make up some deficiencies encountered by respective brain tumor MR imaging modes. In this paper, we propose an adaptive intuitionistic fuzzy sets-based scheme, called as AIFE, which takes information provided from different MR acquisitions and tries to enhance the normal and abnormal structural regions of the brain while displaying the enhanced results as a single image. The AIFE scheme firstly separates an input image into several sub images, then divides each sub image into object and background areas. After that, different novel fuzzification, hyperbolization and defuzzification operations are implemented on each object/background area, and finally an enhanced result is achieved via nonlinear fusion operators. The fuzzy implementations can be processed in parallel. Real data experiments demonstrate that the AIFE scheme is not only effectively useful to have information from images acquired with different MR sequences fused in a single image, but also has better enhancement performance when compared to conventional baseline algorithms. This indicates that the proposed AIFE scheme has potential for improving the detection and diagnosis of brain tumors.

  7. Holographic memory system based on projection recording of computer-generated 1D Fourier holograms.

    PubMed

    Betin, A Yu; Bobrinev, V I; Donchenko, S S; Odinokov, S B; Evtikhiev, N N; Starikov, R S; Starikov, S N; Zlokazov, E Yu

    2014-10-01

    Utilization of computer generation of holographic structures significantly simplifies the optical scheme that is used to record the microholograms in a holographic memory record system. Also digital holographic synthesis allows to account the nonlinear errors of the record system to improve the microholograms quality. The multiplexed record of holograms is a widespread technique to increase the data record density. In this article we represent the holographic memory system based on digital synthesis of amplitude one-dimensional (1D) Fourier transform holograms and the multiplexed record of these holograms onto the holographic carrier using optical projection scheme. 1D Fourier transform holograms are very sensitive to orientation of the anamorphic optical element (cylindrical lens) that is required for encoded data object reconstruction. The multiplex record of several holograms with different orientation in an optical projection scheme allowed reconstruction of the data object from each hologram by rotating the cylindrical lens on the corresponding angle. Also, we discuss two optical schemes for the recorded holograms readout: a full-page readout system and line-by-line readout system. We consider the benefits of both systems and present the results of experimental modeling of 1D Fourier holograms nonmultiplex and multiplex record and reconstruction.

  8. Method for Stereo Mapping Based on Objectarx and Pipeline Technology

    NASA Astrophysics Data System (ADS)

    Liu, F.; Chen, T.; Lin, Z.; Yang, Y.

    2012-07-01

    Stereo mapping is an important way to acquire 4D production. Based on the development of the stereo mapping and the characteristics of ObjectARX and pipeline technology, a new stereo mapping scheme which can realize the interaction between the AutoCAD and digital photogrammetry system is offered by ObjectARX and pipeline technology. An experiment is made in order to make sure the feasibility with the example of the software MAP-AT (Modern Aerial Photogrammetry Automatic Triangulation), the experimental results show that this scheme is feasible and it has very important meaning for the realization of the acquisition and edit integration.

  9. A New UK 2006 National Kidney Allocation Scheme for deceased heart-beating donor kidneys.

    PubMed

    Johnson, Rachel J; Fuggle, Susan V; Mumford, Lisa; Bradley, J Andrew; Forsythe, John L R; Rudge, Chris J

    2010-02-27

    In 2004, it was agreed that a new allocation scheme for kidneys from deceased heart-beating donors was required in the United Kingdom to address observed inequities in access to transplant. The 2006 National Kidney Allocation Scheme (2006 NKAS) was developed to meet agreed objectives and preparatory work included a review of the criteria for human leukocyte antigen (HLA) matching and simulation evidence about the effectiveness of alternative schemes. ALGORITHM FOR 2006 NKAS: The 2006 NKAS gives absolute priority to all 000 HLA-A, -B, -DR-mismatched patients and well-matched pediatric patients (<18 years), and then a points score defines priorities for allocation with waiting time being most influential. Points for age and HLA mismatch are linked in a novel approach to ensure well-matched transplants for younger patients while recognizing that HLA matching is less important for older patients as retransplantation is less likely to be required. To improve equity for difficult to match patients, rare HLA specificities were defaulted to more common, related specificities. IMPACT OF 2006 NKAS: After 3 years, the scheme is already making good progress in achieving its objectives, with overall results similar to those observed in the simulations. There has been a significant benefit for patients waiting more than 5 years for transplant. A number of other advantages of the scheme are also apparent with equity of access improving in many respects, including the achievement of equity of access to transplant for HLA-DR homozygous patients, but geographical inequity of access will take a number of years to address fully.

  10. State-of-the-art for food taxes to promote public health.

    PubMed

    Jensen, J D; Smed, S

    2018-05-01

    The use of taxes to promote healthy nutritional behaviour has gained ground in the past decade. The present paper reviews existing applications of fiscal instruments in nutrition policy and derives some perspectives and recommendations from the experiences gained with these instruments. Many countries in different parts of the world have experiences with the taxation of sugar-sweetened beverages, in some cases in combination with taxes on unhealthy food commodities such as confectionery or high-fat foods. These tax schemes have many similarities, but also differ in their definitions of tax objects and in the applied tax rates. Denmark has been the only country in the world to operate a tax on saturated fat content in foods, from 2011 to 2012. Most of the existing food tax schemes have been introduced from fiscal motivations, with health promotion as a secondary objective, but a few have been introduced with health promotion as the primary objective. The diversity in experiences from existing tax schemes can provide valuable insights for future use of fiscal instruments to promote healthy nutrition, in terms of designing effective and efficient tax or subsidy instruments, and in terms of smooth and politically viable implementation of the instruments.

  11. Detection scheme for a partially occluded pedestrian based on occluded depth in lidar-radar sensor fusion

    NASA Astrophysics Data System (ADS)

    Kwon, Seong Kyung; Hyun, Eugin; Lee, Jin-Hee; Lee, Jonghun; Son, Sang Hyuk

    2017-11-01

    Object detections are critical technologies for the safety of pedestrians and drivers in autonomous vehicles. Above all, occluded pedestrian detection is still a challenging topic. We propose a new detection scheme for occluded pedestrian detection by means of lidar-radar sensor fusion. In the proposed method, the lidar and radar regions of interest (RoIs) have been selected based on the respective sensor measurement. Occluded depth is a new means to determine whether an occluded target exists or not. The occluded depth is a region projected out by expanding the longitudinal distance with maintaining the angle formed by the outermost two end points of the lidar RoI. The occlusion RoI is the overlapped region made by superimposing the radar RoI and the occluded depth. The object within the occlusion RoI is detected by the radar measurement information and the occluded object is estimated as a pedestrian based on human Doppler distribution. Additionally, various experiments are performed in detecting a partially occluded pedestrian in outdoor as well as indoor environments. According to experimental results, the proposed sensor fusion scheme has much better detection performance compared to the case without our proposed method.

  12. Enhancement of brain tumor MR images based on intuitionistic fuzzy sets

    NASA Astrophysics Data System (ADS)

    Deng, Wankai; Deng, He; Cheng, Lifang

    2015-12-01

    Brain tumor is one of the most fatal cancers, especially high-grade gliomas are among the most deadly. However, brain tumor MR images usually have the disadvantages of low resolution and contrast when compared with the optical images. Consequently, we present a novel adaptive intuitionistic fuzzy enhancement scheme by combining a nonlinear fuzzy filtering operation with fusion operators, for the enhancement of brain tumor MR images in this paper. The presented scheme consists of the following six steps: Firstly, the image is divided into several sub-images. Secondly, for each sub-image, object and background areas are separated by a simple threshold. Thirdly, respective intuitionistic fuzzy generators of object and background areas are constructed based on the modified restricted equivalence function. Fourthly, different suitable operations are performed on respective membership functions of object and background areas. Fifthly, the membership plane is inversely transformed into the image plane. Finally, an enhanced image is obtained through fusion operators. The comparison and evaluation of enhancement performance demonstrate that the presented scheme is helpful to determine the abnormal functional areas, guide the operation, judge the prognosis, and plan the radiotherapy by enhancing the fine detail of MR images.

  13. A two-objective optimization scheme for high-OSNR and low-power-consuming all-optical networks

    NASA Astrophysics Data System (ADS)

    Abedifar, Vahid; Mirjalili, Seyed Mohammad; Eshghi, Mohammad

    2015-01-01

    In all-optical networks the ASE noise of the utilized optical power amplifiers is a major impairment, making the OSNR to be the dominant parameter in QoS. In this paper, a two-objective optimization scheme using Multi-Objective Particle Swarm Optimization (MOPSO) is proposed to reach the maximum OSNR for all channels while the optical power consumed by EDFAs and lasers is minimized. Two scenarios are investigated: Scenario 1 and Scenario 2. The former scenario optimizes the gain values of a predefined number of EDFAs in physical links. The gain values may be different from each other. The latter scenario optimizes the gains value of EDFAs (which is supposed to be identical in each physical link) in addition to the number of EDFAs for each physical link. In both scenarios, the launch powers of the lasers are also taken into account during optimization process. Two novel encoding methods are proposed to uniquely represent the problem solutions. Two virtual demand sets are considered for evaluation of the performance of the proposed optimization scheme. The simulations results are described for both scenarios and both virtual demands.

  14. Recovery Schemes for Primitive Variables in General-relativistic Magnetohydrodynamics

    NASA Astrophysics Data System (ADS)

    Siegel, Daniel M.; Mösta, Philipp; Desai, Dhruv; Wu, Samantha

    2018-05-01

    General-relativistic magnetohydrodynamic (GRMHD) simulations are an important tool to study a variety of astrophysical systems such as neutron star mergers, core-collapse supernovae, and accretion onto compact objects. A conservative GRMHD scheme numerically evolves a set of conservation equations for “conserved” quantities and requires the computation of certain primitive variables at every time step. This recovery procedure constitutes a core part of any conservative GRMHD scheme and it is closely tied to the equation of state (EOS) of the fluid. In the quest to include nuclear physics, weak interactions, and neutrino physics, state-of-the-art GRMHD simulations employ finite-temperature, composition-dependent EOSs. While different schemes have individually been proposed, the recovery problem still remains a major source of error, failure, and inefficiency in GRMHD simulations with advanced microphysics. The strengths and weaknesses of the different schemes when compared to each other remain unclear. Here we present the first systematic comparison of various recovery schemes used in different dynamical spacetime GRMHD codes for both analytic and tabulated microphysical EOSs. We assess the schemes in terms of (i) speed, (ii) accuracy, and (iii) robustness. We find large variations among the different schemes and that there is not a single ideal scheme. While the computationally most efficient schemes are less robust, the most robust schemes are computationally less efficient. More robust schemes may require an order of magnitude more calls to the EOS, which are computationally expensive. We propose an optimal strategy of an efficient three-dimensional Newton–Raphson scheme and a slower but more robust one-dimensional scheme as a fall-back.

  15. Moment method analysis of linearly tapered slot antennas: Low loss components for switched beam radiometers

    NASA Technical Reports Server (NTRS)

    Koeksal, Adnan; Trew, Robert J.; Kauffman, J. Frank

    1992-01-01

    A Moment Method Model for the radiation pattern characterization of single Linearly Tapered Slot Antennas (LTSA) in air or on a dielectric substrate is developed. This characterization consists of: (1) finding the radiated far-fields of the antenna; (2) determining the E-Plane and H-Plane beamwidths and sidelobe levels; and (3) determining the D-Plane beamwidth and cross polarization levels, as antenna parameters length, height, taper angle, substrate thickness, and the relative substrate permittivity vary. The LTSA geometry does not lend itself to analytical solution with the given parameter ranges. Therefore, a computer modeling scheme and a code are necessary to analyze the problem. This necessity imposes some further objectives or requirements on the solution method (modeling) and tool (computer code). These may be listed as follows: (1) a good approximation to the real antenna geometry; and (2) feasible computer storage and time requirements. According to these requirements, the work is concentrated on the development of efficient modeling schemes for these type of problems and on reducing the central processing unit (CPU) time required from the computer code. A Method of Moments (MoM) code is developed for the analysis of LTSA's within the parameter ranges given.

  16. A Delay-Aware and Reliable Data Aggregation for Cyber-Physical Sensing

    PubMed Central

    Zhang, Jinhuan; Long, Jun; Zhang, Chengyuan; Zhao, Guihu

    2017-01-01

    Physical information sensed by various sensors in a cyber-physical system should be collected for further operation. In many applications, data aggregation should take reliability and delay into consideration. To address these problems, a novel Tiered Structure Routing-based Delay-Aware and Reliable Data Aggregation scheme named TSR-DARDA for spherical physical objects is proposed. By dividing the spherical network constructed by dispersed sensor nodes into circular tiers with specifically designed widths and cells, TSTR-DARDA tries to enable as many nodes as possible to transmit data simultaneously. In order to ensure transmission reliability, lost packets are retransmitted. Moreover, to minimize the latency while maintaining reliability for data collection, in-network aggregation and broadcast techniques are adopted to deal with the transmission between data collecting nodes in the outer layer and their parent data collecting nodes in the inner layer. Thus, the optimization problem is transformed to minimize the delay under reliability constraints by controlling the system parameters. To demonstrate the effectiveness of the proposed scheme, we have conducted extensive theoretical analysis and comparisons to evaluate the performance of TSR-DARDA. The analysis and simulations show that TSR-DARDA leads to lower delay with reliability satisfaction. PMID:28218668

  17. A Foreign Object Damage Event Detector Data Fusion System for Turbofan Engines

    NASA Technical Reports Server (NTRS)

    Turso, James A.; Litt, Jonathan S.

    2004-01-01

    A Data Fusion System designed to provide a reliable assessment of the occurrence of Foreign Object Damage (FOD) in a turbofan engine is presented. The FOD-event feature level fusion scheme combines knowledge of shifts in engine gas path performance obtained using a Kalman filter, with bearing accelerometer signal features extracted via wavelet analysis, to positively identify a FOD event. A fuzzy inference system provides basic probability assignments (bpa) based on features extracted from the gas path analysis and bearing accelerometers to a fusion algorithm based on the Dempster-Shafer-Yager Theory of Evidence. Details are provided on the wavelet transforms used to extract the foreign object strike features from the noisy data and on the Kalman filter-based gas path analysis. The system is demonstrated using a turbofan engine combined-effects model (CEM), providing both gas path and rotor dynamic structural response, and is suitable for rapid-prototyping of control and diagnostic systems. The fusion of the disparate data can provide significantly more reliable detection of a FOD event than the use of either method alone. The use of fuzzy inference techniques combined with Dempster-Shafer-Yager Theory of Evidence provides a theoretical justification for drawing conclusions based on imprecise or incomplete data.

  18. Navier-Stokes flow field analysis of compressible flow in a high pressure safety relief valve

    NASA Technical Reports Server (NTRS)

    Vu, Bruce; Wang, Ten-See; Shih, Ming-Hsin; Soni, Bharat

    1993-01-01

    The objective of this study is to investigate the complex three-dimensional flowfield of an oxygen safety pressure relieve valve during an incident, with a computational fluid dynamic (CFD) analysis. Specifically, the analysis will provide a flow pattern that would lead to the expansion of the eventual erosion pattern of the hardware, so as to combine it with other findings to piece together a most likely scenario for the investigation. The CFD model is a pressure based solver. An adaptive upwind difference scheme is employed for the spatial discretization, and a predictor, multiple corrector method is used for the velocity-pressure coupling. The computational result indicated vortices formation near the opening of the valve which matched the erosion pattern of the damaged hardware.

  19. Thrust imbalance of solid rocket motor pairs on Space Shuttle flights

    NASA Technical Reports Server (NTRS)

    Foster, W. A., Jr.; Shu, P. H.; Sforzini, R. H.

    1986-01-01

    This analysis extends the investigation presented at the 17th Joint Propulsion Conference in 1981 to include fifteen sets of Space Shuttle flight data. The previous report dealt only with static test data and the first flight pair. The objective is to compare the authors' previous theoretical analysis of thrust imbalance with actual Space Shuttle performance. The theoretical prediction method, which involves a Monte Carlo technique, is reviewed briefly as are salient features of the flight instrumentation system and the statistical analysis. A scheme for smoothing flight data is discussed. The effects of changes in design parameters are discussed with special emphasis on the filament wound motor case being developed to replace the steel case. Good agreement between the predictions and the flight data is demonstrated.

  20. Multi-Hierarchical Gray Correlation Analysis Applied in the Selection of Green Building Design Scheme

    NASA Astrophysics Data System (ADS)

    Wang, Li; Li, Chuanghong

    2018-02-01

    As a sustainable form of ecological structure, green building is widespread concerned and advocated in society increasingly nowadays. In the survey and design phase of preliminary project construction, carrying out the evaluation and selection of green building design scheme, which is in accordance with the scientific and reasonable evaluation index system, can improve the ecological benefits of green building projects largely and effectively. Based on the new Green Building Evaluation Standard which came into effect on January 1, 2015, the evaluation index system of green building design scheme is constructed taking into account the evaluation contents related to the green building design scheme. We organized experts who are experienced in construction scheme optimization to mark and determine the weight of each evaluation index through the AHP method. The correlation degree was calculated between each evaluation scheme and ideal scheme by using multilevel gray relational analysis model and then the optimal scheme was determined. The feasibility and practicability of the evaluation method are verified by introducing examples.

  1. Design and Analysis of a Dynamic Mobility Management Scheme for Wireless Mesh Network

    PubMed Central

    Roy, Sudipta

    2013-01-01

    Seamless mobility management of the mesh clients (MCs) in wireless mesh network (WMN) has drawn a lot of attention from the research community. A number of mobility management schemes such as mesh network with mobility management (MEMO), mesh mobility management (M3), and wireless mesh mobility management (WMM) have been proposed. The common problem with these schemes is that they impose uniform criteria on all the MCs for sending route update message irrespective of their distinct characteristics. This paper proposes a session-to-mobility ratio (SMR) based dynamic mobility management scheme for handling both internet and intranet traffic. To reduce the total communication cost, this scheme considers each MC's session and mobility characteristics by dynamically determining optimal threshold SMR value for each MC. A numerical analysis of the proposed scheme has been carried out. Comparison with other schemes shows that the proposed scheme outperforms MEMO, M3, and WMM with respect to total cost. PMID:24311982

  2. Toward semantic-based retrieval of visual information: a model-based approach

    NASA Astrophysics Data System (ADS)

    Park, Youngchoon; Golshani, Forouzan; Panchanathan, Sethuraman

    2002-07-01

    This paper center around the problem of automated visual content classification. To enable classification based image or visual object retrieval, we propose a new image representation scheme called visual context descriptor (VCD) that is a multidimensional vector in which each element represents the frequency of a unique visual property of an image or a region. VCD utilizes the predetermined quality dimensions (i.e., types of features and quantization level) and semantic model templates mined in priori. Not only observed visual cues, but also contextually relevant visual features are proportionally incorporated in VCD. Contextual relevance of a visual cue to a semantic class is determined by using correlation analysis of ground truth samples. Such co-occurrence analysis of visual cues requires transformation of a real-valued visual feature vector (e.g., color histogram, Gabor texture, etc.,) into a discrete event (e.g., terms in text). Good-feature to track, rule of thirds, iterative k-means clustering and TSVQ are involved in transformation of feature vectors into unified symbolic representations called visual terms. Similarity-based visual cue frequency estimation is also proposed and used for ensuring the correctness of model learning and matching since sparseness of sample data causes the unstable results of frequency estimation of visual cues. The proposed method naturally allows integration of heterogeneous visual or temporal or spatial cues in a single classification or matching framework, and can be easily integrated into a semantic knowledge base such as thesaurus, and ontology. Robust semantic visual model template creation and object based image retrieval are demonstrated based on the proposed content description scheme.

  3. A solution to the water resources crisis in wetlands: development of a scenario-based modeling approach with uncertain features.

    PubMed

    Lv, Ying; Huang, Guohe; Sun, Wei

    2013-01-01

    A scenario-based interval two-phase fuzzy programming (SITF) method was developed for water resources planning in a wetland ecosystem. The SITF approach incorporates two-phase fuzzy programming, interval mathematical programming, and scenario analysis within a general framework. It can tackle fuzzy and interval uncertainties in terms of cost coefficients, resources availabilities, water demands, hydrological conditions and other parameters within a multi-source supply and multi-sector consumption context. The SITF method has the advantage in effectively improving the membership degrees of the system objective and all fuzzy constraints, so that both higher satisfactory grade of the objective and more efficient utilization of system resources can be guaranteed. Under the systematic consideration of water demands by the ecosystem, the SITF method was successfully applied to Baiyangdian Lake, which is the largest wetland in North China. Multi-source supplies (including the inter-basin water sources of Yuecheng Reservoir and Yellow River), and multiple water users (including agricultural, industrial and domestic sectors) were taken into account. The results indicated that, the SITF approach would generate useful solutions to identify long-term water allocation and transfer schemes under multiple economic, environmental, ecological, and system-security targets. It can address a comparative analysis for the system satisfactory degrees of decisions under various policy scenarios. Moreover, it is of significance to quantify the relationship between hydrological change and human activities, such that a scheme on ecologically sustainable water supply to Baiyangdian Lake can be achieved. Copyright © 2012 Elsevier B.V. All rights reserved.

  4. A User Authentication Scheme Based on Elliptic Curves Cryptography for Wireless Ad Hoc Networks

    PubMed Central

    Chen, Huifang; Ge, Linlin; Xie, Lei

    2015-01-01

    The feature of non-infrastructure support in a wireless ad hoc network (WANET) makes it suffer from various attacks. Moreover, user authentication is the first safety barrier in a network. A mutual trust is achieved by a protocol which enables communicating parties to authenticate each other at the same time and to exchange session keys. For the resource-constrained WANET, an efficient and lightweight user authentication scheme is necessary. In this paper, we propose a user authentication scheme based on the self-certified public key system and elliptic curves cryptography for a WANET. Using the proposed scheme, an efficient two-way user authentication and secure session key agreement can be achieved. Security analysis shows that our proposed scheme is resilient to common known attacks. In addition, the performance analysis shows that our proposed scheme performs similar or better compared with some existing user authentication schemes. PMID:26184224

  5. A User Authentication Scheme Based on Elliptic Curves Cryptography for Wireless Ad Hoc Networks.

    PubMed

    Chen, Huifang; Ge, Linlin; Xie, Lei

    2015-07-14

    The feature of non-infrastructure support in a wireless ad hoc network (WANET) makes it suffer from various attacks. Moreover, user authentication is the first safety barrier in a network. A mutual trust is achieved by a protocol which enables communicating parties to authenticate each other at the same time and to exchange session keys. For the resource-constrained WANET, an efficient and lightweight user authentication scheme is necessary. In this paper, we propose a user authentication scheme based on the self-certified public key system and elliptic curves cryptography for a WANET. Using the proposed scheme, an efficient two-way user authentication and secure session key agreement can be achieved. Security analysis shows that our proposed scheme is resilient to common known attacks. In addition, the performance analysis shows that our proposed scheme performs similar or better compared with some existing user authentication schemes.

  6. Optimization of the scheme for natural ecology planning of urban rivers based on ANP (analytic network process) model.

    PubMed

    Zhang, Yichuan; Wang, Jiangping

    2015-07-01

    Rivers serve as a highly valued component in ecosystem and urban infrastructures. River planning should follow basic principles of maintaining or reconstructing the natural landscape and ecological functions of rivers. Optimization of planning scheme is a prerequisite for successful construction of urban rivers. Therefore, relevant studies on optimization of scheme for natural ecology planning of rivers is crucial. In the present study, four planning schemes for Zhaodingpal River in Xinxiang City, Henan Province were included as the objects for optimization. Fourteen factors that influenced the natural ecology planning of urban rivers were selected from five aspects so as to establish the ANP model. The data processing was done using Super Decisions software. The results showed that important degree of scheme 3 was highest. A scientific, reasonable and accurate evaluation of schemes could be made by ANP method on natural ecology planning of urban rivers. This method could be used to provide references for sustainable development and construction of urban rivers. ANP method is also suitable for optimization of schemes for urban green space planning and design.

  7. A genetic fuzzy analytical hierarchy process based projection pursuit method for selecting schemes of water transportation projects

    NASA Astrophysics Data System (ADS)

    Jin, Juliang; Li, Lei; Wang, Wensheng; Zhang, Ming

    2006-10-01

    The optimal selection of schemes of water transportation projects is a process of choosing a relatively optimal scheme from a number of schemes of water transportation programming and management projects, which is of importance in both theory and practice in water resource systems engineering. In order to achieve consistency and eliminate the dimensions of fuzzy qualitative and fuzzy quantitative evaluation indexes, to determine the weights of the indexes objectively, and to increase the differences among the comprehensive evaluation index values of water transportation project schemes, a projection pursuit method, named FPRM-PP for short, was developed in this work for selecting the optimal water transportation project scheme based on the fuzzy preference relation matrix. The research results show that FPRM-PP is intuitive and practical, the correction range of the fuzzy preference relation matrix A it produces is relatively small, and the result obtained is both stable and accurate; therefore FPRM-PP can be widely used in the optimal selection of different multi-factor decision-making schemes.

  8. Finite time step and spatial grid effects in δf simulation of warm plasmas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sturdevant, Benjamin J., E-mail: benjamin.j.sturdevant@gmail.com; Department of Applied Mathematics, University of Colorado at Boulder, Boulder, CO 80309; Parker, Scott E.

    2016-01-15

    This paper introduces a technique for analyzing time integration methods used with the particle weight equations in δf method particle-in-cell (PIC) schemes. The analysis applies to the simulation of warm, uniform, periodic or infinite plasmas in the linear regime and considers the collective behavior similar to the analysis performed by Langdon for full-f PIC schemes [1,2]. We perform both a time integration analysis and spatial grid analysis for a kinetic ion, adiabatic electron model of ion acoustic waves. An implicit time integration scheme is studied in detail for δf simulations using our weight equation analysis and for full-f simulations usingmore » the method of Langdon. It is found that the δf method exhibits a CFL-like stability condition for low temperature ions, which is independent of the parameter characterizing the implicitness of the scheme. The accuracy of the real frequency and damping rate due to the discrete time and spatial schemes is also derived using a perturbative method. The theoretical analysis of numerical error presented here may be useful for the verification of simulations and for providing intuition for the design of new implicit time integration schemes for the δf method, as well as understanding differences between δf and full-f approaches to plasma simulation.« less

  9. A Novel Scheme for an Energy Efficient Internet of Things Based on Wireless Sensor Networks.

    PubMed

    Rani, Shalli; Talwar, Rajneesh; Malhotra, Jyoteesh; Ahmed, Syed Hassan; Sarkar, Mahasweta; Song, Houbing

    2015-11-12

    One of the emerging networking standards that gap between the physical world and the cyber one is the Internet of Things. In the Internet of Things, smart objects communicate with each other, data are gathered and certain requests of users are satisfied by different queried data. The development of energy efficient schemes for the IoT is a challenging issue as the IoT becomes more complex due to its large scale the current techniques of wireless sensor networks cannot be applied directly to the IoT. To achieve the green networked IoT, this paper addresses energy efficiency issues by proposing a novel deployment scheme. This scheme, introduces: (1) a hierarchical network design; (2) a model for the energy efficient IoT; (3) a minimum energy consumption transmission algorithm to implement the optimal model. The simulation results show that the new scheme is more energy efficient and flexible than traditional WSN schemes and consequently it can be implemented for efficient communication in the IoT.

  10. A Novel Scheme for an Energy Efficient Internet of Things Based on Wireless Sensor Networks

    PubMed Central

    Rani, Shalli; Talwar, Rajneesh; Malhotra, Jyoteesh; Ahmed, Syed Hassan; Sarkar, Mahasweta; Song, Houbing

    2015-01-01

    One of the emerging networking standards that gap between the physical world and the cyber one is the Internet of Things. In the Internet of Things, smart objects communicate with each other, data are gathered and certain requests of users are satisfied by different queried data. The development of energy efficient schemes for the IoT is a challenging issue as the IoT becomes more complex due to its large scale the current techniques of wireless sensor networks cannot be applied directly to the IoT. To achieve the green networked IoT, this paper addresses energy efficiency issues by proposing a novel deployment scheme. This scheme, introduces: (1) a hierarchical network design; (2) a model for the energy efficient IoT; (3) a minimum energy consumption transmission algorithm to implement the optimal model. The simulation results show that the new scheme is more energy efficient and flexible than traditional WSN schemes and consequently it can be implemented for efficient communication in the IoT. PMID:26569260

  11. From development to success: the European surveillance scheme for travel associated Legionnaires' disease.

    PubMed

    Joseph, Carol A; Ricketts, Katherine D

    2007-12-01

    EWGLINET, the European surveillance scheme for travel associated Legionnaires' disease, was established in 1987 following the identification of the disease in 1976. In 1998, the European Commission's Decision 2119/98/EC provided a legal framework for EWGLINET's operation, and its aims and objectives were formalised. Since its inception, the scheme has encountered a number of challenges which have influenced its development as a Disease Specific Network. The solutions to these challenges, and their successes, may be of interest to similar schemes. This article traces the development of the scheme and its responses to the challenges it has encountered. One especially significant document developed by the scheme is the European Guidelines for Control and Prevention of Travel Associated Legionnaires' Disease;(1) its history is explored. In addition, EWGLINET's relationship with collaborating centres and other groups such as tour operators is highlighted. Despite changing over time, the collaborations and partnerships have been maintained and continue to ensure a close cooperation, maximizing public health effects.

  12. Efficient Low Dissipative High Order Schemes for Multiscale MHD Flows, I: Basic Theory

    NASA Technical Reports Server (NTRS)

    Sjoegreen, Bjoern; Yee, H. C.

    2003-01-01

    The objective of this paper is to extend our recently developed highly parallelizable nonlinear stable high order schemes for complex multiscale hydrodynamic applications to the viscous MHD equations. These schemes employed multiresolution wavelets as adaptive numerical dissipation controls t o limit the amount of and to aid the selection and/or blending of the appropriate types of dissipation to be used. The new scheme is formulated for both the conservative and non-conservative form of the MHD equations in curvilinear grids. The four advantages of the present approach over existing MHD schemes reported in the open literature are as follows. First, the scheme is constructed for long-time integrations of shock/turbulence/combustion MHD flows. Available schemes are too diffusive for long-time integrations and/or turbulence/combustion problems. Second, unlike exist- ing schemes for the conservative MHD equations which suffer from ill-conditioned eigen- decompositions, the present scheme makes use of a well-conditioned eigen-decomposition obtained from a minor modification of the eigenvectors of the non-conservative MHD equations t o solve the conservative form of the MHD equations. Third, this approach of using the non-conservative eigensystem when solving the conservative equations also works well in the context of standard shock-capturing schemes for the MHD equations. Fourth, a new approach to minimize the numerical error of the divergence-free magnetic condition for high order schemes is introduced. Numerical experiments with typical MHD model problems revealed the applicability of the newly developed schemes for the MHD equations.

  13. A scheme for the uniform mapping and monitoring of earth resources and environmental complexes using ERTS-1 imagery

    NASA Technical Reports Server (NTRS)

    Poulton, C. E. (Principal Investigator); Welch, R. I.

    1973-01-01

    There are no author-identified significant results in this report. Progress on plans for the development and testing of a practical procedure and system for the uniform mapping and monitoring of natural ecosystems and environmental complexes from space-acquired imagery is discussed. With primary emphasis on ERTS-1 imagery, but supported by appropriate aircraft photography as necessary, the objectives are to accomplish the following: (1) Develop and test in a few selected sites and areas of the western United States a standard format for an ecological and land use legend for making natural resource inventories on a simulated global basis. (2) Based on these same limited geographic areas, identify the potentialities and limitations of the legend concept for the recognition and annotation of ecological analogs and environmental complexes. An additional objective is to determine the optimum combination of space photography, aerial photography, ground data, human data analysis, and automatic data analysis for estimating crop yield in the rice growing areas of California and Louisiana.

  14. The construction of high-accuracy schemes for acoustic equations

    NASA Technical Reports Server (NTRS)

    Tang, Lei; Baeder, James D.

    1995-01-01

    An accuracy analysis of various high order schemes is performed from an interpolation point of view. The analysis indicates that classical high order finite difference schemes, which use polynomial interpolation, hold high accuracy only at nodes and are therefore not suitable for time-dependent problems. Thus, some schemes improve their numerical accuracy within grid cells by the near-minimax approximation method, but their practical significance is degraded by maintaining the same stencil as classical schemes. One-step methods in space discretization, which use piecewise polynomial interpolation and involve data at only two points, can generate a uniform accuracy over the whole grid cell and avoid spurious roots. As a result, they are more accurate and efficient than multistep methods. In particular, the Cubic-Interpolated Psuedoparticle (CIP) scheme is recommended for computational acoustics.

  15. Evaluation of the Performance of the Hybrid Lattice Boltzmann Based Numerical Flux

    NASA Astrophysics Data System (ADS)

    Zheng, H. W.; Shu, C.

    2016-06-01

    It is well known that the numerical scheme is a key factor to the stability and accuracy of a Navier-Stokes solver. Recently, a new hybrid lattice Boltzmann numerical flux (HLBFS) is developed by Shu's group. It combines two different LBFS schemes by a switch function. It solves the Boltzmann equation instead of the Euler equation. In this article, the main object is to evaluate the ability of this HLBFS scheme by our in-house cell centered hybrid mesh based Navier-Stokes code. Its performance is examined by several widely-used bench-mark test cases. The comparisons on results between calculation and experiment are conducted. They show that the scheme can capture the shock wave as well as the resolving of boundary layer.

  16. LevelScheme: A level scheme drawing and scientific figure preparation system for Mathematica

    NASA Astrophysics Data System (ADS)

    Caprio, M. A.

    2005-09-01

    LevelScheme is a scientific figure preparation system for Mathematica. The main emphasis is upon the construction of level schemes, or level energy diagrams, as used in nuclear, atomic, molecular, and hadronic physics. LevelScheme also provides a general infrastructure for the preparation of publication-quality figures, including support for multipanel and inset plotting, customizable tick mark generation, and various drawing and labeling tasks. Coupled with Mathematica's plotting functions and powerful programming language, LevelScheme provides a flexible system for the creation of figures combining diagrams, mathematical plots, and data plots. Program summaryTitle of program:LevelScheme Catalogue identifier:ADVZ Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADVZ Operating systems:Any which supports Mathematica; tested under Microsoft Windows XP, Macintosh OS X, and Linux Programming language used:Mathematica 4 Number of bytes in distributed program, including test and documentation:3 051 807 Distribution format:tar.gz Nature of problem:Creation of level scheme diagrams. Creation of publication-quality multipart figures incorporating diagrams and plots. Method of solution:A set of Mathematica packages has been developed, providing a library of level scheme drawing objects, tools for figure construction and labeling, and control code for producing the graphics.

  17. The Anatomy of a Mathematical Proof: Implications for Analyses with Toulmin's Scheme

    ERIC Educational Resources Information Center

    Simpson, Adrian

    2015-01-01

    A model solution to a proof question on an examination is explored and subjected to a detailed analysis in terms of Toulmin's scheme of argumentation. In doing so, the ways in which the scheme has been variously used in the mathematics education and philosophical literature are contrasted. The analysis raises a number of issues concerning the…

  18. Position And Force Control For Multiple-Arm Robots

    NASA Technical Reports Server (NTRS)

    Hayati, Samad A.

    1988-01-01

    Number of arms increased without introducing undue complexity. Strategy and computer architecture developed for simultaneous control of positions of number of robot arms manipulating same object and of forces and torques that arms exert on object. Scheme enables coordinated manipulation of object, causing it to move along assigned trajectory and be subjected to assigned internal forces and torques.

  19. An Improved Biometrics-Based Remote User Authentication Scheme with User Anonymity

    PubMed Central

    Kumari, Saru

    2013-01-01

    The authors review the biometrics-based user authentication scheme proposed by An in 2012. The authors show that there exist loopholes in the scheme which are detrimental for its security. Therefore the authors propose an improved scheme eradicating the flaws of An's scheme. Then a detailed security analysis of the proposed scheme is presented followed by its efficiency comparison. The proposed scheme not only withstands security problems found in An's scheme but also provides some extra features with mere addition of only two hash operations. The proposed scheme allows user to freely change his password and also provides user anonymity with untraceability. PMID:24350272

  20. An improved biometrics-based remote user authentication scheme with user anonymity.

    PubMed

    Khan, Muhammad Khurram; Kumari, Saru

    2013-01-01

    The authors review the biometrics-based user authentication scheme proposed by An in 2012. The authors show that there exist loopholes in the scheme which are detrimental for its security. Therefore the authors propose an improved scheme eradicating the flaws of An's scheme. Then a detailed security analysis of the proposed scheme is presented followed by its efficiency comparison. The proposed scheme not only withstands security problems found in An's scheme but also provides some extra features with mere addition of only two hash operations. The proposed scheme allows user to freely change his password and also provides user anonymity with untraceability.

  1. Icebreaker: The Evaluation

    ERIC Educational Resources Information Center

    Peerbhoy, Denise; Bourke, Cathriona

    2007-01-01

    Objective: To document young people's and teachers' responses to "Icebreaker", a Theatre in Education (TIE) performance exploring themes of sexual health and relationships, in relation to "Healthy Arts"' objectives. Design: Data reported here were part of a wider evaluation of a government funded scheme. Setting: Data was…

  2. Simultaneous Tensor Decomposition and Completion Using Factor Priors.

    PubMed

    Chen, Yi-Lei; Hsu, Chiou-Ting Candy; Liao, Hong-Yuan Mark

    2013-08-27

    Tensor completion, which is a high-order extension of matrix completion, has generated a great deal of research interest in recent years. Given a tensor with incomplete entries, existing methods use either factorization or completion schemes to recover the missing parts. However, as the number of missing entries increases, factorization schemes may overfit the model because of incorrectly predefined ranks, while completion schemes may fail to interpret the model factors. In this paper, we introduce a novel concept: complete the missing entries and simultaneously capture the underlying model structure. To this end, we propose a method called Simultaneous Tensor Decomposition and Completion (STDC) that combines a rank minimization technique with Tucker model decomposition. Moreover, as the model structure is implicitly included in the Tucker model, we use factor priors, which are usually known a priori in real-world tensor objects, to characterize the underlying joint-manifold drawn from the model factors. We conducted experiments to empirically verify the convergence of our algorithm on synthetic data, and evaluate its effectiveness on various kinds of real-world data. The results demonstrate the efficacy of the proposed method and its potential usage in tensor-based applications. It also outperforms state-of-the-art methods on multilinear model analysis and visual data completion tasks.

  3. Multi-stage robust scheme for citrus identification from high resolution airborne images

    NASA Astrophysics Data System (ADS)

    Amorós-López, Julia; Izquierdo Verdiguier, Emma; Gómez-Chova, Luis; Muñoz-Marí, Jordi; Zoilo Rodríguez-Barreiro, Jorge; Camps-Valls, Gustavo; Calpe-Maravilla, Javier

    2008-10-01

    Identification of land cover types is one of the most critical activities in remote sensing. Nowadays, managing land resources by using remote sensing techniques is becoming a common procedure to speed up the process while reducing costs. However, data analysis procedures should satisfy the accuracy figures demanded by institutions and governments for further administrative actions. This paper presents a methodological scheme to update the citrus Geographical Information Systems (GIS) of the Comunidad Valenciana autonomous region, Spain). The proposed approach introduces a multi-stage automatic scheme to reduce visual photointerpretation and ground validation tasks. First, an object-oriented feature extraction process is carried out for each cadastral parcel from very high spatial resolution (VHR) images (0.5m) acquired in the visible and near infrared. Next, several automatic classifiers (decision trees, multilayer perceptron, and support vector machines) are trained and combined to improve the final accuracy of the results. The proposed strategy fulfills the high accuracy demanded by policy makers by means of combining automatic classification methods with visual photointerpretation available resources. A level of confidence based on the agreement between classifiers allows us an effective management by fixing the quantity of parcels to be reviewed. The proposed methodology can be applied to similar problems and applications.

  4. Sparse Learning with Stochastic Composite Optimization.

    PubMed

    Zhang, Weizhong; Zhang, Lijun; Jin, Zhongming; Jin, Rong; Cai, Deng; Li, Xuelong; Liang, Ronghua; He, Xiaofei

    2017-06-01

    In this paper, we study Stochastic Composite Optimization (SCO) for sparse learning that aims to learn a sparse solution from a composite function. Most of the recent SCO algorithms have already reached the optimal expected convergence rate O(1/λT), but they often fail to deliver sparse solutions at the end either due to the limited sparsity regularization during stochastic optimization (SO) or due to the limitation in online-to-batch conversion. Even when the objective function is strongly convex, their high probability bounds can only attain O(√{log(1/δ)/T}) with δ is the failure probability, which is much worse than the expected convergence rate. To address these limitations, we propose a simple yet effective two-phase Stochastic Composite Optimization scheme by adding a novel powerful sparse online-to-batch conversion to the general Stochastic Optimization algorithms. We further develop three concrete algorithms, OptimalSL, LastSL and AverageSL, directly under our scheme to prove the effectiveness of the proposed scheme. Both the theoretical analysis and the experiment results show that our methods can really outperform the existing methods at the ability of sparse learning and at the meantime we can improve the high probability bound to approximately O(log(log(T)/δ)/λT).

  5. Stochastic user equilibrium model with a tradable credit scheme and application in maximizing network reserve capacity

    NASA Astrophysics Data System (ADS)

    Han, Fei; Cheng, Lin

    2017-04-01

    The tradable credit scheme (TCS) outperforms congestion pricing in terms of social equity and revenue neutrality, apart from the same perfect performance on congestion mitigation. This article investigates the effectiveness and efficiency of TCS on enhancing transportation network capacity in a stochastic user equilibrium (SUE) modelling framework. First, the SUE and credit market equilibrium conditions are presented; then an equivalent general SUE model with TCS is established by virtue of two constructed functions, which can be further simplified under a specific probability distribution. To enhance the network capacity by utilizing TCS, a bi-level mathematical programming model is established for the optimal TCS design problem, with the upper level optimization objective maximizing network reserve capacity and lower level being the proposed SUE model. The heuristic sensitivity analysis-based algorithm is developed to solve the bi-level model. Three numerical examples are provided to illustrate the improvement effect of TCS on the network in different scenarios.

  6. Cubic-panorama image dataset analysis for storage and transmission

    NASA Astrophysics Data System (ADS)

    Salehi, Saeed; Dubois, Eric

    2013-02-01

    In this paper we address the problem of disparity estimation required for free navigation in acquired cubicpanorama image datasets. A client server based scheme is assumed and a remote user is assumed to seek information at each navigation step. The initial compression of such image datasets for storage as well as the transmission of the required data is addressed in this work. Regarding the compression of such data for storage, a fast method that uses properties of the epipolar geometry together with the cubic format of panoramas is used to estimate disparity vectors efficiently. Assuming the use of B pictures, the concept of forward and backward prediction is addressed. Regarding the transmission stage, a new disparity vector transcoding-like scheme is introduced and a frame conversion scenario is addressed. Details on how to pick the best vector among candidate disparity vectors is explained. In all the above mentioned cases, results are compared both visually through error images as well as using the objective measure of Peak Signal to Noise Ratio (PSNR) versus time.

  7. Phase-Image Encryption Based on 3D-Lorenz Chaotic System and Double Random Phase Encoding

    NASA Astrophysics Data System (ADS)

    Sharma, Neha; Saini, Indu; Yadav, AK; Singh, Phool

    2017-12-01

    In this paper, an encryption scheme for phase-images based on 3D-Lorenz chaotic system in Fourier domain under the 4f optical system is presented. The encryption scheme uses a random amplitude mask in the spatial domain and a random phase mask in the frequency domain. Its inputs are phase-images, which are relatively more secure as compared to the intensity images because of non-linearity. The proposed scheme further derives its strength from the use of 3D-Lorenz transform in the frequency domain. Although the experimental setup for optical realization of the proposed scheme has been provided, the results presented here are based on simulations on MATLAB. It has been validated for grayscale images, and is found to be sensitive to the encryption parameters of the Lorenz system. The attacks analysis shows that the key-space is large enough to resist brute-force attack, and the scheme is also resistant to the noise and occlusion attacks. Statistical analysis and the analysis based on correlation distribution of adjacent pixels have been performed to test the efficacy of the encryption scheme. The results have indicated that the proposed encryption scheme possesses a high level of security.

  8. Runge-Kutta methods combined with compact difference schemes for the unsteady Euler equations

    NASA Technical Reports Server (NTRS)

    Yu, Sheng-Tao

    1992-01-01

    Recent development using compact difference schemes to solve the Navier-Stokes equations show spectral-like accuracy. A study was made of the numerical characteristics of various combinations of the Runge-Kutta (RK) methods and compact difference schemes to calculate the unsteady Euler equations. The accuracy of finite difference schemes is assessed based on the evaluations of dissipative error. The objectives are reducing the numerical damping and, at the same time, preserving numerical stability. While this approach has tremendous success solving steady flows, numerical characteristics of unsteady calculations remain largely unclear. For unsteady flows, in addition to the dissipative errors, phase velocity and harmonic content of the numerical results are of concern. As a result of the discretization procedure, the simulated unsteady flow motions actually propagate in a dispersive numerical medium. Consequently, the dispersion characteristics of the numerical schemes which relate the phase velocity and wave number may greatly impact the numerical accuracy. The aim is to assess the numerical accuracy of the simulated results. To this end, the Fourier analysis is to provide the dispersive correlations of various numerical schemes. First, a detailed investigation of the existing RK methods is carried out. A generalized form of an N-step RK method is derived. With this generalized form, the criteria are derived for the three and four-step RK methods to be third and fourth-order time accurate for the non-linear equations, e.g., flow equations. These criteria are then applied to commonly used RK methods such as Jameson's 3-step and 4-step schemes and Wray's algorithm to identify the accuracy of the methods. For the spatial discretization, compact difference schemes are presented. The schemes are formulated in the operator-type to render themselves suitable for the Fourier analyses. The performance of the numerical methods is shown by numerical examples. These examples are detailed. described. The third case is a two-dimensional simulation of a Lamb vortex in an uniform flow. This calculation provides a realistic assessment of various finite difference schemes in terms of the conservation of the vortex strength and the harmonic content after travelling a substantial distance. The numerical implementation of Giles' non-refelctive equations coupled with the characteristic equations as the boundary condition is discussed in detail. Finally, the single vortex calculation is extended to simulate vortex pairing. For the distance between two vortices less than a threshold value, numerical results show crisp resolution of the vortex merging.

  9. Cryptanalysis of Chatterjee-Sarkar Hierarchical Identity-Based Encryption Scheme at PKC 06

    NASA Astrophysics Data System (ADS)

    Park, Jong Hwan; Lee, Dong Hoon

    In 2006, Chatterjee and Sarkar proposed a hierarchical identity-based encryption (HIBE) scheme which can support an unbounded number of identity levels. This property is particularly useful in providing forward secrecy by embedding time components within hierarchical identities. In this paper we show that their scheme does not provide the claimed property. Our analysis shows that if the number of identity levels becomes larger than the value of a fixed public parameter, an unintended receiver can reconstruct a new valid ciphertext and decrypt the ciphertext using his or her own private key. The analysis is similarly applied to a multi-receiver identity-based encryption scheme presented as an application of Chatterjee and Sarkar's HIBE scheme.

  10. ONU Power Saving Scheme for EPON System

    NASA Astrophysics Data System (ADS)

    Mukai, Hiroaki; Tano, Fumihiko; Tanaka, Masaki; Kozaki, Seiji; Yamanaka, Hideaki

    PON (Passive Optical Network) achieves FTTH (Fiber To The Home) economically, by sharing an optical fiber among plural subscribers. Recently, global climate change has been recognized as a serious near term problem. Power saving techniques for electronic devices are important. In PON system, the ONU (Optical Network Unit) power saving scheme has been studied and defined in XG-PON. In this paper, we propose an ONU power saving scheme for EPON. Then, we present an analysis of the power reduction effect and the data transmission delay caused by the ONU power saving scheme. According to the analysis, we propose an efficient provisioning method for the ONU power saving scheme which is applicable to both of XG-PON and EPON.

  11. Security analysis of boolean algebra based on Zhang-Wang digital signature scheme

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zheng, Jinbin, E-mail: jbzheng518@163.com

    2014-10-06

    In 2005, Zhang and Wang proposed an improvement signature scheme without using one-way hash function and message redundancy. In this paper, we show that this scheme exits potential safety concerns through the analysis of boolean algebra, such as bitwise exclusive-or, and point out that mapping is not one to one between assembly instructions and machine code actually by means of the analysis of the result of the assembly program segment, and which possibly causes safety problems unknown to the software.

  12. Willingness to pay for health insurance in the informal sector of Sierra Leone.

    PubMed

    Jofre-Bonet, Mireia; Kamara, Joseph

    2018-01-01

    The objective of this project is to study the willingness to pay (WTP) for health insurance (HI) of individuals working in the informal sector in Sierra Leone, using a purposely-designed survey of a representative sample of this sector. We elicit the WTP using the Double-Bounded Dichotomous Choice with Follow Up method. We also examine the factors that are positively and negatively associated with the likelihood of the respondents to answer affirmatively to joining a HI scheme and to paying three different possible premiums, to join the HI scheme. We additionally analyze the individual and household characteristics associated with the maximum amount the household is willing to pay to join the HI scheme. The results indicate that the average WTP for the HI is 20,237.16 SLL (3.6 USD) per adult but it ranges from about 14,000 SLL (2.5 USD) to about 35,000 SLL (6.2 USD) depending on region, occupation, household and respondent characteristics. The analysis of the maximum WTP indicates that living outside the Western region and working in farming instead of petty trade are associated with a decrease in the maximum premium respondents are WTP for the HI scheme. Instead, the maximum WTP is positively associated to being a driver or a biker; having secondary or tertiary education (as opposed to not having any); the number of pregnant women in the household; having a TV; and, having paid for the last medical requirement. In summary, the various analyses show that a premium for the HI package could be set at approximately 20,000 SLL (3.54 USD) but also that establishing a single premium for all individuals in the informal sector could be risky. The efficient functioning of a HI scheme relies on covering as much of the population as possible, in order to spread risks and make the scheme viable. The impact of the various population characteristics raises the issue of how to rate premiums. In other words, setting a premium that may be too high for a big proportion of the population could mean losing many potential enrollees and might have viability consequences for the operation of the scheme.

  13. An Object-Oriented Python Implementation of an Intermediate-Level Atmospheric Model

    NASA Astrophysics Data System (ADS)

    Lin, J. W.

    2008-12-01

    The Neelin-Zeng Quasi-equilibrium Tropical Circulation Model (QTCM1) is a Fortran-based intermediate-level atmospheric model that includes simplified treatments of several physical processes, including a GCM-like convective scheme and a land-surface scheme with representations of different surface types, evaporation, and soil moisture. This model has been used in studies of the Madden-Julian oscillation, ENSO, and vegetation-atmosphere interaction effects on climate. Through the assumption of convective quasi-equilibrium in the troposphere, the QTCM1 is able to include full nonlinearity, resolve baroclinic disturbances, and generate a reasonable climatology, all at low computational cost. One year of simulation on a PC at 5.625 × 3.75 degree longitude-latitude resolution takes under three minutes of wall-clock time. The Python package qtcm implements the QTCM1 in a mixed-language environment that retains the speed of compiled Fortran while providing the benefits of Python's object-oriented framework and robust suite of utilities and datatypes. We describe key programming constructs used to create this modeling environment: the decomposition of model runs into Python objects, providing methods so visualization tools are attached to model runs, and the use of Python's mutable datatypes (lists and dictionaries) to implement the "run list" entity, which enables total runtime control of subroutine execution order and content. The result is an interactive modeling environment where the traditional sequence of "hypothesis → modeling → visualization and analysis" is opened up and made nonlinear and flexible. In this environment, science tasks such as parameter-space exploration and testing alternative parameterizations can be easily automated, without the need for multiple versions of the model code interacting with a bevy of makefiles and shell scripts. The environment also simplifies interfacing of the atmospheric model to other models (e.g., hydrologic models, statistical models) and analysis tools. The tools developed for this package can be adapted to create similar environments for hydrologic models.

  14. Optimizing School-Based Health-Promotion Programmes: Lessons from a Qualitative Study of Fluoridated Milk Schemes in the UK

    ERIC Educational Resources Information Center

    Foster, Geraldine R. K.; Tickle, Martin

    2013-01-01

    Background and objective: Some districts in the United Kingdom (UK), where the level of child dental caries is high and water fluoridation has not been possible, implement school-based fluoridated milk (FM) schemes. However, process variables, such as consent to drink FM and loss of children as they mature, impede the effectiveness of these…

  15. Health Professionals' Perspectives on Exercise Referral and Physical Activity Promotion in Primary Care: Findings from a Process Evaluation of the National Exercise Referral Scheme in Wales

    ERIC Educational Resources Information Center

    Din, Nafees U.; Moore, Graham F.; Murphy, Simon; Wilkinson, Clare; Williams, Nefyn H.

    2015-01-01

    Background and objectives: Referring clinicians' experiences of exercise referral schemes (ERS) can provide valuable insights into their uptake. However, most qualitative studies focus on patient views only. This paper explores health professionals' perceptions of their role in promoting physical activity and experiences of a National Exercise…

  16. Dypas: A dynamic payload scheduler for shuttle missions

    NASA Technical Reports Server (NTRS)

    Davis, Stephen

    1988-01-01

    Decision and analysis systems have had broad and very practical application areas in the human decision making process. These software systems range from the help sections in simple accounting packages, to the more complex computer configuration programs. Dypas is a decision and analysis system that aids prelaunch shutlle scheduling, and has added functionality to aid the rescheduling done in flight. Dypas is written in Common Lisp on a Symbolics Lisp machine. Dypas differs from other scheduling programs in that it can draw its knowledge from different rule bases and apply them to different rule interpretation schemes. The system has been coded with Flavors, an object oriented extension to Common Lisp on the Symbolics hardware. This allows implementation of objects (experiments) to better match the problem definition, and allows a more coherent solution space to be developed. Dypas was originally developed to test a programmer's aptitude toward Common Lisp and the Symbolics software environment. Since then the system has grown into a large software effort with several programmers and researchers thrown into the effort. Dypas is currently using two expert systems and three inferencing procedures to generate a many object schedule. The paper will review the abilities of Dypas and comment on its functionality.

  17. MRI-based quantification of Duchenne muscular dystrophy in a canine model

    NASA Astrophysics Data System (ADS)

    Wang, Jiahui; Fan, Zheng; Kornegay, Joe N.; Styner, Martin A.

    2011-03-01

    Duchenne muscular dystrophy (DMD) is a progressive and fatal X-linked disease caused by mutations in the DMD gene. Magnetic resonance imaging (MRI) has shown potential to provide non-invasive and objective biomarkers for monitoring disease progression and therapeutic effect in DMD. In this paper, we propose a semi-automated scheme to quantify MRI features of golden retriever muscular dystrophy (GRMD), a canine model of DMD. Our method was applied to a natural history data set and a hydrodynamic limb perfusion data set. The scheme is composed of three modules: pre-processing, muscle segmentation, and feature analysis. The pre-processing module includes: calculation of T2 maps, spatial registration of T2 weighted (T2WI) images, T2 weighted fat suppressed (T2FS) images, and T2 maps, and intensity calibration of T2WI and T2FS images. We then manually segment six pelvic limb muscles. For each of the segmented muscles, we finally automatically measure volume and intensity statistics of the T2FS images and T2 maps. For the natural history study, our results showed that four of six muscles in affected dogs had smaller volumes and all had higher mean intensities in T2 maps as compared to normal dogs. For the perfusion study, the muscle volumes and mean intensities in T2FS were increased in the post-perfusion MRI scans as compared to pre-perfusion MRI scans, as predicted. We conclude that our scheme successfully performs quantitative analysis of muscle MRI features of GRMD.

  18. Impact of the health insurance scheme for stateless people on inpatient utilization in Kraburi Hospital, Thailand

    PubMed Central

    Suphanchaimat, Rapeepong; Prakongsai, Phusit; Limwattananon, Supon; Mills, Anne

    2016-01-01

    Objectives This study sought to investigate the impact of the Thai “Health Insurance for People with Citizenship Problems” (HI-PCP) on access to care for stateless patients, compared to Universal Coverage Scheme patients and the uninsured, using inpatient utilization as a proxy for impact. Methods Secondary data analysis of inpatient records of Kraburi Hospital, Ranong province, between 2009 (pre-policy) and 2012 (post-policy) was employed. Descriptive statistics and multivariate analysis by difference-in-difference model were performed. Results The volume of inpatient service utilization by stateless patients expanded after the introduction of the HI-PCP. However, this increase did not appear to stem from the HI-PCP per se. After controlling for key covariates, including patients’ characteristics, disease condition, and domicile, there was only a weak positive association between the HI-PCP and utilization. Critical factors contributing significantly to increased utilization were older age, proximity to the hospital, and presence of catastrophic illness. Conclusion A potential explanation for the insignificant impact of the HI-PCP on access to inpatient care of stateless patients is likely to be a lack of awareness of the existence of the scheme among the stateless population and local health staff. This problem is likely to have been accentuated by operational constraints in policy implementation, including the poor performance of local offices in registering stateless people. A key limitation of this study is a lack of data on patients who did not visit the health facility at the first opportunity. Further study of health-seeking behavior of stateless people at the household level is recommended. PMID:27942240

  19. Topology-independent shape modeling scheme

    NASA Astrophysics Data System (ADS)

    Malladi, Ravikanth; Sethian, James A.; Vemuri, Baba C.

    1993-06-01

    Developing shape models is an important aspect of computer vision research. Geometric and differential properties of the surface can be computed from shape models. They also aid the tasks of object representation and recognition. In this paper we present an innovative new approach for shape modeling which, while retaining important features of the existing methods, overcomes most of their limitations. Our technique can be applied to model arbitrarily complex shapes, shapes with protrusions, and to situations where no a priori assumption about the object's topology can be made. A single instance of our model, when presented with an image having more than one object of interest, has the ability to split freely to represent each object. Our method is based on the level set ideas developed by Osher & Sethian to follow propagating solid/liquid interfaces with curvature-dependent speeds. The interface is a closed, nonintersecting, hypersurface flowing along its gradient field with constant speed or a speed that depends on the curvature. We move the interface by solving a `Hamilton-Jacobi' type equation written for a function in which the interface is a particular level set. A speed function synthesized from the image is used to stop the interface in the vicinity of the object boundaries. The resulting equations of motion are solved by numerical techniques borrowed from the technology of hyperbolic conservation laws. An added advantage of this scheme is that it can easily be extended to any number of space dimensions. The efficacy of the scheme is demonstrated with numerical experiments on synthesized images and noisy medical images.

  20. Direct single-shot phase retrieval from the diffraction pattern of separated objects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leshem, Ben; Xu, Rui; Dallal, Yehonatan

    The non-crystallographic phase problem arises in numerous scientific and technological fields. An important application is coherent diffractive imaging. Recent advances in X-ray free-electron lasers allow capturing of the diffraction pattern from a single nanoparticle before it disintegrates, in so-called ‘diffraction before destruction’ experiments. Presently, the phase is reconstructed by iterative algorithms, imposing a non-convex computational challenge, or by Fourier holography, requiring a well-characterized reference field. Here we present a convex scheme for single-shot phase retrieval for two (or more) sufficiently separated objects, demonstrated in two dimensions. In our approach, the objects serve as unknown references to one another, reducing themore » phase problem to a solvable set of linear equations. We establish our method numerically and experimentally in the optical domain and demonstrate a proof-of-principle single-shot coherent diffractive imaging using X-ray free-electron lasers pulses. Lastly, our scheme alleviates several limitations of current methods, offering a new pathway towards direct reconstruction of complex objects.« less

  1. Direct single-shot phase retrieval from the diffraction pattern of separated objects

    DOE PAGES

    Leshem, Ben; Xu, Rui; Dallal, Yehonatan; ...

    2016-02-22

    The non-crystallographic phase problem arises in numerous scientific and technological fields. An important application is coherent diffractive imaging. Recent advances in X-ray free-electron lasers allow capturing of the diffraction pattern from a single nanoparticle before it disintegrates, in so-called ‘diffraction before destruction’ experiments. Presently, the phase is reconstructed by iterative algorithms, imposing a non-convex computational challenge, or by Fourier holography, requiring a well-characterized reference field. Here we present a convex scheme for single-shot phase retrieval for two (or more) sufficiently separated objects, demonstrated in two dimensions. In our approach, the objects serve as unknown references to one another, reducing themore » phase problem to a solvable set of linear equations. We establish our method numerically and experimentally in the optical domain and demonstrate a proof-of-principle single-shot coherent diffractive imaging using X-ray free-electron lasers pulses. Lastly, our scheme alleviates several limitations of current methods, offering a new pathway towards direct reconstruction of complex objects.« less

  2. Large-scale-system effectiveness analysis. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patton, A.D.; Ayoub, A.K.; Foster, J.W.

    1979-11-01

    Objective of the research project has been the investigation and development of methods for calculating system reliability indices which have absolute, and measurable, significance to consumers. Such indices are a necessary prerequisite to any scheme for system optimization which includes the economic consequences of consumer service interruptions. A further area of investigation has been joint consideration of generation and transmission in reliability studies. Methods for finding or estimating the probability distributions of some measures of reliability performance have been developed. The application of modern Monte Carlo simulation methods to compute reliability indices in generating systems has been studied.

  3. Known-plaintext attack on a joint transform correlator encrypting system.

    PubMed

    Barrera, John Fredy; Vargas, Carlos; Tebaldi, Myrian; Torroba, Roberto; Bolognini, Nestor

    2010-11-01

    We demonstrate in this Letter that a joint transform correlator shows vulnerability to known-plaintext attacks. An unauthorized user, who intercepts both an object and its encrypted version, can obtain the security key code mask. In this contribution, we conduct a hybrid heuristic attack scheme merge to a Gerchberg-Saxton routine to estimate the encrypting key to decode different ciphertexts encrypted with that same key. We also analyze the success of this attack for different pairs of plaintext-ciphertext used to get the encrypting code. We present simulation results for the decrypting procedure to demonstrate the validity of our analysis.

  4. SCISEAL: A CFD code for analysis of fluid dynamic forces in seals

    NASA Technical Reports Server (NTRS)

    Athavale, Mahesh; Przekwas, Andrzej

    1994-01-01

    A viewgraph presentation is made of the objectives, capabilities, and test results of the computer code SCISEAL. Currently, the seal code has: a finite volume, pressure-based integration scheme; colocated variables with strong conservation approach; high-order spatial differencing, up to third-order; up to second-order temporal differencing; a comprehensive set of boundary conditions; a variety of turbulence models and surface roughness treatment; moving grid formulation for arbitrary rotor whirl; rotor dynamic coefficients calculated by the circular whirl and numerical shaker methods; and small perturbation capabilities to handle centered and eccentric seals.

  5. Subdiffraction incoherent optical imaging via spatial-mode demultiplexing: Semiclassical treatment

    NASA Astrophysics Data System (ADS)

    Tsang, Mankei

    2018-02-01

    I present a semiclassical analysis of a spatial-mode demultiplexing (SPADE) measurement scheme for far-field incoherent optical imaging under the effects of diffraction and photon shot noise. Building on previous results that assume two point sources or the Gaussian point-spread function, I generalize SPADE for a larger class of point-spread functions and evaluate its errors in estimating the moments of an arbitrary subdiffraction object. Compared with the limits to direct imaging set by the Cramér-Rao bounds, the results show that SPADE can offer far superior accuracy in estimating second- and higher-order moments.

  6. Improvement of a Quantum Proxy Blind Signature Scheme

    NASA Astrophysics Data System (ADS)

    Zhang, Jia-Lei; Zhang, Jian-Zhong; Xie, Shu-Cui

    2018-02-01

    Improvement of a quantum proxy blind signature scheme is proposed in this paper. Six-qubit entangled state functions as quantum channel. In our scheme, a trust party Trent is introduced so as to avoid David's dishonest behavior. The receiver David verifies the signature with the help of Trent in our scheme. The scheme uses the physical characteristics of quantum mechanics to implement message blinding, delegation, signature and verification. Security analysis proves that our scheme has the properties of undeniability, unforgeability, anonymity and can resist some common attacks.

  7. Improvement of a Quantum Proxy Blind Signature Scheme

    NASA Astrophysics Data System (ADS)

    Zhang, Jia-Lei; Zhang, Jian-Zhong; Xie, Shu-Cui

    2018-06-01

    Improvement of a quantum proxy blind signature scheme is proposed in this paper. Six-qubit entangled state functions as quantum channel. In our scheme, a trust party Trent is introduced so as to avoid David's dishonest behavior. The receiver David verifies the signature with the help of Trent in our scheme. The scheme uses the physical characteristics of quantum mechanics to implement message blinding, delegation, signature and verification. Security analysis proves that our scheme has the properties of undeniability, unforgeability, anonymity and can resist some common attacks.

  8. High-performance parallel analysis of coupled problems for aircraft propulsion

    NASA Technical Reports Server (NTRS)

    Felippa, C. A.; Farhat, C.; Lanteri, S.; Gumaste, U.; Ronaghi, M.

    1994-01-01

    Applications are described of high-performance parallel, computation for the analysis of complete jet engines, considering its multi-discipline coupled problem. The coupled problem involves interaction of structures with gas dynamics, heat conduction and heat transfer in aircraft engines. The methodology issues addressed include: consistent discrete formulation of coupled problems with emphasis on coupling phenomena; effect of partitioning strategies, augmentation and temporal solution procedures; sensitivity of response to problem parameters; and methods for interfacing multiscale discretizations in different single fields. The computer implementation issues addressed include: parallel treatment of coupled systems; domain decomposition and mesh partitioning strategies; data representation in object-oriented form and mapping to hardware driven representation, and tradeoff studies between partitioning schemes and fully coupled treatment.

  9. Integrated optical 3D digital imaging based on DSP scheme

    NASA Astrophysics Data System (ADS)

    Wang, Xiaodong; Peng, Xiang; Gao, Bruce Z.

    2008-03-01

    We present a scheme of integrated optical 3-D digital imaging (IO3DI) based on digital signal processor (DSP), which can acquire range images independently without PC support. This scheme is based on a parallel hardware structure with aid of DSP and field programmable gate array (FPGA) to realize 3-D imaging. In this integrated scheme of 3-D imaging, the phase measurement profilometry is adopted. To realize the pipeline processing of the fringe projection, image acquisition and fringe pattern analysis, we present a multi-threads application program that is developed under the environment of DSP/BIOS RTOS (real-time operating system). Since RTOS provides a preemptive kernel and powerful configuration tool, with which we are able to achieve a real-time scheduling and synchronization. To accelerate automatic fringe analysis and phase unwrapping, we make use of the technique of software optimization. The proposed scheme can reach a performance of 39.5 f/s (frames per second), so it may well fit into real-time fringe-pattern analysis and can implement fast 3-D imaging. Experiment results are also presented to show the validity of proposed scheme.

  10. A modified eco-efficiency framework and methodology for advancing the state of practice of sustainability analysis as applied to green infrastructure.

    PubMed

    Ghimire, Santosh R; Johnston, John M

    2017-09-01

    We propose a modified eco-efficiency (EE) framework and novel sustainability analysis methodology for green infrastructure (GI) practices used in water resource management. Green infrastructure practices such as rainwater harvesting (RWH), rain gardens, porous pavements, and green roofs are emerging as viable strategies for climate change adaptation. The modified framework includes 4 economic, 11 environmental, and 3 social indicators. Using 6 indicators from the framework, at least 1 from each dimension of sustainability, we demonstrate the methodology to analyze RWH designs. We use life cycle assessment and life cycle cost assessment to calculate the sustainability indicators of 20 design configurations as Decision Management Objectives (DMOs). Five DMOs emerged as relatively more sustainable along the EE analysis Tradeoff Line, and we used Data Envelopment Analysis (DEA), a widely applied statistical approach, to quantify the modified EE measures as DMO sustainability scores. We also addressed the subjectivity and sensitivity analysis requirements of sustainability analysis, and we evaluated the performance of 10 weighting schemes that included classical DEA, equal weights, National Institute of Standards and Technology's stakeholder panel, Eco-Indicator 99, Sustainable Society Foundation's Sustainable Society Index, and 5 derived schemes. We improved upon classical DEA by applying the weighting schemes to identify sustainability scores that ranged from 0.18 to 1.0, avoiding the nonuniqueness problem and revealing the least to most sustainable DMOs. Our methodology provides a more comprehensive view of water resource management and is generally applicable to GI and industrial, environmental, and engineered systems to explore the sustainability space of alternative design configurations. Integr Environ Assess Manag 2017;13:821-831. Published 2017. This article is a US Government work and is in the public domain in the USA. Integrated Environmental Assessment and Management published by Wiley Periodicals, Inc. on behalf of Society of Environmental Toxicology & Chemistry (SETAC). Published 2017. This article is a US Government work and is in the public domain in the USA. Integrated Environmental Assessment and Management published by Wiley Periodicals, Inc. on behalf of Society of Environmental Toxicology & Chemistry (SETAC).

  11. An ocean data assimilation system and reanalysis of the World Ocean hydrophysical fields

    NASA Astrophysics Data System (ADS)

    Zelenko, A. A.; Vil'fand, R. M.; Resnyanskii, Yu. D.; Strukov, B. S.; Tsyrulnikov, M. D.; Svirenko, P. I.

    2016-07-01

    A new version of the ocean data assimilation system (ODAS) developed at the Hydrometcentre of Russia is presented. The assimilation is performed following the sequential scheme analysis-forecast-analysis. The main components of the ODAS are procedures for operational observation data processing, a variational analysis scheme, and an ocean general circulation model used to estimate the first guess fields involved in the analysis. In situ observations of temperature and salinity in the upper 1400-m ocean layer obtained from various observational platforms are used as input data. In the new ODAS version, the horizontal resolution of the assimilating model and of the output products is increased, the previous 2D-Var analysis scheme is replaced by a more general 3D-Var scheme, and a more flexible incremental analysis updating procedure is introduced to correct the model calculations. A reanalysis of the main World Ocean hydrophysical fields over the 2005-2015 period has been performed using the updated ODAS. The reanalysis results are compared with data from independent sources.

  12. Performance Evaluation of Counter-Based Dynamic Load Balancing Schemes for Massive Contingency Analysis with Different Computing Environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yousu; Huang, Zhenyu; Chavarría-Miranda, Daniel

    Contingency analysis is a key function in the Energy Management System (EMS) to assess the impact of various combinations of power system component failures based on state estimation. Contingency analysis is also extensively used in power market operation for feasibility test of market solutions. High performance computing holds the promise of faster analysis of more contingency cases for the purpose of safe and reliable operation of today’s power grids with less operating margin and more intermittent renewable energy sources. This paper evaluates the performance of counter-based dynamic load balancing schemes for massive contingency analysis under different computing environments. Insights frommore » the performance evaluation can be used as guidance for users to select suitable schemes in the application of massive contingency analysis. Case studies, as well as MATLAB simulations, of massive contingency cases using the Western Electricity Coordinating Council power grid model are presented to illustrate the application of high performance computing with counter-based dynamic load balancing schemes.« less

  13. Performance Analysis of Physical Layer Security of Opportunistic Scheduling in Multiuser Multirelay Cooperative Networks

    PubMed Central

    Shim, Kyusung; Do, Nhu Tri; An, Beongku

    2017-01-01

    In this paper, we study the physical layer security (PLS) of opportunistic scheduling for uplink scenarios of multiuser multirelay cooperative networks. To this end, we propose a low-complexity, yet comparable secrecy performance source relay selection scheme, called the proposed source relay selection (PSRS) scheme. Specifically, the PSRS scheme first selects the least vulnerable source and then selects the relay that maximizes the system secrecy capacity for the given selected source. Additionally, the maximal ratio combining (MRC) technique and the selection combining (SC) technique are considered at the eavesdropper, respectively. Investigating the system performance in terms of secrecy outage probability (SOP), closed-form expressions of the SOP are derived. The developed analysis is corroborated through Monte Carlo simulation. Numerical results show that the PSRS scheme significantly improves the secure ability of the system compared to that of the random source relay selection scheme, but does not outperform the optimal joint source relay selection (OJSRS) scheme. However, the PSRS scheme drastically reduces the required amount of channel state information (CSI) estimations compared to that required by the OJSRS scheme, specially in dense cooperative networks. PMID:28212286

  14. Organizational Schemes of Information Resources in Top 50 Academic Business Library Websites

    ERIC Educational Resources Information Center

    Kim, Soojung; DeCoster, Elizabeth

    2011-01-01

    This paper analyzes the organizational schemes of information resources found in top 50 academic business library websites through content analysis and discusses the development and evaluation of the identified schemes.

  15. Shape Analysis of Planar Multiply-Connected Objects Using Conformal Welding.

    PubMed

    Lok Ming Lui; Wei Zeng; Shing-Tung Yau; Xianfeng Gu

    2014-07-01

    Shape analysis is a central problem in the field of computer vision. In 2D shape analysis, classification and recognition of objects from their observed silhouettes are extremely crucial but difficult. It usually involves an efficient representation of 2D shape space with a metric, so that its mathematical structure can be used for further analysis. Although the study of 2D simply-connected shapes has been subject to a corpus of literatures, the analysis of multiply-connected shapes is comparatively less studied. In this work, we propose a representation for general 2D multiply-connected domains with arbitrary topologies using conformal welding. A metric can be defined on the proposed representation space, which gives a metric to measure dissimilarities between objects. The main idea is to map the exterior and interior of the domain conformally to unit disks and circle domains (unit disk with several inner disks removed), using holomorphic 1-forms. A set of diffeomorphisms of the unit circle S(1) can be obtained, which together with the conformal modules are used to define the shape signature. A shape distance between shape signatures can be defined to measure dissimilarities between shapes. We prove theoretically that the proposed shape signature uniquely determines the multiply-connected objects under suitable normalization. We also introduce a reconstruction algorithm to obtain shapes from their signatures. This completes our framework and allows us to move back and forth between shapes and signatures. With that, a morphing algorithm between shapes can be developed through the interpolation of the Beltrami coefficients associated with the signatures. Experiments have been carried out on shapes extracted from real images. Results demonstrate the efficacy of our proposed algorithm as a stable shape representation scheme.

  16. A review of accuracy assessment for object-based image analysis: From per-pixel to per-polygon approaches

    NASA Astrophysics Data System (ADS)

    Ye, Su; Pontius, Robert Gilmore; Rakshit, Rahul

    2018-07-01

    Object-based image analysis (OBIA) has gained widespread popularity for creating maps from remotely sensed data. Researchers routinely claim that OBIA procedures outperform pixel-based procedures; however, it is not immediately obvious how to evaluate the degree to which an OBIA map compares to reference information in a manner that accounts for the fact that the OBIA map consists of objects that vary in size and shape. Our study reviews 209 journal articles concerning OBIA published between 2003 and 2017. We focus on the three stages of accuracy assessment: (1) sampling design, (2) response design and (3) accuracy analysis. First, we report the literature's overall characteristics concerning OBIA accuracy assessment. Simple random sampling was the most used method among probability sampling strategies, slightly more than stratified sampling. Office interpreted remotely sensed data was the dominant reference source. The literature reported accuracies ranging from 42% to 96%, with an average of 85%. A third of the articles failed to give sufficient information concerning accuracy methodology such as sampling scheme and sample size. We found few studies that focused specifically on the accuracy of the segmentation. Second, we identify a recent increase of OBIA articles in using per-polygon approaches compared to per-pixel approaches for accuracy assessment. We clarify the impacts of the per-pixel versus the per-polygon approaches respectively on sampling, response design and accuracy analysis. Our review defines the technical and methodological needs in the current per-polygon approaches, such as polygon-based sampling, analysis of mixed polygons, matching of mapped with reference polygons and assessment of segmentation accuracy. Our review summarizes and discusses the current issues in object-based accuracy assessment to provide guidance for improved accuracy assessments for OBIA.

  17. Genetic and economic evaluation of Japanese Black (Wagyu) cattle breeding schemes.

    PubMed

    Kahi, A K; Hirooka, H

    2005-09-01

    Deterministic simulation was used to evaluate 10 breeding schemes for genetic gain and profitability and in the context of maximizing returns from investment in Japanese Black cattle breeding. A breeding objective that integrated the cow-calf and feedlot segments was considered. Ten breeding schemes that differed in the records available for use as selection criteria were defined. The schemes ranged from one that used carcass traits currently available to Japanese Black cattle breeders (Scheme 1) to one that also included linear measurements and male and female reproduction traits (Scheme 10). The latter scheme represented the highest level of performance recording. In all breeding schemes, sires were chosen from the proportion selected during the first selection stage (performance testing), modeling a two-stage selection process. The effect on genetic gain and profitability of varying test capacity and number of progeny per sire and of ultrasound scanning of live animals was examined for all breeding schemes. Breeding schemes that selected young bulls during performance testing based on additional individual traits and information on carcass traits from their relatives generated additional genetic gain and profitability. Increasing test capacity resulted in an increase in genetic gain in all schemes. Profitability was optimal in Scheme 2 (a scheme similar to Scheme 1, but selection of young bulls also was based on information on carcass traits from their relatives) to 10 when 900 to 1,000 places were available for performance testing. Similarly, as the number of progeny used in the selection of sires increased, genetic gain first increased sharply and then gradually in all schemes. Profit was optimal across all breeding schemes when sires were selected based on information from 150 to 200 progeny. Additional genetic gain and profitability were generated in each breeding scheme with ultrasound scanning of live animals for carcass traits. Ultrasound scanning of live animals was more important than the addition of any other traits in the selection criteria. These results may be used to provide guidance to Japanese Black cattle breeders.

  18. Enhancing the Remote Variable Operations in NPSS/CCDK

    NASA Technical Reports Server (NTRS)

    Sang, Janche; Follen, Gregory; Kim, Chan; Lopez, Isaac; Townsend, Scott

    2001-01-01

    Many scientific applications in aerodynamics and solid mechanics are written in Fortran. Refitting these legacy Fortran codes with distributed objects can increase the code reusability. The remote variable scheme provided in NPSS/CCDK helps programmers easily migrate the Fortran codes towards a client-server platform. This scheme gives the client the capability of accessing the variables at the server site. In this paper, we review and enhance the remote variable scheme by using the operator overloading features in C++. The enhancement enables NPSS programmers to use remote variables in much the same way as traditional variables. The remote variable scheme adopts the lazy update approach and the prefetch method. The design strategies and implementation techniques are described in details. Preliminary performance evaluation shows that communication overhead can be greatly reduced.

  19. Quantum attack-resistent certificateless multi-receiver signcryption scheme.

    PubMed

    Li, Huixian; Chen, Xubao; Pang, Liaojun; Shi, Weisong

    2013-01-01

    The existing certificateless signcryption schemes were designed mainly based on the traditional public key cryptography, in which the security relies on the hard problems, such as factor decomposition and discrete logarithm. However, these problems will be easily solved by the quantum computing. So the existing certificateless signcryption schemes are vulnerable to the quantum attack. Multivariate public key cryptography (MPKC), which can resist the quantum attack, is one of the alternative solutions to guarantee the security of communications in the post-quantum age. Motivated by these concerns, we proposed a new construction of the certificateless multi-receiver signcryption scheme (CLMSC) based on MPKC. The new scheme inherits the security of MPKC, which can withstand the quantum attack. Multivariate quadratic polynomial operations, which have lower computation complexity than bilinear pairing operations, are employed in signcrypting a message for a certain number of receivers in our scheme. Security analysis shows that our scheme is a secure MPKC-based scheme. We proved its security under the hardness of the Multivariate Quadratic (MQ) problem and its unforgeability under the Isomorphism of Polynomials (IP) assumption in the random oracle model. The analysis results show that our scheme also has the security properties of non-repudiation, perfect forward secrecy, perfect backward secrecy and public verifiability. Compared with the existing schemes in terms of computation complexity and ciphertext length, our scheme is more efficient, which makes it suitable for terminals with low computation capacity like smart cards.

  20. Phase II Evaluation of Clinical Coding Schemes

    PubMed Central

    Campbell, James R.; Carpenter, Paul; Sneiderman, Charles; Cohn, Simon; Chute, Christopher G.; Warren, Judith

    1997-01-01

    Abstract Objective: To compare three potential sources of controlled clinical terminology (READ codes version 3.1, SNOMED International, and Unified Medical Language System (UMLS) version 1.6) relative to attributes of completeness, clinical taxonomy, administrative mapping, term definitions and clarity (duplicate coding rate). Methods: The authors assembled 1929 source concept records from a variety of clinical information taken from four medical centers across the United States. The source data included medical as well as ample nursing terminology. The source records were coded in each scheme by an investigator and checked by the coding scheme owner. The codings were then scored by an independent panel of clinicians for acceptability. Codes were checked for definitions provided with the scheme. Codes for a random sample of source records were analyzed by an investigator for “parent” and “child” codes within the scheme. Parent and child pairs were scored by an independent panel of medical informatics specialists for clinical acceptability. Administrative and billing code mapping from the published scheme were reviewed for all coded records and analyzed by independent reviewers for accuracy. The investigator for each scheme exhaustively searched a sample of coded records for duplications. Results: SNOMED was judged to be significantly more complete in coding the source material than the other schemes (SNOMED* 70%; READ 57%; UMLS 50%; *p <.00001). SNOMED also had a richer clinical taxonomy judged by the number of acceptable first-degree relatives per coded concept (SNOMED* 4.56; UMLS 3.17; READ 2.14, *p <.005). Only the UMLS provided any definitions; these were found for 49% of records which had a coding assignment. READ and UMLS had better administrative mappings (composite score: READ* 40.6%; UMLS* 36.1%; SNOMED 20.7%, *p <. 00001), and SNOMED had substantially more duplications of coding assignments (duplication rate: READ 0%; UMLS 4.2%; SNOMED* 13.9%, *p <. 004) associated with a loss of clarity. Conclusion: No major terminology source can lay claim to being the ideal resource for a computer-based patient record. However, based upon this analysis of releases for April 1995, SNOMED International is considerably more complete, has a compositional nature and a richer taxonomy. It suffers from less clarity, resulting from a lack of syntax and evolutionary changes in its coding scheme. READ has greater clarity and better mapping to administrative schemes (ICD-10 and OPCS-4), is rapidly changing and is less complete. UMLS is a rich lexical resource, with mappings to many source vocabularies. It provides definitions for many of its terms. However, due to the varying granularities and purposes of its source schemes, it has limitations for representation of clinical concepts within a computer-based patient record. PMID:9147343

  1. Review on dog rabies vaccination coverage in Africa: a question of dog accessibility or cost recovery?

    PubMed

    Jibat, Tariku; Hogeveen, Henk; Mourits, Monique C M

    2015-02-01

    Rabies still poses a significant human health problem throughout most of Africa, where the majority of the human cases results from dog bites. Mass dog vaccination is considered to be the most effective method to prevent rabies in humans. Our objective was to systematically review research articles on dog rabies parenteral vaccination coverage in Africa in relation to dog accessibility and vaccination cost recovery arrangement (i.e.free of charge or owner charged). A systematic literature search was made in the databases of CAB abstracts (EBSCOhost and OvidSP), Scopus, Web of Science, PubMed, Medline (EBSCOhost and OvidSP) and AJOL (African Journal Online) for peer reviewed articles on 1) rabies control, 2) dog rabies vaccination coverage and 3) dog demography in Africa. Identified articles were subsequently screened and selected using predefined selection criteria like year of publication (viz. ≥ 1990), type of study (cross sectional), objective(s) of the study (i.e. vaccination coverage rates, dog demographics and financial arrangements of vaccination costs), language of publication (English) and geographical focus (Africa). The selection process resulted in sixteen peer reviewed articles which were used to review dog demography and dog ownership status, and dog rabies vaccination coverage throughout Africa. The main review findings indicate that 1) the majority (up to 98.1%) of dogs in African countries are owned (and as such accessible), 2) puppies younger than 3 months of age constitute a considerable proportion (up to 30%) of the dog population and 3) male dogs are dominating in numbers (up to 3.6 times the female dog population). Dog rabies parenteral vaccination coverage was compared between "free of charge" and "owner charged" vaccination schemes by the technique of Meta-analysis. Results indicate that the rabies vaccination coverage following a free of charge vaccination scheme (68%) is closer to the World Health Organization recommended coverage rate (70%) than the achieved coverage rate in owner-charged dog rabies vaccination schemes (18%). Most dogs in Africa are owned and accessible for parenteral vaccination against rabies if the campaign is performed "free of charge".

  2. Review on Dog Rabies Vaccination Coverage in Africa: A Question of Dog Accessibility or Cost Recovery?

    PubMed Central

    Jibat, Tariku; Hogeveen, Henk; Mourits, Monique C. M.

    2015-01-01

    Background Rabies still poses a significant human health problem throughout most of Africa, where the majority of the human cases results from dog bites. Mass dog vaccination is considered to be the most effective method to prevent rabies in humans. Our objective was to systematically review research articles on dog rabies parenteral vaccination coverage in Africa in relation to dog accessibility and vaccination cost recovery arrangement (i.e.free of charge or owner charged). Methodology/Principal Findings A systematic literature search was made in the databases of CAB abstracts (EBSCOhost and OvidSP), Scopus, Web of Science, PubMed, Medline (EBSCOhost and OvidSP) and AJOL (African Journal Online) for peer reviewed articles on 1) rabies control, 2) dog rabies vaccination coverage and 3) dog demography in Africa. Identified articles were subsequently screened and selected using predefined selection criteria like year of publication (viz. ≥ 1990), type of study (cross sectional), objective(s) of the study (i.e. vaccination coverage rates, dog demographics and financial arrangements of vaccination costs), language of publication (English) and geographical focus (Africa). The selection process resulted in sixteen peer reviewed articles which were used to review dog demography and dog ownership status, and dog rabies vaccination coverage throughout Africa. The main review findings indicate that 1) the majority (up to 98.1%) of dogs in African countries are owned (and as such accessible), 2) puppies younger than 3 months of age constitute a considerable proportion (up to 30%) of the dog population and 3) male dogs are dominating in numbers (up to 3.6 times the female dog population). Dog rabies parenteral vaccination coverage was compared between “free of charge” and “owner charged” vaccination schemes by the technique of Meta-analysis. Results indicate that the rabies vaccination coverage following a free of charge vaccination scheme (68%) is closer to the World Health Organization recommended coverage rate (70%) than the achieved coverage rate in owner-charged dog rabies vaccination schemes (18%). Conclusions/Significance Most dogs in Africa are owned and accessible for parenteral vaccination against rabies if the campaign is performed “free of charge”. PMID:25646774

  3. Differential evolution-simulated annealing for multiple sequence alignment

    NASA Astrophysics Data System (ADS)

    Addawe, R. C.; Addawe, J. M.; Sueño, M. R. K.; Magadia, J. C.

    2017-10-01

    Multiple sequence alignments (MSA) are used in the analysis of molecular evolution and sequence structure relationships. In this paper, a hybrid algorithm, Differential Evolution - Simulated Annealing (DESA) is applied in optimizing multiple sequence alignments (MSAs) based on structural information, non-gaps percentage and totally conserved columns. DESA is a robust algorithm characterized by self-organization, mutation, crossover, and SA-like selection scheme of the strategy parameters. Here, the MSA problem is treated as a multi-objective optimization problem of the hybrid evolutionary algorithm, DESA. Thus, we name the algorithm as DESA-MSA. Simulated sequences and alignments were generated to evaluate the accuracy and efficiency of DESA-MSA using different indel sizes, sequence lengths, deletion rates and insertion rates. The proposed hybrid algorithm obtained acceptable solutions particularly for the MSA problem evaluated based on the three objectives.

  4. On the security of a novel probabilistic signature based on bilinear square Diffie-Hellman problem and its extension.

    PubMed

    Zhao, Zhenguo; Shi, Wenbo

    2014-01-01

    Probabilistic signature scheme has been widely used in modern electronic commerce since it could provide integrity, authenticity, and nonrepudiation. Recently, Wu and Lin proposed a novel probabilistic signature (PS) scheme using the bilinear square Diffie-Hellman (BSDH) problem. They also extended it to a universal designated verifier signature (UDVS) scheme. In this paper, we analyze the security of Wu et al.'s PS scheme and UDVS scheme. Through concrete attacks, we demonstrate both of their schemes are not unforgeable. The security analysis shows that their schemes are not suitable for practical applications.

  5. Performance evaluation of object based greenhouse detection from Sentinel-2 MSI and Landsat 8 OLI data: A case study from Almería (Spain)

    NASA Astrophysics Data System (ADS)

    Novelli, Antonio; Aguilar, Manuel A.; Nemmaoui, Abderrahim; Aguilar, Fernando J.; Tarantino, Eufemia

    2016-10-01

    This paper shows the first comparison between data from Sentinel-2 (S2) Multi Spectral Instrument (MSI) and Landsat 8 (L8) Operational Land Imager (OLI) headed up to greenhouse detection. Two closely related in time scenes, one for each sensor, were classified by using Object Based Image Analysis and Random Forest (RF). The RF input consisted of several object-based features computed from spectral bands and including mean values, spectral indices and textural features. S2 and L8 data comparisons were also extended using a common segmentation dataset extracted form VHR World-View 2 (WV2) imagery to test differences only due to their specific spectral contribution. The best band combinations to perform segmentation were found through a modified version of the Euclidian Distance 2 index. Four different RF classifications schemes were considered achieving 89.1%, 91.3%, 90.9% and 93.4% as the best overall accuracies respectively, evaluated over the whole study area.

  6. Optimisation of colour schemes to accurately display mass spectrometry imaging data based on human colour perception.

    PubMed

    Race, Alan M; Bunch, Josephine

    2015-03-01

    The choice of colour scheme used to present data can have a dramatic effect on the perceived structure present within the data. This is of particular significance in mass spectrometry imaging (MSI), where ion images that provide 2D distributions of a wide range of analytes are used to draw conclusions about the observed system. Commonly employed colour schemes are generally suboptimal for providing an accurate representation of the maximum amount of data. Rainbow-based colour schemes are extremely popular within the community, but they introduce well-documented artefacts which can be actively misleading in the interpretation of the data. In this article, we consider the suitability of colour schemes and composite image formation found in MSI literature in the context of human colour perception. We also discuss recommendations of rules for colour scheme selection for ion composites and multivariate analysis techniques such as principal component analysis (PCA).

  7. A secondstep in development of a checklist for screening risk for violence in acute psychiatric patients: evaluation of interrater reliability of the Preliminary Scheme 33.

    PubMed

    Bjørkly, Stål; Moger, Tron A

    2007-12-01

    The Acute Project is a research project conducted on acute psychiatric admission wards in Norway. The objective is to develop and validate a structured, easy-to-use screening checklist for assessment of risk for violence in patients both during their stay in the ward and after discharge. The Preliminary Scheme 33 is a 33-item screening checklist with content domain inspired by the Historical-Clinical-Risk Management Scheme (HCR-20), the Brøset Violence Checklist, and eight risk factors extracted from the literature on risk assessment. The Preliminary Scheme 33 was designed and tested in two steps by a research group which includes the authors. The common aim of both steps was to develop this into a time economical, reliable, and valid checklist. In the first step in 2006 the predictive validity of the individual items was tested. The present work presents results from the second step, a study conducted to assess the interrater reliability of the 33 items. Eight clinicians working in an acute psychiatric unit volunteered to be raters and were trained to score the 33 items on a three-point scale in relation to 15 clinical vignettes, which contained information from 15 acute psychiatric patients' files. Analysis showed high interrater reliability for the total score with an intraclass correlation coefficient (ICC) of .86 (95% CI: 0.74-0.94). However, a substantial proportion of the items had medium to low ICCs. Consequences of this finding for further development of these items into a brief screen are discussed.

  8. Numerical analysis of electromagnetic cascades in emulsion chambers

    NASA Technical Reports Server (NTRS)

    Plyasheshnikov, A. V.; Vorobyev, K. V.

    1985-01-01

    A new calculational scheme of the Monte Carlo method assigned for the investigation of the development of high and extremely high energy electromagnetic cascades (EMC) in the matter was elaborated. The scheme was applied to the analysis of angular and radial distributions of EMC electrons in the atmosphere. By means of this scheme the EMC development in dense medium is investigated and some preliminary data are presented on the behavior of EMC in emulsion chambers. The results of more detailed theoretical analysis of the EMC development in emulsion chambers are discussed.

  9. Optimized scheme in coal-fired boiler combustion based on information entropy and modified K-prototypes algorithm

    NASA Astrophysics Data System (ADS)

    Gu, Hui; Zhu, Hongxia; Cui, Yanfeng; Si, Fengqi; Xue, Rui; Xi, Han; Zhang, Jiayu

    2018-06-01

    An integrated combustion optimization scheme is proposed for the combined considering the restriction in coal-fired boiler combustion efficiency and outlet NOx emissions. Continuous attribute discretization and reduction techniques are handled as optimization preparation by E-Cluster and C_RED methods, in which the segmentation numbers don't need to be provided in advance and can be continuously adapted with data characters. In order to obtain results of multi-objections with clustering method for mixed data, a modified K-prototypes algorithm is then proposed. This algorithm can be divided into two stages as K-prototypes algorithm for clustering number self-adaptation and clustering for multi-objective optimization, respectively. Field tests were carried out at a 660 MW coal-fired boiler to provide real data as a case study for controllable attribute discretization and reduction in boiler system and obtaining optimization parameters considering [ maxηb, minyNOx ] multi-objective rule.

  10. Free-form reticulated shell structures searched for maximum buckling strength

    NASA Astrophysics Data System (ADS)

    Takiuchi, Yuji; Kato, Shiro; Nakazawa, Shoji

    2017-10-01

    In this paper, a scheme of shape optimization is proposed for maximum buckling strength of free-form steel reticulated shells. In order to discuss the effectiveness of objective functions with respect to maximizing buckling strength, several different optimizations are applied to shallow steel single layer reticulated shells targeting rigidly jointed tubular members. The objective functions to be compared are linear buckling load, strain energy, initial yield load, and elasto-plastic buckling strength evaluated based on Modified Dunkerley Formula. With respect to obtained free-forms based on the four optimization schemes, both of their elastic buckling and elasto-plastic buckling behaviour are investigated and compared considering geometrical imperfections. As a result, it is concluded that the first and fourth optimization methods are effective from a viewpoint of buckling strength. And the relation between generalized slenderness ratio and appropriate objective function applied in buckling strength maximization is made clear.

  11. Rate-distortion optimized tree-structured compression algorithms for piecewise polynomial images.

    PubMed

    Shukla, Rahul; Dragotti, Pier Luigi; Do, Minh N; Vetterli, Martin

    2005-03-01

    This paper presents novel coding algorithms based on tree-structured segmentation, which achieve the correct asymptotic rate-distortion (R-D) behavior for a simple class of signals, known as piecewise polynomials, by using an R-D based prune and join scheme. For the one-dimensional case, our scheme is based on binary-tree segmentation of the signal. This scheme approximates the signal segments using polynomial models and utilizes an R-D optimal bit allocation strategy among the different signal segments. The scheme further encodes similar neighbors jointly to achieve the correct exponentially decaying R-D behavior (D(R) - c(o)2(-c1R)), thus improving over classic wavelet schemes. We also prove that the computational complexity of the scheme is of O(N log N). We then show the extension of this scheme to the two-dimensional case using a quadtree. This quadtree-coding scheme also achieves an exponentially decaying R-D behavior, for the polygonal image model composed of a white polygon-shaped object against a uniform black background, with low computational cost of O(N log N). Again, the key is an R-D optimized prune and join strategy. Finally, we conclude with numerical results, which show that the proposed quadtree-coding scheme outperforms JPEG2000 by about 1 dB for real images, like cameraman, at low rates of around 0.15 bpp.

  12. Revised Chapman-Enskog analysis for a class of forcing schemes in the lattice Boltzmann method

    NASA Astrophysics Data System (ADS)

    Li, Q.; Zhou, P.; Yan, H. J.

    2016-10-01

    In the lattice Boltzmann (LB) method, the forcing scheme, which is used to incorporate an external or internal force into the LB equation, plays an important role. It determines whether the force of the system is correctly implemented in an LB model and affects the numerical accuracy. In this paper we aim to clarify a critical issue about the Chapman-Enskog analysis for a class of forcing schemes in the LB method in which the velocity in the equilibrium density distribution function is given by u =∑αeαfα / ρ , while the actual fluid velocity is defined as u ̂=u +δtF / (2 ρ ) . It is shown that the usual Chapman-Enskog analysis for this class of forcing schemes should be revised so as to derive the actual macroscopic equations recovered from these forcing schemes. Three forcing schemes belonging to the above class are analyzed, among which Wagner's forcing scheme [A. J. Wagner, Phys. Rev. E 74, 056703 (2006), 10.1103/PhysRevE.74.056703] is shown to be capable of reproducing the correct macroscopic equations. The theoretical analyses are examined and demonstrated with two numerical tests, including the simulation of Womersley flow and the modeling of flat and circular interfaces by the pseudopotential multiphase LB model.

  13. Comparison of finite-difference schemes for analysis of shells of revolution. [stress and free vibration analysis

    NASA Technical Reports Server (NTRS)

    Noor, A. K.; Stephens, W. B.

    1973-01-01

    Several finite difference schemes are applied to the stress and free vibration analysis of homogeneous isotropic and layered orthotropic shells of revolution. The study is based on a form of the Sanders-Budiansky first-approximation linear shell theory modified such that the effects of shear deformation and rotary inertia are included. A Fourier approach is used in which all the shell stress resultants and displacements are expanded in a Fourier series in the circumferential direction, and the governing equations reduce to ordinary differential equations in the meridional direction. While primary attention is given to finite difference schemes used in conjunction with first order differential equation formulation, comparison is made with finite difference schemes used with other formulations. These finite difference discretization models are compared with respect to simplicity of application, convergence characteristics, and computational efficiency. Numerical studies are presented for the effects of variations in shell geometry and lamination parameters on the accuracy and convergence of the solutions obtained by the different finite difference schemes. On the basis of the present study it is shown that the mixed finite difference scheme based on the first order differential equation formulation and two interlacing grids for the different fundamental unknowns combines a number of advantages over other finite difference schemes previously reported in the literature.

  14. Tracking multiple particles in fluorescence time-lapse microscopy images via probabilistic data association.

    PubMed

    Godinez, William J; Rohr, Karl

    2015-02-01

    Tracking subcellular structures as well as viral structures displayed as 'particles' in fluorescence microscopy images yields quantitative information on the underlying dynamical processes. We have developed an approach for tracking multiple fluorescent particles based on probabilistic data association. The approach combines a localization scheme that uses a bottom-up strategy based on the spot-enhancing filter as well as a top-down strategy based on an ellipsoidal sampling scheme that uses the Gaussian probability distributions computed by a Kalman filter. The localization scheme yields multiple measurements that are incorporated into the Kalman filter via a combined innovation, where the association probabilities are interpreted as weights calculated using an image likelihood. To track objects in close proximity, we compute the support of each image position relative to the neighboring objects of a tracked object and use this support to recalculate the weights. To cope with multiple motion models, we integrated the interacting multiple model algorithm. The approach has been successfully applied to synthetic 2-D and 3-D images as well as to real 2-D and 3-D microscopy images, and the performance has been quantified. In addition, the approach was successfully applied to the 2-D and 3-D image data of the recent Particle Tracking Challenge at the IEEE International Symposium on Biomedical Imaging (ISBI) 2012.

  15. A new multi-symplectic scheme for the generalized Kadomtsev-Petviashvili equation

    NASA Astrophysics Data System (ADS)

    Li, Haochen; Sun, Jianqiang

    2012-09-01

    We propose a new scheme for the generalized Kadomtsev-Petviashvili (KP) equation. The multi-symplectic conservation property of the new scheme is proved. Back error analysis shows that the new multi-symplectic scheme has second order accuracy in space and time. Numerical application on studying the KPI equation and the KPII equation are presented in detail.

  16. Nonlinear dynamic model for visual object tracking on Grassmann manifolds with partial occlusion handling.

    PubMed

    Khan, Zulfiqar Hasan; Gu, Irene Yu-Hua

    2013-12-01

    This paper proposes a novel Bayesian online learning and tracking scheme for video objects on Grassmann manifolds. Although manifold visual object tracking is promising, large and fast nonplanar (or out-of-plane) pose changes and long-term partial occlusions of deformable objects in video remain a challenge that limits the tracking performance. The proposed method tackles these problems with the main novelties on: 1) online estimation of object appearances on Grassmann manifolds; 2) optimal criterion-based occlusion handling for online updating of object appearances; 3) a nonlinear dynamic model for both the appearance basis matrix and its velocity; and 4) Bayesian formulations, separately for the tracking process and the online learning process, that are realized by employing two particle filters: one is on the manifold for generating appearance particles and another on the linear space for generating affine box particles. Tracking and online updating are performed in an alternating fashion to mitigate the tracking drift. Experiments using the proposed tracker on videos captured by a single dynamic/static camera have shown robust tracking performance, particularly for scenarios when target objects contain significant nonplanar pose changes and long-term partial occlusions. Comparisons with eight existing state-of-the-art/most relevant manifold/nonmanifold trackers with evaluations have provided further support to the proposed scheme.

  17. Single-Grasp Object Classification and Feature Extraction with Simple Robot Hands and Tactile Sensors.

    PubMed

    Spiers, Adam J; Liarokapis, Minas V; Calli, Berk; Dollar, Aaron M

    2016-01-01

    Classical robotic approaches to tactile object identification often involve rigid mechanical grippers, dense sensor arrays, and exploratory procedures (EPs). Though EPs are a natural method for humans to acquire object information, evidence also exists for meaningful tactile property inference from brief, non-exploratory motions (a 'haptic glance'). In this work, we implement tactile object identification and feature extraction techniques on data acquired during a single, unplanned grasp with a simple, underactuated robot hand equipped with inexpensive barometric pressure sensors. Our methodology utilizes two cooperating schemes based on an advanced machine learning technique (random forests) and parametric methods that estimate object properties. The available data is limited to actuator positions (one per two link finger) and force sensors values (eight per finger). The schemes are able to work both independently and collaboratively, depending on the task scenario. When collaborating, the results of each method contribute to the other, improving the overall result in a synergistic fashion. Unlike prior work, the proposed approach does not require object exploration, re-grasping, grasp-release, or force modulation and works for arbitrary object start positions and orientations. Due to these factors, the technique may be integrated into practical robotic grasping scenarios without adding time or manipulation overheads.

  18. Space Station racks weight and CG measurement using the rack insertion end-effector

    NASA Technical Reports Server (NTRS)

    Brewer, William V.

    1994-01-01

    The objective was to design a method to measure weight and center of gravity (C.G.) location for Space Station Modules by adding sensors to the existing Rack Insertion End Effector (RIEE). Accomplishments included alternative sensor placement schemes organized into categories. Vendors were queried for suitable sensor equipment recommendations. Inverse mathematical models for each category determine expected maximum sensor loads. Sensors are selected using these computations, yielding cost and accuracy data. Accuracy data for individual sensors are inserted into forward mathematical models to estimate the accuracy of an overall sensor scheme. Cost of the schemes can be estimated. Ease of implementation and operation are discussed.

  19. Correspondence between quantization schemes for two-player nonzero-sum games and CNOT complexity

    NASA Astrophysics Data System (ADS)

    Vijayakrishnan, V.; Balakrishnan, S.

    2018-05-01

    The well-known quantization schemes for two-player nonzero-sum games are Eisert-Wilkens-Lewenstein scheme and Marinatto-Weber scheme. In this work, we establish the connection between the two schemes from the perspective of quantum circuits. Further, we provide the correspondence between any game quantization schemes and the CNOT complexity, where CNOT complexity is up to the local unitary operations. While CNOT complexity is known to be useful in the analysis of universal quantum circuit, in this work, we find its applicability in quantum game theory.

  20. A secure biometrics-based authentication scheme for telecare medicine information systems.

    PubMed

    Yan, Xiaopeng; Li, Weiheng; Li, Ping; Wang, Jiantao; Hao, Xinhong; Gong, Peng

    2013-10-01

    The telecare medicine information system (TMIS) allows patients and doctors to access medical services or medical information at remote sites. Therefore, it could bring us very big convenient. To safeguard patients' privacy, authentication schemes for the TMIS attracted wide attention. Recently, Tan proposed an efficient biometrics-based authentication scheme for the TMIS and claimed their scheme could withstand various attacks. However, in this paper, we point out that Tan's scheme is vulnerable to the Denial-of-Service attack. To enhance security, we also propose an improved scheme based on Tan's work. Security and performance analysis shows our scheme not only could overcome weakness in Tan's scheme but also has better performance.

  1. Parametric Study of Decay of Homogeneous Isotropic Turbulence Using Large Eddy Simulation

    NASA Technical Reports Server (NTRS)

    Swanson, R. C.; Rumsey, Christopher L.; Rubinstein, Robert; Balakumar, Ponnampalam; Zang, Thomas A.

    2012-01-01

    Numerical simulations of decaying homogeneous isotropic turbulence are performed with both low-order and high-order spatial discretization schemes. The turbulent Mach and Reynolds numbers for the simulations are 0.2 and 250, respectively. For the low-order schemes we use either second-order central or third-order upwind biased differencing. For higher order approximations we apply weighted essentially non-oscillatory (WENO) schemes, both with linear and nonlinear weights. There are two objectives in this preliminary effort to investigate possible schemes for large eddy simulation (LES). One is to explore the capability of a widely used low-order computational fluid dynamics (CFD) code to perform LES computations. The other is to determine the effect of higher order accuracy (fifth, seventh, and ninth order) achieved with high-order upwind biased WENO-based schemes. Turbulence statistics, such as kinetic energy, dissipation, and skewness, along with the energy spectra from simulations of the decaying turbulence problem are used to assess and compare the various numerical schemes. In addition, results from the best performing schemes are compared with those from a spectral scheme. The effects of grid density, ranging from 32 cubed to 192 cubed, on the computations are also examined. The fifth-order WENO-based scheme is found to be too dissipative, especially on the coarser grids. However, with the seventh-order and ninth-order WENO-based schemes we observe a significant improvement in accuracy relative to the lower order LES schemes, as revealed by the computed peak in the energy dissipation and by the energy spectrum.

  2. Assessment of the reduction methods used to develop chemical schemes: building of a new chemical scheme for VOC oxidation suited to three-dimensional multiscale HOx-NOx-VOC chemistry simulations

    NASA Astrophysics Data System (ADS)

    Szopa, S.; Aumont, B.; Madronich, S.

    2005-09-01

    The objective of this work was to develop and assess an automatic procedure to generate reduced chemical schemes for the atmospheric photooxidation of volatile organic carbon (VOC) compounds. The procedure is based on (i) the development of a tool for writing the fully explicit schemes for VOC oxidation (see companion paper Aumont et al., 2005), (ii) the application of several commonly used reduction methods to the fully explicit scheme, and (iii) the assessment of resulting errors based on direct comparison between the reduced and full schemes.

    The reference scheme included seventy emitted VOCs chosen to be representative of both anthropogenic and biogenic emissions, and their atmospheric degradation chemistry required more than two million reactions among 350000 species. Three methods were applied to reduce the size of the reference chemical scheme: (i) use of operators, based on the redundancy of the reaction sequences involved in the VOC oxidation, (ii) grouping of primary species having similar reactivities into surrogate species and (iii) grouping of some secondary products into surrogate species. The number of species in the final reduced scheme is 147, this being small enough for practical inclusion in current three-dimensional models. Comparisons between the fully explicit and reduced schemes, carried out with a box model for several typical tropospheric conditions, showed that the reduced chemical scheme accurately predicts ozone concentrations and some other aspects of oxidant chemistry for both polluted and clean tropospheric conditions.

  3. Identifing Atmospheric Pollutant Sources Using Artificial Neural Networks

    NASA Astrophysics Data System (ADS)

    Paes, F. F.; Campos, H. F.; Luz, E. P.; Carvalho, A. R.

    2008-05-01

    The estimation of the area source pollutant strength is a relevant issue for atmospheric environment. This characterizes an inverse problem in the atmospheric pollution dispersion. In the inverse analysis, an area source domain is considered, where the strength of such area source term is assumed unknown. The inverse problem is solved by using a supervised artificial neural network: multi-layer perceptron. The conection weights of the neural network are computed from delta rule - learning process. The neural network inversion is compared with results from standard inverse analysis (regularized inverse solution). In the regularization method, the inverse problem is formulated as a non-linear optimization approach, whose the objective function is given by the square difference between the measured pollutant concentration and the mathematical models, associated with a regularization operator. In our numerical experiments, the forward problem is addressed by a source-receptor scheme, where a regressive Lagrangian model is applied to compute the transition matrix. The second order maximum entropy regularization is used, and the regularization parameter is calculated by the L-curve technique. The objective function is minimized employing a deterministic scheme (a quasi-Newton algorithm) [1] and a stochastic technique (PSO: particle swarm optimization) [2]. The inverse problem methodology is tested with synthetic observational data, from six measurement points in the physical domain. The best inverse solutions were obtained with neural networks. References: [1] D. R. Roberti, D. Anfossi, H. F. Campos Velho, G. A. Degrazia (2005): Estimating Emission Rate and Pollutant Source Location, Ciencia e Natura, p. 131-134. [2] E.F.P. da Luz, H.F. de Campos Velho, J.C. Becceneri, D.R. Roberti (2007): Estimating Atmospheric Area Source Strength Through Particle Swarm Optimization. Inverse Problems, Desing and Optimization Symposium IPDO-2007, April 16-18, Miami (FL), USA, vol 1, p. 354-359.

  4. Assessing Long-Term Seagrass Changes by Integrating a High-Spatial Resolution Image, Historical Aerial Photography and Field Data

    NASA Astrophysics Data System (ADS)

    Leon-Perez, M.; Hernandez, W. J.; Armstrong, R.

    2016-02-01

    Reported cases of seagrass loss have increased over the last 40 years, increasing the awareness of the need for assessing seagrass health. In situ monitoring has been the main method to assess spatial and temporal changes in seagrass ecosystem. Although remote sensing techniques with multispectral imagery have been recently used for these purposes, long-term analysis is limited to the sensor's mission life. The objective of this project is to determine long-term changes in seagrass habitat cover at Caja de Muertos Island Nature Reserve, by combining in situ data with a satellite image and historical aerial photography. A current satellite imagery of the WorldView-2 sensor was used to generate a 2014 benthic habitat map for the study area. The multispectral image was pre-processed using: conversion of digital numbers to radiance, and atmospheric and water column corrections. Object-based image analysis was used to segment the image into polygons representing different benthic habitats and to classify those habitats according to the classification scheme developed for this project. The scheme include the following benthic habitat categories: seagrass (sparse, dense and very dense), colonized hard bottom (sparse, dense and very dense), sand and mix algae on unconsolidated sediments. Field work was used to calibrate the satellite-derived benthic maps and to asses accuracy of the final products. In addition, a time series of satellite imagery and historic aerial photography from 1950 to 2014 provided data to assess long-term changes in seagrass habitat cover within the Reserve. Preliminary results show an increase in seagrass habitat cover, contrasting with the worldwide declining trend. The results of this study will provide valuable information for the conservation and management of seagrass habitat in the Caja de Muertos Island Nature Reserve.

  5. Selecting registration schemes in case of interstitial lung disease follow-up in CT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vlachopoulos, Georgios; Korfiatis, Panayiotis; Skiadopoulos, Spyros

    Purpose: Primary goal of this study is to select optimal registration schemes in the framework of interstitial lung disease (ILD) follow-up analysis in CT. Methods: A set of 128 multiresolution schemes composed of multiresolution nonrigid and combinations of rigid and nonrigid registration schemes are evaluated, utilizing ten artificially warped ILD follow-up volumes, originating from ten clinical volumetric CT scans of ILD affected patients, to select candidate optimal schemes. Specifically, all combinations of four transformation models (three rigid: rigid, similarity, affine and one nonrigid: third order B-spline), four cost functions (sum-of-square distances, normalized correlation coefficient, mutual information, and normalized mutual information),more » four gradient descent optimizers (standard, regular step, adaptive stochastic, and finite difference), and two types of pyramids (recursive and Gaussian-smoothing) were considered. The selection process involves two stages. The first stage involves identification of schemes with deformation field singularities, according to the determinant of the Jacobian matrix. In the second stage, evaluation methodology is based on distance between corresponding landmark points in both normal lung parenchyma (NLP) and ILD affected regions. Statistical analysis was performed in order to select near optimal registration schemes per evaluation metric. Performance of the candidate registration schemes was verified on a case sample of ten clinical follow-up CT scans to obtain the selected registration schemes. Results: By considering near optimal schemes common to all ranking lists, 16 out of 128 registration schemes were initially selected. These schemes obtained submillimeter registration accuracies in terms of average distance errors 0.18 ± 0.01 mm for NLP and 0.20 ± 0.01 mm for ILD, in case of artificially generated follow-up data. Registration accuracy in terms of average distance error in clinical follow-up data was in the range of 1.985–2.156 mm and 1.966–2.234 mm, for NLP and ILD affected regions, respectively, excluding schemes with statistically significant lower performance (Wilcoxon signed-ranks test, p < 0.05), resulting in 13 finally selected registration schemes. Conclusions: Selected registration schemes in case of ILD CT follow-up analysis indicate the significance of adaptive stochastic gradient descent optimizer, as well as the importance of combined rigid and nonrigid schemes providing high accuracy and time efficiency. The selected optimal deformable registration schemes are equivalent in terms of their accuracy and thus compatible in terms of their clinical outcome.« less

  6. Performance analyses and improvements for the IEEE 802.15.4 CSMA/CA scheme with heterogeneous buffered conditions.

    PubMed

    Zhu, Jianping; Tao, Zhengsu; Lv, Chunfeng

    2012-01-01

    Studies of the IEEE 802.15.4 Carrier Sense Multiple Access with Collision Avoidance (CSMA/CA) scheme have been received considerable attention recently, with most of these studies focusing on homogeneous or saturated traffic. Two novel transmission schemes-OSTS/BSTS (One Service a Time Scheme/Bulk Service a Time Scheme)-are proposed in this paper to improve the behaviors of time-critical buffered networks with heterogeneous unsaturated traffic. First, we propose a model which contains two modified semi-Markov chains and a macro-Markov chain combined with the theory of M/G/1/K queues to evaluate the characteristics of these two improved CSMA/CA schemes, in which traffic arrivals and accessing packets are bestowed with non-preemptive priority over each other, instead of prioritization. Then, throughput, packet delay and energy consumption of unsaturated, unacknowledged IEEE 802.15.4 beacon-enabled networks are predicted based on the overall point of view which takes the dependent interactions of different types of nodes into account. Moreover, performance comparisons of these two schemes with other non-priority schemes are also proposed. Analysis and simulation results show that delay and fairness of our schemes are superior to those of other schemes, while throughput and energy efficiency are superior to others in more heterogeneous situations. Comprehensive simulations demonstrate that the analysis results of these models match well with the simulation results.

  7. An Anonymous User Authentication and Key Agreement Scheme Based on a Symmetric Cryptosystem in Wireless Sensor Networks.

    PubMed

    Jung, Jaewook; Kim, Jiye; Choi, Younsung; Won, Dongho

    2016-08-16

    In wireless sensor networks (WSNs), a registered user can login to the network and use a user authentication protocol to access data collected from the sensor nodes. Since WSNs are typically deployed in unattended environments and sensor nodes have limited resources, many researchers have made considerable efforts to design a secure and efficient user authentication process. Recently, Chen et al. proposed a secure user authentication scheme using symmetric key techniques for WSNs. They claim that their scheme assures high efficiency and security against different types of attacks. After careful analysis, however, we find that Chen et al.'s scheme is still vulnerable to smart card loss attack and is susceptible to denial of service attack, since it is invalid for verification to simply compare an entered ID and a stored ID in smart card. In addition, we also observe that their scheme cannot preserve user anonymity. Furthermore, their scheme cannot quickly detect an incorrect password during login phase, and this flaw wastes both communication and computational overheads. In this paper, we describe how these attacks work, and propose an enhanced anonymous user authentication and key agreement scheme based on a symmetric cryptosystem in WSNs to address all of the aforementioned vulnerabilities in Chen et al.'s scheme. Our analysis shows that the proposed scheme improves the level of security, and is also more efficient relative to other related schemes.

  8. A Multiserver Biometric Authentication Scheme for TMIS using Elliptic Curve Cryptography.

    PubMed

    Chaudhry, Shehzad Ashraf; Khan, Muhammad Tawab; Khan, Muhammad Khurram; Shon, Taeshik

    2016-11-01

    Recently several authentication schemes are proposed for telecare medicine information system (TMIS). Many of such schemes are proved to have weaknesses against known attacks. Furthermore, numerous such schemes cannot be used in real time scenarios. Because they assume a single server for authentication across the globe. Very recently, Amin et al. (J. Med. Syst. 39(11):180, 2015) designed an authentication scheme for secure communication between a patient and a medical practitioner using a trusted central medical server. They claimed their scheme to extend all security requirements and emphasized the efficiency of their scheme. However, the analysis in this article proves that the scheme designed by Amin et al. is vulnerable to stolen smart card and stolen verifier attacks. Furthermore, their scheme is having scalability issues along with inefficient password change and password recovery phases. Then we propose an improved scheme. The proposed scheme is more practical, secure and lightweight than Amin et al.'s scheme. The security of proposed scheme is proved using the popular automated tool ProVerif.

  9. Quantum Attack-Resistent Certificateless Multi-Receiver Signcryption Scheme

    PubMed Central

    Li, Huixian; Chen, Xubao; Pang, Liaojun; Shi, Weisong

    2013-01-01

    The existing certificateless signcryption schemes were designed mainly based on the traditional public key cryptography, in which the security relies on the hard problems, such as factor decomposition and discrete logarithm. However, these problems will be easily solved by the quantum computing. So the existing certificateless signcryption schemes are vulnerable to the quantum attack. Multivariate public key cryptography (MPKC), which can resist the quantum attack, is one of the alternative solutions to guarantee the security of communications in the post-quantum age. Motivated by these concerns, we proposed a new construction of the certificateless multi-receiver signcryption scheme (CLMSC) based on MPKC. The new scheme inherits the security of MPKC, which can withstand the quantum attack. Multivariate quadratic polynomial operations, which have lower computation complexity than bilinear pairing operations, are employed in signcrypting a message for a certain number of receivers in our scheme. Security analysis shows that our scheme is a secure MPKC-based scheme. We proved its security under the hardness of the Multivariate Quadratic (MQ) problem and its unforgeability under the Isomorphism of Polynomials (IP) assumption in the random oracle model. The analysis results show that our scheme also has the security properties of non-repudiation, perfect forward secrecy, perfect backward secrecy and public verifiability. Compared with the existing schemes in terms of computation complexity and ciphertext length, our scheme is more efficient, which makes it suitable for terminals with low computation capacity like smart cards. PMID:23967037

  10. On the Security of a Novel Probabilistic Signature Based on Bilinear Square Diffie-Hellman Problem and Its Extension

    PubMed Central

    Zhao, Zhenguo; Shi, Wenbo

    2014-01-01

    Probabilistic signature scheme has been widely used in modern electronic commerce since it could provide integrity, authenticity, and nonrepudiation. Recently, Wu and Lin proposed a novel probabilistic signature (PS) scheme using the bilinear square Diffie-Hellman (BSDH) problem. They also extended it to a universal designated verifier signature (UDVS) scheme. In this paper, we analyze the security of Wu et al.'s PS scheme and UDVS scheme. Through concrete attacks, we demonstrate both of their schemes are not unforgeable. The security analysis shows that their schemes are not suitable for practical applications. PMID:25025083

  11. Design of an extensive information representation scheme for clinical narratives.

    PubMed

    Deléger, Louise; Campillos, Leonardo; Ligozat, Anne-Laure; Névéol, Aurélie

    2017-09-11

    Knowledge representation frameworks are essential to the understanding of complex biomedical processes, and to the analysis of biomedical texts that describe them. Combined with natural language processing (NLP), they have the potential to contribute to retrospective studies by unlocking important phenotyping information contained in the narrative content of electronic health records (EHRs). This work aims to develop an extensive information representation scheme for clinical information contained in EHR narratives, and to support secondary use of EHR narrative data to answer clinical questions. We review recent work that proposed information representation schemes and applied them to the analysis of clinical narratives. We then propose a unifying scheme that supports the extraction of information to address a large variety of clinical questions. We devised a new information representation scheme for clinical narratives that comprises 13 entities, 11 attributes and 37 relations. The associated annotation guidelines can be used to consistently apply the scheme to clinical narratives and are https://cabernet.limsi.fr/annotation_guide_for_the_merlot_french_clinical_corpus-Sept2016.pdf . The information scheme includes many elements of the major schemes described in the clinical natural language processing literature, as well as a uniquely detailed set of relations.

  12. Robust biometrics based authentication and key agreement scheme for multi-server environments using smart cards.

    PubMed

    Lu, Yanrong; Li, Lixiang; Yang, Xing; Yang, Yixian

    2015-01-01

    Biometrics authenticated schemes using smart cards have attracted much attention in multi-server environments. Several schemes of this type where proposed in the past. However, many of them were found to have some design flaws. This paper concentrates on the security weaknesses of the three-factor authentication scheme by Mishra et al. After careful analysis, we find their scheme does not really resist replay attack while failing to provide an efficient password change phase. We further propose an improvement of Mishra et al.'s scheme with the purpose of preventing the security threats of their scheme. We demonstrate the proposed scheme is given to strong authentication against several attacks including attacks shown in the original scheme. In addition, we compare the performance and functionality with other multi-server authenticated key schemes.

  13. Robust Biometrics Based Authentication and Key Agreement Scheme for Multi-Server Environments Using Smart Cards

    PubMed Central

    Lu, Yanrong; Li, Lixiang; Yang, Xing; Yang, Yixian

    2015-01-01

    Biometrics authenticated schemes using smart cards have attracted much attention in multi-server environments. Several schemes of this type where proposed in the past. However, many of them were found to have some design flaws. This paper concentrates on the security weaknesses of the three-factor authentication scheme by Mishra et al. After careful analysis, we find their scheme does not really resist replay attack while failing to provide an efficient password change phase. We further propose an improvement of Mishra et al.’s scheme with the purpose of preventing the security threats of their scheme. We demonstrate the proposed scheme is given to strong authentication against several attacks including attacks shown in the original scheme. In addition, we compare the performance and functionality with other multi-server authenticated key schemes. PMID:25978373

  14. Cryptanalysis and Improvement of a Biometric-Based Multi-Server Authentication and Key Agreement Scheme.

    PubMed

    Wang, Chengqi; Zhang, Xiao; Zheng, Zhiming

    2016-01-01

    With the security requirements of networks, biometrics authenticated schemes which are applied in the multi-server environment come to be more crucial and widely deployed. In this paper, we propose a novel biometric-based multi-server authentication and key agreement scheme which is based on the cryptanalysis of Mishra et al.'s scheme. The informal and formal security analysis of our scheme are given, which demonstrate that our scheme satisfies the desirable security requirements. The presented scheme provides a variety of significant functionalities, in which some features are not considered in the most of existing authentication schemes, such as, user revocation or re-registration and biometric information protection. Compared with several related schemes, our scheme has more secure properties and lower computation cost. It is obviously more appropriate for practical applications in the remote distributed networks.

  15. Fostering and Inspiring Research Engagement (FIRE): program logic of a research incubator scheme for allied health students.

    PubMed

    Ziviani, Jenny; Feeney, Rachel; Schabrun, Siobhan; Copland, David; Hodges, Paul

    2014-08-01

    The purpose of this study was to present the application of a logic model in depicting the underlying theory of an undergraduate research scheme for occupational therapy, physiotherapy, and speech pathology university students in Queensland, Australia. Data gathered from key written documents on the goals and intended operation of the research incubator scheme were used to create a draft (unverified) logic model. The major components of the logic model were inputs and resources, activities/outputs, and outcomes (immediate/learning, intermediate/action, and longer term/impacts). Although immediate and intermediate outcomes chiefly pertained to students' participation in honours programs, longer-term outcomes (impacts) concerned their subsequent participation in research higher-degree programs and engagement in research careers. Program logic provided an effective means of clarifying program objectives and the mechanisms by which the research incubator scheme was designed to achieve its intended outcomes. This model was developed as the basis for evaluation of the effectiveness of the scheme in achieving its stated goals.

  16. Modern foreign language teachers - don't leave those kids alone! Linguistic-cultural "give and take" in an ad-hoc tutoring scheme

    NASA Astrophysics Data System (ADS)

    Leroy, Norah

    2017-08-01

    This paper addresses the theme of social inclusion through language learning. The focus is on an ad-hoc tutoring scheme set up between newly arrived British migrant pupils and French monolingual pupils in a small secondary school in the south-west of France. Though the original objective of this tutoring scheme was to improve the English skills of the younger pupils, feedback reports indicated that it also had a positive impact on the relationship between the British migrant pupils and their French peers. Teachers believed that those involved participated more fully in class, and appeared more self-assured and generally happy thanks to the interpersonal relationships this scheme helped to forge. This study demonstrates the necessity of analysing the socio-cultural context migrants may find themselves in, in order to identify potential challenges. The ad-hoc tutoring scheme described here is an example of how language learning can support the integration and inclusion of "new generation" migrants into everyday school life.

  17. FPGA design of correlation-based pattern recognition

    NASA Astrophysics Data System (ADS)

    Jridi, Maher; Alfalou, Ayman

    2017-05-01

    Optical/Digital pattern recognition and tracking based on optical/digital correlation are a well-known techniques to detect, identify and localize a target object in a scene. Despite the limited number of treatments required by the correlation scheme, computational time and resources are relatively high. The most computational intensive treatment required by the correlation is the transformation from spatial to spectral domain and then from spectral to spatial domain. Furthermore, these transformations are used on optical/digital encryption schemes like the double random phase encryption (DRPE). In this paper, we present a VLSI architecture for the correlation scheme based on the fast Fourier transform (FFT). One interesting feature of the proposed scheme is its ability to stream image processing in order to perform correlation for video sequences. A trade-off between the hardware consumption and the robustness of the correlation can be made in order to understand the limitations of the correlation implementation in reconfigurable and portable platforms. Experimental results obtained from HDL simulations and FPGA prototype have demonstrated the advantages of the proposed scheme.

  18. A data seamless interaction scheme between electric power secondary business systems

    NASA Astrophysics Data System (ADS)

    Ai, Wenkai; Qian, Feng

    2018-03-01

    At present, the data interaction of electric power secondary business systems is very high, and it is not universal to develop programs when data interaction is carried out by different manufacturers' electric power secondary business systems. There are different interaction schemes for electric power secondary business systems with different manufacturers, which lead to high development cost, low reusability and high maintenance difficulty. This paper introduces a new data seamless interaction scheme between electric power secondary business systems. The scheme adopts the international common Java message service protocol as the transmission protocol, adopts the common JavaScript object symbol format as the data interactive format, unified electric power secondary business systems data interactive way, improve reusability, reduce complexity, monitor the operation of the electric power secondary business systems construction has laid a solid foundation.

  19. A numerical resolution study of high order essentially non-oscillatory schemes applied to incompressible flow

    NASA Technical Reports Server (NTRS)

    Weinan, E.; Shu, Chi-Wang

    1994-01-01

    High order essentially non-oscillatory (ENO) schemes, originally designed for compressible flow and in general for hyperbolic conservation laws, are applied to incompressible Euler and Navier-Stokes equations with periodic boundary conditions. The projection to divergence-free velocity fields is achieved by fourth-order central differences through fast Fourier transforms (FFT) and a mild high-order filtering. The objective of this work is to assess the resolution of ENO schemes for large scale features of the flow when a coarse grid is used and small scale features of the flow, such as shears and roll-ups, are not fully resolved. It is found that high-order ENO schemes remain stable under such situations and quantities related to large scale features, such as the total circulation around the roll-up region, are adequately resolved.

  20. A numerical resolution study of high order essentially non-oscillatory schemes applied to incompressible flow

    NASA Technical Reports Server (NTRS)

    Weinan, E.; Shu, Chi-Wang

    1992-01-01

    High order essentially non-oscillatory (ENO) schemes, originally designed for compressible flow and in general for hyperbolic conservation laws, are applied to incompressible Euler and Navier-Stokes equations with periodic boundary conditions. The projection to divergence-free velocity fields is achieved by fourth order central differences through Fast Fourier Transforms (FFT) and a mild high-order filtering. The objective of this work is to assess the resolution of ENO schemes for large scale features of the flow when a coarse grid is used and small scale features of the flow, such as shears and roll-ups, are not fully resolved. It is found that high-order ENO schemes remain stable under such situations and quantities related to large-scale features, such as the total circulation around the roll-up region, are adequately resolved.

  1. Sea breeze: Induced mesoscale systems and severe weather

    NASA Technical Reports Server (NTRS)

    Nicholls, M. E.; Pielke, R. A.; Cotton, W. R.

    1990-01-01

    Sea-breeze-deep convective interactions over the Florida peninsula were investigated using a cloud/mesoscale numerical model. The objective was to gain a better understanding of sea-breeze and deep convective interactions over the Florida peninsula using a high resolution convectively explicit model and to use these results to evaluate convective parameterization schemes. A 3-D numerical investigation of Florida convection was completed. The Kuo and Fritsch-Chappell parameterization schemes are summarized and evaluated.

  2. Analysis of a Teacher's Pedagogical Arguments Using Toulmin's Model and Argumentation Schemes

    ERIC Educational Resources Information Center

    Metaxas, N.; Potari, D.; Zachariades, T.

    2016-01-01

    In this article, we elaborate methodologies to study the argumentation speech of a teacher involved in argumentative activities. The standard tool of analysis of teachers' argumentation concerning pedagogical matters is Toulmin's model. The theory of argumentation schemes offers an alternative perspective on the analysis of arguments. We propose…

  3. Content Analysis Coding Schemes for Online Asynchronous Discussion

    ERIC Educational Resources Information Center

    Weltzer-Ward, Lisa

    2011-01-01

    Purpose: Researchers commonly utilize coding-based analysis of classroom asynchronous discussion contributions as part of studies of online learning and instruction. However, this analysis is inconsistent from study to study with over 50 coding schemes and procedures applied in the last eight years. The aim of this article is to provide a basis…

  4. Outage Performance Analysis of Relay Selection Schemes in Wireless Energy Harvesting Cooperative Networks over Non-Identical Rayleigh Fading Channels †

    PubMed Central

    Do, Nhu Tri; Bao, Vo Nguyen Quoc; An, Beongku

    2016-01-01

    In this paper, we study relay selection in decode-and-forward wireless energy harvesting cooperative networks. In contrast to conventional cooperative networks, the relays harvest energy from the source’s radio-frequency radiation and then use that energy to forward the source information. Considering power splitting receiver architecture used at relays to harvest energy, we are concerned with the performance of two popular relay selection schemes, namely, partial relay selection (PRS) scheme and optimal relay selection (ORS) scheme. In particular, we analyze the system performance in terms of outage probability (OP) over independent and non-identical (i.n.i.d.) Rayleigh fading channels. We derive the closed-form approximations for the system outage probabilities of both schemes and validate the analysis by the Monte-Carlo simulation. The numerical results provide comprehensive performance comparison between the PRS and ORS schemes and reveal the effect of wireless energy harvesting on the outage performances of both schemes. Additionally, we also show the advantages and drawbacks of the wireless energy harvesting cooperative networks and compare to the conventional cooperative networks. PMID:26927119

  5. Outage Performance Analysis of Relay Selection Schemes in Wireless Energy Harvesting Cooperative Networks over Non-Identical Rayleigh Fading Channels.

    PubMed

    Do, Nhu Tri; Bao, Vo Nguyen Quoc; An, Beongku

    2016-02-26

    In this paper, we study relay selection in decode-and-forward wireless energy harvesting cooperative networks. In contrast to conventional cooperative networks, the relays harvest energy from the source's radio-frequency radiation and then use that energy to forward the source information. Considering power splitting receiver architecture used at relays to harvest energy, we are concerned with the performance of two popular relay selection schemes, namely, partial relay selection (PRS) scheme and optimal relay selection (ORS) scheme. In particular, we analyze the system performance in terms of outage probability (OP) over independent and non-identical (i.n.i.d.) Rayleigh fading channels. We derive the closed-form approximations for the system outage probabilities of both schemes and validate the analysis by the Monte-Carlo simulation. The numerical results provide comprehensive performance comparison between the PRS and ORS schemes and reveal the effect of wireless energy harvesting on the outage performances of both schemes. Additionally, we also show the advantages and drawbacks of the wireless energy harvesting cooperative networks and compare to the conventional cooperative networks.

  6. An Energy Efficient Mutual Authentication and Key Agreement Scheme Preserving Anonymity for Wireless Sensor Networks.

    PubMed

    Lu, Yanrong; Li, Lixiang; Peng, Haipeng; Yang, Yixian

    2016-06-08

    WSNs (Wireless sensor networks) are nowadays viewed as a vital portion of the IoTs (Internet of Things). Security is a significant issue in WSNs, especially in resource-constrained environments. AKA (Authentication and key agreement) enhances the security of WSNs against adversaries attempting to get sensitive sensor data. Various AKA schemes have been developed for verifying the legitimate users of a WSN. Firstly, we scrutinize Amin-Biswas's currently scheme and demonstrate the major security loopholes in their works. Next, we propose a lightweight AKA scheme, using symmetric key cryptography based on smart card, which is resilient against all well known security attacks. Furthermore, we prove the scheme accomplishes mutual handshake and session key agreement property securely between the participates involved under BAN (Burrows, Abadi and Needham) logic. Moreover, formal security analysis and simulations are also conducted using AVISPA(Automated Validation of Internet Security Protocols and Applications) to show that our scheme is secure against active and passive attacks. Additionally, performance analysis shows that our proposed scheme is secure and efficient to apply for resource-constrained WSNs.

  7. An Energy Efficient Mutual Authentication and Key Agreement Scheme Preserving Anonymity for Wireless Sensor Networks

    PubMed Central

    Lu, Yanrong; Li, Lixiang; Peng, Haipeng; Yang, Yixian

    2016-01-01

    WSNs (Wireless sensor networks) are nowadays viewed as a vital portion of the IoTs (Internet of Things). Security is a significant issue in WSNs, especially in resource-constrained environments. AKA (Authentication and key agreement) enhances the security of WSNs against adversaries attempting to get sensitive sensor data. Various AKA schemes have been developed for verifying the legitimate users of a WSN. Firstly, we scrutinize Amin-Biswas’s currently scheme and demonstrate the major security loopholes in their works. Next, we propose a lightweight AKA scheme, using symmetric key cryptography based on smart card, which is resilient against all well known security attacks. Furthermore, we prove the scheme accomplishes mutual handshake and session key agreement property securely between the participates involved under BAN (Burrows, Abadi and Needham) logic. Moreover, formal security analysis and simulations are also conducted using AVISPA(Automated Validation of Internet Security Protocols and Applications) to show that our scheme is secure against active and passive attacks. Additionally, performance analysis shows that our proposed scheme is secure and efficient to apply for resource-constrained WSNs. PMID:27338382

  8. Von Neumann stability analysis of globally divergence-free RKDG schemes for the induction equation using multidimensional Riemann solvers

    NASA Astrophysics Data System (ADS)

    Balsara, Dinshaw S.; Käppeli, Roger

    2017-05-01

    In this paper we focus on the numerical solution of the induction equation using Runge-Kutta Discontinuous Galerkin (RKDG)-like schemes that are globally divergence-free. The induction equation plays a role in numerical MHD and other systems like it. It ensures that the magnetic field evolves in a divergence-free fashion; and that same property is shared by the numerical schemes presented here. The algorithms presented here are based on a novel DG-like method as it applies to the magnetic field components in the faces of a mesh. (I.e., this is not a conventional DG algorithm for conservation laws.) The other two novel building blocks of the method include divergence-free reconstruction of the magnetic field and multidimensional Riemann solvers; both of which have been developed in recent years by the first author. Since the method is linear, a von Neumann stability analysis is carried out in two-dimensions to understand its stability properties. The von Neumann stability analysis that we develop in this paper relies on transcribing from a modal to a nodal DG formulation in order to develop discrete evolutionary equations for the nodal values. These are then coupled to a suitable Runge-Kutta timestepping strategy so that one can analyze the stability of the entire scheme which is suitably high order in space and time. We show that our scheme permits CFL numbers that are comparable to those of traditional RKDG schemes. We also analyze the wave propagation characteristics of the method and show that with increasing order of accuracy the wave propagation becomes more isotropic and free of dissipation for a larger range of long wavelength modes. This makes a strong case for investing in higher order methods. We also use the von Neumann stability analysis to show that the divergence-free reconstruction and multidimensional Riemann solvers are essential algorithmic ingredients of a globally divergence-free RKDG-like scheme. Numerical accuracy analyses of the RKDG-like schemes are presented and compared with the accuracy of PNPM schemes. It is found that PNPM retrieve much of the accuracy of the RKDG-like schemes while permitting a larger CFL number.

  9. Finite Volume Methods: Foundation and Analysis

    NASA Technical Reports Server (NTRS)

    Barth, Timothy; Ohlberger, Mario

    2003-01-01

    Finite volume methods are a class of discretization schemes that have proven highly successful in approximating the solution of a wide variety of conservation law systems. They are extensively used in fluid mechanics, porous media flow, meteorology, electromagnetics, models of biological processes, semi-conductor device simulation and many other engineering areas governed by conservative systems that can be written in integral control volume form. This article reviews elements of the foundation and analysis of modern finite volume methods. The primary advantages of these methods are numerical robustness through the obtention of discrete maximum (minimum) principles, applicability on very general unstructured meshes, and the intrinsic local conservation properties of the resulting schemes. Throughout this article, specific attention is given to scalar nonlinear hyperbolic conservation laws and the development of high order accurate schemes for discretizing them. A key tool in the design and analysis of finite volume schemes suitable for non-oscillatory discontinuity capturing is discrete maximum principle analysis. A number of building blocks used in the development of numerical schemes possessing local discrete maximum principles are reviewed in one and several space dimensions, e.g. monotone fluxes, E-fluxes, TVD discretization, non-oscillatory reconstruction, slope limiters, positive coefficient schemes, etc. When available, theoretical results concerning a priori and a posteriori error estimates are given. Further advanced topics are then considered such as high order time integration, discretization of diffusion terms and the extension to systems of nonlinear conservation laws.

  10. A comparative study on the motion of various objects inside an air tunnel

    NASA Astrophysics Data System (ADS)

    Shibani, Wanis Mustafa E.; Zulkafli, Mohd Fadhli; Basunoand, Bambang

    2017-04-01

    This paper presents a comparative study of the movement of various rigid bodies through an air tunnel for both two and three-dimensional flow problems. Three kinds of objects under investigation are in the form of box, ball and wedge shape. The investigation was carried out through the use of a commercial CFD software, named Fluent, in order to determine aerodynamic forces, act on the object as well as to track its movement. Adopted numerical scheme is the time-averaged Navier-Stokes equation with k - ɛ as its turbulence modeling and the scheme was solved using the SIMPLE algorithm. Triangular elements grid was used in 2D case, while tetrahedron elements for 3D case. Grid independence studies were performed for each problem from a coarse to fine grid. The motion of an object is restricted in one direction only and is found by tracking its center of mass at every time step. The result indicates the movement of the object is increasing as the flow moves down stream and the box have the fastest speed compare to the other two shapes for both 2D and 3D cases.

  11. An Unscented Kalman-Particle Hybrid Filter for Space Object Tracking

    NASA Astrophysics Data System (ADS)

    Raihan A. V, Dilshad; Chakravorty, Suman

    2018-03-01

    Optimal and consistent estimation of the state of space objects is pivotal to surveillance and tracking applications. However, probabilistic estimation of space objects is made difficult by the non-Gaussianity and nonlinearity associated with orbital mechanics. In this paper, we present an unscented Kalman-particle hybrid filtering framework for recursive Bayesian estimation of space objects. The hybrid filtering scheme is designed to provide accurate and consistent estimates when measurements are sparse without incurring a large computational cost. It employs an unscented Kalman filter (UKF) for estimation when measurements are available. When the target is outside the field of view (FOV) of the sensor, it updates the state probability density function (PDF) via a sequential Monte Carlo method. The hybrid filter addresses the problem of particle depletion through a suitably designed filter transition scheme. To assess the performance of the hybrid filtering approach, we consider two test cases of space objects that are assumed to undergo full three dimensional orbital motion under the effects of J 2 and atmospheric drag perturbations. It is demonstrated that the hybrid filters can furnish fast, accurate and consistent estimates outperforming standard UKF and particle filter (PF) implementations.

  12. Declining incidence of chickenpox in the absence of universal childhood immunisation

    PubMed Central

    Lowe, G; Salmon, R; Thomas, D; Evans, M

    2004-01-01

    Objective: To examine the epidemiology of chickenpox in Wales from 1986 to 2001. Design: Descriptive analysis of chickenpox consultations reported by the Welsh general practice sentinel surveillance scheme for infectious diseases, compared with annual shingles consultation rates from the same scheme to exclude reporting fatigue and data from a general practice morbidity database to validate results. Setting: A total of 226 884 patients registered with one of 30 volunteer general practices participating in the sentinel surveillance scheme. Main outcome measures: Age standardised and age specific incidence of chickenpox. Results: Crude and age standardised consultation rates for chickenpox declined from 1986 to 2001, with loss of epidemic cycling. Rates remained stable in 0–4 year olds but declined in all older age groups, particularly those aged 5–14 years. Shingles consultation rates remained constant over the same period. Data from the morbidity database displayed similar trends. Conclusion: General practitioner consultation rates for chickenpox are declining in Wales except in pre-school children. These findings are unlikely to be a reporting artefact but may be explained either by an overall decline in transmission or increased social mixing in those under 5 years old, through formal child care and earlier school entry, and associated increasing rates of mild or subclinical infection in this age group. Further investigation, particularly by serological surveillance, is necessary before universal varicella immunisation can be considered in the UK. PMID:15383443

  13. [Prognostic value of three different staging schemes based on pN, MLR and LODDS in patients with T3 esophageal cancer].

    PubMed

    Wang, L; Cai, L; Chen, Q; Jiang, Y H

    2017-10-23

    Objective: To evaluate the prognostic value of three different staging schemes based on positive lymph nodes (pN), metastatic lymph nodes ratio (MLR) and log odds of positive lymph nodes (LODDS) in patients with T3 esophageal cancer. Methods: From 2007 to 2014, clinicopathological characteristics of 905 patients who were pathologically diagnosed as T3 esophageal cancer and underwent radical esophagectomy in Zhejiang Cancer Hospital were retrospectively analyzed. Kaplan-Meier curves and Multivariate Cox proportional hazards models were used to evaluate the independent prognostic factors. The values of three lymph node staging schemes for predicting 5-year survival were analyzed by using receiver operating characteristic (ROC) curves. Results: The 1-, 3- and 5-year overall survival rates of patients with T3 esophageal cancer were 80.9%, 50.0% and 38.4%, respectively. Multivariate analysis showed that MLR stage, LODDS stage and differentiation were independent prognostic survival factors ( P <0.05 for all). ROC curves showed that the area under the curve of pN stage, MLR stage, LODDS stage was 0.607, 0.613 and 0.618, respectively. However, the differences were not statistically significant ( P >0.05). Conclusions: LODDS is an independent prognostic factor for patients with T3 esophageal cancer. The value of LODDS staging system may be superior to pN staging system for evaluating the prognosis of these patients.

  14. Diagnostic classification scheme in Iranian breast cancer patients using a decision tree.

    PubMed

    Malehi, Amal Saki

    2014-01-01

    The objective of this study was to determine a diagnostic classification scheme using a decision tree based model. The study was conducted as a retrospective case-control study in Imam Khomeini hospital in Tehran during 2001 to 2009. Data, including demographic and clinical-pathological characteristics, were uniformly collected from 624 females, 312 of them were referred with positive diagnosis of breast cancer (cases) and 312 healthy women (controls). The decision tree was implemented to develop a diagnostic classification scheme using CART 6.0 Software. The AUC (area under curve), was measured as the overall performance of diagnostic classification of the decision tree. Five variables as main risk factors of breast cancer and six subgroups as high risk were identified. The results indicated that increasing age, low age at menarche, single and divorced statues, irregular menarche pattern and family history of breast cancer are the important diagnostic factors in Iranian breast cancer patients. The sensitivity and specificity of the analysis were 66% and 86.9% respectively. The high AUC (0.82) also showed an excellent classification and diagnostic performance of the model. Decision tree based model appears to be suitable for identifying risk factors and high or low risk subgroups. It can also assists clinicians in making a decision, since it can identify underlying prognostic relationships and understanding the model is very explicit.

  15. A New Scheme for the Design of Hilbert Transform Pairs of Biorthogonal Wavelet Bases

    NASA Astrophysics Data System (ADS)

    Shi, Hongli; Luo, Shuqian

    2010-12-01

    In designing the Hilbert transform pairs of biorthogonal wavelet bases, it has been shown that the requirements of the equal-magnitude responses and the half-sample phase offset on the lowpass filters are the necessary and sufficient condition. In this paper, the relationship between the phase offset and the vanishing moment difference of biorthogonal scaling filters is derived, which implies a simple way to choose the vanishing moments so that the phase response requirement can be satisfied structurally. The magnitude response requirement is approximately achieved by a constrained optimization procedure, where the objective function and constraints are all expressed in terms of the auxiliary filters of scaling filters rather than the scaling filters directly. Generally, the calculation burden in the design implementation will be less than that of the current schemes. The integral of magnitude response difference between the primal and dual scaling filters has been chosen as the objective function, which expresses the magnitude response requirements in the whole frequency range. Two design examples illustrate that the biorthogonal wavelet bases designed by the proposed scheme are very close to Hilbert transform pairs.

  16. A Quantum Multi-proxy Blind Signature Scheme Based on Genuine Four-Qubit Entangled State

    NASA Astrophysics Data System (ADS)

    Tian, Juan-Hong; Zhang, Jian-Zhong; Li, Yan-Ping

    2016-02-01

    In this paper, we propose a multi-proxy blind signature scheme based on controlled teleportation. Genuine four-qubit entangled state functions as quantum channel. The scheme uses the physical characteristics of quantum mechanics to implement delegation, signature and verification. The security analysis shows the scheme satisfies the security features of multi-proxy signature, unforgeability, undeniability, blindness and unconditional security.

  17. Supporting newly qualified dental therapists into practice: a longitudinal evaluation of a foundation training scheme for dental therapists (TFT).

    PubMed

    Bullock, A D; Barnes, E; Falcon, H C; Stearns, K

    2013-04-01

    Focused on the dental therapists foundation training (TFT) scheme run by the Postgraduate Dental Deaneries of Oxford and Wessex (NHS Education South Central - NESC) the objectives were (1) to evaluate the TFT 2010/11 scheme, identifying strengths, areas for development and drawing comparisons with the 2009 evaluation; and (2) to follow-up previous cohorts, reporting current work and retrospective reflections on the scheme. Data were collected from 2010/11 ('current') trainees (n = 10) through group discussion, questionnaire and portfolio extracts. Eleven past-trainees from 2008/09 and 2009/10 took part in a structured telephone interview or responded to questions via e-mail. Data from 2011 consolidated that collected earlier. The scheme was highly valued. Current participants thought the scheme should be mandatory and all past-participants would recommend it to others. Trainees attributed an increase in confidence and ability in their clinical skills to participation in TFT. Current trainees' concerns about finding therapy work were echoed in past-participants' post-scheme employment. At the point of qualification, trainees do not feel well-prepared for starting work as dental therapists. Opportunity to develop confidence and skills in a supportive environment is a key benefit of the scheme. Maintaining ability in the full range of duties requires continued use of skills and the opportunity to do this remains an ongoing challenge.

  18. Long-term strategy for the statistical design of a forest health monitoring system

    Treesearch

    Hans T. Schreuder; Raymond L. Czaplewski

    1993-01-01

    A conceptual framework is given for a broad-scale survey of forest health that accomplishes three objectives: generate descriptive statistics; detect changes in such statistics; and simplify analytical inferences that identify, and possibly establish cause-effect relationships. Our paper discusses the development of sampling schemes to satisfy these three objectives,...

  19. Knowledge of Metabolic Syndrome in Chinese Adults: Implications for Health Education

    ERIC Educational Resources Information Center

    Lo, Sally Wai Sze; Chair, Sek Ying; Lee, Iris Fung Kam

    2016-01-01

    Objective: The objective of this study was to assess knowledge of metabolic syndrome (MS) among Chinese adults and provide directions for designing healthcare promotion schemes for improving MS awareness in the community. Design: The study adopted a cross-sectional design and a convenience sampling method. Method: Chinese adults aged 18-65 years…

  20. Index-in-retrospect and breeding objectives characterizing genetic improvement programs for South African Nguni cattle

    USDA-ARS?s Scientific Manuscript database

    The objective of the current study was to describe the historical selection applied to Nguni cattle in South Africa. Index-in-retrospect methods were applied to data originating from the National Beef Cattle Improvement Scheme. Data used were estimated breeding values (EBV) for animals born during t...

  1. Considering social and environmental concerns as reservoir operating objectives

    NASA Astrophysics Data System (ADS)

    Tilmant, A.; Georis, B.; Doulliez, P.

    2003-04-01

    Sustainability principles are now widely recognized as key criteria for water resource development schemes, such as hydroelectric and multipurpose reservoirs. Development decisions no longer rely solely on economic grounds, but also consider environmental and social concerns through the so-called environmental and social impact assessments. The objective of this paper is to show that environmental and social concerns can also be addressed in the management (operation) of existing or projected reservoir schemes. By either adequately exploiting the results of environmental and social impact assessments, or by carrying out survey of water users, experts and managers, efficient (Pareto optimal) reservoir operating rules can be derived using flexible mathematical programming techniques. By reformulating the problem as a multistage flexible constraint satisfaction problem, incommensurable and subjective operating objectives can contribute, along with classical economic objectives, to the determination of optimal release decisions. Employed in a simulation mode, the results can be used to assess the long-term impacts of various operating rules on the social well-being of affected populations as well as on the integrity of the environment. The methodology is illustrated with a reservoir reallocation problem in Chile.

  2. FR-type radio sources in COSMOS: relation of radio structure to size, accretion modes and large-scale environment

    NASA Astrophysics Data System (ADS)

    Vardoulaki, Eleni; Faustino Jimenez Andrade, Eric; Delvecchio, Ivan; Karim, Alexander; Smolčić, Vernesa; Magnelli, Benjamin; Bertoldi, Frank; Schinnener, Eva; Sargent, Mark; Finoguenov, Alexis; VLA COSMOS Team

    2018-01-01

    The radio sources associated with active galactic nuclei (AGN) can exhibit a variety of radio structures, from simple to more complex, giving rise to a variety of classification schemes. The question which still remains open, given deeper surveys revealing new populations of radio sources, is whether this plethora of radio structures can be attributed to the physical properties of the host or to the environment. Here we present an analysis on the radio structure of radio-selected AGN from the VLA-COSMOS Large Project at 3 GHz (JVLA-COSMOS; Smolčić et al.) in relation to: 1) their linear projected size, 2) the Eddington ratio, and 3) the environment their hosts lie within. We classify these as FRI (jet-like) and FRII (lobe-like) based on the FR-type classification scheme, and compare them to a sample of jet-less radio AGN in JVLA-COSMOS. We measure their linear projected sizes using a semi-automatic machine learning technique. Their Eddington ratios are calculated from X-ray data available for COSMOS. As environmental probes we take the X-ray groups (hundreds kpc) and the density fields (~Mpc-scale) in COSMOS. We find that FRII radio sources are on average larger than FRIs, which agrees with literature. But contrary to past studies, we find no dichotomy in FR objects in JVLA-COSMOS given their Eddington ratios, as on average they exhibit similar values. Furthermore our results show that the large-scale environment does not explain the observed dichotomy in lobe- and jet-like FR-type objects as both types are found on similar environments, but it does affect the shape of the radio structure introducing bents for objects closer to the centre of an X-ray group.

  3. Analysis of parenchymal patterns using conspicuous spatial frequency features in mammograms applied to the BI-RADS density rating scheme

    NASA Astrophysics Data System (ADS)

    Perconti, Philip; Loew, Murray

    2006-03-01

    Automatic classification of the density of breast parenchyma is shown using a measure that is correlated to the human observer performance, and compared against the BI-RADS density rating. Increasingly popular in the United States, the Breast Imaging Reporting and Data System (BI-RADS) is used to draw attention to the increased screening difficulty associated with greater breast density; however, the BI-RADS rating scheme is subjective and is not intended as an objective measure of breast density. So, while popular, BI-RADS does not define density classes using a standardized measure, which leads to increased variability among observers. The adaptive thresholding technique is a more quantitative approach for assessing the percentage breast density, but considerable reader interaction is required. We calculate an objective density rating that is derived using a measure of local feature salience. Previously, this measure was shown to correlate well with radiologists' localization and discrimination of true positive and true negative regions-of-interest. Using conspicuous spatial frequency features, an objective density rating is obtained and correlated with adaptive thresholding, and the subjectively ascertained BI-RADS density ratings. Using 100 cases, obtained from the University of South Florida's DDSM database, we show that an automated breast density measure can be derived that is correlated with the interactive thresholding method for continuous percentage breast density, but not with the BI-RADS density rating categories for the selected cases. Comparison between interactive thresholding and the new salience percentage density resulted in a Pearson correlation of 76.7%. Using a four-category scale equivalent to the BI-RADS density categories, a Spearman correlation coefficient of 79.8% was found.

  4. Cryptanalysis and Improvement of a Biometric-Based Multi-Server Authentication and Key Agreement Scheme

    PubMed Central

    Wang, Chengqi; Zhang, Xiao; Zheng, Zhiming

    2016-01-01

    With the security requirements of networks, biometrics authenticated schemes which are applied in the multi-server environment come to be more crucial and widely deployed. In this paper, we propose a novel biometric-based multi-server authentication and key agreement scheme which is based on the cryptanalysis of Mishra et al.’s scheme. The informal and formal security analysis of our scheme are given, which demonstrate that our scheme satisfies the desirable security requirements. The presented scheme provides a variety of significant functionalities, in which some features are not considered in the most of existing authentication schemes, such as, user revocation or re-registration and biometric information protection. Compared with several related schemes, our scheme has more secure properties and lower computation cost. It is obviously more appropriate for practical applications in the remote distributed networks. PMID:26866606

  5. Numerical study of read scheme in one-selector one-resistor crossbar array

    NASA Astrophysics Data System (ADS)

    Kim, Sungho; Kim, Hee-Dong; Choi, Sung-Jin

    2015-12-01

    A comprehensive numerical circuit analysis of read schemes of a one selector-one resistance change memory (1S1R) crossbar array is carried out. Three schemes-the ground, V/2, and V/3 schemes-are compared with each other in terms of sensing margin and power consumption. Without the aid of a complex analytical approach or SPICE-based simulation, a simple numerical iteration method is developed to simulate entire current flows and node voltages within a crossbar array. Understanding such phenomena is essential in successfully evaluating the electrical specifications of selectors for suppressing intrinsic drawbacks of crossbar arrays, such as sneaky current paths and series line resistance problems. This method provides a quantitative tool for the accurate analysis of crossbar arrays and provides guidelines for developing an optimal read scheme, array configuration, and selector device specifications.

  6. Qualitative Analysis: The Current Status.

    ERIC Educational Resources Information Center

    Cole, G. Mattney, Jr.; Waggoner, William H.

    1983-01-01

    To assist in designing/implementing qualitative analysis courses, examines reliability/accuracy of several published separation schemes, notes methods where particular difficulties arise (focusing on Groups II/III), and presents alternative schemes for the separation of these groups. Only cation analyses are reviewed. Figures are presented in…

  7. Demographic and Socioeconomic Factors Influencing Public Attitudes Toward a Presumed Consent System for Organ Donation Without and With a Priority Allocation Scheme.

    PubMed

    Tumin, Makmor; Tafran, Khaled; Mutalib, Muzalwana Abdul Talib Abdul; Satar, NurulHuda Mohd; Said, Saad Mohd; Adnan, Wan Ahmad Hafiz Wan Md; Lu, Yong Sook

    2015-10-01

    The influence of demographic and socioeconomic factors on the public's attitude towards a presumed consent system (PCS) of organ donation was estimated in 2 scenarios: without and with a priority allocation scheme (PAS). Self-administered questionnaires were completed by 775 respondents. Using multiple logistic regressions, respondents' objections to donating organs in both scenarios were estimated. In total, 63.9% of respondents would object to donating under a PCS, whereas 54.6% would object under a PCS with a PAS. Respondents with pretertiary education were more likely to object than were respondents with tertiary education, in both the first (adjusted odds ratio [AOR] = 1.615) and second (AOR = 1.728) scenarios. Young respondents were less likely to object than were middle-aged respondents, in both the first (AOR = 0.648) and second (AOR = 0.572) scenarios. Respondents with mid-ranged personal monthly income were more likely to object than were respondents with low income, in both the first (AOR = 1.994) and second (AOR = 1.519) scenarios. It does not seem that Malaysia is ready to implement a PCS. The educational level, age, and income of the broader public should be considered if a PCS, without or with a PAS, is planned for implementation in Malaysia.

  8. High-quality slab-based intermixing method for fusion rendering of multiple medical objects.

    PubMed

    Kim, Dong-Joon; Kim, Bohyoung; Lee, Jeongjin; Shin, Juneseuk; Kim, Kyoung Won; Shin, Yeong-Gil

    2016-01-01

    The visualization of multiple 3D objects has been increasingly required for recent applications in medical fields. Due to the heterogeneity in data representation or data configuration, it is difficult to efficiently render multiple medical objects in high quality. In this paper, we present a novel intermixing scheme for fusion rendering of multiple medical objects while preserving the real-time performance. First, we present an in-slab visibility interpolation method for the representation of subdivided slabs. Second, we introduce virtual zSlab, which extends an infinitely thin boundary (such as polygonal objects) into a slab with a finite thickness. Finally, based on virtual zSlab and in-slab visibility interpolation, we propose a slab-based visibility intermixing method with the newly proposed rendering pipeline. Experimental results demonstrate that the proposed method delivers more effective multiple-object renderings in terms of rendering quality, compared to conventional approaches. And proposed intermixing scheme provides high-quality intermixing results for the visualization of intersecting and overlapping surfaces by resolving aliasing and z-fighting problems. Moreover, two case studies are presented that apply the proposed method to the real clinical applications. These case studies manifest that the proposed method has the outstanding advantages of the rendering independency and reusability. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  9. Qualitative Analysis, with Periodicity, for "Real" Solutions.

    ERIC Educational Resources Information Center

    Rich, Ronald L.

    1984-01-01

    Presents an outline of group separations for a nonhydrogen sulfide analytical scheme applicable to all metallic elements (Bromide scheme). Also presents another outline of an abbreviated and modified version (Iodide scheme) designed for emphasis on nutritionally important metals, with special attention to 10 cations. (JM)

  10. A new scheme for grading the quality of scientific reports that evaluate imaging modalities for cerebrovascular diseases.

    PubMed

    Qureshi, Adnan I

    2007-10-01

    Imaging of head and neck vasculature continues to improve with the application of new technology. To judge the value of new technologies reported in the literature, it is imperative to develop objective standards optimized against bias and favoring statistical power and clinical relevance. A review of the existing literature identified the following items as lending scientific value to a report on imaging technology: prospective design, comparison with an accepted modality, unbiased patient selection, standardized image acquisition, blinded interpretation, and measurement of reliability. These were incorporated into a new grading scheme. Two physicians tested the new scheme and an established scheme to grade reports published in the medical literature. Inter-observer reliability for both methods was calculated using the kappa coefficient. A total of 22 reports evaluating imaging modalities for cervical internal carotid artery stenosis were identified from a literature search and graded by both schemes. Agreement between the two physicians in grading the level of scientific evidence using the new scheme was excellent (kappa coefficient: 0.93, p<0.0001). Agreement using the established scheme was less rigorous (kappa coefficient: 0.39, p<0.0001). The weighted kappa coefficients were 0.95 and 0.38 for the new and established schemes, respectively. Overall agreement was higher for the newer scheme (95% versus 64%). The new grading scheme can be used reliably to categorize the strength of scientific knowledge provided by individual studies of vascular imaging. The new method could assist clinicians and researchers in determining appropriate clinical applications of newly reported technical advances.

  11. Cyanide and Aflatoxin Loads of Processed Cassava (Manihot esculenta) Tubers (Garri) in Njaba, Imo State, Nigeria

    PubMed Central

    Chikezie, Paul Chidoka; Ojiako, Okey A.

    2013-01-01

    Objectives: The present study sought to investigate the role of palm oil, in conjunction with the duration of fermentation, on cyanide and aflatoxin (AFT) loads of processed cassava tubers (Garri). Materials and Methods: Matured cassava (Manihot esculenta Crantz) tubers were harvested from three different locations (Akunna, Mkporo-Oji and Durungwu) in Njaba Local Government Area, Imo State, Nigeria. The cassava tubers were processed into Garri according to standard schemes with required modifications and measured for cyanide content using titrimetric methods. Samples of Garri for determination of AFT levels were stored for 30 days before the commencement of spectrophotometric analysis. Results: Cyanide content of peeled cassava tubers was within the range of 4.07 ± 0.16-5.20 ± 0.19 mg hydrocyanic acid (HCN) equivalent/100 g wet weight, whereas the various processed cassava tubers was within the range of 1.44 ± 0.34-3.95 ± 0.23 mg HCN equivalents/100 g. For the 48 h fermentation scheme, Garri treated with palm oil exhibited marginal reduction in cyanide contents by 0.96%, 3.52% and 3.69%, whereas 4 h fermentation scheme is in concurrence with palm oil treatment caused 4.42%, 7.47% and 5.15% elimination of cyanide contents compared with corresponding untreated Garri samples (P > 0.05). Levels of AFT of the various Garri samples ranged between 0.26 ± 0.07 and 0.55 ± 0.04 ppb/100 g. There was no significant difference (P > 0.05) in AFT levels among the various samples in relation to their corresponding sources. Conclusion: The present study showed that the 48 h fermentation scheme for Garri production caused significant (P < 0.05) reduction, but did not obliterate the cyanide content of cassava tubers. Conversely, the 48 h fermentation scheme promoted the elevation of AFT levels, but was relatively reduced in Garri samples treated with palm oil. PMID:24403736

  12. Study of Awareness, Enrollment, and Utilization of Rashtriya Swasthya Bima Yojana (National Health Insurance Scheme) in Maharashtra, India.

    PubMed

    Thakur, Harshad

    2015-01-01

    Government of India launched a social health protection program called Rashtriya Swasthya Bima Yojana (RSBY) in the year 2008 to provide financial protection from catastrophic health expenses to below poverty line households (HHs). The objectives of the current paper are to assess the current status of RSBY in Maharashtra at each step of awareness, enrollment, and utilization. In addition, urban and rural areas were compared, and social, political, economic, and cultural (SPEC) factors responsible for the better or poor proportions, especially for the awareness of the scheme, were identified. The study followed mixed methods approach. For quantitative data, a systematic multistage sampling design was adopted in both rural and urban areas covering 6000 HHs across 22 districts. For qualitative data, five districts were selected to conduct Stakeholder Analysis, Focused Group Discussions, and In-Depth Interviews with key informants to supplement the findings. The data were analyzed using innovative SPEC-by-steps tool developed by Health Inc. It is seen that that the RSBY had a very limited success in Maharashtra. Out of 6000 HHs, only 29.7% were aware about the scheme and 21.6% were enrolled during the period of 2010-2012. Only 11.3% HHs reported that they were currently enrolled for RSBY. Although 1886 (33.1%) HHs reported at least one case of hospitalization in the last 1 year, only 16 (0.3%) HHs could actually utilize the benefits during hospitalization. It is seen that at each step, there is an increase in the exclusion of eligible HHs from the scheme. The participants felt that such schemes did not reach their intended beneficiaries due to various SPEC factors. The results of this study were quite similar to other studies done in the recent past. RSBY might still be continued in Maharashtra with modified focus along with good and improved strategy. Various other similar schemes in India can definitely learn few important lessons such as the need to improve awareness, issuing prompt enrollment cards with proper details, achieving universal enrollment, ongoing and prompt renewal, and ensuring proper utilization by proactively educating the vulnerable sections.

  13. Development of a New System for Transport Simulation and Analysis at General Atomics

    NASA Astrophysics Data System (ADS)

    St. John, H. E.; Peng, Q.; Freeman, J.; Crotinger, J.

    1997-11-01

    General Atomics has begun a long term program to improve all aspects of experimental data analysis related to DIII--D. The object is to make local and visiting physicists as productive as possible, with only a small investment in training, by developing intuitive, sophisticated interfaces to existing and newly created computer programs. Here we describe our initial work and results of a pilot project in this program. The pilot project is a collaboratory effort between LLNL and GA which will ultimately result in the merger of Corsica and ONETWO (and selected modules from other codes) into a new advanced transport code system. The initial goal is to produce a graphical user interface to the transport code ONETWO which will couple to a programmable (steerable) front end designed for the transport system. This will be an object oriented scheme written primarily in python. The programmable application will integrate existing C, C^++, and Fortran methods in a single computational paradigm. Its most important feature is the use of plug in physics modules which will allow a high degree of customization.

  14. Flyby of large-size space debris objects and their transition to the disposal orbits in LEO

    NASA Astrophysics Data System (ADS)

    Baranov, Andrey A.; Grishko, Dmitriy A.; Razoumny, Yury N.; Jun, Li

    2017-06-01

    The article focuses on the flyby issue involving large-size space debris (LSSD) objects in low Earth orbits. The data on overall sizes of the known upper-stages and last stages of launch-vehicles make it possible to emphasize five compact groups of such objects from the Satellite catalogue in 600-2000 km altitude interval. The flyby maneuvers are executed by a single space vehicle (SV) that transfers the current captured LSSD object to the specially selected circular or elliptical disposal orbit (DO) and after a period of time returns to capture a new one. The flight is always realized when a value of the Right Ascension of the Ascending Node (RAAN) is approximately the same for the current DO and for an orbit of the following LSSD object. Distinctive features of changes in mutual distribution of orbital planes of LSSD within a group are shown on the RAAN deviations' evolution portrait. In case of the first three groups (inclinations 71°, 74° and 81°), the lines describing the relative orientation of orbital planes are quasi-parallel. Such configuration allows easy identification of the flyby order within a group, and calculation of the mission duration and the required total ΔV. In case of the 4th and the 5th groups the RAAN deviations' evolution portrait represents a conjunction of lines chaotically intersecting. The article studies changes in mission duration and in the required ΔV depending on the catalogue number of the first object in the flyby order. The article also contains a comparative efficiency analysis of the two world-wide known schemes applicable to LSSD objects' de-orbiting; the analysis is carried out for all 5 distinguished LSSD groups.

  15. Modeling and performance analysis of an improved movement-based location management scheme for packet-switched mobile communication systems.

    PubMed

    Chung, Yun Won; Kwon, Jae Kyun; Park, Suwon

    2014-01-01

    One of the key technologies to support mobility of mobile station (MS) in mobile communication systems is location management which consists of location update and paging. In this paper, an improved movement-based location management scheme with two movement thresholds is proposed, considering bursty data traffic characteristics of packet-switched (PS) services. The analytical modeling for location update and paging signaling loads of the proposed scheme is developed thoroughly and the performance of the proposed scheme is compared with that of the conventional scheme. We show that the proposed scheme outperforms the conventional scheme in terms of total signaling load with an appropriate selection of movement thresholds.

  16. Surfactant studies for bench-scale operation

    NASA Technical Reports Server (NTRS)

    Hickey, Gregory S.; Sharma, Pramod K.

    1992-01-01

    A phase 2 study was initiated to investigate surfactant-assisted coal liquefaction, with the objective of quantifying the enhancement in liquid yields and product quality. This publication covers the first quarter of work. The major accomplishments were: the refurbishment of the high-pressure, high-temperature reactor autoclave, the completion of four coal liquefaction runs with Pittsburgh #8 coal, two each with and without sodium lignosulfonate surfactant, and the development of an analysis scheme for the product liquid filtrate and filter cake. Initial results at low reactor temperatures show that the addition of the surfactant produces an improvement in conversion yields and an increase in lighter boiling point fractions for the filtrate.

  17. Economic order quantity (EOQ) by game theory approach in probabilistic supply chain system under service level constraint for items with imperfect quality

    NASA Astrophysics Data System (ADS)

    Setiawan, R.

    2018-03-01

    In this paper, Economic Order Quantity (EOQ) of probabilistic two-level supply – chain system for items with imperfect quality has been analyzed under service level constraint. A firm applies an active service level constraint to avoid unpredictable shortage terms in the objective function. Mathematical analysis of optimal result is delivered using two equilibrium scheme concept in game theory approach. Stackelberg’s equilibrium for cooperative strategy and Stackelberg’s Equilibrium for noncooperative strategy. This is a new approach to game theory result in inventory system whether service level constraint is applied by a firm in his moves.

  18. Data acquisition and analysis of range-finding systems for spacing construction

    NASA Technical Reports Server (NTRS)

    Shen, C. N.

    1981-01-01

    For space missions of future, completely autonomous robotic machines will be required to free astronauts from routine chores of equipment maintenance, servicing of faulty systems, etc. and to extend human capabilities in hazardous environments full of cosmic and other harmful radiations. In places of high radiation and uncontrollable ambient illuminations, T.V. camera based vision systems cannot work effectively. However, a vision system utilizing directly measured range information with a time of flight laser rangefinder, can successfully operate under these environments. Such a system will be independent of proper illumination conditions and the interfering effects of intense radiation of all kinds will be eliminated by the tuned input of the laser instrument. Processing the range data according to certain decision, stochastic estimation and heuristic schemes, the laser based vision system will recognize known objects and thus provide sufficient information to the robot's control system which can develop strategies for various objectives.

  19. The dynamics and fueling of active nuclei

    NASA Technical Reports Server (NTRS)

    Norman, C.; Silk, J.

    1983-01-01

    It is generally believed that quasars and active galactic nuclei produce their prodigious luminosities in connection with the release of gravitational energy associated with accretion and infall of matter onto a compact central object. In the present analysis, it is assumed that the central object is a massive black hole. The fact that a black hole provides the deepest possible central potential well does imply that it is the most natural candidate for the central engine. It is also assumed that the quasar is associated with the nucleus of a conventional galaxy. A number of difficulties arise in connection with finding a suitable stellar fueling model. A simple scheme is discussed for resolving these difficulties. Attention is given to fueling in a nonaxisymmetric potential, the effects of a massive accretion disk, and the variability in the disk luminosity caused by star-disk collisions assuming that the energy deposited in the disk is radiated.

  20. Weighted analysis methods for mapped plot forest inventory data: Tables, regressions, maps and graphs

    Treesearch

    Paul C. Van Deusen; Linda S. Heath

    2010-01-01

    Weighted estimation methods for analysis of mapped plot forest inventory data are discussed. The appropriate weighting scheme can vary depending on the type of analysis and graphical display. Both statistical issues and user expectations need to be considered in these methods. A weighting scheme is proposed that balances statistical considerations and the logical...

  1. An Improved Flame Test for Qualitative Analysis Using a Multichannel UV-Visible Spectrophotometer

    ERIC Educational Resources Information Center

    Blitz, Jonathan P.; Sheeran, Daniel J.; Becker, Thomas L.

    2006-01-01

    Qualitative analysis schemes are used in undergraduate laboratory settings as a way to introduce equilibrium concepts and logical thinking. The main component of all qualitative analysis schemes is a flame test, as the color of light emitted from certain elements is distinctive and a flame photometer or spectrophotometer in each laboratory is…

  2. An improved experimental scheme for simultaneous measurement of high-resolution zero electron kinetic energy (ZEKE) photoelectron and threshold photoion (MATI) spectra

    NASA Astrophysics Data System (ADS)

    Michels, François; Mazzoni, Federico; Becucci, Maurizio; Müller-Dethlefs, Klaus

    2017-10-01

    An improved detection scheme is presented for threshold ionization spectroscopy with simultaneous recording of the Zero Electron Kinetic Energy (ZEKE) and Mass Analysed Threshold Ionisation (MATI) signals. The objective is to obtain accurate dissociation energies for larger molecular clusters by simultaneously detecting the fragment and parent ion MATI signals with identical transmission. The scheme preserves an optimal ZEKE spectral resolution together with excellent separation of the spontaneous ion and MATI signals in the time-of-flight mass spectrum. The resulting improvement in sensitivity will allow for the determination of dissociation energies in clusters with substantial mass difference between parent and daughter ions.

  3. The synthesis of monomers with pendent ethynyl group for modified high performance thermoplastics

    NASA Technical Reports Server (NTRS)

    Nwokogu, Godson C.; Antoine, Miquel D.; Ansong, Omari

    1992-01-01

    The objectives of this project were to develop synthetic schemes for the following classes of modified monomers: (1) difunctional triarylethanes with pendent acetylenic groups; and (2) tertiary aspartimides with terminal acetylene groups at the two ends. Our efforts have resulted in the successful development of high yield schemes for the syntheses of several diamino and bisphenolic analogs of difunctional triarylethanes with pendent ethynyl group. A scheme for one new tertiary aspartimide was also established. Multi-gram samples of all prepared new monomers were provided to our technical contact at NASA-LaRC and preliminary polymerization studies were encouraging. Details of the accomplished work within the last four years are described.

  4. Determination of power distribution in the VVER-440 core on the basis of data from in-core monitors by means of a metric analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kryanev, A. V.; Udumyan, D. K.; Kurchenkov, A. Yu., E-mail: s327@vver.kiae.ru

    2014-12-15

    Problems associated with determining the power distribution in the VVER-440 core on the basis of a neutron-physics calculation and data from in-core monitors are considered. A new mathematical scheme is proposed for this on the basis of a metric analysis. In relation to the existing mathematical schemes, the scheme in question improves the accuracy and reliability of the resulting power distribution.

  5. High-order upwind schemes for the wave equation on overlapping grids: Maxwell's equations in second-order form

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Angel, Jordan B.; Banks, Jeffrey W.; Henshaw, William D.

    High-order accurate upwind approximations for the wave equation in second-order form on overlapping grids are developed. Although upwind schemes are well established for first-order hyperbolic systems, it was only recently shown by Banks and Henshaw how upwinding could be incorporated into the second-order form of the wave equation. This new upwind approach is extended here to solve the time-domain Maxwell's equations in second-order form; schemes of arbitrary order of accuracy are formulated for general curvilinear grids. Taylor time-stepping is used to develop single-step space-time schemes, and the upwind dissipation is incorporated by embedding the exact solution of a local Riemannmore » problem into the discretization. Second-order and fourth-order accurate schemes are implemented for problems in two and three space dimensions, and overlapping grids are used to treat complex geometry and problems with multiple materials. Stability analysis of the upwind-scheme on overlapping grids is performed using normal mode theory. The stability analysis and computations confirm that the upwind scheme remains stable on overlapping grids, including the difficult case of thin boundary grids when the traditional non-dissipative scheme becomes unstable. The accuracy properties of the scheme are carefully evaluated on a series of classical scattering problems for both perfect conductors and dielectric materials in two and three space dimensions. Finally, the upwind scheme is shown to be robust and provide high-order accuracy.« less

  6. High-order upwind schemes for the wave equation on overlapping grids: Maxwell's equations in second-order form

    DOE PAGES

    Angel, Jordan B.; Banks, Jeffrey W.; Henshaw, William D.

    2017-09-28

    High-order accurate upwind approximations for the wave equation in second-order form on overlapping grids are developed. Although upwind schemes are well established for first-order hyperbolic systems, it was only recently shown by Banks and Henshaw how upwinding could be incorporated into the second-order form of the wave equation. This new upwind approach is extended here to solve the time-domain Maxwell's equations in second-order form; schemes of arbitrary order of accuracy are formulated for general curvilinear grids. Taylor time-stepping is used to develop single-step space-time schemes, and the upwind dissipation is incorporated by embedding the exact solution of a local Riemannmore » problem into the discretization. Second-order and fourth-order accurate schemes are implemented for problems in two and three space dimensions, and overlapping grids are used to treat complex geometry and problems with multiple materials. Stability analysis of the upwind-scheme on overlapping grids is performed using normal mode theory. The stability analysis and computations confirm that the upwind scheme remains stable on overlapping grids, including the difficult case of thin boundary grids when the traditional non-dissipative scheme becomes unstable. The accuracy properties of the scheme are carefully evaluated on a series of classical scattering problems for both perfect conductors and dielectric materials in two and three space dimensions. Finally, the upwind scheme is shown to be robust and provide high-order accuracy.« less

  7. An Anonymous User Authentication and Key Agreement Scheme Based on a Symmetric Cryptosystem in Wireless Sensor Networks

    PubMed Central

    Jung, Jaewook; Kim, Jiye; Choi, Younsung; Won, Dongho

    2016-01-01

    In wireless sensor networks (WSNs), a registered user can login to the network and use a user authentication protocol to access data collected from the sensor nodes. Since WSNs are typically deployed in unattended environments and sensor nodes have limited resources, many researchers have made considerable efforts to design a secure and efficient user authentication process. Recently, Chen et al. proposed a secure user authentication scheme using symmetric key techniques for WSNs. They claim that their scheme assures high efficiency and security against different types of attacks. After careful analysis, however, we find that Chen et al.’s scheme is still vulnerable to smart card loss attack and is susceptible to denial of service attack, since it is invalid for verification to simply compare an entered ID and a stored ID in smart card. In addition, we also observe that their scheme cannot preserve user anonymity. Furthermore, their scheme cannot quickly detect an incorrect password during login phase, and this flaw wastes both communication and computational overheads. In this paper, we describe how these attacks work, and propose an enhanced anonymous user authentication and key agreement scheme based on a symmetric cryptosystem in WSNs to address all of the aforementioned vulnerabilities in Chen et al.’s scheme. Our analysis shows that the proposed scheme improves the level of security, and is also more efficient relative to other related schemes. PMID:27537890

  8. High-order upwind schemes for the wave equation on overlapping grids: Maxwell's equations in second-order form

    NASA Astrophysics Data System (ADS)

    Angel, Jordan B.; Banks, Jeffrey W.; Henshaw, William D.

    2018-01-01

    High-order accurate upwind approximations for the wave equation in second-order form on overlapping grids are developed. Although upwind schemes are well established for first-order hyperbolic systems, it was only recently shown by Banks and Henshaw [1] how upwinding could be incorporated into the second-order form of the wave equation. This new upwind approach is extended here to solve the time-domain Maxwell's equations in second-order form; schemes of arbitrary order of accuracy are formulated for general curvilinear grids. Taylor time-stepping is used to develop single-step space-time schemes, and the upwind dissipation is incorporated by embedding the exact solution of a local Riemann problem into the discretization. Second-order and fourth-order accurate schemes are implemented for problems in two and three space dimensions, and overlapping grids are used to treat complex geometry and problems with multiple materials. Stability analysis of the upwind-scheme on overlapping grids is performed using normal mode theory. The stability analysis and computations confirm that the upwind scheme remains stable on overlapping grids, including the difficult case of thin boundary grids when the traditional non-dissipative scheme becomes unstable. The accuracy properties of the scheme are carefully evaluated on a series of classical scattering problems for both perfect conductors and dielectric materials in two and three space dimensions. The upwind scheme is shown to be robust and provide high-order accuracy.

  9. Location and Size Planning of Distributed Photovoltaic Generation in Distribution network System Based on K-means Clustering Analysis

    NASA Astrophysics Data System (ADS)

    Lu, Siqi; Wang, Xiaorong; Wu, Junyong

    2018-01-01

    The paper presents a method to generate the planning scenarios, which is based on K-means clustering analysis algorithm driven by data, for the location and size planning of distributed photovoltaic (PV) units in the network. Taken the power losses of the network, the installation and maintenance costs of distributed PV, the profit of distributed PV and the voltage offset as objectives and the locations and sizes of distributed PV as decision variables, Pareto optimal front is obtained through the self-adaptive genetic algorithm (GA) and solutions are ranked by a method called technique for order preference by similarity to an ideal solution (TOPSIS). Finally, select the planning schemes at the top of the ranking list based on different planning emphasis after the analysis in detail. The proposed method is applied to a 10-kV distribution network in Gansu Province, China and the results are discussed.

  10. Continuous development of schemes for parallel computing of the electrostatics in biological systems: implementation in DelPhi.

    PubMed

    Li, Chuan; Petukh, Marharyta; Li, Lin; Alexov, Emil

    2013-08-15

    Due to the enormous importance of electrostatics in molecular biology, calculating the electrostatic potential and corresponding energies has become a standard computational approach for the study of biomolecules and nano-objects immersed in water and salt phase or other media. However, the electrostatics of large macromolecules and macromolecular complexes, including nano-objects, may not be obtainable via explicit methods and even the standard continuum electrostatics methods may not be applicable due to high computational time and memory requirements. Here, we report further development of the parallelization scheme reported in our previous work (Li, et al., J. Comput. Chem. 2012, 33, 1960) to include parallelization of the molecular surface and energy calculations components of the algorithm. The parallelization scheme utilizes different approaches such as space domain parallelization, algorithmic parallelization, multithreading, and task scheduling, depending on the quantity being calculated. This allows for efficient use of the computing resources of the corresponding computer cluster. The parallelization scheme is implemented in the popular software DelPhi and results in speedup of several folds. As a demonstration of the efficiency and capability of this methodology, the electrostatic potential, and electric field distributions are calculated for the bovine mitochondrial supercomplex illustrating their complex topology, which cannot be obtained by modeling the supercomplex components alone. Copyright © 2013 Wiley Periodicals, Inc.

  11. Performance Analyses and Improvements for the IEEE 802.15.4 CSMA/CA Scheme with Heterogeneous Buffered Conditions

    PubMed Central

    Zhu, Jianping; Tao, Zhengsu; Lv, Chunfeng

    2012-01-01

    Studies of the IEEE 802.15.4 Carrier Sense Multiple Access with Collision Avoidance (CSMA/CA) scheme have been received considerable attention recently, with most of these studies focusing on homogeneous or saturated traffic. Two novel transmission schemes—OSTS/BSTS (One Service a Time Scheme/Bulk Service a Time Scheme)—are proposed in this paper to improve the behaviors of time-critical buffered networks with heterogeneous unsaturated traffic. First, we propose a model which contains two modified semi-Markov chains and a macro-Markov chain combined with the theory of M/G/1/K queues to evaluate the characteristics of these two improved CSMA/CA schemes, in which traffic arrivals and accessing packets are bestowed with non-preemptive priority over each other, instead of prioritization. Then, throughput, packet delay and energy consumption of unsaturated, unacknowledged IEEE 802.15.4 beacon-enabled networks are predicted based on the overall point of view which takes the dependent interactions of different types of nodes into account. Moreover, performance comparisons of these two schemes with other non-priority schemes are also proposed. Analysis and simulation results show that delay and fairness of our schemes are superior to those of other schemes, while throughput and energy efficiency are superior to others in more heterogeneous situations. Comprehensive simulations demonstrate that the analysis results of these models match well with the simulation results. PMID:22666076

  12. Cut-Off Points for Mild, Moderate, and Severe Pain on the Numeric Rating Scale for Pain in Patients with Chronic Musculoskeletal Pain: Variability and Influence of Sex and Catastrophizing.

    PubMed

    Boonstra, Anne M; Stewart, Roy E; Köke, Albère J A; Oosterwijk, René F A; Swaan, Jeannette L; Schreurs, Karlein M G; Schiphorst Preuper, Henrica R

    2016-01-01

    Objectives: The 0-10 Numeric Rating Scale (NRS) is often used in pain management. The aims of our study were to determine the cut-off points for mild, moderate, and severe pain in terms of pain-related interference with functioning in patients with chronic musculoskeletal pain, to measure the variability of the optimal cut-off points, and to determine the influence of patients' catastrophizing and their sex on these cut-off points. Methods: 2854 patients were included. Pain was assessed by the NRS, functioning by the Pain Disability Index (PDI) and catastrophizing by the Pain Catastrophizing Scale (PCS). Cut-off point schemes were tested using ANOVAs with and without using the PSC scores or sex as co-variates and with the interaction between CP scheme and PCS score and sex, respectively. The variability of the optimal cut-off point schemes was quantified using bootstrapping procedure. Results and conclusion: The study showed that NRS scores ≤ 5 correspond to mild, scores of 6-7 to moderate and scores ≥8 to severe pain in terms of pain-related interference with functioning. Bootstrapping analysis identified this optimal NRS cut-off point scheme in 90% of the bootstrapping samples. The interpretation of the NRS is independent of sex, but seems to depend on catastrophizing. In patients with high catastrophizing tendency, the optimal cut-off point scheme equals that for the total study sample, but in patients with a low catastrophizing tendency, NRS scores ≤ 3 correspond to mild, scores of 4-6 to moderate and scores ≥7 to severe pain in terms of interference with functioning. In these optimal cut-off schemes, NRS scores of 4 and 5 correspond to moderate interference with functioning for patients with low catastrophizing tendency and to mild interference for patients with high catastrophizing tendency. Theoretically one would therefore expect that among the patients with NRS scores 4 and 5 there would be a higher average PDI score for those with low catastrophizing than for those with high catastrophizing. However, we found the opposite. The fact that we did not find the same optimal CP scheme in the subgroups with lower and higher catastrophizing tendency may be due to chance variability.

  13. Cut-Off Points for Mild, Moderate, and Severe Pain on the Numeric Rating Scale for Pain in Patients with Chronic Musculoskeletal Pain: Variability and Influence of Sex and Catastrophizing

    PubMed Central

    Boonstra, Anne M.; Stewart, Roy E.; Köke, Albère J. A.; Oosterwijk, René F. A.; Swaan, Jeannette L.; Schreurs, Karlein M. G.; Schiphorst Preuper, Henrica R.

    2016-01-01

    Objectives: The 0–10 Numeric Rating Scale (NRS) is often used in pain management. The aims of our study were to determine the cut-off points for mild, moderate, and severe pain in terms of pain-related interference with functioning in patients with chronic musculoskeletal pain, to measure the variability of the optimal cut-off points, and to determine the influence of patients’ catastrophizing and their sex on these cut-off points. Methods: 2854 patients were included. Pain was assessed by the NRS, functioning by the Pain Disability Index (PDI) and catastrophizing by the Pain Catastrophizing Scale (PCS). Cut-off point schemes were tested using ANOVAs with and without using the PSC scores or sex as co-variates and with the interaction between CP scheme and PCS score and sex, respectively. The variability of the optimal cut-off point schemes was quantified using bootstrapping procedure. Results and conclusion: The study showed that NRS scores ≤ 5 correspond to mild, scores of 6–7 to moderate and scores ≥8 to severe pain in terms of pain-related interference with functioning. Bootstrapping analysis identified this optimal NRS cut-off point scheme in 90% of the bootstrapping samples. The interpretation of the NRS is independent of sex, but seems to depend on catastrophizing. In patients with high catastrophizing tendency, the optimal cut-off point scheme equals that for the total study sample, but in patients with a low catastrophizing tendency, NRS scores ≤ 3 correspond to mild, scores of 4–6 to moderate and scores ≥7 to severe pain in terms of interference with functioning. In these optimal cut-off schemes, NRS scores of 4 and 5 correspond to moderate interference with functioning for patients with low catastrophizing tendency and to mild interference for patients with high catastrophizing tendency. Theoretically one would therefore expect that among the patients with NRS scores 4 and 5 there would be a higher average PDI score for those with low catastrophizing than for those with high catastrophizing. However, we found the opposite. The fact that we did not find the same optimal CP scheme in the subgroups with lower and higher catastrophizing tendency may be due to chance variability. PMID:27746750

  14. Willingness to pay for health insurance in the informal sector of Sierra Leone

    PubMed Central

    Jofre-Bonet, Mireia; Kamara, Joseph

    2018-01-01

    Purpose The objective of this project is to study the willingness to pay (WTP) for health insurance (HI) of individuals working in the informal sector in Sierra Leone, using a purposely-designed survey of a representative sample of this sector. Methods We elicit the WTP using the Double-Bounded Dichotomous Choice with Follow Up method. We also examine the factors that are positively and negatively associated with the likelihood of the respondents to answer affirmatively to joining a HI scheme and to paying three different possible premiums, to join the HI scheme. We additionally analyze the individual and household characteristics associated with the maximum amount the household is willing to pay to join the HI scheme. Results The results indicate that the average WTP for the HI is 20,237.16 SLL (3.6 USD) per adult but it ranges from about 14,000 SLL (2.5 USD) to about 35,000 SLL (6.2 USD) depending on region, occupation, household and respondent characteristics. The analysis of the maximum WTP indicates that living outside the Western region and working in farming instead of petty trade are associated with a decrease in the maximum premium respondents are WTP for the HI scheme. Instead, the maximum WTP is positively associated to being a driver or a biker; having secondary or tertiary education (as opposed to not having any); the number of pregnant women in the household; having a TV; and, having paid for the last medical requirement. Conclusions In summary, the various analyses show that a premium for the HI package could be set at approximately 20,000 SLL (3.54 USD) but also that establishing a single premium for all individuals in the informal sector could be risky. The efficient functioning of a HI scheme relies on covering as much of the population as possible, in order to spread risks and make the scheme viable. The impact of the various population characteristics raises the issue of how to rate premiums. In other words, setting a premium that may be too high for a big proportion of the population could mean losing many potential enrollees and might have viability consequences for the operation of the scheme. PMID:29768409

  15. Evaluation of the impact of a chronic disease scheme reimbursing medical costs of patients with diabetes in Anhui province, China: a follow-up study.

    PubMed

    Jiang, Qicheng; Jiang, Zhen; Xin, Zhang; Cherry, Nicola

    2016-09-15

    Although many studies have investigated the relationship between the introduction of the New Cooperative Medical Scheme (NCMS) in rural China in 2003 and increased use of medical services, the effect on health status, objectively measured, is seldom reported. In Anhui Province a chronic disease scheme (CDS) for reimbursing part of the cost of outpatient care is designed to improve management of those with chronic conditions, including diabetes. A follow-up study was designed in which patients with diabetes aged 40-70 years who had recently (in 2010) been granted a chronic disease card were individually matched on age, sex and village with a patient with diabetes not yet in the scheme. Each subject gave a fingertip sample of blood to give the concentration of glycosylated hemoglobin (HbA1c), a measure indicating blood glucose control during the previous 3 months. This measure was made on recruitment and at 12 month follow-up: information on use of health services, quality of life and financial burden was also collected at the two contacts. Of 602 pairs initially recruited, 528 pairs were contacted at follow-up and are the subject of this report. To distinguish between outcomes associated with application and those of membership of the scheme, the primary analysis was of 256 pairs in which one had been a member of the CDS throughout and the other never applied. No difference between pairs on HbA1c was found either at recruitment or follow-up but those in the CDS reported more hospital visits, more tests and more use of high level hospitals. However they had poorer scores on quality of life scales (SF-12, EQ-5D) and were more likely to report that the financial costs were very burdensome. Those recently applying for the scheme, or being accepted since recruitment, had lower HbA1c scores. On-going membership of the CDS was associated with increased use of services but this did not appear to result in better management of blood glucose or improved quality of life. Those who had recently joined the scheme had signs of improvement, suggesting a need for active follow-up to maintain and reinforce early gains.

  16. An improved and effective secure password-based authentication and key agreement scheme using smart cards for the telecare medicine information system.

    PubMed

    Das, Ashok Kumar; Bruhadeshwar, Bezawada

    2013-10-01

    Recently Lee and Liu proposed an efficient password based authentication and key agreement scheme using smart card for the telecare medicine information system [J. Med. Syst. (2013) 37:9933]. In this paper, we show that though their scheme is efficient, their scheme still has two security weaknesses such as (1) it has design flaws in authentication phase and (2) it has design flaws in password change phase. In order to withstand these flaws found in Lee-Liu's scheme, we propose an improvement of their scheme. Our improved scheme keeps also the original merits of Lee-Liu's scheme. We show that our scheme is efficient as compared to Lee-Liu's scheme. Further, through the security analysis, we show that our scheme is secure against possible known attacks. In addition, we simulate our scheme for the formal security verification using the widely-accepted AVISPA (Automated Validation of Internet Security Protocols and Applications) tool to show that our scheme is secure against passive and active attacks.

  17. MASTER ANALYTICAL SCHEME FOR ORGANIC COMPOUNDS IN WATER: PART 1. PROTOCOLS

    EPA Science Inventory

    A Master Analytical Scheme (MAS) has been developed for the analysis of volatile (gas chromatographable) organic compounds in water. In developing the MAS, it was necessary to evaluate and modify existing analysis procedures and develop new techniques to produce protocols that pr...

  18. APPLICATION OF THE MASTER ANALYTICAL SCHEME TO POLAR ORGANICS IN DRINKING WATER

    EPA Science Inventory

    EPA's Master Analytical Scheme (MAS) for Organic Compounds in Water provides for comprehensive qualitative-quantitative analysis of gas chromatographable organics in many types of water. The paper emphasizes the analysis of polar and ionic organics, the more water soluble compoun...

  19. Computational aspects of helicopter trim analysis and damping levels from Floquet theory

    NASA Technical Reports Server (NTRS)

    Gaonkar, Gopal H.; Achar, N. S.

    1992-01-01

    Helicopter trim settings of periodic initial state and control inputs are investigated for convergence of Newton iteration in computing the settings sequentially and in parallel. The trim analysis uses a shooting method and a weak version of two temporal finite element methods with displacement formulation and with mixed formulation of displacements and momenta. These three methods broadly represent two main approaches of trim analysis: adaptation of initial-value and finite element boundary-value codes to periodic boundary conditions, particularly for unstable and marginally stable systems. In each method, both the sequential and in-parallel schemes are used and the resulting nonlinear algebraic equations are solved by damped Newton iteration with an optimally selected damping parameter. The impact of damped Newton iteration, including earlier-observed divergence problems in trim analysis, is demonstrated by the maximum condition number of the Jacobian matrices of the iterative scheme and by virtual elimination of divergence. The advantages of the in-parallel scheme over the conventional sequential scheme are also demonstrated.

  20. Analysis/forecast experiments with a flow-dependent correlation function using FGGE data

    NASA Technical Reports Server (NTRS)

    Baker, W. E.; Bloom, S. C.; Carus, H.; Nestler, M. S.

    1986-01-01

    The use of a flow-dependent correlation function to improve the accuracy of an optimum interpolation (OI) scheme is examined. The development of the correlation function for the OI analysis scheme used for numerical weather prediction is described. The scheme uses a multivariate surface analysis over the oceans to model the pressure-wind error cross-correlation and it has the ability to use an error correlation function that is flow- and geographically-dependent. A series of four-day data assimilation experiments, conducted from January 5-9, 1979, were used to investigate the effect of the different features of the OI scheme (error correlation) on forecast skill for the barotropic lows and highs. The skill of the OI was compared with that of a successive correlation method (SCM) of analysis. It is observed that the largest difference in the correlation statistics occurred in barotropic and baroclinic lows and highs. The comparison reveals that the OI forecasts were more accurate than the SCM forecasts.

  1. Incentivising effort in governance of public hospitals: Development of a delegation-based alternative to activity-based remuneration.

    PubMed

    Søgaard, Rikke; Kristensen, Søren Rud; Bech, Mickael

    2015-08-01

    This paper is a first examination of the development of an alternative to activity-based remuneration in public hospitals, which is currently being tested at nine hospital departments in a Danish region. The objective is to examine the process of delegating the authority of designing new incentive schemes from the principal (the regional government) to the agents (the hospital departments). We adopt a theoretical framework where, when deciding about delegation, the principal should trade off an initiative effect against the potential cost of loss of control. The initiative effect is evaluated by studying the development process and the resulting incentive schemes for each of the departments. Similarly, the potential cost of loss of control is evaluated by assessing the congruence between focus of the new incentive schemes and the principal's objectives. We observe a high impact of the effort incentive in the form of innovative and ambitious selection of projects by the agents, leading to nine very different solutions across departments. However, we also observe some incongruence between the principal's stated objectives and the revealed private interests of the agents. Although this is a baseline study involving high uncertainty about the future, the findings point at some issues with the delegation approach that could lead to inefficient outcomes. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  2. Multi-Objective Memetic Search for Robust Motion and Distortion Correction in Diffusion MRI.

    PubMed

    Hering, Jan; Wolf, Ivo; Maier-Hein, Klaus H

    2016-10-01

    Effective image-based artifact correction is an essential step in the analysis of diffusion MR images. Many current approaches are based on retrospective registration, which becomes challenging in the realm of high b -values and low signal-to-noise ratio, rendering the corresponding correction schemes more and more ineffective. We propose a novel registration scheme based on memetic search optimization that allows for simultaneous exploitation of different signal intensity relationships between the images, leading to more robust registration results. We demonstrate the increased robustness and efficacy of our method on simulated as well as in vivo datasets. In contrast to the state-of-art methods, the median target registration error (TRE) stayed below the voxel size even for high b -values (3000 s ·mm -2 and higher) and low SNR conditions. We also demonstrate the increased precision in diffusion-derived quantities by evaluating Neurite Orientation Dispersion and Density Imaging (NODDI) derived measures on a in vivo dataset with severe motion artifacts. These promising results will potentially inspire further studies on metaheuristic optimization in diffusion MRI artifact correction and image registration in general.

  3. Simulation of adaptive semi-active magnetorheological seat damper for vehicle occupant blast protection

    NASA Astrophysics Data System (ADS)

    Yoo, Jin-Hyeong; Murugan, Muthuvel; Wereley, Norman M.

    2013-04-01

    This study investigates a lumped-parameter human body model which includes lower leg in seated posture within a quarter-car model for blast injury assessment simulation. To simulate the shock acceleration of the vehicle, mine blast analysis was conducted on a generic land vehicle crew compartment (sand box) structure. For the purpose of simulating human body dynamics with non-linear parameters, a physical model of a lumped-parameter human body within a quarter car model was implemented using multi-body dynamic simulation software. For implementing the control scheme, a skyhook algorithm was made to work with the multi-body dynamic model by running a co-simulation with the control scheme software plug-in. The injury criteria and tolerance levels for the biomechanical effects are discussed for each of the identified vulnerable body regions, such as the relative head displacement and the neck bending moment. The desired objective of this analytical model development is to study the performance of adaptive semi-active magnetorheological damper that can be used for vehicle-occupant protection technology enhancements to the seat design in a mine-resistant military vehicle.

  4. A reevaluation of the costs of heart failure and its implications for allocation of health resources in the United States.

    PubMed

    Voigt, Jeff; Sasha John, M; Taylor, Andrew; Krucoff, Mitchell; Reynolds, Matthew R; Michael Gibson, C

    2014-05-01

    The annual cost of heart failure (HF) is estimated at $39.2 billion. This has been acknowledged to underestimate the true costs for care. The objective of this analysis is to more accurately assess these costs. Publicly available data sources were used. Cost calculations incorporated relevant factors such as Medicare hospital cost-to-charge ratios, reimbursement from both government and private insurance, and out-of-pocket expenditures. A recently published Atherosclerosis Risk in Communities (ARIC) HF scheme was used to adjust the HF classification scheme. Costs were calculated with HF as the primary diagnosis (HF in isolation, or HFI) or HF as one of the diagnoses/part of a disease milieu (HF syndrome, or HFS). Total direct costs for HF were calculated at $60.2 billion (HFI) and $115.4 billion (HFS). Indirect costs were $10.6 billion for both. Costs attributable to HF may represent a much larger burden to US health care than what is commonly referenced. These revised and increased costs have implications for policy makers.

  5. An opposite view data replacement approach for reducing artifacts due to metallic dental objects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yazdi, Mehran; Lari, Meghdad Asadi; Bernier, Gaston

    Purpose: To present a conceptually new method for metal artifact reduction (MAR) that can be used on patients with multiple objects within the scan plane that are also of small sized along the longitudinal (scanning) direction, such as dental fillings. Methods: The proposed algorithm, named opposite view replacement, achieves MAR by first detecting the projection data affected by metal objects and then replacing the affected projections by the corresponding opposite view projections, which are not affected by metal objects. The authors also applied a fading process to avoid producing any discontinuities in the boundary of the affected projection areas inmore » the sinogram. A skull phantom with and without a variety of dental metal inserts was made to extract the performance metric of the algorithm. A head and neck case, typical of IMRT planning, was also tested. Results: The reconstructed CT images based on this new replacement scheme show a significant improvement in image quality for patients with metallic dental objects compared to the MAR algorithms based on the interpolation scheme. For the phantom, the authors showed that the artifact reduction algorithm can efficiently recover the CT numbers in the area next to the metallic objects. Conclusions: The authors presented a new and efficient method for artifact reduction due to multiple small metallic objects. The obtained results from phantoms and clinical cases fully validate the proposed approach.« less

  6. Performance of Low Dissipative High Order Shock-Capturing Schemes for Shock-Turbulence Interactions

    NASA Technical Reports Server (NTRS)

    Sandham, N. D.; Yee, H. C.

    1998-01-01

    Accurate and efficient direct numerical simulation of turbulence in the presence of shock waves represents a significant challenge for numerical methods. The objective of this paper is to evaluate the performance of high order compact and non-compact central spatial differencing employing total variation diminishing (TVD) shock-capturing dissipations as characteristic based filters for two model problems combining shock wave and shear layer phenomena. A vortex pairing model evaluates the ability of the schemes to cope with shear layer instability and eddy shock waves, while a shock wave impingement on a spatially-evolving mixing layer model studies the accuracy of computation of vortices passing through a sequence of shock and expansion waves. A drastic increase in accuracy is observed if a suitable artificial compression formulation is applied to the TVD dissipations. With this modification to the filter step the fourth-order non-compact scheme shows improved results in comparison to second-order methods, while retaining the good shock resolution of the basic TVD scheme. For this characteristic based filter approach, however, the benefits of compact schemes or schemes with higher than fourth order are not sufficient to justify the higher complexity near the boundary and/or the additional computational cost.

  7. Classification of basic facilities for high-rise residential: A survey from 100 housing scheme in Kajang area

    NASA Astrophysics Data System (ADS)

    Ani, Adi Irfan Che; Sairi, Ahmad; Tawil, Norngainy Mohd; Wahab, Siti Rashidah Hanum Abd; Razak, Muhd Zulhanif Abd

    2016-08-01

    High demand for housing and limited land in town area has increasing the provision of high-rise residential scheme. This type of housing has different owners but share the same land lot and common facilities. Thus, maintenance works of the buildings and common facilities must be well organized. The purpose of this paper is to identify and classify basic facilities for high-rise residential building hoping to improve the management of the scheme. The method adopted is a survey on 100 high-rise residential schemes that ranged from affordable housing to high cost housing by using a snowball sampling. The scope of this research is within Kajang area, which is rapidly developed with high-rise housing. The objective of the survey is to list out all facilities in every sample of the schemes. The result confirmed that pre-determined 11 classifications hold true and can provide the realistic classification for high-rise residential scheme. This paper proposed for redefinition of facilities provided to create a better management system and give a clear definition on the type of high-rise residential based on its facilities.

  8. Pricing and reimbursement frameworks in Central Eastern Europe: a decision tool to support choices.

    PubMed

    Kolasa, Katarzyna; Kalo, Zoltan; Hornby, Edward

    2015-02-01

    Given limited financial resources in the Central Eastern European (CEE) region, challenges in obtaining access to innovative medical technologies are formidable. The objective of this research was to develop a decision tree that supports decision makers and drug manufacturers from CEE region in their search for optimal innovative pricing and reimbursement scheme (IPRSs). A systematic literature review was performed to search for published IPRSs, and then ten experts from the CEE region were interviewed to ascertain their opinions on these schemes. In total, 33 articles representing 46 unique IPRSs were analyzed. Based on our literature review and subsequent expert input, key decision nodes and branches of the decision tree were developed. The results indicate that outcome-based schemes are better suited to deal with uncertainties surrounding cost effectiveness, while non-outcome-based schemes are more appropriate for pricing and budget impact challenges.

  9. Fast viscosity solutions for shape from shading under a more realistic imaging model

    NASA Astrophysics Data System (ADS)

    Wang, Guohui; Han, Jiuqiang; Jia, Honghai; Zhang, Xinman

    2009-11-01

    Shape from shading (SFS) has been a classical and important problem in the domain of computer vision. The goal of SFS is to reconstruct the 3-D shape of an object from its 2-D intensity image. To this end, an image irradiance equation describing the relation between the shape of a surface and its corresponding brightness variations is used. Then it is derived as an explicit partial differential equation (PDE). Using the nonlinear programming principle, we propose a detailed solution to Prados and Faugeras's implicit scheme for approximating the viscosity solution of the resulting PDE. Furthermore, by combining implicit and semi-implicit schemes, a new approximation scheme is presented. In order to accelerate the convergence speed, we adopt the Gauss-Seidel idea and alternating sweeping strategy to the approximation schemes. Experimental results on both synthetic and real images are performed to demonstrate that the proposed methods are fast and accurate.

  10. Oral health finance and expenditure in South Africa.

    PubMed

    Naidoo, L C; Stephen, L X

    1997-12-01

    The objective of this paper was to examine the cost of oral health in South Africa over the past decade Particular emphasis was placed on the contribution made by medical schemes which is the main source of private health care funding. Some of the problems facing this huge industry were also briefly explored. Primary aggregate data on oral health expenditure were obtained from the Department of Health, Pretoria and from the offices of the Registrar of Medical Schemes, Pretoria. The results show that in 1994, 4.7 per cent of the total health care budget was allocated to oral health. Of this amount, 14.2 per cent came from the state, 71.9 per cent from medical schemes and the remainder calculated to be from direct out-of-pocket payments. Furthermore, real expenditure for oral health by medical schemes grew robustly and almost continuously from 1984 through to 1994, generally outstripping medical inflation.

  11. Valiant load-balanced robust routing under hose model for WDM mesh networks

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaoning; Li, Lemin; Wang, Sheng

    2006-09-01

    In this paper, we propose Valiant Load-Balanced robust routing scheme for WDM mesh networks under the model of polyhedral uncertainty (i.e., hose model), and the proposed routing scheme is implemented with traffic grooming approach. Our Objective is to maximize the hose model throughput. A mathematic formulation of Valiant Load-Balanced robust routing is presented and three fast heuristic algorithms are also proposed. When implementing Valiant Load-Balanced robust routing scheme to WDM mesh networks, a novel traffic-grooming algorithm called MHF (minimizing hop first) is proposed. We compare the three heuristic algorithms with the VPN tree under the hose model. Finally we demonstrate in the simulation results that MHF with Valiant Load-Balanced robust routing scheme outperforms the traditional traffic-grooming algorithm in terms of the throughput for the uniform/non-uniform traffic matrix under the hose model.

  12. Matching by linear programming and successive convexification.

    PubMed

    Jiang, Hao; Drew, Mark S; Li, Ze-Nian

    2007-06-01

    We present a novel convex programming scheme to solve matching problems, focusing on the challenging problem of matching in a large search range and with cluttered background. Matching is formulated as metric labeling with L1 regularization terms, for which we propose a novel linear programming relaxation method and an efficient successive convexification implementation. The unique feature of the proposed relaxation scheme is that a much smaller set of basis labels is used to represent the original label space. This greatly reduces the size of the searching space. A successive convexification scheme solves the labeling problem in a coarse to fine manner. Importantly, the original cost function is reconvexified at each stage, in the new focus region only, and the focus region is updated so as to refine the searching result. This makes the method well-suited for large label set matching. Experiments demonstrate successful applications of the proposed matching scheme in object detection, motion estimation, and tracking.

  13. Unstructured grids for sonic-boom analysis

    NASA Technical Reports Server (NTRS)

    Fouladi, Kamran

    1993-01-01

    A fast and efficient unstructured grid scheme is evaluated for sonic-boom applications. The scheme is used to predict the near-field pressure signatures of a body of revolution at several body lengths below the configuration, and those results are compared with experimental data. The introduction of the 'sonic-boom grid topology' to this scheme make it well suited for sonic-boom applications, thus providing an alternative to conventional multiblock structured grid schemes.

  14. Stability analysis of implicit time discretizations for the Compton-scattering Fokker-Planck equation

    NASA Astrophysics Data System (ADS)

    Densmore, Jeffery D.; Warsa, James S.; Lowrie, Robert B.; Morel, Jim E.

    2009-09-01

    The Fokker-Planck equation is a widely used approximation for modeling the Compton scattering of photons in high energy density applications. In this paper, we perform a stability analysis of three implicit time discretizations for the Compton-Scattering Fokker-Planck equation. Specifically, we examine (i) a Semi-Implicit (SI) scheme that employs backward-Euler differencing but evaluates temperature-dependent coefficients at their beginning-of-time-step values, (ii) a Fully Implicit (FI) discretization that instead evaluates temperature-dependent coefficients at their end-of-time-step values, and (iii) a Linearized Implicit (LI) scheme, which is developed by linearizing the temperature dependence of the FI discretization within each time step. Our stability analysis shows that the FI and LI schemes are unconditionally stable and cannot generate oscillatory solutions regardless of time-step size, whereas the SI discretization can suffer from instabilities and nonphysical oscillations for sufficiently large time steps. With the results of this analysis, we present time-step limits for the SI scheme that prevent undesirable behavior. We test the validity of our stability analysis and time-step limits with a set of numerical examples.

  15. Genome-wide association analysis of secondary imaging phenotypes from the Alzheimer's disease neuroimaging initiative study.

    PubMed

    Zhu, Wensheng; Yuan, Ying; Zhang, Jingwen; Zhou, Fan; Knickmeyer, Rebecca C; Zhu, Hongtu

    2017-02-01

    The aim of this paper is to systematically evaluate a biased sampling issue associated with genome-wide association analysis (GWAS) of imaging phenotypes for most imaging genetic studies, including the Alzheimer's Disease Neuroimaging Initiative (ADNI). Specifically, the original sampling scheme of these imaging genetic studies is primarily the retrospective case-control design, whereas most existing statistical analyses of these studies ignore such sampling scheme by directly correlating imaging phenotypes (called the secondary traits) with genotype. Although it has been well documented in genetic epidemiology that ignoring the case-control sampling scheme can produce highly biased estimates, and subsequently lead to misleading results and suspicious associations, such findings are not well documented in imaging genetics. We use extensive simulations and a large-scale imaging genetic data analysis of the Alzheimer's Disease Neuroimaging Initiative (ADNI) data to evaluate the effects of the case-control sampling scheme on GWAS results based on some standard statistical methods, such as linear regression methods, while comparing it with several advanced statistical methods that appropriately adjust for the case-control sampling scheme. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. An extended GS method for dense linear systems

    NASA Astrophysics Data System (ADS)

    Niki, Hiroshi; Kohno, Toshiyuki; Abe, Kuniyoshi

    2009-09-01

    Davey and Rosindale [K. Davey, I. Rosindale, An iterative solution scheme for systems of boundary element equations, Internat. J. Numer. Methods Engrg. 37 (1994) 1399-1411] derived the GSOR method, which uses an upper triangular matrix [Omega] in order to solve dense linear systems. By applying functional analysis, the authors presented an expression for the optimum [Omega]. Moreover, Davey and Bounds [K. Davey, S. Bounds, A generalized SOR method for dense linear systems of boundary element equations, SIAM J. Comput. 19 (1998) 953-967] also introduced further interesting results. In this note, we employ a matrix analysis approach to investigate these schemes, and derive theorems that compare these schemes with existing preconditioners for dense linear systems. We show that the convergence rate of the Gauss-Seidel method with preconditioner PG is superior to that of the GSOR method. Moreover, we define some splittings associated with the iterative schemes. Some numerical examples are reported to confirm the theoretical analysis. We show that the EGS method with preconditioner produces an extremely small spectral radius in comparison with the other schemes considered.

  17. Comparison of different incremental analysis update schemes in a realistic assimilation system with Ensemble Kalman Filter

    NASA Astrophysics Data System (ADS)

    Yan, Y.; Barth, A.; Beckers, J. M.; Brankart, J. M.; Brasseur, P.; Candille, G.

    2017-07-01

    In this paper, three incremental analysis update schemes (IAU 0, IAU 50 and IAU 100) are compared in the same assimilation experiments with a realistic eddy permitting primitive equation model of the North Atlantic Ocean using the Ensemble Kalman Filter. The difference between the three IAU schemes lies on the position of the increment update window. The relevance of each IAU scheme is evaluated through analyses on both thermohaline and dynamical variables. The validation of the assimilation results is performed according to both deterministic and probabilistic metrics against different sources of observations. For deterministic validation, the ensemble mean and the ensemble spread are compared to the observations. For probabilistic validation, the continuous ranked probability score (CRPS) is used to evaluate the ensemble forecast system according to reliability and resolution. The reliability is further decomposed into bias and dispersion by the reduced centred random variable (RCRV) score. The obtained results show that 1) the IAU 50 scheme has the same performance as the IAU 100 scheme 2) the IAU 50/100 schemes outperform the IAU 0 scheme in error covariance propagation for thermohaline variables in relatively stable region, while the IAU 0 scheme outperforms the IAU 50/100 schemes in dynamical variables estimation in dynamically active region 3) in case with sufficient number of observations and good error specification, the impact of IAU schemes is negligible. The differences between the IAU 0 scheme and the IAU 50/100 schemes are mainly due to different model integration time and different instability (density inversion, large vertical velocity, etc.) induced by the increment update. The longer model integration time with the IAU 50/100 schemes, especially the free model integration, on one hand, allows for better re-establishment of the equilibrium model state, on the other hand, smooths the strong gradients in dynamically active region.

  18. Performance evaluation of a health insurance in Nigeria using optimal resource use: health care providers perspectives

    PubMed Central

    2014-01-01

    Background Performance measures are often neglected during the transition period of national health insurance scheme implementation in many low and middle income countries. These measurements evaluate the extent to which various aspects of the schemes meet their key objectives. This study assesses the implementation of a health insurance scheme using optimal resource use domains and examines possible factors that influence each domain, according to providers’ perspectives. Methods A retrospective, cross-sectional survey was done between August and December 2010 in Kaduna state, and 466 health care provider personnel were interviewed. Optimal-resource-use was defined in four domains: provider payment mechanism (capitation and fee-for-service payment methods), benefit package, administrative efficiency, and active monitoring mechanism. Logistic regression analysis was used to identify provider factors that may influence each domain. Results In the provider payment mechanism domain, capitation payment method (95%) performed better than fee-for-service payment method (62%). Benefit package domain performed strongly (97%), while active monitoring mechanism performed weakly (37%). In the administrative efficiency domain, both promptness of referral system (80%) and prompt arrival of funds (93%) performed well. At the individual level, providers with fewer enrolees encountered difficulties with reimbursement. Other factors significantly influenced each of the optimal-resource-use domains. Conclusions Fee-for-service payment method and claims review, in the provider payment and active monitoring mechanisms, respectively, performed weakly according to the providers’ (at individual-level) perspectives. A short-fall on the supply-side of health insurance could lead to a direct or indirect adverse effect on the demand-side of the scheme. Capitation payment per enrolees should be revised to conform to economic circumstances. Performance indicators and providers’ characteristics and experiences associated with resource use can assist policy makers to monitor and evaluate health insurance implementation. PMID:24628889

  19. Multi-objective calibration and uncertainty analysis of hydrologic models; A comparative study between formal and informal methods

    NASA Astrophysics Data System (ADS)

    Shafii, M.; Tolson, B.; Matott, L. S.

    2012-04-01

    Hydrologic modeling has benefited from significant developments over the past two decades. This has resulted in building of higher levels of complexity into hydrologic models, which eventually makes the model evaluation process (parameter estimation via calibration and uncertainty analysis) more challenging. In order to avoid unreasonable parameter estimates, many researchers have suggested implementation of multi-criteria calibration schemes. Furthermore, for predictive hydrologic models to be useful, proper consideration of uncertainty is essential. Consequently, recent research has emphasized comprehensive model assessment procedures in which multi-criteria parameter estimation is combined with statistically-based uncertainty analysis routines such as Bayesian inference using Markov Chain Monte Carlo (MCMC) sampling. Such a procedure relies on the use of formal likelihood functions based on statistical assumptions, and moreover, the Bayesian inference structured on MCMC samplers requires a considerably large number of simulations. Due to these issues, especially in complex non-linear hydrological models, a variety of alternative informal approaches have been proposed for uncertainty analysis in the multi-criteria context. This study aims at exploring a number of such informal uncertainty analysis techniques in multi-criteria calibration of hydrological models. The informal methods addressed in this study are (i) Pareto optimality which quantifies the parameter uncertainty using the Pareto solutions, (ii) DDS-AU which uses the weighted sum of objective functions to derive the prediction limits, and (iii) GLUE which describes the total uncertainty through identification of behavioral solutions. The main objective is to compare such methods with MCMC-based Bayesian inference with respect to factors such as computational burden, and predictive capacity, which are evaluated based on multiple comparative measures. The measures for comparison are calculated both for calibration and evaluation periods. The uncertainty analysis methodologies are applied to a simple 5-parameter rainfall-runoff model, called HYMOD.

  20. Study on test of coal co-firing for 600MW ultra supercritical boiler with four walls tangential burning

    NASA Astrophysics Data System (ADS)

    Ying, Wu; Yong-lu, Zhong; Guo-mingi, Yin

    2018-06-01

    On account of nine commonly used coals in a Jiangxi Power Plant,two kinds of coal were selected to be applied in coal co-firing test through industrial analysis,elementary analysis and thermogravimetric analysis of coal.During the coal co-firing test,two load points were selected,three coal mixtures were prepared.Moreover,under each coal blending scheme, the optimal oxygen content was obtained by oxygen varying test. At last,by measuring the boiler efficiency and coal consumption of power supply in different coal co-firing schemes, the recommended coal co-firing scheme was obtained.

  1. Laplace-Fourier-domain dispersion analysis of an average derivative optimal scheme for scalar-wave equation

    NASA Astrophysics Data System (ADS)

    Chen, Jing-Bo

    2014-06-01

    By using low-frequency components of the damped wavefield, Laplace-Fourier-domain full waveform inversion (FWI) can recover a long-wavelength velocity model from the original undamped seismic data lacking low-frequency information. Laplace-Fourier-domain modelling is an important foundation of Laplace-Fourier-domain FWI. Based on the numerical phase velocity and the numerical attenuation propagation velocity, a method for performing Laplace-Fourier-domain numerical dispersion analysis is developed in this paper. This method is applied to an average-derivative optimal scheme. The results show that within the relative error of 1 per cent, the Laplace-Fourier-domain average-derivative optimal scheme requires seven gridpoints per smallest wavelength and smallest pseudo-wavelength for both equal and unequal directional sampling intervals. In contrast, the classical five-point scheme requires 23 gridpoints per smallest wavelength and smallest pseudo-wavelength to achieve the same accuracy. Numerical experiments demonstrate the theoretical analysis.

  2. A novel equivalent definition of Caputo fractional derivative without singular kernel and superconvergent analysis

    NASA Astrophysics Data System (ADS)

    Liu, Zhengguang; Li, Xiaoli

    2018-05-01

    In this article, we present a new second-order finite difference discrete scheme for a fractal mobile/immobile transport model based on equivalent transformative Caputo formulation. The new transformative formulation takes the singular kernel away to make the integral calculation more efficient. Furthermore, this definition is also effective where α is a positive integer. Besides, the T-Caputo derivative also helps us to increase the convergence rate of the discretization of the α-order(0 < α < 1) Caputo derivative from O(τ2-α) to O(τ3-α), where τ is the time step. For numerical analysis, a Crank-Nicolson finite difference scheme to solve the fractal mobile/immobile transport model is introduced and analyzed. The unconditional stability and a priori estimates of the scheme are given rigorously. Moreover, the applicability and accuracy of the scheme are demonstrated by numerical experiments to support our theoretical analysis.

  3. MASTER ANALYTICAL SCHEME FOR ORGANIC COMPOUNDS IN WATER. PART 2. APPENDICES TO PROTOCOLS

    EPA Science Inventory

    A Master Analytical Scheme (MAS) has been developed for the analysis of volatile (gas chromatographable) organic compounds in water. In developing the MAS, it was necessary to evaluate and modify existing analysis procedures and develop new techniques to produce protocols that pr...

  4. Waste characterization study for the Kemp's Ridley sea turtle. Technical memo

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Malone, R.F.; Guarisco, M.

    1988-02-01

    The Kemp's Ridley sea turtle, Lepidochelys kempi, is an endangered species. The National Marine Fisheries Service's Head Start program is part of an international operation to save the turtles from extinction. Under the Head Start program, eggs from the Ridley's only known wild nesting beach at Rancho Nuevo in Mexico are transported to Padre Island on the Texas coast to be hatched. The head start enables the turtles to develop a survival advantage. The principal objective was to develop baseline waste-characterization data required to design a waste-water treatment scheme for the Galveston Head Start facility. As a secondary objective, preliminarymore » testing of some filtration components was undertaken to determine which units were most appropriate for inclusion in a wastewater treatment scheme.« less

  5. Parallelization of implicit finite difference schemes in computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    Decker, Naomi H.; Naik, Vijay K.; Nicoules, Michel

    1990-01-01

    Implicit finite difference schemes are often the preferred numerical schemes in computational fluid dynamics, requiring less stringent stability bounds than the explicit schemes. Each iteration in an implicit scheme involves global data dependencies in the form of second and higher order recurrences. Efficient parallel implementations of such iterative methods are considerably more difficult and non-intuitive. The parallelization of the implicit schemes that are used for solving the Euler and the thin layer Navier-Stokes equations and that require inversions of large linear systems in the form of block tri-diagonal and/or block penta-diagonal matrices is discussed. Three-dimensional cases are emphasized and schemes that minimize the total execution time are presented. Partitioning and scheduling schemes for alleviating the effects of the global data dependencies are described. An analysis of the communication and the computation aspects of these methods is presented. The effect of the boundary conditions on the parallel schemes is also discussed.

  6. Security analysis and improvement of a privacy authentication scheme for telecare medical information systems.

    PubMed

    Wu, Fan; Xu, Lili

    2013-08-01

    Nowadays, patients can gain many kinds of medical service on line via Telecare Medical Information Systems(TMIS) due to the fast development of computer technology. So security of communication through network between the users and the server is very significant. Authentication plays an important part to protect information from being attacked by malicious attackers. Recently, Jiang et al. proposed a privacy enhanced scheme for TMIS using smart cards and claimed their scheme was better than Chen et al.'s. However, we have showed that Jiang et al.'s scheme has the weakness of ID uselessness and is vulnerable to off-line password guessing attack and user impersonation attack if an attacker compromises the legal user's smart card. Also, it can't resist DoS attack in two cases: after a successful impersonation attack and wrong password input in Password change phase. Then we propose an improved mutual authentication scheme used for a telecare medical information system. Remote monitoring, checking patients' past medical history record and medical consultant can be applied in the system where information transmits via Internet. Finally, our analysis indicates that the suggested scheme overcomes the disadvantages of Jiang et al.'s scheme and is practical for TMIS.

  7. The search for structure - Object classification in large data sets. [for astronomers

    NASA Technical Reports Server (NTRS)

    Kurtz, Michael J.

    1988-01-01

    Research concerning object classifications schemes are reviewed, focusing on large data sets. Classification techniques are discussed, including syntactic, decision theoretic methods, fuzzy techniques, and stochastic and fuzzy grammars. Consideration is given to the automation of MK classification (Morgan and Keenan, 1973) and other problems associated with the classification of spectra. In addition, the classification of galaxies is examined, including the problems of systematic errors, blended objects, galaxy types, and galaxy clusters.

  8. Study of flow over object problems by a nodal discontinuous Galerkin-lattice Boltzmann method

    NASA Astrophysics Data System (ADS)

    Wu, Jie; Shen, Meng; Liu, Chen

    2018-04-01

    The flow over object problems are studied by a nodal discontinuous Galerkin-lattice Boltzmann method (NDG-LBM) in this work. Different from the standard lattice Boltzmann method, the current method applies the nodal discontinuous Galerkin method into the streaming process in LBM to solve the resultant pure convection equation, in which the spatial discretization is completed on unstructured grids and the low-storage explicit Runge-Kutta scheme is used for time marching. The present method then overcomes the disadvantage of standard LBM for depending on the uniform meshes. Moreover, the collision process in the LBM is completed by using the multiple-relaxation-time scheme. After the validation of the NDG-LBM by simulating the lid-driven cavity flow, the simulations of flows over a fixed circular cylinder, a stationary airfoil and rotating-stationary cylinders are performed. Good agreement of present results with previous results is achieved, which indicates that the current NDG-LBM is accurate and effective for flow over object problems.

  9. Vanpool trip planning based on evolutionary multiple objective optimization

    NASA Astrophysics Data System (ADS)

    Zhao, Ming; Yang, Disheng; Feng, Shibing; Liu, Hengchang

    2017-08-01

    Carpool and vanpool draw a lot of researchers’ attention, which is the emphasis of this paper. A concrete vanpool operation definition is given, based on the given definition, this paper tackles vanpool operation optimization using user experience decline index(UEDI). This paper is focused on making each user having identical UEDI and the system having minimum sum of all users’ UEDI. Three contributions are made, the first contribution is a vanpool operation scheme diagram, each component of the scheme is explained in detail. The second contribution is getting all customer’s UEDI as a set, standard deviation and sum of all users’ UEDI set are used as objectives in multiple objective optimization to decide trip start address, trip start time and trip destination address. The third contribution is a trip planning algorithm, which tries to minimize the sum of all users’ UEDI. Geographical distribution of the charging stations and utilization rate of the charging stations are considered in the trip planning process.

  10. A new method for recognizing quadric surfaces from range data and its application to telerobotics and automation, final phase

    NASA Technical Reports Server (NTRS)

    Mielke, Roland; Dcunha, Ivan; Alvertos, Nicolas

    1994-01-01

    In the final phase of the proposed research a complete top to down three dimensional object recognition scheme has been proposed. The various three dimensional objects included spheres, cones, cylinders, ellipsoids, paraboloids, and hyperboloids. Utilizing a newly developed blob determination technique, a given range scene with several non-cluttered quadric surfaces is segmented. Next, using the earlier (phase 1) developed alignment scheme, each of the segmented objects are then aligned in a desired coordinate system. For each of the quadric surfaces based upon their intersections with certain pre-determined planes, a set of distinct features (curves) are obtained. A database with entities such as the equations of the planes and angular bounds of these planes has been created for each of the quadric surfaces. Real range data of spheres, cones, cylinders, and parallelpipeds have been utilized for the recognition process. The developed algorithm gave excellent results for the real data as well as for several sets of simulated range data.

  11. Semi-regular remeshing based trust region spherical geometry image for 3D deformed mesh used MLWNN

    NASA Astrophysics Data System (ADS)

    Dhibi, Naziha; Elkefi, Akram; Bellil, Wajdi; Ben Amar, Chokri

    2017-03-01

    Triangular surface are now widely used for modeling three-dimensional object, since these models are very high resolution and the geometry of the mesh is often very dense, it is then necessary to remesh this object to reduce their complexity, the mesh quality (connectivity regularity) must be ameliorated. In this paper, we review the main methods of semi-regular remeshing of the state of the art, given the semi-regular remeshing is mainly relevant for wavelet-based compression, then we present our method for re-meshing based trust region spherical geometry image to have good scheme of 3d mesh compression used to deform 3D meh based on Multi library Wavelet Neural Network structure (MLWNN). Experimental results show that the progressive re-meshing algorithm capable of obtaining more compact representations and semi-regular objects and yield an efficient compression capabilities with minimal set of features used to have good 3D deformation scheme.

  12. Analysis of sensitivity to different parameterization schemes for a subtropical cyclone

    NASA Astrophysics Data System (ADS)

    Quitián-Hernández, L.; Fernández-González, S.; González-Alemán, J. J.; Valero, F.; Martín, M. L.

    2018-05-01

    A sensitivity analysis to diverse WRF model physical parameterization schemes is carried out during the lifecycle of a Subtropical cyclone (STC). STCs are low-pressure systems that share tropical and extratropical characteristics, with hybrid thermal structures. In October 2014, a STC made landfall in the Canary Islands, causing widespread damage from strong winds and precipitation there. The system began to develop on October 18 and its effects lasted until October 21. Accurate simulation of this type of cyclone continues to be a major challenge because of its rapid intensification and unique characteristics. In the present study, several numerical simulations were performed using the WRF model to do a sensitivity analysis of its various parameterization schemes for the development and intensification of the STC. The combination of parameterization schemes that best simulated this type of phenomenon was thereby determined. In particular, the parameterization combinations that included the Tiedtke cumulus schemes had the most positive effects on model results. Moreover, concerning STC track validation, optimal results were attained when the STC was fully formed and all convective processes stabilized. Furthermore, to obtain the parameterization schemes that optimally categorize STC structure, a verification using Cyclone Phase Space is assessed. Consequently, the combination of parameterizations including the Tiedtke cumulus schemes were again the best in categorizing the cyclone's subtropical structure. For strength validation, related atmospheric variables such as wind speed and precipitable water were analyzed. Finally, the effects of using a deterministic or probabilistic approach in simulating intense convective phenomena were evaluated.

  13. Computerized quantitative evaluation of mammographic accreditation phantom images

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Yongbum; Tsai, Du-Yih; Shinohara, Norimitsu

    2010-12-15

    Purpose: The objective was to develop and investigate an automated scoring scheme of the American College of Radiology (ACR) mammographic accreditation phantom (RMI 156, Middleton, WI) images. Methods: The developed method consisted of background subtraction, determination of region of interest, classification of fiber and mass objects by Mahalanobis distance, detection of specks by template matching, and rule-based scoring. Fifty-one phantom images were collected from 51 facilities for this study (one facility provided one image). A medical physicist and two radiologic technologists also scored the images. The human and computerized scores were compared. Results: In terms of meeting the ACR's criteria,more » the accuracies of the developed method for computerized evaluation of fiber, mass, and speck were 90%, 80%, and 98%, respectively. Contingency table analysis revealed significant association between observer and computer scores for microcalcifications (p<5%) but not for masses and fibers. Conclusions: The developed method may achieve a stable assessment of visibility for test objects in mammographic accreditation phantom image in whether the phantom image meets the ACR's criteria in the evaluation test, although there is room left for improvement in the approach for fiber and mass objects.« less

  14. On the single sweep processing of auditory brainstem responses: click vs. chirp stimulations and active vs. passive electrodes.

    PubMed

    Corona-Strauss, Farah I; Delb, Wolfgang; Bloching, Marc; Strauss, Daniel J

    2008-01-01

    We have recently shown that click evoked auditory brainstem responses (ABRs) single sweeps can efficiently be processed by a hybrid novelty detection system. This approach allowed for the objective detection of hearing thresholds in a fraction of time of conventional schemes, making it appropriate for the efficient implementation of newborn hearing screening procedures. It is the objective of this study to evaluate whether this approach might further be improved by different stimulation paradigms and electrode settings. In particular, we evaluate chirp stimulations which compensate the basilar-membrane dispersion and active electrodes which are less sensitive to movements. This is the first study which is directed to a single sweep processing of chirp evoked ABRs. By concentrating on transparent features and a minimum number of adjustable parameters, we present an objective comparison of click vs.chirp stimulations and active vs. passive electrodes in the ultrafast ABR detection. We show that chirp evoked brainstem responses and active electrodes might improve the single sweeps analysis of ABRs.Consequently, we conclude that a single sweep processing of ABRs for the objective determination of hearing thresholds can further be improved by the use of optimized chirp stimulations and active electrodes.

  15. Adaptive Aft Signature Shaping of a Low-Boom Supersonic Aircraft Using Off-Body Pressures

    NASA Technical Reports Server (NTRS)

    Ordaz, Irian; Li, Wu

    2012-01-01

    The design and optimization of a low-boom supersonic aircraft using the state-of-the- art o -body aerodynamics and sonic boom analysis has long been a challenging problem. The focus of this paper is to demonstrate an e ective geometry parameterization scheme and a numerical optimization approach for the aft shaping of a low-boom supersonic aircraft using o -body pressure calculations. A gradient-based numerical optimization algorithm that models the objective and constraints as response surface equations is used to drive the aft ground signature toward a ramp shape. The design objective is the minimization of the variation between the ground signature and the target signature subject to several geometric and signature constraints. The target signature is computed by using a least-squares regression of the aft portion of the ground signature. The parameterization and the deformation of the geometry is performed with a NASA in- house shaping tool. The optimization algorithm uses the shaping tool to drive the geometric deformation of a horizontal tail with a parameterization scheme that consists of seven camber design variables and an additional design variable that describes the spanwise location of the midspan section. The demonstration cases show that numerical optimization using the state-of-the-art o -body aerodynamic calculations is not only feasible and repeatable but also allows the exploration of complex design spaces for which a knowledge-based design method becomes less effective.

  16. BlobContours: adapting Blobworld for supervised color- and texture-based image segmentation

    NASA Astrophysics Data System (ADS)

    Vogel, Thomas; Nguyen, Dinh Quyen; Dittmann, Jana

    2006-01-01

    Extracting features is the first and one of the most crucial steps in recent image retrieval process. While the color features and the texture features of digital images can be extracted rather easily, the shape features and the layout features depend on reliable image segmentation. Unsupervised image segmentation, often used in image analysis, works on merely syntactical basis. That is, what an unsupervised segmentation algorithm can segment is only regions, but not objects. To obtain high-level objects, which is desirable in image retrieval, human assistance is needed. Supervised image segmentations schemes can improve the reliability of segmentation and segmentation refinement. In this paper we propose a novel interactive image segmentation technique that combines the reliability of a human expert with the precision of automated image segmentation. The iterative procedure can be considered a variation on the Blobworld algorithm introduced by Carson et al. from EECS Department, University of California, Berkeley. Starting with an initial segmentation as provided by the Blobworld framework, our algorithm, namely BlobContours, gradually updates it by recalculating every blob, based on the original features and the updated number of Gaussians. Since the original algorithm has hardly been designed for interactive processing we had to consider additional requirements for realizing a supervised segmentation scheme on the basis of Blobworld. Increasing transparency of the algorithm by applying usercontrolled iterative segmentation, providing different types of visualization for displaying the segmented image and decreasing computational time of segmentation are three major requirements which are discussed in detail.

  17. A robust control scheme for flexible arms with friction in the joints

    NASA Technical Reports Server (NTRS)

    Rattan, Kuldip S.; Feliu, Vicente; Brown, H. Benjamin, Jr.

    1988-01-01

    A general control scheme to control flexible arms with friction in the joints is proposed in this paper. This scheme presents the advantage of being robust in the sense that it minimizes the effects of the Coulomb friction existing in the motor and the effects of changes in the dynamic friction coefficient. A justification of the robustness properties of the scheme is given in terms of the sensitivity analysis.

  18. MultiScheme: A Parallel Processing System Based on MIT (Massachusetts Institute of Technology) Scheme.

    DTIC Science & Technology

    1987-09-01

    Later, when these alloca- :t)il t rate(-ies beconme a p)erfornmance concern, the schieduler can be inolded 4 toN fit the p~ articular appllicationi... distracts attention from the more important points that this example is intended to demonstrate. The implementation, therefore, is described separately in...for the benefit of outsiders. From the object’s point of view the pipeline is nothing but a list of messages that tell it how to mutate its own state

  19. A Fast MEANSHIFT Algorithm-Based Target Tracking System

    PubMed Central

    Sun, Jian

    2012-01-01

    Tracking moving targets in complex scenes using an active video camera is a challenging task. Tracking accuracy and efficiency are two key yet generally incompatible aspects of a Target Tracking System (TTS). A compromise scheme will be studied in this paper. A fast mean-shift-based Target Tracking scheme is designed and realized, which is robust to partial occlusion and changes in object appearance. The physical simulation shows that the image signal processing speed is >50 frame/s. PMID:22969397

  20. Distortion Representation of Forecast Errors for Model Skill Assessment and Objective Analysis. Revision 1.12

    NASA Technical Reports Server (NTRS)

    Hoffman, Ross N.; Nehrkorn, Thomas; Grassotti, Christopher

    1997-01-01

    We proposed a novel characterization of errors for numerical weather predictions. In its simplest form we decompose the error into a part attributable to phase errors and a remainder. The phase error is represented in the same fashion as a velocity field and is required to vary slowly and smoothly with position. A general distortion representation allows for the displacement and amplification or bias correction of forecast anomalies. Characterizing and decomposing forecast error in this way has two important applications, which we term the assessment application and the objective analysis application. For the assessment application, our approach results in new objective measures of forecast skill which are more in line with subjective measures of forecast skill and which are useful in validating models and diagnosing their shortcomings. With regard to the objective analysis application, meteorological analysis schemes balance forecast error and observational error to obtain an optimal analysis. Presently, representations of the error covariance matrix used to measure the forecast error are severely limited. For the objective analysis application our approach will improve analyses by providing a more realistic measure of the forecast error. We expect, a priori, that our approach should greatly improve the utility of remotely sensed data which have relatively high horizontal resolution, but which are indirectly related to the conventional atmospheric variables. In this project, we are initially focusing on the assessment application, restricted to a realistic but univariate 2-dimensional situation. Specifically, we study the forecast errors of the sea level pressure (SLP) and 500 hPa geopotential height fields for forecasts of the short and medium range. Since the forecasts are generated by the GEOS (Goddard Earth Observing System) data assimilation system with and without ERS 1 scatterometer data, these preliminary studies serve several purposes. They (1) provide a testbed for the use of the distortion representation of forecast errors, (2) act as one means of validating the GEOS data assimilation system and (3) help to describe the impact of the ERS 1 scatterometer data.

Top